Dec 05 16:15:38 crc systemd[1]: Starting Kubernetes Kubelet... Dec 05 16:15:38 crc restorecon[4755]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 16:15:38 crc restorecon[4755]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 16:15:39 crc restorecon[4755]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 16:15:39 crc restorecon[4755]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 05 16:15:39 crc kubenswrapper[4756]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 16:15:39 crc kubenswrapper[4756]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 05 16:15:39 crc kubenswrapper[4756]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 16:15:39 crc kubenswrapper[4756]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 16:15:39 crc kubenswrapper[4756]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 05 16:15:39 crc kubenswrapper[4756]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.876087 4756 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878759 4756 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878776 4756 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878780 4756 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878784 4756 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878787 4756 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878791 4756 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878795 4756 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878800 4756 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878804 4756 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878808 4756 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878813 4756 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878818 4756 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878822 4756 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878827 4756 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878832 4756 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878843 4756 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878849 4756 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878853 4756 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878857 4756 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878861 4756 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878864 4756 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878868 4756 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878872 4756 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878875 4756 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878879 4756 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878882 4756 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878886 4756 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878890 4756 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878893 4756 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878897 4756 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878900 4756 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878904 4756 feature_gate.go:330] unrecognized feature gate: Example Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878908 4756 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878911 4756 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878915 4756 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878918 4756 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878922 4756 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878926 4756 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878929 4756 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878933 4756 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878938 4756 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878943 4756 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878947 4756 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878952 4756 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878956 4756 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878960 4756 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878965 4756 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878969 4756 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878972 4756 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878976 4756 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878979 4756 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878983 4756 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878987 4756 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878991 4756 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878995 4756 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.878999 4756 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.879002 4756 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.879006 4756 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.879009 4756 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.879012 4756 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.879016 4756 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.879019 4756 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.879022 4756 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.879026 4756 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.879029 4756 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.879033 4756 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.879037 4756 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.879041 4756 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.879045 4756 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.879049 4756 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.879053 4756 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879254 4756 flags.go:64] FLAG: --address="0.0.0.0" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879263 4756 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879271 4756 flags.go:64] FLAG: --anonymous-auth="true" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879277 4756 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879282 4756 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879286 4756 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879293 4756 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879298 4756 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879303 4756 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879307 4756 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879311 4756 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879316 4756 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879320 4756 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879324 4756 flags.go:64] FLAG: --cgroup-root="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879328 4756 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879332 4756 flags.go:64] FLAG: --client-ca-file="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879337 4756 flags.go:64] FLAG: --cloud-config="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879341 4756 flags.go:64] FLAG: --cloud-provider="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879345 4756 flags.go:64] FLAG: --cluster-dns="[]" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879350 4756 flags.go:64] FLAG: --cluster-domain="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879354 4756 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879358 4756 flags.go:64] FLAG: --config-dir="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879362 4756 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879366 4756 flags.go:64] FLAG: --container-log-max-files="5" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879372 4756 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879375 4756 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879379 4756 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879384 4756 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879388 4756 flags.go:64] FLAG: --contention-profiling="false" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879392 4756 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879396 4756 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879400 4756 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879405 4756 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879411 4756 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879430 4756 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879435 4756 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879439 4756 flags.go:64] FLAG: --enable-load-reader="false" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879443 4756 flags.go:64] FLAG: --enable-server="true" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879447 4756 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879453 4756 flags.go:64] FLAG: --event-burst="100" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879458 4756 flags.go:64] FLAG: --event-qps="50" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879462 4756 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879466 4756 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879470 4756 flags.go:64] FLAG: --eviction-hard="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879476 4756 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879481 4756 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879485 4756 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879489 4756 flags.go:64] FLAG: --eviction-soft="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879494 4756 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879499 4756 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879503 4756 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879507 4756 flags.go:64] FLAG: --experimental-mounter-path="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879511 4756 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879515 4756 flags.go:64] FLAG: --fail-swap-on="true" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879519 4756 flags.go:64] FLAG: --feature-gates="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879525 4756 flags.go:64] FLAG: --file-check-frequency="20s" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879529 4756 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879533 4756 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879537 4756 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879542 4756 flags.go:64] FLAG: --healthz-port="10248" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879546 4756 flags.go:64] FLAG: --help="false" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879552 4756 flags.go:64] FLAG: --hostname-override="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879556 4756 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879560 4756 flags.go:64] FLAG: --http-check-frequency="20s" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879565 4756 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879569 4756 flags.go:64] FLAG: --image-credential-provider-config="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879574 4756 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879578 4756 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879582 4756 flags.go:64] FLAG: --image-service-endpoint="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879590 4756 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879593 4756 flags.go:64] FLAG: --kube-api-burst="100" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879598 4756 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879604 4756 flags.go:64] FLAG: --kube-api-qps="50" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879608 4756 flags.go:64] FLAG: --kube-reserved="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879613 4756 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879617 4756 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879622 4756 flags.go:64] FLAG: --kubelet-cgroups="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879626 4756 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879630 4756 flags.go:64] FLAG: --lock-file="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879634 4756 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879639 4756 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879644 4756 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879652 4756 flags.go:64] FLAG: --log-json-split-stream="false" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879657 4756 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879662 4756 flags.go:64] FLAG: --log-text-split-stream="false" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879667 4756 flags.go:64] FLAG: --logging-format="text" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879672 4756 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879677 4756 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879683 4756 flags.go:64] FLAG: --manifest-url="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879688 4756 flags.go:64] FLAG: --manifest-url-header="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879695 4756 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879701 4756 flags.go:64] FLAG: --max-open-files="1000000" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879707 4756 flags.go:64] FLAG: --max-pods="110" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879713 4756 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879718 4756 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879723 4756 flags.go:64] FLAG: --memory-manager-policy="None" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879728 4756 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879734 4756 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879738 4756 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879743 4756 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879753 4756 flags.go:64] FLAG: --node-status-max-images="50" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879760 4756 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879764 4756 flags.go:64] FLAG: --oom-score-adj="-999" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879768 4756 flags.go:64] FLAG: --pod-cidr="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879796 4756 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879803 4756 flags.go:64] FLAG: --pod-manifest-path="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879808 4756 flags.go:64] FLAG: --pod-max-pids="-1" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879812 4756 flags.go:64] FLAG: --pods-per-core="0" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879816 4756 flags.go:64] FLAG: --port="10250" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879821 4756 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879825 4756 flags.go:64] FLAG: --provider-id="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879829 4756 flags.go:64] FLAG: --qos-reserved="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879834 4756 flags.go:64] FLAG: --read-only-port="10255" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879838 4756 flags.go:64] FLAG: --register-node="true" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879842 4756 flags.go:64] FLAG: --register-schedulable="true" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879846 4756 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879854 4756 flags.go:64] FLAG: --registry-burst="10" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879858 4756 flags.go:64] FLAG: --registry-qps="5" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879862 4756 flags.go:64] FLAG: --reserved-cpus="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879866 4756 flags.go:64] FLAG: --reserved-memory="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879871 4756 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879876 4756 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879880 4756 flags.go:64] FLAG: --rotate-certificates="false" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879884 4756 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879889 4756 flags.go:64] FLAG: --runonce="false" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879893 4756 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879898 4756 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879903 4756 flags.go:64] FLAG: --seccomp-default="false" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879907 4756 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879911 4756 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879915 4756 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879919 4756 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879924 4756 flags.go:64] FLAG: --storage-driver-password="root" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879928 4756 flags.go:64] FLAG: --storage-driver-secure="false" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879933 4756 flags.go:64] FLAG: --storage-driver-table="stats" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879937 4756 flags.go:64] FLAG: --storage-driver-user="root" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879943 4756 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879947 4756 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879952 4756 flags.go:64] FLAG: --system-cgroups="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879956 4756 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879962 4756 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879966 4756 flags.go:64] FLAG: --tls-cert-file="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879971 4756 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879976 4756 flags.go:64] FLAG: --tls-min-version="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879980 4756 flags.go:64] FLAG: --tls-private-key-file="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879984 4756 flags.go:64] FLAG: --topology-manager-policy="none" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879988 4756 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879992 4756 flags.go:64] FLAG: --topology-manager-scope="container" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.879996 4756 flags.go:64] FLAG: --v="2" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.880002 4756 flags.go:64] FLAG: --version="false" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.880007 4756 flags.go:64] FLAG: --vmodule="" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.880016 4756 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.880020 4756 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880145 4756 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880149 4756 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880154 4756 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880157 4756 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880162 4756 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880166 4756 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880170 4756 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880174 4756 feature_gate.go:330] unrecognized feature gate: Example Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880177 4756 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880182 4756 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880187 4756 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880191 4756 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880195 4756 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880200 4756 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880203 4756 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880209 4756 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880212 4756 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880216 4756 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880219 4756 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880223 4756 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880227 4756 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880231 4756 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880235 4756 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880238 4756 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880242 4756 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880245 4756 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880249 4756 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880253 4756 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880256 4756 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880260 4756 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880263 4756 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880267 4756 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880271 4756 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880275 4756 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880280 4756 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880284 4756 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880288 4756 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880292 4756 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880296 4756 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880300 4756 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880303 4756 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880307 4756 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880311 4756 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880315 4756 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880319 4756 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880324 4756 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880328 4756 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880333 4756 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880337 4756 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880340 4756 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880345 4756 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880350 4756 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880354 4756 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880358 4756 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880362 4756 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880365 4756 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880370 4756 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880375 4756 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880378 4756 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880382 4756 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880386 4756 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880390 4756 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880394 4756 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880397 4756 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880401 4756 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880405 4756 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880408 4756 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880412 4756 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880430 4756 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880434 4756 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.880437 4756 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.880450 4756 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.888194 4756 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.888383 4756 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889288 4756 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889348 4756 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889362 4756 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889373 4756 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889385 4756 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889397 4756 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889408 4756 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889457 4756 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889469 4756 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889479 4756 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889491 4756 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889502 4756 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889513 4756 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889523 4756 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889534 4756 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889558 4756 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889570 4756 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889581 4756 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889591 4756 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889601 4756 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889611 4756 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889622 4756 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889633 4756 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889643 4756 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889653 4756 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889665 4756 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889676 4756 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889686 4756 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889697 4756 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889709 4756 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889720 4756 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889731 4756 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889743 4756 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889754 4756 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889765 4756 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889775 4756 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889785 4756 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889801 4756 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889814 4756 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889826 4756 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889836 4756 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889847 4756 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889859 4756 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889868 4756 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889880 4756 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889890 4756 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889900 4756 feature_gate.go:330] unrecognized feature gate: Example Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889911 4756 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889921 4756 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889931 4756 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889941 4756 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889956 4756 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889972 4756 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889983 4756 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.889994 4756 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890006 4756 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890019 4756 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890030 4756 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890040 4756 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890055 4756 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890070 4756 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890083 4756 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890097 4756 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890110 4756 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890121 4756 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890131 4756 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890142 4756 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890152 4756 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890162 4756 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890174 4756 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890184 4756 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.890203 4756 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890680 4756 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890702 4756 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890714 4756 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890725 4756 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890737 4756 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890748 4756 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890764 4756 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890778 4756 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890789 4756 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890801 4756 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890811 4756 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890823 4756 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890834 4756 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890844 4756 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890854 4756 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890865 4756 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890876 4756 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890887 4756 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890898 4756 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890908 4756 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890919 4756 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890929 4756 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890940 4756 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890950 4756 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890961 4756 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890971 4756 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890981 4756 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.890992 4756 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891002 4756 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891013 4756 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891022 4756 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891033 4756 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891043 4756 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891053 4756 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891063 4756 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891073 4756 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891084 4756 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891094 4756 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891104 4756 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891114 4756 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891126 4756 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891137 4756 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891147 4756 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891160 4756 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891174 4756 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891186 4756 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891199 4756 feature_gate.go:330] unrecognized feature gate: Example Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891210 4756 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891221 4756 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891231 4756 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891243 4756 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891253 4756 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891263 4756 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891274 4756 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891284 4756 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891294 4756 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891304 4756 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891315 4756 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891327 4756 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891337 4756 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891351 4756 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891363 4756 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891375 4756 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891387 4756 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891399 4756 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891410 4756 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891456 4756 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891468 4756 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891479 4756 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891489 4756 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.891498 4756 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.891513 4756 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.892298 4756 server.go:940] "Client rotation is on, will bootstrap in background" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.898088 4756 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.898277 4756 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.899311 4756 server.go:997] "Starting client certificate rotation" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.899357 4756 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.899582 4756 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-24 02:43:04.089007724 +0000 UTC Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.899681 4756 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.908344 4756 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 16:15:39 crc kubenswrapper[4756]: E1205 16:15:39.910066 4756 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.189:6443: connect: connection refused" logger="UnhandledError" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.910971 4756 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.922119 4756 log.go:25] "Validated CRI v1 runtime API" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.946555 4756 log.go:25] "Validated CRI v1 image API" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.949033 4756 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.952013 4756 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-05-16-10-53-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.952085 4756 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.977143 4756 manager.go:217] Machine: {Timestamp:2025-12-05 16:15:39.974767195 +0000 UTC m=+0.213440797 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:6be86e52-5e8b-4c16-8ed7-a526f092135c BootID:9b87c017-f071-4b1f-b157-701524c44075 Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:5e:bf:13 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:5e:bf:13 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:79:6b:4b Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:a7:23:00 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:6d:22:66 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:a2:fe:1a Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:56:17:22 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:86:95:72:cf:70:2e Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:6e:fc:a2:83:d8:bb Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.977611 4756 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.977817 4756 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.978511 4756 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.978895 4756 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.978967 4756 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.979352 4756 topology_manager.go:138] "Creating topology manager with none policy" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.979372 4756 container_manager_linux.go:303] "Creating device plugin manager" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.979745 4756 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.979817 4756 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.980216 4756 state_mem.go:36] "Initialized new in-memory state store" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.980382 4756 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.981703 4756 kubelet.go:418] "Attempting to sync node with API server" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.981740 4756 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.981780 4756 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.981804 4756 kubelet.go:324] "Adding apiserver pod source" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.981828 4756 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.984074 4756 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.984561 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.189:6443: connect: connection refused Dec 05 16:15:39 crc kubenswrapper[4756]: E1205 16:15:39.984685 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.189:6443: connect: connection refused" logger="UnhandledError" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.984605 4756 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.984553 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.189:6443: connect: connection refused Dec 05 16:15:39 crc kubenswrapper[4756]: E1205 16:15:39.984818 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.189:6443: connect: connection refused" logger="UnhandledError" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.985686 4756 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.986501 4756 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.986548 4756 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.986564 4756 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.986579 4756 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.986602 4756 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.986617 4756 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.986644 4756 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.986676 4756 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.986694 4756 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.986709 4756 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.986729 4756 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.986744 4756 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.987072 4756 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.987812 4756 server.go:1280] "Started kubelet" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.987933 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.189:6443: connect: connection refused Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.988279 4756 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.988332 4756 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.989307 4756 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 05 16:15:39 crc systemd[1]: Started Kubernetes Kubelet. Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.990747 4756 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.990804 4756 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.991295 4756 server.go:460] "Adding debug handlers to kubelet server" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.991877 4756 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 05 16:15:39 crc kubenswrapper[4756]: E1205 16:15:39.991889 4756 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.991932 4756 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.991950 4756 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 07:49:03.061161534 +0000 UTC Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.992101 4756 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 05 16:15:39 crc kubenswrapper[4756]: W1205 16:15:39.992661 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.189:6443: connect: connection refused Dec 05 16:15:39 crc kubenswrapper[4756]: E1205 16:15:39.992734 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.189:6443: connect: connection refused" logger="UnhandledError" Dec 05 16:15:39 crc kubenswrapper[4756]: E1205 16:15:39.991500 4756 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.189:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187e5de481d21309 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 16:15:39.987768073 +0000 UTC m=+0.226441655,LastTimestamp:2025-12-05 16:15:39.987768073 +0000 UTC m=+0.226441655,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.992991 4756 factory.go:55] Registering systemd factory Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.993028 4756 factory.go:221] Registration of the systemd container factory successfully Dec 05 16:15:39 crc kubenswrapper[4756]: E1205 16:15:39.994674 4756 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" interval="200ms" Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.996705 4756 factory.go:153] Registering CRI-O factory Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.996739 4756 factory.go:221] Registration of the crio container factory successfully Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.996801 4756 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.996822 4756 factory.go:103] Registering Raw factory Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.996838 4756 manager.go:1196] Started watching for new ooms in manager Dec 05 16:15:39 crc kubenswrapper[4756]: I1205 16:15:39.998010 4756 manager.go:319] Starting recovery of all containers Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003233 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003282 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003293 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003305 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003315 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003327 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003337 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003346 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003357 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003367 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003405 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003429 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003440 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003452 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003470 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003479 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003490 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003500 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003509 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003518 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003529 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003539 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003550 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003561 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003570 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003580 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003635 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003647 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003657 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003666 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003676 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003736 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003746 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003755 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003786 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003796 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003825 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003837 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003846 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003855 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003866 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003877 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003886 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003896 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003906 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003919 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003929 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003940 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003972 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003982 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.003993 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.004004 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.004020 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.004032 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.004043 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.004054 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.004064 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.004074 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.004762 4756 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.004794 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.004811 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.004828 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.004841 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.004853 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.004866 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.004878 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.004888 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.004901 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.004911 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.004922 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.004939 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.004952 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.004964 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.004976 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.004986 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.004999 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005011 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005023 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005034 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005045 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005057 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005068 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005081 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005094 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005105 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005118 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005132 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005146 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005159 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005171 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005182 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005195 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005210 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005229 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005243 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005255 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005268 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005281 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005292 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005306 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005319 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005331 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005343 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005357 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005371 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005403 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005436 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005451 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005468 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005481 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005494 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005508 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005522 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005534 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005549 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005562 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005575 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005586 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005601 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005613 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005626 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005637 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005649 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005660 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005671 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005689 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005701 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005713 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005729 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005740 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005755 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005770 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005780 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005813 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005826 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005840 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005853 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005865 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005881 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005895 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005906 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005930 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005943 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005956 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005969 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005982 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.005996 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006008 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006019 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006032 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006046 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006060 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006073 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006085 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006100 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006112 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006125 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006140 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006154 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006169 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006182 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006197 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006211 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006224 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006242 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006257 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006272 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006286 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006300 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006315 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006329 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006341 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006355 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006371 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006385 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006399 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006446 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006462 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006476 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006488 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006501 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006516 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006527 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006540 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006552 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006565 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006578 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006592 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006605 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006620 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006632 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006644 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006657 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006669 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006682 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006695 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006707 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006720 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006730 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006740 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006750 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006759 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006768 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006777 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006786 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006798 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006807 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006817 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006826 4756 reconstruct.go:97] "Volume reconstruction finished" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.006833 4756 reconciler.go:26] "Reconciler: start to sync state" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.016953 4756 manager.go:324] Recovery completed Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.026954 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.029659 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.029714 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.029732 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.033965 4756 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.033991 4756 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.034019 4756 state_mem.go:36] "Initialized new in-memory state store" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.041067 4756 policy_none.go:49] "None policy: Start" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.042088 4756 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.042156 4756 state_mem.go:35] "Initializing new in-memory state store" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.045685 4756 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.048255 4756 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.048303 4756 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.048354 4756 kubelet.go:2335] "Starting kubelet main sync loop" Dec 05 16:15:40 crc kubenswrapper[4756]: E1205 16:15:40.048451 4756 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 05 16:15:40 crc kubenswrapper[4756]: W1205 16:15:40.049843 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.189:6443: connect: connection refused Dec 05 16:15:40 crc kubenswrapper[4756]: E1205 16:15:40.049918 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.189:6443: connect: connection refused" logger="UnhandledError" Dec 05 16:15:40 crc kubenswrapper[4756]: E1205 16:15:40.092924 4756 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.115583 4756 manager.go:334] "Starting Device Plugin manager" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.115657 4756 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.115671 4756 server.go:79] "Starting device plugin registration server" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.116077 4756 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.116098 4756 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.116311 4756 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.116391 4756 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.116404 4756 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 05 16:15:40 crc kubenswrapper[4756]: E1205 16:15:40.128781 4756 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.149499 4756 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.149618 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.150392 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.150455 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.150468 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.150585 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.150796 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.150823 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.151248 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.151302 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.151314 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.151510 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.151525 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.151534 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.151648 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.151682 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.152184 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.153700 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.153751 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.153767 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.153730 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.153863 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.153875 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.153977 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.154067 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.154096 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.154876 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.154896 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.154905 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.154978 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.155106 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.155131 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.155405 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.155440 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.155452 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.155886 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.155908 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.155915 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.156010 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.156030 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.156620 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.156664 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.156676 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.157219 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.157249 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.157259 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:40 crc kubenswrapper[4756]: E1205 16:15:40.195615 4756 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" interval="400ms" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.209645 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.209852 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.209937 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.210009 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.210079 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.210172 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.210251 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.210393 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.210548 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.210578 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.210627 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.210659 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.210698 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.210722 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.210745 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.216485 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.217796 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.217845 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.217877 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.217912 4756 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 16:15:40 crc kubenswrapper[4756]: E1205 16:15:40.218604 4756 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.189:6443: connect: connection refused" node="crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.312044 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.312600 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.312786 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.312701 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.312359 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.312867 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.313352 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.313752 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.313983 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.314154 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.314353 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.314616 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.314784 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.314925 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.315146 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.315335 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.315559 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.315714 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.313847 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.314854 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.314223 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.315011 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.313466 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.315200 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.314432 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.315438 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.314056 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.315631 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.314697 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.315792 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.418801 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.421485 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.421558 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.421577 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.421624 4756 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 16:15:40 crc kubenswrapper[4756]: E1205 16:15:40.422288 4756 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.189:6443: connect: connection refused" node="crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.472785 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.491047 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.499615 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: W1205 16:15:40.502779 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-00697704014c9b3585f6e1021a89b886c310a90174dd605087672c667e6d4ecf WatchSource:0}: Error finding container 00697704014c9b3585f6e1021a89b886c310a90174dd605087672c667e6d4ecf: Status 404 returned error can't find the container with id 00697704014c9b3585f6e1021a89b886c310a90174dd605087672c667e6d4ecf Dec 05 16:15:40 crc kubenswrapper[4756]: W1205 16:15:40.511529 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-7060229ecdec67d83aaf8c7ec3e2e46372a2a7b22dffdc0a9f192ddb05fdc060 WatchSource:0}: Error finding container 7060229ecdec67d83aaf8c7ec3e2e46372a2a7b22dffdc0a9f192ddb05fdc060: Status 404 returned error can't find the container with id 7060229ecdec67d83aaf8c7ec3e2e46372a2a7b22dffdc0a9f192ddb05fdc060 Dec 05 16:15:40 crc kubenswrapper[4756]: W1205 16:15:40.518149 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-7554736282f677ea16b20a01759707e0032ecea19eecfe759ef0074827a50ec0 WatchSource:0}: Error finding container 7554736282f677ea16b20a01759707e0032ecea19eecfe759ef0074827a50ec0: Status 404 returned error can't find the container with id 7554736282f677ea16b20a01759707e0032ecea19eecfe759ef0074827a50ec0 Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.523441 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.529749 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 16:15:40 crc kubenswrapper[4756]: W1205 16:15:40.545197 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-e6243c06a23190ab480f95fb5bdce51a4d5389ca16a65de2112c6fb42d250fff WatchSource:0}: Error finding container e6243c06a23190ab480f95fb5bdce51a4d5389ca16a65de2112c6fb42d250fff: Status 404 returned error can't find the container with id e6243c06a23190ab480f95fb5bdce51a4d5389ca16a65de2112c6fb42d250fff Dec 05 16:15:40 crc kubenswrapper[4756]: W1205 16:15:40.549095 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-67292c8bfd41f131cfa672aa0fde5b58714caaf0a85a253eeaaa226c8cb49eda WatchSource:0}: Error finding container 67292c8bfd41f131cfa672aa0fde5b58714caaf0a85a253eeaaa226c8cb49eda: Status 404 returned error can't find the container with id 67292c8bfd41f131cfa672aa0fde5b58714caaf0a85a253eeaaa226c8cb49eda Dec 05 16:15:40 crc kubenswrapper[4756]: E1205 16:15:40.597030 4756 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" interval="800ms" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.823248 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.825435 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.825484 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.825502 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.825536 4756 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 16:15:40 crc kubenswrapper[4756]: E1205 16:15:40.825967 4756 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.189:6443: connect: connection refused" node="crc" Dec 05 16:15:40 crc kubenswrapper[4756]: W1205 16:15:40.952896 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.189:6443: connect: connection refused Dec 05 16:15:40 crc kubenswrapper[4756]: E1205 16:15:40.953031 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.189:6443: connect: connection refused" logger="UnhandledError" Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.988917 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.189:6443: connect: connection refused Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.992488 4756 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 02:22:54.898785282 +0000 UTC Dec 05 16:15:40 crc kubenswrapper[4756]: I1205 16:15:40.992541 4756 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 538h7m13.906246802s for next certificate rotation Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.056049 4756 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1" exitCode=0 Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.056117 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1"} Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.056191 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"00697704014c9b3585f6e1021a89b886c310a90174dd605087672c667e6d4ecf"} Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.056281 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.057439 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.057468 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.057477 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.059632 4756 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="373cd5c8b65a5cd92e18658b5926509f83c3055e3fcf7a82e8674c35f9f130b6" exitCode=0 Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.059743 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"373cd5c8b65a5cd92e18658b5926509f83c3055e3fcf7a82e8674c35f9f130b6"} Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.059829 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"67292c8bfd41f131cfa672aa0fde5b58714caaf0a85a253eeaaa226c8cb49eda"} Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.059955 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.060959 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.060993 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.061002 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.061777 4756 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="1b76cab0eed4b440c96aefe25fa6077a101ba03b3b6df2d337a94152f55d4d0f" exitCode=0 Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.061829 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"1b76cab0eed4b440c96aefe25fa6077a101ba03b3b6df2d337a94152f55d4d0f"} Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.061851 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"e6243c06a23190ab480f95fb5bdce51a4d5389ca16a65de2112c6fb42d250fff"} Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.061906 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.066108 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.066165 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.066184 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.066503 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714"} Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.066535 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7554736282f677ea16b20a01759707e0032ecea19eecfe759ef0074827a50ec0"} Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.069015 4756 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613" exitCode=0 Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.069040 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613"} Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.069055 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7060229ecdec67d83aaf8c7ec3e2e46372a2a7b22dffdc0a9f192ddb05fdc060"} Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.069127 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.069864 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.069902 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.069911 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.073075 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.073908 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.073935 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.073943 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:41 crc kubenswrapper[4756]: W1205 16:15:41.133326 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.189:6443: connect: connection refused Dec 05 16:15:41 crc kubenswrapper[4756]: E1205 16:15:41.133448 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.189:6443: connect: connection refused" logger="UnhandledError" Dec 05 16:15:41 crc kubenswrapper[4756]: W1205 16:15:41.171582 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.189:6443: connect: connection refused Dec 05 16:15:41 crc kubenswrapper[4756]: E1205 16:15:41.171687 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.189:6443: connect: connection refused" logger="UnhandledError" Dec 05 16:15:41 crc kubenswrapper[4756]: E1205 16:15:41.242990 4756 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.189:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187e5de481d21309 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 16:15:39.987768073 +0000 UTC m=+0.226441655,LastTimestamp:2025-12-05 16:15:39.987768073 +0000 UTC m=+0.226441655,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 16:15:41 crc kubenswrapper[4756]: E1205 16:15:41.398279 4756 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" interval="1.6s" Dec 05 16:15:41 crc kubenswrapper[4756]: W1205 16:15:41.535487 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.189:6443: connect: connection refused Dec 05 16:15:41 crc kubenswrapper[4756]: E1205 16:15:41.535674 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.189:6443: connect: connection refused" logger="UnhandledError" Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.626987 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.628178 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.628203 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.628211 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:41 crc kubenswrapper[4756]: I1205 16:15:41.628395 4756 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 16:15:41 crc kubenswrapper[4756]: E1205 16:15:41.629400 4756 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.189:6443: connect: connection refused" node="crc" Dec 05 16:15:42 crc kubenswrapper[4756]: I1205 16:15:42.021108 4756 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 05 16:15:42 crc kubenswrapper[4756]: I1205 16:15:42.081191 4756 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128" exitCode=0 Dec 05 16:15:42 crc kubenswrapper[4756]: I1205 16:15:42.081269 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128"} Dec 05 16:15:42 crc kubenswrapper[4756]: I1205 16:15:42.081463 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:42 crc kubenswrapper[4756]: I1205 16:15:42.085175 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:42 crc kubenswrapper[4756]: I1205 16:15:42.085210 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:42 crc kubenswrapper[4756]: I1205 16:15:42.085222 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:42 crc kubenswrapper[4756]: I1205 16:15:42.087390 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b18b747e8661dcf2cdad0c507bbdc8ae546903c4aca090e6dab97bf8d6c612bf"} Dec 05 16:15:42 crc kubenswrapper[4756]: I1205 16:15:42.087429 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8568eb65db0cf87e541088899b4074151be30d606c0046d2b304d238078797ac"} Dec 05 16:15:42 crc kubenswrapper[4756]: I1205 16:15:42.087444 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5739236ff87695260ff75649be85ea733991ee8d0dc84198d6690ad408b90eb3"} Dec 05 16:15:42 crc kubenswrapper[4756]: I1205 16:15:42.087511 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:42 crc kubenswrapper[4756]: I1205 16:15:42.088331 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:42 crc kubenswrapper[4756]: I1205 16:15:42.088351 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:42 crc kubenswrapper[4756]: I1205 16:15:42.088359 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:42 crc kubenswrapper[4756]: I1205 16:15:42.089456 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"a967f6919586e3fb43b7e4b1236d537baffff065430838b580204b39243ef564"} Dec 05 16:15:42 crc kubenswrapper[4756]: I1205 16:15:42.089540 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:42 crc kubenswrapper[4756]: I1205 16:15:42.090196 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:42 crc kubenswrapper[4756]: I1205 16:15:42.090242 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:42 crc kubenswrapper[4756]: I1205 16:15:42.090281 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:42 crc kubenswrapper[4756]: I1205 16:15:42.091847 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1"} Dec 05 16:15:42 crc kubenswrapper[4756]: I1205 16:15:42.091883 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3"} Dec 05 16:15:42 crc kubenswrapper[4756]: I1205 16:15:42.091899 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a"} Dec 05 16:15:42 crc kubenswrapper[4756]: I1205 16:15:42.091885 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:42 crc kubenswrapper[4756]: I1205 16:15:42.092758 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:42 crc kubenswrapper[4756]: I1205 16:15:42.092800 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:42 crc kubenswrapper[4756]: I1205 16:15:42.092817 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:42 crc kubenswrapper[4756]: I1205 16:15:42.093922 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907"} Dec 05 16:15:42 crc kubenswrapper[4756]: I1205 16:15:42.093946 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e"} Dec 05 16:15:42 crc kubenswrapper[4756]: I1205 16:15:42.093955 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29"} Dec 05 16:15:42 crc kubenswrapper[4756]: I1205 16:15:42.093964 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e"} Dec 05 16:15:43 crc kubenswrapper[4756]: I1205 16:15:43.102393 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961"} Dec 05 16:15:43 crc kubenswrapper[4756]: I1205 16:15:43.102542 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:43 crc kubenswrapper[4756]: I1205 16:15:43.104048 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:43 crc kubenswrapper[4756]: I1205 16:15:43.104096 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:43 crc kubenswrapper[4756]: I1205 16:15:43.104116 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:43 crc kubenswrapper[4756]: I1205 16:15:43.106030 4756 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9" exitCode=0 Dec 05 16:15:43 crc kubenswrapper[4756]: I1205 16:15:43.106087 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9"} Dec 05 16:15:43 crc kubenswrapper[4756]: I1205 16:15:43.106171 4756 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 16:15:43 crc kubenswrapper[4756]: I1205 16:15:43.106204 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:43 crc kubenswrapper[4756]: I1205 16:15:43.106220 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:43 crc kubenswrapper[4756]: I1205 16:15:43.106212 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:43 crc kubenswrapper[4756]: I1205 16:15:43.107694 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:43 crc kubenswrapper[4756]: I1205 16:15:43.107726 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:43 crc kubenswrapper[4756]: I1205 16:15:43.107739 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:43 crc kubenswrapper[4756]: I1205 16:15:43.107815 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:43 crc kubenswrapper[4756]: I1205 16:15:43.107823 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:43 crc kubenswrapper[4756]: I1205 16:15:43.107838 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:43 crc kubenswrapper[4756]: I1205 16:15:43.107847 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:43 crc kubenswrapper[4756]: I1205 16:15:43.107849 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:43 crc kubenswrapper[4756]: I1205 16:15:43.107864 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:43 crc kubenswrapper[4756]: I1205 16:15:43.230044 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:43 crc kubenswrapper[4756]: I1205 16:15:43.231324 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:43 crc kubenswrapper[4756]: I1205 16:15:43.231356 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:43 crc kubenswrapper[4756]: I1205 16:15:43.231367 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:43 crc kubenswrapper[4756]: I1205 16:15:43.231391 4756 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 16:15:43 crc kubenswrapper[4756]: I1205 16:15:43.243034 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 16:15:44 crc kubenswrapper[4756]: I1205 16:15:44.113730 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e"} Dec 05 16:15:44 crc kubenswrapper[4756]: I1205 16:15:44.113825 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22"} Dec 05 16:15:44 crc kubenswrapper[4756]: I1205 16:15:44.113840 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b"} Dec 05 16:15:44 crc kubenswrapper[4756]: I1205 16:15:44.113887 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:44 crc kubenswrapper[4756]: I1205 16:15:44.113924 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:44 crc kubenswrapper[4756]: I1205 16:15:44.113983 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:15:44 crc kubenswrapper[4756]: I1205 16:15:44.115406 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:44 crc kubenswrapper[4756]: I1205 16:15:44.115466 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:44 crc kubenswrapper[4756]: I1205 16:15:44.115479 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:44 crc kubenswrapper[4756]: I1205 16:15:44.116110 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:44 crc kubenswrapper[4756]: I1205 16:15:44.116151 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:44 crc kubenswrapper[4756]: I1205 16:15:44.116164 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:45 crc kubenswrapper[4756]: I1205 16:15:45.100058 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 16:15:45 crc kubenswrapper[4756]: I1205 16:15:45.123119 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf"} Dec 05 16:15:45 crc kubenswrapper[4756]: I1205 16:15:45.123200 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:45 crc kubenswrapper[4756]: I1205 16:15:45.123265 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:45 crc kubenswrapper[4756]: I1205 16:15:45.123328 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:45 crc kubenswrapper[4756]: I1205 16:15:45.123197 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4"} Dec 05 16:15:45 crc kubenswrapper[4756]: I1205 16:15:45.124378 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:45 crc kubenswrapper[4756]: I1205 16:15:45.124410 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:45 crc kubenswrapper[4756]: I1205 16:15:45.124432 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:45 crc kubenswrapper[4756]: I1205 16:15:45.124681 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:45 crc kubenswrapper[4756]: I1205 16:15:45.124733 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:45 crc kubenswrapper[4756]: I1205 16:15:45.124753 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:45 crc kubenswrapper[4756]: I1205 16:15:45.124758 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:45 crc kubenswrapper[4756]: I1205 16:15:45.124779 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:45 crc kubenswrapper[4756]: I1205 16:15:45.124836 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:45 crc kubenswrapper[4756]: I1205 16:15:45.295334 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 16:15:45 crc kubenswrapper[4756]: I1205 16:15:45.295599 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:45 crc kubenswrapper[4756]: I1205 16:15:45.296926 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:45 crc kubenswrapper[4756]: I1205 16:15:45.296966 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:45 crc kubenswrapper[4756]: I1205 16:15:45.296977 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:45 crc kubenswrapper[4756]: I1205 16:15:45.790818 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:15:46 crc kubenswrapper[4756]: I1205 16:15:46.126271 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:46 crc kubenswrapper[4756]: I1205 16:15:46.126271 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:46 crc kubenswrapper[4756]: I1205 16:15:46.128038 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:46 crc kubenswrapper[4756]: I1205 16:15:46.128094 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:46 crc kubenswrapper[4756]: I1205 16:15:46.128117 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:46 crc kubenswrapper[4756]: I1205 16:15:46.128228 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:46 crc kubenswrapper[4756]: I1205 16:15:46.128285 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:46 crc kubenswrapper[4756]: I1205 16:15:46.128303 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:46 crc kubenswrapper[4756]: I1205 16:15:46.140258 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 16:15:46 crc kubenswrapper[4756]: I1205 16:15:46.140341 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:46 crc kubenswrapper[4756]: I1205 16:15:46.141397 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:46 crc kubenswrapper[4756]: I1205 16:15:46.141477 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:46 crc kubenswrapper[4756]: I1205 16:15:46.141495 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:47 crc kubenswrapper[4756]: I1205 16:15:47.095544 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:15:47 crc kubenswrapper[4756]: I1205 16:15:47.131061 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:47 crc kubenswrapper[4756]: I1205 16:15:47.132207 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:47 crc kubenswrapper[4756]: I1205 16:15:47.132241 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:47 crc kubenswrapper[4756]: I1205 16:15:47.132250 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:48 crc kubenswrapper[4756]: I1205 16:15:48.100918 4756 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 16:15:48 crc kubenswrapper[4756]: I1205 16:15:48.101011 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 16:15:49 crc kubenswrapper[4756]: I1205 16:15:49.116148 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 16:15:49 crc kubenswrapper[4756]: I1205 16:15:49.116348 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:49 crc kubenswrapper[4756]: I1205 16:15:49.117676 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:49 crc kubenswrapper[4756]: I1205 16:15:49.117716 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:49 crc kubenswrapper[4756]: I1205 16:15:49.117728 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:49 crc kubenswrapper[4756]: I1205 16:15:49.124051 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 16:15:49 crc kubenswrapper[4756]: I1205 16:15:49.136147 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:49 crc kubenswrapper[4756]: I1205 16:15:49.137103 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:49 crc kubenswrapper[4756]: I1205 16:15:49.137152 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:49 crc kubenswrapper[4756]: I1205 16:15:49.137168 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:49 crc kubenswrapper[4756]: I1205 16:15:49.492556 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 05 16:15:49 crc kubenswrapper[4756]: I1205 16:15:49.492734 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:49 crc kubenswrapper[4756]: I1205 16:15:49.493970 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:49 crc kubenswrapper[4756]: I1205 16:15:49.494000 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:49 crc kubenswrapper[4756]: I1205 16:15:49.494008 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:50 crc kubenswrapper[4756]: E1205 16:15:50.128934 4756 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 05 16:15:51 crc kubenswrapper[4756]: I1205 16:15:51.990641 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 05 16:15:52 crc kubenswrapper[4756]: E1205 16:15:52.023258 4756 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 05 16:15:52 crc kubenswrapper[4756]: I1205 16:15:52.744324 4756 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 05 16:15:52 crc kubenswrapper[4756]: I1205 16:15:52.744733 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 05 16:15:52 crc kubenswrapper[4756]: I1205 16:15:52.750935 4756 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 05 16:15:52 crc kubenswrapper[4756]: I1205 16:15:52.751071 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 05 16:15:52 crc kubenswrapper[4756]: I1205 16:15:52.769918 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 05 16:15:52 crc kubenswrapper[4756]: I1205 16:15:52.770116 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:52 crc kubenswrapper[4756]: I1205 16:15:52.771526 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:52 crc kubenswrapper[4756]: I1205 16:15:52.771597 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:52 crc kubenswrapper[4756]: I1205 16:15:52.771618 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:53 crc kubenswrapper[4756]: I1205 16:15:53.252862 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 16:15:53 crc kubenswrapper[4756]: I1205 16:15:53.252985 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:53 crc kubenswrapper[4756]: I1205 16:15:53.254316 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:53 crc kubenswrapper[4756]: I1205 16:15:53.254358 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:53 crc kubenswrapper[4756]: I1205 16:15:53.254375 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:55 crc kubenswrapper[4756]: I1205 16:15:55.798391 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:15:55 crc kubenswrapper[4756]: I1205 16:15:55.798692 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:55 crc kubenswrapper[4756]: I1205 16:15:55.800034 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:55 crc kubenswrapper[4756]: I1205 16:15:55.800079 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:55 crc kubenswrapper[4756]: I1205 16:15:55.800090 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:55 crc kubenswrapper[4756]: I1205 16:15:55.803328 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:15:56 crc kubenswrapper[4756]: I1205 16:15:56.151022 4756 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 05 16:15:56 crc kubenswrapper[4756]: I1205 16:15:56.153563 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:15:56 crc kubenswrapper[4756]: I1205 16:15:56.155153 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:15:56 crc kubenswrapper[4756]: I1205 16:15:56.155224 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:15:56 crc kubenswrapper[4756]: I1205 16:15:56.155261 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:15:56 crc kubenswrapper[4756]: I1205 16:15:56.173243 4756 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 05 16:15:57 crc kubenswrapper[4756]: E1205 16:15:57.740100 4756 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Dec 05 16:15:57 crc kubenswrapper[4756]: I1205 16:15:57.743195 4756 trace.go:236] Trace[31418025]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 16:15:42.813) (total time: 14929ms): Dec 05 16:15:57 crc kubenswrapper[4756]: Trace[31418025]: ---"Objects listed" error: 14929ms (16:15:57.743) Dec 05 16:15:57 crc kubenswrapper[4756]: Trace[31418025]: [14.929224645s] [14.929224645s] END Dec 05 16:15:57 crc kubenswrapper[4756]: I1205 16:15:57.743248 4756 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 05 16:15:57 crc kubenswrapper[4756]: I1205 16:15:57.743587 4756 trace.go:236] Trace[32257804]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 16:15:43.452) (total time: 14290ms): Dec 05 16:15:57 crc kubenswrapper[4756]: Trace[32257804]: ---"Objects listed" error: 14290ms (16:15:57.743) Dec 05 16:15:57 crc kubenswrapper[4756]: Trace[32257804]: [14.290958903s] [14.290958903s] END Dec 05 16:15:57 crc kubenswrapper[4756]: I1205 16:15:57.743628 4756 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 05 16:15:57 crc kubenswrapper[4756]: I1205 16:15:57.744876 4756 trace.go:236] Trace[283467167]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 16:15:44.675) (total time: 13069ms): Dec 05 16:15:57 crc kubenswrapper[4756]: Trace[283467167]: ---"Objects listed" error: 13069ms (16:15:57.744) Dec 05 16:15:57 crc kubenswrapper[4756]: Trace[283467167]: [13.069577151s] [13.069577151s] END Dec 05 16:15:57 crc kubenswrapper[4756]: I1205 16:15:57.744912 4756 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 05 16:15:57 crc kubenswrapper[4756]: I1205 16:15:57.746098 4756 trace.go:236] Trace[780774477]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 16:15:42.947) (total time: 14798ms): Dec 05 16:15:57 crc kubenswrapper[4756]: Trace[780774477]: ---"Objects listed" error: 14798ms (16:15:57.745) Dec 05 16:15:57 crc kubenswrapper[4756]: Trace[780774477]: [14.798332018s] [14.798332018s] END Dec 05 16:15:57 crc kubenswrapper[4756]: I1205 16:15:57.746141 4756 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 05 16:15:57 crc kubenswrapper[4756]: I1205 16:15:57.746149 4756 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 05 16:15:57 crc kubenswrapper[4756]: E1205 16:15:57.747529 4756 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 05 16:15:57 crc kubenswrapper[4756]: I1205 16:15:57.796714 4756 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:55174->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 05 16:15:57 crc kubenswrapper[4756]: I1205 16:15:57.796764 4756 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:55180->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 05 16:15:57 crc kubenswrapper[4756]: I1205 16:15:57.796802 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:55174->192.168.126.11:17697: read: connection reset by peer" Dec 05 16:15:57 crc kubenswrapper[4756]: I1205 16:15:57.796835 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:55180->192.168.126.11:17697: read: connection reset by peer" Dec 05 16:15:57 crc kubenswrapper[4756]: I1205 16:15:57.797230 4756 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 05 16:15:57 crc kubenswrapper[4756]: I1205 16:15:57.797268 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 05 16:15:57 crc kubenswrapper[4756]: I1205 16:15:57.989909 4756 apiserver.go:52] "Watching apiserver" Dec 05 16:15:57 crc kubenswrapper[4756]: I1205 16:15:57.993922 4756 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 05 16:15:57 crc kubenswrapper[4756]: I1205 16:15:57.994203 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Dec 05 16:15:57 crc kubenswrapper[4756]: I1205 16:15:57.994538 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 16:15:57 crc kubenswrapper[4756]: I1205 16:15:57.994632 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:15:57 crc kubenswrapper[4756]: I1205 16:15:57.994730 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:15:57 crc kubenswrapper[4756]: I1205 16:15:57.994784 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 16:15:57 crc kubenswrapper[4756]: E1205 16:15:57.994927 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:15:57 crc kubenswrapper[4756]: I1205 16:15:57.994955 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 16:15:57 crc kubenswrapper[4756]: I1205 16:15:57.995019 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:15:57 crc kubenswrapper[4756]: E1205 16:15:57.995092 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:15:57 crc kubenswrapper[4756]: E1205 16:15:57.995169 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:15:57 crc kubenswrapper[4756]: I1205 16:15:57.996391 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 05 16:15:57 crc kubenswrapper[4756]: I1205 16:15:57.997066 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 05 16:15:57 crc kubenswrapper[4756]: I1205 16:15:57.997387 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 05 16:15:57 crc kubenswrapper[4756]: I1205 16:15:57.997576 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 05 16:15:57 crc kubenswrapper[4756]: I1205 16:15:57.998237 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 05 16:15:57 crc kubenswrapper[4756]: I1205 16:15:57.998542 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.006825 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.006829 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.007203 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.034911 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.046089 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.057662 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.067614 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.083443 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.093518 4756 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.097401 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.100894 4756 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.100996 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.108522 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.117956 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.148473 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.148513 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.148531 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.148555 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.148571 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.148585 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.148599 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.148636 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.148654 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.148669 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.148686 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.148703 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.148720 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.148734 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.148752 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.148767 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.148781 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.148796 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.148828 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.148851 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.148869 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.148942 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.148957 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.148993 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149008 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149040 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149057 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149077 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149095 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149091 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149110 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149188 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149222 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149253 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149236 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149282 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149298 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149325 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149308 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149363 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149386 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149402 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149437 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149455 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149473 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149484 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149490 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149518 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149530 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149547 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149567 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149568 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149584 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149600 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149643 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149653 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149697 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149731 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149764 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149797 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149830 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149863 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149907 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149952 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149998 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.150031 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.150063 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.150095 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.150126 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.150173 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.150214 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.150247 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.150283 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.150317 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.150347 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.150381 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.150440 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149645 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149720 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149742 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149843 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.149866 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.150028 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.150124 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.150094 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.150224 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.150258 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.150341 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.150348 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.150450 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.151198 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.150463 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.150599 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.151222 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.151264 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.151298 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.151330 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.151361 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.151391 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.151450 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.151482 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.151513 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.151546 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.150654 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.150745 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.150793 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.151033 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.151052 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.151172 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.151385 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.151400 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.151534 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.151543 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.151595 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.151632 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.151721 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.151770 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: E1205 16:15:58.151820 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:15:58.651794885 +0000 UTC m=+18.890468437 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.152025 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.152219 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.152298 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.152361 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.152668 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.152921 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.153055 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.153702 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.154168 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.154278 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.154300 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.151718 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.154313 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.151818 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.154397 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.154438 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.154465 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.154496 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.154509 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.154520 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.154572 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.154597 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.154621 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.154643 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.154664 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.154745 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.154773 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.154796 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.154818 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.154828 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.154844 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.154874 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.154896 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.154899 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.154922 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.154947 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.154969 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.155915 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.155963 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.156001 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.156032 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.156097 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.156167 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.156249 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.156352 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.156534 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.156600 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.156664 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.156737 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.156795 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.156862 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.156912 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.156974 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.157050 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.157107 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.157193 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.157270 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.157301 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.157450 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.157530 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.157557 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.157645 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.157754 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.157775 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.157797 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.157853 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.157910 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.158142 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.158167 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.158224 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.158301 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.158410 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.158450 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.158477 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.158505 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.158577 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.158691 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.158724 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.158779 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.158828 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.158860 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.159093 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.159151 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.159184 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.160033 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.160096 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.160157 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.154917 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.155574 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.155888 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.155950 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.156137 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.156235 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.156323 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.156745 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.156902 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.157092 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.157603 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.157905 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.157919 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.158035 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.158259 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.158538 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.158568 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.158871 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.158913 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.160785 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.159406 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.159851 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.159983 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.160855 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.160927 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.160972 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.160993 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.161006 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.161055 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.161032 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.161208 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.161331 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.162164 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.162203 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.162224 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.162249 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.162276 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.162298 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.162316 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.162333 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.162291 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.162349 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.162470 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.162528 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.162564 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.162599 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.162635 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.162671 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.162706 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.162740 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.162774 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.162805 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.162838 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.163135 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.163203 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.163402 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.163549 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.163688 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.164475 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.164525 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.164554 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.164581 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.164606 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.164629 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.164654 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.164678 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.164703 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.164725 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.164751 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.164777 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.164800 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.164824 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.164848 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.164879 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.164903 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.164930 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.164982 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.165011 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.165035 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.165060 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.165093 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.165124 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.165149 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.165178 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.165207 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.165236 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.165271 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.165295 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.165322 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.165630 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.162472 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.162695 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.162832 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.163014 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.163053 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.163791 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.163903 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.164681 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.164917 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.165134 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.165172 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.165217 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.165358 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.165430 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.165661 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.165758 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.166085 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.166275 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.167389 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.167536 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.167549 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.168160 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.168190 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.169556 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.169586 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.170657 4756 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.170958 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.170965 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.171215 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.171312 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: E1205 16:15:58.171702 4756 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 16:15:58 crc kubenswrapper[4756]: E1205 16:15:58.171779 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 16:15:58.671755961 +0000 UTC m=+18.910429493 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.171495 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.171829 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.171951 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.172084 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.172112 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.172224 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.172350 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.172942 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.172975 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.173294 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.173386 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.173779 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.173863 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: E1205 16:15:58.173926 4756 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 16:15:58 crc kubenswrapper[4756]: E1205 16:15:58.174648 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 16:15:58.674625555 +0000 UTC m=+18.913299087 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.172104 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.173719 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.174266 4756 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.174735 4756 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.174751 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.174768 4756 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.174783 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.174797 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.174812 4756 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.174826 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.174841 4756 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.174855 4756 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.174868 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.174881 4756 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.174894 4756 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.174908 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.174921 4756 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.174933 4756 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.174946 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.174329 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.174973 4756 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.174986 4756 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.174300 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.175035 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.175059 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.175074 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.175094 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.175109 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.175123 4756 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.175136 4756 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.175151 4756 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.175165 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.175179 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.175192 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.175207 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.175222 4756 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.175236 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.175250 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.175267 4756 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.175280 4756 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.176341 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.176588 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.177954 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.178509 4756 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961" exitCode=255 Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.178554 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961"} Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.181139 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.181231 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.181503 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.181642 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.182175 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.184071 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.184276 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.184971 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.185223 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.185331 4756 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.185407 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.185547 4756 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.185653 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.185715 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.185809 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.185870 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.185961 4756 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.186114 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.186217 4756 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.186298 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.186365 4756 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.186633 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.186691 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.186755 4756 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.186860 4756 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.186926 4756 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.186982 4756 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.187040 4756 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.187097 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.187153 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.187207 4756 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.187382 4756 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.188181 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.188280 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.188399 4756 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.189201 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.187428 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: E1205 16:15:58.187805 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 16:15:58 crc kubenswrapper[4756]: E1205 16:15:58.189275 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 16:15:58 crc kubenswrapper[4756]: E1205 16:15:58.189293 4756 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.189242 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.189347 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: E1205 16:15:58.187832 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 16:15:58 crc kubenswrapper[4756]: E1205 16:15:58.189361 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 16:15:58.689342775 +0000 UTC m=+18.928016287 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.189586 4756 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: E1205 16:15:58.189597 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.189597 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: E1205 16:15:58.189617 4756 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.188005 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.188309 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.187899 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.189600 4756 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.189189 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.188004 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: E1205 16:15:58.189691 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 16:15:58.689665872 +0000 UTC m=+18.928339384 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.189718 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.189779 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.189798 4756 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.189814 4756 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.189826 4756 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.189837 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.189850 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.189862 4756 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.189873 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.189885 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.189897 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.189910 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.189922 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.189935 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.189949 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.189962 4756 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.189973 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.190235 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.191051 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.191542 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.191556 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.191729 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.191821 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.197594 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.204804 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.206916 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.211539 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.218829 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.219399 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.221059 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.221876 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.221973 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.221987 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.224963 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.225382 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.225765 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.226131 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.226251 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.226476 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.226842 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.226964 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.227287 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.227584 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.228171 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.228357 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.229019 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.229033 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.229526 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.229577 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.229707 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.229922 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.229569 4756 scope.go:117] "RemoveContainer" containerID="fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.230259 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.230286 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.230343 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.231231 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.232201 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.233208 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.233256 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.233611 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.233763 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.233763 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.234115 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.234165 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.234607 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.234751 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.234925 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.234943 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.235459 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.235849 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.238767 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.241065 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.250135 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.251716 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.251694 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.264002 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.264925 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.275518 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.285327 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290525 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290582 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290618 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290628 4756 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290637 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290646 4756 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290654 4756 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290663 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290673 4756 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290682 4756 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290690 4756 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290706 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290716 4756 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290724 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290732 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290739 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290747 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290756 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290765 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290774 4756 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290693 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290783 4756 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290843 4756 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290856 4756 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290869 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290884 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290897 4756 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290908 4756 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290919 4756 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290816 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290931 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290971 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290982 4756 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.290994 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291002 4756 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291010 4756 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291019 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291027 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291036 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291044 4756 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291052 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291061 4756 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291069 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291077 4756 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291084 4756 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291093 4756 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291101 4756 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291110 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291119 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291129 4756 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291137 4756 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291147 4756 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291158 4756 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291166 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291175 4756 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291183 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291191 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291200 4756 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291209 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291217 4756 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291226 4756 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291236 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291248 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291259 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291268 4756 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291280 4756 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291290 4756 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291301 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291312 4756 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291322 4756 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291330 4756 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291338 4756 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291346 4756 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291354 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291362 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291370 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291378 4756 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291386 4756 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291445 4756 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291456 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291464 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291474 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291482 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291492 4756 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291501 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291524 4756 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291533 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291542 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291550 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291559 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291567 4756 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291576 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291584 4756 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291592 4756 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291599 4756 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291607 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291615 4756 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291623 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291631 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291639 4756 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291647 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291655 4756 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291663 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291671 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291680 4756 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291688 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291696 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291704 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291712 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291720 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291728 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291736 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.291746 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.309886 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.314962 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.321031 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 16:15:58 crc kubenswrapper[4756]: W1205 16:15:58.330870 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-d9899d4da9e93248825e0dd93dd480250a584934923de62bf2d9af80e1c1589b WatchSource:0}: Error finding container d9899d4da9e93248825e0dd93dd480250a584934923de62bf2d9af80e1c1589b: Status 404 returned error can't find the container with id d9899d4da9e93248825e0dd93dd480250a584934923de62bf2d9af80e1c1589b Dec 05 16:15:58 crc kubenswrapper[4756]: W1205 16:15:58.341242 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-ed7dd8b47c5edb0fe98a7ae306fde3e8aee64c76eeb278e5df747050befed389 WatchSource:0}: Error finding container ed7dd8b47c5edb0fe98a7ae306fde3e8aee64c76eeb278e5df747050befed389: Status 404 returned error can't find the container with id ed7dd8b47c5edb0fe98a7ae306fde3e8aee64c76eeb278e5df747050befed389 Dec 05 16:15:58 crc kubenswrapper[4756]: W1205 16:15:58.349845 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-94837feb7e380ed10093924eb5227d231757c0d519dc7af51588615f937246c1 WatchSource:0}: Error finding container 94837feb7e380ed10093924eb5227d231757c0d519dc7af51588615f937246c1: Status 404 returned error can't find the container with id 94837feb7e380ed10093924eb5227d231757c0d519dc7af51588615f937246c1 Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.694701 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.695050 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.695073 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.695098 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:15:58 crc kubenswrapper[4756]: I1205 16:15:58.695117 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:15:58 crc kubenswrapper[4756]: E1205 16:15:58.695210 4756 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 16:15:58 crc kubenswrapper[4756]: E1205 16:15:58.695257 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 16:15:59.695244357 +0000 UTC m=+19.933917869 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 16:15:58 crc kubenswrapper[4756]: E1205 16:15:58.695654 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:15:59.695646196 +0000 UTC m=+19.934319708 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:15:58 crc kubenswrapper[4756]: E1205 16:15:58.695685 4756 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 16:15:58 crc kubenswrapper[4756]: E1205 16:15:58.695704 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 16:15:59.695699267 +0000 UTC m=+19.934372779 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 16:15:58 crc kubenswrapper[4756]: E1205 16:15:58.695752 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 16:15:58 crc kubenswrapper[4756]: E1205 16:15:58.695763 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 16:15:58 crc kubenswrapper[4756]: E1205 16:15:58.695772 4756 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:15:58 crc kubenswrapper[4756]: E1205 16:15:58.695792 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 16:15:59.695786809 +0000 UTC m=+19.934460321 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:15:58 crc kubenswrapper[4756]: E1205 16:15:58.695825 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 16:15:58 crc kubenswrapper[4756]: E1205 16:15:58.695834 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 16:15:58 crc kubenswrapper[4756]: E1205 16:15:58.695841 4756 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:15:58 crc kubenswrapper[4756]: E1205 16:15:58.695857 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 16:15:59.695852601 +0000 UTC m=+19.934526103 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:15:59 crc kubenswrapper[4756]: I1205 16:15:59.208613 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85"} Dec 05 16:15:59 crc kubenswrapper[4756]: I1205 16:15:59.208790 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d9899d4da9e93248825e0dd93dd480250a584934923de62bf2d9af80e1c1589b"} Dec 05 16:15:59 crc kubenswrapper[4756]: I1205 16:15:59.217712 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 05 16:15:59 crc kubenswrapper[4756]: I1205 16:15:59.222143 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723"} Dec 05 16:15:59 crc kubenswrapper[4756]: I1205 16:15:59.223450 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:15:59 crc kubenswrapper[4756]: I1205 16:15:59.240889 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"94837feb7e380ed10093924eb5227d231757c0d519dc7af51588615f937246c1"} Dec 05 16:15:59 crc kubenswrapper[4756]: I1205 16:15:59.244399 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c"} Dec 05 16:15:59 crc kubenswrapper[4756]: I1205 16:15:59.244465 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe"} Dec 05 16:15:59 crc kubenswrapper[4756]: I1205 16:15:59.244475 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ed7dd8b47c5edb0fe98a7ae306fde3e8aee64c76eeb278e5df747050befed389"} Dec 05 16:15:59 crc kubenswrapper[4756]: I1205 16:15:59.247175 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:15:59Z is after 2025-08-24T17:21:41Z" Dec 05 16:15:59 crc kubenswrapper[4756]: I1205 16:15:59.262517 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:15:59Z is after 2025-08-24T17:21:41Z" Dec 05 16:15:59 crc kubenswrapper[4756]: I1205 16:15:59.298591 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:15:59Z is after 2025-08-24T17:21:41Z" Dec 05 16:15:59 crc kubenswrapper[4756]: I1205 16:15:59.313783 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:15:59Z is after 2025-08-24T17:21:41Z" Dec 05 16:15:59 crc kubenswrapper[4756]: I1205 16:15:59.329836 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:15:59Z is after 2025-08-24T17:21:41Z" Dec 05 16:15:59 crc kubenswrapper[4756]: I1205 16:15:59.344177 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:15:59Z is after 2025-08-24T17:21:41Z" Dec 05 16:15:59 crc kubenswrapper[4756]: I1205 16:15:59.355959 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:15:59Z is after 2025-08-24T17:21:41Z" Dec 05 16:15:59 crc kubenswrapper[4756]: I1205 16:15:59.370573 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:15:59Z is after 2025-08-24T17:21:41Z" Dec 05 16:15:59 crc kubenswrapper[4756]: I1205 16:15:59.382717 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:15:59Z is after 2025-08-24T17:21:41Z" Dec 05 16:15:59 crc kubenswrapper[4756]: I1205 16:15:59.399546 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:15:59Z is after 2025-08-24T17:21:41Z" Dec 05 16:15:59 crc kubenswrapper[4756]: I1205 16:15:59.422925 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:15:59Z is after 2025-08-24T17:21:41Z" Dec 05 16:15:59 crc kubenswrapper[4756]: I1205 16:15:59.460335 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:15:59Z is after 2025-08-24T17:21:41Z" Dec 05 16:15:59 crc kubenswrapper[4756]: I1205 16:15:59.483646 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:15:59Z is after 2025-08-24T17:21:41Z" Dec 05 16:15:59 crc kubenswrapper[4756]: I1205 16:15:59.497515 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:15:59Z is after 2025-08-24T17:21:41Z" Dec 05 16:15:59 crc kubenswrapper[4756]: I1205 16:15:59.704723 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:15:59 crc kubenswrapper[4756]: I1205 16:15:59.704875 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:15:59 crc kubenswrapper[4756]: E1205 16:15:59.704950 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:16:01.704911605 +0000 UTC m=+21.943585157 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:15:59 crc kubenswrapper[4756]: E1205 16:15:59.705002 4756 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 16:15:59 crc kubenswrapper[4756]: I1205 16:15:59.705020 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:15:59 crc kubenswrapper[4756]: E1205 16:15:59.705079 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 16:16:01.705056858 +0000 UTC m=+21.943730410 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 16:15:59 crc kubenswrapper[4756]: I1205 16:15:59.705112 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:15:59 crc kubenswrapper[4756]: I1205 16:15:59.705157 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:15:59 crc kubenswrapper[4756]: E1205 16:15:59.705219 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 16:15:59 crc kubenswrapper[4756]: E1205 16:15:59.705255 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 16:15:59 crc kubenswrapper[4756]: E1205 16:15:59.705276 4756 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:15:59 crc kubenswrapper[4756]: E1205 16:15:59.705290 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 16:15:59 crc kubenswrapper[4756]: E1205 16:15:59.705320 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 16:15:59 crc kubenswrapper[4756]: E1205 16:15:59.705333 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 16:16:01.705315164 +0000 UTC m=+21.943988706 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:15:59 crc kubenswrapper[4756]: E1205 16:15:59.705341 4756 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:15:59 crc kubenswrapper[4756]: E1205 16:15:59.705392 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 16:16:01.705379875 +0000 UTC m=+21.944053427 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:15:59 crc kubenswrapper[4756]: E1205 16:15:59.705292 4756 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 16:15:59 crc kubenswrapper[4756]: E1205 16:15:59.705476 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 16:16:01.705460157 +0000 UTC m=+21.944133709 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.048960 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.049012 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:00 crc kubenswrapper[4756]: E1205 16:16:00.049086 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.048959 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:00 crc kubenswrapper[4756]: E1205 16:16:00.049220 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:16:00 crc kubenswrapper[4756]: E1205 16:16:00.049325 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.053850 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.055040 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.057305 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.058353 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.059040 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.059690 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.060399 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.060991 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.061721 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.062337 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.063035 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.063887 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.064573 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.065244 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.065244 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:00Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.065936 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.066636 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.067402 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.069305 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.070303 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.071006 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.071619 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.072239 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.072778 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.073589 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.074044 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.074767 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.075531 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.076040 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.076812 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.077467 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.077999 4756 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.078139 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.083149 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.083788 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.084253 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.086112 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.086816 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:00Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.087306 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.088755 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.089843 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.090611 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.091560 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.092244 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.093305 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.095039 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.095558 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.096146 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.097255 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.098640 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.099158 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.099696 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.100616 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.101165 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.102227 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.102956 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.110293 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:00Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.127900 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:00Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.149295 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:00Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.166126 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:00Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.208151 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:00Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.947923 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.950580 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.950636 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.950648 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.950717 4756 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.960260 4756 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.960575 4756 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.961976 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.962018 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.962036 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.962059 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.962077 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:00Z","lastTransitionTime":"2025-12-05T16:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:00 crc kubenswrapper[4756]: E1205 16:16:00.983298 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:00Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.988176 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.988214 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.988224 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.988243 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:00 crc kubenswrapper[4756]: I1205 16:16:00.988256 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:00Z","lastTransitionTime":"2025-12-05T16:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:01 crc kubenswrapper[4756]: E1205 16:16:01.006480 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:01Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.010516 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.010560 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.010571 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.010589 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.010602 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:01Z","lastTransitionTime":"2025-12-05T16:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:01 crc kubenswrapper[4756]: E1205 16:16:01.028330 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:01Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.032686 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.032894 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.032986 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.033101 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.033180 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:01Z","lastTransitionTime":"2025-12-05T16:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:01 crc kubenswrapper[4756]: E1205 16:16:01.051257 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:01Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.055547 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.055582 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.055593 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.055609 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.055623 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:01Z","lastTransitionTime":"2025-12-05T16:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:01 crc kubenswrapper[4756]: E1205 16:16:01.070079 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:01Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:01 crc kubenswrapper[4756]: E1205 16:16:01.070240 4756 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.072363 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.072407 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.072452 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.072484 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.072501 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:01Z","lastTransitionTime":"2025-12-05T16:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.175385 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.175452 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.175465 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.175481 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.175492 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:01Z","lastTransitionTime":"2025-12-05T16:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.252411 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f"} Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.271188 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:01Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.278827 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.278881 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.278900 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.278922 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.278938 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:01Z","lastTransitionTime":"2025-12-05T16:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.289958 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:01Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.302448 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:01Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.316226 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:01Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.331361 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:01Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.351546 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:01Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.368527 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:01Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.382002 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.382056 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.382074 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.382097 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.382115 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:01Z","lastTransitionTime":"2025-12-05T16:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.484796 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.484853 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.484869 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.484896 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.484914 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:01Z","lastTransitionTime":"2025-12-05T16:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.587636 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.587704 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.587729 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.587758 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.587781 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:01Z","lastTransitionTime":"2025-12-05T16:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.690602 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.690648 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.690670 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.690698 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.690718 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:01Z","lastTransitionTime":"2025-12-05T16:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.722262 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.722344 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.722377 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.722402 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.722465 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:01 crc kubenswrapper[4756]: E1205 16:16:01.722528 4756 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 16:16:01 crc kubenswrapper[4756]: E1205 16:16:01.722593 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 16:16:01 crc kubenswrapper[4756]: E1205 16:16:01.722608 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 16:16:01 crc kubenswrapper[4756]: E1205 16:16:01.722607 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 16:16:01 crc kubenswrapper[4756]: E1205 16:16:01.722686 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 16:16:01 crc kubenswrapper[4756]: E1205 16:16:01.722698 4756 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 16:16:01 crc kubenswrapper[4756]: E1205 16:16:01.722710 4756 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:16:01 crc kubenswrapper[4756]: E1205 16:16:01.722619 4756 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:16:01 crc kubenswrapper[4756]: E1205 16:16:01.722611 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:16:05.722573089 +0000 UTC m=+25.961246641 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:16:01 crc kubenswrapper[4756]: E1205 16:16:01.722917 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 16:16:05.722885657 +0000 UTC m=+25.961559209 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 16:16:01 crc kubenswrapper[4756]: E1205 16:16:01.722947 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 16:16:05.722929709 +0000 UTC m=+25.961603261 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 16:16:01 crc kubenswrapper[4756]: E1205 16:16:01.722989 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 16:16:05.72297276 +0000 UTC m=+25.961646312 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:16:01 crc kubenswrapper[4756]: E1205 16:16:01.723025 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 16:16:05.723009571 +0000 UTC m=+25.961683183 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.793954 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.793998 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.794009 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.794025 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.794038 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:01Z","lastTransitionTime":"2025-12-05T16:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.897466 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.897509 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.897521 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.897537 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:01 crc kubenswrapper[4756]: I1205 16:16:01.897549 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:01Z","lastTransitionTime":"2025-12-05T16:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.000375 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.000455 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.000470 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.000488 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.000499 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:02Z","lastTransitionTime":"2025-12-05T16:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.048638 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.048693 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:02 crc kubenswrapper[4756]: E1205 16:16:02.048824 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.048855 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:02 crc kubenswrapper[4756]: E1205 16:16:02.048986 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:16:02 crc kubenswrapper[4756]: E1205 16:16:02.049126 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.102931 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.102974 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.102985 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.103000 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.103013 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:02Z","lastTransitionTime":"2025-12-05T16:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.205392 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.205458 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.205469 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.205487 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.205498 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:02Z","lastTransitionTime":"2025-12-05T16:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.308455 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.308509 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.308523 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.308543 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.308579 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:02Z","lastTransitionTime":"2025-12-05T16:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.412034 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.412070 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.412079 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.412096 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.412105 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:02Z","lastTransitionTime":"2025-12-05T16:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.514782 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.514829 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.514845 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.514872 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.514916 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:02Z","lastTransitionTime":"2025-12-05T16:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.617676 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.617758 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.617780 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.617806 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.617823 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:02Z","lastTransitionTime":"2025-12-05T16:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.720317 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.720354 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.720364 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.720379 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.720396 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:02Z","lastTransitionTime":"2025-12-05T16:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.796043 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.808248 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.815642 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.817539 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.822999 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.823065 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.823083 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.823109 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.823123 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:02Z","lastTransitionTime":"2025-12-05T16:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.832667 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.849216 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.872398 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.890473 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.917451 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.925944 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.926018 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.926031 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.926052 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.926064 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:02Z","lastTransitionTime":"2025-12-05T16:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.935386 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.952702 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.968627 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.983218 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:02 crc kubenswrapper[4756]: I1205 16:16:02.997106 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.015176 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:03Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.029575 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.029617 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.029628 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.029645 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.029655 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:03Z","lastTransitionTime":"2025-12-05T16:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.030227 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:03Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.047243 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:03Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.071752 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:03Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.132023 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.132056 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.132067 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.132083 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.132117 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:03Z","lastTransitionTime":"2025-12-05T16:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.234823 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.234889 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.234927 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.234960 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.234984 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:03Z","lastTransitionTime":"2025-12-05T16:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:03 crc kubenswrapper[4756]: E1205 16:16:03.270891 4756 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.337654 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.337690 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.337699 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.337713 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.337721 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:03Z","lastTransitionTime":"2025-12-05T16:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.440941 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.441009 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.441035 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.441067 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.441087 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:03Z","lastTransitionTime":"2025-12-05T16:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.544533 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.544581 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.544595 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.544616 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.544632 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:03Z","lastTransitionTime":"2025-12-05T16:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.646944 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.646991 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.647003 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.647018 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.647028 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:03Z","lastTransitionTime":"2025-12-05T16:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.749532 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.749592 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.749605 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.749629 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.749644 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:03Z","lastTransitionTime":"2025-12-05T16:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.851659 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.851877 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.851969 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.852069 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.852170 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:03Z","lastTransitionTime":"2025-12-05T16:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.954564 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.954867 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.954939 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.955000 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:03 crc kubenswrapper[4756]: I1205 16:16:03.955066 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:03Z","lastTransitionTime":"2025-12-05T16:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.040400 4756 csr.go:261] certificate signing request csr-jxvd4 is approved, waiting to be issued Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.048857 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.048930 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:04 crc kubenswrapper[4756]: E1205 16:16:04.049027 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:16:04 crc kubenswrapper[4756]: E1205 16:16:04.049227 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.048857 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:04 crc kubenswrapper[4756]: E1205 16:16:04.049401 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.057440 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.057474 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.057483 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.057499 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.057511 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:04Z","lastTransitionTime":"2025-12-05T16:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.068621 4756 csr.go:257] certificate signing request csr-jxvd4 is issued Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.161145 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.161183 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.161193 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.161213 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.161227 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:04Z","lastTransitionTime":"2025-12-05T16:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.263351 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.263390 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.263399 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.263500 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.263514 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:04Z","lastTransitionTime":"2025-12-05T16:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.366141 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.366370 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.366480 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.366547 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.366621 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:04Z","lastTransitionTime":"2025-12-05T16:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.469320 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.469356 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.469370 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.469394 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.469406 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:04Z","lastTransitionTime":"2025-12-05T16:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.571677 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.571766 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.571778 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.571798 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.571810 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:04Z","lastTransitionTime":"2025-12-05T16:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.674689 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.674737 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.674749 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.674768 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.674780 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:04Z","lastTransitionTime":"2025-12-05T16:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.777513 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.777581 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.777595 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.777620 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.777702 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:04Z","lastTransitionTime":"2025-12-05T16:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.880035 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.880083 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.880096 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.880114 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.880125 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:04Z","lastTransitionTime":"2025-12-05T16:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.982380 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.982444 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.982455 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.982473 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.982484 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:04Z","lastTransitionTime":"2025-12-05T16:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.985770 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-jdgc8"] Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.986253 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-zrvcn"] Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.986405 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-fcqf4"] Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.986470 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.986513 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.986576 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fcqf4" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.987983 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.988352 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.988685 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.989295 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.991067 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-rr8pw"] Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.991402 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rr8pw" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.991792 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.992518 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.992625 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.992666 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.992814 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.992993 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.993001 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.993215 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.993569 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.993869 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 05 16:16:04 crc kubenswrapper[4756]: I1205 16:16:04.994540 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.006610 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.029390 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.049301 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.053704 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-host-var-lib-cni-multus\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.053937 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-host-var-lib-kubelet\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.054062 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-multus-conf-dir\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.054166 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c0cb6a2e-b002-460c-8633-46ab5fbe9668-hosts-file\") pod \"node-resolver-rr8pw\" (UID: \"c0cb6a2e-b002-460c-8633-46ab5fbe9668\") " pod="openshift-dns/node-resolver-rr8pw" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.054251 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-system-cni-dir\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.054381 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fb26e42b-df49-418d-8be1-8a0704cd5dfd-cni-binary-copy\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.054495 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-host-run-k8s-cni-cncf-io\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.054591 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzp58\" (UniqueName: \"kubernetes.io/projected/2067f3da-c088-40d8-bc08-1eb971992c56-kube-api-access-nzp58\") pod \"machine-config-daemon-zrvcn\" (UID: \"2067f3da-c088-40d8-bc08-1eb971992c56\") " pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.054786 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1d6a0ce5-2436-4b0a-859e-b6a6bb896271-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jdgc8\" (UID: \"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\") " pod="openshift-multus/multus-additional-cni-plugins-jdgc8" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.054891 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-multus-socket-dir-parent\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.054985 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-host-var-lib-cni-bin\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.055067 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2067f3da-c088-40d8-bc08-1eb971992c56-proxy-tls\") pod \"machine-config-daemon-zrvcn\" (UID: \"2067f3da-c088-40d8-bc08-1eb971992c56\") " pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.055166 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-host-run-netns\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.055252 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-etc-kubernetes\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.055345 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1d6a0ce5-2436-4b0a-859e-b6a6bb896271-cni-binary-copy\") pod \"multus-additional-cni-plugins-jdgc8\" (UID: \"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\") " pod="openshift-multus/multus-additional-cni-plugins-jdgc8" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.055462 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/2067f3da-c088-40d8-bc08-1eb971992c56-rootfs\") pod \"machine-config-daemon-zrvcn\" (UID: \"2067f3da-c088-40d8-bc08-1eb971992c56\") " pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.055548 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-os-release\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.055626 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-hostroot\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.055699 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-host-run-multus-certs\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.055779 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2067f3da-c088-40d8-bc08-1eb971992c56-mcd-auth-proxy-config\") pod \"machine-config-daemon-zrvcn\" (UID: \"2067f3da-c088-40d8-bc08-1eb971992c56\") " pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.055849 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1d6a0ce5-2436-4b0a-859e-b6a6bb896271-system-cni-dir\") pod \"multus-additional-cni-plugins-jdgc8\" (UID: \"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\") " pod="openshift-multus/multus-additional-cni-plugins-jdgc8" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.055922 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-multus-cni-dir\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.055992 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fb26e42b-df49-418d-8be1-8a0704cd5dfd-multus-daemon-config\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.056070 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-cnibin\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.056140 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlbvq\" (UniqueName: \"kubernetes.io/projected/1d6a0ce5-2436-4b0a-859e-b6a6bb896271-kube-api-access-tlbvq\") pod \"multus-additional-cni-plugins-jdgc8\" (UID: \"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\") " pod="openshift-multus/multus-additional-cni-plugins-jdgc8" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.056214 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbqdz\" (UniqueName: \"kubernetes.io/projected/fb26e42b-df49-418d-8be1-8a0704cd5dfd-kube-api-access-qbqdz\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.056289 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1d6a0ce5-2436-4b0a-859e-b6a6bb896271-cnibin\") pod \"multus-additional-cni-plugins-jdgc8\" (UID: \"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\") " pod="openshift-multus/multus-additional-cni-plugins-jdgc8" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.056363 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6nwc\" (UniqueName: \"kubernetes.io/projected/c0cb6a2e-b002-460c-8633-46ab5fbe9668-kube-api-access-s6nwc\") pod \"node-resolver-rr8pw\" (UID: \"c0cb6a2e-b002-460c-8633-46ab5fbe9668\") " pod="openshift-dns/node-resolver-rr8pw" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.056467 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1d6a0ce5-2436-4b0a-859e-b6a6bb896271-os-release\") pod \"multus-additional-cni-plugins-jdgc8\" (UID: \"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\") " pod="openshift-multus/multus-additional-cni-plugins-jdgc8" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.056559 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1d6a0ce5-2436-4b0a-859e-b6a6bb896271-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jdgc8\" (UID: \"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\") " pod="openshift-multus/multus-additional-cni-plugins-jdgc8" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.065910 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.069799 4756 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-05 16:11:04 +0000 UTC, rotation deadline is 2026-09-17 10:18:35.407464592 +0000 UTC Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.069836 4756 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6858h2m30.337631011s for next certificate rotation Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.077178 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.085811 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.085854 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.085864 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.085883 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.085897 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:05Z","lastTransitionTime":"2025-12-05T16:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.091025 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.103215 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.104788 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.108200 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.112816 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.121633 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.134613 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.151959 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.157702 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/2067f3da-c088-40d8-bc08-1eb971992c56-rootfs\") pod \"machine-config-daemon-zrvcn\" (UID: \"2067f3da-c088-40d8-bc08-1eb971992c56\") " pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.157738 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-os-release\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.157765 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-hostroot\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.157787 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-host-run-multus-certs\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.157825 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2067f3da-c088-40d8-bc08-1eb971992c56-mcd-auth-proxy-config\") pod \"machine-config-daemon-zrvcn\" (UID: \"2067f3da-c088-40d8-bc08-1eb971992c56\") " pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.157850 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1d6a0ce5-2436-4b0a-859e-b6a6bb896271-system-cni-dir\") pod \"multus-additional-cni-plugins-jdgc8\" (UID: \"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\") " pod="openshift-multus/multus-additional-cni-plugins-jdgc8" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.157872 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-multus-cni-dir\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.157896 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fb26e42b-df49-418d-8be1-8a0704cd5dfd-multus-daemon-config\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.157919 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlbvq\" (UniqueName: \"kubernetes.io/projected/1d6a0ce5-2436-4b0a-859e-b6a6bb896271-kube-api-access-tlbvq\") pod \"multus-additional-cni-plugins-jdgc8\" (UID: \"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\") " pod="openshift-multus/multus-additional-cni-plugins-jdgc8" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.157925 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-os-release\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.157949 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-cnibin\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.157971 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbqdz\" (UniqueName: \"kubernetes.io/projected/fb26e42b-df49-418d-8be1-8a0704cd5dfd-kube-api-access-qbqdz\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.157993 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1d6a0ce5-2436-4b0a-859e-b6a6bb896271-cnibin\") pod \"multus-additional-cni-plugins-jdgc8\" (UID: \"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\") " pod="openshift-multus/multus-additional-cni-plugins-jdgc8" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.158001 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/2067f3da-c088-40d8-bc08-1eb971992c56-rootfs\") pod \"machine-config-daemon-zrvcn\" (UID: \"2067f3da-c088-40d8-bc08-1eb971992c56\") " pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.158012 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1d6a0ce5-2436-4b0a-859e-b6a6bb896271-os-release\") pod \"multus-additional-cni-plugins-jdgc8\" (UID: \"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\") " pod="openshift-multus/multus-additional-cni-plugins-jdgc8" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.158034 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1d6a0ce5-2436-4b0a-859e-b6a6bb896271-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jdgc8\" (UID: \"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\") " pod="openshift-multus/multus-additional-cni-plugins-jdgc8" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.158058 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6nwc\" (UniqueName: \"kubernetes.io/projected/c0cb6a2e-b002-460c-8633-46ab5fbe9668-kube-api-access-s6nwc\") pod \"node-resolver-rr8pw\" (UID: \"c0cb6a2e-b002-460c-8633-46ab5fbe9668\") " pod="openshift-dns/node-resolver-rr8pw" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.158082 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-host-var-lib-cni-multus\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.158119 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-host-var-lib-kubelet\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.158142 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-multus-conf-dir\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.158156 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-multus-cni-dir\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.158163 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c0cb6a2e-b002-460c-8633-46ab5fbe9668-hosts-file\") pod \"node-resolver-rr8pw\" (UID: \"c0cb6a2e-b002-460c-8633-46ab5fbe9668\") " pod="openshift-dns/node-resolver-rr8pw" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.158187 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-system-cni-dir\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.158216 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-host-run-multus-certs\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.158220 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fb26e42b-df49-418d-8be1-8a0704cd5dfd-cni-binary-copy\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.158266 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-host-run-k8s-cni-cncf-io\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.158294 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzp58\" (UniqueName: \"kubernetes.io/projected/2067f3da-c088-40d8-bc08-1eb971992c56-kube-api-access-nzp58\") pod \"machine-config-daemon-zrvcn\" (UID: \"2067f3da-c088-40d8-bc08-1eb971992c56\") " pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.158317 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-host-var-lib-cni-bin\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.158340 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2067f3da-c088-40d8-bc08-1eb971992c56-proxy-tls\") pod \"machine-config-daemon-zrvcn\" (UID: \"2067f3da-c088-40d8-bc08-1eb971992c56\") " pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.158362 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1d6a0ce5-2436-4b0a-859e-b6a6bb896271-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jdgc8\" (UID: \"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\") " pod="openshift-multus/multus-additional-cni-plugins-jdgc8" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.158400 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-multus-socket-dir-parent\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.158447 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-host-run-netns\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.158472 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-etc-kubernetes\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.158495 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1d6a0ce5-2436-4b0a-859e-b6a6bb896271-cni-binary-copy\") pod \"multus-additional-cni-plugins-jdgc8\" (UID: \"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\") " pod="openshift-multus/multus-additional-cni-plugins-jdgc8" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.158589 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-cnibin\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.158628 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1d6a0ce5-2436-4b0a-859e-b6a6bb896271-cnibin\") pod \"multus-additional-cni-plugins-jdgc8\" (UID: \"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\") " pod="openshift-multus/multus-additional-cni-plugins-jdgc8" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.158725 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-host-var-lib-cni-multus\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.158786 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-host-var-lib-kubelet\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.158713 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-multus-conf-dir\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.158929 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-system-cni-dir\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.158987 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c0cb6a2e-b002-460c-8633-46ab5fbe9668-hosts-file\") pod \"node-resolver-rr8pw\" (UID: \"c0cb6a2e-b002-460c-8633-46ab5fbe9668\") " pod="openshift-dns/node-resolver-rr8pw" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.158995 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1d6a0ce5-2436-4b0a-859e-b6a6bb896271-os-release\") pod \"multus-additional-cni-plugins-jdgc8\" (UID: \"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\") " pod="openshift-multus/multus-additional-cni-plugins-jdgc8" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.158188 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-hostroot\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.159032 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-host-var-lib-cni-bin\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.159044 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1d6a0ce5-2436-4b0a-859e-b6a6bb896271-system-cni-dir\") pod \"multus-additional-cni-plugins-jdgc8\" (UID: \"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\") " pod="openshift-multus/multus-additional-cni-plugins-jdgc8" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.159060 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-host-run-k8s-cni-cncf-io\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.159119 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-host-run-netns\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.159128 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-multus-socket-dir-parent\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.159176 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb26e42b-df49-418d-8be1-8a0704cd5dfd-etc-kubernetes\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.159169 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1d6a0ce5-2436-4b0a-859e-b6a6bb896271-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jdgc8\" (UID: \"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\") " pod="openshift-multus/multus-additional-cni-plugins-jdgc8" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.159830 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1d6a0ce5-2436-4b0a-859e-b6a6bb896271-cni-binary-copy\") pod \"multus-additional-cni-plugins-jdgc8\" (UID: \"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\") " pod="openshift-multus/multus-additional-cni-plugins-jdgc8" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.159835 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fb26e42b-df49-418d-8be1-8a0704cd5dfd-multus-daemon-config\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.159890 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2067f3da-c088-40d8-bc08-1eb971992c56-mcd-auth-proxy-config\") pod \"machine-config-daemon-zrvcn\" (UID: \"2067f3da-c088-40d8-bc08-1eb971992c56\") " pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.159923 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fb26e42b-df49-418d-8be1-8a0704cd5dfd-cni-binary-copy\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.160027 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1d6a0ce5-2436-4b0a-859e-b6a6bb896271-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jdgc8\" (UID: \"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\") " pod="openshift-multus/multus-additional-cni-plugins-jdgc8" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.165812 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2067f3da-c088-40d8-bc08-1eb971992c56-proxy-tls\") pod \"machine-config-daemon-zrvcn\" (UID: \"2067f3da-c088-40d8-bc08-1eb971992c56\") " pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.167690 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.173948 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzp58\" (UniqueName: \"kubernetes.io/projected/2067f3da-c088-40d8-bc08-1eb971992c56-kube-api-access-nzp58\") pod \"machine-config-daemon-zrvcn\" (UID: \"2067f3da-c088-40d8-bc08-1eb971992c56\") " pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.176352 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlbvq\" (UniqueName: \"kubernetes.io/projected/1d6a0ce5-2436-4b0a-859e-b6a6bb896271-kube-api-access-tlbvq\") pod \"multus-additional-cni-plugins-jdgc8\" (UID: \"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\") " pod="openshift-multus/multus-additional-cni-plugins-jdgc8" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.179478 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6nwc\" (UniqueName: \"kubernetes.io/projected/c0cb6a2e-b002-460c-8633-46ab5fbe9668-kube-api-access-s6nwc\") pod \"node-resolver-rr8pw\" (UID: \"c0cb6a2e-b002-460c-8633-46ab5fbe9668\") " pod="openshift-dns/node-resolver-rr8pw" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.182178 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbqdz\" (UniqueName: \"kubernetes.io/projected/fb26e42b-df49-418d-8be1-8a0704cd5dfd-kube-api-access-qbqdz\") pod \"multus-fcqf4\" (UID: \"fb26e42b-df49-418d-8be1-8a0704cd5dfd\") " pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.183168 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.188851 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.188903 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.188917 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.188937 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.188949 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:05Z","lastTransitionTime":"2025-12-05T16:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.196092 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.206804 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.218441 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.234045 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.243457 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.261527 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: E1205 16:16:05.268737 4756 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.274483 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.287242 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.290842 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.290879 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.290890 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.290906 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.290917 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:05Z","lastTransitionTime":"2025-12-05T16:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.299168 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.303363 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.312630 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: W1205 16:16:05.313804 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d6a0ce5_2436_4b0a_859e_b6a6bb896271.slice/crio-a07298a09cc989178449754437add2bc7cd3fadd6906e72c99d74c84cb7b2b80 WatchSource:0}: Error finding container a07298a09cc989178449754437add2bc7cd3fadd6906e72c99d74c84cb7b2b80: Status 404 returned error can't find the container with id a07298a09cc989178449754437add2bc7cd3fadd6906e72c99d74c84cb7b2b80 Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.314066 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fcqf4" Dec 05 16:16:05 crc kubenswrapper[4756]: W1205 16:16:05.325591 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb26e42b_df49_418d_8be1_8a0704cd5dfd.slice/crio-bd95759c4e067196d0b7b7915cfe2c9f45f12d99a16f3d26a0c986bf99cbdc0f WatchSource:0}: Error finding container bd95759c4e067196d0b7b7915cfe2c9f45f12d99a16f3d26a0c986bf99cbdc0f: Status 404 returned error can't find the container with id bd95759c4e067196d0b7b7915cfe2c9f45f12d99a16f3d26a0c986bf99cbdc0f Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.327235 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.336783 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rr8pw" Dec 05 16:16:05 crc kubenswrapper[4756]: W1205 16:16:05.360398 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0cb6a2e_b002_460c_8633_46ab5fbe9668.slice/crio-ce122f51a27194e2a37fef67bbb83a4f839ee6beedfd03b80e163f44abda43cd WatchSource:0}: Error finding container ce122f51a27194e2a37fef67bbb83a4f839ee6beedfd03b80e163f44abda43cd: Status 404 returned error can't find the container with id ce122f51a27194e2a37fef67bbb83a4f839ee6beedfd03b80e163f44abda43cd Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.361068 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pz5rs"] Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.362274 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.365070 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.365710 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.365766 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.365894 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.365913 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.365970 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.368381 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.379122 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.392991 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.393014 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.393022 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.393038 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.393048 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:05Z","lastTransitionTime":"2025-12-05T16:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.394927 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.407294 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.419644 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.432089 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.460293 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.460365 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-systemd-units\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.460468 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-etc-openvswitch\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.460484 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-log-socket\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.460500 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-cni-bin\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.460528 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-run-openvswitch\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.460550 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-run-ovn\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.460565 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b75aef14-f06d-44ec-8a2d-28d9661b1001-ovnkube-config\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.460591 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l5kc\" (UniqueName: \"kubernetes.io/projected/b75aef14-f06d-44ec-8a2d-28d9661b1001-kube-api-access-6l5kc\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.460613 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-cni-netd\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.460702 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-kubelet\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.460765 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-run-netns\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.460782 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-run-systemd\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.460830 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-node-log\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.460849 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b75aef14-f06d-44ec-8a2d-28d9661b1001-ovnkube-script-lib\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.460868 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-var-lib-openvswitch\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.460926 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.460998 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b75aef14-f06d-44ec-8a2d-28d9661b1001-env-overrides\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.461019 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-slash\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.461109 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-run-ovn-kubernetes\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.461156 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b75aef14-f06d-44ec-8a2d-28d9661b1001-ovn-node-metrics-cert\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.477875 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.504144 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.504182 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.504195 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.504213 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.504224 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:05Z","lastTransitionTime":"2025-12-05T16:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.516766 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.543802 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.564666 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-run-openvswitch\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.564700 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-run-ovn\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.564721 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b75aef14-f06d-44ec-8a2d-28d9661b1001-ovnkube-config\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.564756 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l5kc\" (UniqueName: \"kubernetes.io/projected/b75aef14-f06d-44ec-8a2d-28d9661b1001-kube-api-access-6l5kc\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.564786 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-cni-netd\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.564790 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-run-openvswitch\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.564804 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-node-log\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.564843 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-node-log\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.564856 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-kubelet\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.564873 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-run-netns\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.564884 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-run-ovn\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.564909 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-kubelet\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.564911 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-run-systemd\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.564945 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-run-netns\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.564966 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-cni-netd\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.564887 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-run-systemd\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.565198 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b75aef14-f06d-44ec-8a2d-28d9661b1001-ovnkube-script-lib\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.565217 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-var-lib-openvswitch\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.565233 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.565267 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b75aef14-f06d-44ec-8a2d-28d9661b1001-env-overrides\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.565281 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-slash\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.565308 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b75aef14-f06d-44ec-8a2d-28d9661b1001-ovn-node-metrics-cert\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.565326 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-run-ovn-kubernetes\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.565348 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-systemd-units\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.565367 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-etc-openvswitch\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.565381 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-log-socket\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.565397 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-cni-bin\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.565459 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-cni-bin\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.565628 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b75aef14-f06d-44ec-8a2d-28d9661b1001-ovnkube-config\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.565941 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b75aef14-f06d-44ec-8a2d-28d9661b1001-ovnkube-script-lib\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.565974 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-run-ovn-kubernetes\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.565996 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-systemd-units\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.566018 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-etc-openvswitch\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.566037 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-log-socket\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.566057 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.566077 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-var-lib-openvswitch\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.566391 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b75aef14-f06d-44ec-8a2d-28d9661b1001-env-overrides\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.566437 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-slash\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.571022 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.576621 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b75aef14-f06d-44ec-8a2d-28d9661b1001-ovn-node-metrics-cert\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.585813 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.586362 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l5kc\" (UniqueName: \"kubernetes.io/projected/b75aef14-f06d-44ec-8a2d-28d9661b1001-kube-api-access-6l5kc\") pod \"ovnkube-node-pz5rs\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.599145 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.607004 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.607036 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.607044 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.607059 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.607069 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:05Z","lastTransitionTime":"2025-12-05T16:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.612441 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.633763 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:05Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.693038 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.709813 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.709861 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.709872 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.709892 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.709908 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:05Z","lastTransitionTime":"2025-12-05T16:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:05 crc kubenswrapper[4756]: W1205 16:16:05.760126 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb75aef14_f06d_44ec_8a2d_28d9661b1001.slice/crio-a1ecf2925c03484c8de154e4d997146fbca06b5be684279eeb3d456cc0c05cfc WatchSource:0}: Error finding container a1ecf2925c03484c8de154e4d997146fbca06b5be684279eeb3d456cc0c05cfc: Status 404 returned error can't find the container with id a1ecf2925c03484c8de154e4d997146fbca06b5be684279eeb3d456cc0c05cfc Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.767408 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:16:05 crc kubenswrapper[4756]: E1205 16:16:05.767483 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:16:13.767467434 +0000 UTC m=+34.006140946 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.767577 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.767611 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.767644 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.767675 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:05 crc kubenswrapper[4756]: E1205 16:16:05.767722 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 16:16:05 crc kubenswrapper[4756]: E1205 16:16:05.767734 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 16:16:05 crc kubenswrapper[4756]: E1205 16:16:05.767744 4756 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:16:05 crc kubenswrapper[4756]: E1205 16:16:05.767755 4756 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 16:16:05 crc kubenswrapper[4756]: E1205 16:16:05.767776 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 16:16:13.767766491 +0000 UTC m=+34.006440003 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:16:05 crc kubenswrapper[4756]: E1205 16:16:05.767793 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 16:16:13.767781892 +0000 UTC m=+34.006455404 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 16:16:05 crc kubenswrapper[4756]: E1205 16:16:05.767852 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 16:16:05 crc kubenswrapper[4756]: E1205 16:16:05.767866 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 16:16:05 crc kubenswrapper[4756]: E1205 16:16:05.767875 4756 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:16:05 crc kubenswrapper[4756]: E1205 16:16:05.767903 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 16:16:13.767893674 +0000 UTC m=+34.006567186 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:16:05 crc kubenswrapper[4756]: E1205 16:16:05.767955 4756 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 16:16:05 crc kubenswrapper[4756]: E1205 16:16:05.767982 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 16:16:13.767974067 +0000 UTC m=+34.006647579 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.812048 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.812082 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.812092 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.812107 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.812118 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:05Z","lastTransitionTime":"2025-12-05T16:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.914681 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.915315 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.915334 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.915358 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:05 crc kubenswrapper[4756]: I1205 16:16:05.915372 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:05Z","lastTransitionTime":"2025-12-05T16:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.019580 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.019607 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.019615 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.019635 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.019643 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:06Z","lastTransitionTime":"2025-12-05T16:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.048917 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.049014 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.049102 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:06 crc kubenswrapper[4756]: E1205 16:16:06.049151 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:16:06 crc kubenswrapper[4756]: E1205 16:16:06.049229 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:16:06 crc kubenswrapper[4756]: E1205 16:16:06.049339 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.122570 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.122609 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.122621 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.122639 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.122654 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:06Z","lastTransitionTime":"2025-12-05T16:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.225757 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.225812 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.225827 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.225849 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.225861 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:06Z","lastTransitionTime":"2025-12-05T16:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.267922 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d"} Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.267976 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d"} Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.267990 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"3463dc49f9e00b84e4bdc7ddabcd4fad41d0e453d4c56383afdcbec3c065c08b"} Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.269475 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fcqf4" event={"ID":"fb26e42b-df49-418d-8be1-8a0704cd5dfd","Type":"ContainerStarted","Data":"91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f"} Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.269527 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fcqf4" event={"ID":"fb26e42b-df49-418d-8be1-8a0704cd5dfd","Type":"ContainerStarted","Data":"bd95759c4e067196d0b7b7915cfe2c9f45f12d99a16f3d26a0c986bf99cbdc0f"} Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.271094 4756 generic.go:334] "Generic (PLEG): container finished" podID="1d6a0ce5-2436-4b0a-859e-b6a6bb896271" containerID="a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0" exitCode=0 Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.271180 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" event={"ID":"1d6a0ce5-2436-4b0a-859e-b6a6bb896271","Type":"ContainerDied","Data":"a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0"} Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.271209 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" event={"ID":"1d6a0ce5-2436-4b0a-859e-b6a6bb896271","Type":"ContainerStarted","Data":"a07298a09cc989178449754437add2bc7cd3fadd6906e72c99d74c84cb7b2b80"} Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.272622 4756 generic.go:334] "Generic (PLEG): container finished" podID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerID="d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a" exitCode=0 Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.272683 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" event={"ID":"b75aef14-f06d-44ec-8a2d-28d9661b1001","Type":"ContainerDied","Data":"d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a"} Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.272704 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" event={"ID":"b75aef14-f06d-44ec-8a2d-28d9661b1001","Type":"ContainerStarted","Data":"a1ecf2925c03484c8de154e4d997146fbca06b5be684279eeb3d456cc0c05cfc"} Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.274812 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rr8pw" event={"ID":"c0cb6a2e-b002-460c-8633-46ab5fbe9668","Type":"ContainerStarted","Data":"b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f"} Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.274846 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rr8pw" event={"ID":"c0cb6a2e-b002-460c-8633-46ab5fbe9668","Type":"ContainerStarted","Data":"ce122f51a27194e2a37fef67bbb83a4f839ee6beedfd03b80e163f44abda43cd"} Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.298088 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:06Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.315399 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:06Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.329159 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.329197 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.329209 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.329226 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.329238 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:06Z","lastTransitionTime":"2025-12-05T16:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.338886 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:06Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.353360 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:06Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.367361 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:06Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.382969 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:06Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.396216 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:06Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.411572 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:06Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.425803 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:06Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.431371 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.431408 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.431433 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.431447 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.431458 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:06Z","lastTransitionTime":"2025-12-05T16:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.448019 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:06Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.460926 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:06Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.480691 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:06Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.492210 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:06Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.518033 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:06Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.532802 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:06Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.534806 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.534837 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.534846 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.534864 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.534876 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:06Z","lastTransitionTime":"2025-12-05T16:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.546160 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:06Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.558129 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:06Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.571593 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:06Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.581255 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:06Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.600479 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:06Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.611406 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:06Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.629797 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:06Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.637001 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.637024 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.637031 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.637044 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.637052 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:06Z","lastTransitionTime":"2025-12-05T16:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.644694 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:06Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.656179 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:06Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.666457 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:06Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.678637 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:06Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.690802 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:06Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.702147 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:06Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.738883 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.738915 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.738924 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.738937 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.738946 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:06Z","lastTransitionTime":"2025-12-05T16:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.841180 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.841294 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.841304 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.841323 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.841338 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:06Z","lastTransitionTime":"2025-12-05T16:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.943565 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.943615 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.943624 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.943641 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:06 crc kubenswrapper[4756]: I1205 16:16:06.943652 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:06Z","lastTransitionTime":"2025-12-05T16:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.046873 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.046920 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.046960 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.046979 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.046993 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:07Z","lastTransitionTime":"2025-12-05T16:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.149600 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.149647 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.149660 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.149679 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.149691 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:07Z","lastTransitionTime":"2025-12-05T16:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.254576 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.254930 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.254942 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.254960 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.254971 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:07Z","lastTransitionTime":"2025-12-05T16:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.285621 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" event={"ID":"b75aef14-f06d-44ec-8a2d-28d9661b1001","Type":"ContainerStarted","Data":"8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c"} Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.285670 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" event={"ID":"b75aef14-f06d-44ec-8a2d-28d9661b1001","Type":"ContainerStarted","Data":"24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af"} Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.285686 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" event={"ID":"b75aef14-f06d-44ec-8a2d-28d9661b1001","Type":"ContainerStarted","Data":"bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191"} Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.291346 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" event={"ID":"1d6a0ce5-2436-4b0a-859e-b6a6bb896271","Type":"ContainerStarted","Data":"0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076"} Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.304605 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:07Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.319358 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:07Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.339777 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:07Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.354997 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:07Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.356322 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.356367 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.356382 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.356400 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.356441 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:07Z","lastTransitionTime":"2025-12-05T16:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.370788 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:07Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.408045 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:07Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.445338 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:07Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.459201 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.459452 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.459531 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.459598 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.459657 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:07Z","lastTransitionTime":"2025-12-05T16:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.466390 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:07Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.479106 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:07Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.495889 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:07Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.511319 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:07Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.525456 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:07Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.537928 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:07Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.554907 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:07Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.561638 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-bz9nh"] Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.562181 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bz9nh" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.562477 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.562522 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.562532 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.562555 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.562568 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:07Z","lastTransitionTime":"2025-12-05T16:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.565103 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.565240 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.565558 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.566193 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.583071 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:07Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.584441 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzf9z\" (UniqueName: \"kubernetes.io/projected/13fd360e-8db2-431e-bc4a-cdd7938b0da6-kube-api-access-nzf9z\") pod \"node-ca-bz9nh\" (UID: \"13fd360e-8db2-431e-bc4a-cdd7938b0da6\") " pod="openshift-image-registry/node-ca-bz9nh" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.584506 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/13fd360e-8db2-431e-bc4a-cdd7938b0da6-host\") pod \"node-ca-bz9nh\" (UID: \"13fd360e-8db2-431e-bc4a-cdd7938b0da6\") " pod="openshift-image-registry/node-ca-bz9nh" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.584534 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/13fd360e-8db2-431e-bc4a-cdd7938b0da6-serviceca\") pod \"node-ca-bz9nh\" (UID: \"13fd360e-8db2-431e-bc4a-cdd7938b0da6\") " pod="openshift-image-registry/node-ca-bz9nh" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.596051 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:07Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.608160 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:07Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.621387 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:07Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.632059 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:07Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.646102 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:07Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.665757 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.665803 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.665813 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.665831 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.665842 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:07Z","lastTransitionTime":"2025-12-05T16:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.668377 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:07Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.679363 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:07Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.685577 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzf9z\" (UniqueName: \"kubernetes.io/projected/13fd360e-8db2-431e-bc4a-cdd7938b0da6-kube-api-access-nzf9z\") pod \"node-ca-bz9nh\" (UID: \"13fd360e-8db2-431e-bc4a-cdd7938b0da6\") " pod="openshift-image-registry/node-ca-bz9nh" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.685639 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/13fd360e-8db2-431e-bc4a-cdd7938b0da6-host\") pod \"node-ca-bz9nh\" (UID: \"13fd360e-8db2-431e-bc4a-cdd7938b0da6\") " pod="openshift-image-registry/node-ca-bz9nh" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.685668 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/13fd360e-8db2-431e-bc4a-cdd7938b0da6-serviceca\") pod \"node-ca-bz9nh\" (UID: \"13fd360e-8db2-431e-bc4a-cdd7938b0da6\") " pod="openshift-image-registry/node-ca-bz9nh" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.685769 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/13fd360e-8db2-431e-bc4a-cdd7938b0da6-host\") pod \"node-ca-bz9nh\" (UID: \"13fd360e-8db2-431e-bc4a-cdd7938b0da6\") " pod="openshift-image-registry/node-ca-bz9nh" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.687103 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/13fd360e-8db2-431e-bc4a-cdd7938b0da6-serviceca\") pod \"node-ca-bz9nh\" (UID: \"13fd360e-8db2-431e-bc4a-cdd7938b0da6\") " pod="openshift-image-registry/node-ca-bz9nh" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.691721 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:07Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.702365 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:07Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.704838 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzf9z\" (UniqueName: \"kubernetes.io/projected/13fd360e-8db2-431e-bc4a-cdd7938b0da6-kube-api-access-nzf9z\") pod \"node-ca-bz9nh\" (UID: \"13fd360e-8db2-431e-bc4a-cdd7938b0da6\") " pod="openshift-image-registry/node-ca-bz9nh" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.718744 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:07Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.729854 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:07Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.742048 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:07Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.752821 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:07Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.768125 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.768154 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.768163 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.768175 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.768184 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:07Z","lastTransitionTime":"2025-12-05T16:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.773035 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:07Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.870310 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.870346 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.870357 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.870376 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.870389 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:07Z","lastTransitionTime":"2025-12-05T16:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.881669 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bz9nh" Dec 05 16:16:07 crc kubenswrapper[4756]: W1205 16:16:07.892495 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13fd360e_8db2_431e_bc4a_cdd7938b0da6.slice/crio-7a324eec56adde4d675d3688ef0a14a14494603ce930e8b16720a0dbde153927 WatchSource:0}: Error finding container 7a324eec56adde4d675d3688ef0a14a14494603ce930e8b16720a0dbde153927: Status 404 returned error can't find the container with id 7a324eec56adde4d675d3688ef0a14a14494603ce930e8b16720a0dbde153927 Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.972519 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.972559 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.972568 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.972582 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:07 crc kubenswrapper[4756]: I1205 16:16:07.972595 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:07Z","lastTransitionTime":"2025-12-05T16:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.048911 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.048911 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.048927 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:08 crc kubenswrapper[4756]: E1205 16:16:08.049112 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:16:08 crc kubenswrapper[4756]: E1205 16:16:08.049134 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:16:08 crc kubenswrapper[4756]: E1205 16:16:08.049160 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.074971 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.075012 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.075021 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.075037 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.075050 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:08Z","lastTransitionTime":"2025-12-05T16:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.178969 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.179019 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.179034 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.179053 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.179068 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:08Z","lastTransitionTime":"2025-12-05T16:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.282449 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.282519 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.282530 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.282547 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.282557 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:08Z","lastTransitionTime":"2025-12-05T16:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.296483 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bz9nh" event={"ID":"13fd360e-8db2-431e-bc4a-cdd7938b0da6","Type":"ContainerStarted","Data":"4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567"} Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.296544 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bz9nh" event={"ID":"13fd360e-8db2-431e-bc4a-cdd7938b0da6","Type":"ContainerStarted","Data":"7a324eec56adde4d675d3688ef0a14a14494603ce930e8b16720a0dbde153927"} Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.300762 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" event={"ID":"b75aef14-f06d-44ec-8a2d-28d9661b1001","Type":"ContainerStarted","Data":"c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6"} Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.300815 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" event={"ID":"b75aef14-f06d-44ec-8a2d-28d9661b1001","Type":"ContainerStarted","Data":"0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9"} Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.300825 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" event={"ID":"b75aef14-f06d-44ec-8a2d-28d9661b1001","Type":"ContainerStarted","Data":"8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1"} Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.302788 4756 generic.go:334] "Generic (PLEG): container finished" podID="1d6a0ce5-2436-4b0a-859e-b6a6bb896271" containerID="0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076" exitCode=0 Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.302829 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" event={"ID":"1d6a0ce5-2436-4b0a-859e-b6a6bb896271","Type":"ContainerDied","Data":"0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076"} Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.313898 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:08Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.328529 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:08Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.345283 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:08Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.360536 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:08Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.374087 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:08Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.385357 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.385401 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.385413 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.385457 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.385471 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:08Z","lastTransitionTime":"2025-12-05T16:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.389154 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:08Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.405458 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:08Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.422502 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:08Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.436873 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:08Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.457536 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:08Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.473673 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:08Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.488354 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.488386 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.488394 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.488408 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.488435 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:08Z","lastTransitionTime":"2025-12-05T16:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.492925 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:08Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.508536 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:08Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.530615 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:08Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.545477 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:08Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.562190 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:08Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.576026 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:08Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.590761 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.590805 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.590817 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.590838 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.590850 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:08Z","lastTransitionTime":"2025-12-05T16:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.599937 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:08Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.614206 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:08Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.627057 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:08Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.638747 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:08Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.651437 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:08Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.663158 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:08Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.671852 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:08Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.693243 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.693286 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.693296 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.693338 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.693350 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:08Z","lastTransitionTime":"2025-12-05T16:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.694693 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:08Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.709881 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:08Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.725656 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:08Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.741274 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:08Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.761476 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:08Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.774757 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:08Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.795652 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.795690 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.795701 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.795717 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.795730 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:08Z","lastTransitionTime":"2025-12-05T16:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.898107 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.898152 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.898167 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.898191 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:08 crc kubenswrapper[4756]: I1205 16:16:08.898207 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:08Z","lastTransitionTime":"2025-12-05T16:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.000708 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.000740 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.000749 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.000762 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.000771 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:09Z","lastTransitionTime":"2025-12-05T16:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.103358 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.103450 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.103481 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.103504 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.103522 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:09Z","lastTransitionTime":"2025-12-05T16:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.206209 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.206271 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.206293 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.206321 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.206341 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:09Z","lastTransitionTime":"2025-12-05T16:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.307701 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.307734 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.307745 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.307760 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.307770 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:09Z","lastTransitionTime":"2025-12-05T16:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.308505 4756 generic.go:334] "Generic (PLEG): container finished" podID="1d6a0ce5-2436-4b0a-859e-b6a6bb896271" containerID="66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b" exitCode=0 Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.308538 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" event={"ID":"1d6a0ce5-2436-4b0a-859e-b6a6bb896271","Type":"ContainerDied","Data":"66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b"} Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.325912 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:09Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.344836 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:09Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.362751 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:09Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.381298 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:09Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.392529 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:09Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.410526 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.410565 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.410575 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.410589 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.410600 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:09Z","lastTransitionTime":"2025-12-05T16:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.414388 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:09Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.427322 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:09Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.442004 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:09Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.458853 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:09Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.477662 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:09Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.503133 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:09Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.513650 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.513703 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.513715 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.513735 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.513749 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:09Z","lastTransitionTime":"2025-12-05T16:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.519958 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:09Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.534699 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:09Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.551589 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:09Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.567552 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:09Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.617657 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.617699 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.617710 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.617727 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.617738 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:09Z","lastTransitionTime":"2025-12-05T16:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.720215 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.720270 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.720283 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.720301 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.720313 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:09Z","lastTransitionTime":"2025-12-05T16:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.822546 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.822582 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.822593 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.822609 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.822621 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:09Z","lastTransitionTime":"2025-12-05T16:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.900904 4756 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.928506 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.928646 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.928670 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.928700 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:09 crc kubenswrapper[4756]: I1205 16:16:09.928722 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:09Z","lastTransitionTime":"2025-12-05T16:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.031513 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.031556 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.031571 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.031592 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.031608 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:10Z","lastTransitionTime":"2025-12-05T16:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.050648 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.050662 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.050738 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:10 crc kubenswrapper[4756]: E1205 16:16:10.050815 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:16:10 crc kubenswrapper[4756]: E1205 16:16:10.050943 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:16:10 crc kubenswrapper[4756]: E1205 16:16:10.051015 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.084922 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.105582 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.130234 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.134205 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.134382 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.134521 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.134619 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.134721 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:10Z","lastTransitionTime":"2025-12-05T16:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.146773 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.167720 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.194232 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.230457 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.237277 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.237321 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.237333 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.237348 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.237360 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:10Z","lastTransitionTime":"2025-12-05T16:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.246143 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.264464 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.276023 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.286829 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.298383 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.309567 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.315711 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" event={"ID":"b75aef14-f06d-44ec-8a2d-28d9661b1001","Type":"ContainerStarted","Data":"612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36"} Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.318919 4756 generic.go:334] "Generic (PLEG): container finished" podID="1d6a0ce5-2436-4b0a-859e-b6a6bb896271" containerID="9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05" exitCode=0 Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.318987 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" event={"ID":"1d6a0ce5-2436-4b0a-859e-b6a6bb896271","Type":"ContainerDied","Data":"9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05"} Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.321286 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.330104 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.341399 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.341490 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.341505 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.341567 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.341586 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:10Z","lastTransitionTime":"2025-12-05T16:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.344290 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.359376 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.372385 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.382450 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.391067 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.401824 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.417378 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.431280 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.444175 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.444207 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.444216 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.444229 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.444238 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:10Z","lastTransitionTime":"2025-12-05T16:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.450316 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.462404 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.474295 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.485700 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.498693 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.544765 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.546252 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.546282 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.546293 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.546308 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.546319 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:10Z","lastTransitionTime":"2025-12-05T16:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.563582 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.648948 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.648984 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.649007 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.649023 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.649034 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:10Z","lastTransitionTime":"2025-12-05T16:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.751846 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.751922 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.751946 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.751977 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.752002 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:10Z","lastTransitionTime":"2025-12-05T16:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.855000 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.855063 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.855080 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.855108 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.855127 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:10Z","lastTransitionTime":"2025-12-05T16:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.957355 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.957401 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.957411 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.957449 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:10 crc kubenswrapper[4756]: I1205 16:16:10.957461 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:10Z","lastTransitionTime":"2025-12-05T16:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.028378 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.051119 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.059434 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.059480 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.059496 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.059515 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.059531 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:11Z","lastTransitionTime":"2025-12-05T16:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.063813 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.075427 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.089913 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.099163 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.108674 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.120396 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.129821 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.142903 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.156995 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.161596 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.161629 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.161641 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.161658 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.161668 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:11Z","lastTransitionTime":"2025-12-05T16:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.169649 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.182878 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.191625 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.208143 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.222944 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.230409 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.230489 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.230505 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.230526 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.230542 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:11Z","lastTransitionTime":"2025-12-05T16:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:11 crc kubenswrapper[4756]: E1205 16:16:11.245750 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.250030 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.250160 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.250242 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.250323 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.250397 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:11Z","lastTransitionTime":"2025-12-05T16:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:11 crc kubenswrapper[4756]: E1205 16:16:11.264390 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.269487 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.269564 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.269584 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.269610 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.269631 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:11Z","lastTransitionTime":"2025-12-05T16:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:11 crc kubenswrapper[4756]: E1205 16:16:11.288144 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.292793 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.292837 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.292846 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.292861 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.292870 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:11Z","lastTransitionTime":"2025-12-05T16:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:11 crc kubenswrapper[4756]: E1205 16:16:11.306677 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.310253 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.310316 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.310342 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.310373 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.310406 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:11Z","lastTransitionTime":"2025-12-05T16:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:11 crc kubenswrapper[4756]: E1205 16:16:11.326597 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: E1205 16:16:11.326752 4756 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.327405 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" event={"ID":"1d6a0ce5-2436-4b0a-859e-b6a6bb896271","Type":"ContainerStarted","Data":"e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058"} Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.328344 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.328398 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.328446 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.328470 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.328487 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:11Z","lastTransitionTime":"2025-12-05T16:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.344973 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.364771 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.390808 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.408214 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.424797 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.436876 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.436914 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.436924 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.436939 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.436948 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:11Z","lastTransitionTime":"2025-12-05T16:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.440395 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.452264 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.463918 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.475864 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.497575 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.512203 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.526127 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.539491 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.539528 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.539541 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.539559 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.539571 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:11Z","lastTransitionTime":"2025-12-05T16:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.540955 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.560158 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.595354 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:11Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.642117 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.642150 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.642158 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.642170 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.642179 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:11Z","lastTransitionTime":"2025-12-05T16:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.744174 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.744225 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.744240 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.744262 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.744277 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:11Z","lastTransitionTime":"2025-12-05T16:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.847304 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.847340 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.847350 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.847366 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.847376 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:11Z","lastTransitionTime":"2025-12-05T16:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.950323 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.950595 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.950698 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.950800 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:11 crc kubenswrapper[4756]: I1205 16:16:11.950873 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:11Z","lastTransitionTime":"2025-12-05T16:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.049865 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:12 crc kubenswrapper[4756]: E1205 16:16:12.049999 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.050381 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:12 crc kubenswrapper[4756]: E1205 16:16:12.050502 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.050526 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:12 crc kubenswrapper[4756]: E1205 16:16:12.050614 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.053797 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.053885 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.053896 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.053910 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.053919 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:12Z","lastTransitionTime":"2025-12-05T16:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.156301 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.156355 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.156366 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.156383 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.156395 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:12Z","lastTransitionTime":"2025-12-05T16:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.259837 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.260224 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.260238 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.260251 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.260260 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:12Z","lastTransitionTime":"2025-12-05T16:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.338058 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" event={"ID":"b75aef14-f06d-44ec-8a2d-28d9661b1001","Type":"ContainerStarted","Data":"2c3385459f3faf65e2bf7570657355eccf9e24f0b092ce91c00322d8b890434e"} Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.338828 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.338948 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.353252 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:12Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.354205 4756 generic.go:334] "Generic (PLEG): container finished" podID="1d6a0ce5-2436-4b0a-859e-b6a6bb896271" containerID="e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058" exitCode=0 Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.354242 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" event={"ID":"1d6a0ce5-2436-4b0a-859e-b6a6bb896271","Type":"ContainerDied","Data":"e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058"} Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.362487 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.364080 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.364154 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.364170 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.364201 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.364216 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:12Z","lastTransitionTime":"2025-12-05T16:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.364100 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.368204 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:12Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.387851 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:12Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.398711 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:12Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.410055 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:12Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.421661 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:12Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.432681 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:12Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.443523 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:12Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.460271 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3385459f3faf65e2bf7570657355eccf9e24f0b092ce91c00322d8b890434e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:12Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.467210 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.467247 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.467256 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.467271 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.467280 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:12Z","lastTransitionTime":"2025-12-05T16:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.474001 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:12Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.486527 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:12Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.497677 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:12Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.509393 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:12Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.521624 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:12Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.529863 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:12Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.538720 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.539137 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:12Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.549012 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:12Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.562236 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:12Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.569688 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.569719 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.569730 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.569747 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.569760 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:12Z","lastTransitionTime":"2025-12-05T16:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.572079 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:12Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.592320 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:12Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.603791 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:12Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.616930 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:12Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.628517 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:12Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.645185 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:12Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.659959 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:12Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.672137 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.672179 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.672193 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.672213 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.672228 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:12Z","lastTransitionTime":"2025-12-05T16:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.679167 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3385459f3faf65e2bf7570657355eccf9e24f0b092ce91c00322d8b890434e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:12Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.692137 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:12Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.715855 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:12Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.756992 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:12Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.774910 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.774951 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.774966 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.775031 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.775048 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:12Z","lastTransitionTime":"2025-12-05T16:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.798232 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:12Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.878010 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.878061 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.878076 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.878098 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.878112 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:12Z","lastTransitionTime":"2025-12-05T16:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.981096 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.981175 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.981196 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.981227 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:12 crc kubenswrapper[4756]: I1205 16:16:12.981248 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:12Z","lastTransitionTime":"2025-12-05T16:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.084778 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.084850 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.084873 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.084899 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.084916 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:13Z","lastTransitionTime":"2025-12-05T16:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.188309 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.188377 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.188402 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.188468 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.188493 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:13Z","lastTransitionTime":"2025-12-05T16:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.292121 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.292197 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.292238 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.292271 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.292294 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:13Z","lastTransitionTime":"2025-12-05T16:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.363470 4756 generic.go:334] "Generic (PLEG): container finished" podID="1d6a0ce5-2436-4b0a-859e-b6a6bb896271" containerID="2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51" exitCode=0 Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.363568 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" event={"ID":"1d6a0ce5-2436-4b0a-859e-b6a6bb896271","Type":"ContainerDied","Data":"2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51"} Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.398968 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.398994 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.399003 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.399017 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.399026 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:13Z","lastTransitionTime":"2025-12-05T16:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.402029 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3385459f3faf65e2bf7570657355eccf9e24f0b092ce91c00322d8b890434e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:13Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.421731 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:13Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.439881 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:13Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.452494 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:13Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.466829 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:13Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.489668 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:13Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.502141 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.502182 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.502193 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.502211 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.502224 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:13Z","lastTransitionTime":"2025-12-05T16:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.507746 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:13Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.521404 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:13Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.536800 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:13Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.551542 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:13Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.564877 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:13Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.585024 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:13Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.595571 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:13Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.605057 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.605101 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.605109 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.605122 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.605131 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:13Z","lastTransitionTime":"2025-12-05T16:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.618070 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:13Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.632284 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:13Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.708022 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.708069 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.708083 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.708100 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.708112 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:13Z","lastTransitionTime":"2025-12-05T16:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.777752 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:16:13 crc kubenswrapper[4756]: E1205 16:16:13.777987 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:16:29.77795635 +0000 UTC m=+50.016629872 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.778390 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:13 crc kubenswrapper[4756]: E1205 16:16:13.778521 4756 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 16:16:13 crc kubenswrapper[4756]: E1205 16:16:13.778773 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 16:16:29.778761101 +0000 UTC m=+50.017434713 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.778697 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.778893 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.778941 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:13 crc kubenswrapper[4756]: E1205 16:16:13.779068 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 16:16:13 crc kubenswrapper[4756]: E1205 16:16:13.779083 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 16:16:13 crc kubenswrapper[4756]: E1205 16:16:13.779096 4756 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:16:13 crc kubenswrapper[4756]: E1205 16:16:13.779102 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 16:16:13 crc kubenswrapper[4756]: E1205 16:16:13.779131 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 16:16:29.77912296 +0000 UTC m=+50.017796472 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:16:13 crc kubenswrapper[4756]: E1205 16:16:13.779134 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 16:16:13 crc kubenswrapper[4756]: E1205 16:16:13.779158 4756 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:16:13 crc kubenswrapper[4756]: E1205 16:16:13.779226 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 16:16:29.779203612 +0000 UTC m=+50.017877224 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:16:13 crc kubenswrapper[4756]: E1205 16:16:13.779943 4756 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 16:16:13 crc kubenswrapper[4756]: E1205 16:16:13.780136 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 16:16:29.780114846 +0000 UTC m=+50.018788398 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.813645 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.813980 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.814142 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.814381 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.814668 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:13Z","lastTransitionTime":"2025-12-05T16:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.917859 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.917898 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.917909 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.917925 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:13 crc kubenswrapper[4756]: I1205 16:16:13.917937 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:13Z","lastTransitionTime":"2025-12-05T16:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.020246 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.020283 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.020297 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.020317 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.020332 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:14Z","lastTransitionTime":"2025-12-05T16:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.048855 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.048904 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.048872 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:14 crc kubenswrapper[4756]: E1205 16:16:14.049032 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:16:14 crc kubenswrapper[4756]: E1205 16:16:14.049088 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:16:14 crc kubenswrapper[4756]: E1205 16:16:14.049136 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.123428 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.123460 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.123468 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.123481 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.123490 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:14Z","lastTransitionTime":"2025-12-05T16:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.226349 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.226381 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.226389 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.226403 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.226440 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:14Z","lastTransitionTime":"2025-12-05T16:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.328053 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.328108 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.328120 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.328138 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.328152 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:14Z","lastTransitionTime":"2025-12-05T16:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.371283 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" event={"ID":"1d6a0ce5-2436-4b0a-859e-b6a6bb896271","Type":"ContainerStarted","Data":"c33f86f8bfd78434c12f4e929d3df2330af2674409bf571d24b567650db52b63"} Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.385054 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:14Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.399541 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:14Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.421835 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3385459f3faf65e2bf7570657355eccf9e24f0b092ce91c00322d8b890434e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:14Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.431345 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.431391 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.431401 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.431435 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.431445 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:14Z","lastTransitionTime":"2025-12-05T16:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.437576 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:14Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.454590 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:14Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.471997 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:14Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.485497 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:14Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.497278 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:14Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.508354 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:14Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.520503 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33f86f8bfd78434c12f4e929d3df2330af2674409bf571d24b567650db52b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:14Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.530231 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:14Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.537473 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.537754 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.537826 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.537889 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.538158 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:14Z","lastTransitionTime":"2025-12-05T16:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.550229 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:14Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.561542 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:14Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.574274 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:14Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.584934 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:14Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.640766 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.640981 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.641072 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.641183 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.641243 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:14Z","lastTransitionTime":"2025-12-05T16:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.743739 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.743950 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.744011 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.744073 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.744155 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:14Z","lastTransitionTime":"2025-12-05T16:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.847238 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.847302 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.847322 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.847346 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.847363 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:14Z","lastTransitionTime":"2025-12-05T16:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.950176 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.950236 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.950258 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.950282 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:14 crc kubenswrapper[4756]: I1205 16:16:14.950299 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:14Z","lastTransitionTime":"2025-12-05T16:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.052196 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.052224 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.052234 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.052247 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.052257 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:15Z","lastTransitionTime":"2025-12-05T16:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.154579 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.154661 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.154682 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.154707 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.154723 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:15Z","lastTransitionTime":"2025-12-05T16:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.256732 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.256787 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.256802 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.256822 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.256839 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:15Z","lastTransitionTime":"2025-12-05T16:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.359314 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.359363 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.359378 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.359395 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.359409 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:15Z","lastTransitionTime":"2025-12-05T16:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.375499 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz5rs_b75aef14-f06d-44ec-8a2d-28d9661b1001/ovnkube-controller/0.log" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.378050 4756 generic.go:334] "Generic (PLEG): container finished" podID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerID="2c3385459f3faf65e2bf7570657355eccf9e24f0b092ce91c00322d8b890434e" exitCode=1 Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.378092 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" event={"ID":"b75aef14-f06d-44ec-8a2d-28d9661b1001","Type":"ContainerDied","Data":"2c3385459f3faf65e2bf7570657355eccf9e24f0b092ce91c00322d8b890434e"} Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.378731 4756 scope.go:117] "RemoveContainer" containerID="2c3385459f3faf65e2bf7570657355eccf9e24f0b092ce91c00322d8b890434e" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.392550 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:15Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.406036 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:15Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.420521 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:15Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.434903 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:15Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.447001 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:15Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.458546 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:15Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.461113 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.461137 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.461146 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.461160 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.461172 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:15Z","lastTransitionTime":"2025-12-05T16:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.477595 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:15Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.491907 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:15Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.505988 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:15Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.516431 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:15Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.533226 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33f86f8bfd78434c12f4e929d3df2330af2674409bf571d24b567650db52b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:15Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.543537 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:15Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.556609 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:15Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.563720 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.563768 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.563782 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.563802 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.563822 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:15Z","lastTransitionTime":"2025-12-05T16:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.569122 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:15Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.592944 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3385459f3faf65e2bf7570657355eccf9e24f0b092ce91c00322d8b890434e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c3385459f3faf65e2bf7570657355eccf9e24f0b092ce91c00322d8b890434e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"message\\\":\\\" 6044 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 16:16:14.545505 6044 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 16:16:14.545962 6044 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 16:16:14.545982 6044 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 16:16:14.546018 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 16:16:14.546024 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 16:16:14.546035 6044 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 16:16:14.546053 6044 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 16:16:14.546060 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 16:16:14.546069 6044 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 16:16:14.546070 6044 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 16:16:14.546082 6044 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 16:16:14.546106 6044 factory.go:656] Stopping watch factory\\\\nI1205 16:16:14.546119 6044 ovnkube.go:599] Stopped ovnkube\\\\nI1205 16:16:14.546121 6044 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 16:16:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:15Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.667174 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.667473 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.667558 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.667624 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.667686 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:15Z","lastTransitionTime":"2025-12-05T16:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.771522 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.771584 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.771594 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.771613 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.771628 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:15Z","lastTransitionTime":"2025-12-05T16:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.876570 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.876647 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.876672 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.876694 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.876709 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:15Z","lastTransitionTime":"2025-12-05T16:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.980158 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.980223 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.980241 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.980264 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:15 crc kubenswrapper[4756]: I1205 16:16:15.980280 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:15Z","lastTransitionTime":"2025-12-05T16:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.048986 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.049192 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:16 crc kubenswrapper[4756]: E1205 16:16:16.049212 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:16:16 crc kubenswrapper[4756]: E1205 16:16:16.049357 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.049013 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:16 crc kubenswrapper[4756]: E1205 16:16:16.049589 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.083076 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.083139 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.083156 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.083179 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.083206 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:16Z","lastTransitionTime":"2025-12-05T16:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.186162 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.186197 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.186205 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.186219 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.186229 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:16Z","lastTransitionTime":"2025-12-05T16:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.289002 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.289074 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.289085 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.289102 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.289112 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:16Z","lastTransitionTime":"2025-12-05T16:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.387844 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz5rs_b75aef14-f06d-44ec-8a2d-28d9661b1001/ovnkube-controller/0.log" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.391015 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.391127 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.391191 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.391263 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.391321 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:16Z","lastTransitionTime":"2025-12-05T16:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.393217 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" event={"ID":"b75aef14-f06d-44ec-8a2d-28d9661b1001","Type":"ContainerStarted","Data":"fc2dc870e3d21c284b50f933c9036cb2e2a789c5bb1f3215ddb1c9a3155bf024"} Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.493389 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.493509 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.493533 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.493564 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.493590 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:16Z","lastTransitionTime":"2025-12-05T16:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.597377 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.597828 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.598029 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.598228 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.598484 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:16Z","lastTransitionTime":"2025-12-05T16:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.702159 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.702226 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.702247 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.702277 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.702300 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:16Z","lastTransitionTime":"2025-12-05T16:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.805379 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.805439 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.805452 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.805468 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.805478 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:16Z","lastTransitionTime":"2025-12-05T16:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.907877 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.907911 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.907921 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.907936 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:16 crc kubenswrapper[4756]: I1205 16:16:16.907945 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:16Z","lastTransitionTime":"2025-12-05T16:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.010905 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.010948 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.010959 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.010976 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.010989 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:17Z","lastTransitionTime":"2025-12-05T16:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.113637 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.113673 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.113685 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.113704 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.113715 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:17Z","lastTransitionTime":"2025-12-05T16:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.182248 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv"] Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.182746 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.185727 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.186170 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.196673 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.212779 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.216938 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.216973 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.216984 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.217000 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.217010 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:17Z","lastTransitionTime":"2025-12-05T16:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.227741 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.245559 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.257166 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.267719 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.279680 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.290256 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.305087 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33f86f8bfd78434c12f4e929d3df2330af2674409bf571d24b567650db52b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.314331 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cb44bc23-0bf3-4911-8f7d-e7bbe31b5816-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9crpv\" (UID: \"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.314382 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cb44bc23-0bf3-4911-8f7d-e7bbe31b5816-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9crpv\" (UID: \"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.314509 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h8k5\" (UniqueName: \"kubernetes.io/projected/cb44bc23-0bf3-4911-8f7d-e7bbe31b5816-kube-api-access-7h8k5\") pod \"ovnkube-control-plane-749d76644c-9crpv\" (UID: \"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.314578 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cb44bc23-0bf3-4911-8f7d-e7bbe31b5816-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9crpv\" (UID: \"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.316577 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.319316 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.319363 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.319377 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.319398 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.319412 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:17Z","lastTransitionTime":"2025-12-05T16:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.329410 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9crpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.352270 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.375669 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.396526 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.409290 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3385459f3faf65e2bf7570657355eccf9e24f0b092ce91c00322d8b890434e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c3385459f3faf65e2bf7570657355eccf9e24f0b092ce91c00322d8b890434e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"message\\\":\\\" 6044 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 16:16:14.545505 6044 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 16:16:14.545962 6044 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 16:16:14.545982 6044 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 16:16:14.546018 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 16:16:14.546024 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 16:16:14.546035 6044 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 16:16:14.546053 6044 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 16:16:14.546060 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 16:16:14.546069 6044 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 16:16:14.546070 6044 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 16:16:14.546082 6044 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 16:16:14.546106 6044 factory.go:656] Stopping watch factory\\\\nI1205 16:16:14.546119 6044 ovnkube.go:599] Stopped ovnkube\\\\nI1205 16:16:14.546121 6044 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 16:16:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.415984 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cb44bc23-0bf3-4911-8f7d-e7bbe31b5816-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9crpv\" (UID: \"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.416063 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cb44bc23-0bf3-4911-8f7d-e7bbe31b5816-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9crpv\" (UID: \"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.416085 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cb44bc23-0bf3-4911-8f7d-e7bbe31b5816-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9crpv\" (UID: \"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.416355 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h8k5\" (UniqueName: \"kubernetes.io/projected/cb44bc23-0bf3-4911-8f7d-e7bbe31b5816-kube-api-access-7h8k5\") pod \"ovnkube-control-plane-749d76644c-9crpv\" (UID: \"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.416793 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cb44bc23-0bf3-4911-8f7d-e7bbe31b5816-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9crpv\" (UID: \"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.417309 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cb44bc23-0bf3-4911-8f7d-e7bbe31b5816-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9crpv\" (UID: \"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.423092 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.423128 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.423140 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.423156 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.423169 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:17Z","lastTransitionTime":"2025-12-05T16:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.433753 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.434306 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cb44bc23-0bf3-4911-8f7d-e7bbe31b5816-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9crpv\" (UID: \"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.439615 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h8k5\" (UniqueName: \"kubernetes.io/projected/cb44bc23-0bf3-4911-8f7d-e7bbe31b5816-kube-api-access-7h8k5\") pod \"ovnkube-control-plane-749d76644c-9crpv\" (UID: \"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.457225 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.473917 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.494151 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2dc870e3d21c284b50f933c9036cb2e2a789c5bb1f3215ddb1c9a3155bf024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c3385459f3faf65e2bf7570657355eccf9e24f0b092ce91c00322d8b890434e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"message\\\":\\\" 6044 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 16:16:14.545505 6044 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 16:16:14.545962 6044 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 16:16:14.545982 6044 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 16:16:14.546018 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 16:16:14.546024 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 16:16:14.546035 6044 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 16:16:14.546053 6044 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 16:16:14.546060 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 16:16:14.546069 6044 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 16:16:14.546070 6044 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 16:16:14.546082 6044 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 16:16:14.546106 6044 factory.go:656] Stopping watch factory\\\\nI1205 16:16:14.546119 6044 ovnkube.go:599] Stopped ovnkube\\\\nI1205 16:16:14.546121 6044 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 16:16:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.497300 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" Dec 05 16:16:17 crc kubenswrapper[4756]: W1205 16:16:17.511154 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb44bc23_0bf3_4911_8f7d_e7bbe31b5816.slice/crio-b35e60b91e4faf04b9861ac0c91d1e8bc57743d04d6e56e346c4ab37279aa451 WatchSource:0}: Error finding container b35e60b91e4faf04b9861ac0c91d1e8bc57743d04d6e56e346c4ab37279aa451: Status 404 returned error can't find the container with id b35e60b91e4faf04b9861ac0c91d1e8bc57743d04d6e56e346c4ab37279aa451 Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.514551 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.525373 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.525403 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.525411 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.525435 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.525443 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:17Z","lastTransitionTime":"2025-12-05T16:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.532229 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.546202 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.557156 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.576007 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.588106 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.600399 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.612996 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.625400 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.627562 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.627621 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.627633 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.627652 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.627663 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:17Z","lastTransitionTime":"2025-12-05T16:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.635973 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.650011 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33f86f8bfd78434c12f4e929d3df2330af2674409bf571d24b567650db52b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.663052 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.674677 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9crpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.694660 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:17Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.730496 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.730536 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.730547 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.730561 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.730572 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:17Z","lastTransitionTime":"2025-12-05T16:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.832841 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.832870 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.832878 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.832890 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.832898 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:17Z","lastTransitionTime":"2025-12-05T16:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.934830 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.934888 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.934907 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.934931 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:17 crc kubenswrapper[4756]: I1205 16:16:17.934950 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:17Z","lastTransitionTime":"2025-12-05T16:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.037957 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.037991 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.038000 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.038013 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.038023 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:18Z","lastTransitionTime":"2025-12-05T16:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.049573 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.049666 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.049754 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:18 crc kubenswrapper[4756]: E1205 16:16:18.049753 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:16:18 crc kubenswrapper[4756]: E1205 16:16:18.049891 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:16:18 crc kubenswrapper[4756]: E1205 16:16:18.049998 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.144723 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.144809 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.144831 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.144860 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.144883 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:18Z","lastTransitionTime":"2025-12-05T16:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.247519 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.247565 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.247577 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.247592 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.247604 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:18Z","lastTransitionTime":"2025-12-05T16:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.351139 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.351196 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.351212 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.351234 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.351252 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:18Z","lastTransitionTime":"2025-12-05T16:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.401702 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" event={"ID":"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816","Type":"ContainerStarted","Data":"b35e60b91e4faf04b9861ac0c91d1e8bc57743d04d6e56e346c4ab37279aa451"} Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.453520 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.453603 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.453629 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.453646 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.453656 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:18Z","lastTransitionTime":"2025-12-05T16:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.556186 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.556231 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.556248 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.556271 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.556287 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:18Z","lastTransitionTime":"2025-12-05T16:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.643495 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-8lsn9"] Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.644009 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:18 crc kubenswrapper[4756]: E1205 16:16:18.644067 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.659245 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.659301 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.659316 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.659337 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.659351 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:18Z","lastTransitionTime":"2025-12-05T16:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.659536 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:18Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.672576 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:18Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.686292 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:18Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.700221 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33f86f8bfd78434c12f4e929d3df2330af2674409bf571d24b567650db52b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:18Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.713020 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:18Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.725020 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9crpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:18Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.730047 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmt7g\" (UniqueName: \"kubernetes.io/projected/7a3ee468-140c-408e-80a1-aa8fe3fc0acc-kube-api-access-hmt7g\") pod \"network-metrics-daemon-8lsn9\" (UID: \"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\") " pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.730114 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a3ee468-140c-408e-80a1-aa8fe3fc0acc-metrics-certs\") pod \"network-metrics-daemon-8lsn9\" (UID: \"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\") " pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.746562 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:18Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.761787 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:18Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.762763 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.762818 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.762838 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.762865 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.762881 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:18Z","lastTransitionTime":"2025-12-05T16:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.777905 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:18Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.792733 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:18Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.814754 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:18Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.830980 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmt7g\" (UniqueName: \"kubernetes.io/projected/7a3ee468-140c-408e-80a1-aa8fe3fc0acc-kube-api-access-hmt7g\") pod \"network-metrics-daemon-8lsn9\" (UID: \"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\") " pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.831019 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a3ee468-140c-408e-80a1-aa8fe3fc0acc-metrics-certs\") pod \"network-metrics-daemon-8lsn9\" (UID: \"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\") " pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:18 crc kubenswrapper[4756]: E1205 16:16:18.831119 4756 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 16:16:18 crc kubenswrapper[4756]: E1205 16:16:18.831170 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a3ee468-140c-408e-80a1-aa8fe3fc0acc-metrics-certs podName:7a3ee468-140c-408e-80a1-aa8fe3fc0acc nodeName:}" failed. No retries permitted until 2025-12-05 16:16:19.331155884 +0000 UTC m=+39.569829396 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7a3ee468-140c-408e-80a1-aa8fe3fc0acc-metrics-certs") pod "network-metrics-daemon-8lsn9" (UID: "7a3ee468-140c-408e-80a1-aa8fe3fc0acc") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.840857 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2dc870e3d21c284b50f933c9036cb2e2a789c5bb1f3215ddb1c9a3155bf024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c3385459f3faf65e2bf7570657355eccf9e24f0b092ce91c00322d8b890434e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"message\\\":\\\" 6044 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 16:16:14.545505 6044 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 16:16:14.545962 6044 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 16:16:14.545982 6044 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 16:16:14.546018 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 16:16:14.546024 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 16:16:14.546035 6044 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 16:16:14.546053 6044 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 16:16:14.546060 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 16:16:14.546069 6044 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 16:16:14.546070 6044 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 16:16:14.546082 6044 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 16:16:14.546106 6044 factory.go:656] Stopping watch factory\\\\nI1205 16:16:14.546119 6044 ovnkube.go:599] Stopped ovnkube\\\\nI1205 16:16:14.546121 6044 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 16:16:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:18Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.846337 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmt7g\" (UniqueName: \"kubernetes.io/projected/7a3ee468-140c-408e-80a1-aa8fe3fc0acc-kube-api-access-hmt7g\") pod \"network-metrics-daemon-8lsn9\" (UID: \"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\") " pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.855340 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:18Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.865117 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.865151 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.865162 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.865176 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.865188 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:18Z","lastTransitionTime":"2025-12-05T16:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.870322 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:18Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.880867 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lsn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lsn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:18Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.895507 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:18Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.910265 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:18Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.967383 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.967450 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.967462 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.967479 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:18 crc kubenswrapper[4756]: I1205 16:16:18.967490 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:18Z","lastTransitionTime":"2025-12-05T16:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.069951 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.070009 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.070027 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.070050 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.070068 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:19Z","lastTransitionTime":"2025-12-05T16:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.173821 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.173873 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.173889 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.173917 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.173938 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:19Z","lastTransitionTime":"2025-12-05T16:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.276744 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.276792 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.276806 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.276824 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.276835 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:19Z","lastTransitionTime":"2025-12-05T16:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.335813 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a3ee468-140c-408e-80a1-aa8fe3fc0acc-metrics-certs\") pod \"network-metrics-daemon-8lsn9\" (UID: \"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\") " pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:19 crc kubenswrapper[4756]: E1205 16:16:19.336128 4756 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 16:16:19 crc kubenswrapper[4756]: E1205 16:16:19.336261 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a3ee468-140c-408e-80a1-aa8fe3fc0acc-metrics-certs podName:7a3ee468-140c-408e-80a1-aa8fe3fc0acc nodeName:}" failed. No retries permitted until 2025-12-05 16:16:20.336223669 +0000 UTC m=+40.574897221 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7a3ee468-140c-408e-80a1-aa8fe3fc0acc-metrics-certs") pod "network-metrics-daemon-8lsn9" (UID: "7a3ee468-140c-408e-80a1-aa8fe3fc0acc") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.379820 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.379884 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.379901 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.379924 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.379941 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:19Z","lastTransitionTime":"2025-12-05T16:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.409708 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" event={"ID":"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816","Type":"ContainerStarted","Data":"76cac9f03390b95dedb4fd926de2d392effe80b3aff4532af921fb9c586b4e2b"} Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.409761 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" event={"ID":"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816","Type":"ContainerStarted","Data":"3d5b7589ddbcceabedf2f6b5a386d8df5c9b52af1f7fb6b63a7fd4d30d4aa2f7"} Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.411636 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz5rs_b75aef14-f06d-44ec-8a2d-28d9661b1001/ovnkube-controller/1.log" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.412140 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz5rs_b75aef14-f06d-44ec-8a2d-28d9661b1001/ovnkube-controller/0.log" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.417607 4756 generic.go:334] "Generic (PLEG): container finished" podID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerID="fc2dc870e3d21c284b50f933c9036cb2e2a789c5bb1f3215ddb1c9a3155bf024" exitCode=1 Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.417669 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" event={"ID":"b75aef14-f06d-44ec-8a2d-28d9661b1001","Type":"ContainerDied","Data":"fc2dc870e3d21c284b50f933c9036cb2e2a789c5bb1f3215ddb1c9a3155bf024"} Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.417716 4756 scope.go:117] "RemoveContainer" containerID="2c3385459f3faf65e2bf7570657355eccf9e24f0b092ce91c00322d8b890434e" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.418822 4756 scope.go:117] "RemoveContainer" containerID="fc2dc870e3d21c284b50f933c9036cb2e2a789c5bb1f3215ddb1c9a3155bf024" Dec 05 16:16:19 crc kubenswrapper[4756]: E1205 16:16:19.419057 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pz5rs_openshift-ovn-kubernetes(b75aef14-f06d-44ec-8a2d-28d9661b1001)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.435981 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.457044 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.468726 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.481967 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.482269 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.482304 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.482319 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.482340 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.482355 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:19Z","lastTransitionTime":"2025-12-05T16:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.506645 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33f86f8bfd78434c12f4e929d3df2330af2674409bf571d24b567650db52b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.519306 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.534543 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5b7589ddbcceabedf2f6b5a386d8df5c9b52af1f7fb6b63a7fd4d30d4aa2f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cac9f03390b95dedb4fd926de2d392effe80b3aff4532af921fb9c586b4e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9crpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.549194 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.559347 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.584688 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.584724 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.584732 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.584748 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.584758 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:19Z","lastTransitionTime":"2025-12-05T16:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.586261 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2dc870e3d21c284b50f933c9036cb2e2a789c5bb1f3215ddb1c9a3155bf024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c3385459f3faf65e2bf7570657355eccf9e24f0b092ce91c00322d8b890434e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"message\\\":\\\" 6044 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 16:16:14.545505 6044 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 16:16:14.545962 6044 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 16:16:14.545982 6044 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 16:16:14.546018 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 16:16:14.546024 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 16:16:14.546035 6044 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 16:16:14.546053 6044 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 16:16:14.546060 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 16:16:14.546069 6044 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 16:16:14.546070 6044 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 16:16:14.546082 6044 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 16:16:14.546106 6044 factory.go:656] Stopping watch factory\\\\nI1205 16:16:14.546119 6044 ovnkube.go:599] Stopped ovnkube\\\\nI1205 16:16:14.546121 6044 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 16:16:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.602148 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.619692 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.638606 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.658822 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.673921 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lsn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lsn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.684964 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.686591 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.686659 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.686678 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.686704 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.686725 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:19Z","lastTransitionTime":"2025-12-05T16:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.699480 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.711631 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.728502 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.741572 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lsn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lsn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.755586 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.768811 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.779237 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.789560 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.789624 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.789635 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.789653 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.789668 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:19Z","lastTransitionTime":"2025-12-05T16:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.792527 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.808436 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.824194 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33f86f8bfd78434c12f4e929d3df2330af2674409bf571d24b567650db52b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.841042 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.856835 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5b7589ddbcceabedf2f6b5a386d8df5c9b52af1f7fb6b63a7fd4d30d4aa2f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cac9f03390b95dedb4fd926de2d392effe80b3aff4532af921fb9c586b4e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9crpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.882286 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.891997 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.892053 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.892069 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.892093 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.892110 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:19Z","lastTransitionTime":"2025-12-05T16:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.900641 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.921393 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.944848 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.958609 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.979731 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2dc870e3d21c284b50f933c9036cb2e2a789c5bb1f3215ddb1c9a3155bf024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c3385459f3faf65e2bf7570657355eccf9e24f0b092ce91c00322d8b890434e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"message\\\":\\\" 6044 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 16:16:14.545505 6044 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 16:16:14.545962 6044 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 16:16:14.545982 6044 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 16:16:14.546018 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 16:16:14.546024 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 16:16:14.546035 6044 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 16:16:14.546053 6044 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 16:16:14.546060 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 16:16:14.546069 6044 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 16:16:14.546070 6044 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 16:16:14.546082 6044 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 16:16:14.546106 6044 factory.go:656] Stopping watch factory\\\\nI1205 16:16:14.546119 6044 ovnkube.go:599] Stopped ovnkube\\\\nI1205 16:16:14.546121 6044 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 16:16:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc2dc870e3d21c284b50f933c9036cb2e2a789c5bb1f3215ddb1c9a3155bf024\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"uter]} options:{GoMap:map[requested-tnl-key:2 router-port:rtots-crc]} port_security:{GoSet:[]} tag_request:{GoSet:[]} type:router] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b3ef3d2-aa1a-4ff5-b390-b2bd8f0241f3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:8b3ef3d2-aa1a-4ff5-b390-b2bd8f0241f3}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {3cb9854d-2900-4fd0-baba-4bfcad667b19}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 16:16:18.550710 6213 obj_retry.go:551] Creating *v1.Node crc took: 869.24879ms\\\\nI1205 16:16:18.550773 6213 factory.go:1336] Added *v1.Node event handler 2\\\\nI1205 16:16:18.550806 6213 services_controller.go:189] Starting controller ovn-lb-controller for network=default\\\\nI1205 16:16:18.550882 6213 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 16:16:18.551008 6213 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 16:16:18.551105 6213 factory.go:656] Stopping watch factory\\\\nI1205 16:16:18.551138 6213 ovnkube.go:599] Stopped ovnkube\\\\nI1205 16:16:18.551186 6213 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1205 16:16:18.551301 6213 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:19Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.994635 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.994672 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.994682 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.994696 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:19 crc kubenswrapper[4756]: I1205 16:16:19.994708 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:19Z","lastTransitionTime":"2025-12-05T16:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.049353 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:20 crc kubenswrapper[4756]: E1205 16:16:20.049503 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.049627 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.049719 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:20 crc kubenswrapper[4756]: E1205 16:16:20.049877 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.049958 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:20 crc kubenswrapper[4756]: E1205 16:16:20.050061 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:16:20 crc kubenswrapper[4756]: E1205 16:16:20.050163 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.076170 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33f86f8bfd78434c12f4e929d3df2330af2674409bf571d24b567650db52b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:20Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.087293 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:20Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.097217 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.097247 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.097256 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.097269 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.097278 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:20Z","lastTransitionTime":"2025-12-05T16:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.101314 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5b7589ddbcceabedf2f6b5a386d8df5c9b52af1f7fb6b63a7fd4d30d4aa2f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cac9f03390b95dedb4fd926de2d392effe80b3aff4532af921fb9c586b4e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9crpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:20Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.131948 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:20Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.148105 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:20Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.163682 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:20Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.178655 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:20Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.192544 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:20Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.199747 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.199815 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.199835 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.199855 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.199869 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:20Z","lastTransitionTime":"2025-12-05T16:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.203611 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:20Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.225875 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2dc870e3d21c284b50f933c9036cb2e2a789c5bb1f3215ddb1c9a3155bf024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c3385459f3faf65e2bf7570657355eccf9e24f0b092ce91c00322d8b890434e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"message\\\":\\\" 6044 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 16:16:14.545505 6044 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 16:16:14.545962 6044 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 16:16:14.545982 6044 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 16:16:14.546018 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 16:16:14.546024 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 16:16:14.546035 6044 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 16:16:14.546053 6044 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 16:16:14.546060 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 16:16:14.546069 6044 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 16:16:14.546070 6044 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 16:16:14.546082 6044 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 16:16:14.546106 6044 factory.go:656] Stopping watch factory\\\\nI1205 16:16:14.546119 6044 ovnkube.go:599] Stopped ovnkube\\\\nI1205 16:16:14.546121 6044 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 16:16:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc2dc870e3d21c284b50f933c9036cb2e2a789c5bb1f3215ddb1c9a3155bf024\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"uter]} options:{GoMap:map[requested-tnl-key:2 router-port:rtots-crc]} port_security:{GoSet:[]} tag_request:{GoSet:[]} type:router] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b3ef3d2-aa1a-4ff5-b390-b2bd8f0241f3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:8b3ef3d2-aa1a-4ff5-b390-b2bd8f0241f3}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {3cb9854d-2900-4fd0-baba-4bfcad667b19}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 16:16:18.550710 6213 obj_retry.go:551] Creating *v1.Node crc took: 869.24879ms\\\\nI1205 16:16:18.550773 6213 factory.go:1336] Added *v1.Node event handler 2\\\\nI1205 16:16:18.550806 6213 services_controller.go:189] Starting controller ovn-lb-controller for network=default\\\\nI1205 16:16:18.550882 6213 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 16:16:18.551008 6213 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 16:16:18.551105 6213 factory.go:656] Stopping watch factory\\\\nI1205 16:16:18.551138 6213 ovnkube.go:599] Stopped ovnkube\\\\nI1205 16:16:18.551186 6213 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1205 16:16:18.551301 6213 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:20Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.239028 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:20Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.249587 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lsn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lsn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:20Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.264188 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:20Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.276614 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:20Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.287454 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:20Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.301825 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:20Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.302134 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.302154 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.302165 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.302179 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.302189 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:20Z","lastTransitionTime":"2025-12-05T16:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.317295 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:20Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.348890 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a3ee468-140c-408e-80a1-aa8fe3fc0acc-metrics-certs\") pod \"network-metrics-daemon-8lsn9\" (UID: \"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\") " pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:20 crc kubenswrapper[4756]: E1205 16:16:20.349005 4756 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 16:16:20 crc kubenswrapper[4756]: E1205 16:16:20.349046 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a3ee468-140c-408e-80a1-aa8fe3fc0acc-metrics-certs podName:7a3ee468-140c-408e-80a1-aa8fe3fc0acc nodeName:}" failed. No retries permitted until 2025-12-05 16:16:22.349032766 +0000 UTC m=+42.587706268 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7a3ee468-140c-408e-80a1-aa8fe3fc0acc-metrics-certs") pod "network-metrics-daemon-8lsn9" (UID: "7a3ee468-140c-408e-80a1-aa8fe3fc0acc") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.405725 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.406036 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.406195 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.406450 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.406679 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:20Z","lastTransitionTime":"2025-12-05T16:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.422254 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz5rs_b75aef14-f06d-44ec-8a2d-28d9661b1001/ovnkube-controller/1.log" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.509546 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.509841 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.509983 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.510128 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.510293 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:20Z","lastTransitionTime":"2025-12-05T16:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.614008 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.614344 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.614378 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.614402 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.614457 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:20Z","lastTransitionTime":"2025-12-05T16:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.716603 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.716785 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.716872 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.716960 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.717055 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:20Z","lastTransitionTime":"2025-12-05T16:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.819771 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.819799 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.819806 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.819819 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.819827 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:20Z","lastTransitionTime":"2025-12-05T16:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.922249 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.922339 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.922364 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.922396 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:20 crc kubenswrapper[4756]: I1205 16:16:20.922457 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:20Z","lastTransitionTime":"2025-12-05T16:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.024810 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.024900 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.024926 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.024958 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.024982 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:21Z","lastTransitionTime":"2025-12-05T16:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.127175 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.127505 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.127533 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.127567 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.127591 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:21Z","lastTransitionTime":"2025-12-05T16:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.230799 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.230877 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.230898 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.230927 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.230949 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:21Z","lastTransitionTime":"2025-12-05T16:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.333712 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.333783 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.333799 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.333824 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.333841 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:21Z","lastTransitionTime":"2025-12-05T16:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.402682 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.402728 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.402740 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.402769 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.402783 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:21Z","lastTransitionTime":"2025-12-05T16:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:21 crc kubenswrapper[4756]: E1205 16:16:21.421935 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:21Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.425618 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.425734 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.425755 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.425821 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.425843 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:21Z","lastTransitionTime":"2025-12-05T16:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:21 crc kubenswrapper[4756]: E1205 16:16:21.447715 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:21Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.452936 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.453170 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.453501 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.453670 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.453802 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:21Z","lastTransitionTime":"2025-12-05T16:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:21 crc kubenswrapper[4756]: E1205 16:16:21.471757 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:21Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.475448 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.475512 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.475526 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.475541 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.475551 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:21Z","lastTransitionTime":"2025-12-05T16:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:21 crc kubenswrapper[4756]: E1205 16:16:21.493127 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:21Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.496808 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.496840 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.496847 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.496860 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.496869 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:21Z","lastTransitionTime":"2025-12-05T16:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:21 crc kubenswrapper[4756]: E1205 16:16:21.513316 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:21Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:21 crc kubenswrapper[4756]: E1205 16:16:21.513482 4756 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.515063 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.515104 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.515115 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.515130 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.515141 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:21Z","lastTransitionTime":"2025-12-05T16:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.618074 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.618192 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.618217 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.618247 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.618272 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:21Z","lastTransitionTime":"2025-12-05T16:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.721750 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.721797 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.721809 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.721834 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.721847 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:21Z","lastTransitionTime":"2025-12-05T16:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.825555 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.825629 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.825652 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.825679 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.825700 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:21Z","lastTransitionTime":"2025-12-05T16:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.928239 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.928471 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.928481 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.928498 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:21 crc kubenswrapper[4756]: I1205 16:16:21.928507 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:21Z","lastTransitionTime":"2025-12-05T16:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.032098 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.032165 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.032184 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.032209 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.032226 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:22Z","lastTransitionTime":"2025-12-05T16:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.049530 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.049610 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.049626 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:22 crc kubenswrapper[4756]: E1205 16:16:22.049698 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:16:22 crc kubenswrapper[4756]: E1205 16:16:22.049822 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.049935 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:22 crc kubenswrapper[4756]: E1205 16:16:22.050076 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:16:22 crc kubenswrapper[4756]: E1205 16:16:22.050213 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.136145 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.136218 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.136252 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.136281 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.136301 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:22Z","lastTransitionTime":"2025-12-05T16:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.239936 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.240013 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.240031 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.240060 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.240084 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:22Z","lastTransitionTime":"2025-12-05T16:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.342364 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.342484 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.342510 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.342538 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.342558 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:22Z","lastTransitionTime":"2025-12-05T16:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.374089 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a3ee468-140c-408e-80a1-aa8fe3fc0acc-metrics-certs\") pod \"network-metrics-daemon-8lsn9\" (UID: \"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\") " pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:22 crc kubenswrapper[4756]: E1205 16:16:22.374261 4756 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 16:16:22 crc kubenswrapper[4756]: E1205 16:16:22.374469 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a3ee468-140c-408e-80a1-aa8fe3fc0acc-metrics-certs podName:7a3ee468-140c-408e-80a1-aa8fe3fc0acc nodeName:}" failed. No retries permitted until 2025-12-05 16:16:26.374389242 +0000 UTC m=+46.613062794 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7a3ee468-140c-408e-80a1-aa8fe3fc0acc-metrics-certs") pod "network-metrics-daemon-8lsn9" (UID: "7a3ee468-140c-408e-80a1-aa8fe3fc0acc") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.445017 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.445053 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.445061 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.445077 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.445086 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:22Z","lastTransitionTime":"2025-12-05T16:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.547473 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.547796 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.547917 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.548031 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.548135 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:22Z","lastTransitionTime":"2025-12-05T16:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.650898 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.650940 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.650954 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.650972 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.650984 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:22Z","lastTransitionTime":"2025-12-05T16:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.753064 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.753103 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.753113 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.753129 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.753139 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:22Z","lastTransitionTime":"2025-12-05T16:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.856587 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.856643 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.856660 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.856683 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.856700 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:22Z","lastTransitionTime":"2025-12-05T16:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.959487 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.959553 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.959573 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.959614 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:22 crc kubenswrapper[4756]: I1205 16:16:22.959648 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:22Z","lastTransitionTime":"2025-12-05T16:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.062111 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.062166 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.062178 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.062195 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.062207 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:23Z","lastTransitionTime":"2025-12-05T16:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.165537 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.165600 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.165617 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.165640 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.165657 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:23Z","lastTransitionTime":"2025-12-05T16:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.268636 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.268882 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.268952 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.269015 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.269073 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:23Z","lastTransitionTime":"2025-12-05T16:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.371578 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.371621 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.371633 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.371648 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.371660 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:23Z","lastTransitionTime":"2025-12-05T16:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.475151 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.476023 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.476189 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.476396 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.476639 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:23Z","lastTransitionTime":"2025-12-05T16:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.580389 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.580474 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.580486 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.580511 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.580524 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:23Z","lastTransitionTime":"2025-12-05T16:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.683282 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.683337 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.683354 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.683379 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.683399 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:23Z","lastTransitionTime":"2025-12-05T16:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.786758 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.786828 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.786847 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.786871 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.786888 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:23Z","lastTransitionTime":"2025-12-05T16:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.890022 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.890106 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.890129 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.890154 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.890171 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:23Z","lastTransitionTime":"2025-12-05T16:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.993075 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.993117 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.993127 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.993140 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:23 crc kubenswrapper[4756]: I1205 16:16:23.993150 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:23Z","lastTransitionTime":"2025-12-05T16:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.049073 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.049182 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:24 crc kubenswrapper[4756]: E1205 16:16:24.049253 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.049277 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:24 crc kubenswrapper[4756]: E1205 16:16:24.049379 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:16:24 crc kubenswrapper[4756]: E1205 16:16:24.049530 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.049713 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:24 crc kubenswrapper[4756]: E1205 16:16:24.050262 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.095866 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.096212 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.096275 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.096344 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.096440 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:24Z","lastTransitionTime":"2025-12-05T16:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.199252 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.199620 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.199752 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.199873 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.199985 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:24Z","lastTransitionTime":"2025-12-05T16:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.302444 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.302521 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.302544 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.302575 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.302598 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:24Z","lastTransitionTime":"2025-12-05T16:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.405538 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.405915 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.406047 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.406183 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.406306 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:24Z","lastTransitionTime":"2025-12-05T16:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.508338 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.508376 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.508386 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.508401 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.508413 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:24Z","lastTransitionTime":"2025-12-05T16:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.611796 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.612214 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.612358 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.613081 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.613260 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:24Z","lastTransitionTime":"2025-12-05T16:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.716081 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.716137 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.716156 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.716191 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.716226 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:24Z","lastTransitionTime":"2025-12-05T16:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.818651 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.818719 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.818744 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.818774 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.818797 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:24Z","lastTransitionTime":"2025-12-05T16:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.921223 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.921251 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.921296 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.921313 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:24 crc kubenswrapper[4756]: I1205 16:16:24.921323 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:24Z","lastTransitionTime":"2025-12-05T16:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.024006 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.024073 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.024135 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.024166 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.024188 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:25Z","lastTransitionTime":"2025-12-05T16:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.127246 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.127303 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.127321 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.127347 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.127368 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:25Z","lastTransitionTime":"2025-12-05T16:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.230081 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.230141 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.230158 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.230181 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.230198 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:25Z","lastTransitionTime":"2025-12-05T16:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.333666 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.333902 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.333973 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.334046 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.334143 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:25Z","lastTransitionTime":"2025-12-05T16:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.437064 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.437340 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.437446 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.437530 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.437611 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:25Z","lastTransitionTime":"2025-12-05T16:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.540545 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.540603 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.540622 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.540651 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.540672 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:25Z","lastTransitionTime":"2025-12-05T16:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.643462 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.643499 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.643508 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.643555 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.643564 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:25Z","lastTransitionTime":"2025-12-05T16:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.745886 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.745930 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.745942 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.745959 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.745970 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:25Z","lastTransitionTime":"2025-12-05T16:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.848360 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.848447 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.848457 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.848471 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.848480 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:25Z","lastTransitionTime":"2025-12-05T16:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.951466 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.951510 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.951518 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.951533 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:25 crc kubenswrapper[4756]: I1205 16:16:25.951542 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:25Z","lastTransitionTime":"2025-12-05T16:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.048973 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.049006 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:26 crc kubenswrapper[4756]: E1205 16:16:26.049169 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.049225 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:26 crc kubenswrapper[4756]: E1205 16:16:26.049350 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.049583 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:26 crc kubenswrapper[4756]: E1205 16:16:26.049625 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:16:26 crc kubenswrapper[4756]: E1205 16:16:26.049662 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.052898 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.052931 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.052942 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.052958 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.052969 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:26Z","lastTransitionTime":"2025-12-05T16:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.155087 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.155398 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.155507 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.155597 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.155786 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:26Z","lastTransitionTime":"2025-12-05T16:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.258501 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.258559 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.258573 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.258594 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.258610 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:26Z","lastTransitionTime":"2025-12-05T16:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.361862 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.361929 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.361942 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.361961 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.361974 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:26Z","lastTransitionTime":"2025-12-05T16:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.418773 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a3ee468-140c-408e-80a1-aa8fe3fc0acc-metrics-certs\") pod \"network-metrics-daemon-8lsn9\" (UID: \"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\") " pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:26 crc kubenswrapper[4756]: E1205 16:16:26.418987 4756 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 16:16:26 crc kubenswrapper[4756]: E1205 16:16:26.419625 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a3ee468-140c-408e-80a1-aa8fe3fc0acc-metrics-certs podName:7a3ee468-140c-408e-80a1-aa8fe3fc0acc nodeName:}" failed. No retries permitted until 2025-12-05 16:16:34.419594914 +0000 UTC m=+54.658268446 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7a3ee468-140c-408e-80a1-aa8fe3fc0acc-metrics-certs") pod "network-metrics-daemon-8lsn9" (UID: "7a3ee468-140c-408e-80a1-aa8fe3fc0acc") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.464885 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.464970 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.464988 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.465017 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.465036 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:26Z","lastTransitionTime":"2025-12-05T16:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.567790 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.567857 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.567879 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.567908 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.567926 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:26Z","lastTransitionTime":"2025-12-05T16:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.670625 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.670671 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.670682 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.670698 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.670710 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:26Z","lastTransitionTime":"2025-12-05T16:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.773846 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.774348 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.774637 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.774849 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.775024 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:26Z","lastTransitionTime":"2025-12-05T16:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.878135 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.878671 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.878835 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.879009 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.879166 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:26Z","lastTransitionTime":"2025-12-05T16:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.982905 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.982980 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.982995 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.983014 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:26 crc kubenswrapper[4756]: I1205 16:16:26.983026 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:26Z","lastTransitionTime":"2025-12-05T16:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.085940 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.086008 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.086025 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.086047 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.086062 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:27Z","lastTransitionTime":"2025-12-05T16:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.188476 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.188508 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.188518 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.188532 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.188541 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:27Z","lastTransitionTime":"2025-12-05T16:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.291574 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.291978 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.292074 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.292160 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.292231 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:27Z","lastTransitionTime":"2025-12-05T16:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.394861 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.394916 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.394935 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.394958 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.394977 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:27Z","lastTransitionTime":"2025-12-05T16:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.497469 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.497541 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.497577 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.497678 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.497709 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:27Z","lastTransitionTime":"2025-12-05T16:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.599846 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.599906 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.599917 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.599935 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.599947 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:27Z","lastTransitionTime":"2025-12-05T16:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.702217 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.702289 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.702311 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.702337 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.702355 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:27Z","lastTransitionTime":"2025-12-05T16:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.805135 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.805204 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.805228 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.805261 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.805283 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:27Z","lastTransitionTime":"2025-12-05T16:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.908938 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.909020 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.909036 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.909056 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:27 crc kubenswrapper[4756]: I1205 16:16:27.909066 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:27Z","lastTransitionTime":"2025-12-05T16:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.012390 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.012646 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.012660 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.012682 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.012703 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:28Z","lastTransitionTime":"2025-12-05T16:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.049273 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.049346 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.049345 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.049529 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:28 crc kubenswrapper[4756]: E1205 16:16:28.049530 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:16:28 crc kubenswrapper[4756]: E1205 16:16:28.049766 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:16:28 crc kubenswrapper[4756]: E1205 16:16:28.049910 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:16:28 crc kubenswrapper[4756]: E1205 16:16:28.050030 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.115272 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.115369 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.115388 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.115484 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.115504 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:28Z","lastTransitionTime":"2025-12-05T16:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.218025 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.218071 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.218084 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.218103 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.218115 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:28Z","lastTransitionTime":"2025-12-05T16:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.320885 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.320925 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.320934 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.320948 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.320958 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:28Z","lastTransitionTime":"2025-12-05T16:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.423355 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.423431 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.423446 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.423462 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.423473 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:28Z","lastTransitionTime":"2025-12-05T16:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.525168 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.525217 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.525229 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.525247 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.525260 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:28Z","lastTransitionTime":"2025-12-05T16:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.627851 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.627941 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.627969 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.627998 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.628017 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:28Z","lastTransitionTime":"2025-12-05T16:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.736586 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.736650 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.736660 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.736680 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.736691 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:28Z","lastTransitionTime":"2025-12-05T16:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.838762 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.838827 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.838845 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.838870 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.838886 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:28Z","lastTransitionTime":"2025-12-05T16:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.942538 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.942582 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.942594 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.942610 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:28 crc kubenswrapper[4756]: I1205 16:16:28.942622 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:28Z","lastTransitionTime":"2025-12-05T16:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.045215 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.045459 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.045479 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.045501 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.045520 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:29Z","lastTransitionTime":"2025-12-05T16:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.148502 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.148832 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.148909 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.148991 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.149059 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:29Z","lastTransitionTime":"2025-12-05T16:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.252231 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.252303 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.252316 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.252335 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.252869 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:29Z","lastTransitionTime":"2025-12-05T16:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.355725 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.355757 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.355764 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.355777 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.355785 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:29Z","lastTransitionTime":"2025-12-05T16:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.458643 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.458706 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.458717 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.458752 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.458770 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:29Z","lastTransitionTime":"2025-12-05T16:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.561629 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.561690 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.561700 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.561722 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.561735 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:29Z","lastTransitionTime":"2025-12-05T16:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.664046 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.664089 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.664114 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.664140 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.664157 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:29Z","lastTransitionTime":"2025-12-05T16:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.767000 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.767038 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.767053 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.767072 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.767088 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:29Z","lastTransitionTime":"2025-12-05T16:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.855585 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.855683 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:29 crc kubenswrapper[4756]: E1205 16:16:29.855712 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:17:01.855685073 +0000 UTC m=+82.094358585 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.855758 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.855792 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.855828 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:29 crc kubenswrapper[4756]: E1205 16:16:29.855910 4756 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 16:16:29 crc kubenswrapper[4756]: E1205 16:16:29.855921 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 16:16:29 crc kubenswrapper[4756]: E1205 16:16:29.855927 4756 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 16:16:29 crc kubenswrapper[4756]: E1205 16:16:29.855999 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 16:16:29 crc kubenswrapper[4756]: E1205 16:16:29.856014 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 16:16:29 crc kubenswrapper[4756]: E1205 16:16:29.856025 4756 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:16:29 crc kubenswrapper[4756]: E1205 16:16:29.855943 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 16:16:29 crc kubenswrapper[4756]: E1205 16:16:29.856077 4756 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:16:29 crc kubenswrapper[4756]: E1205 16:16:29.855951 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 16:17:01.855941349 +0000 UTC m=+82.094614861 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 16:16:29 crc kubenswrapper[4756]: E1205 16:16:29.856177 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 16:17:01.856151645 +0000 UTC m=+82.094825197 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 16:16:29 crc kubenswrapper[4756]: E1205 16:16:29.856217 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 16:17:01.856201906 +0000 UTC m=+82.094875448 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:16:29 crc kubenswrapper[4756]: E1205 16:16:29.856249 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 16:17:01.856238317 +0000 UTC m=+82.094911869 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.870344 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.870405 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.870447 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.870472 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.870493 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:29Z","lastTransitionTime":"2025-12-05T16:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.973625 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.973705 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.973723 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.973749 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:29 crc kubenswrapper[4756]: I1205 16:16:29.973766 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:29Z","lastTransitionTime":"2025-12-05T16:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.049902 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.049943 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:30 crc kubenswrapper[4756]: E1205 16:16:30.050301 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.051040 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:30 crc kubenswrapper[4756]: E1205 16:16:30.051146 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.051193 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:30 crc kubenswrapper[4756]: E1205 16:16:30.051265 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:16:30 crc kubenswrapper[4756]: E1205 16:16:30.051353 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.074030 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:30Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.077008 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.077062 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.077085 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.077116 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.077138 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:30Z","lastTransitionTime":"2025-12-05T16:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.092791 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:30Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.110514 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:30Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.131186 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:30Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.150451 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lsn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lsn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:30Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.165815 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:30Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.179776 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:30Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.180306 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.180340 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.180349 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.180368 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.180380 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:30Z","lastTransitionTime":"2025-12-05T16:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.191854 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5b7589ddbcceabedf2f6b5a386d8df5c9b52af1f7fb6b63a7fd4d30d4aa2f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cac9f03390b95dedb4fd926de2d392effe80b3aff4532af921fb9c586b4e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9crpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:30Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.217205 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:30Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.236065 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:30Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.254180 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:30Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.272582 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:30Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.285037 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.285130 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.285216 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.286255 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.286280 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:30Z","lastTransitionTime":"2025-12-05T16:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.290877 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33f86f8bfd78434c12f4e929d3df2330af2674409bf571d24b567650db52b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:30Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.303143 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:30Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.315985 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:30Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.331486 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:30Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.352171 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2dc870e3d21c284b50f933c9036cb2e2a789c5bb1f3215ddb1c9a3155bf024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c3385459f3faf65e2bf7570657355eccf9e24f0b092ce91c00322d8b890434e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"message\\\":\\\" 6044 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 16:16:14.545505 6044 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 16:16:14.545962 6044 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 16:16:14.545982 6044 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 16:16:14.546018 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 16:16:14.546024 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 16:16:14.546035 6044 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 16:16:14.546053 6044 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 16:16:14.546060 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 16:16:14.546069 6044 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 16:16:14.546070 6044 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 16:16:14.546082 6044 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 16:16:14.546106 6044 factory.go:656] Stopping watch factory\\\\nI1205 16:16:14.546119 6044 ovnkube.go:599] Stopped ovnkube\\\\nI1205 16:16:14.546121 6044 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 16:16:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc2dc870e3d21c284b50f933c9036cb2e2a789c5bb1f3215ddb1c9a3155bf024\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"uter]} options:{GoMap:map[requested-tnl-key:2 router-port:rtots-crc]} port_security:{GoSet:[]} tag_request:{GoSet:[]} type:router] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b3ef3d2-aa1a-4ff5-b390-b2bd8f0241f3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:8b3ef3d2-aa1a-4ff5-b390-b2bd8f0241f3}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {3cb9854d-2900-4fd0-baba-4bfcad667b19}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 16:16:18.550710 6213 obj_retry.go:551] Creating *v1.Node crc took: 869.24879ms\\\\nI1205 16:16:18.550773 6213 factory.go:1336] Added *v1.Node event handler 2\\\\nI1205 16:16:18.550806 6213 services_controller.go:189] Starting controller ovn-lb-controller for network=default\\\\nI1205 16:16:18.550882 6213 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 16:16:18.551008 6213 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 16:16:18.551105 6213 factory.go:656] Stopping watch factory\\\\nI1205 16:16:18.551138 6213 ovnkube.go:599] Stopped ovnkube\\\\nI1205 16:16:18.551186 6213 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1205 16:16:18.551301 6213 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:30Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.388841 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.388891 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.388907 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.388931 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.388950 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:30Z","lastTransitionTime":"2025-12-05T16:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.491544 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.491608 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.491624 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.491650 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.491668 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:30Z","lastTransitionTime":"2025-12-05T16:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.595306 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.595359 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.595376 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.595401 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.595455 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:30Z","lastTransitionTime":"2025-12-05T16:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.698904 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.698971 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.698993 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.699021 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.699042 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:30Z","lastTransitionTime":"2025-12-05T16:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.802316 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.802480 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.802564 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.802589 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.802608 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:30Z","lastTransitionTime":"2025-12-05T16:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.906487 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.906541 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.906551 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.906572 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:30 crc kubenswrapper[4756]: I1205 16:16:30.906588 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:30Z","lastTransitionTime":"2025-12-05T16:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.009814 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.009875 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.009887 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.009905 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.009919 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:31Z","lastTransitionTime":"2025-12-05T16:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.112775 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.112845 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.112865 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.112892 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.112910 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:31Z","lastTransitionTime":"2025-12-05T16:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.215125 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.215176 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.215188 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.215205 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.215218 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:31Z","lastTransitionTime":"2025-12-05T16:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.318264 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.318321 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.318332 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.318352 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.318364 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:31Z","lastTransitionTime":"2025-12-05T16:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.420690 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.420728 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.420740 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.420757 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.420769 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:31Z","lastTransitionTime":"2025-12-05T16:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.523145 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.523195 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.523206 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.523219 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.523227 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:31Z","lastTransitionTime":"2025-12-05T16:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.625541 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.625649 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.625658 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.625674 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.625683 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:31Z","lastTransitionTime":"2025-12-05T16:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.728905 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.728981 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.728994 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.729063 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.729079 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:31Z","lastTransitionTime":"2025-12-05T16:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.832384 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.832464 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.832506 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.832569 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.832587 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:31Z","lastTransitionTime":"2025-12-05T16:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.860239 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.860294 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.860308 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.860326 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.860338 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:31Z","lastTransitionTime":"2025-12-05T16:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:31 crc kubenswrapper[4756]: E1205 16:16:31.874829 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:31Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.879777 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.879928 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.879944 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.879962 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.879974 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:31Z","lastTransitionTime":"2025-12-05T16:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:31 crc kubenswrapper[4756]: E1205 16:16:31.899534 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:31Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.903649 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.903712 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.903738 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.903767 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.903788 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:31Z","lastTransitionTime":"2025-12-05T16:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:31 crc kubenswrapper[4756]: E1205 16:16:31.920034 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:31Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.924438 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.924475 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.924488 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.924505 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.924517 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:31Z","lastTransitionTime":"2025-12-05T16:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:31 crc kubenswrapper[4756]: E1205 16:16:31.940508 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:31Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.944190 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.944253 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.944271 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.944295 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.944311 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:31Z","lastTransitionTime":"2025-12-05T16:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:31 crc kubenswrapper[4756]: E1205 16:16:31.961138 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:31Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:31 crc kubenswrapper[4756]: E1205 16:16:31.961367 4756 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.962957 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.963009 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.963028 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.963052 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:31 crc kubenswrapper[4756]: I1205 16:16:31.963071 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:31Z","lastTransitionTime":"2025-12-05T16:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.049779 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.049803 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.050033 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.050077 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:32 crc kubenswrapper[4756]: E1205 16:16:32.050339 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:16:32 crc kubenswrapper[4756]: E1205 16:16:32.050459 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:16:32 crc kubenswrapper[4756]: E1205 16:16:32.050567 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.050607 4756 scope.go:117] "RemoveContainer" containerID="fc2dc870e3d21c284b50f933c9036cb2e2a789c5bb1f3215ddb1c9a3155bf024" Dec 05 16:16:32 crc kubenswrapper[4756]: E1205 16:16:32.050717 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.065677 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.065739 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.065758 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.065785 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.065802 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:32Z","lastTransitionTime":"2025-12-05T16:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.071388 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.096044 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.114300 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.134075 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5b7589ddbcceabedf2f6b5a386d8df5c9b52af1f7fb6b63a7fd4d30d4aa2f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cac9f03390b95dedb4fd926de2d392effe80b3aff4532af921fb9c586b4e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9crpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.169481 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.171065 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.171112 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.171129 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.171151 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.171167 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:32Z","lastTransitionTime":"2025-12-05T16:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.194478 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.217237 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.235853 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.263100 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33f86f8bfd78434c12f4e929d3df2330af2674409bf571d24b567650db52b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.274461 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.274504 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.274521 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.274555 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.274568 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:32Z","lastTransitionTime":"2025-12-05T16:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.282616 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.297510 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.316662 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2dc870e3d21c284b50f933c9036cb2e2a789c5bb1f3215ddb1c9a3155bf024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc2dc870e3d21c284b50f933c9036cb2e2a789c5bb1f3215ddb1c9a3155bf024\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"uter]} options:{GoMap:map[requested-tnl-key:2 router-port:rtots-crc]} port_security:{GoSet:[]} tag_request:{GoSet:[]} type:router] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b3ef3d2-aa1a-4ff5-b390-b2bd8f0241f3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:8b3ef3d2-aa1a-4ff5-b390-b2bd8f0241f3}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {3cb9854d-2900-4fd0-baba-4bfcad667b19}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 16:16:18.550710 6213 obj_retry.go:551] Creating *v1.Node crc took: 869.24879ms\\\\nI1205 16:16:18.550773 6213 factory.go:1336] Added *v1.Node event handler 2\\\\nI1205 16:16:18.550806 6213 services_controller.go:189] Starting controller ovn-lb-controller for network=default\\\\nI1205 16:16:18.550882 6213 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 16:16:18.551008 6213 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 16:16:18.551105 6213 factory.go:656] Stopping watch factory\\\\nI1205 16:16:18.551138 6213 ovnkube.go:599] Stopped ovnkube\\\\nI1205 16:16:18.551186 6213 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1205 16:16:18.551301 6213 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pz5rs_openshift-ovn-kubernetes(b75aef14-f06d-44ec-8a2d-28d9661b1001)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.329266 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lsn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lsn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.349954 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.363946 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.377746 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.377821 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.377847 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.377881 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.377905 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:32Z","lastTransitionTime":"2025-12-05T16:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.378828 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.404079 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.466866 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz5rs_b75aef14-f06d-44ec-8a2d-28d9661b1001/ovnkube-controller/1.log" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.470548 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" event={"ID":"b75aef14-f06d-44ec-8a2d-28d9661b1001","Type":"ContainerStarted","Data":"024ed62a186df5496a865626329e59b509c25497e35b7bb0616b736ec29ed609"} Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.470964 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.479709 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.479748 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.479757 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.479771 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.479779 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:32Z","lastTransitionTime":"2025-12-05T16:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.495258 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.513130 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.533706 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ed62a186df5496a865626329e59b509c25497e35b7bb0616b736ec29ed609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc2dc870e3d21c284b50f933c9036cb2e2a789c5bb1f3215ddb1c9a3155bf024\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"uter]} options:{GoMap:map[requested-tnl-key:2 router-port:rtots-crc]} port_security:{GoSet:[]} tag_request:{GoSet:[]} type:router] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b3ef3d2-aa1a-4ff5-b390-b2bd8f0241f3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:8b3ef3d2-aa1a-4ff5-b390-b2bd8f0241f3}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {3cb9854d-2900-4fd0-baba-4bfcad667b19}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 16:16:18.550710 6213 obj_retry.go:551] Creating *v1.Node crc took: 869.24879ms\\\\nI1205 16:16:18.550773 6213 factory.go:1336] Added *v1.Node event handler 2\\\\nI1205 16:16:18.550806 6213 services_controller.go:189] Starting controller ovn-lb-controller for network=default\\\\nI1205 16:16:18.550882 6213 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 16:16:18.551008 6213 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 16:16:18.551105 6213 factory.go:656] Stopping watch factory\\\\nI1205 16:16:18.551138 6213 ovnkube.go:599] Stopped ovnkube\\\\nI1205 16:16:18.551186 6213 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1205 16:16:18.551301 6213 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.547177 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.558268 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.568642 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.578767 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.582404 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.582443 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.582452 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.582465 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.582475 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:32Z","lastTransitionTime":"2025-12-05T16:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.589258 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lsn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lsn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.600511 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.609290 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.627534 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.642722 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.664781 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.682735 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.684342 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.684389 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.684404 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.684447 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.684468 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:32Z","lastTransitionTime":"2025-12-05T16:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.699849 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33f86f8bfd78434c12f4e929d3df2330af2674409bf571d24b567650db52b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.710553 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.719723 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5b7589ddbcceabedf2f6b5a386d8df5c9b52af1f7fb6b63a7fd4d30d4aa2f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cac9f03390b95dedb4fd926de2d392effe80b3aff4532af921fb9c586b4e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9crpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:32Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.787233 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.787288 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.787299 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.787316 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.787329 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:32Z","lastTransitionTime":"2025-12-05T16:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.891065 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.891125 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.891140 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.891166 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.891178 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:32Z","lastTransitionTime":"2025-12-05T16:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.993344 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.993374 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.993382 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.993395 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:32 crc kubenswrapper[4756]: I1205 16:16:32.993403 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:32Z","lastTransitionTime":"2025-12-05T16:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.096310 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.096356 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.096372 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.096394 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.096411 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:33Z","lastTransitionTime":"2025-12-05T16:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.199657 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.199703 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.199714 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.199731 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.199742 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:33Z","lastTransitionTime":"2025-12-05T16:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.301742 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.301787 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.301800 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.301817 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.301828 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:33Z","lastTransitionTime":"2025-12-05T16:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.404721 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.404778 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.404794 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.404814 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.404830 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:33Z","lastTransitionTime":"2025-12-05T16:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.476347 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz5rs_b75aef14-f06d-44ec-8a2d-28d9661b1001/ovnkube-controller/2.log" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.476911 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz5rs_b75aef14-f06d-44ec-8a2d-28d9661b1001/ovnkube-controller/1.log" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.480634 4756 generic.go:334] "Generic (PLEG): container finished" podID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerID="024ed62a186df5496a865626329e59b509c25497e35b7bb0616b736ec29ed609" exitCode=1 Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.480682 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" event={"ID":"b75aef14-f06d-44ec-8a2d-28d9661b1001","Type":"ContainerDied","Data":"024ed62a186df5496a865626329e59b509c25497e35b7bb0616b736ec29ed609"} Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.480744 4756 scope.go:117] "RemoveContainer" containerID="fc2dc870e3d21c284b50f933c9036cb2e2a789c5bb1f3215ddb1c9a3155bf024" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.481662 4756 scope.go:117] "RemoveContainer" containerID="024ed62a186df5496a865626329e59b509c25497e35b7bb0616b736ec29ed609" Dec 05 16:16:33 crc kubenswrapper[4756]: E1205 16:16:33.482004 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pz5rs_openshift-ovn-kubernetes(b75aef14-f06d-44ec-8a2d-28d9661b1001)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.499089 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:33Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.512173 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.512233 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.512254 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.512842 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.512884 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:33Z","lastTransitionTime":"2025-12-05T16:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.514846 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:33Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.533667 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:33Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.553031 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:33Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.572058 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:33Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.595082 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33f86f8bfd78434c12f4e929d3df2330af2674409bf571d24b567650db52b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:33Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.612377 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:33Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.615717 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.615811 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.615828 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.615852 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.615869 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:33Z","lastTransitionTime":"2025-12-05T16:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.629504 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5b7589ddbcceabedf2f6b5a386d8df5c9b52af1f7fb6b63a7fd4d30d4aa2f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cac9f03390b95dedb4fd926de2d392effe80b3aff4532af921fb9c586b4e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9crpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:33Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.657947 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:33Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.673011 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:33Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.694078 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ed62a186df5496a865626329e59b509c25497e35b7bb0616b736ec29ed609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc2dc870e3d21c284b50f933c9036cb2e2a789c5bb1f3215ddb1c9a3155bf024\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"uter]} options:{GoMap:map[requested-tnl-key:2 router-port:rtots-crc]} port_security:{GoSet:[]} tag_request:{GoSet:[]} type:router] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b3ef3d2-aa1a-4ff5-b390-b2bd8f0241f3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:8b3ef3d2-aa1a-4ff5-b390-b2bd8f0241f3}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {3cb9854d-2900-4fd0-baba-4bfcad667b19}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 16:16:18.550710 6213 obj_retry.go:551] Creating *v1.Node crc took: 869.24879ms\\\\nI1205 16:16:18.550773 6213 factory.go:1336] Added *v1.Node event handler 2\\\\nI1205 16:16:18.550806 6213 services_controller.go:189] Starting controller ovn-lb-controller for network=default\\\\nI1205 16:16:18.550882 6213 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 16:16:18.551008 6213 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 16:16:18.551105 6213 factory.go:656] Stopping watch factory\\\\nI1205 16:16:18.551138 6213 ovnkube.go:599] Stopped ovnkube\\\\nI1205 16:16:18.551186 6213 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1205 16:16:18.551301 6213 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ed62a186df5496a865626329e59b509c25497e35b7bb0616b736ec29ed609\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:33Z\\\",\\\"message\\\":\\\"utations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {39432221-5995-412b-967b-35e1a9405ec7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 16:16:32.939726 6430 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nI1205 16:16:32.939733 6430 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 16:16:32.939738 6430 services_controller.go:360] Finished syncing service package-server-manager-metrics on namespace openshift-operator-lifecycle-manager for network=default : 1.487028ms\\\\nI1205 16:16:32.939720 6430 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-scheduler/scheduler]} name:Service_openshift-kube-scheduler/scheduler_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.169:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {39432221-5995-412b-967b-35e1a9405ec7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1205 16:16:32.939842 6430 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:33Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.709782 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:33Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.719212 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.719272 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.719289 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.719311 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.719329 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:33Z","lastTransitionTime":"2025-12-05T16:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.725407 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:33Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.738233 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:33Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.749669 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:33Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.764528 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:33Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.777116 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lsn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lsn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:33Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.822141 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.822194 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.822206 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.822224 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.822236 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:33Z","lastTransitionTime":"2025-12-05T16:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.924969 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.924999 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.925007 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.925020 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:33 crc kubenswrapper[4756]: I1205 16:16:33.925030 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:33Z","lastTransitionTime":"2025-12-05T16:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.027088 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.027143 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.027155 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.027171 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.027180 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:34Z","lastTransitionTime":"2025-12-05T16:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.048945 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.048975 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.049018 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:34 crc kubenswrapper[4756]: E1205 16:16:34.049581 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.049664 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:34 crc kubenswrapper[4756]: E1205 16:16:34.049819 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:16:34 crc kubenswrapper[4756]: E1205 16:16:34.049934 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:16:34 crc kubenswrapper[4756]: E1205 16:16:34.050035 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.129957 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.129995 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.130005 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.130019 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.130030 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:34Z","lastTransitionTime":"2025-12-05T16:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.233628 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.233689 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.233709 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.233736 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.233758 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:34Z","lastTransitionTime":"2025-12-05T16:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.336347 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.336385 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.336394 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.336408 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.336440 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:34Z","lastTransitionTime":"2025-12-05T16:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.438673 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.438701 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.438710 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.438723 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.438732 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:34Z","lastTransitionTime":"2025-12-05T16:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.487089 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz5rs_b75aef14-f06d-44ec-8a2d-28d9661b1001/ovnkube-controller/2.log" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.490603 4756 scope.go:117] "RemoveContainer" containerID="024ed62a186df5496a865626329e59b509c25497e35b7bb0616b736ec29ed609" Dec 05 16:16:34 crc kubenswrapper[4756]: E1205 16:16:34.490780 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pz5rs_openshift-ovn-kubernetes(b75aef14-f06d-44ec-8a2d-28d9661b1001)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.507169 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a3ee468-140c-408e-80a1-aa8fe3fc0acc-metrics-certs\") pod \"network-metrics-daemon-8lsn9\" (UID: \"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\") " pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:34 crc kubenswrapper[4756]: E1205 16:16:34.508227 4756 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 16:16:34 crc kubenswrapper[4756]: E1205 16:16:34.508377 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a3ee468-140c-408e-80a1-aa8fe3fc0acc-metrics-certs podName:7a3ee468-140c-408e-80a1-aa8fe3fc0acc nodeName:}" failed. No retries permitted until 2025-12-05 16:16:50.508335206 +0000 UTC m=+70.747008868 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7a3ee468-140c-408e-80a1-aa8fe3fc0acc-metrics-certs") pod "network-metrics-daemon-8lsn9" (UID: "7a3ee468-140c-408e-80a1-aa8fe3fc0acc") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.525862 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ed62a186df5496a865626329e59b509c25497e35b7bb0616b736ec29ed609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ed62a186df5496a865626329e59b509c25497e35b7bb0616b736ec29ed609\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:33Z\\\",\\\"message\\\":\\\"utations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {39432221-5995-412b-967b-35e1a9405ec7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 16:16:32.939726 6430 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nI1205 16:16:32.939733 6430 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 16:16:32.939738 6430 services_controller.go:360] Finished syncing service package-server-manager-metrics on namespace openshift-operator-lifecycle-manager for network=default : 1.487028ms\\\\nI1205 16:16:32.939720 6430 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-scheduler/scheduler]} name:Service_openshift-kube-scheduler/scheduler_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.169:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {39432221-5995-412b-967b-35e1a9405ec7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1205 16:16:32.939842 6430 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pz5rs_openshift-ovn-kubernetes(b75aef14-f06d-44ec-8a2d-28d9661b1001)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:34Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.541849 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.541911 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.541922 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.541946 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.541962 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:34Z","lastTransitionTime":"2025-12-05T16:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.546229 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:34Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.563839 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:34Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.581924 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:34Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.599044 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:34Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.614333 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:34Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.627987 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lsn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lsn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:34Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.645185 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.645241 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.645255 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.645279 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.645296 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:34Z","lastTransitionTime":"2025-12-05T16:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.645327 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:34Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.658467 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:34Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.674029 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:34Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.694940 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:34Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.713297 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:34Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.735611 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33f86f8bfd78434c12f4e929d3df2330af2674409bf571d24b567650db52b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:34Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.748387 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.748472 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.748485 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.748512 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.748530 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:34Z","lastTransitionTime":"2025-12-05T16:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.754372 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:34Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.771888 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5b7589ddbcceabedf2f6b5a386d8df5c9b52af1f7fb6b63a7fd4d30d4aa2f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cac9f03390b95dedb4fd926de2d392effe80b3aff4532af921fb9c586b4e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9crpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:34Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.797951 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:34Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.816314 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:34Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.851397 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.851480 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.851500 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.851520 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.851536 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:34Z","lastTransitionTime":"2025-12-05T16:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.954465 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.954522 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.954537 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.954561 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:34 crc kubenswrapper[4756]: I1205 16:16:34.954576 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:34Z","lastTransitionTime":"2025-12-05T16:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.057748 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.057836 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.057860 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.057891 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.057913 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:35Z","lastTransitionTime":"2025-12-05T16:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.160234 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.160296 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.160313 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.160335 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.160352 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:35Z","lastTransitionTime":"2025-12-05T16:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.263642 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.263717 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.263730 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.263751 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.263765 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:35Z","lastTransitionTime":"2025-12-05T16:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.302177 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.321700 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.324685 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:35Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.341968 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:35Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.356872 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:35Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.366491 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.366567 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.366595 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.366630 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.366657 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:35Z","lastTransitionTime":"2025-12-05T16:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.377797 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33f86f8bfd78434c12f4e929d3df2330af2674409bf571d24b567650db52b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:35Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.390548 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:35Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.407137 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5b7589ddbcceabedf2f6b5a386d8df5c9b52af1f7fb6b63a7fd4d30d4aa2f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cac9f03390b95dedb4fd926de2d392effe80b3aff4532af921fb9c586b4e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9crpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:35Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.432071 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:35Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.448958 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:35Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.469774 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.469847 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.469873 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.469905 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.469928 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:35Z","lastTransitionTime":"2025-12-05T16:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.472137 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ed62a186df5496a865626329e59b509c25497e35b7bb0616b736ec29ed609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ed62a186df5496a865626329e59b509c25497e35b7bb0616b736ec29ed609\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:33Z\\\",\\\"message\\\":\\\"utations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {39432221-5995-412b-967b-35e1a9405ec7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 16:16:32.939726 6430 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nI1205 16:16:32.939733 6430 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 16:16:32.939738 6430 services_controller.go:360] Finished syncing service package-server-manager-metrics on namespace openshift-operator-lifecycle-manager for network=default : 1.487028ms\\\\nI1205 16:16:32.939720 6430 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-scheduler/scheduler]} name:Service_openshift-kube-scheduler/scheduler_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.169:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {39432221-5995-412b-967b-35e1a9405ec7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1205 16:16:32.939842 6430 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pz5rs_openshift-ovn-kubernetes(b75aef14-f06d-44ec-8a2d-28d9661b1001)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:35Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.488290 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:35Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.507834 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:35Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.526022 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:35Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.543509 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:35Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.557866 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:35Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.570380 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lsn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lsn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:35Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.572067 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.572097 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.572105 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.572120 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.572130 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:35Z","lastTransitionTime":"2025-12-05T16:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.584457 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:35Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.596496 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:35Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.674457 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.674488 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.674497 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.674515 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.674526 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:35Z","lastTransitionTime":"2025-12-05T16:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.776882 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.776945 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.776959 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.776987 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.777013 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:35Z","lastTransitionTime":"2025-12-05T16:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.880910 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.880988 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.881010 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.881040 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.881059 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:35Z","lastTransitionTime":"2025-12-05T16:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.984106 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.984197 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.984230 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.984259 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:35 crc kubenswrapper[4756]: I1205 16:16:35.984278 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:35Z","lastTransitionTime":"2025-12-05T16:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.049687 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.049755 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:36 crc kubenswrapper[4756]: E1205 16:16:36.049842 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.050094 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:36 crc kubenswrapper[4756]: E1205 16:16:36.050251 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.050289 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:36 crc kubenswrapper[4756]: E1205 16:16:36.051053 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:16:36 crc kubenswrapper[4756]: E1205 16:16:36.051115 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.086643 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.086681 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.086698 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.086717 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.086734 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:36Z","lastTransitionTime":"2025-12-05T16:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.189381 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.189487 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.189511 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.189539 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.189560 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:36Z","lastTransitionTime":"2025-12-05T16:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.291942 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.292020 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.292059 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.292089 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.292111 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:36Z","lastTransitionTime":"2025-12-05T16:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.395107 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.395170 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.395220 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.395243 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.395260 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:36Z","lastTransitionTime":"2025-12-05T16:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.497701 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.497757 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.497770 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.497788 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.497800 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:36Z","lastTransitionTime":"2025-12-05T16:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.600755 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.600797 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.600806 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.600822 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.600834 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:36Z","lastTransitionTime":"2025-12-05T16:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.703759 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.703815 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.703831 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.703852 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.703868 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:36Z","lastTransitionTime":"2025-12-05T16:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.806394 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.806444 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.806476 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.806493 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.806504 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:36Z","lastTransitionTime":"2025-12-05T16:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.909008 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.909098 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.909112 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.909695 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:36 crc kubenswrapper[4756]: I1205 16:16:36.909728 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:36Z","lastTransitionTime":"2025-12-05T16:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.012804 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.012872 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.012887 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.012912 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.012927 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:37Z","lastTransitionTime":"2025-12-05T16:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.117407 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.117515 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.117540 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.117570 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.117661 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:37Z","lastTransitionTime":"2025-12-05T16:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.221004 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.221104 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.221126 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.221159 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.221183 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:37Z","lastTransitionTime":"2025-12-05T16:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.324704 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.324758 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.324771 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.324792 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.324807 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:37Z","lastTransitionTime":"2025-12-05T16:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.427673 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.427729 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.427745 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.427768 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.427786 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:37Z","lastTransitionTime":"2025-12-05T16:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.530823 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.530884 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.530900 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.530925 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.530942 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:37Z","lastTransitionTime":"2025-12-05T16:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.634318 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.634368 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.634388 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.634410 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.634475 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:37Z","lastTransitionTime":"2025-12-05T16:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.737047 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.737118 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.737142 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.737171 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.737191 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:37Z","lastTransitionTime":"2025-12-05T16:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.839998 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.840066 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.840088 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.840116 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.840137 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:37Z","lastTransitionTime":"2025-12-05T16:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.942895 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.942928 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.942939 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.942955 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:37 crc kubenswrapper[4756]: I1205 16:16:37.942966 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:37Z","lastTransitionTime":"2025-12-05T16:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.045902 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.045973 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.045991 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.046014 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.046031 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:38Z","lastTransitionTime":"2025-12-05T16:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.049342 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.049379 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.049542 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:38 crc kubenswrapper[4756]: E1205 16:16:38.049736 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.049786 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:38 crc kubenswrapper[4756]: E1205 16:16:38.049906 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:16:38 crc kubenswrapper[4756]: E1205 16:16:38.050184 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:16:38 crc kubenswrapper[4756]: E1205 16:16:38.050233 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.148511 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.148584 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.148609 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.148637 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.148660 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:38Z","lastTransitionTime":"2025-12-05T16:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.251828 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.251893 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.251910 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.251938 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.251956 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:38Z","lastTransitionTime":"2025-12-05T16:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.354958 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.355026 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.355040 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.355058 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.355069 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:38Z","lastTransitionTime":"2025-12-05T16:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.458141 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.458182 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.458194 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.458209 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.458220 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:38Z","lastTransitionTime":"2025-12-05T16:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.561175 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.561240 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.561252 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.561269 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.561281 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:38Z","lastTransitionTime":"2025-12-05T16:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.664257 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.664298 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.664309 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.664330 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.664342 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:38Z","lastTransitionTime":"2025-12-05T16:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.766844 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.766893 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.766903 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.766921 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.766932 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:38Z","lastTransitionTime":"2025-12-05T16:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.870048 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.870110 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.870125 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.870147 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.870161 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:38Z","lastTransitionTime":"2025-12-05T16:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.973886 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.973945 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.973967 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.973991 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:38 crc kubenswrapper[4756]: I1205 16:16:38.974009 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:38Z","lastTransitionTime":"2025-12-05T16:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.076839 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.076878 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.076886 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.076900 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.076909 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:39Z","lastTransitionTime":"2025-12-05T16:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.180984 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.181049 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.181104 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.181130 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.181148 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:39Z","lastTransitionTime":"2025-12-05T16:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.285544 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.285903 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.285985 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.286070 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.286117 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:39Z","lastTransitionTime":"2025-12-05T16:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.389512 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.389575 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.389592 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.389617 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.389634 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:39Z","lastTransitionTime":"2025-12-05T16:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.495268 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.495309 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.495318 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.495334 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.495345 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:39Z","lastTransitionTime":"2025-12-05T16:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.598869 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.598987 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.599010 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.599060 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.599082 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:39Z","lastTransitionTime":"2025-12-05T16:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.702469 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.702541 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.702562 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.702590 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.702612 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:39Z","lastTransitionTime":"2025-12-05T16:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.805998 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.806079 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.806104 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.806186 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.806279 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:39Z","lastTransitionTime":"2025-12-05T16:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.910538 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.910606 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.910619 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.910636 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:39 crc kubenswrapper[4756]: I1205 16:16:39.910650 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:39Z","lastTransitionTime":"2025-12-05T16:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.014083 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.015204 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.015297 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.015390 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.015518 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:40Z","lastTransitionTime":"2025-12-05T16:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.048708 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.048766 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:40 crc kubenswrapper[4756]: E1205 16:16:40.048892 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.049214 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:40 crc kubenswrapper[4756]: E1205 16:16:40.049265 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:16:40 crc kubenswrapper[4756]: E1205 16:16:40.049627 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.049783 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:40 crc kubenswrapper[4756]: E1205 16:16:40.050131 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.072645 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:40Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.092115 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:40Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.117735 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ed62a186df5496a865626329e59b509c25497e35b7bb0616b736ec29ed609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ed62a186df5496a865626329e59b509c25497e35b7bb0616b736ec29ed609\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:33Z\\\",\\\"message\\\":\\\"utations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {39432221-5995-412b-967b-35e1a9405ec7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 16:16:32.939726 6430 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nI1205 16:16:32.939733 6430 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 16:16:32.939738 6430 services_controller.go:360] Finished syncing service package-server-manager-metrics on namespace openshift-operator-lifecycle-manager for network=default : 1.487028ms\\\\nI1205 16:16:32.939720 6430 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-scheduler/scheduler]} name:Service_openshift-kube-scheduler/scheduler_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.169:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {39432221-5995-412b-967b-35e1a9405ec7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1205 16:16:32.939842 6430 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pz5rs_openshift-ovn-kubernetes(b75aef14-f06d-44ec-8a2d-28d9661b1001)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:40Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.118917 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.118960 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.118974 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.118993 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.119008 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:40Z","lastTransitionTime":"2025-12-05T16:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.130560 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:40Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.151883 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:40Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.166370 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lsn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lsn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:40Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.181326 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:40Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.194188 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:40Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.210196 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"636d6ca9-ad3c-4ba3-9b8e-dd0aa505308a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5739236ff87695260ff75649be85ea733991ee8d0dc84198d6690ad408b90eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568eb65db0cf87e541088899b4074151be30d606c0046d2b304d238078797ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b747e8661dcf2cdad0c507bbdc8ae546903c4aca090e6dab97bf8d6c612bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373cd5c8b65a5cd92e18658b5926509f83c3055e3fcf7a82e8674c35f9f130b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://373cd5c8b65a5cd92e18658b5926509f83c3055e3fcf7a82e8674c35f9f130b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:40Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.224094 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.224176 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.224199 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.224229 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.224251 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:40Z","lastTransitionTime":"2025-12-05T16:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.226575 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:40Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.240684 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:40Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.257993 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:40Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.276571 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33f86f8bfd78434c12f4e929d3df2330af2674409bf571d24b567650db52b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:40Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.287925 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:40Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.298576 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5b7589ddbcceabedf2f6b5a386d8df5c9b52af1f7fb6b63a7fd4d30d4aa2f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cac9f03390b95dedb4fd926de2d392effe80b3aff4532af921fb9c586b4e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9crpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:40Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.319705 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:40Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.327384 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.327641 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.327703 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.327762 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.327827 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:40Z","lastTransitionTime":"2025-12-05T16:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.333516 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:40Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.346154 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:40Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.430139 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.430455 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.430530 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.430643 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.430731 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:40Z","lastTransitionTime":"2025-12-05T16:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.535533 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.535595 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.535612 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.535637 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.535653 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:40Z","lastTransitionTime":"2025-12-05T16:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.638345 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.638698 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.638776 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.638868 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.638969 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:40Z","lastTransitionTime":"2025-12-05T16:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.741939 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.742229 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.742366 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.742488 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.742569 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:40Z","lastTransitionTime":"2025-12-05T16:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.845910 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.846012 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.846091 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.846127 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.846183 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:40Z","lastTransitionTime":"2025-12-05T16:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.950847 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.950936 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.950952 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.950973 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:40 crc kubenswrapper[4756]: I1205 16:16:40.950992 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:40Z","lastTransitionTime":"2025-12-05T16:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.054279 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.054331 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.054340 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.054355 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.054367 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:41Z","lastTransitionTime":"2025-12-05T16:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.156100 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.156149 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.156162 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.156179 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.156189 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:41Z","lastTransitionTime":"2025-12-05T16:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.259371 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.259903 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.260028 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.260146 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.260236 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:41Z","lastTransitionTime":"2025-12-05T16:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.364081 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.364652 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.364810 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.364967 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.365139 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:41Z","lastTransitionTime":"2025-12-05T16:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.468381 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.468477 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.468495 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.468517 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.468535 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:41Z","lastTransitionTime":"2025-12-05T16:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.572272 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.572345 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.572364 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.572392 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.572413 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:41Z","lastTransitionTime":"2025-12-05T16:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.676278 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.676360 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.676380 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.676411 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.676484 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:41Z","lastTransitionTime":"2025-12-05T16:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.779597 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.779656 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.779674 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.779697 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.779717 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:41Z","lastTransitionTime":"2025-12-05T16:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.882475 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.882528 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.882545 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.882567 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.882583 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:41Z","lastTransitionTime":"2025-12-05T16:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.984903 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.985085 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.985168 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.985269 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:41 crc kubenswrapper[4756]: I1205 16:16:41.985364 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:41Z","lastTransitionTime":"2025-12-05T16:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.048859 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:42 crc kubenswrapper[4756]: E1205 16:16:42.049121 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.048942 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:42 crc kubenswrapper[4756]: E1205 16:16:42.049355 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.048860 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:42 crc kubenswrapper[4756]: E1205 16:16:42.049627 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.048970 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:42 crc kubenswrapper[4756]: E1205 16:16:42.049887 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.088158 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.088196 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.088206 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.088220 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.088231 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:42Z","lastTransitionTime":"2025-12-05T16:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.146109 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.146180 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.146203 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.146230 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.146251 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:42Z","lastTransitionTime":"2025-12-05T16:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:42 crc kubenswrapper[4756]: E1205 16:16:42.162305 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:42Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.166516 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.166576 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.166599 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.166623 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.166643 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:42Z","lastTransitionTime":"2025-12-05T16:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:42 crc kubenswrapper[4756]: E1205 16:16:42.181133 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:42Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.184356 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.184388 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.184396 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.184409 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.184435 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:42Z","lastTransitionTime":"2025-12-05T16:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:42 crc kubenswrapper[4756]: E1205 16:16:42.196538 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:42Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.200553 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.200608 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.200632 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.200655 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.200675 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:42Z","lastTransitionTime":"2025-12-05T16:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:42 crc kubenswrapper[4756]: E1205 16:16:42.213573 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:42Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.217391 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.217452 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.217466 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.217484 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.217496 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:42Z","lastTransitionTime":"2025-12-05T16:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:42 crc kubenswrapper[4756]: E1205 16:16:42.230593 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:42Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:42 crc kubenswrapper[4756]: E1205 16:16:42.231069 4756 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.233252 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.233376 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.233501 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.233587 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.233671 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:42Z","lastTransitionTime":"2025-12-05T16:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.336924 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.336995 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.337017 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.337045 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.337068 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:42Z","lastTransitionTime":"2025-12-05T16:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.440148 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.440181 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.440189 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.440203 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.440212 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:42Z","lastTransitionTime":"2025-12-05T16:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.542622 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.542665 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.542677 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.542694 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.542707 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:42Z","lastTransitionTime":"2025-12-05T16:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.645902 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.646162 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.646251 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.646343 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.646452 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:42Z","lastTransitionTime":"2025-12-05T16:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.749369 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.749460 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.749478 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.749503 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.749521 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:42Z","lastTransitionTime":"2025-12-05T16:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.852539 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.852596 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.852607 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.852632 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.852647 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:42Z","lastTransitionTime":"2025-12-05T16:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.955780 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.955832 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.955847 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.955866 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:42 crc kubenswrapper[4756]: I1205 16:16:42.955880 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:42Z","lastTransitionTime":"2025-12-05T16:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.058862 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.058943 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.058956 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.058974 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.058986 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:43Z","lastTransitionTime":"2025-12-05T16:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.161946 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.162005 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.162014 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.162033 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.162048 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:43Z","lastTransitionTime":"2025-12-05T16:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.266667 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.266737 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.266754 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.266818 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.266837 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:43Z","lastTransitionTime":"2025-12-05T16:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.369537 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.369600 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.369617 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.369642 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.369660 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:43Z","lastTransitionTime":"2025-12-05T16:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.471383 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.471446 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.471458 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.471477 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.471489 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:43Z","lastTransitionTime":"2025-12-05T16:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.574389 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.574467 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.574483 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.574508 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.574523 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:43Z","lastTransitionTime":"2025-12-05T16:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.677970 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.678033 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.678046 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.678069 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.678085 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:43Z","lastTransitionTime":"2025-12-05T16:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.780976 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.781053 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.781069 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.781103 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.781121 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:43Z","lastTransitionTime":"2025-12-05T16:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.883479 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.883526 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.883538 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.883554 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.883565 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:43Z","lastTransitionTime":"2025-12-05T16:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.985587 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.985630 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.985639 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.985653 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:43 crc kubenswrapper[4756]: I1205 16:16:43.985663 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:43Z","lastTransitionTime":"2025-12-05T16:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.049525 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.049592 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.049636 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.049568 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:44 crc kubenswrapper[4756]: E1205 16:16:44.049758 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:16:44 crc kubenswrapper[4756]: E1205 16:16:44.049857 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:16:44 crc kubenswrapper[4756]: E1205 16:16:44.050020 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:16:44 crc kubenswrapper[4756]: E1205 16:16:44.050102 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.088196 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.088240 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.088254 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.088268 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.088277 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:44Z","lastTransitionTime":"2025-12-05T16:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.190881 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.190923 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.190931 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.190945 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.190953 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:44Z","lastTransitionTime":"2025-12-05T16:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.293857 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.293900 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.293912 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.293931 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.293944 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:44Z","lastTransitionTime":"2025-12-05T16:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.396510 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.396586 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.396605 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.396630 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.396647 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:44Z","lastTransitionTime":"2025-12-05T16:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.500112 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.500165 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.500179 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.500204 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.500216 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:44Z","lastTransitionTime":"2025-12-05T16:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.603093 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.603134 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.603149 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.603172 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.603184 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:44Z","lastTransitionTime":"2025-12-05T16:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.707200 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.707268 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.707286 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.707312 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.707328 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:44Z","lastTransitionTime":"2025-12-05T16:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.810026 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.810108 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.810128 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.810160 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.810184 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:44Z","lastTransitionTime":"2025-12-05T16:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.913176 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.913240 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.913251 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.913268 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:44 crc kubenswrapper[4756]: I1205 16:16:44.913277 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:44Z","lastTransitionTime":"2025-12-05T16:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.016350 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.016459 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.016499 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.016530 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.016551 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:45Z","lastTransitionTime":"2025-12-05T16:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.118930 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.118993 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.119003 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.119028 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.119041 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:45Z","lastTransitionTime":"2025-12-05T16:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.222134 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.222180 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.222189 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.222205 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.222215 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:45Z","lastTransitionTime":"2025-12-05T16:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.324497 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.324563 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.324573 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.324593 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.324608 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:45Z","lastTransitionTime":"2025-12-05T16:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.427001 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.427048 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.427060 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.427078 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.427091 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:45Z","lastTransitionTime":"2025-12-05T16:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.529279 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.529321 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.529333 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.529353 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.529364 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:45Z","lastTransitionTime":"2025-12-05T16:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.631615 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.631672 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.631681 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.631705 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.631716 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:45Z","lastTransitionTime":"2025-12-05T16:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.734522 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.734559 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.734567 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.734582 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.734591 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:45Z","lastTransitionTime":"2025-12-05T16:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.836513 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.836545 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.836555 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.836569 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.836578 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:45Z","lastTransitionTime":"2025-12-05T16:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.939354 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.939399 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.939432 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.939455 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:45 crc kubenswrapper[4756]: I1205 16:16:45.939468 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:45Z","lastTransitionTime":"2025-12-05T16:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.042579 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.042647 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.042660 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.042683 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.042702 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:46Z","lastTransitionTime":"2025-12-05T16:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.048987 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.049000 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.049046 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.049078 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:46 crc kubenswrapper[4756]: E1205 16:16:46.049116 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:16:46 crc kubenswrapper[4756]: E1205 16:16:46.049248 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:16:46 crc kubenswrapper[4756]: E1205 16:16:46.049342 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:16:46 crc kubenswrapper[4756]: E1205 16:16:46.049432 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.146222 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.146262 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.146270 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.146285 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.146298 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:46Z","lastTransitionTime":"2025-12-05T16:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.249321 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.249384 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.249401 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.249450 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.249465 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:46Z","lastTransitionTime":"2025-12-05T16:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.351504 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.351554 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.351568 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.351588 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.351599 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:46Z","lastTransitionTime":"2025-12-05T16:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.455117 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.455182 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.455204 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.455241 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.455277 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:46Z","lastTransitionTime":"2025-12-05T16:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.557854 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.557899 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.557913 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.557931 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.557944 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:46Z","lastTransitionTime":"2025-12-05T16:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.660836 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.660911 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.660926 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.660950 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.660965 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:46Z","lastTransitionTime":"2025-12-05T16:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.763656 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.763700 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.763709 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.763725 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.763744 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:46Z","lastTransitionTime":"2025-12-05T16:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.866371 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.866439 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.866455 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.866471 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.866482 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:46Z","lastTransitionTime":"2025-12-05T16:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.973119 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.973197 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.973216 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.973239 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:46 crc kubenswrapper[4756]: I1205 16:16:46.973261 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:46Z","lastTransitionTime":"2025-12-05T16:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.076614 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.076654 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.076668 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.076688 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.076703 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:47Z","lastTransitionTime":"2025-12-05T16:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.180180 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.180229 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.180246 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.180268 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.180285 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:47Z","lastTransitionTime":"2025-12-05T16:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.283951 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.283987 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.283997 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.284016 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.284030 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:47Z","lastTransitionTime":"2025-12-05T16:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.386549 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.386601 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.386614 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.386631 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.386645 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:47Z","lastTransitionTime":"2025-12-05T16:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.489903 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.489935 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.489943 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.489956 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.489964 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:47Z","lastTransitionTime":"2025-12-05T16:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.591839 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.591914 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.591926 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.591987 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.592004 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:47Z","lastTransitionTime":"2025-12-05T16:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.694480 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.694787 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.694883 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.694988 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.695065 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:47Z","lastTransitionTime":"2025-12-05T16:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.797534 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.797819 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.797900 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.797986 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.798071 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:47Z","lastTransitionTime":"2025-12-05T16:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.901254 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.901311 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.901325 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.901346 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:47 crc kubenswrapper[4756]: I1205 16:16:47.901361 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:47Z","lastTransitionTime":"2025-12-05T16:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.004503 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.004549 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.004562 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.004579 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.004593 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:48Z","lastTransitionTime":"2025-12-05T16:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.049771 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.049830 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.049834 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.049805 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:48 crc kubenswrapper[4756]: E1205 16:16:48.049973 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:16:48 crc kubenswrapper[4756]: E1205 16:16:48.050114 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:16:48 crc kubenswrapper[4756]: E1205 16:16:48.050238 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:16:48 crc kubenswrapper[4756]: E1205 16:16:48.050288 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.108208 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.108557 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.108646 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.108727 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.108815 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:48Z","lastTransitionTime":"2025-12-05T16:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.211676 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.211944 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.212019 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.212079 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.212455 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:48Z","lastTransitionTime":"2025-12-05T16:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.314962 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.314993 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.315005 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.315023 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.315033 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:48Z","lastTransitionTime":"2025-12-05T16:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.418102 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.418165 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.418175 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.418197 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.418210 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:48Z","lastTransitionTime":"2025-12-05T16:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.520912 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.520958 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.520967 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.520987 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.520996 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:48Z","lastTransitionTime":"2025-12-05T16:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.623742 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.623808 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.623821 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.623894 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.623913 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:48Z","lastTransitionTime":"2025-12-05T16:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.726586 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.726647 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.726674 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.726698 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.726713 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:48Z","lastTransitionTime":"2025-12-05T16:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.829465 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.829522 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.829532 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.829549 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.829559 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:48Z","lastTransitionTime":"2025-12-05T16:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.933113 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.933186 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.933197 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.933219 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:48 crc kubenswrapper[4756]: I1205 16:16:48.933240 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:48Z","lastTransitionTime":"2025-12-05T16:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.036944 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.037012 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.037035 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.037064 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.037087 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:49Z","lastTransitionTime":"2025-12-05T16:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.049839 4756 scope.go:117] "RemoveContainer" containerID="024ed62a186df5496a865626329e59b509c25497e35b7bb0616b736ec29ed609" Dec 05 16:16:49 crc kubenswrapper[4756]: E1205 16:16:49.050113 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pz5rs_openshift-ovn-kubernetes(b75aef14-f06d-44ec-8a2d-28d9661b1001)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.140337 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.140404 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.140447 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.140471 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.140488 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:49Z","lastTransitionTime":"2025-12-05T16:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.242502 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.242541 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.242552 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.242581 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.242596 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:49Z","lastTransitionTime":"2025-12-05T16:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.345449 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.345492 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.345504 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.345518 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.345528 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:49Z","lastTransitionTime":"2025-12-05T16:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.452126 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.452185 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.452208 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.452230 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.452243 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:49Z","lastTransitionTime":"2025-12-05T16:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.554401 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.554465 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.554476 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.554490 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.554498 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:49Z","lastTransitionTime":"2025-12-05T16:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.656822 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.656885 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.656909 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.656941 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.656963 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:49Z","lastTransitionTime":"2025-12-05T16:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.759606 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.759673 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.759689 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.760234 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.760300 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:49Z","lastTransitionTime":"2025-12-05T16:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.863554 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.863605 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.863618 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.863637 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.863650 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:49Z","lastTransitionTime":"2025-12-05T16:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.966214 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.966283 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.966299 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.966323 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:49 crc kubenswrapper[4756]: I1205 16:16:49.966342 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:49Z","lastTransitionTime":"2025-12-05T16:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.049611 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.049635 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.049739 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.049820 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:50 crc kubenswrapper[4756]: E1205 16:16:50.050305 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:16:50 crc kubenswrapper[4756]: E1205 16:16:50.050474 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:16:50 crc kubenswrapper[4756]: E1205 16:16:50.050563 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:16:50 crc kubenswrapper[4756]: E1205 16:16:50.050784 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.061404 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:50Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.068952 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.068997 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.069010 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.069028 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.069039 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:50Z","lastTransitionTime":"2025-12-05T16:16:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.082756 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33f86f8bfd78434c12f4e929d3df2330af2674409bf571d24b567650db52b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:50Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.091991 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:50Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.102087 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5b7589ddbcceabedf2f6b5a386d8df5c9b52af1f7fb6b63a7fd4d30d4aa2f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cac9f03390b95dedb4fd926de2d392effe80b3aff4532af921fb9c586b4e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9crpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:50Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.122198 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:50Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.135258 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:50Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.148973 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:50Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.161929 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:50Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.179174 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:50Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.196125 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.196189 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.196202 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.196226 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.196239 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:50Z","lastTransitionTime":"2025-12-05T16:16:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.226334 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ed62a186df5496a865626329e59b509c25497e35b7bb0616b736ec29ed609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ed62a186df5496a865626329e59b509c25497e35b7bb0616b736ec29ed609\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:33Z\\\",\\\"message\\\":\\\"utations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {39432221-5995-412b-967b-35e1a9405ec7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 16:16:32.939726 6430 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nI1205 16:16:32.939733 6430 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 16:16:32.939738 6430 services_controller.go:360] Finished syncing service package-server-manager-metrics on namespace openshift-operator-lifecycle-manager for network=default : 1.487028ms\\\\nI1205 16:16:32.939720 6430 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-scheduler/scheduler]} name:Service_openshift-kube-scheduler/scheduler_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.169:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {39432221-5995-412b-967b-35e1a9405ec7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1205 16:16:32.939842 6430 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pz5rs_openshift-ovn-kubernetes(b75aef14-f06d-44ec-8a2d-28d9661b1001)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:50Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.240544 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:50Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.254759 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:50Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.265384 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lsn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lsn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:50Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.280331 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:50Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.297555 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:50Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.298828 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.298953 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.299026 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.299105 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.299168 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:50Z","lastTransitionTime":"2025-12-05T16:16:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.311138 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"636d6ca9-ad3c-4ba3-9b8e-dd0aa505308a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5739236ff87695260ff75649be85ea733991ee8d0dc84198d6690ad408b90eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568eb65db0cf87e541088899b4074151be30d606c0046d2b304d238078797ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b747e8661dcf2cdad0c507bbdc8ae546903c4aca090e6dab97bf8d6c612bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373cd5c8b65a5cd92e18658b5926509f83c3055e3fcf7a82e8674c35f9f130b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://373cd5c8b65a5cd92e18658b5926509f83c3055e3fcf7a82e8674c35f9f130b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:50Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.324076 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:50Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.334963 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:50Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.400928 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.403906 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.404039 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.404134 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.404244 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:50Z","lastTransitionTime":"2025-12-05T16:16:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.507229 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.507281 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.507294 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.507311 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.507324 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:50Z","lastTransitionTime":"2025-12-05T16:16:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.587183 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a3ee468-140c-408e-80a1-aa8fe3fc0acc-metrics-certs\") pod \"network-metrics-daemon-8lsn9\" (UID: \"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\") " pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:50 crc kubenswrapper[4756]: E1205 16:16:50.587357 4756 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 16:16:50 crc kubenswrapper[4756]: E1205 16:16:50.587479 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a3ee468-140c-408e-80a1-aa8fe3fc0acc-metrics-certs podName:7a3ee468-140c-408e-80a1-aa8fe3fc0acc nodeName:}" failed. No retries permitted until 2025-12-05 16:17:22.587402456 +0000 UTC m=+102.826076008 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7a3ee468-140c-408e-80a1-aa8fe3fc0acc-metrics-certs") pod "network-metrics-daemon-8lsn9" (UID: "7a3ee468-140c-408e-80a1-aa8fe3fc0acc") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.609520 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.609575 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.609591 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.609614 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.609632 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:50Z","lastTransitionTime":"2025-12-05T16:16:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.712056 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.712097 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.712109 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.712126 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.712139 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:50Z","lastTransitionTime":"2025-12-05T16:16:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.815284 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.815321 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.815330 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.815348 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.815359 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:50Z","lastTransitionTime":"2025-12-05T16:16:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.918508 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.918555 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.918567 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.918585 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:50 crc kubenswrapper[4756]: I1205 16:16:50.918598 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:50Z","lastTransitionTime":"2025-12-05T16:16:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.021166 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.021209 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.021220 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.021238 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.021250 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:51Z","lastTransitionTime":"2025-12-05T16:16:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.124928 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.124975 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.124987 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.125004 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.125014 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:51Z","lastTransitionTime":"2025-12-05T16:16:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.227104 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.227149 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.227161 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.227178 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.227189 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:51Z","lastTransitionTime":"2025-12-05T16:16:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.330249 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.330287 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.330296 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.330310 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.330320 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:51Z","lastTransitionTime":"2025-12-05T16:16:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.433117 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.433155 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.433165 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.433181 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.433194 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:51Z","lastTransitionTime":"2025-12-05T16:16:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.536146 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.536196 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.536209 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.536228 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.536241 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:51Z","lastTransitionTime":"2025-12-05T16:16:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.542915 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fcqf4_fb26e42b-df49-418d-8be1-8a0704cd5dfd/kube-multus/0.log" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.542966 4756 generic.go:334] "Generic (PLEG): container finished" podID="fb26e42b-df49-418d-8be1-8a0704cd5dfd" containerID="91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f" exitCode=1 Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.542996 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fcqf4" event={"ID":"fb26e42b-df49-418d-8be1-8a0704cd5dfd","Type":"ContainerDied","Data":"91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f"} Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.543355 4756 scope.go:117] "RemoveContainer" containerID="91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.557901 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:51Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.574566 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:51Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.588017 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:51Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.602058 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:51Z\\\",\\\"message\\\":\\\"2025-12-05T16:16:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0870f74e-77e4-4ca8-8c3a-d27c95ae7899\\\\n2025-12-05T16:16:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0870f74e-77e4-4ca8-8c3a-d27c95ae7899 to /host/opt/cni/bin/\\\\n2025-12-05T16:16:06Z [verbose] multus-daemon started\\\\n2025-12-05T16:16:06Z [verbose] Readiness Indicator file check\\\\n2025-12-05T16:16:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:51Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.614048 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lsn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lsn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:51Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.628124 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:51Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.638163 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.638203 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.638216 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.638234 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.638247 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:51Z","lastTransitionTime":"2025-12-05T16:16:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.641862 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:51Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.652663 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"636d6ca9-ad3c-4ba3-9b8e-dd0aa505308a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5739236ff87695260ff75649be85ea733991ee8d0dc84198d6690ad408b90eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568eb65db0cf87e541088899b4074151be30d606c0046d2b304d238078797ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b747e8661dcf2cdad0c507bbdc8ae546903c4aca090e6dab97bf8d6c612bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373cd5c8b65a5cd92e18658b5926509f83c3055e3fcf7a82e8674c35f9f130b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://373cd5c8b65a5cd92e18658b5926509f83c3055e3fcf7a82e8674c35f9f130b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:51Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.665485 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:51Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.679606 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:51Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.692119 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:51Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.708610 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33f86f8bfd78434c12f4e929d3df2330af2674409bf571d24b567650db52b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:51Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.720358 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:51Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.732109 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5b7589ddbcceabedf2f6b5a386d8df5c9b52af1f7fb6b63a7fd4d30d4aa2f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cac9f03390b95dedb4fd926de2d392effe80b3aff4532af921fb9c586b4e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9crpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:51Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.740635 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.740669 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.740678 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.740696 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.740708 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:51Z","lastTransitionTime":"2025-12-05T16:16:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.749580 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:51Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.761173 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:51Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.780506 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ed62a186df5496a865626329e59b509c25497e35b7bb0616b736ec29ed609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ed62a186df5496a865626329e59b509c25497e35b7bb0616b736ec29ed609\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:33Z\\\",\\\"message\\\":\\\"utations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {39432221-5995-412b-967b-35e1a9405ec7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 16:16:32.939726 6430 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nI1205 16:16:32.939733 6430 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 16:16:32.939738 6430 services_controller.go:360] Finished syncing service package-server-manager-metrics on namespace openshift-operator-lifecycle-manager for network=default : 1.487028ms\\\\nI1205 16:16:32.939720 6430 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-scheduler/scheduler]} name:Service_openshift-kube-scheduler/scheduler_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.169:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {39432221-5995-412b-967b-35e1a9405ec7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1205 16:16:32.939842 6430 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pz5rs_openshift-ovn-kubernetes(b75aef14-f06d-44ec-8a2d-28d9661b1001)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:51Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.797433 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:51Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.842991 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.843060 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.843071 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.843087 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.843097 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:51Z","lastTransitionTime":"2025-12-05T16:16:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.951752 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.951794 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.951805 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.951823 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:51 crc kubenswrapper[4756]: I1205 16:16:51.951833 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:51Z","lastTransitionTime":"2025-12-05T16:16:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.048808 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.048916 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.048853 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.048853 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:52 crc kubenswrapper[4756]: E1205 16:16:52.049090 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:16:52 crc kubenswrapper[4756]: E1205 16:16:52.049207 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:16:52 crc kubenswrapper[4756]: E1205 16:16:52.049374 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:16:52 crc kubenswrapper[4756]: E1205 16:16:52.049529 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.053812 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.053856 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.053869 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.053891 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.053903 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:52Z","lastTransitionTime":"2025-12-05T16:16:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.157072 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.157126 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.157138 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.157156 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.157168 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:52Z","lastTransitionTime":"2025-12-05T16:16:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.259322 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.259369 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.259381 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.259396 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.259407 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:52Z","lastTransitionTime":"2025-12-05T16:16:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.362384 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.362500 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.362517 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.362783 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.362815 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:52Z","lastTransitionTime":"2025-12-05T16:16:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.465519 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.465557 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.465569 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.465585 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.465596 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:52Z","lastTransitionTime":"2025-12-05T16:16:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.544165 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.544195 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.544203 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.544216 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.544225 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:52Z","lastTransitionTime":"2025-12-05T16:16:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.547071 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fcqf4_fb26e42b-df49-418d-8be1-8a0704cd5dfd/kube-multus/0.log" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.547119 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fcqf4" event={"ID":"fb26e42b-df49-418d-8be1-8a0704cd5dfd","Type":"ContainerStarted","Data":"51fbf3a68f2e5d6a355b8181868e9ea812ba19b0f1b42ce32ec95e2126451436"} Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.560186 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:52Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:52 crc kubenswrapper[4756]: E1205 16:16:52.562061 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:52Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.566296 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.566338 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.566350 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.566616 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.566636 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:52Z","lastTransitionTime":"2025-12-05T16:16:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.574862 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:52Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:52 crc kubenswrapper[4756]: E1205 16:16:52.578809 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:52Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.583932 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.583962 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.583973 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.583989 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.583998 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:52Z","lastTransitionTime":"2025-12-05T16:16:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.588248 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:52Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:52 crc kubenswrapper[4756]: E1205 16:16:52.596050 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:52Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.599190 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.599229 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.599243 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.599266 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.599281 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:52Z","lastTransitionTime":"2025-12-05T16:16:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.603200 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33f86f8bfd78434c12f4e929d3df2330af2674409bf571d24b567650db52b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:52Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.615082 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:52Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:52 crc kubenswrapper[4756]: E1205 16:16:52.614969 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:52Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.618644 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.618679 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.618690 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.618705 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.618717 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:52Z","lastTransitionTime":"2025-12-05T16:16:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.625009 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5b7589ddbcceabedf2f6b5a386d8df5c9b52af1f7fb6b63a7fd4d30d4aa2f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cac9f03390b95dedb4fd926de2d392effe80b3aff4532af921fb9c586b4e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9crpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:52Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:52 crc kubenswrapper[4756]: E1205 16:16:52.630872 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:52Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:52 crc kubenswrapper[4756]: E1205 16:16:52.631018 4756 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.632726 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.632750 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.632761 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.632777 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.632788 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:52Z","lastTransitionTime":"2025-12-05T16:16:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.645454 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:52Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.658498 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:52Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.679463 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ed62a186df5496a865626329e59b509c25497e35b7bb0616b736ec29ed609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ed62a186df5496a865626329e59b509c25497e35b7bb0616b736ec29ed609\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:33Z\\\",\\\"message\\\":\\\"utations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {39432221-5995-412b-967b-35e1a9405ec7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 16:16:32.939726 6430 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nI1205 16:16:32.939733 6430 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 16:16:32.939738 6430 services_controller.go:360] Finished syncing service package-server-manager-metrics on namespace openshift-operator-lifecycle-manager for network=default : 1.487028ms\\\\nI1205 16:16:32.939720 6430 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-scheduler/scheduler]} name:Service_openshift-kube-scheduler/scheduler_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.169:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {39432221-5995-412b-967b-35e1a9405ec7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1205 16:16:32.939842 6430 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pz5rs_openshift-ovn-kubernetes(b75aef14-f06d-44ec-8a2d-28d9661b1001)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:52Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.693392 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:52Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.708287 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:52Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.723444 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:52Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.736676 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.736721 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.736733 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.736751 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.736762 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:52Z","lastTransitionTime":"2025-12-05T16:16:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.740369 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:52Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.753792 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51fbf3a68f2e5d6a355b8181868e9ea812ba19b0f1b42ce32ec95e2126451436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:51Z\\\",\\\"message\\\":\\\"2025-12-05T16:16:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0870f74e-77e4-4ca8-8c3a-d27c95ae7899\\\\n2025-12-05T16:16:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0870f74e-77e4-4ca8-8c3a-d27c95ae7899 to /host/opt/cni/bin/\\\\n2025-12-05T16:16:06Z [verbose] multus-daemon started\\\\n2025-12-05T16:16:06Z [verbose] Readiness Indicator file check\\\\n2025-12-05T16:16:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:52Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.769055 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lsn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lsn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:52Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.782438 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:52Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.792216 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:52Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.804789 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"636d6ca9-ad3c-4ba3-9b8e-dd0aa505308a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5739236ff87695260ff75649be85ea733991ee8d0dc84198d6690ad408b90eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568eb65db0cf87e541088899b4074151be30d606c0046d2b304d238078797ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b747e8661dcf2cdad0c507bbdc8ae546903c4aca090e6dab97bf8d6c612bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373cd5c8b65a5cd92e18658b5926509f83c3055e3fcf7a82e8674c35f9f130b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://373cd5c8b65a5cd92e18658b5926509f83c3055e3fcf7a82e8674c35f9f130b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:16:52Z is after 2025-08-24T17:21:41Z" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.839122 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.839165 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.839177 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.839195 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.839207 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:52Z","lastTransitionTime":"2025-12-05T16:16:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.941546 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.941594 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.941607 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.941627 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:52 crc kubenswrapper[4756]: I1205 16:16:52.941639 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:52Z","lastTransitionTime":"2025-12-05T16:16:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.043542 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.043590 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.043601 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.043619 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.043632 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:53Z","lastTransitionTime":"2025-12-05T16:16:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.145710 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.145786 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.145797 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.145832 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.145842 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:53Z","lastTransitionTime":"2025-12-05T16:16:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.247856 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.247889 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.247897 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.247909 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.247939 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:53Z","lastTransitionTime":"2025-12-05T16:16:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.351164 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.351208 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.351220 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.351237 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.351250 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:53Z","lastTransitionTime":"2025-12-05T16:16:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.453543 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.453586 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.453624 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.453641 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.453673 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:53Z","lastTransitionTime":"2025-12-05T16:16:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.556124 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.556180 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.556195 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.556216 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.556230 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:53Z","lastTransitionTime":"2025-12-05T16:16:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.658797 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.658866 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.658884 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.658908 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.658927 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:53Z","lastTransitionTime":"2025-12-05T16:16:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.761198 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.761244 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.761260 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.761280 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.761296 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:53Z","lastTransitionTime":"2025-12-05T16:16:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.864251 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.864308 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.864324 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.864344 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.864357 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:53Z","lastTransitionTime":"2025-12-05T16:16:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.966852 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.966911 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.966927 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.966953 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:53 crc kubenswrapper[4756]: I1205 16:16:53.966977 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:53Z","lastTransitionTime":"2025-12-05T16:16:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.049648 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.049748 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.049771 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:54 crc kubenswrapper[4756]: E1205 16:16:54.049969 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.050071 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:54 crc kubenswrapper[4756]: E1205 16:16:54.050272 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:16:54 crc kubenswrapper[4756]: E1205 16:16:54.050332 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:16:54 crc kubenswrapper[4756]: E1205 16:16:54.050464 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.068930 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.068973 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.068984 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.069000 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.069012 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:54Z","lastTransitionTime":"2025-12-05T16:16:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.171165 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.171225 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.171244 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.171267 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.171283 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:54Z","lastTransitionTime":"2025-12-05T16:16:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.274514 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.274584 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.274608 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.274638 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.274659 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:54Z","lastTransitionTime":"2025-12-05T16:16:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.377340 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.377444 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.377460 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.377478 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.377489 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:54Z","lastTransitionTime":"2025-12-05T16:16:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.480171 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.480215 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.480225 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.480243 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.480254 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:54Z","lastTransitionTime":"2025-12-05T16:16:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.582300 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.582334 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.582341 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.582355 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.582363 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:54Z","lastTransitionTime":"2025-12-05T16:16:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.684776 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.684813 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.684822 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.684835 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.684845 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:54Z","lastTransitionTime":"2025-12-05T16:16:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.787457 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.787495 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.787523 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.787537 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.787548 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:54Z","lastTransitionTime":"2025-12-05T16:16:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.890054 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.890084 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.890094 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.890109 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.890120 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:54Z","lastTransitionTime":"2025-12-05T16:16:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.993205 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.993252 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.993264 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.993281 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:54 crc kubenswrapper[4756]: I1205 16:16:54.993293 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:54Z","lastTransitionTime":"2025-12-05T16:16:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.095330 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.095380 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.095392 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.095437 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.095449 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:55Z","lastTransitionTime":"2025-12-05T16:16:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.197659 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.197696 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.197708 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.197723 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.197735 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:55Z","lastTransitionTime":"2025-12-05T16:16:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.300804 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.300840 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.300850 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.300867 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.300879 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:55Z","lastTransitionTime":"2025-12-05T16:16:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.403027 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.403072 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.403083 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.403099 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.403111 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:55Z","lastTransitionTime":"2025-12-05T16:16:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.504925 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.504968 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.504979 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.504993 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.505004 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:55Z","lastTransitionTime":"2025-12-05T16:16:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.607962 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.608014 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.608026 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.608044 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.608057 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:55Z","lastTransitionTime":"2025-12-05T16:16:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.710645 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.710679 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.710690 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.710704 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.710714 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:55Z","lastTransitionTime":"2025-12-05T16:16:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.812991 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.813023 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.813032 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.813045 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.813055 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:55Z","lastTransitionTime":"2025-12-05T16:16:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.915526 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.915564 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.915573 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.915589 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:55 crc kubenswrapper[4756]: I1205 16:16:55.915599 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:55Z","lastTransitionTime":"2025-12-05T16:16:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.018392 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.018464 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.018476 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.018492 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.018504 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:56Z","lastTransitionTime":"2025-12-05T16:16:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.049478 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:56 crc kubenswrapper[4756]: E1205 16:16:56.049616 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.049688 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.049764 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.049689 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:56 crc kubenswrapper[4756]: E1205 16:16:56.049914 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:16:56 crc kubenswrapper[4756]: E1205 16:16:56.049831 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:16:56 crc kubenswrapper[4756]: E1205 16:16:56.050082 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.120733 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.120801 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.120820 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.120842 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.120856 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:56Z","lastTransitionTime":"2025-12-05T16:16:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.222802 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.222843 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.222854 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.222870 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.222882 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:56Z","lastTransitionTime":"2025-12-05T16:16:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.326798 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.326856 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.326872 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.326895 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.326911 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:56Z","lastTransitionTime":"2025-12-05T16:16:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.429923 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.430004 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.430016 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.430031 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.430042 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:56Z","lastTransitionTime":"2025-12-05T16:16:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.532910 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.532955 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.532966 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.532989 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.533002 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:56Z","lastTransitionTime":"2025-12-05T16:16:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.635429 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.635474 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.635484 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.635502 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.635514 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:56Z","lastTransitionTime":"2025-12-05T16:16:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.737531 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.737580 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.737592 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.737610 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.737622 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:56Z","lastTransitionTime":"2025-12-05T16:16:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.840213 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.840249 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.840258 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.840272 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.840282 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:56Z","lastTransitionTime":"2025-12-05T16:16:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.943098 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.943145 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.943156 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.943173 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:56 crc kubenswrapper[4756]: I1205 16:16:56.943187 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:56Z","lastTransitionTime":"2025-12-05T16:16:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.045561 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.045586 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.045593 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.045605 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.045615 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:57Z","lastTransitionTime":"2025-12-05T16:16:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.149533 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.149772 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.149783 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.149801 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.149813 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:57Z","lastTransitionTime":"2025-12-05T16:16:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.253512 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.253568 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.253584 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.253610 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.253630 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:57Z","lastTransitionTime":"2025-12-05T16:16:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.355845 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.355891 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.355902 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.355919 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.355928 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:57Z","lastTransitionTime":"2025-12-05T16:16:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.459028 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.459094 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.459112 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.459136 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.459153 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:57Z","lastTransitionTime":"2025-12-05T16:16:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.561251 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.561286 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.561298 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.561332 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.561345 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:57Z","lastTransitionTime":"2025-12-05T16:16:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.663933 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.663972 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.663991 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.664007 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.664018 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:57Z","lastTransitionTime":"2025-12-05T16:16:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.766803 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.766867 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.766894 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.766939 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.766962 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:57Z","lastTransitionTime":"2025-12-05T16:16:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.870029 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.870101 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.870123 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.870151 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.870173 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:57Z","lastTransitionTime":"2025-12-05T16:16:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.973627 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.973667 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.973676 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.973692 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:57 crc kubenswrapper[4756]: I1205 16:16:57.973701 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:57Z","lastTransitionTime":"2025-12-05T16:16:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.053710 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.053729 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:16:58 crc kubenswrapper[4756]: E1205 16:16:58.053854 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.053923 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.053951 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:16:58 crc kubenswrapper[4756]: E1205 16:16:58.054043 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:16:58 crc kubenswrapper[4756]: E1205 16:16:58.054094 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:16:58 crc kubenswrapper[4756]: E1205 16:16:58.054164 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.075795 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.075879 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.075890 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.075905 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.075917 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:58Z","lastTransitionTime":"2025-12-05T16:16:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.179001 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.179079 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.179104 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.179161 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.179205 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:58Z","lastTransitionTime":"2025-12-05T16:16:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.282901 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.282957 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.282974 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.282996 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.283013 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:58Z","lastTransitionTime":"2025-12-05T16:16:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.386978 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.387034 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.387051 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.387074 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.387090 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:58Z","lastTransitionTime":"2025-12-05T16:16:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.490493 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.490539 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.490554 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.490573 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.490586 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:58Z","lastTransitionTime":"2025-12-05T16:16:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.593234 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.593360 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.593391 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.593502 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.593530 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:58Z","lastTransitionTime":"2025-12-05T16:16:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.696039 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.696143 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.696158 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.696177 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.696190 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:58Z","lastTransitionTime":"2025-12-05T16:16:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.799216 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.799307 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.799345 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.799377 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.799399 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:58Z","lastTransitionTime":"2025-12-05T16:16:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.901687 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.901722 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.901730 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.901742 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:58 crc kubenswrapper[4756]: I1205 16:16:58.901752 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:58Z","lastTransitionTime":"2025-12-05T16:16:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.004599 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.004895 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.004953 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.005018 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.005081 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:59Z","lastTransitionTime":"2025-12-05T16:16:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.107300 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.107335 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.107357 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.107372 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.107382 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:59Z","lastTransitionTime":"2025-12-05T16:16:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.209532 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.209575 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.209587 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.209608 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.209619 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:59Z","lastTransitionTime":"2025-12-05T16:16:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.312588 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.312678 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.312696 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.312727 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.312746 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:59Z","lastTransitionTime":"2025-12-05T16:16:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.416007 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.416332 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.416432 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.416517 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.416608 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:59Z","lastTransitionTime":"2025-12-05T16:16:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.518944 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.519021 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.519039 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.519071 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.519089 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:59Z","lastTransitionTime":"2025-12-05T16:16:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.623199 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.623256 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.623278 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.623376 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.623444 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:59Z","lastTransitionTime":"2025-12-05T16:16:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.726360 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.727020 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.727130 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.727210 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.727288 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:59Z","lastTransitionTime":"2025-12-05T16:16:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.830403 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.830461 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.830472 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.830512 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.830523 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:59Z","lastTransitionTime":"2025-12-05T16:16:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.933896 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.933950 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.933965 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.933986 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:16:59 crc kubenswrapper[4756]: I1205 16:16:59.934003 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:16:59Z","lastTransitionTime":"2025-12-05T16:16:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.036494 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.036557 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.036575 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.036599 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.036616 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:00Z","lastTransitionTime":"2025-12-05T16:17:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.049611 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.049646 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.049691 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:00 crc kubenswrapper[4756]: E1205 16:17:00.050125 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:17:00 crc kubenswrapper[4756]: E1205 16:17:00.049958 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.049752 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:00 crc kubenswrapper[4756]: E1205 16:17:00.050293 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:17:00 crc kubenswrapper[4756]: E1205 16:17:00.050464 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.069508 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:00Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.087534 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51fbf3a68f2e5d6a355b8181868e9ea812ba19b0f1b42ce32ec95e2126451436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:51Z\\\",\\\"message\\\":\\\"2025-12-05T16:16:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0870f74e-77e4-4ca8-8c3a-d27c95ae7899\\\\n2025-12-05T16:16:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0870f74e-77e4-4ca8-8c3a-d27c95ae7899 to /host/opt/cni/bin/\\\\n2025-12-05T16:16:06Z [verbose] multus-daemon started\\\\n2025-12-05T16:16:06Z [verbose] Readiness Indicator file check\\\\n2025-12-05T16:16:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:00Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.102462 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lsn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lsn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:00Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.122677 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:00Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.140224 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.140523 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.140562 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.140589 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.140613 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:00Z","lastTransitionTime":"2025-12-05T16:17:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.144821 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:00Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.161626 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"636d6ca9-ad3c-4ba3-9b8e-dd0aa505308a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5739236ff87695260ff75649be85ea733991ee8d0dc84198d6690ad408b90eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568eb65db0cf87e541088899b4074151be30d606c0046d2b304d238078797ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b747e8661dcf2cdad0c507bbdc8ae546903c4aca090e6dab97bf8d6c612bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373cd5c8b65a5cd92e18658b5926509f83c3055e3fcf7a82e8674c35f9f130b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://373cd5c8b65a5cd92e18658b5926509f83c3055e3fcf7a82e8674c35f9f130b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:00Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.182365 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:00Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.200201 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:00Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.218566 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:00Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.236368 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33f86f8bfd78434c12f4e929d3df2330af2674409bf571d24b567650db52b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:00Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.242553 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.242585 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.242598 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.242613 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.242624 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:00Z","lastTransitionTime":"2025-12-05T16:17:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.248528 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:00Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.262463 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5b7589ddbcceabedf2f6b5a386d8df5c9b52af1f7fb6b63a7fd4d30d4aa2f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cac9f03390b95dedb4fd926de2d392effe80b3aff4532af921fb9c586b4e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9crpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:00Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.294600 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:00Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.310827 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:00Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.327049 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:00Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.345123 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.345176 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.345192 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.345216 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.345042 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:00Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.345235 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:00Z","lastTransitionTime":"2025-12-05T16:17:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.361686 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:00Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.389549 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ed62a186df5496a865626329e59b509c25497e35b7bb0616b736ec29ed609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ed62a186df5496a865626329e59b509c25497e35b7bb0616b736ec29ed609\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:33Z\\\",\\\"message\\\":\\\"utations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {39432221-5995-412b-967b-35e1a9405ec7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 16:16:32.939726 6430 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nI1205 16:16:32.939733 6430 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 16:16:32.939738 6430 services_controller.go:360] Finished syncing service package-server-manager-metrics on namespace openshift-operator-lifecycle-manager for network=default : 1.487028ms\\\\nI1205 16:16:32.939720 6430 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-scheduler/scheduler]} name:Service_openshift-kube-scheduler/scheduler_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.169:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {39432221-5995-412b-967b-35e1a9405ec7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1205 16:16:32.939842 6430 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pz5rs_openshift-ovn-kubernetes(b75aef14-f06d-44ec-8a2d-28d9661b1001)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:00Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.449053 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.449092 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.449104 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.449120 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.449132 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:00Z","lastTransitionTime":"2025-12-05T16:17:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.552135 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.552190 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.552209 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.552243 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.552260 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:00Z","lastTransitionTime":"2025-12-05T16:17:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.654129 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.654157 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.654167 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.654181 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.654190 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:00Z","lastTransitionTime":"2025-12-05T16:17:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.756972 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.757015 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.757029 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.757045 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.757055 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:00Z","lastTransitionTime":"2025-12-05T16:17:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.859901 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.859937 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.859948 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.859965 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.859977 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:00Z","lastTransitionTime":"2025-12-05T16:17:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.963303 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.963370 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.963388 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.963411 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:00 crc kubenswrapper[4756]: I1205 16:17:00.963472 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:00Z","lastTransitionTime":"2025-12-05T16:17:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.049987 4756 scope.go:117] "RemoveContainer" containerID="024ed62a186df5496a865626329e59b509c25497e35b7bb0616b736ec29ed609" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.067495 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.067561 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.067579 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.067602 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.067619 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:01Z","lastTransitionTime":"2025-12-05T16:17:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.171154 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.171261 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.171274 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.171298 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.171317 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:01Z","lastTransitionTime":"2025-12-05T16:17:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.274257 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.274315 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.274333 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.274359 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.274377 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:01Z","lastTransitionTime":"2025-12-05T16:17:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.376386 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.376444 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.376454 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.376467 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.376475 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:01Z","lastTransitionTime":"2025-12-05T16:17:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.480401 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.480481 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.480500 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.480524 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.480543 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:01Z","lastTransitionTime":"2025-12-05T16:17:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.577293 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz5rs_b75aef14-f06d-44ec-8a2d-28d9661b1001/ovnkube-controller/2.log" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.580764 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" event={"ID":"b75aef14-f06d-44ec-8a2d-28d9661b1001","Type":"ContainerStarted","Data":"678012ecd6a1f851dd87783e9c2fc348817b41557fcb46a8cc4a8acc313bf09c"} Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.581304 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.582485 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.582547 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.582561 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.582588 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.582604 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:01Z","lastTransitionTime":"2025-12-05T16:17:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.593751 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:01Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.605737 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5b7589ddbcceabedf2f6b5a386d8df5c9b52af1f7fb6b63a7fd4d30d4aa2f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cac9f03390b95dedb4fd926de2d392effe80b3aff4532af921fb9c586b4e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9crpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:01Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.627737 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:01Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.641847 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:01Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.657319 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:01Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.670002 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:01Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.685827 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.685919 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.685945 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.685979 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.686005 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:01Z","lastTransitionTime":"2025-12-05T16:17:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.696600 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33f86f8bfd78434c12f4e929d3df2330af2674409bf571d24b567650db52b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:01Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.718640 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:01Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.734307 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:01Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.758247 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://678012ecd6a1f851dd87783e9c2fc348817b41557fcb46a8cc4a8acc313bf09c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ed62a186df5496a865626329e59b509c25497e35b7bb0616b736ec29ed609\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:33Z\\\",\\\"message\\\":\\\"utations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {39432221-5995-412b-967b-35e1a9405ec7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 16:16:32.939726 6430 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nI1205 16:16:32.939733 6430 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 16:16:32.939738 6430 services_controller.go:360] Finished syncing service package-server-manager-metrics on namespace openshift-operator-lifecycle-manager for network=default : 1.487028ms\\\\nI1205 16:16:32.939720 6430 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-scheduler/scheduler]} name:Service_openshift-kube-scheduler/scheduler_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.169:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {39432221-5995-412b-967b-35e1a9405ec7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1205 16:16:32.939842 6430 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:01Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.770640 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lsn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lsn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:01Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.784581 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:01Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.789173 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.789202 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.789212 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.789227 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.789238 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:01Z","lastTransitionTime":"2025-12-05T16:17:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.801256 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:01Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.812989 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:01Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.824826 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51fbf3a68f2e5d6a355b8181868e9ea812ba19b0f1b42ce32ec95e2126451436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:51Z\\\",\\\"message\\\":\\\"2025-12-05T16:16:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0870f74e-77e4-4ca8-8c3a-d27c95ae7899\\\\n2025-12-05T16:16:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0870f74e-77e4-4ca8-8c3a-d27c95ae7899 to /host/opt/cni/bin/\\\\n2025-12-05T16:16:06Z [verbose] multus-daemon started\\\\n2025-12-05T16:16:06Z [verbose] Readiness Indicator file check\\\\n2025-12-05T16:16:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:01Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.836211 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"636d6ca9-ad3c-4ba3-9b8e-dd0aa505308a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5739236ff87695260ff75649be85ea733991ee8d0dc84198d6690ad408b90eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568eb65db0cf87e541088899b4074151be30d606c0046d2b304d238078797ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b747e8661dcf2cdad0c507bbdc8ae546903c4aca090e6dab97bf8d6c612bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373cd5c8b65a5cd92e18658b5926509f83c3055e3fcf7a82e8674c35f9f130b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://373cd5c8b65a5cd92e18658b5926509f83c3055e3fcf7a82e8674c35f9f130b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:01Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.849374 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:01Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.862457 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:01Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.892299 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.892363 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.892374 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.892393 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.892409 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:01Z","lastTransitionTime":"2025-12-05T16:17:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.908034 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.908192 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:01 crc kubenswrapper[4756]: E1205 16:17:01.908272 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:05.90824354 +0000 UTC m=+146.146917052 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:17:01 crc kubenswrapper[4756]: E1205 16:17:01.908333 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.908342 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.908390 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:01 crc kubenswrapper[4756]: E1205 16:17:01.908355 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 16:17:01 crc kubenswrapper[4756]: E1205 16:17:01.908475 4756 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 16:17:01 crc kubenswrapper[4756]: E1205 16:17:01.908500 4756 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:17:01 crc kubenswrapper[4756]: E1205 16:17:01.908530 4756 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.908438 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:01 crc kubenswrapper[4756]: E1205 16:17:01.908533 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 16:18:05.908515818 +0000 UTC m=+146.147189330 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 16:17:01 crc kubenswrapper[4756]: E1205 16:17:01.908409 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 16:17:01 crc kubenswrapper[4756]: E1205 16:17:01.908666 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 16:18:05.908657382 +0000 UTC m=+146.147330894 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 16:17:01 crc kubenswrapper[4756]: E1205 16:17:01.909023 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 16:17:01 crc kubenswrapper[4756]: E1205 16:17:01.909051 4756 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:17:01 crc kubenswrapper[4756]: E1205 16:17:01.909164 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 16:18:05.909011621 +0000 UTC m=+146.147685143 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:17:01 crc kubenswrapper[4756]: E1205 16:17:01.909195 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 16:18:05.909181996 +0000 UTC m=+146.147855508 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.994891 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.994922 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.994933 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.994950 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:01 crc kubenswrapper[4756]: I1205 16:17:01.994962 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:01Z","lastTransitionTime":"2025-12-05T16:17:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.049331 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.049390 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:02 crc kubenswrapper[4756]: E1205 16:17:02.049595 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.049640 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:02 crc kubenswrapper[4756]: E1205 16:17:02.049827 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.049856 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:02 crc kubenswrapper[4756]: E1205 16:17:02.049905 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:17:02 crc kubenswrapper[4756]: E1205 16:17:02.049974 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.097911 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.097951 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.097961 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.097980 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.097991 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:02Z","lastTransitionTime":"2025-12-05T16:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.200915 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.200966 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.200982 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.201005 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.201022 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:02Z","lastTransitionTime":"2025-12-05T16:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.304030 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.304070 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.304079 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.304094 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.304104 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:02Z","lastTransitionTime":"2025-12-05T16:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.406924 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.406979 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.406995 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.407017 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.407035 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:02Z","lastTransitionTime":"2025-12-05T16:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.510281 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.510321 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.510330 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.510344 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.510355 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:02Z","lastTransitionTime":"2025-12-05T16:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.586451 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz5rs_b75aef14-f06d-44ec-8a2d-28d9661b1001/ovnkube-controller/3.log" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.587059 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz5rs_b75aef14-f06d-44ec-8a2d-28d9661b1001/ovnkube-controller/2.log" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.589857 4756 generic.go:334] "Generic (PLEG): container finished" podID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerID="678012ecd6a1f851dd87783e9c2fc348817b41557fcb46a8cc4a8acc313bf09c" exitCode=1 Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.589898 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" event={"ID":"b75aef14-f06d-44ec-8a2d-28d9661b1001","Type":"ContainerDied","Data":"678012ecd6a1f851dd87783e9c2fc348817b41557fcb46a8cc4a8acc313bf09c"} Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.589931 4756 scope.go:117] "RemoveContainer" containerID="024ed62a186df5496a865626329e59b509c25497e35b7bb0616b736ec29ed609" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.590553 4756 scope.go:117] "RemoveContainer" containerID="678012ecd6a1f851dd87783e9c2fc348817b41557fcb46a8cc4a8acc313bf09c" Dec 05 16:17:02 crc kubenswrapper[4756]: E1205 16:17:02.590698 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pz5rs_openshift-ovn-kubernetes(b75aef14-f06d-44ec-8a2d-28d9661b1001)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.602631 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lsn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lsn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.612886 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.612932 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.612943 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.612959 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.612971 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:02Z","lastTransitionTime":"2025-12-05T16:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.619056 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.633471 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.646646 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.660070 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51fbf3a68f2e5d6a355b8181868e9ea812ba19b0f1b42ce32ec95e2126451436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:51Z\\\",\\\"message\\\":\\\"2025-12-05T16:16:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0870f74e-77e4-4ca8-8c3a-d27c95ae7899\\\\n2025-12-05T16:16:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0870f74e-77e4-4ca8-8c3a-d27c95ae7899 to /host/opt/cni/bin/\\\\n2025-12-05T16:16:06Z [verbose] multus-daemon started\\\\n2025-12-05T16:16:06Z [verbose] Readiness Indicator file check\\\\n2025-12-05T16:16:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.672016 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"636d6ca9-ad3c-4ba3-9b8e-dd0aa505308a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5739236ff87695260ff75649be85ea733991ee8d0dc84198d6690ad408b90eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568eb65db0cf87e541088899b4074151be30d606c0046d2b304d238078797ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b747e8661dcf2cdad0c507bbdc8ae546903c4aca090e6dab97bf8d6c612bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373cd5c8b65a5cd92e18658b5926509f83c3055e3fcf7a82e8674c35f9f130b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://373cd5c8b65a5cd92e18658b5926509f83c3055e3fcf7a82e8674c35f9f130b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.685003 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.700128 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.713205 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.715644 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.715679 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.715691 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.715707 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.715718 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:02Z","lastTransitionTime":"2025-12-05T16:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.724815 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5b7589ddbcceabedf2f6b5a386d8df5c9b52af1f7fb6b63a7fd4d30d4aa2f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cac9f03390b95dedb4fd926de2d392effe80b3aff4532af921fb9c586b4e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9crpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.757206 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.773853 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.788178 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.801136 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.818529 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.818573 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.818584 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.818608 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.818620 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:02Z","lastTransitionTime":"2025-12-05T16:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.820825 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33f86f8bfd78434c12f4e929d3df2330af2674409bf571d24b567650db52b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.839035 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.853621 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.871784 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://678012ecd6a1f851dd87783e9c2fc348817b41557fcb46a8cc4a8acc313bf09c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ed62a186df5496a865626329e59b509c25497e35b7bb0616b736ec29ed609\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:33Z\\\",\\\"message\\\":\\\"utations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {39432221-5995-412b-967b-35e1a9405ec7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 16:16:32.939726 6430 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nI1205 16:16:32.939733 6430 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 16:16:32.939738 6430 services_controller.go:360] Finished syncing service package-server-manager-metrics on namespace openshift-operator-lifecycle-manager for network=default : 1.487028ms\\\\nI1205 16:16:32.939720 6430 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-scheduler/scheduler]} name:Service_openshift-kube-scheduler/scheduler_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.169:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {39432221-5995-412b-967b-35e1a9405ec7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1205 16:16:32.939842 6430 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678012ecd6a1f851dd87783e9c2fc348817b41557fcb46a8cc4a8acc313bf09c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:17:02Z\\\",\\\"message\\\":\\\" load balancers\\\\nI1205 16:17:01.979264 6827 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 16:17:01.978781 6827 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-rr8pw after 0 failed attempt(s)\\\\nI1205 16:17:01.979315 6827 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-rr8pw\\\\nI1205 16:17:01.979025 6827 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1205 16:17:01.978711 6827 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1205 16:17:01.979336 6827 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nF1205 16:17:01.979337 6827 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.921243 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.921288 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.921300 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.921318 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.921331 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:02Z","lastTransitionTime":"2025-12-05T16:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.952565 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.952613 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.952625 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.952641 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.952650 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:02Z","lastTransitionTime":"2025-12-05T16:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:02 crc kubenswrapper[4756]: E1205 16:17:02.969302 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.974550 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.974593 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.974608 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.974628 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.974642 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:02Z","lastTransitionTime":"2025-12-05T16:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:02 crc kubenswrapper[4756]: E1205 16:17:02.991374 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:02Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.996368 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.996605 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.996749 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.996964 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:02 crc kubenswrapper[4756]: I1205 16:17:02.997119 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:02Z","lastTransitionTime":"2025-12-05T16:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:03 crc kubenswrapper[4756]: E1205 16:17:03.013658 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:03Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.017558 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.017686 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.017784 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.017900 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.017993 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:03Z","lastTransitionTime":"2025-12-05T16:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:03 crc kubenswrapper[4756]: E1205 16:17:03.031349 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:03Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.036797 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.036831 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.036840 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.036855 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.036864 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:03Z","lastTransitionTime":"2025-12-05T16:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:03 crc kubenswrapper[4756]: E1205 16:17:03.054358 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:03Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:03 crc kubenswrapper[4756]: E1205 16:17:03.054495 4756 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.056116 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.056148 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.056158 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.056172 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.056182 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:03Z","lastTransitionTime":"2025-12-05T16:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.159340 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.159377 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.159385 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.159401 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.159425 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:03Z","lastTransitionTime":"2025-12-05T16:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.261373 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.261454 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.261468 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.261487 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.261504 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:03Z","lastTransitionTime":"2025-12-05T16:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.364325 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.364375 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.364387 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.364409 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.364442 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:03Z","lastTransitionTime":"2025-12-05T16:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.466701 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.466760 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.466771 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.466788 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.466799 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:03Z","lastTransitionTime":"2025-12-05T16:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.569345 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.569437 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.569446 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.569462 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.569472 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:03Z","lastTransitionTime":"2025-12-05T16:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.594614 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz5rs_b75aef14-f06d-44ec-8a2d-28d9661b1001/ovnkube-controller/3.log" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.597763 4756 scope.go:117] "RemoveContainer" containerID="678012ecd6a1f851dd87783e9c2fc348817b41557fcb46a8cc4a8acc313bf09c" Dec 05 16:17:03 crc kubenswrapper[4756]: E1205 16:17:03.597936 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pz5rs_openshift-ovn-kubernetes(b75aef14-f06d-44ec-8a2d-28d9661b1001)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.611527 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"636d6ca9-ad3c-4ba3-9b8e-dd0aa505308a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5739236ff87695260ff75649be85ea733991ee8d0dc84198d6690ad408b90eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568eb65db0cf87e541088899b4074151be30d606c0046d2b304d238078797ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b747e8661dcf2cdad0c507bbdc8ae546903c4aca090e6dab97bf8d6c612bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373cd5c8b65a5cd92e18658b5926509f83c3055e3fcf7a82e8674c35f9f130b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://373cd5c8b65a5cd92e18658b5926509f83c3055e3fcf7a82e8674c35f9f130b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:03Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.625238 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:03Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.638118 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:03Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.652794 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:03Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.666849 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33f86f8bfd78434c12f4e929d3df2330af2674409bf571d24b567650db52b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:03Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.671797 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.671844 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.671856 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.671877 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.671890 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:03Z","lastTransitionTime":"2025-12-05T16:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.678126 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:03Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.690296 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5b7589ddbcceabedf2f6b5a386d8df5c9b52af1f7fb6b63a7fd4d30d4aa2f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cac9f03390b95dedb4fd926de2d392effe80b3aff4532af921fb9c586b4e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9crpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:03Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.719588 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:03Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.735193 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:03Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.746014 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:03Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.758689 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:03Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.770671 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:03Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.774321 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.774356 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.774367 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.774383 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.774393 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:03Z","lastTransitionTime":"2025-12-05T16:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.788791 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://678012ecd6a1f851dd87783e9c2fc348817b41557fcb46a8cc4a8acc313bf09c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678012ecd6a1f851dd87783e9c2fc348817b41557fcb46a8cc4a8acc313bf09c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:17:02Z\\\",\\\"message\\\":\\\" load balancers\\\\nI1205 16:17:01.979264 6827 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 16:17:01.978781 6827 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-rr8pw after 0 failed attempt(s)\\\\nI1205 16:17:01.979315 6827 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-rr8pw\\\\nI1205 16:17:01.979025 6827 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1205 16:17:01.978711 6827 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1205 16:17:01.979336 6827 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nF1205 16:17:01.979337 6827 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:17:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pz5rs_openshift-ovn-kubernetes(b75aef14-f06d-44ec-8a2d-28d9661b1001)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:03Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.803105 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:03Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.818132 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51fbf3a68f2e5d6a355b8181868e9ea812ba19b0f1b42ce32ec95e2126451436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:51Z\\\",\\\"message\\\":\\\"2025-12-05T16:16:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0870f74e-77e4-4ca8-8c3a-d27c95ae7899\\\\n2025-12-05T16:16:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0870f74e-77e4-4ca8-8c3a-d27c95ae7899 to /host/opt/cni/bin/\\\\n2025-12-05T16:16:06Z [verbose] multus-daemon started\\\\n2025-12-05T16:16:06Z [verbose] Readiness Indicator file check\\\\n2025-12-05T16:16:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:03Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.830974 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lsn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lsn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:03Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.845184 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:03Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.857911 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:03Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.876603 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.876880 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.876963 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.877062 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.877124 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:03Z","lastTransitionTime":"2025-12-05T16:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.979989 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.980270 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.980562 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.980790 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:03 crc kubenswrapper[4756]: I1205 16:17:03.980981 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:03Z","lastTransitionTime":"2025-12-05T16:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.049479 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.049574 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:04 crc kubenswrapper[4756]: E1205 16:17:04.049610 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:17:04 crc kubenswrapper[4756]: E1205 16:17:04.049747 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.049832 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.050031 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:04 crc kubenswrapper[4756]: E1205 16:17:04.050083 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:17:04 crc kubenswrapper[4756]: E1205 16:17:04.050282 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.084931 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.085277 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.085639 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.085904 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.086110 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:04Z","lastTransitionTime":"2025-12-05T16:17:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.190611 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.190675 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.190687 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.190705 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.190717 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:04Z","lastTransitionTime":"2025-12-05T16:17:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.293472 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.293524 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.293539 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.293635 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.293653 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:04Z","lastTransitionTime":"2025-12-05T16:17:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.396849 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.396892 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.396908 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.396928 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.396944 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:04Z","lastTransitionTime":"2025-12-05T16:17:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.500172 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.500489 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.500615 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.500702 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.500984 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:04Z","lastTransitionTime":"2025-12-05T16:17:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.603532 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.604229 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.604295 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.604360 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.604433 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:04Z","lastTransitionTime":"2025-12-05T16:17:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.706642 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.706895 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.706967 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.707106 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.707318 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:04Z","lastTransitionTime":"2025-12-05T16:17:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.809988 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.810511 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.810586 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.810662 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.810730 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:04Z","lastTransitionTime":"2025-12-05T16:17:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.913071 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.913109 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.913120 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.913138 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:04 crc kubenswrapper[4756]: I1205 16:17:04.913151 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:04Z","lastTransitionTime":"2025-12-05T16:17:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.016160 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.016193 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.016203 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.016216 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.016225 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:05Z","lastTransitionTime":"2025-12-05T16:17:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.119366 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.119397 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.119408 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.119443 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.119452 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:05Z","lastTransitionTime":"2025-12-05T16:17:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.221783 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.221820 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.221829 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.221845 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.221856 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:05Z","lastTransitionTime":"2025-12-05T16:17:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.323688 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.323998 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.324201 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.324386 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.324613 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:05Z","lastTransitionTime":"2025-12-05T16:17:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.426981 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.427357 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.427670 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.427984 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.428261 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:05Z","lastTransitionTime":"2025-12-05T16:17:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.531605 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.531680 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.531745 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.531777 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.531800 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:05Z","lastTransitionTime":"2025-12-05T16:17:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.635039 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.635077 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.635091 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.635110 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.635123 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:05Z","lastTransitionTime":"2025-12-05T16:17:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.737556 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.737603 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.737621 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.737648 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.737667 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:05Z","lastTransitionTime":"2025-12-05T16:17:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.840239 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.840344 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.840364 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.840386 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.840402 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:05Z","lastTransitionTime":"2025-12-05T16:17:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.943043 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.943074 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.943083 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.943097 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:05 crc kubenswrapper[4756]: I1205 16:17:05.943106 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:05Z","lastTransitionTime":"2025-12-05T16:17:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.045700 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.045733 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.045742 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.045756 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.045765 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:06Z","lastTransitionTime":"2025-12-05T16:17:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.049611 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.049655 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:06 crc kubenswrapper[4756]: E1205 16:17:06.049730 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.049614 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.049796 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:06 crc kubenswrapper[4756]: E1205 16:17:06.049900 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:17:06 crc kubenswrapper[4756]: E1205 16:17:06.049983 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:17:06 crc kubenswrapper[4756]: E1205 16:17:06.050052 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.060427 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.148115 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.148157 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.148174 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.148196 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.148211 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:06Z","lastTransitionTime":"2025-12-05T16:17:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.251567 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.251610 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.251622 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.251641 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.251653 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:06Z","lastTransitionTime":"2025-12-05T16:17:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.354120 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.354147 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.354155 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.354167 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.354176 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:06Z","lastTransitionTime":"2025-12-05T16:17:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.456901 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.456981 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.457004 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.457034 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.457056 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:06Z","lastTransitionTime":"2025-12-05T16:17:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.560922 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.560963 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.560975 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.560996 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.561008 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:06Z","lastTransitionTime":"2025-12-05T16:17:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.663322 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.663362 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.663370 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.663384 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.663391 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:06Z","lastTransitionTime":"2025-12-05T16:17:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.771160 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.771210 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.771224 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.771242 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.771255 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:06Z","lastTransitionTime":"2025-12-05T16:17:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.872948 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.872989 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.873000 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.873015 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.873026 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:06Z","lastTransitionTime":"2025-12-05T16:17:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.975437 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.975486 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.975517 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.975535 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:06 crc kubenswrapper[4756]: I1205 16:17:06.975548 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:06Z","lastTransitionTime":"2025-12-05T16:17:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.078253 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.078294 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.078305 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.078322 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.078332 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:07Z","lastTransitionTime":"2025-12-05T16:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.181020 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.181069 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.181079 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.181096 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.181109 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:07Z","lastTransitionTime":"2025-12-05T16:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.283855 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.283899 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.283909 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.283924 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.283939 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:07Z","lastTransitionTime":"2025-12-05T16:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.386465 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.386534 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.386551 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.386575 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.386593 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:07Z","lastTransitionTime":"2025-12-05T16:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.488859 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.488914 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.488924 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.488944 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.488957 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:07Z","lastTransitionTime":"2025-12-05T16:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.591661 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.591716 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.591734 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.591762 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.591781 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:07Z","lastTransitionTime":"2025-12-05T16:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.694982 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.695020 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.695031 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.695048 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.695060 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:07Z","lastTransitionTime":"2025-12-05T16:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.798633 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.798702 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.798720 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.798742 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.798759 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:07Z","lastTransitionTime":"2025-12-05T16:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.901853 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.901956 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.901979 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.902013 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:07 crc kubenswrapper[4756]: I1205 16:17:07.902038 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:07Z","lastTransitionTime":"2025-12-05T16:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.005250 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.005310 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.005321 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.005339 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.005348 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:08Z","lastTransitionTime":"2025-12-05T16:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.049458 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.049479 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.049613 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.049641 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:08 crc kubenswrapper[4756]: E1205 16:17:08.049784 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:17:08 crc kubenswrapper[4756]: E1205 16:17:08.049956 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:17:08 crc kubenswrapper[4756]: E1205 16:17:08.050025 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:17:08 crc kubenswrapper[4756]: E1205 16:17:08.050159 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.108800 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.108853 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.108868 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.108883 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.108893 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:08Z","lastTransitionTime":"2025-12-05T16:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.212188 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.212222 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.212233 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.212249 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.212261 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:08Z","lastTransitionTime":"2025-12-05T16:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.315026 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.315066 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.315076 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.315089 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.315098 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:08Z","lastTransitionTime":"2025-12-05T16:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.418321 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.418373 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.418389 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.418445 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.418462 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:08Z","lastTransitionTime":"2025-12-05T16:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.520727 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.520768 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.520777 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.520795 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.520807 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:08Z","lastTransitionTime":"2025-12-05T16:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.623292 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.623357 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.623374 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.623398 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.623443 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:08Z","lastTransitionTime":"2025-12-05T16:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.725557 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.725599 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.725608 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.725623 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.725634 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:08Z","lastTransitionTime":"2025-12-05T16:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.827308 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.827364 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.827375 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.827393 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.827405 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:08Z","lastTransitionTime":"2025-12-05T16:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.929754 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.929813 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.929822 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.929844 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:08 crc kubenswrapper[4756]: I1205 16:17:08.929857 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:08Z","lastTransitionTime":"2025-12-05T16:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.032313 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.032374 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.032383 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.032398 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.032408 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:09Z","lastTransitionTime":"2025-12-05T16:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.135151 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.135189 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.135199 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.135215 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.135225 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:09Z","lastTransitionTime":"2025-12-05T16:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.238305 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.238340 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.238348 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.238362 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.238371 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:09Z","lastTransitionTime":"2025-12-05T16:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.340765 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.340828 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.340837 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.340853 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.340880 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:09Z","lastTransitionTime":"2025-12-05T16:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.448141 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.448216 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.448244 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.448272 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.448289 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:09Z","lastTransitionTime":"2025-12-05T16:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.551252 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.551342 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.551365 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.551398 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.551456 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:09Z","lastTransitionTime":"2025-12-05T16:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.653717 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.653782 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.653797 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.653819 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.653835 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:09Z","lastTransitionTime":"2025-12-05T16:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.756092 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.756167 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.756183 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.756201 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.756215 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:09Z","lastTransitionTime":"2025-12-05T16:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.859308 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.859355 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.859376 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.859399 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.859446 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:09Z","lastTransitionTime":"2025-12-05T16:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.962154 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.962224 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.962244 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.962271 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:09 crc kubenswrapper[4756]: I1205 16:17:09.962289 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:09Z","lastTransitionTime":"2025-12-05T16:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.048897 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.048929 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.048991 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:10 crc kubenswrapper[4756]: E1205 16:17:10.049085 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.049104 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:10 crc kubenswrapper[4756]: E1205 16:17:10.049228 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:17:10 crc kubenswrapper[4756]: E1205 16:17:10.049447 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:17:10 crc kubenswrapper[4756]: E1205 16:17:10.049531 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.064278 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.064328 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.064340 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.064358 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.064370 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:10Z","lastTransitionTime":"2025-12-05T16:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.072743 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6a0ce5-2436-4b0a-859e-b6a6bb896271\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33f86f8bfd78434c12f4e929d3df2330af2674409bf571d24b567650db52b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47960150dafe3687778fad23667972453f779c6e6a2f9f9a6993030fe66eae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0917bcbcaf15193c5a6dedf6782c084317c6adf6102015a6523661fa05ebf076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66db03ad6dffeb96f6467f3f16e1de82de7cfb75c5d466eca146340d370e9b6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b0372e67b038baa66df470c0f853f5a02c45410e5f54b6c508df432b2671c05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8935aeb859fab5d506afdb2925799611b2029bf8369d50dfaa08f0a13738058\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f12efb3ea4dac5bed762c9fa08db6226f9e2ad6ea4ab5b6a1d7f811161acf51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlbvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdgc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.089260 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rr8pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0cb6a2e-b002-460c-8633-46ab5fbe9668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b614872f608eae4f3a2f1a4b6664f2a9778b21e11545b071995546fb60de6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6nwc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rr8pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.103715 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb44bc23-0bf3-4911-8f7d-e7bbe31b5816\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5b7589ddbcceabedf2f6b5a386d8df5c9b52af1f7fb6b63a7fd4d30d4aa2f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cac9f03390b95dedb4fd926de2d392effe80b3aff4532af921fb9c586b4e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7h8k5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9crpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.128734 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a4eca95-4da7-4fbd-b93f-a812de814925\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354508055ae1b5d0b696b0264b96fd2fa15cf9cd8bc0bcd2df4be8d51a876d22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28449d54c99a82a722abaa0db28d32bd39d6773c43dfb8ec30468e69dd61d55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8327b3ec78f33a89a08aecddf47a5ac16dbc41794bb4f3b2ff8f1232bbffacd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d52b5c51bbf3e6286a3c41ed79640a979f2ce2ad5108a74e089aeffc1a2aedf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76047d5e1e8dd1588f6af989c8c5d21dca16ab43b7304fd838f830b8af09b86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da81b2e56af8684ceace9a39c02521afcfc70f11fae6a5097d372a516a2db3d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68b713986f7c7d33bb9ffdac93df53be7ee3f8de681bad9022e2084e476db128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d4b67a709fa1946ddbc5af1897f0f168f4cca7d9e10b1b616a78c2552bee3a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.150278 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76b785c7-075e-414a-b679-e10916131084\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4293c2496b274319902ba8afc59b13a723c2aa36138c1de4e101706d0bb5ec1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac394a641422533c286aa6268b7b3a7345c80114018d0620ea8337e87cbe66e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28477c2886211dc59939fd294d1444d4219ab7b4123da62678a19454c35417d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.167558 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.167656 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.167716 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.167741 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.167759 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:10Z","lastTransitionTime":"2025-12-05T16:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.168796 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.188507 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc60cd73b616ce08c9303adeb51f46d8914c6438eb23b0533ef5681213e5d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.211841 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd37c06-6aa3-4dbc-a783-e2c3d31bb680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 16:15:52.462204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 16:15:52.463805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1711271359/tls.crt::/tmp/serving-cert-1711271359/tls.key\\\\\\\"\\\\nI1205 16:15:57.763141 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 16:15:57.770357 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 16:15:57.770446 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 16:15:57.770485 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 16:15:57.770498 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 16:15:57.777713 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 16:15:57.777746 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 16:15:57.777765 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 16:15:57.777772 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 16:15:57.777782 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 16:15:57.777789 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 16:15:57.780559 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 16:15:57.788322 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.233494 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.268004 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75aef14-f06d-44ec-8a2d-28d9661b1001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://678012ecd6a1f851dd87783e9c2fc348817b41557fcb46a8cc4a8acc313bf09c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678012ecd6a1f851dd87783e9c2fc348817b41557fcb46a8cc4a8acc313bf09c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:17:02Z\\\",\\\"message\\\":\\\" load balancers\\\\nI1205 16:17:01.979264 6827 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 16:17:01.978781 6827 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-rr8pw after 0 failed attempt(s)\\\\nI1205 16:17:01.979315 6827 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-rr8pw\\\\nI1205 16:17:01.979025 6827 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1205 16:17:01.978711 6827 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1205 16:17:01.979336 6827 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nF1205 16:17:01.979337 6827 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:17:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pz5rs_openshift-ovn-kubernetes(b75aef14-f06d-44ec-8a2d-28d9661b1001)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:16:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6l5kc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz5rs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.271966 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.272020 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.272039 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.272066 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.272084 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:10Z","lastTransitionTime":"2025-12-05T16:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.287856 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcqf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb26e42b-df49-418d-8be1-8a0704cd5dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51fbf3a68f2e5d6a355b8181868e9ea812ba19b0f1b42ce32ec95e2126451436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T16:16:51Z\\\",\\\"message\\\":\\\"2025-12-05T16:16:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0870f74e-77e4-4ca8-8c3a-d27c95ae7899\\\\n2025-12-05T16:16:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0870f74e-77e4-4ca8-8c3a-d27c95ae7899 to /host/opt/cni/bin/\\\\n2025-12-05T16:16:06Z [verbose] multus-daemon started\\\\n2025-12-05T16:16:06Z [verbose] Readiness Indicator file check\\\\n2025-12-05T16:16:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqdz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcqf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.308624 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lsn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hmt7g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lsn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.328168 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99db355-6e00-475b-b67e-2ca42a0ab24d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a967f6919586e3fb43b7e4b1236d537baffff065430838b580204b39243ef564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b76cab0eed4b440c96aefe25fa6077a101ba03b3b6df2d337a94152f55d4d0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b76cab0eed4b440c96aefe25fa6077a101ba03b3b6df2d337a94152f55d4d0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.353148 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49d2eca96e3523ab0d93374adbaaad19ff036654c2802290fd5e325ec8842c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.374764 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.374823 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.374838 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.374860 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.374876 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:10Z","lastTransitionTime":"2025-12-05T16:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.377205 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd9dee3834c6752a1ae20f912e653db9425319f76c85f9bf7846ce938605e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd99ad21faa8e271d08f0cbe368ce54fd19f92fef9488f924e64e0e630790fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.394595 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.410221 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"636d6ca9-ad3c-4ba3-9b8e-dd0aa505308a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5739236ff87695260ff75649be85ea733991ee8d0dc84198d6690ad408b90eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568eb65db0cf87e541088899b4074151be30d606c0046d2b304d238078797ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b747e8661dcf2cdad0c507bbdc8ae546903c4aca090e6dab97bf8d6c612bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:15:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373cd5c8b65a5cd92e18658b5926509f83c3055e3fcf7a82e8674c35f9f130b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://373cd5c8b65a5cd92e18658b5926509f83c3055e3fcf7a82e8674c35f9f130b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T16:15:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T16:15:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:15:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.429147 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2067f3da-c088-40d8-bc08-1eb971992c56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78804bf28098db71c4d1d0b7f72082dd67e80b53e8fee642016761cd6151da3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzp58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zrvcn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.445079 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bz9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13fd360e-8db2-431e-bc4a-cdd7938b0da6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T16:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f24ff305432a2f2d5b359d35b9bc07667ebdb4fca77def6a2559bf624956567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T16:16:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzf9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T16:16:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bz9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:10Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.477064 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.477128 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.477146 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.477223 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.477243 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:10Z","lastTransitionTime":"2025-12-05T16:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.581950 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.581995 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.582006 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.582028 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.582040 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:10Z","lastTransitionTime":"2025-12-05T16:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.685281 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.685781 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.685796 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.685817 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.685832 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:10Z","lastTransitionTime":"2025-12-05T16:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.789286 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.789329 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.789342 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.789358 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.789368 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:10Z","lastTransitionTime":"2025-12-05T16:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.892566 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.892944 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.893065 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.893170 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.893267 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:10Z","lastTransitionTime":"2025-12-05T16:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.996599 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.996690 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.996707 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.996737 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:10 crc kubenswrapper[4756]: I1205 16:17:10.996758 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:10Z","lastTransitionTime":"2025-12-05T16:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.099982 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.100055 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.100078 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.100111 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.100131 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:11Z","lastTransitionTime":"2025-12-05T16:17:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.203453 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.203500 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.203511 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.203531 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.203543 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:11Z","lastTransitionTime":"2025-12-05T16:17:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.306412 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.306491 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.306504 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.306521 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.306534 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:11Z","lastTransitionTime":"2025-12-05T16:17:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.409541 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.409590 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.409602 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.409626 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.409640 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:11Z","lastTransitionTime":"2025-12-05T16:17:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.512601 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.512662 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.512679 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.512702 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.512715 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:11Z","lastTransitionTime":"2025-12-05T16:17:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.615225 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.615263 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.615273 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.615288 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.615299 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:11Z","lastTransitionTime":"2025-12-05T16:17:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.717828 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.717876 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.717888 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.717904 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.717915 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:11Z","lastTransitionTime":"2025-12-05T16:17:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.820743 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.820784 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.820798 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.820814 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.820825 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:11Z","lastTransitionTime":"2025-12-05T16:17:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.923401 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.923466 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.923478 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.923493 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:11 crc kubenswrapper[4756]: I1205 16:17:11.923506 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:11Z","lastTransitionTime":"2025-12-05T16:17:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.026557 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.026615 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.026630 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.026651 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.026668 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:12Z","lastTransitionTime":"2025-12-05T16:17:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.049627 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.049731 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.049803 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.049985 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:12 crc kubenswrapper[4756]: E1205 16:17:12.049966 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:17:12 crc kubenswrapper[4756]: E1205 16:17:12.050086 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:17:12 crc kubenswrapper[4756]: E1205 16:17:12.050169 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:17:12 crc kubenswrapper[4756]: E1205 16:17:12.050247 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.129516 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.129581 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.129591 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.129605 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.129614 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:12Z","lastTransitionTime":"2025-12-05T16:17:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.233228 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.233302 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.233320 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.233351 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.233372 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:12Z","lastTransitionTime":"2025-12-05T16:17:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.336531 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.336630 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.336667 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.336701 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.336725 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:12Z","lastTransitionTime":"2025-12-05T16:17:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.439571 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.439613 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.439629 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.439647 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.439663 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:12Z","lastTransitionTime":"2025-12-05T16:17:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.542387 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.542468 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.542488 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.542508 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.542522 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:12Z","lastTransitionTime":"2025-12-05T16:17:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.645256 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.645330 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.645475 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.645509 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.645546 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:12Z","lastTransitionTime":"2025-12-05T16:17:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.749195 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.749258 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.749280 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.749306 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.749324 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:12Z","lastTransitionTime":"2025-12-05T16:17:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.852874 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.852918 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.852934 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.852960 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.852977 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:12Z","lastTransitionTime":"2025-12-05T16:17:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.955830 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.955905 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.955930 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.955962 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:12 crc kubenswrapper[4756]: I1205 16:17:12.955981 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:12Z","lastTransitionTime":"2025-12-05T16:17:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.063540 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.063626 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.063659 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.063685 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.063704 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:13Z","lastTransitionTime":"2025-12-05T16:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.135112 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.135395 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.135407 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.135445 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.135459 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:13Z","lastTransitionTime":"2025-12-05T16:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:13 crc kubenswrapper[4756]: E1205 16:17:13.155212 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:13Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.160043 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.160092 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.160109 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.160131 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.160147 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:13Z","lastTransitionTime":"2025-12-05T16:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:13 crc kubenswrapper[4756]: E1205 16:17:13.178886 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:13Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.183879 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.183940 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.183964 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.183993 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.184015 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:13Z","lastTransitionTime":"2025-12-05T16:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:13 crc kubenswrapper[4756]: E1205 16:17:13.205157 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:13Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.210375 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.210534 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.210574 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.210605 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.210642 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:13Z","lastTransitionTime":"2025-12-05T16:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:13 crc kubenswrapper[4756]: E1205 16:17:13.228005 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:13Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.232937 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.232988 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.233005 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.233028 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.233044 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:13Z","lastTransitionTime":"2025-12-05T16:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:13 crc kubenswrapper[4756]: E1205 16:17:13.251763 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T16:17:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b87c017-f071-4b1f-b157-701524c44075\\\",\\\"systemUUID\\\":\\\"6be86e52-5e8b-4c16-8ed7-a526f092135c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T16:17:13Z is after 2025-08-24T17:21:41Z" Dec 05 16:17:13 crc kubenswrapper[4756]: E1205 16:17:13.251890 4756 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.253361 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.253430 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.253439 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.253453 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.253462 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:13Z","lastTransitionTime":"2025-12-05T16:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.356257 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.356319 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.356337 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.356362 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.356382 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:13Z","lastTransitionTime":"2025-12-05T16:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.459515 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.459588 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.459606 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.459627 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.459643 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:13Z","lastTransitionTime":"2025-12-05T16:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.562262 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.562298 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.562308 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.562325 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.562338 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:13Z","lastTransitionTime":"2025-12-05T16:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.665541 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.665587 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.665600 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.665617 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.665629 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:13Z","lastTransitionTime":"2025-12-05T16:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.769076 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.769181 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.769205 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.769242 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.769270 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:13Z","lastTransitionTime":"2025-12-05T16:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.872111 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.872152 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.872163 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.872178 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.872189 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:13Z","lastTransitionTime":"2025-12-05T16:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.974754 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.974807 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.974822 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.974845 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:13 crc kubenswrapper[4756]: I1205 16:17:13.974860 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:13Z","lastTransitionTime":"2025-12-05T16:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.048810 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.048884 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.048919 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:14 crc kubenswrapper[4756]: E1205 16:17:14.049001 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.049060 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:14 crc kubenswrapper[4756]: E1205 16:17:14.049120 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:17:14 crc kubenswrapper[4756]: E1205 16:17:14.049251 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:17:14 crc kubenswrapper[4756]: E1205 16:17:14.049375 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.077049 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.077099 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.077118 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.077142 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.077159 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:14Z","lastTransitionTime":"2025-12-05T16:17:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.179719 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.179759 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.179771 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.179790 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.179801 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:14Z","lastTransitionTime":"2025-12-05T16:17:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.282290 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.282326 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.282338 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.282351 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.282362 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:14Z","lastTransitionTime":"2025-12-05T16:17:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.384907 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.384955 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.384973 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.384993 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.385006 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:14Z","lastTransitionTime":"2025-12-05T16:17:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.487385 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.487435 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.487443 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.487456 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.487465 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:14Z","lastTransitionTime":"2025-12-05T16:17:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.590324 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.590370 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.590380 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.590398 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.590409 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:14Z","lastTransitionTime":"2025-12-05T16:17:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.694152 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.694224 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.694238 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.694263 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.694281 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:14Z","lastTransitionTime":"2025-12-05T16:17:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.798756 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.798816 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.798832 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.798855 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.798869 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:14Z","lastTransitionTime":"2025-12-05T16:17:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.901806 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.901856 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.901868 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.901888 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:14 crc kubenswrapper[4756]: I1205 16:17:14.901901 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:14Z","lastTransitionTime":"2025-12-05T16:17:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.006881 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.006916 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.006924 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.006938 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.006961 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:15Z","lastTransitionTime":"2025-12-05T16:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.109912 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.109981 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.110004 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.110039 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.110069 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:15Z","lastTransitionTime":"2025-12-05T16:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.212660 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.212705 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.212716 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.212734 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.212745 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:15Z","lastTransitionTime":"2025-12-05T16:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.315219 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.315272 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.315284 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.315303 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.315315 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:15Z","lastTransitionTime":"2025-12-05T16:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.418365 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.418440 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.418460 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.418481 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.418497 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:15Z","lastTransitionTime":"2025-12-05T16:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.521491 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.521537 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.521553 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.521576 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.521592 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:15Z","lastTransitionTime":"2025-12-05T16:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.623923 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.623964 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.623978 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.623995 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.624007 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:15Z","lastTransitionTime":"2025-12-05T16:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.725890 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.726132 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.726144 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.726164 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.726176 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:15Z","lastTransitionTime":"2025-12-05T16:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.829982 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.830021 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.830033 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.830049 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.830062 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:15Z","lastTransitionTime":"2025-12-05T16:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.932462 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.932510 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.932525 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.932549 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:15 crc kubenswrapper[4756]: I1205 16:17:15.932567 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:15Z","lastTransitionTime":"2025-12-05T16:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.034748 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.034837 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.034862 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.034894 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.034917 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:16Z","lastTransitionTime":"2025-12-05T16:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.049160 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.049518 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.049571 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:16 crc kubenswrapper[4756]: E1205 16:17:16.049589 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:17:16 crc kubenswrapper[4756]: E1205 16:17:16.049662 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.049679 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:16 crc kubenswrapper[4756]: E1205 16:17:16.049818 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:17:16 crc kubenswrapper[4756]: E1205 16:17:16.050346 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.050603 4756 scope.go:117] "RemoveContainer" containerID="678012ecd6a1f851dd87783e9c2fc348817b41557fcb46a8cc4a8acc313bf09c" Dec 05 16:17:16 crc kubenswrapper[4756]: E1205 16:17:16.050750 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pz5rs_openshift-ovn-kubernetes(b75aef14-f06d-44ec-8a2d-28d9661b1001)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.138651 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.138708 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.138728 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.138752 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.138770 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:16Z","lastTransitionTime":"2025-12-05T16:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.242147 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.242183 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.242218 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.242438 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.242450 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:16Z","lastTransitionTime":"2025-12-05T16:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.345078 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.345164 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.345203 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.345268 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.345294 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:16Z","lastTransitionTime":"2025-12-05T16:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.447633 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.447668 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.447677 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.447691 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.447703 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:16Z","lastTransitionTime":"2025-12-05T16:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.550395 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.550473 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.550486 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.550502 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.550515 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:16Z","lastTransitionTime":"2025-12-05T16:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.653135 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.653197 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.653220 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.653254 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.653275 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:16Z","lastTransitionTime":"2025-12-05T16:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.756178 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.756229 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.756241 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.756259 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.756273 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:16Z","lastTransitionTime":"2025-12-05T16:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.859394 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.859540 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.859564 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.859934 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.859954 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:16Z","lastTransitionTime":"2025-12-05T16:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.963043 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.963134 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.963153 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.963182 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:16 crc kubenswrapper[4756]: I1205 16:17:16.963203 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:16Z","lastTransitionTime":"2025-12-05T16:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.066582 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.066754 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.066776 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.066799 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.066986 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:17Z","lastTransitionTime":"2025-12-05T16:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.170365 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.170467 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.170494 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.170522 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.170540 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:17Z","lastTransitionTime":"2025-12-05T16:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.273932 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.273984 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.273995 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.274011 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.274022 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:17Z","lastTransitionTime":"2025-12-05T16:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.376870 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.376936 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.376959 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.376988 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.377008 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:17Z","lastTransitionTime":"2025-12-05T16:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.480370 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.480474 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.480500 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.480531 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.480555 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:17Z","lastTransitionTime":"2025-12-05T16:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.583204 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.583270 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.583288 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.583311 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.583329 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:17Z","lastTransitionTime":"2025-12-05T16:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.686566 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.686621 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.686637 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.686656 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.686670 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:17Z","lastTransitionTime":"2025-12-05T16:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.789943 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.789992 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.790006 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.790027 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.790041 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:17Z","lastTransitionTime":"2025-12-05T16:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.892715 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.892760 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.892775 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.892795 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.892806 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:17Z","lastTransitionTime":"2025-12-05T16:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.995634 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.995675 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.995686 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.995701 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:17 crc kubenswrapper[4756]: I1205 16:17:17.995710 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:17Z","lastTransitionTime":"2025-12-05T16:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.049368 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.049464 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:18 crc kubenswrapper[4756]: E1205 16:17:18.049507 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.049494 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.049395 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:18 crc kubenswrapper[4756]: E1205 16:17:18.049618 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:17:18 crc kubenswrapper[4756]: E1205 16:17:18.049767 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:17:18 crc kubenswrapper[4756]: E1205 16:17:18.049983 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.098847 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.098919 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.098942 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.098971 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.098994 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:18Z","lastTransitionTime":"2025-12-05T16:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.202031 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.202123 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.202155 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.202191 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.202226 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:18Z","lastTransitionTime":"2025-12-05T16:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.305104 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.305187 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.305210 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.305238 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.305262 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:18Z","lastTransitionTime":"2025-12-05T16:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.408506 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.408553 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.408565 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.408588 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.408600 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:18Z","lastTransitionTime":"2025-12-05T16:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.511306 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.511383 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.511405 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.511466 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.511488 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:18Z","lastTransitionTime":"2025-12-05T16:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.614875 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.614917 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.614928 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.614944 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.614955 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:18Z","lastTransitionTime":"2025-12-05T16:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.719556 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.719620 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.719637 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.719661 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.719678 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:18Z","lastTransitionTime":"2025-12-05T16:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.822118 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.822156 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.822175 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.822239 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.822259 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:18Z","lastTransitionTime":"2025-12-05T16:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.925148 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.925603 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.925771 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.925936 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:18 crc kubenswrapper[4756]: I1205 16:17:18.926022 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:18Z","lastTransitionTime":"2025-12-05T16:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.028381 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.028602 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.028635 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.028660 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.028676 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:19Z","lastTransitionTime":"2025-12-05T16:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.131525 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.131598 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.131616 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.131633 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.131644 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:19Z","lastTransitionTime":"2025-12-05T16:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.234590 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.234660 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.234681 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.234706 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.234726 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:19Z","lastTransitionTime":"2025-12-05T16:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.337724 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.337835 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.337859 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.337886 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.337907 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:19Z","lastTransitionTime":"2025-12-05T16:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.440909 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.440971 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.440996 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.441025 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.441047 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:19Z","lastTransitionTime":"2025-12-05T16:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.544719 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.544829 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.545176 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.545505 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.545791 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:19Z","lastTransitionTime":"2025-12-05T16:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.649058 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.649219 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.649240 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.649263 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.649280 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:19Z","lastTransitionTime":"2025-12-05T16:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.751981 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.752086 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.752097 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.752115 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.752127 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:19Z","lastTransitionTime":"2025-12-05T16:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.854942 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.854999 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.855015 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.855034 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.855052 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:19Z","lastTransitionTime":"2025-12-05T16:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.958021 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.958068 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.958077 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.958092 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:19 crc kubenswrapper[4756]: I1205 16:17:19.958101 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:19Z","lastTransitionTime":"2025-12-05T16:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.049327 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.049618 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:20 crc kubenswrapper[4756]: E1205 16:17:20.049787 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.049865 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:20 crc kubenswrapper[4756]: E1205 16:17:20.050300 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.050543 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:20 crc kubenswrapper[4756]: E1205 16:17:20.050744 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:17:20 crc kubenswrapper[4756]: E1205 16:17:20.051006 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.060531 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.060638 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.060674 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.060735 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.060744 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:20Z","lastTransitionTime":"2025-12-05T16:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.157896 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-fcqf4" podStartSLOduration=76.157873319 podStartE2EDuration="1m16.157873319s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:17:20.140178067 +0000 UTC m=+100.378851619" watchObservedRunningTime="2025-12-05 16:17:20.157873319 +0000 UTC m=+100.396546862" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.162837 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.162883 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.162896 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.162916 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.162929 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:20Z","lastTransitionTime":"2025-12-05T16:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.176555 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=14.176536958 podStartE2EDuration="14.176536958s" podCreationTimestamp="2025-12-05 16:17:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:17:20.176487756 +0000 UTC m=+100.415161288" watchObservedRunningTime="2025-12-05 16:17:20.176536958 +0000 UTC m=+100.415210480" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.211016 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podStartSLOduration=76.210987597 podStartE2EDuration="1m16.210987597s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:17:20.197278481 +0000 UTC m=+100.435952003" watchObservedRunningTime="2025-12-05 16:17:20.210987597 +0000 UTC m=+100.449661159" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.211824 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-bz9nh" podStartSLOduration=76.211811589 podStartE2EDuration="1m16.211811589s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:17:20.210604797 +0000 UTC m=+100.449278319" watchObservedRunningTime="2025-12-05 16:17:20.211811589 +0000 UTC m=+100.450485141" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.249194 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=45.249173007 podStartE2EDuration="45.249173007s" podCreationTimestamp="2025-12-05 16:16:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:17:20.230334064 +0000 UTC m=+100.469007586" watchObservedRunningTime="2025-12-05 16:17:20.249173007 +0000 UTC m=+100.487846529" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.265612 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.265658 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.265672 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.265692 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.265707 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:20Z","lastTransitionTime":"2025-12-05T16:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.266410 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=75.266341655 podStartE2EDuration="1m15.266341655s" podCreationTimestamp="2025-12-05 16:16:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:17:20.250028079 +0000 UTC m=+100.488701621" watchObservedRunningTime="2025-12-05 16:17:20.266341655 +0000 UTC m=+100.505015207" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.341163 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-rr8pw" podStartSLOduration=76.341147592 podStartE2EDuration="1m16.341147592s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:17:20.341033348 +0000 UTC m=+100.579706870" watchObservedRunningTime="2025-12-05 16:17:20.341147592 +0000 UTC m=+100.579821124" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.341665 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-jdgc8" podStartSLOduration=76.341660905 podStartE2EDuration="1m16.341660905s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:17:20.32385689 +0000 UTC m=+100.562530412" watchObservedRunningTime="2025-12-05 16:17:20.341660905 +0000 UTC m=+100.580334417" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.368301 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.368329 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.368337 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.368350 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.368357 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:20Z","lastTransitionTime":"2025-12-05T16:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.376041 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9crpv" podStartSLOduration=76.376024692 podStartE2EDuration="1m16.376024692s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:17:20.352793752 +0000 UTC m=+100.591467264" watchObservedRunningTime="2025-12-05 16:17:20.376024692 +0000 UTC m=+100.614698224" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.390792 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=78.390774906 podStartE2EDuration="1m18.390774906s" podCreationTimestamp="2025-12-05 16:16:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:17:20.375728725 +0000 UTC m=+100.614402267" watchObservedRunningTime="2025-12-05 16:17:20.390774906 +0000 UTC m=+100.629448418" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.436378 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=82.436363393 podStartE2EDuration="1m22.436363393s" podCreationTimestamp="2025-12-05 16:15:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:17:20.435662364 +0000 UTC m=+100.674335886" watchObservedRunningTime="2025-12-05 16:17:20.436363393 +0000 UTC m=+100.675036905" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.470965 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.471041 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.471060 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.471081 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.471094 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:20Z","lastTransitionTime":"2025-12-05T16:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.573621 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.573650 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.573659 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.573672 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.573682 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:20Z","lastTransitionTime":"2025-12-05T16:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.676752 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.676817 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.676839 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.676870 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.676929 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:20Z","lastTransitionTime":"2025-12-05T16:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.779955 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.780052 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.780070 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.780094 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.780112 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:20Z","lastTransitionTime":"2025-12-05T16:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.883293 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.883375 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.883398 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.883499 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.883519 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:20Z","lastTransitionTime":"2025-12-05T16:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.986748 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.986806 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.986823 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.986846 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:20 crc kubenswrapper[4756]: I1205 16:17:20.986863 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:20Z","lastTransitionTime":"2025-12-05T16:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.089738 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.089792 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.089809 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.089836 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.089853 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:21Z","lastTransitionTime":"2025-12-05T16:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.192795 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.192852 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.192870 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.192896 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.192914 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:21Z","lastTransitionTime":"2025-12-05T16:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.295672 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.295722 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.295739 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.295761 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.295778 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:21Z","lastTransitionTime":"2025-12-05T16:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.399030 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.399096 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.399112 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.399136 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.399154 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:21Z","lastTransitionTime":"2025-12-05T16:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.502146 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.502205 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.502225 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.502253 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.502270 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:21Z","lastTransitionTime":"2025-12-05T16:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.604996 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.605054 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.605073 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.605096 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.605113 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:21Z","lastTransitionTime":"2025-12-05T16:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.708058 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.708145 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.708168 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.708202 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.708225 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:21Z","lastTransitionTime":"2025-12-05T16:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.811253 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.811313 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.811329 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.811351 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.811369 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:21Z","lastTransitionTime":"2025-12-05T16:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.913742 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.913818 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.913837 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.913888 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:21 crc kubenswrapper[4756]: I1205 16:17:21.913908 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:21Z","lastTransitionTime":"2025-12-05T16:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.017127 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.017192 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.017203 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.017220 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.017232 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:22Z","lastTransitionTime":"2025-12-05T16:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.049155 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.049177 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.049277 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.049565 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:22 crc kubenswrapper[4756]: E1205 16:17:22.049547 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:17:22 crc kubenswrapper[4756]: E1205 16:17:22.049766 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:17:22 crc kubenswrapper[4756]: E1205 16:17:22.049885 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:17:22 crc kubenswrapper[4756]: E1205 16:17:22.050401 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.120316 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.120382 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.120398 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.120450 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.120468 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:22Z","lastTransitionTime":"2025-12-05T16:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.223161 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.223486 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.223623 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.223718 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.223797 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:22Z","lastTransitionTime":"2025-12-05T16:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.326280 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.326340 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.326359 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.326386 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.326404 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:22Z","lastTransitionTime":"2025-12-05T16:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.429387 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.429569 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.429604 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.429631 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.429648 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:22Z","lastTransitionTime":"2025-12-05T16:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.532260 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.532306 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.532318 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.532335 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.532348 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:22Z","lastTransitionTime":"2025-12-05T16:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.635541 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.635613 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.635633 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.635658 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.635672 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:22Z","lastTransitionTime":"2025-12-05T16:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.649212 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a3ee468-140c-408e-80a1-aa8fe3fc0acc-metrics-certs\") pod \"network-metrics-daemon-8lsn9\" (UID: \"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\") " pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:22 crc kubenswrapper[4756]: E1205 16:17:22.649381 4756 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 16:17:22 crc kubenswrapper[4756]: E1205 16:17:22.649454 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a3ee468-140c-408e-80a1-aa8fe3fc0acc-metrics-certs podName:7a3ee468-140c-408e-80a1-aa8fe3fc0acc nodeName:}" failed. No retries permitted until 2025-12-05 16:18:26.649439364 +0000 UTC m=+166.888112866 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7a3ee468-140c-408e-80a1-aa8fe3fc0acc-metrics-certs") pod "network-metrics-daemon-8lsn9" (UID: "7a3ee468-140c-408e-80a1-aa8fe3fc0acc") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.738490 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.738550 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.738569 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.738594 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.738611 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:22Z","lastTransitionTime":"2025-12-05T16:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.841996 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.842054 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.842091 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.842130 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.842156 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:22Z","lastTransitionTime":"2025-12-05T16:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.945072 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.945265 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.945292 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.945325 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:22 crc kubenswrapper[4756]: I1205 16:17:22.945348 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:22Z","lastTransitionTime":"2025-12-05T16:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.050940 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.050979 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.050991 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.051008 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.051021 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:23Z","lastTransitionTime":"2025-12-05T16:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.153633 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.153704 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.153722 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.153745 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.153763 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:23Z","lastTransitionTime":"2025-12-05T16:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.256303 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.256375 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.256392 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.256460 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.256486 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:23Z","lastTransitionTime":"2025-12-05T16:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.257825 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.257866 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.257879 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.257896 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.257910 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T16:17:23Z","lastTransitionTime":"2025-12-05T16:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.315002 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-v2g95"] Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.315745 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-v2g95" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.318499 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.318548 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.324990 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.325543 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.455990 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8f1fa83a-340e-4c1d-b276-be5619243458-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-v2g95\" (UID: \"8f1fa83a-340e-4c1d-b276-be5619243458\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-v2g95" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.456097 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f1fa83a-340e-4c1d-b276-be5619243458-service-ca\") pod \"cluster-version-operator-5c965bbfc6-v2g95\" (UID: \"8f1fa83a-340e-4c1d-b276-be5619243458\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-v2g95" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.456354 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8f1fa83a-340e-4c1d-b276-be5619243458-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-v2g95\" (UID: \"8f1fa83a-340e-4c1d-b276-be5619243458\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-v2g95" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.456447 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f1fa83a-340e-4c1d-b276-be5619243458-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-v2g95\" (UID: \"8f1fa83a-340e-4c1d-b276-be5619243458\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-v2g95" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.456532 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f1fa83a-340e-4c1d-b276-be5619243458-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-v2g95\" (UID: \"8f1fa83a-340e-4c1d-b276-be5619243458\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-v2g95" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.557719 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8f1fa83a-340e-4c1d-b276-be5619243458-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-v2g95\" (UID: \"8f1fa83a-340e-4c1d-b276-be5619243458\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-v2g95" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.557815 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f1fa83a-340e-4c1d-b276-be5619243458-service-ca\") pod \"cluster-version-operator-5c965bbfc6-v2g95\" (UID: \"8f1fa83a-340e-4c1d-b276-be5619243458\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-v2g95" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.557900 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8f1fa83a-340e-4c1d-b276-be5619243458-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-v2g95\" (UID: \"8f1fa83a-340e-4c1d-b276-be5619243458\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-v2g95" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.557963 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8f1fa83a-340e-4c1d-b276-be5619243458-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-v2g95\" (UID: \"8f1fa83a-340e-4c1d-b276-be5619243458\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-v2g95" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.558010 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f1fa83a-340e-4c1d-b276-be5619243458-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-v2g95\" (UID: \"8f1fa83a-340e-4c1d-b276-be5619243458\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-v2g95" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.558078 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f1fa83a-340e-4c1d-b276-be5619243458-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-v2g95\" (UID: \"8f1fa83a-340e-4c1d-b276-be5619243458\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-v2g95" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.559320 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f1fa83a-340e-4c1d-b276-be5619243458-service-ca\") pod \"cluster-version-operator-5c965bbfc6-v2g95\" (UID: \"8f1fa83a-340e-4c1d-b276-be5619243458\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-v2g95" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.559340 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8f1fa83a-340e-4c1d-b276-be5619243458-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-v2g95\" (UID: \"8f1fa83a-340e-4c1d-b276-be5619243458\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-v2g95" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.566358 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f1fa83a-340e-4c1d-b276-be5619243458-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-v2g95\" (UID: \"8f1fa83a-340e-4c1d-b276-be5619243458\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-v2g95" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.579937 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f1fa83a-340e-4c1d-b276-be5619243458-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-v2g95\" (UID: \"8f1fa83a-340e-4c1d-b276-be5619243458\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-v2g95" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.636963 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-v2g95" Dec 05 16:17:23 crc kubenswrapper[4756]: I1205 16:17:23.662705 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-v2g95" event={"ID":"8f1fa83a-340e-4c1d-b276-be5619243458","Type":"ContainerStarted","Data":"09843776c5f144cbb893fa79df60b5d35c6922981b6fe30af30765925ff98413"} Dec 05 16:17:24 crc kubenswrapper[4756]: I1205 16:17:24.048619 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:24 crc kubenswrapper[4756]: I1205 16:17:24.048636 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:24 crc kubenswrapper[4756]: I1205 16:17:24.048659 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:24 crc kubenswrapper[4756]: I1205 16:17:24.048691 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:24 crc kubenswrapper[4756]: E1205 16:17:24.048743 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:17:24 crc kubenswrapper[4756]: E1205 16:17:24.048856 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:17:24 crc kubenswrapper[4756]: E1205 16:17:24.048958 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:17:24 crc kubenswrapper[4756]: E1205 16:17:24.049029 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:17:24 crc kubenswrapper[4756]: I1205 16:17:24.667396 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-v2g95" event={"ID":"8f1fa83a-340e-4c1d-b276-be5619243458","Type":"ContainerStarted","Data":"ac6f94d97e47500bb78dd84227a27f9b8d13864fa042cb313c5eced1d41e6eae"} Dec 05 16:17:24 crc kubenswrapper[4756]: I1205 16:17:24.689398 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-v2g95" podStartSLOduration=80.689373794 podStartE2EDuration="1m20.689373794s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:17:24.686897718 +0000 UTC m=+104.925571240" watchObservedRunningTime="2025-12-05 16:17:24.689373794 +0000 UTC m=+104.928047346" Dec 05 16:17:26 crc kubenswrapper[4756]: I1205 16:17:26.049359 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:26 crc kubenswrapper[4756]: I1205 16:17:26.049469 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:26 crc kubenswrapper[4756]: I1205 16:17:26.049481 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:26 crc kubenswrapper[4756]: E1205 16:17:26.049567 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:17:26 crc kubenswrapper[4756]: E1205 16:17:26.049656 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:17:26 crc kubenswrapper[4756]: I1205 16:17:26.049737 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:26 crc kubenswrapper[4756]: E1205 16:17:26.049833 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:17:26 crc kubenswrapper[4756]: E1205 16:17:26.049926 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:17:28 crc kubenswrapper[4756]: I1205 16:17:28.049145 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:28 crc kubenswrapper[4756]: I1205 16:17:28.049815 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:28 crc kubenswrapper[4756]: I1205 16:17:28.049766 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:28 crc kubenswrapper[4756]: I1205 16:17:28.049192 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:28 crc kubenswrapper[4756]: E1205 16:17:28.050071 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:17:28 crc kubenswrapper[4756]: E1205 16:17:28.050259 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:17:28 crc kubenswrapper[4756]: I1205 16:17:28.050395 4756 scope.go:117] "RemoveContainer" containerID="678012ecd6a1f851dd87783e9c2fc348817b41557fcb46a8cc4a8acc313bf09c" Dec 05 16:17:28 crc kubenswrapper[4756]: E1205 16:17:28.050500 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:17:28 crc kubenswrapper[4756]: E1205 16:17:28.050679 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:17:28 crc kubenswrapper[4756]: E1205 16:17:28.050708 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pz5rs_openshift-ovn-kubernetes(b75aef14-f06d-44ec-8a2d-28d9661b1001)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" Dec 05 16:17:30 crc kubenswrapper[4756]: I1205 16:17:30.049109 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:30 crc kubenswrapper[4756]: E1205 16:17:30.050136 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:17:30 crc kubenswrapper[4756]: I1205 16:17:30.050186 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:30 crc kubenswrapper[4756]: I1205 16:17:30.050201 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:30 crc kubenswrapper[4756]: I1205 16:17:30.050159 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:30 crc kubenswrapper[4756]: E1205 16:17:30.050297 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:17:30 crc kubenswrapper[4756]: E1205 16:17:30.050354 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:17:30 crc kubenswrapper[4756]: E1205 16:17:30.050439 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:17:32 crc kubenswrapper[4756]: I1205 16:17:32.048668 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:32 crc kubenswrapper[4756]: I1205 16:17:32.048734 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:32 crc kubenswrapper[4756]: I1205 16:17:32.048752 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:32 crc kubenswrapper[4756]: E1205 16:17:32.048809 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:17:32 crc kubenswrapper[4756]: E1205 16:17:32.048990 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:17:32 crc kubenswrapper[4756]: E1205 16:17:32.049075 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:17:32 crc kubenswrapper[4756]: I1205 16:17:32.049125 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:32 crc kubenswrapper[4756]: E1205 16:17:32.049327 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:17:34 crc kubenswrapper[4756]: I1205 16:17:34.049383 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:34 crc kubenswrapper[4756]: I1205 16:17:34.049493 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:34 crc kubenswrapper[4756]: I1205 16:17:34.049531 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:34 crc kubenswrapper[4756]: I1205 16:17:34.049589 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:34 crc kubenswrapper[4756]: E1205 16:17:34.049589 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:17:34 crc kubenswrapper[4756]: E1205 16:17:34.049703 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:17:34 crc kubenswrapper[4756]: E1205 16:17:34.049837 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:17:34 crc kubenswrapper[4756]: E1205 16:17:34.049961 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:17:36 crc kubenswrapper[4756]: I1205 16:17:36.049223 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:36 crc kubenswrapper[4756]: I1205 16:17:36.049324 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:36 crc kubenswrapper[4756]: E1205 16:17:36.049378 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:17:36 crc kubenswrapper[4756]: E1205 16:17:36.049582 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:17:36 crc kubenswrapper[4756]: I1205 16:17:36.049659 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:36 crc kubenswrapper[4756]: E1205 16:17:36.049722 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:17:36 crc kubenswrapper[4756]: I1205 16:17:36.049865 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:36 crc kubenswrapper[4756]: E1205 16:17:36.049926 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:17:37 crc kubenswrapper[4756]: I1205 16:17:37.711321 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fcqf4_fb26e42b-df49-418d-8be1-8a0704cd5dfd/kube-multus/1.log" Dec 05 16:17:37 crc kubenswrapper[4756]: I1205 16:17:37.711812 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fcqf4_fb26e42b-df49-418d-8be1-8a0704cd5dfd/kube-multus/0.log" Dec 05 16:17:37 crc kubenswrapper[4756]: I1205 16:17:37.711845 4756 generic.go:334] "Generic (PLEG): container finished" podID="fb26e42b-df49-418d-8be1-8a0704cd5dfd" containerID="51fbf3a68f2e5d6a355b8181868e9ea812ba19b0f1b42ce32ec95e2126451436" exitCode=1 Dec 05 16:17:37 crc kubenswrapper[4756]: I1205 16:17:37.711872 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fcqf4" event={"ID":"fb26e42b-df49-418d-8be1-8a0704cd5dfd","Type":"ContainerDied","Data":"51fbf3a68f2e5d6a355b8181868e9ea812ba19b0f1b42ce32ec95e2126451436"} Dec 05 16:17:37 crc kubenswrapper[4756]: I1205 16:17:37.711907 4756 scope.go:117] "RemoveContainer" containerID="91c685bd8c1c68a8a147321ee118b685fa1348ee7275a380c75efbc1534bef1f" Dec 05 16:17:37 crc kubenswrapper[4756]: I1205 16:17:37.712245 4756 scope.go:117] "RemoveContainer" containerID="51fbf3a68f2e5d6a355b8181868e9ea812ba19b0f1b42ce32ec95e2126451436" Dec 05 16:17:37 crc kubenswrapper[4756]: E1205 16:17:37.712389 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-fcqf4_openshift-multus(fb26e42b-df49-418d-8be1-8a0704cd5dfd)\"" pod="openshift-multus/multus-fcqf4" podUID="fb26e42b-df49-418d-8be1-8a0704cd5dfd" Dec 05 16:17:38 crc kubenswrapper[4756]: I1205 16:17:38.049518 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:38 crc kubenswrapper[4756]: E1205 16:17:38.049728 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:17:38 crc kubenswrapper[4756]: I1205 16:17:38.050070 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:38 crc kubenswrapper[4756]: E1205 16:17:38.050217 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:17:38 crc kubenswrapper[4756]: I1205 16:17:38.050550 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:38 crc kubenswrapper[4756]: E1205 16:17:38.050703 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:17:38 crc kubenswrapper[4756]: I1205 16:17:38.051092 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:38 crc kubenswrapper[4756]: E1205 16:17:38.051249 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:17:38 crc kubenswrapper[4756]: I1205 16:17:38.717599 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fcqf4_fb26e42b-df49-418d-8be1-8a0704cd5dfd/kube-multus/1.log" Dec 05 16:17:40 crc kubenswrapper[4756]: I1205 16:17:40.049131 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:40 crc kubenswrapper[4756]: I1205 16:17:40.049237 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:40 crc kubenswrapper[4756]: E1205 16:17:40.050269 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:17:40 crc kubenswrapper[4756]: I1205 16:17:40.050356 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:40 crc kubenswrapper[4756]: I1205 16:17:40.050369 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:40 crc kubenswrapper[4756]: E1205 16:17:40.050623 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:17:40 crc kubenswrapper[4756]: E1205 16:17:40.051049 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:17:40 crc kubenswrapper[4756]: E1205 16:17:40.051167 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:17:40 crc kubenswrapper[4756]: I1205 16:17:40.051286 4756 scope.go:117] "RemoveContainer" containerID="678012ecd6a1f851dd87783e9c2fc348817b41557fcb46a8cc4a8acc313bf09c" Dec 05 16:17:40 crc kubenswrapper[4756]: E1205 16:17:40.051494 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pz5rs_openshift-ovn-kubernetes(b75aef14-f06d-44ec-8a2d-28d9661b1001)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" Dec 05 16:17:40 crc kubenswrapper[4756]: E1205 16:17:40.073694 4756 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 05 16:17:40 crc kubenswrapper[4756]: E1205 16:17:40.144295 4756 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 16:17:42 crc kubenswrapper[4756]: I1205 16:17:42.099930 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:42 crc kubenswrapper[4756]: E1205 16:17:42.100135 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:17:42 crc kubenswrapper[4756]: I1205 16:17:42.100316 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:42 crc kubenswrapper[4756]: I1205 16:17:42.100447 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:42 crc kubenswrapper[4756]: E1205 16:17:42.100527 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:17:42 crc kubenswrapper[4756]: E1205 16:17:42.100635 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:17:42 crc kubenswrapper[4756]: I1205 16:17:42.100725 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:42 crc kubenswrapper[4756]: E1205 16:17:42.100821 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:17:44 crc kubenswrapper[4756]: I1205 16:17:44.048883 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:44 crc kubenswrapper[4756]: I1205 16:17:44.048989 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:44 crc kubenswrapper[4756]: E1205 16:17:44.049114 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:17:44 crc kubenswrapper[4756]: I1205 16:17:44.049221 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:44 crc kubenswrapper[4756]: E1205 16:17:44.049216 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:17:44 crc kubenswrapper[4756]: E1205 16:17:44.049335 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:17:44 crc kubenswrapper[4756]: I1205 16:17:44.049462 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:44 crc kubenswrapper[4756]: E1205 16:17:44.049545 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:17:45 crc kubenswrapper[4756]: E1205 16:17:45.145907 4756 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 16:17:46 crc kubenswrapper[4756]: I1205 16:17:46.049365 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:46 crc kubenswrapper[4756]: I1205 16:17:46.049425 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:46 crc kubenswrapper[4756]: E1205 16:17:46.049561 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:17:46 crc kubenswrapper[4756]: E1205 16:17:46.049801 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:17:46 crc kubenswrapper[4756]: I1205 16:17:46.049980 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:46 crc kubenswrapper[4756]: E1205 16:17:46.050078 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:17:46 crc kubenswrapper[4756]: I1205 16:17:46.050234 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:46 crc kubenswrapper[4756]: E1205 16:17:46.050375 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:17:48 crc kubenswrapper[4756]: I1205 16:17:48.049311 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:48 crc kubenswrapper[4756]: I1205 16:17:48.049343 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:48 crc kubenswrapper[4756]: I1205 16:17:48.049598 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:48 crc kubenswrapper[4756]: I1205 16:17:48.049603 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:48 crc kubenswrapper[4756]: E1205 16:17:48.049725 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:17:48 crc kubenswrapper[4756]: E1205 16:17:48.049990 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:17:48 crc kubenswrapper[4756]: E1205 16:17:48.050084 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:17:48 crc kubenswrapper[4756]: E1205 16:17:48.050230 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:17:50 crc kubenswrapper[4756]: I1205 16:17:50.049684 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:50 crc kubenswrapper[4756]: I1205 16:17:50.049829 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:50 crc kubenswrapper[4756]: E1205 16:17:50.051843 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:17:50 crc kubenswrapper[4756]: I1205 16:17:50.051875 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:50 crc kubenswrapper[4756]: I1205 16:17:50.051941 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:50 crc kubenswrapper[4756]: E1205 16:17:50.051991 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:17:50 crc kubenswrapper[4756]: E1205 16:17:50.052277 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:17:50 crc kubenswrapper[4756]: E1205 16:17:50.052398 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:17:50 crc kubenswrapper[4756]: E1205 16:17:50.146641 4756 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 16:17:52 crc kubenswrapper[4756]: I1205 16:17:52.049057 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:52 crc kubenswrapper[4756]: I1205 16:17:52.049066 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:52 crc kubenswrapper[4756]: I1205 16:17:52.049207 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:52 crc kubenswrapper[4756]: I1205 16:17:52.049648 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:52 crc kubenswrapper[4756]: I1205 16:17:52.049658 4756 scope.go:117] "RemoveContainer" containerID="51fbf3a68f2e5d6a355b8181868e9ea812ba19b0f1b42ce32ec95e2126451436" Dec 05 16:17:52 crc kubenswrapper[4756]: E1205 16:17:52.049650 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:17:52 crc kubenswrapper[4756]: E1205 16:17:52.049746 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:17:52 crc kubenswrapper[4756]: E1205 16:17:52.050062 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:17:52 crc kubenswrapper[4756]: E1205 16:17:52.050172 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:17:52 crc kubenswrapper[4756]: I1205 16:17:52.780903 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fcqf4_fb26e42b-df49-418d-8be1-8a0704cd5dfd/kube-multus/1.log" Dec 05 16:17:52 crc kubenswrapper[4756]: I1205 16:17:52.780968 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fcqf4" event={"ID":"fb26e42b-df49-418d-8be1-8a0704cd5dfd","Type":"ContainerStarted","Data":"dac33531a016040a5246f6e69fadedf2c460a4d592313c2e4c01a9e8c3dca800"} Dec 05 16:17:54 crc kubenswrapper[4756]: I1205 16:17:54.049510 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:54 crc kubenswrapper[4756]: E1205 16:17:54.049648 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:17:54 crc kubenswrapper[4756]: I1205 16:17:54.049682 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:54 crc kubenswrapper[4756]: I1205 16:17:54.049722 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:54 crc kubenswrapper[4756]: E1205 16:17:54.049749 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:17:54 crc kubenswrapper[4756]: I1205 16:17:54.049510 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:54 crc kubenswrapper[4756]: E1205 16:17:54.049871 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:17:54 crc kubenswrapper[4756]: E1205 16:17:54.049929 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:17:55 crc kubenswrapper[4756]: I1205 16:17:55.050985 4756 scope.go:117] "RemoveContainer" containerID="678012ecd6a1f851dd87783e9c2fc348817b41557fcb46a8cc4a8acc313bf09c" Dec 05 16:17:55 crc kubenswrapper[4756]: E1205 16:17:55.147594 4756 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 16:17:55 crc kubenswrapper[4756]: I1205 16:17:55.792354 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz5rs_b75aef14-f06d-44ec-8a2d-28d9661b1001/ovnkube-controller/3.log" Dec 05 16:17:55 crc kubenswrapper[4756]: I1205 16:17:55.795186 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" event={"ID":"b75aef14-f06d-44ec-8a2d-28d9661b1001","Type":"ContainerStarted","Data":"d93234e29ff75d7e51fd9d09b43ab59a7f60fa2c432745aa3e0d9de700af402e"} Dec 05 16:17:55 crc kubenswrapper[4756]: I1205 16:17:55.795828 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:17:55 crc kubenswrapper[4756]: I1205 16:17:55.823613 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" podStartSLOduration=111.823590014 podStartE2EDuration="1m51.823590014s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:17:55.823526802 +0000 UTC m=+136.062200364" watchObservedRunningTime="2025-12-05 16:17:55.823590014 +0000 UTC m=+136.062263546" Dec 05 16:17:55 crc kubenswrapper[4756]: I1205 16:17:55.866835 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8lsn9"] Dec 05 16:17:55 crc kubenswrapper[4756]: I1205 16:17:55.866973 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:55 crc kubenswrapper[4756]: E1205 16:17:55.867070 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:17:56 crc kubenswrapper[4756]: I1205 16:17:56.048934 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:56 crc kubenswrapper[4756]: I1205 16:17:56.048985 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:56 crc kubenswrapper[4756]: I1205 16:17:56.049090 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:56 crc kubenswrapper[4756]: E1205 16:17:56.049077 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:17:56 crc kubenswrapper[4756]: E1205 16:17:56.049210 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:17:56 crc kubenswrapper[4756]: E1205 16:17:56.049281 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:17:57 crc kubenswrapper[4756]: I1205 16:17:57.049541 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:57 crc kubenswrapper[4756]: E1205 16:17:57.049770 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:17:58 crc kubenswrapper[4756]: I1205 16:17:58.048751 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:17:58 crc kubenswrapper[4756]: I1205 16:17:58.048783 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:17:58 crc kubenswrapper[4756]: I1205 16:17:58.049045 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:17:58 crc kubenswrapper[4756]: E1205 16:17:58.048990 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:17:58 crc kubenswrapper[4756]: E1205 16:17:58.049142 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:17:58 crc kubenswrapper[4756]: E1205 16:17:58.049259 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:17:59 crc kubenswrapper[4756]: I1205 16:17:59.049363 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:17:59 crc kubenswrapper[4756]: E1205 16:17:59.049769 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lsn9" podUID="7a3ee468-140c-408e-80a1-aa8fe3fc0acc" Dec 05 16:18:00 crc kubenswrapper[4756]: I1205 16:18:00.049401 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:18:00 crc kubenswrapper[4756]: I1205 16:18:00.049522 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:18:00 crc kubenswrapper[4756]: E1205 16:18:00.051443 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 16:18:00 crc kubenswrapper[4756]: I1205 16:18:00.051471 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:18:00 crc kubenswrapper[4756]: E1205 16:18:00.051567 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 16:18:00 crc kubenswrapper[4756]: E1205 16:18:00.051687 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 16:18:01 crc kubenswrapper[4756]: I1205 16:18:01.048987 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:18:01 crc kubenswrapper[4756]: I1205 16:18:01.053227 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 05 16:18:01 crc kubenswrapper[4756]: I1205 16:18:01.053283 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 05 16:18:02 crc kubenswrapper[4756]: I1205 16:18:02.049238 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:18:02 crc kubenswrapper[4756]: I1205 16:18:02.049319 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:18:02 crc kubenswrapper[4756]: I1205 16:18:02.049545 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:18:02 crc kubenswrapper[4756]: I1205 16:18:02.051510 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 05 16:18:02 crc kubenswrapper[4756]: I1205 16:18:02.051773 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 05 16:18:02 crc kubenswrapper[4756]: I1205 16:18:02.051818 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 05 16:18:02 crc kubenswrapper[4756]: I1205 16:18:02.052073 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 05 16:18:03 crc kubenswrapper[4756]: I1205 16:18:03.954988 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 05 16:18:03 crc kubenswrapper[4756]: I1205 16:18:03.999180 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4jdj5"] Dec 05 16:18:03 crc kubenswrapper[4756]: I1205 16:18:03.999676 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4jdj5" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.000290 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mlthg"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.001305 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-hqh5j"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.001506 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-mlthg" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.002249 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.002437 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.003072 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.004494 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.004797 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.005235 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.006318 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.006632 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.009108 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.009568 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.013101 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.013299 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.013312 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.013360 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.013366 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.013737 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.013867 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.013979 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.014017 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.014458 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.014712 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.018025 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-9nxrs"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.018395 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-zlcmz"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.018722 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-9nxrs" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.019059 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.019956 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-zlcmz" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.021138 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.021392 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.022259 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.022613 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.022917 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.023198 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.023531 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.024408 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-wld6x"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.025245 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wld6x" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.032048 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.032300 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.034135 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.034260 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.035008 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.035123 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.035257 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.035638 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.035870 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.036001 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.036110 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.036240 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.036337 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.036469 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.036659 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.036814 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.036829 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-5cxc5"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.037001 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.037117 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.037217 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.037358 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.037505 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.037628 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cxc5" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.037651 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.038966 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.037702 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.037747 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.038302 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.053884 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gn5x4"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.080226 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.081160 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.081220 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gn5x4" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.081455 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.084873 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.085160 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.087334 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.087938 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.088453 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.089865 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9xwp8"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.090314 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hxxk6"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.090647 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6zc4l"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.090757 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9xwp8" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.090906 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2rb"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.091125 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q9vxd"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.091330 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8cjkz"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.091679 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-8cjkz" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.091829 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6zc4l" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.091903 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q9vxd" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.091881 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2rb" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.092068 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.096131 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.096561 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.096656 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.096566 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.098313 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.099762 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.100284 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.100636 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.100661 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.107828 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.109214 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k7kwr"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.110075 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k7kwr" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.110507 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dpqm9"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.111035 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dpqm9" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.113189 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.113336 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.113674 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.114014 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.114169 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.114321 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.114096 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.114130 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.114535 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.114584 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.114648 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.114660 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.114587 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.114760 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.114835 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.124320 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.125341 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-k8whp"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.125965 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-w5k56"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.126301 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-w5k56" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.126643 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.126991 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zmrlm"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.130457 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.130749 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.131004 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.131153 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.132279 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.132799 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.132939 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.133145 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.133338 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.133490 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.133566 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.147508 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-d9gdk"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.148736 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-z9xhd"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.149542 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-t6pc7"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.152819 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9gdk" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.153237 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-z9xhd" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.154504 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.170642 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.170810 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.171194 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.171315 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.173879 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.174035 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.174280 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zmrlm" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.174868 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zs5pc"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.174940 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.175120 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t6pc7" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.175234 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.175799 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nfdlb"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.175896 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zs5pc" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.176332 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-zn2v6"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.176591 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-nfdlb" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.176824 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6twhh"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.177393 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-pt8fk"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.177529 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zn2v6" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.177674 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6twhh" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.182464 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/cf4b45a0-f5a8-4d66-9d82-19e31c680447-audit\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.182507 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/317dead3-6446-45a7-a256-4ae88df539aa-machine-approver-tls\") pod \"machine-approver-56656f9798-zn2v6\" (UID: \"317dead3-6446-45a7-a256-4ae88df539aa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zn2v6" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.182532 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59db5196-70c4-4bd6-8406-3589d4e44599-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6zc4l\" (UID: \"59db5196-70c4-4bd6-8406-3589d4e44599\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6zc4l" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.182553 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/59db5196-70c4-4bd6-8406-3589d4e44599-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6zc4l\" (UID: \"59db5196-70c4-4bd6-8406-3589d4e44599\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6zc4l" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.182574 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/351d65a8-44a7-4f84-be48-1b5204422e8b-console-config\") pod \"console-f9d7485db-wld6x\" (UID: \"351d65a8-44a7-4f84-be48-1b5204422e8b\") " pod="openshift-console/console-f9d7485db-wld6x" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.182596 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsgjg\" (UniqueName: \"kubernetes.io/projected/0839e735-d517-4f9d-b742-45f842a9d9cc-kube-api-access-gsgjg\") pod \"openshift-config-operator-7777fb866f-5cxc5\" (UID: \"0839e735-d517-4f9d-b742-45f842a9d9cc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cxc5" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.182616 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0eb62b4b-b2c4-46c8-af81-b234e5f3bf71-metrics-tls\") pod \"dns-operator-744455d44c-nfdlb\" (UID: \"0eb62b4b-b2c4-46c8-af81-b234e5f3bf71\") " pod="openshift-dns-operator/dns-operator-744455d44c-nfdlb" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.182638 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/351d65a8-44a7-4f84-be48-1b5204422e8b-trusted-ca-bundle\") pod \"console-f9d7485db-wld6x\" (UID: \"351d65a8-44a7-4f84-be48-1b5204422e8b\") " pod="openshift-console/console-f9d7485db-wld6x" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.182661 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a9694101-a510-49c6-8c1a-4869c9619b6a-encryption-config\") pod \"apiserver-7bbb656c7d-4zr8n\" (UID: \"a9694101-a510-49c6-8c1a-4869c9619b6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.182681 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0839e735-d517-4f9d-b742-45f842a9d9cc-available-featuregates\") pod \"openshift-config-operator-7777fb866f-5cxc5\" (UID: \"0839e735-d517-4f9d-b742-45f842a9d9cc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cxc5" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.182705 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/cba515e6-fcd1-4310-9630-70ca7c90c213-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mlthg\" (UID: \"cba515e6-fcd1-4310-9630-70ca7c90c213\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mlthg" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.182726 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9694101-a510-49c6-8c1a-4869c9619b6a-serving-cert\") pod \"apiserver-7bbb656c7d-4zr8n\" (UID: \"a9694101-a510-49c6-8c1a-4869c9619b6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.182745 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a6b3333-726a-45d8-98e0-70a523f6ac6e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q9vxd\" (UID: \"7a6b3333-726a-45d8-98e0-70a523f6ac6e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q9vxd" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.182766 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf4b45a0-f5a8-4d66-9d82-19e31c680447-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.182782 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cf4b45a0-f5a8-4d66-9d82-19e31c680447-encryption-config\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.182796 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ed2725c8-948a-4757-8e92-e365e49e1f3f-etcd-ca\") pod \"etcd-operator-b45778765-zlcmz\" (UID: \"ed2725c8-948a-4757-8e92-e365e49e1f3f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zlcmz" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.182811 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6bcdf9c6-be18-4bd1-a335-a8479496f678-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6twhh\" (UID: \"6bcdf9c6-be18-4bd1-a335-a8479496f678\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6twhh" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.182832 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/351d65a8-44a7-4f84-be48-1b5204422e8b-console-oauth-config\") pod \"console-f9d7485db-wld6x\" (UID: \"351d65a8-44a7-4f84-be48-1b5204422e8b\") " pod="openshift-console/console-f9d7485db-wld6x" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.182927 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8rhx\" (UniqueName: \"kubernetes.io/projected/cba515e6-fcd1-4310-9630-70ca7c90c213-kube-api-access-m8rhx\") pod \"machine-api-operator-5694c8668f-mlthg\" (UID: \"cba515e6-fcd1-4310-9630-70ca7c90c213\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mlthg" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.182964 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6bcdf9c6-be18-4bd1-a335-a8479496f678-trusted-ca\") pod \"ingress-operator-5b745b69d9-6twhh\" (UID: \"6bcdf9c6-be18-4bd1-a335-a8479496f678\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6twhh" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.182983 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmgkc\" (UniqueName: \"kubernetes.io/projected/317dead3-6446-45a7-a256-4ae88df539aa-kube-api-access-lmgkc\") pod \"machine-approver-56656f9798-zn2v6\" (UID: \"317dead3-6446-45a7-a256-4ae88df539aa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zn2v6" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183005 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a9694101-a510-49c6-8c1a-4869c9619b6a-audit-dir\") pod \"apiserver-7bbb656c7d-4zr8n\" (UID: \"a9694101-a510-49c6-8c1a-4869c9619b6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183025 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf4b45a0-f5a8-4d66-9d82-19e31c680447-config\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183043 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5856492-e637-42d0-b018-6f205e03e897-client-ca\") pod \"controller-manager-879f6c89f-4jdj5\" (UID: \"b5856492-e637-42d0-b018-6f205e03e897\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4jdj5" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183066 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5xgh\" (UniqueName: \"kubernetes.io/projected/6013c2d1-fe3a-4d8d-ba47-132ad91a6a34-kube-api-access-q5xgh\") pod \"downloads-7954f5f757-z9xhd\" (UID: \"6013c2d1-fe3a-4d8d-ba47-132ad91a6a34\") " pod="openshift-console/downloads-7954f5f757-z9xhd" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183149 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqtls\" (UniqueName: \"kubernetes.io/projected/dac87fdb-fe64-40be-81f6-e525e378a0c7-kube-api-access-wqtls\") pod \"catalog-operator-68c6474976-k7kwr\" (UID: \"dac87fdb-fe64-40be-81f6-e525e378a0c7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k7kwr" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183186 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2jfd\" (UniqueName: \"kubernetes.io/projected/351d65a8-44a7-4f84-be48-1b5204422e8b-kube-api-access-n2jfd\") pod \"console-f9d7485db-wld6x\" (UID: \"351d65a8-44a7-4f84-be48-1b5204422e8b\") " pod="openshift-console/console-f9d7485db-wld6x" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183203 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93f05eb8-490a-4d15-8c38-dd158bd4a91f-config\") pod \"console-operator-58897d9998-9nxrs\" (UID: \"93f05eb8-490a-4d15-8c38-dd158bd4a91f\") " pod="openshift-console-operator/console-operator-58897d9998-9nxrs" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183223 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a6b3333-726a-45d8-98e0-70a523f6ac6e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q9vxd\" (UID: \"7a6b3333-726a-45d8-98e0-70a523f6ac6e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q9vxd" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183242 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/93f05eb8-490a-4d15-8c38-dd158bd4a91f-trusted-ca\") pod \"console-operator-58897d9998-9nxrs\" (UID: \"93f05eb8-490a-4d15-8c38-dd158bd4a91f\") " pod="openshift-console-operator/console-operator-58897d9998-9nxrs" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183260 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd58t\" (UniqueName: \"kubernetes.io/projected/a9694101-a510-49c6-8c1a-4869c9619b6a-kube-api-access-xd58t\") pod \"apiserver-7bbb656c7d-4zr8n\" (UID: \"a9694101-a510-49c6-8c1a-4869c9619b6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183280 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a9694101-a510-49c6-8c1a-4869c9619b6a-etcd-client\") pod \"apiserver-7bbb656c7d-4zr8n\" (UID: \"a9694101-a510-49c6-8c1a-4869c9619b6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183299 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5856492-e637-42d0-b018-6f205e03e897-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4jdj5\" (UID: \"b5856492-e637-42d0-b018-6f205e03e897\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4jdj5" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183317 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ed2725c8-948a-4757-8e92-e365e49e1f3f-etcd-service-ca\") pod \"etcd-operator-b45778765-zlcmz\" (UID: \"ed2725c8-948a-4757-8e92-e365e49e1f3f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zlcmz" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183336 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw87l\" (UniqueName: \"kubernetes.io/projected/0eb62b4b-b2c4-46c8-af81-b234e5f3bf71-kube-api-access-kw87l\") pod \"dns-operator-744455d44c-nfdlb\" (UID: \"0eb62b4b-b2c4-46c8-af81-b234e5f3bf71\") " pod="openshift-dns-operator/dns-operator-744455d44c-nfdlb" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183357 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf4b45a0-f5a8-4d66-9d82-19e31c680447-audit-dir\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183375 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/351d65a8-44a7-4f84-be48-1b5204422e8b-console-serving-cert\") pod \"console-f9d7485db-wld6x\" (UID: \"351d65a8-44a7-4f84-be48-1b5204422e8b\") " pod="openshift-console/console-f9d7485db-wld6x" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183395 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmz42\" (UniqueName: \"kubernetes.io/projected/c6e932c7-ad81-4830-b212-61bdde0cec5b-kube-api-access-mmz42\") pod \"cluster-image-registry-operator-dc59b4c8b-gn5x4\" (UID: \"c6e932c7-ad81-4830-b212-61bdde0cec5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gn5x4" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183432 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a9694101-a510-49c6-8c1a-4869c9619b6a-audit-policies\") pod \"apiserver-7bbb656c7d-4zr8n\" (UID: \"a9694101-a510-49c6-8c1a-4869c9619b6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183453 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cf4b45a0-f5a8-4d66-9d82-19e31c680447-node-pullsecrets\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183473 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/317dead3-6446-45a7-a256-4ae88df539aa-auth-proxy-config\") pod \"machine-approver-56656f9798-zn2v6\" (UID: \"317dead3-6446-45a7-a256-4ae88df539aa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zn2v6" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183520 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf4b45a0-f5a8-4d66-9d82-19e31c680447-serving-cert\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183541 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed2725c8-948a-4757-8e92-e365e49e1f3f-serving-cert\") pod \"etcd-operator-b45778765-zlcmz\" (UID: \"ed2725c8-948a-4757-8e92-e365e49e1f3f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zlcmz" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183563 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkn27\" (UniqueName: \"kubernetes.io/projected/9c56993d-85b5-46f9-9897-4e371ee09769-kube-api-access-mkn27\") pod \"cluster-samples-operator-665b6dd947-zs5pc\" (UID: \"9c56993d-85b5-46f9-9897-4e371ee09769\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zs5pc" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183588 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c6e932c7-ad81-4830-b212-61bdde0cec5b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gn5x4\" (UID: \"c6e932c7-ad81-4830-b212-61bdde0cec5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gn5x4" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183609 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a229dbb4-017d-4ef2-bcd0-6c5585e3c497-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-9xwp8\" (UID: \"a229dbb4-017d-4ef2-bcd0-6c5585e3c497\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9xwp8" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183652 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c6e932c7-ad81-4830-b212-61bdde0cec5b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gn5x4\" (UID: \"c6e932c7-ad81-4830-b212-61bdde0cec5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gn5x4" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183674 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c6e932c7-ad81-4830-b212-61bdde0cec5b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gn5x4\" (UID: \"c6e932c7-ad81-4830-b212-61bdde0cec5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gn5x4" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183692 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cf4b45a0-f5a8-4d66-9d82-19e31c680447-etcd-client\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183713 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/317dead3-6446-45a7-a256-4ae88df539aa-config\") pod \"machine-approver-56656f9798-zn2v6\" (UID: \"317dead3-6446-45a7-a256-4ae88df539aa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zn2v6" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183733 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkztf\" (UniqueName: \"kubernetes.io/projected/a229dbb4-017d-4ef2-bcd0-6c5585e3c497-kube-api-access-hkztf\") pod \"openshift-controller-manager-operator-756b6f6bc6-9xwp8\" (UID: \"a229dbb4-017d-4ef2-bcd0-6c5585e3c497\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9xwp8" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183754 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvs2d\" (UniqueName: \"kubernetes.io/projected/93f05eb8-490a-4d15-8c38-dd158bd4a91f-kube-api-access-nvs2d\") pod \"console-operator-58897d9998-9nxrs\" (UID: \"93f05eb8-490a-4d15-8c38-dd158bd4a91f\") " pod="openshift-console-operator/console-operator-58897d9998-9nxrs" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183774 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5856492-e637-42d0-b018-6f205e03e897-serving-cert\") pod \"controller-manager-879f6c89f-4jdj5\" (UID: \"b5856492-e637-42d0-b018-6f205e03e897\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4jdj5" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183794 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22bzp\" (UniqueName: \"kubernetes.io/projected/6bcdf9c6-be18-4bd1-a335-a8479496f678-kube-api-access-22bzp\") pod \"ingress-operator-5b745b69d9-6twhh\" (UID: \"6bcdf9c6-be18-4bd1-a335-a8479496f678\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6twhh" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183811 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e438b5e2-50cf-4f45-a46f-5e4ee62633b4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8cjkz\" (UID: \"e438b5e2-50cf-4f45-a46f-5e4ee62633b4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8cjkz" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183832 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6bcdf9c6-be18-4bd1-a335-a8479496f678-metrics-tls\") pod \"ingress-operator-5b745b69d9-6twhh\" (UID: \"6bcdf9c6-be18-4bd1-a335-a8479496f678\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6twhh" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183851 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93f05eb8-490a-4d15-8c38-dd158bd4a91f-serving-cert\") pod \"console-operator-58897d9998-9nxrs\" (UID: \"93f05eb8-490a-4d15-8c38-dd158bd4a91f\") " pod="openshift-console-operator/console-operator-58897d9998-9nxrs" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183871 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/cf4b45a0-f5a8-4d66-9d82-19e31c680447-image-import-ca\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183889 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ed2725c8-948a-4757-8e92-e365e49e1f3f-etcd-client\") pod \"etcd-operator-b45778765-zlcmz\" (UID: \"ed2725c8-948a-4757-8e92-e365e49e1f3f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zlcmz" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183924 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjdvn\" (UniqueName: \"kubernetes.io/projected/e438b5e2-50cf-4f45-a46f-5e4ee62633b4-kube-api-access-vjdvn\") pod \"multus-admission-controller-857f4d67dd-8cjkz\" (UID: \"e438b5e2-50cf-4f45-a46f-5e4ee62633b4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8cjkz" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183945 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/351d65a8-44a7-4f84-be48-1b5204422e8b-oauth-serving-cert\") pod \"console-f9d7485db-wld6x\" (UID: \"351d65a8-44a7-4f84-be48-1b5204422e8b\") " pod="openshift-console/console-f9d7485db-wld6x" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183968 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a9694101-a510-49c6-8c1a-4869c9619b6a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4zr8n\" (UID: \"a9694101-a510-49c6-8c1a-4869c9619b6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.183986 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cf4b45a0-f5a8-4d66-9d82-19e31c680447-etcd-serving-ca\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.184004 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5856492-e637-42d0-b018-6f205e03e897-config\") pod \"controller-manager-879f6c89f-4jdj5\" (UID: \"b5856492-e637-42d0-b018-6f205e03e897\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4jdj5" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.184099 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdpfl\" (UniqueName: \"kubernetes.io/projected/b5856492-e637-42d0-b018-6f205e03e897-kube-api-access-jdpfl\") pod \"controller-manager-879f6c89f-4jdj5\" (UID: \"b5856492-e637-42d0-b018-6f205e03e897\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4jdj5" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.184126 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7a6b3333-726a-45d8-98e0-70a523f6ac6e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q9vxd\" (UID: \"7a6b3333-726a-45d8-98e0-70a523f6ac6e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q9vxd" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.184147 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9c56993d-85b5-46f9-9897-4e371ee09769-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zs5pc\" (UID: \"9c56993d-85b5-46f9-9897-4e371ee09769\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zs5pc" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.184168 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/351d65a8-44a7-4f84-be48-1b5204422e8b-service-ca\") pod \"console-f9d7485db-wld6x\" (UID: \"351d65a8-44a7-4f84-be48-1b5204422e8b\") " pod="openshift-console/console-f9d7485db-wld6x" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.184191 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cba515e6-fcd1-4310-9630-70ca7c90c213-images\") pod \"machine-api-operator-5694c8668f-mlthg\" (UID: \"cba515e6-fcd1-4310-9630-70ca7c90c213\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mlthg" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.184212 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t55qc\" (UniqueName: \"kubernetes.io/projected/ed2725c8-948a-4757-8e92-e365e49e1f3f-kube-api-access-t55qc\") pod \"etcd-operator-b45778765-zlcmz\" (UID: \"ed2725c8-948a-4757-8e92-e365e49e1f3f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zlcmz" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.184270 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59db5196-70c4-4bd6-8406-3589d4e44599-config\") pod \"kube-apiserver-operator-766d6c64bb-6zc4l\" (UID: \"59db5196-70c4-4bd6-8406-3589d4e44599\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6zc4l" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.184295 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9694101-a510-49c6-8c1a-4869c9619b6a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4zr8n\" (UID: \"a9694101-a510-49c6-8c1a-4869c9619b6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.184319 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bktqt\" (UniqueName: \"kubernetes.io/projected/cf4b45a0-f5a8-4d66-9d82-19e31c680447-kube-api-access-bktqt\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.184363 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a229dbb4-017d-4ef2-bcd0-6c5585e3c497-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-9xwp8\" (UID: \"a229dbb4-017d-4ef2-bcd0-6c5585e3c497\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9xwp8" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.184386 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2ff7989-b26b-4c66-991a-fecafc6f38b9-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-7n2rb\" (UID: \"d2ff7989-b26b-4c66-991a-fecafc6f38b9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2rb" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.184436 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d2ff7989-b26b-4c66-991a-fecafc6f38b9-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-7n2rb\" (UID: \"d2ff7989-b26b-4c66-991a-fecafc6f38b9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2rb" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.184470 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cba515e6-fcd1-4310-9630-70ca7c90c213-config\") pod \"machine-api-operator-5694c8668f-mlthg\" (UID: \"cba515e6-fcd1-4310-9630-70ca7c90c213\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mlthg" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.184514 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0839e735-d517-4f9d-b742-45f842a9d9cc-serving-cert\") pod \"openshift-config-operator-7777fb866f-5cxc5\" (UID: \"0839e735-d517-4f9d-b742-45f842a9d9cc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cxc5" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.184552 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2ff7989-b26b-4c66-991a-fecafc6f38b9-config\") pod \"kube-controller-manager-operator-78b949d7b-7n2rb\" (UID: \"d2ff7989-b26b-4c66-991a-fecafc6f38b9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2rb" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.184590 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dac87fdb-fe64-40be-81f6-e525e378a0c7-srv-cert\") pod \"catalog-operator-68c6474976-k7kwr\" (UID: \"dac87fdb-fe64-40be-81f6-e525e378a0c7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k7kwr" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.184614 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dac87fdb-fe64-40be-81f6-e525e378a0c7-profile-collector-cert\") pod \"catalog-operator-68c6474976-k7kwr\" (UID: \"dac87fdb-fe64-40be-81f6-e525e378a0c7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k7kwr" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.184636 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed2725c8-948a-4757-8e92-e365e49e1f3f-config\") pod \"etcd-operator-b45778765-zlcmz\" (UID: \"ed2725c8-948a-4757-8e92-e365e49e1f3f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zlcmz" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.187746 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.189138 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-stdrz"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.189631 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-pt8fk" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.190110 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lcrs9"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.190587 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v4kkz"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.190814 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-stdrz" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.190906 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v4kkz" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.191440 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lcrs9" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.197461 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-q296n"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.198104 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6zs9x"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.198763 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6zs9x" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.199398 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cpmj6"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.199465 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q296n" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.201168 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cpmj6" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.204569 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dsrxv"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.205000 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-67x7d"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.205312 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7zphw"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.206600 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dsrxv" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.206770 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-67x7d" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.206802 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7zphw" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.207363 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415855-9cgsd"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.210869 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-9cgsd" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.214968 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.215304 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-wld6x"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.216352 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4jdj5"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.217926 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.219612 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wckzw"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.221858 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.221969 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-9nxrs"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.222044 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-wckzw" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.222735 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9xwp8"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.224842 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-hqh5j"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.226226 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mlthg"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.228280 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q9vxd"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.232515 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6zc4l"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.235317 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dpqm9"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.239014 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gn5x4"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.241321 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-5cxc5"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.244276 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hxxk6"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.244312 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.248253 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-z9xhd"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.251506 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2rb"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.252776 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zs5pc"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.254000 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-d9gdk"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.258772 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-t6pc7"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.258816 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6twhh"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.258826 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-w5k56"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.261129 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k7kwr"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.261225 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-q296n"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.264559 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zmrlm"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.266318 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8cjkz"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.267300 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-8jwvd"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.267973 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8jwvd" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.268990 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6kmfx"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.269620 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.273130 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v4kkz"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.273155 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415855-9cgsd"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.273243 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6kmfx" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.273667 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nfdlb"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.274974 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-zlcmz"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.276548 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cpmj6"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.278146 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7zphw"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.279399 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-k8whp"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.280801 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-67x7d"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.282244 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-stdrz"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.282528 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.284161 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dsrxv"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.285193 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvs2d\" (UniqueName: \"kubernetes.io/projected/93f05eb8-490a-4d15-8c38-dd158bd4a91f-kube-api-access-nvs2d\") pod \"console-operator-58897d9998-9nxrs\" (UID: \"93f05eb8-490a-4d15-8c38-dd158bd4a91f\") " pod="openshift-console-operator/console-operator-58897d9998-9nxrs" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.285221 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5856492-e637-42d0-b018-6f205e03e897-serving-cert\") pod \"controller-manager-879f6c89f-4jdj5\" (UID: \"b5856492-e637-42d0-b018-6f205e03e897\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4jdj5" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.285356 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22bzp\" (UniqueName: \"kubernetes.io/projected/6bcdf9c6-be18-4bd1-a335-a8479496f678-kube-api-access-22bzp\") pod \"ingress-operator-5b745b69d9-6twhh\" (UID: \"6bcdf9c6-be18-4bd1-a335-a8479496f678\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6twhh" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.285559 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e438b5e2-50cf-4f45-a46f-5e4ee62633b4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8cjkz\" (UID: \"e438b5e2-50cf-4f45-a46f-5e4ee62633b4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8cjkz" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.285612 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93f05eb8-490a-4d15-8c38-dd158bd4a91f-serving-cert\") pod \"console-operator-58897d9998-9nxrs\" (UID: \"93f05eb8-490a-4d15-8c38-dd158bd4a91f\") " pod="openshift-console-operator/console-operator-58897d9998-9nxrs" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.285636 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6bcdf9c6-be18-4bd1-a335-a8479496f678-metrics-tls\") pod \"ingress-operator-5b745b69d9-6twhh\" (UID: \"6bcdf9c6-be18-4bd1-a335-a8479496f678\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6twhh" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.285703 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/cf4b45a0-f5a8-4d66-9d82-19e31c680447-image-import-ca\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.285719 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ed2725c8-948a-4757-8e92-e365e49e1f3f-etcd-client\") pod \"etcd-operator-b45778765-zlcmz\" (UID: \"ed2725c8-948a-4757-8e92-e365e49e1f3f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zlcmz" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.285736 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjdvn\" (UniqueName: \"kubernetes.io/projected/e438b5e2-50cf-4f45-a46f-5e4ee62633b4-kube-api-access-vjdvn\") pod \"multus-admission-controller-857f4d67dd-8cjkz\" (UID: \"e438b5e2-50cf-4f45-a46f-5e4ee62633b4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8cjkz" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.285751 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/351d65a8-44a7-4f84-be48-1b5204422e8b-oauth-serving-cert\") pod \"console-f9d7485db-wld6x\" (UID: \"351d65a8-44a7-4f84-be48-1b5204422e8b\") " pod="openshift-console/console-f9d7485db-wld6x" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.285765 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a9694101-a510-49c6-8c1a-4869c9619b6a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4zr8n\" (UID: \"a9694101-a510-49c6-8c1a-4869c9619b6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.285781 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cf4b45a0-f5a8-4d66-9d82-19e31c680447-etcd-serving-ca\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.285795 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5856492-e637-42d0-b018-6f205e03e897-config\") pod \"controller-manager-879f6c89f-4jdj5\" (UID: \"b5856492-e637-42d0-b018-6f205e03e897\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4jdj5" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.285809 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdpfl\" (UniqueName: \"kubernetes.io/projected/b5856492-e637-42d0-b018-6f205e03e897-kube-api-access-jdpfl\") pod \"controller-manager-879f6c89f-4jdj5\" (UID: \"b5856492-e637-42d0-b018-6f205e03e897\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4jdj5" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.285826 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7a6b3333-726a-45d8-98e0-70a523f6ac6e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q9vxd\" (UID: \"7a6b3333-726a-45d8-98e0-70a523f6ac6e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q9vxd" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.285842 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9c56993d-85b5-46f9-9897-4e371ee09769-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zs5pc\" (UID: \"9c56993d-85b5-46f9-9897-4e371ee09769\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zs5pc" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.285859 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/351d65a8-44a7-4f84-be48-1b5204422e8b-service-ca\") pod \"console-f9d7485db-wld6x\" (UID: \"351d65a8-44a7-4f84-be48-1b5204422e8b\") " pod="openshift-console/console-f9d7485db-wld6x" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.285895 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cba515e6-fcd1-4310-9630-70ca7c90c213-images\") pod \"machine-api-operator-5694c8668f-mlthg\" (UID: \"cba515e6-fcd1-4310-9630-70ca7c90c213\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mlthg" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.285910 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t55qc\" (UniqueName: \"kubernetes.io/projected/ed2725c8-948a-4757-8e92-e365e49e1f3f-kube-api-access-t55qc\") pod \"etcd-operator-b45778765-zlcmz\" (UID: \"ed2725c8-948a-4757-8e92-e365e49e1f3f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zlcmz" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.285951 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59db5196-70c4-4bd6-8406-3589d4e44599-config\") pod \"kube-apiserver-operator-766d6c64bb-6zc4l\" (UID: \"59db5196-70c4-4bd6-8406-3589d4e44599\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6zc4l" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.286007 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/67cf5e8a-2d35-4db9-bcb6-68492f4be191-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dpqm9\" (UID: \"67cf5e8a-2d35-4db9-bcb6-68492f4be191\") " pod="openshift-marketplace/marketplace-operator-79b997595-dpqm9" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.286028 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9694101-a510-49c6-8c1a-4869c9619b6a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4zr8n\" (UID: \"a9694101-a510-49c6-8c1a-4869c9619b6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.286044 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bktqt\" (UniqueName: \"kubernetes.io/projected/cf4b45a0-f5a8-4d66-9d82-19e31c680447-kube-api-access-bktqt\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.286060 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2ff7989-b26b-4c66-991a-fecafc6f38b9-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-7n2rb\" (UID: \"d2ff7989-b26b-4c66-991a-fecafc6f38b9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2rb" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.286099 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d2ff7989-b26b-4c66-991a-fecafc6f38b9-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-7n2rb\" (UID: \"d2ff7989-b26b-4c66-991a-fecafc6f38b9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2rb" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.286116 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a229dbb4-017d-4ef2-bcd0-6c5585e3c497-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-9xwp8\" (UID: \"a229dbb4-017d-4ef2-bcd0-6c5585e3c497\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9xwp8" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.286152 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cba515e6-fcd1-4310-9630-70ca7c90c213-config\") pod \"machine-api-operator-5694c8668f-mlthg\" (UID: \"cba515e6-fcd1-4310-9630-70ca7c90c213\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mlthg" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.286169 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0839e735-d517-4f9d-b742-45f842a9d9cc-serving-cert\") pod \"openshift-config-operator-7777fb866f-5cxc5\" (UID: \"0839e735-d517-4f9d-b742-45f842a9d9cc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cxc5" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.286183 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2ff7989-b26b-4c66-991a-fecafc6f38b9-config\") pod \"kube-controller-manager-operator-78b949d7b-7n2rb\" (UID: \"d2ff7989-b26b-4c66-991a-fecafc6f38b9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2rb" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.286202 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dac87fdb-fe64-40be-81f6-e525e378a0c7-srv-cert\") pod \"catalog-operator-68c6474976-k7kwr\" (UID: \"dac87fdb-fe64-40be-81f6-e525e378a0c7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k7kwr" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.286221 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dac87fdb-fe64-40be-81f6-e525e378a0c7-profile-collector-cert\") pod \"catalog-operator-68c6474976-k7kwr\" (UID: \"dac87fdb-fe64-40be-81f6-e525e378a0c7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k7kwr" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.286242 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed2725c8-948a-4757-8e92-e365e49e1f3f-config\") pod \"etcd-operator-b45778765-zlcmz\" (UID: \"ed2725c8-948a-4757-8e92-e365e49e1f3f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zlcmz" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.287139 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a9694101-a510-49c6-8c1a-4869c9619b6a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4zr8n\" (UID: \"a9694101-a510-49c6-8c1a-4869c9619b6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.287270 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cf4b45a0-f5a8-4d66-9d82-19e31c680447-etcd-serving-ca\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.288115 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/cf4b45a0-f5a8-4d66-9d82-19e31c680447-image-import-ca\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.288236 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/351d65a8-44a7-4f84-be48-1b5204422e8b-oauth-serving-cert\") pod \"console-f9d7485db-wld6x\" (UID: \"351d65a8-44a7-4f84-be48-1b5204422e8b\") " pod="openshift-console/console-f9d7485db-wld6x" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.288495 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/cf4b45a0-f5a8-4d66-9d82-19e31c680447-audit\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.288666 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/317dead3-6446-45a7-a256-4ae88df539aa-machine-approver-tls\") pod \"machine-approver-56656f9798-zn2v6\" (UID: \"317dead3-6446-45a7-a256-4ae88df539aa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zn2v6" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.288776 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59db5196-70c4-4bd6-8406-3589d4e44599-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6zc4l\" (UID: \"59db5196-70c4-4bd6-8406-3589d4e44599\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6zc4l" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.288896 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/59db5196-70c4-4bd6-8406-3589d4e44599-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6zc4l\" (UID: \"59db5196-70c4-4bd6-8406-3589d4e44599\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6zc4l" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.288999 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbg62\" (UniqueName: \"kubernetes.io/projected/67cf5e8a-2d35-4db9-bcb6-68492f4be191-kube-api-access-kbg62\") pod \"marketplace-operator-79b997595-dpqm9\" (UID: \"67cf5e8a-2d35-4db9-bcb6-68492f4be191\") " pod="openshift-marketplace/marketplace-operator-79b997595-dpqm9" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289103 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/351d65a8-44a7-4f84-be48-1b5204422e8b-service-ca\") pod \"console-f9d7485db-wld6x\" (UID: \"351d65a8-44a7-4f84-be48-1b5204422e8b\") " pod="openshift-console/console-f9d7485db-wld6x" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289106 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/351d65a8-44a7-4f84-be48-1b5204422e8b-console-config\") pod \"console-f9d7485db-wld6x\" (UID: \"351d65a8-44a7-4f84-be48-1b5204422e8b\") " pod="openshift-console/console-f9d7485db-wld6x" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289145 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a229dbb4-017d-4ef2-bcd0-6c5585e3c497-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-9xwp8\" (UID: \"a229dbb4-017d-4ef2-bcd0-6c5585e3c497\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9xwp8" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289170 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsgjg\" (UniqueName: \"kubernetes.io/projected/0839e735-d517-4f9d-b742-45f842a9d9cc-kube-api-access-gsgjg\") pod \"openshift-config-operator-7777fb866f-5cxc5\" (UID: \"0839e735-d517-4f9d-b742-45f842a9d9cc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cxc5" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289198 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0eb62b4b-b2c4-46c8-af81-b234e5f3bf71-metrics-tls\") pod \"dns-operator-744455d44c-nfdlb\" (UID: \"0eb62b4b-b2c4-46c8-af81-b234e5f3bf71\") " pod="openshift-dns-operator/dns-operator-744455d44c-nfdlb" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289224 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/351d65a8-44a7-4f84-be48-1b5204422e8b-trusted-ca-bundle\") pod \"console-f9d7485db-wld6x\" (UID: \"351d65a8-44a7-4f84-be48-1b5204422e8b\") " pod="openshift-console/console-f9d7485db-wld6x" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289253 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8b44c87a-bbab-469a-b5d7-2c8a0a9f05da-proxy-tls\") pod \"machine-config-controller-84d6567774-t6pc7\" (UID: \"8b44c87a-bbab-469a-b5d7-2c8a0a9f05da\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t6pc7" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289280 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/cba515e6-fcd1-4310-9630-70ca7c90c213-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mlthg\" (UID: \"cba515e6-fcd1-4310-9630-70ca7c90c213\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mlthg" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289304 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a9694101-a510-49c6-8c1a-4869c9619b6a-encryption-config\") pod \"apiserver-7bbb656c7d-4zr8n\" (UID: \"a9694101-a510-49c6-8c1a-4869c9619b6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289327 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0839e735-d517-4f9d-b742-45f842a9d9cc-available-featuregates\") pod \"openshift-config-operator-7777fb866f-5cxc5\" (UID: \"0839e735-d517-4f9d-b742-45f842a9d9cc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cxc5" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289353 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9694101-a510-49c6-8c1a-4869c9619b6a-serving-cert\") pod \"apiserver-7bbb656c7d-4zr8n\" (UID: \"a9694101-a510-49c6-8c1a-4869c9619b6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289378 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a6b3333-726a-45d8-98e0-70a523f6ac6e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q9vxd\" (UID: \"7a6b3333-726a-45d8-98e0-70a523f6ac6e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q9vxd" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289433 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf4b45a0-f5a8-4d66-9d82-19e31c680447-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289457 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cf4b45a0-f5a8-4d66-9d82-19e31c680447-encryption-config\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289479 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ed2725c8-948a-4757-8e92-e365e49e1f3f-etcd-ca\") pod \"etcd-operator-b45778765-zlcmz\" (UID: \"ed2725c8-948a-4757-8e92-e365e49e1f3f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zlcmz" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289503 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6bcdf9c6-be18-4bd1-a335-a8479496f678-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6twhh\" (UID: \"6bcdf9c6-be18-4bd1-a335-a8479496f678\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6twhh" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289530 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/351d65a8-44a7-4f84-be48-1b5204422e8b-console-oauth-config\") pod \"console-f9d7485db-wld6x\" (UID: \"351d65a8-44a7-4f84-be48-1b5204422e8b\") " pod="openshift-console/console-f9d7485db-wld6x" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289553 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8rhx\" (UniqueName: \"kubernetes.io/projected/cba515e6-fcd1-4310-9630-70ca7c90c213-kube-api-access-m8rhx\") pod \"machine-api-operator-5694c8668f-mlthg\" (UID: \"cba515e6-fcd1-4310-9630-70ca7c90c213\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mlthg" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289576 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6bcdf9c6-be18-4bd1-a335-a8479496f678-trusted-ca\") pod \"ingress-operator-5b745b69d9-6twhh\" (UID: \"6bcdf9c6-be18-4bd1-a335-a8479496f678\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6twhh" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289599 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmgkc\" (UniqueName: \"kubernetes.io/projected/317dead3-6446-45a7-a256-4ae88df539aa-kube-api-access-lmgkc\") pod \"machine-approver-56656f9798-zn2v6\" (UID: \"317dead3-6446-45a7-a256-4ae88df539aa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zn2v6" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289625 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a9694101-a510-49c6-8c1a-4869c9619b6a-audit-dir\") pod \"apiserver-7bbb656c7d-4zr8n\" (UID: \"a9694101-a510-49c6-8c1a-4869c9619b6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289647 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf4b45a0-f5a8-4d66-9d82-19e31c680447-config\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289672 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5856492-e637-42d0-b018-6f205e03e897-client-ca\") pod \"controller-manager-879f6c89f-4jdj5\" (UID: \"b5856492-e637-42d0-b018-6f205e03e897\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4jdj5" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289695 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5xgh\" (UniqueName: \"kubernetes.io/projected/6013c2d1-fe3a-4d8d-ba47-132ad91a6a34-kube-api-access-q5xgh\") pod \"downloads-7954f5f757-z9xhd\" (UID: \"6013c2d1-fe3a-4d8d-ba47-132ad91a6a34\") " pod="openshift-console/downloads-7954f5f757-z9xhd" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289721 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqtls\" (UniqueName: \"kubernetes.io/projected/dac87fdb-fe64-40be-81f6-e525e378a0c7-kube-api-access-wqtls\") pod \"catalog-operator-68c6474976-k7kwr\" (UID: \"dac87fdb-fe64-40be-81f6-e525e378a0c7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k7kwr" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289727 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5856492-e637-42d0-b018-6f205e03e897-config\") pod \"controller-manager-879f6c89f-4jdj5\" (UID: \"b5856492-e637-42d0-b018-6f205e03e897\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4jdj5" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289767 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2jfd\" (UniqueName: \"kubernetes.io/projected/351d65a8-44a7-4f84-be48-1b5204422e8b-kube-api-access-n2jfd\") pod \"console-f9d7485db-wld6x\" (UID: \"351d65a8-44a7-4f84-be48-1b5204422e8b\") " pod="openshift-console/console-f9d7485db-wld6x" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289794 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93f05eb8-490a-4d15-8c38-dd158bd4a91f-config\") pod \"console-operator-58897d9998-9nxrs\" (UID: \"93f05eb8-490a-4d15-8c38-dd158bd4a91f\") " pod="openshift-console-operator/console-operator-58897d9998-9nxrs" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289810 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59db5196-70c4-4bd6-8406-3589d4e44599-config\") pod \"kube-apiserver-operator-766d6c64bb-6zc4l\" (UID: \"59db5196-70c4-4bd6-8406-3589d4e44599\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6zc4l" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289819 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a6b3333-726a-45d8-98e0-70a523f6ac6e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q9vxd\" (UID: \"7a6b3333-726a-45d8-98e0-70a523f6ac6e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q9vxd" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289849 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/67cf5e8a-2d35-4db9-bcb6-68492f4be191-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dpqm9\" (UID: \"67cf5e8a-2d35-4db9-bcb6-68492f4be191\") " pod="openshift-marketplace/marketplace-operator-79b997595-dpqm9" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289881 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/93f05eb8-490a-4d15-8c38-dd158bd4a91f-trusted-ca\") pod \"console-operator-58897d9998-9nxrs\" (UID: \"93f05eb8-490a-4d15-8c38-dd158bd4a91f\") " pod="openshift-console-operator/console-operator-58897d9998-9nxrs" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289905 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd58t\" (UniqueName: \"kubernetes.io/projected/a9694101-a510-49c6-8c1a-4869c9619b6a-kube-api-access-xd58t\") pod \"apiserver-7bbb656c7d-4zr8n\" (UID: \"a9694101-a510-49c6-8c1a-4869c9619b6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289928 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m9rr\" (UniqueName: \"kubernetes.io/projected/8b44c87a-bbab-469a-b5d7-2c8a0a9f05da-kube-api-access-7m9rr\") pod \"machine-config-controller-84d6567774-t6pc7\" (UID: \"8b44c87a-bbab-469a-b5d7-2c8a0a9f05da\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t6pc7" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289955 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a9694101-a510-49c6-8c1a-4869c9619b6a-etcd-client\") pod \"apiserver-7bbb656c7d-4zr8n\" (UID: \"a9694101-a510-49c6-8c1a-4869c9619b6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.289978 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5856492-e637-42d0-b018-6f205e03e897-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4jdj5\" (UID: \"b5856492-e637-42d0-b018-6f205e03e897\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4jdj5" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.290004 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ed2725c8-948a-4757-8e92-e365e49e1f3f-etcd-service-ca\") pod \"etcd-operator-b45778765-zlcmz\" (UID: \"ed2725c8-948a-4757-8e92-e365e49e1f3f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zlcmz" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.290070 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw87l\" (UniqueName: \"kubernetes.io/projected/0eb62b4b-b2c4-46c8-af81-b234e5f3bf71-kube-api-access-kw87l\") pod \"dns-operator-744455d44c-nfdlb\" (UID: \"0eb62b4b-b2c4-46c8-af81-b234e5f3bf71\") " pod="openshift-dns-operator/dns-operator-744455d44c-nfdlb" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.290098 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8b44c87a-bbab-469a-b5d7-2c8a0a9f05da-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-t6pc7\" (UID: \"8b44c87a-bbab-469a-b5d7-2c8a0a9f05da\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t6pc7" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.290126 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf4b45a0-f5a8-4d66-9d82-19e31c680447-audit-dir\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.290168 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/351d65a8-44a7-4f84-be48-1b5204422e8b-console-serving-cert\") pod \"console-f9d7485db-wld6x\" (UID: \"351d65a8-44a7-4f84-be48-1b5204422e8b\") " pod="openshift-console/console-f9d7485db-wld6x" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.290193 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmz42\" (UniqueName: \"kubernetes.io/projected/c6e932c7-ad81-4830-b212-61bdde0cec5b-kube-api-access-mmz42\") pod \"cluster-image-registry-operator-dc59b4c8b-gn5x4\" (UID: \"c6e932c7-ad81-4830-b212-61bdde0cec5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gn5x4" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.290215 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a9694101-a510-49c6-8c1a-4869c9619b6a-audit-policies\") pod \"apiserver-7bbb656c7d-4zr8n\" (UID: \"a9694101-a510-49c6-8c1a-4869c9619b6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.290239 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cf4b45a0-f5a8-4d66-9d82-19e31c680447-node-pullsecrets\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.288726 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cba515e6-fcd1-4310-9630-70ca7c90c213-images\") pod \"machine-api-operator-5694c8668f-mlthg\" (UID: \"cba515e6-fcd1-4310-9630-70ca7c90c213\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mlthg" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.290563 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lcrs9"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.290687 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9694101-a510-49c6-8c1a-4869c9619b6a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4zr8n\" (UID: \"a9694101-a510-49c6-8c1a-4869c9619b6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.291403 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e438b5e2-50cf-4f45-a46f-5e4ee62633b4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8cjkz\" (UID: \"e438b5e2-50cf-4f45-a46f-5e4ee62633b4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8cjkz" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.291494 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf4b45a0-f5a8-4d66-9d82-19e31c680447-config\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.291865 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5856492-e637-42d0-b018-6f205e03e897-serving-cert\") pod \"controller-manager-879f6c89f-4jdj5\" (UID: \"b5856492-e637-42d0-b018-6f205e03e897\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4jdj5" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.291930 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf4b45a0-f5a8-4d66-9d82-19e31c680447-audit-dir\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.291932 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0839e735-d517-4f9d-b742-45f842a9d9cc-available-featuregates\") pod \"openshift-config-operator-7777fb866f-5cxc5\" (UID: \"0839e735-d517-4f9d-b742-45f842a9d9cc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cxc5" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.292013 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ed2725c8-948a-4757-8e92-e365e49e1f3f-etcd-ca\") pod \"etcd-operator-b45778765-zlcmz\" (UID: \"ed2725c8-948a-4757-8e92-e365e49e1f3f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zlcmz" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.292024 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ed2725c8-948a-4757-8e92-e365e49e1f3f-etcd-service-ca\") pod \"etcd-operator-b45778765-zlcmz\" (UID: \"ed2725c8-948a-4757-8e92-e365e49e1f3f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zlcmz" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.292282 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a9694101-a510-49c6-8c1a-4869c9619b6a-audit-dir\") pod \"apiserver-7bbb656c7d-4zr8n\" (UID: \"a9694101-a510-49c6-8c1a-4869c9619b6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.292493 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/351d65a8-44a7-4f84-be48-1b5204422e8b-console-config\") pod \"console-f9d7485db-wld6x\" (UID: \"351d65a8-44a7-4f84-be48-1b5204422e8b\") " pod="openshift-console/console-f9d7485db-wld6x" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.292652 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cf4b45a0-f5a8-4d66-9d82-19e31c680447-node-pullsecrets\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.292834 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a9694101-a510-49c6-8c1a-4869c9619b6a-audit-policies\") pod \"apiserver-7bbb656c7d-4zr8n\" (UID: \"a9694101-a510-49c6-8c1a-4869c9619b6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.293567 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ed2725c8-948a-4757-8e92-e365e49e1f3f-etcd-client\") pod \"etcd-operator-b45778765-zlcmz\" (UID: \"ed2725c8-948a-4757-8e92-e365e49e1f3f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zlcmz" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.293632 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93f05eb8-490a-4d15-8c38-dd158bd4a91f-config\") pod \"console-operator-58897d9998-9nxrs\" (UID: \"93f05eb8-490a-4d15-8c38-dd158bd4a91f\") " pod="openshift-console-operator/console-operator-58897d9998-9nxrs" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.293678 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/317dead3-6446-45a7-a256-4ae88df539aa-auth-proxy-config\") pod \"machine-approver-56656f9798-zn2v6\" (UID: \"317dead3-6446-45a7-a256-4ae88df539aa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zn2v6" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.293765 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0839e735-d517-4f9d-b742-45f842a9d9cc-serving-cert\") pod \"openshift-config-operator-7777fb866f-5cxc5\" (UID: \"0839e735-d517-4f9d-b742-45f842a9d9cc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cxc5" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.293802 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf4b45a0-f5a8-4d66-9d82-19e31c680447-serving-cert\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.293829 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed2725c8-948a-4757-8e92-e365e49e1f3f-serving-cert\") pod \"etcd-operator-b45778765-zlcmz\" (UID: \"ed2725c8-948a-4757-8e92-e365e49e1f3f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zlcmz" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.293856 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkn27\" (UniqueName: \"kubernetes.io/projected/9c56993d-85b5-46f9-9897-4e371ee09769-kube-api-access-mkn27\") pod \"cluster-samples-operator-665b6dd947-zs5pc\" (UID: \"9c56993d-85b5-46f9-9897-4e371ee09769\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zs5pc" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.293976 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c6e932c7-ad81-4830-b212-61bdde0cec5b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gn5x4\" (UID: \"c6e932c7-ad81-4830-b212-61bdde0cec5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gn5x4" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.294010 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a229dbb4-017d-4ef2-bcd0-6c5585e3c497-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-9xwp8\" (UID: \"a229dbb4-017d-4ef2-bcd0-6c5585e3c497\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9xwp8" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.294011 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a9694101-a510-49c6-8c1a-4869c9619b6a-encryption-config\") pod \"apiserver-7bbb656c7d-4zr8n\" (UID: \"a9694101-a510-49c6-8c1a-4869c9619b6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.294157 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6kmfx"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.294185 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5856492-e637-42d0-b018-6f205e03e897-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4jdj5\" (UID: \"b5856492-e637-42d0-b018-6f205e03e897\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4jdj5" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.294612 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5856492-e637-42d0-b018-6f205e03e897-client-ca\") pod \"controller-manager-879f6c89f-4jdj5\" (UID: \"b5856492-e637-42d0-b018-6f205e03e897\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4jdj5" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.294637 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wckzw"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.294707 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c6e932c7-ad81-4830-b212-61bdde0cec5b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gn5x4\" (UID: \"c6e932c7-ad81-4830-b212-61bdde0cec5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gn5x4" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.294733 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c6e932c7-ad81-4830-b212-61bdde0cec5b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gn5x4\" (UID: \"c6e932c7-ad81-4830-b212-61bdde0cec5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gn5x4" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.294763 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cf4b45a0-f5a8-4d66-9d82-19e31c680447-etcd-client\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.294793 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/317dead3-6446-45a7-a256-4ae88df539aa-config\") pod \"machine-approver-56656f9798-zn2v6\" (UID: \"317dead3-6446-45a7-a256-4ae88df539aa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zn2v6" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.294812 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2ff7989-b26b-4c66-991a-fecafc6f38b9-config\") pod \"kube-controller-manager-operator-78b949d7b-7n2rb\" (UID: \"d2ff7989-b26b-4c66-991a-fecafc6f38b9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2rb" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.294821 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkztf\" (UniqueName: \"kubernetes.io/projected/a229dbb4-017d-4ef2-bcd0-6c5585e3c497-kube-api-access-hkztf\") pod \"openshift-controller-manager-operator-756b6f6bc6-9xwp8\" (UID: \"a229dbb4-017d-4ef2-bcd0-6c5585e3c497\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9xwp8" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.294971 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a6b3333-726a-45d8-98e0-70a523f6ac6e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q9vxd\" (UID: \"7a6b3333-726a-45d8-98e0-70a523f6ac6e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q9vxd" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.295106 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a9694101-a510-49c6-8c1a-4869c9619b6a-etcd-client\") pod \"apiserver-7bbb656c7d-4zr8n\" (UID: \"a9694101-a510-49c6-8c1a-4869c9619b6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.295356 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/351d65a8-44a7-4f84-be48-1b5204422e8b-console-serving-cert\") pod \"console-f9d7485db-wld6x\" (UID: \"351d65a8-44a7-4f84-be48-1b5204422e8b\") " pod="openshift-console/console-f9d7485db-wld6x" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.295596 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/cf4b45a0-f5a8-4d66-9d82-19e31c680447-audit\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.296324 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/351d65a8-44a7-4f84-be48-1b5204422e8b-trusted-ca-bundle\") pod \"console-f9d7485db-wld6x\" (UID: \"351d65a8-44a7-4f84-be48-1b5204422e8b\") " pod="openshift-console/console-f9d7485db-wld6x" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.296536 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/351d65a8-44a7-4f84-be48-1b5204422e8b-console-oauth-config\") pod \"console-f9d7485db-wld6x\" (UID: \"351d65a8-44a7-4f84-be48-1b5204422e8b\") " pod="openshift-console/console-f9d7485db-wld6x" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.296663 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6zs9x"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.296955 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed2725c8-948a-4757-8e92-e365e49e1f3f-config\") pod \"etcd-operator-b45778765-zlcmz\" (UID: \"ed2725c8-948a-4757-8e92-e365e49e1f3f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zlcmz" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.297141 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dac87fdb-fe64-40be-81f6-e525e378a0c7-profile-collector-cert\") pod \"catalog-operator-68c6474976-k7kwr\" (UID: \"dac87fdb-fe64-40be-81f6-e525e378a0c7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k7kwr" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.297492 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/93f05eb8-490a-4d15-8c38-dd158bd4a91f-trusted-ca\") pod \"console-operator-58897d9998-9nxrs\" (UID: \"93f05eb8-490a-4d15-8c38-dd158bd4a91f\") " pod="openshift-console-operator/console-operator-58897d9998-9nxrs" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.297953 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c6e932c7-ad81-4830-b212-61bdde0cec5b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gn5x4\" (UID: \"c6e932c7-ad81-4830-b212-61bdde0cec5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gn5x4" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.298123 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf4b45a0-f5a8-4d66-9d82-19e31c680447-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.298244 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a6b3333-726a-45d8-98e0-70a523f6ac6e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q9vxd\" (UID: \"7a6b3333-726a-45d8-98e0-70a523f6ac6e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q9vxd" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.298556 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-22v2l"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.298998 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf4b45a0-f5a8-4d66-9d82-19e31c680447-serving-cert\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.299227 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9694101-a510-49c6-8c1a-4869c9619b6a-serving-cert\") pod \"apiserver-7bbb656c7d-4zr8n\" (UID: \"a9694101-a510-49c6-8c1a-4869c9619b6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.299754 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93f05eb8-490a-4d15-8c38-dd158bd4a91f-serving-cert\") pod \"console-operator-58897d9998-9nxrs\" (UID: \"93f05eb8-490a-4d15-8c38-dd158bd4a91f\") " pod="openshift-console-operator/console-operator-58897d9998-9nxrs" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.299779 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59db5196-70c4-4bd6-8406-3589d4e44599-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6zc4l\" (UID: \"59db5196-70c4-4bd6-8406-3589d4e44599\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6zc4l" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.299958 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cf4b45a0-f5a8-4d66-9d82-19e31c680447-etcd-client\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.300163 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-22v2l"] Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.300218 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-22v2l" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.300637 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed2725c8-948a-4757-8e92-e365e49e1f3f-serving-cert\") pod \"etcd-operator-b45778765-zlcmz\" (UID: \"ed2725c8-948a-4757-8e92-e365e49e1f3f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zlcmz" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.301057 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cf4b45a0-f5a8-4d66-9d82-19e31c680447-encryption-config\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.301274 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2ff7989-b26b-4c66-991a-fecafc6f38b9-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-7n2rb\" (UID: \"d2ff7989-b26b-4c66-991a-fecafc6f38b9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2rb" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.301620 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.301698 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c6e932c7-ad81-4830-b212-61bdde0cec5b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gn5x4\" (UID: \"c6e932c7-ad81-4830-b212-61bdde0cec5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gn5x4" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.301774 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cba515e6-fcd1-4310-9630-70ca7c90c213-config\") pod \"machine-api-operator-5694c8668f-mlthg\" (UID: \"cba515e6-fcd1-4310-9630-70ca7c90c213\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mlthg" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.301797 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a229dbb4-017d-4ef2-bcd0-6c5585e3c497-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-9xwp8\" (UID: \"a229dbb4-017d-4ef2-bcd0-6c5585e3c497\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9xwp8" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.303189 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dac87fdb-fe64-40be-81f6-e525e378a0c7-srv-cert\") pod \"catalog-operator-68c6474976-k7kwr\" (UID: \"dac87fdb-fe64-40be-81f6-e525e378a0c7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k7kwr" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.304834 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/cba515e6-fcd1-4310-9630-70ca7c90c213-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mlthg\" (UID: \"cba515e6-fcd1-4310-9630-70ca7c90c213\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mlthg" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.322402 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.355503 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.361691 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.382272 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.395608 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/67cf5e8a-2d35-4db9-bcb6-68492f4be191-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dpqm9\" (UID: \"67cf5e8a-2d35-4db9-bcb6-68492f4be191\") " pod="openshift-marketplace/marketplace-operator-79b997595-dpqm9" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.395703 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbg62\" (UniqueName: \"kubernetes.io/projected/67cf5e8a-2d35-4db9-bcb6-68492f4be191-kube-api-access-kbg62\") pod \"marketplace-operator-79b997595-dpqm9\" (UID: \"67cf5e8a-2d35-4db9-bcb6-68492f4be191\") " pod="openshift-marketplace/marketplace-operator-79b997595-dpqm9" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.395745 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8b44c87a-bbab-469a-b5d7-2c8a0a9f05da-proxy-tls\") pod \"machine-config-controller-84d6567774-t6pc7\" (UID: \"8b44c87a-bbab-469a-b5d7-2c8a0a9f05da\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t6pc7" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.395837 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/67cf5e8a-2d35-4db9-bcb6-68492f4be191-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dpqm9\" (UID: \"67cf5e8a-2d35-4db9-bcb6-68492f4be191\") " pod="openshift-marketplace/marketplace-operator-79b997595-dpqm9" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.395870 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m9rr\" (UniqueName: \"kubernetes.io/projected/8b44c87a-bbab-469a-b5d7-2c8a0a9f05da-kube-api-access-7m9rr\") pod \"machine-config-controller-84d6567774-t6pc7\" (UID: \"8b44c87a-bbab-469a-b5d7-2c8a0a9f05da\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t6pc7" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.395896 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8b44c87a-bbab-469a-b5d7-2c8a0a9f05da-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-t6pc7\" (UID: \"8b44c87a-bbab-469a-b5d7-2c8a0a9f05da\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t6pc7" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.396779 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8b44c87a-bbab-469a-b5d7-2c8a0a9f05da-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-t6pc7\" (UID: \"8b44c87a-bbab-469a-b5d7-2c8a0a9f05da\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t6pc7" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.397561 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/67cf5e8a-2d35-4db9-bcb6-68492f4be191-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dpqm9\" (UID: \"67cf5e8a-2d35-4db9-bcb6-68492f4be191\") " pod="openshift-marketplace/marketplace-operator-79b997595-dpqm9" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.400751 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/67cf5e8a-2d35-4db9-bcb6-68492f4be191-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dpqm9\" (UID: \"67cf5e8a-2d35-4db9-bcb6-68492f4be191\") " pod="openshift-marketplace/marketplace-operator-79b997595-dpqm9" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.422921 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.442682 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.462599 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.481685 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.502281 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.521869 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.529495 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8b44c87a-bbab-469a-b5d7-2c8a0a9f05da-proxy-tls\") pod \"machine-config-controller-84d6567774-t6pc7\" (UID: \"8b44c87a-bbab-469a-b5d7-2c8a0a9f05da\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t6pc7" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.543201 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.562259 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.582159 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.601900 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.612644 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9c56993d-85b5-46f9-9897-4e371ee09769-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zs5pc\" (UID: \"9c56993d-85b5-46f9-9897-4e371ee09769\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zs5pc" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.621776 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.641838 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.662193 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.676045 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0eb62b4b-b2c4-46c8-af81-b234e5f3bf71-metrics-tls\") pod \"dns-operator-744455d44c-nfdlb\" (UID: \"0eb62b4b-b2c4-46c8-af81-b234e5f3bf71\") " pod="openshift-dns-operator/dns-operator-744455d44c-nfdlb" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.682318 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.701784 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.722302 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.750285 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.751853 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6bcdf9c6-be18-4bd1-a335-a8479496f678-trusted-ca\") pod \"ingress-operator-5b745b69d9-6twhh\" (UID: \"6bcdf9c6-be18-4bd1-a335-a8479496f678\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6twhh" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.762193 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.782683 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.790626 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6bcdf9c6-be18-4bd1-a335-a8479496f678-metrics-tls\") pod \"ingress-operator-5b745b69d9-6twhh\" (UID: \"6bcdf9c6-be18-4bd1-a335-a8479496f678\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6twhh" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.802087 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.822130 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.842391 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.862462 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.873095 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/317dead3-6446-45a7-a256-4ae88df539aa-machine-approver-tls\") pod \"machine-approver-56656f9798-zn2v6\" (UID: \"317dead3-6446-45a7-a256-4ae88df539aa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zn2v6" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.882575 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.887810 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/317dead3-6446-45a7-a256-4ae88df539aa-auth-proxy-config\") pod \"machine-approver-56656f9798-zn2v6\" (UID: \"317dead3-6446-45a7-a256-4ae88df539aa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zn2v6" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.901919 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.907813 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/317dead3-6446-45a7-a256-4ae88df539aa-config\") pod \"machine-approver-56656f9798-zn2v6\" (UID: \"317dead3-6446-45a7-a256-4ae88df539aa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zn2v6" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.922669 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.961478 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 05 16:18:04 crc kubenswrapper[4756]: I1205 16:18:04.981855 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.002172 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.022612 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.042307 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.062517 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.082244 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.103537 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.123257 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.142470 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.163321 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.183227 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.200141 4756 request.go:700] Waited for 1.008678666s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver-operator/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.202772 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.222139 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.242272 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.262473 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.282216 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.302133 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.321722 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.342281 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.361501 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.382813 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.402189 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.421845 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.443040 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.462792 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.482913 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.501975 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.522336 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.543540 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.562845 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.581954 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.603052 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.622455 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.643519 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.662042 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.682335 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.702108 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.723853 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.741602 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.763014 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.782006 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.802484 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.821591 4756 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.842826 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.864290 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.882578 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.902572 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.917070 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.917220 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:18:05 crc kubenswrapper[4756]: E1205 16:18:05.917276 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:20:07.917245645 +0000 UTC m=+268.155919167 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.917384 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.917473 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.917582 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.918496 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.922444 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.922494 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.922500 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.923929 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.942223 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.961703 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.969884 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 16:18:05 crc kubenswrapper[4756]: I1205 16:18:05.990328 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.005883 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvs2d\" (UniqueName: \"kubernetes.io/projected/93f05eb8-490a-4d15-8c38-dd158bd4a91f-kube-api-access-nvs2d\") pod \"console-operator-58897d9998-9nxrs\" (UID: \"93f05eb8-490a-4d15-8c38-dd158bd4a91f\") " pod="openshift-console-operator/console-operator-58897d9998-9nxrs" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.006373 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.028808 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22bzp\" (UniqueName: \"kubernetes.io/projected/6bcdf9c6-be18-4bd1-a335-a8479496f678-kube-api-access-22bzp\") pod \"ingress-operator-5b745b69d9-6twhh\" (UID: \"6bcdf9c6-be18-4bd1-a335-a8479496f678\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6twhh" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.049160 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdpfl\" (UniqueName: \"kubernetes.io/projected/b5856492-e637-42d0-b018-6f205e03e897-kube-api-access-jdpfl\") pod \"controller-manager-879f6c89f-4jdj5\" (UID: \"b5856492-e637-42d0-b018-6f205e03e897\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4jdj5" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.075510 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjdvn\" (UniqueName: \"kubernetes.io/projected/e438b5e2-50cf-4f45-a46f-5e4ee62633b4-kube-api-access-vjdvn\") pod \"multus-admission-controller-857f4d67dd-8cjkz\" (UID: \"e438b5e2-50cf-4f45-a46f-5e4ee62633b4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8cjkz" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.084922 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7a6b3333-726a-45d8-98e0-70a523f6ac6e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q9vxd\" (UID: \"7a6b3333-726a-45d8-98e0-70a523f6ac6e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q9vxd" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.106676 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t55qc\" (UniqueName: \"kubernetes.io/projected/ed2725c8-948a-4757-8e92-e365e49e1f3f-kube-api-access-t55qc\") pod \"etcd-operator-b45778765-zlcmz\" (UID: \"ed2725c8-948a-4757-8e92-e365e49e1f3f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zlcmz" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.119924 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4jdj5" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.125485 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmgkc\" (UniqueName: \"kubernetes.io/projected/317dead3-6446-45a7-a256-4ae88df539aa-kube-api-access-lmgkc\") pod \"machine-approver-56656f9798-zn2v6\" (UID: \"317dead3-6446-45a7-a256-4ae88df539aa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zn2v6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.142741 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsgjg\" (UniqueName: \"kubernetes.io/projected/0839e735-d517-4f9d-b742-45f842a9d9cc-kube-api-access-gsgjg\") pod \"openshift-config-operator-7777fb866f-5cxc5\" (UID: \"0839e735-d517-4f9d-b742-45f842a9d9cc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cxc5" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.160293 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8rhx\" (UniqueName: \"kubernetes.io/projected/cba515e6-fcd1-4310-9630-70ca7c90c213-kube-api-access-m8rhx\") pod \"machine-api-operator-5694c8668f-mlthg\" (UID: \"cba515e6-fcd1-4310-9630-70ca7c90c213\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mlthg" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.175315 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6bcdf9c6-be18-4bd1-a335-a8479496f678-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6twhh\" (UID: \"6bcdf9c6-be18-4bd1-a335-a8479496f678\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6twhh" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.187515 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6twhh" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.196287 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zn2v6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.197101 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bktqt\" (UniqueName: \"kubernetes.io/projected/cf4b45a0-f5a8-4d66-9d82-19e31c680447-kube-api-access-bktqt\") pod \"apiserver-76f77b778f-hqh5j\" (UID: \"cf4b45a0-f5a8-4d66-9d82-19e31c680447\") " pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.200560 4756 request.go:700] Waited for 1.908410107s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/serviceaccounts/default/token Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.200714 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-9nxrs" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.223461 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-zlcmz" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.225603 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5xgh\" (UniqueName: \"kubernetes.io/projected/6013c2d1-fe3a-4d8d-ba47-132ad91a6a34-kube-api-access-q5xgh\") pod \"downloads-7954f5f757-z9xhd\" (UID: \"6013c2d1-fe3a-4d8d-ba47-132ad91a6a34\") " pod="openshift-console/downloads-7954f5f757-z9xhd" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.236393 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqtls\" (UniqueName: \"kubernetes.io/projected/dac87fdb-fe64-40be-81f6-e525e378a0c7-kube-api-access-wqtls\") pod \"catalog-operator-68c6474976-k7kwr\" (UID: \"dac87fdb-fe64-40be-81f6-e525e378a0c7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k7kwr" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.261808 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw87l\" (UniqueName: \"kubernetes.io/projected/0eb62b4b-b2c4-46c8-af81-b234e5f3bf71-kube-api-access-kw87l\") pod \"dns-operator-744455d44c-nfdlb\" (UID: \"0eb62b4b-b2c4-46c8-af81-b234e5f3bf71\") " pod="openshift-dns-operator/dns-operator-744455d44c-nfdlb" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.276070 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cxc5" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.279684 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2jfd\" (UniqueName: \"kubernetes.io/projected/351d65a8-44a7-4f84-be48-1b5204422e8b-kube-api-access-n2jfd\") pod \"console-f9d7485db-wld6x\" (UID: \"351d65a8-44a7-4f84-be48-1b5204422e8b\") " pod="openshift-console/console-f9d7485db-wld6x" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.299835 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmz42\" (UniqueName: \"kubernetes.io/projected/c6e932c7-ad81-4830-b212-61bdde0cec5b-kube-api-access-mmz42\") pod \"cluster-image-registry-operator-dc59b4c8b-gn5x4\" (UID: \"c6e932c7-ad81-4830-b212-61bdde0cec5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gn5x4" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.312022 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-8cjkz" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.320809 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkztf\" (UniqueName: \"kubernetes.io/projected/a229dbb4-017d-4ef2-bcd0-6c5585e3c497-kube-api-access-hkztf\") pod \"openshift-controller-manager-operator-756b6f6bc6-9xwp8\" (UID: \"a229dbb4-017d-4ef2-bcd0-6c5585e3c497\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9xwp8" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.322619 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q9vxd" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.339892 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd58t\" (UniqueName: \"kubernetes.io/projected/a9694101-a510-49c6-8c1a-4869c9619b6a-kube-api-access-xd58t\") pod \"apiserver-7bbb656c7d-4zr8n\" (UID: \"a9694101-a510-49c6-8c1a-4869c9619b6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.362915 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d2ff7989-b26b-4c66-991a-fecafc6f38b9-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-7n2rb\" (UID: \"d2ff7989-b26b-4c66-991a-fecafc6f38b9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2rb" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.363237 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k7kwr" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.383555 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkn27\" (UniqueName: \"kubernetes.io/projected/9c56993d-85b5-46f9-9897-4e371ee09769-kube-api-access-mkn27\") pod \"cluster-samples-operator-665b6dd947-zs5pc\" (UID: \"9c56993d-85b5-46f9-9897-4e371ee09769\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zs5pc" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.405477 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c6e932c7-ad81-4830-b212-61bdde0cec5b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gn5x4\" (UID: \"c6e932c7-ad81-4830-b212-61bdde0cec5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gn5x4" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.421493 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/59db5196-70c4-4bd6-8406-3589d4e44599-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6zc4l\" (UID: \"59db5196-70c4-4bd6-8406-3589d4e44599\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6zc4l" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.423598 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.435648 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-mlthg" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.447386 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-z9xhd" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.447628 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.463998 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.466680 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zs5pc" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.479883 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-nfdlb" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.480148 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.483796 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.487866 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:06 crc kubenswrapper[4756]: W1205 16:18:06.496862 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-f53894001958e4643985fd1643cdd95ddbe02eaf52d2e33f3da74952c701f1f9 WatchSource:0}: Error finding container f53894001958e4643985fd1643cdd95ddbe02eaf52d2e33f3da74952c701f1f9: Status 404 returned error can't find the container with id f53894001958e4643985fd1643cdd95ddbe02eaf52d2e33f3da74952c701f1f9 Dec 05 16:18:06 crc kubenswrapper[4756]: W1205 16:18:06.510488 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-bdbe8716b116526192744a9e083fa80b17e01d1a33e60e1b19db8ef9a02373bb WatchSource:0}: Error finding container bdbe8716b116526192744a9e083fa80b17e01d1a33e60e1b19db8ef9a02373bb: Status 404 returned error can't find the container with id bdbe8716b116526192744a9e083fa80b17e01d1a33e60e1b19db8ef9a02373bb Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.522585 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbg62\" (UniqueName: \"kubernetes.io/projected/67cf5e8a-2d35-4db9-bcb6-68492f4be191-kube-api-access-kbg62\") pod \"marketplace-operator-79b997595-dpqm9\" (UID: \"67cf5e8a-2d35-4db9-bcb6-68492f4be191\") " pod="openshift-marketplace/marketplace-operator-79b997595-dpqm9" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.534796 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wld6x" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.551440 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-9nxrs"] Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.551979 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m9rr\" (UniqueName: \"kubernetes.io/projected/8b44c87a-bbab-469a-b5d7-2c8a0a9f05da-kube-api-access-7m9rr\") pod \"machine-config-controller-84d6567774-t6pc7\" (UID: \"8b44c87a-bbab-469a-b5d7-2c8a0a9f05da\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t6pc7" Dec 05 16:18:06 crc kubenswrapper[4756]: W1205 16:18:06.571711 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-006478453d140104da5957428867a3af4e97f5f4c8a5c579b830cf3317b15aaf WatchSource:0}: Error finding container 006478453d140104da5957428867a3af4e97f5f4c8a5c579b830cf3317b15aaf: Status 404 returned error can't find the container with id 006478453d140104da5957428867a3af4e97f5f4c8a5c579b830cf3317b15aaf Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.600963 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gn5x4" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.606271 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9xwp8" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.626676 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6zc4l" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.627590 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6ad5d1b1-4e12-430f-91f1-70e8fd975a09-default-certificate\") pod \"router-default-5444994796-pt8fk\" (UID: \"6ad5d1b1-4e12-430f-91f1-70e8fd975a09\") " pod="openshift-ingress/router-default-5444994796-pt8fk" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.627649 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.627673 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.627691 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.627709 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.627727 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/24d8224d-0fe6-47f3-ae6b-14ea099aa69e-images\") pod \"machine-config-operator-74547568cd-d9gdk\" (UID: \"24d8224d-0fe6-47f3-ae6b-14ea099aa69e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9gdk" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.627748 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ad5d1b1-4e12-430f-91f1-70e8fd975a09-metrics-certs\") pod \"router-default-5444994796-pt8fk\" (UID: \"6ad5d1b1-4e12-430f-91f1-70e8fd975a09\") " pod="openshift-ingress/router-default-5444994796-pt8fk" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.627776 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bd595eb-93e9-4ecf-9aad-b57eb902bcf0-config\") pod \"authentication-operator-69f744f599-w5k56\" (UID: \"1bd595eb-93e9-4ecf-9aad-b57eb902bcf0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w5k56" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.627793 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bd595eb-93e9-4ecf-9aad-b57eb902bcf0-serving-cert\") pod \"authentication-operator-69f744f599-w5k56\" (UID: \"1bd595eb-93e9-4ecf-9aad-b57eb902bcf0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w5k56" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.627817 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.627849 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-registry-certificates\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.627874 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.627897 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0dc1aa3a-bdf5-423b-bb6e-ef83b6b2a14b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zmrlm\" (UID: \"0dc1aa3a-bdf5-423b-bb6e-ef83b6b2a14b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zmrlm" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.627924 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.627943 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ad5d1b1-4e12-430f-91f1-70e8fd975a09-service-ca-bundle\") pod \"router-default-5444994796-pt8fk\" (UID: \"6ad5d1b1-4e12-430f-91f1-70e8fd975a09\") " pod="openshift-ingress/router-default-5444994796-pt8fk" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.627965 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.627985 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.628007 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.628028 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6ad5d1b1-4e12-430f-91f1-70e8fd975a09-stats-auth\") pod \"router-default-5444994796-pt8fk\" (UID: \"6ad5d1b1-4e12-430f-91f1-70e8fd975a09\") " pod="openshift-ingress/router-default-5444994796-pt8fk" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.628049 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/24d8224d-0fe6-47f3-ae6b-14ea099aa69e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-d9gdk\" (UID: \"24d8224d-0fe6-47f3-ae6b-14ea099aa69e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9gdk" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.628080 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.628118 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.628144 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-trusted-ca\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.628167 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvfqz\" (UniqueName: \"kubernetes.io/projected/597869bc-03e8-40fd-8f80-c156102a3dc7-kube-api-access-lvfqz\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.628194 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh59k\" (UniqueName: \"kubernetes.io/projected/6ad5d1b1-4e12-430f-91f1-70e8fd975a09-kube-api-access-hh59k\") pod \"router-default-5444994796-pt8fk\" (UID: \"6ad5d1b1-4e12-430f-91f1-70e8fd975a09\") " pod="openshift-ingress/router-default-5444994796-pt8fk" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.628220 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9jtw\" (UniqueName: \"kubernetes.io/projected/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-kube-api-access-v9jtw\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.628241 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0dc1aa3a-bdf5-423b-bb6e-ef83b6b2a14b-srv-cert\") pod \"olm-operator-6b444d44fb-zmrlm\" (UID: \"0dc1aa3a-bdf5-423b-bb6e-ef83b6b2a14b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zmrlm" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.628264 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-registry-tls\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.628288 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24d8224d-0fe6-47f3-ae6b-14ea099aa69e-proxy-tls\") pod \"machine-config-operator-74547568cd-d9gdk\" (UID: \"24d8224d-0fe6-47f3-ae6b-14ea099aa69e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9gdk" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.628311 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.628332 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bd595eb-93e9-4ecf-9aad-b57eb902bcf0-service-ca-bundle\") pod \"authentication-operator-69f744f599-w5k56\" (UID: \"1bd595eb-93e9-4ecf-9aad-b57eb902bcf0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w5k56" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.628356 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-bound-sa-token\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.628377 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/597869bc-03e8-40fd-8f80-c156102a3dc7-audit-policies\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.628397 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.628429 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bd595eb-93e9-4ecf-9aad-b57eb902bcf0-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-w5k56\" (UID: \"1bd595eb-93e9-4ecf-9aad-b57eb902bcf0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w5k56" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.628450 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qtdk\" (UniqueName: \"kubernetes.io/projected/0dc1aa3a-bdf5-423b-bb6e-ef83b6b2a14b-kube-api-access-9qtdk\") pod \"olm-operator-6b444d44fb-zmrlm\" (UID: \"0dc1aa3a-bdf5-423b-bb6e-ef83b6b2a14b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zmrlm" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.628470 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v765\" (UniqueName: \"kubernetes.io/projected/24d8224d-0fe6-47f3-ae6b-14ea099aa69e-kube-api-access-9v765\") pod \"machine-config-operator-74547568cd-d9gdk\" (UID: \"24d8224d-0fe6-47f3-ae6b-14ea099aa69e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9gdk" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.628493 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hmdv\" (UniqueName: \"kubernetes.io/projected/1bd595eb-93e9-4ecf-9aad-b57eb902bcf0-kube-api-access-4hmdv\") pod \"authentication-operator-69f744f599-w5k56\" (UID: \"1bd595eb-93e9-4ecf-9aad-b57eb902bcf0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w5k56" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.628513 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/597869bc-03e8-40fd-8f80-c156102a3dc7-audit-dir\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: E1205 16:18:06.629144 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:07.129127721 +0000 UTC m=+147.367801233 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.642168 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2rb" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.657257 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6twhh"] Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.674466 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dpqm9" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.697013 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-5cxc5"] Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.705157 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-zlcmz"] Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.710254 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4jdj5"] Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.730168 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:06 crc kubenswrapper[4756]: E1205 16:18:06.730363 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:07.230328889 +0000 UTC m=+147.469002401 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.731069 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0dc1aa3a-bdf5-423b-bb6e-ef83b6b2a14b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zmrlm\" (UID: \"0dc1aa3a-bdf5-423b-bb6e-ef83b6b2a14b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zmrlm" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.731168 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/598b5e4e-bbed-4011-ab04-f152e2905431-cert\") pod \"ingress-canary-22v2l\" (UID: \"598b5e4e-bbed-4011-ab04-f152e2905431\") " pod="openshift-ingress-canary/ingress-canary-22v2l" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.731202 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpz9m\" (UniqueName: \"kubernetes.io/projected/875c68c0-137f-466e-97ff-3c2f4bf90401-kube-api-access-rpz9m\") pod \"migrator-59844c95c7-lcrs9\" (UID: \"875c68c0-137f-466e-97ff-3c2f4bf90401\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lcrs9" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.731266 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxq7w\" (UniqueName: \"kubernetes.io/projected/a7002a42-041f-4b43-a998-659fa4a3da84-kube-api-access-fxq7w\") pod \"kube-storage-version-migrator-operator-b67b599dd-7zphw\" (UID: \"a7002a42-041f-4b43-a998-659fa4a3da84\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7zphw" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.731371 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7002a42-041f-4b43-a998-659fa4a3da84-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7zphw\" (UID: \"a7002a42-041f-4b43-a998-659fa4a3da84\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7zphw" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.731510 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbh2f\" (UniqueName: \"kubernetes.io/projected/2f1a7a84-f798-4242-b060-91a6c2f50a21-kube-api-access-fbh2f\") pod \"package-server-manager-789f6589d5-6zs9x\" (UID: \"2f1a7a84-f798-4242-b060-91a6c2f50a21\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6zs9x" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.731581 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ea105de-fb15-46c7-b937-f4738b7fd8fd-config\") pod \"service-ca-operator-777779d784-cpmj6\" (UID: \"5ea105de-fb15-46c7-b937-f4738b7fd8fd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cpmj6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.731678 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.731845 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f1a7a84-f798-4242-b060-91a6c2f50a21-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6zs9x\" (UID: \"2f1a7a84-f798-4242-b060-91a6c2f50a21\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6zs9x" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.731930 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ad5d1b1-4e12-430f-91f1-70e8fd975a09-service-ca-bundle\") pod \"router-default-5444994796-pt8fk\" (UID: \"6ad5d1b1-4e12-430f-91f1-70e8fd975a09\") " pod="openshift-ingress/router-default-5444994796-pt8fk" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.731981 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79fba350-e2c8-44c0-8151-e1e07559bfd7-config\") pod \"openshift-apiserver-operator-796bbdcf4f-v4kkz\" (UID: \"79fba350-e2c8-44c0-8151-e1e07559bfd7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v4kkz" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.732234 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.733042 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3708be10-919d-4500-8325-90de029cbb66-mountpoint-dir\") pod \"csi-hostpathplugin-wckzw\" (UID: \"3708be10-919d-4500-8325-90de029cbb66\") " pod="hostpath-provisioner/csi-hostpathplugin-wckzw" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.733090 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.733115 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.733147 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.733178 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6ad5d1b1-4e12-430f-91f1-70e8fd975a09-stats-auth\") pod \"router-default-5444994796-pt8fk\" (UID: \"6ad5d1b1-4e12-430f-91f1-70e8fd975a09\") " pod="openshift-ingress/router-default-5444994796-pt8fk" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.733208 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fslf2\" (UniqueName: \"kubernetes.io/projected/598b5e4e-bbed-4011-ab04-f152e2905431-kube-api-access-fslf2\") pod \"ingress-canary-22v2l\" (UID: \"598b5e4e-bbed-4011-ab04-f152e2905431\") " pod="openshift-ingress-canary/ingress-canary-22v2l" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.733226 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75q94\" (UniqueName: \"kubernetes.io/projected/c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8-kube-api-access-75q94\") pod \"route-controller-manager-6576b87f9c-q296n\" (UID: \"c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q296n" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.733247 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/24d8224d-0fe6-47f3-ae6b-14ea099aa69e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-d9gdk\" (UID: \"24d8224d-0fe6-47f3-ae6b-14ea099aa69e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9gdk" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.733297 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.733324 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.733340 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/65fe8661-28bd-474d-aec9-a63801393725-apiservice-cert\") pod \"packageserver-d55dfcdfc-dsrxv\" (UID: \"65fe8661-28bd-474d-aec9-a63801393725\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dsrxv" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.733374 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d7eef861-8bb0-429e-b359-372213f7769b-signing-key\") pod \"service-ca-9c57cc56f-stdrz\" (UID: \"d7eef861-8bb0-429e-b359-372213f7769b\") " pod="openshift-service-ca/service-ca-9c57cc56f-stdrz" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.733407 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-trusted-ca\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.733444 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvfqz\" (UniqueName: \"kubernetes.io/projected/597869bc-03e8-40fd-8f80-c156102a3dc7-kube-api-access-lvfqz\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.733483 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh59k\" (UniqueName: \"kubernetes.io/projected/6ad5d1b1-4e12-430f-91f1-70e8fd975a09-kube-api-access-hh59k\") pod \"router-default-5444994796-pt8fk\" (UID: \"6ad5d1b1-4e12-430f-91f1-70e8fd975a09\") " pod="openshift-ingress/router-default-5444994796-pt8fk" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.733502 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8cb1ac56-8d3f-4dce-be45-b9ef2e859e05-config-volume\") pod \"dns-default-6kmfx\" (UID: \"8cb1ac56-8d3f-4dce-be45-b9ef2e859e05\") " pod="openshift-dns/dns-default-6kmfx" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.733738 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ad5d1b1-4e12-430f-91f1-70e8fd975a09-service-ca-bundle\") pod \"router-default-5444994796-pt8fk\" (UID: \"6ad5d1b1-4e12-430f-91f1-70e8fd975a09\") " pod="openshift-ingress/router-default-5444994796-pt8fk" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.733800 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9jtw\" (UniqueName: \"kubernetes.io/projected/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-kube-api-access-v9jtw\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.733828 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8-client-ca\") pod \"route-controller-manager-6576b87f9c-q296n\" (UID: \"c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q296n" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.734838 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/24d8224d-0fe6-47f3-ae6b-14ea099aa69e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-d9gdk\" (UID: \"24d8224d-0fe6-47f3-ae6b-14ea099aa69e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9gdk" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.741468 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0dc1aa3a-bdf5-423b-bb6e-ef83b6b2a14b-srv-cert\") pod \"olm-operator-6b444d44fb-zmrlm\" (UID: \"0dc1aa3a-bdf5-423b-bb6e-ef83b6b2a14b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zmrlm" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.741599 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5tdn\" (UniqueName: \"kubernetes.io/projected/65fe8661-28bd-474d-aec9-a63801393725-kube-api-access-q5tdn\") pod \"packageserver-d55dfcdfc-dsrxv\" (UID: \"65fe8661-28bd-474d-aec9-a63801393725\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dsrxv" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.741794 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2e132f7-f4ea-426c-b094-b5eea4577d5f-secret-volume\") pod \"collect-profiles-29415855-9cgsd\" (UID: \"f2e132f7-f4ea-426c-b094-b5eea4577d5f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-9cgsd" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.741870 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-registry-tls\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.741932 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24d8224d-0fe6-47f3-ae6b-14ea099aa69e-proxy-tls\") pod \"machine-config-operator-74547568cd-d9gdk\" (UID: \"24d8224d-0fe6-47f3-ae6b-14ea099aa69e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9gdk" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.742001 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfbnm\" (UniqueName: \"kubernetes.io/projected/3708be10-919d-4500-8325-90de029cbb66-kube-api-access-lfbnm\") pod \"csi-hostpathplugin-wckzw\" (UID: \"3708be10-919d-4500-8325-90de029cbb66\") " pod="hostpath-provisioner/csi-hostpathplugin-wckzw" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.742101 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.742174 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bd595eb-93e9-4ecf-9aad-b57eb902bcf0-service-ca-bundle\") pod \"authentication-operator-69f744f599-w5k56\" (UID: \"1bd595eb-93e9-4ecf-9aad-b57eb902bcf0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w5k56" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.742238 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3708be10-919d-4500-8325-90de029cbb66-socket-dir\") pod \"csi-hostpathplugin-wckzw\" (UID: \"3708be10-919d-4500-8325-90de029cbb66\") " pod="hostpath-provisioner/csi-hostpathplugin-wckzw" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.742286 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-bound-sa-token\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.742345 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/597869bc-03e8-40fd-8f80-c156102a3dc7-audit-policies\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.742394 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.742458 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bd595eb-93e9-4ecf-9aad-b57eb902bcf0-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-w5k56\" (UID: \"1bd595eb-93e9-4ecf-9aad-b57eb902bcf0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w5k56" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.742531 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qtdk\" (UniqueName: \"kubernetes.io/projected/0dc1aa3a-bdf5-423b-bb6e-ef83b6b2a14b-kube-api-access-9qtdk\") pod \"olm-operator-6b444d44fb-zmrlm\" (UID: \"0dc1aa3a-bdf5-423b-bb6e-ef83b6b2a14b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zmrlm" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.742563 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v765\" (UniqueName: \"kubernetes.io/projected/24d8224d-0fe6-47f3-ae6b-14ea099aa69e-kube-api-access-9v765\") pod \"machine-config-operator-74547568cd-d9gdk\" (UID: \"24d8224d-0fe6-47f3-ae6b-14ea099aa69e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9gdk" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.742619 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d7eef861-8bb0-429e-b359-372213f7769b-signing-cabundle\") pod \"service-ca-9c57cc56f-stdrz\" (UID: \"d7eef861-8bb0-429e-b359-372213f7769b\") " pod="openshift-service-ca/service-ca-9c57cc56f-stdrz" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.742688 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/996b39af-143d-4f4a-affc-b726debbfb0c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-67x7d\" (UID: \"996b39af-143d-4f4a-affc-b726debbfb0c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-67x7d" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.742957 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hmdv\" (UniqueName: \"kubernetes.io/projected/1bd595eb-93e9-4ecf-9aad-b57eb902bcf0-kube-api-access-4hmdv\") pod \"authentication-operator-69f744f599-w5k56\" (UID: \"1bd595eb-93e9-4ecf-9aad-b57eb902bcf0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w5k56" Dec 05 16:18:06 crc kubenswrapper[4756]: E1205 16:18:06.742988 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:07.242759531 +0000 UTC m=+147.481433043 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.746984 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-trusted-ca\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.751314 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bd595eb-93e9-4ecf-9aad-b57eb902bcf0-service-ca-bundle\") pod \"authentication-operator-69f744f599-w5k56\" (UID: \"1bd595eb-93e9-4ecf-9aad-b57eb902bcf0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w5k56" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.755087 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.758249 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t6pc7" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.743096 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsqk2\" (UniqueName: \"kubernetes.io/projected/79fba350-e2c8-44c0-8151-e1e07559bfd7-kube-api-access-hsqk2\") pod \"openshift-apiserver-operator-796bbdcf4f-v4kkz\" (UID: \"79fba350-e2c8-44c0-8151-e1e07559bfd7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v4kkz" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.770926 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2e132f7-f4ea-426c-b094-b5eea4577d5f-config-volume\") pod \"collect-profiles-29415855-9cgsd\" (UID: \"f2e132f7-f4ea-426c-b094-b5eea4577d5f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-9cgsd" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.770969 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/597869bc-03e8-40fd-8f80-c156102a3dc7-audit-dir\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.771082 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxb8d\" (UniqueName: \"kubernetes.io/projected/8cb1ac56-8d3f-4dce-be45-b9ef2e859e05-kube-api-access-dxb8d\") pod \"dns-default-6kmfx\" (UID: \"8cb1ac56-8d3f-4dce-be45-b9ef2e859e05\") " pod="openshift-dns/dns-default-6kmfx" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.772016 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.773466 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0dc1aa3a-bdf5-423b-bb6e-ef83b6b2a14b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zmrlm\" (UID: \"0dc1aa3a-bdf5-423b-bb6e-ef83b6b2a14b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zmrlm" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.774207 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bd595eb-93e9-4ecf-9aad-b57eb902bcf0-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-w5k56\" (UID: \"1bd595eb-93e9-4ecf-9aad-b57eb902bcf0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w5k56" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.771131 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6ad5d1b1-4e12-430f-91f1-70e8fd975a09-default-certificate\") pod \"router-default-5444994796-pt8fk\" (UID: \"6ad5d1b1-4e12-430f-91f1-70e8fd975a09\") " pod="openshift-ingress/router-default-5444994796-pt8fk" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.776995 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.778876 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/597869bc-03e8-40fd-8f80-c156102a3dc7-audit-policies\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.784160 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3708be10-919d-4500-8325-90de029cbb66-registration-dir\") pod \"csi-hostpathplugin-wckzw\" (UID: \"3708be10-919d-4500-8325-90de029cbb66\") " pod="hostpath-provisioner/csi-hostpathplugin-wckzw" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.784134 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/597869bc-03e8-40fd-8f80-c156102a3dc7-audit-dir\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.784215 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3708be10-919d-4500-8325-90de029cbb66-csi-data-dir\") pod \"csi-hostpathplugin-wckzw\" (UID: \"3708be10-919d-4500-8325-90de029cbb66\") " pod="hostpath-provisioner/csi-hostpathplugin-wckzw" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.784270 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.784294 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bfgq\" (UniqueName: \"kubernetes.io/projected/f2e132f7-f4ea-426c-b094-b5eea4577d5f-kube-api-access-5bfgq\") pod \"collect-profiles-29415855-9cgsd\" (UID: \"f2e132f7-f4ea-426c-b094-b5eea4577d5f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-9cgsd" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.784315 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.784331 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.784366 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.784385 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/24d8224d-0fe6-47f3-ae6b-14ea099aa69e-images\") pod \"machine-config-operator-74547568cd-d9gdk\" (UID: \"24d8224d-0fe6-47f3-ae6b-14ea099aa69e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9gdk" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.784442 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ad5d1b1-4e12-430f-91f1-70e8fd975a09-metrics-certs\") pod \"router-default-5444994796-pt8fk\" (UID: \"6ad5d1b1-4e12-430f-91f1-70e8fd975a09\") " pod="openshift-ingress/router-default-5444994796-pt8fk" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.784461 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8-serving-cert\") pod \"route-controller-manager-6576b87f9c-q296n\" (UID: \"c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q296n" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.784478 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/65fe8661-28bd-474d-aec9-a63801393725-tmpfs\") pod \"packageserver-d55dfcdfc-dsrxv\" (UID: \"65fe8661-28bd-474d-aec9-a63801393725\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dsrxv" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.784508 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bd595eb-93e9-4ecf-9aad-b57eb902bcf0-config\") pod \"authentication-operator-69f744f599-w5k56\" (UID: \"1bd595eb-93e9-4ecf-9aad-b57eb902bcf0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w5k56" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.784525 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bd595eb-93e9-4ecf-9aad-b57eb902bcf0-serving-cert\") pod \"authentication-operator-69f744f599-w5k56\" (UID: \"1bd595eb-93e9-4ecf-9aad-b57eb902bcf0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w5k56" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.784569 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ec9fa804-6e0c-4375-b1c4-7a467da3a43a-node-bootstrap-token\") pod \"machine-config-server-8jwvd\" (UID: \"ec9fa804-6e0c-4375-b1c4-7a467da3a43a\") " pod="openshift-machine-config-operator/machine-config-server-8jwvd" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.784586 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkrj7\" (UniqueName: \"kubernetes.io/projected/5ea105de-fb15-46c7-b937-f4738b7fd8fd-kube-api-access-zkrj7\") pod \"service-ca-operator-777779d784-cpmj6\" (UID: \"5ea105de-fb15-46c7-b937-f4738b7fd8fd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cpmj6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.784604 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.784624 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3708be10-919d-4500-8325-90de029cbb66-plugins-dir\") pod \"csi-hostpathplugin-wckzw\" (UID: \"3708be10-919d-4500-8325-90de029cbb66\") " pod="hostpath-provisioner/csi-hostpathplugin-wckzw" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.784638 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7002a42-041f-4b43-a998-659fa4a3da84-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7zphw\" (UID: \"a7002a42-041f-4b43-a998-659fa4a3da84\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7zphw" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.784656 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvw2w\" (UniqueName: \"kubernetes.io/projected/996b39af-143d-4f4a-affc-b726debbfb0c-kube-api-access-rvw2w\") pod \"control-plane-machine-set-operator-78cbb6b69f-67x7d\" (UID: \"996b39af-143d-4f4a-affc-b726debbfb0c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-67x7d" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.784715 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-registry-certificates\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.784760 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79fba350-e2c8-44c0-8151-e1e07559bfd7-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-v4kkz\" (UID: \"79fba350-e2c8-44c0-8151-e1e07559bfd7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v4kkz" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.784787 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/65fe8661-28bd-474d-aec9-a63801393725-webhook-cert\") pod \"packageserver-d55dfcdfc-dsrxv\" (UID: \"65fe8661-28bd-474d-aec9-a63801393725\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dsrxv" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.784826 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ec9fa804-6e0c-4375-b1c4-7a467da3a43a-certs\") pod \"machine-config-server-8jwvd\" (UID: \"ec9fa804-6e0c-4375-b1c4-7a467da3a43a\") " pod="openshift-machine-config-operator/machine-config-server-8jwvd" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.784844 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ea105de-fb15-46c7-b937-f4738b7fd8fd-serving-cert\") pod \"service-ca-operator-777779d784-cpmj6\" (UID: \"5ea105de-fb15-46c7-b937-f4738b7fd8fd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cpmj6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.784870 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8cb1ac56-8d3f-4dce-be45-b9ef2e859e05-metrics-tls\") pod \"dns-default-6kmfx\" (UID: \"8cb1ac56-8d3f-4dce-be45-b9ef2e859e05\") " pod="openshift-dns/dns-default-6kmfx" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.784913 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8-config\") pod \"route-controller-manager-6576b87f9c-q296n\" (UID: \"c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q296n" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.784942 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.784972 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwk4c\" (UniqueName: \"kubernetes.io/projected/ec9fa804-6e0c-4375-b1c4-7a467da3a43a-kube-api-access-kwk4c\") pod \"machine-config-server-8jwvd\" (UID: \"ec9fa804-6e0c-4375-b1c4-7a467da3a43a\") " pod="openshift-machine-config-operator/machine-config-server-8jwvd" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.785048 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccd4g\" (UniqueName: \"kubernetes.io/projected/d7eef861-8bb0-429e-b359-372213f7769b-kube-api-access-ccd4g\") pod \"service-ca-9c57cc56f-stdrz\" (UID: \"d7eef861-8bb0-429e-b359-372213f7769b\") " pod="openshift-service-ca/service-ca-9c57cc56f-stdrz" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.786075 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bd595eb-93e9-4ecf-9aad-b57eb902bcf0-config\") pod \"authentication-operator-69f744f599-w5k56\" (UID: \"1bd595eb-93e9-4ecf-9aad-b57eb902bcf0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w5k56" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.786355 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-registry-certificates\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.786489 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6ad5d1b1-4e12-430f-91f1-70e8fd975a09-stats-auth\") pod \"router-default-5444994796-pt8fk\" (UID: \"6ad5d1b1-4e12-430f-91f1-70e8fd975a09\") " pod="openshift-ingress/router-default-5444994796-pt8fk" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.786692 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hmdv\" (UniqueName: \"kubernetes.io/projected/1bd595eb-93e9-4ecf-9aad-b57eb902bcf0-kube-api-access-4hmdv\") pod \"authentication-operator-69f744f599-w5k56\" (UID: \"1bd595eb-93e9-4ecf-9aad-b57eb902bcf0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w5k56" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.786856 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0dc1aa3a-bdf5-423b-bb6e-ef83b6b2a14b-srv-cert\") pod \"olm-operator-6b444d44fb-zmrlm\" (UID: \"0dc1aa3a-bdf5-423b-bb6e-ef83b6b2a14b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zmrlm" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.787379 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/24d8224d-0fe6-47f3-ae6b-14ea099aa69e-images\") pod \"machine-config-operator-74547568cd-d9gdk\" (UID: \"24d8224d-0fe6-47f3-ae6b-14ea099aa69e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9gdk" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.788589 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.788750 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-registry-tls\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.788827 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24d8224d-0fe6-47f3-ae6b-14ea099aa69e-proxy-tls\") pod \"machine-config-operator-74547568cd-d9gdk\" (UID: \"24d8224d-0fe6-47f3-ae6b-14ea099aa69e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9gdk" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.789021 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.789971 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.790172 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.791565 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.792136 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ad5d1b1-4e12-430f-91f1-70e8fd975a09-metrics-certs\") pod \"router-default-5444994796-pt8fk\" (UID: \"6ad5d1b1-4e12-430f-91f1-70e8fd975a09\") " pod="openshift-ingress/router-default-5444994796-pt8fk" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.792275 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8cjkz"] Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.800237 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.813793 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.813806 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.814305 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvfqz\" (UniqueName: \"kubernetes.io/projected/597869bc-03e8-40fd-8f80-c156102a3dc7-kube-api-access-lvfqz\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.816273 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6ad5d1b1-4e12-430f-91f1-70e8fd975a09-default-certificate\") pod \"router-default-5444994796-pt8fk\" (UID: \"6ad5d1b1-4e12-430f-91f1-70e8fd975a09\") " pod="openshift-ingress/router-default-5444994796-pt8fk" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.816716 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bd595eb-93e9-4ecf-9aad-b57eb902bcf0-serving-cert\") pod \"authentication-operator-69f744f599-w5k56\" (UID: \"1bd595eb-93e9-4ecf-9aad-b57eb902bcf0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w5k56" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.818702 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-k8whp\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.818885 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh59k\" (UniqueName: \"kubernetes.io/projected/6ad5d1b1-4e12-430f-91f1-70e8fd975a09-kube-api-access-hh59k\") pod \"router-default-5444994796-pt8fk\" (UID: \"6ad5d1b1-4e12-430f-91f1-70e8fd975a09\") " pod="openshift-ingress/router-default-5444994796-pt8fk" Dec 05 16:18:06 crc kubenswrapper[4756]: W1205 16:18:06.833398 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bcdf9c6_be18_4bd1_a335_a8479496f678.slice/crio-f05120158aae8f153bdf711d34015fc3c2a64fdb31ba59097aadb7eb39bcccd7 WatchSource:0}: Error finding container f05120158aae8f153bdf711d34015fc3c2a64fdb31ba59097aadb7eb39bcccd7: Status 404 returned error can't find the container with id f05120158aae8f153bdf711d34015fc3c2a64fdb31ba59097aadb7eb39bcccd7 Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.840876 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9jtw\" (UniqueName: \"kubernetes.io/projected/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-kube-api-access-v9jtw\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:06 crc kubenswrapper[4756]: W1205 16:18:06.863150 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0839e735_d517_4f9d_b742_45f842a9d9cc.slice/crio-21e2ed4c7a13bef76a6bb6dc9e8932c4ababd500528a6eb42f3eef662b3e213f WatchSource:0}: Error finding container 21e2ed4c7a13bef76a6bb6dc9e8932c4ababd500528a6eb42f3eef662b3e213f: Status 404 returned error can't find the container with id 21e2ed4c7a13bef76a6bb6dc9e8932c4ababd500528a6eb42f3eef662b3e213f Dec 05 16:18:06 crc kubenswrapper[4756]: W1205 16:18:06.874390 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode438b5e2_50cf_4f45_a46f_5e4ee62633b4.slice/crio-2c64b2d9615868baf1c99fcce37e0e74a35456a9b9c960940160cad77fdfdd8a WatchSource:0}: Error finding container 2c64b2d9615868baf1c99fcce37e0e74a35456a9b9c960940160cad77fdfdd8a: Status 404 returned error can't find the container with id 2c64b2d9615868baf1c99fcce37e0e74a35456a9b9c960940160cad77fdfdd8a Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.885360 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qtdk\" (UniqueName: \"kubernetes.io/projected/0dc1aa3a-bdf5-423b-bb6e-ef83b6b2a14b-kube-api-access-9qtdk\") pod \"olm-operator-6b444d44fb-zmrlm\" (UID: \"0dc1aa3a-bdf5-423b-bb6e-ef83b6b2a14b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zmrlm" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.886494 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.886711 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7002a42-041f-4b43-a998-659fa4a3da84-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7zphw\" (UID: \"a7002a42-041f-4b43-a998-659fa4a3da84\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7zphw" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.886742 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbh2f\" (UniqueName: \"kubernetes.io/projected/2f1a7a84-f798-4242-b060-91a6c2f50a21-kube-api-access-fbh2f\") pod \"package-server-manager-789f6589d5-6zs9x\" (UID: \"2f1a7a84-f798-4242-b060-91a6c2f50a21\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6zs9x" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.886767 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ea105de-fb15-46c7-b937-f4738b7fd8fd-config\") pod \"service-ca-operator-777779d784-cpmj6\" (UID: \"5ea105de-fb15-46c7-b937-f4738b7fd8fd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cpmj6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.886803 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f1a7a84-f798-4242-b060-91a6c2f50a21-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6zs9x\" (UID: \"2f1a7a84-f798-4242-b060-91a6c2f50a21\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6zs9x" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.886824 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79fba350-e2c8-44c0-8151-e1e07559bfd7-config\") pod \"openshift-apiserver-operator-796bbdcf4f-v4kkz\" (UID: \"79fba350-e2c8-44c0-8151-e1e07559bfd7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v4kkz" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.886848 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3708be10-919d-4500-8325-90de029cbb66-mountpoint-dir\") pod \"csi-hostpathplugin-wckzw\" (UID: \"3708be10-919d-4500-8325-90de029cbb66\") " pod="hostpath-provisioner/csi-hostpathplugin-wckzw" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.886872 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fslf2\" (UniqueName: \"kubernetes.io/projected/598b5e4e-bbed-4011-ab04-f152e2905431-kube-api-access-fslf2\") pod \"ingress-canary-22v2l\" (UID: \"598b5e4e-bbed-4011-ab04-f152e2905431\") " pod="openshift-ingress-canary/ingress-canary-22v2l" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.886893 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75q94\" (UniqueName: \"kubernetes.io/projected/c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8-kube-api-access-75q94\") pod \"route-controller-manager-6576b87f9c-q296n\" (UID: \"c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q296n" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.886932 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/65fe8661-28bd-474d-aec9-a63801393725-apiservice-cert\") pod \"packageserver-d55dfcdfc-dsrxv\" (UID: \"65fe8661-28bd-474d-aec9-a63801393725\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dsrxv" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.886951 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d7eef861-8bb0-429e-b359-372213f7769b-signing-key\") pod \"service-ca-9c57cc56f-stdrz\" (UID: \"d7eef861-8bb0-429e-b359-372213f7769b\") " pod="openshift-service-ca/service-ca-9c57cc56f-stdrz" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.886975 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8cb1ac56-8d3f-4dce-be45-b9ef2e859e05-config-volume\") pod \"dns-default-6kmfx\" (UID: \"8cb1ac56-8d3f-4dce-be45-b9ef2e859e05\") " pod="openshift-dns/dns-default-6kmfx" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887000 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8-client-ca\") pod \"route-controller-manager-6576b87f9c-q296n\" (UID: \"c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q296n" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887021 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5tdn\" (UniqueName: \"kubernetes.io/projected/65fe8661-28bd-474d-aec9-a63801393725-kube-api-access-q5tdn\") pod \"packageserver-d55dfcdfc-dsrxv\" (UID: \"65fe8661-28bd-474d-aec9-a63801393725\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dsrxv" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887047 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2e132f7-f4ea-426c-b094-b5eea4577d5f-secret-volume\") pod \"collect-profiles-29415855-9cgsd\" (UID: \"f2e132f7-f4ea-426c-b094-b5eea4577d5f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-9cgsd" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887066 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfbnm\" (UniqueName: \"kubernetes.io/projected/3708be10-919d-4500-8325-90de029cbb66-kube-api-access-lfbnm\") pod \"csi-hostpathplugin-wckzw\" (UID: \"3708be10-919d-4500-8325-90de029cbb66\") " pod="hostpath-provisioner/csi-hostpathplugin-wckzw" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887085 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3708be10-919d-4500-8325-90de029cbb66-socket-dir\") pod \"csi-hostpathplugin-wckzw\" (UID: \"3708be10-919d-4500-8325-90de029cbb66\") " pod="hostpath-provisioner/csi-hostpathplugin-wckzw" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887119 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d7eef861-8bb0-429e-b359-372213f7769b-signing-cabundle\") pod \"service-ca-9c57cc56f-stdrz\" (UID: \"d7eef861-8bb0-429e-b359-372213f7769b\") " pod="openshift-service-ca/service-ca-9c57cc56f-stdrz" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887144 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/996b39af-143d-4f4a-affc-b726debbfb0c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-67x7d\" (UID: \"996b39af-143d-4f4a-affc-b726debbfb0c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-67x7d" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887168 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsqk2\" (UniqueName: \"kubernetes.io/projected/79fba350-e2c8-44c0-8151-e1e07559bfd7-kube-api-access-hsqk2\") pod \"openshift-apiserver-operator-796bbdcf4f-v4kkz\" (UID: \"79fba350-e2c8-44c0-8151-e1e07559bfd7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v4kkz" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887187 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2e132f7-f4ea-426c-b094-b5eea4577d5f-config-volume\") pod \"collect-profiles-29415855-9cgsd\" (UID: \"f2e132f7-f4ea-426c-b094-b5eea4577d5f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-9cgsd" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887208 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxb8d\" (UniqueName: \"kubernetes.io/projected/8cb1ac56-8d3f-4dce-be45-b9ef2e859e05-kube-api-access-dxb8d\") pod \"dns-default-6kmfx\" (UID: \"8cb1ac56-8d3f-4dce-be45-b9ef2e859e05\") " pod="openshift-dns/dns-default-6kmfx" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887228 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bfgq\" (UniqueName: \"kubernetes.io/projected/f2e132f7-f4ea-426c-b094-b5eea4577d5f-kube-api-access-5bfgq\") pod \"collect-profiles-29415855-9cgsd\" (UID: \"f2e132f7-f4ea-426c-b094-b5eea4577d5f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-9cgsd" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887244 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3708be10-919d-4500-8325-90de029cbb66-registration-dir\") pod \"csi-hostpathplugin-wckzw\" (UID: \"3708be10-919d-4500-8325-90de029cbb66\") " pod="hostpath-provisioner/csi-hostpathplugin-wckzw" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887260 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3708be10-919d-4500-8325-90de029cbb66-csi-data-dir\") pod \"csi-hostpathplugin-wckzw\" (UID: \"3708be10-919d-4500-8325-90de029cbb66\") " pod="hostpath-provisioner/csi-hostpathplugin-wckzw" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887282 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8-serving-cert\") pod \"route-controller-manager-6576b87f9c-q296n\" (UID: \"c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q296n" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887301 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/65fe8661-28bd-474d-aec9-a63801393725-tmpfs\") pod \"packageserver-d55dfcdfc-dsrxv\" (UID: \"65fe8661-28bd-474d-aec9-a63801393725\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dsrxv" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887328 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ec9fa804-6e0c-4375-b1c4-7a467da3a43a-node-bootstrap-token\") pod \"machine-config-server-8jwvd\" (UID: \"ec9fa804-6e0c-4375-b1c4-7a467da3a43a\") " pod="openshift-machine-config-operator/machine-config-server-8jwvd" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887348 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkrj7\" (UniqueName: \"kubernetes.io/projected/5ea105de-fb15-46c7-b937-f4738b7fd8fd-kube-api-access-zkrj7\") pod \"service-ca-operator-777779d784-cpmj6\" (UID: \"5ea105de-fb15-46c7-b937-f4738b7fd8fd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cpmj6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887369 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3708be10-919d-4500-8325-90de029cbb66-plugins-dir\") pod \"csi-hostpathplugin-wckzw\" (UID: \"3708be10-919d-4500-8325-90de029cbb66\") " pod="hostpath-provisioner/csi-hostpathplugin-wckzw" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887390 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7002a42-041f-4b43-a998-659fa4a3da84-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7zphw\" (UID: \"a7002a42-041f-4b43-a998-659fa4a3da84\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7zphw" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887410 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvw2w\" (UniqueName: \"kubernetes.io/projected/996b39af-143d-4f4a-affc-b726debbfb0c-kube-api-access-rvw2w\") pod \"control-plane-machine-set-operator-78cbb6b69f-67x7d\" (UID: \"996b39af-143d-4f4a-affc-b726debbfb0c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-67x7d" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887461 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79fba350-e2c8-44c0-8151-e1e07559bfd7-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-v4kkz\" (UID: \"79fba350-e2c8-44c0-8151-e1e07559bfd7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v4kkz" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887488 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ec9fa804-6e0c-4375-b1c4-7a467da3a43a-certs\") pod \"machine-config-server-8jwvd\" (UID: \"ec9fa804-6e0c-4375-b1c4-7a467da3a43a\") " pod="openshift-machine-config-operator/machine-config-server-8jwvd" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887505 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ea105de-fb15-46c7-b937-f4738b7fd8fd-serving-cert\") pod \"service-ca-operator-777779d784-cpmj6\" (UID: \"5ea105de-fb15-46c7-b937-f4738b7fd8fd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cpmj6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887523 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/65fe8661-28bd-474d-aec9-a63801393725-webhook-cert\") pod \"packageserver-d55dfcdfc-dsrxv\" (UID: \"65fe8661-28bd-474d-aec9-a63801393725\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dsrxv" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887541 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8cb1ac56-8d3f-4dce-be45-b9ef2e859e05-metrics-tls\") pod \"dns-default-6kmfx\" (UID: \"8cb1ac56-8d3f-4dce-be45-b9ef2e859e05\") " pod="openshift-dns/dns-default-6kmfx" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887562 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8-config\") pod \"route-controller-manager-6576b87f9c-q296n\" (UID: \"c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q296n" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887584 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwk4c\" (UniqueName: \"kubernetes.io/projected/ec9fa804-6e0c-4375-b1c4-7a467da3a43a-kube-api-access-kwk4c\") pod \"machine-config-server-8jwvd\" (UID: \"ec9fa804-6e0c-4375-b1c4-7a467da3a43a\") " pod="openshift-machine-config-operator/machine-config-server-8jwvd" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887607 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccd4g\" (UniqueName: \"kubernetes.io/projected/d7eef861-8bb0-429e-b359-372213f7769b-kube-api-access-ccd4g\") pod \"service-ca-9c57cc56f-stdrz\" (UID: \"d7eef861-8bb0-429e-b359-372213f7769b\") " pod="openshift-service-ca/service-ca-9c57cc56f-stdrz" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887627 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/598b5e4e-bbed-4011-ab04-f152e2905431-cert\") pod \"ingress-canary-22v2l\" (UID: \"598b5e4e-bbed-4011-ab04-f152e2905431\") " pod="openshift-ingress-canary/ingress-canary-22v2l" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887646 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpz9m\" (UniqueName: \"kubernetes.io/projected/875c68c0-137f-466e-97ff-3c2f4bf90401-kube-api-access-rpz9m\") pod \"migrator-59844c95c7-lcrs9\" (UID: \"875c68c0-137f-466e-97ff-3c2f4bf90401\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lcrs9" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.887669 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxq7w\" (UniqueName: \"kubernetes.io/projected/a7002a42-041f-4b43-a998-659fa4a3da84-kube-api-access-fxq7w\") pod \"kube-storage-version-migrator-operator-b67b599dd-7zphw\" (UID: \"a7002a42-041f-4b43-a998-659fa4a3da84\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7zphw" Dec 05 16:18:06 crc kubenswrapper[4756]: E1205 16:18:06.887953 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:07.387927326 +0000 UTC m=+147.626600838 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.889840 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v765\" (UniqueName: \"kubernetes.io/projected/24d8224d-0fe6-47f3-ae6b-14ea099aa69e-kube-api-access-9v765\") pod \"machine-config-operator-74547568cd-d9gdk\" (UID: \"24d8224d-0fe6-47f3-ae6b-14ea099aa69e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9gdk" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.889975 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ea105de-fb15-46c7-b937-f4738b7fd8fd-config\") pod \"service-ca-operator-777779d784-cpmj6\" (UID: \"5ea105de-fb15-46c7-b937-f4738b7fd8fd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cpmj6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.890239 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3708be10-919d-4500-8325-90de029cbb66-socket-dir\") pod \"csi-hostpathplugin-wckzw\" (UID: \"3708be10-919d-4500-8325-90de029cbb66\") " pod="hostpath-provisioner/csi-hostpathplugin-wckzw" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.890707 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8-client-ca\") pod \"route-controller-manager-6576b87f9c-q296n\" (UID: \"c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q296n" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.890850 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d7eef861-8bb0-429e-b359-372213f7769b-signing-cabundle\") pod \"service-ca-9c57cc56f-stdrz\" (UID: \"d7eef861-8bb0-429e-b359-372213f7769b\") " pod="openshift-service-ca/service-ca-9c57cc56f-stdrz" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.895101 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q9vxd"] Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.895494 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f1a7a84-f798-4242-b060-91a6c2f50a21-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6zs9x\" (UID: \"2f1a7a84-f798-4242-b060-91a6c2f50a21\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6zs9x" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.896281 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79fba350-e2c8-44c0-8151-e1e07559bfd7-config\") pod \"openshift-apiserver-operator-796bbdcf4f-v4kkz\" (UID: \"79fba350-e2c8-44c0-8151-e1e07559bfd7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v4kkz" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.896334 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2e132f7-f4ea-426c-b094-b5eea4577d5f-secret-volume\") pod \"collect-profiles-29415855-9cgsd\" (UID: \"f2e132f7-f4ea-426c-b094-b5eea4577d5f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-9cgsd" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.896392 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3708be10-919d-4500-8325-90de029cbb66-mountpoint-dir\") pod \"csi-hostpathplugin-wckzw\" (UID: \"3708be10-919d-4500-8325-90de029cbb66\") " pod="hostpath-provisioner/csi-hostpathplugin-wckzw" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.896537 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3708be10-919d-4500-8325-90de029cbb66-plugins-dir\") pod \"csi-hostpathplugin-wckzw\" (UID: \"3708be10-919d-4500-8325-90de029cbb66\") " pod="hostpath-provisioner/csi-hostpathplugin-wckzw" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.896805 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3708be10-919d-4500-8325-90de029cbb66-registration-dir\") pod \"csi-hostpathplugin-wckzw\" (UID: \"3708be10-919d-4500-8325-90de029cbb66\") " pod="hostpath-provisioner/csi-hostpathplugin-wckzw" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.897172 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7002a42-041f-4b43-a998-659fa4a3da84-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7zphw\" (UID: \"a7002a42-041f-4b43-a998-659fa4a3da84\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7zphw" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.897541 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/65fe8661-28bd-474d-aec9-a63801393725-tmpfs\") pod \"packageserver-d55dfcdfc-dsrxv\" (UID: \"65fe8661-28bd-474d-aec9-a63801393725\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dsrxv" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.897635 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3708be10-919d-4500-8325-90de029cbb66-csi-data-dir\") pod \"csi-hostpathplugin-wckzw\" (UID: \"3708be10-919d-4500-8325-90de029cbb66\") " pod="hostpath-provisioner/csi-hostpathplugin-wckzw" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.902358 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8-config\") pod \"route-controller-manager-6576b87f9c-q296n\" (UID: \"c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q296n" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.910349 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8cb1ac56-8d3f-4dce-be45-b9ef2e859e05-config-volume\") pod \"dns-default-6kmfx\" (UID: \"8cb1ac56-8d3f-4dce-be45-b9ef2e859e05\") " pod="openshift-dns/dns-default-6kmfx" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.910650 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7002a42-041f-4b43-a998-659fa4a3da84-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7zphw\" (UID: \"a7002a42-041f-4b43-a998-659fa4a3da84\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7zphw" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.911336 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8-serving-cert\") pod \"route-controller-manager-6576b87f9c-q296n\" (UID: \"c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q296n" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.912615 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2e132f7-f4ea-426c-b094-b5eea4577d5f-config-volume\") pod \"collect-profiles-29415855-9cgsd\" (UID: \"f2e132f7-f4ea-426c-b094-b5eea4577d5f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-9cgsd" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.923653 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/598b5e4e-bbed-4011-ab04-f152e2905431-cert\") pod \"ingress-canary-22v2l\" (UID: \"598b5e4e-bbed-4011-ab04-f152e2905431\") " pod="openshift-ingress-canary/ingress-canary-22v2l" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.924112 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8cb1ac56-8d3f-4dce-be45-b9ef2e859e05-metrics-tls\") pod \"dns-default-6kmfx\" (UID: \"8cb1ac56-8d3f-4dce-be45-b9ef2e859e05\") " pod="openshift-dns/dns-default-6kmfx" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.924840 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/996b39af-143d-4f4a-affc-b726debbfb0c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-67x7d\" (UID: \"996b39af-143d-4f4a-affc-b726debbfb0c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-67x7d" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.924893 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ec9fa804-6e0c-4375-b1c4-7a467da3a43a-node-bootstrap-token\") pod \"machine-config-server-8jwvd\" (UID: \"ec9fa804-6e0c-4375-b1c4-7a467da3a43a\") " pod="openshift-machine-config-operator/machine-config-server-8jwvd" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.925009 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d7eef861-8bb0-429e-b359-372213f7769b-signing-key\") pod \"service-ca-9c57cc56f-stdrz\" (UID: \"d7eef861-8bb0-429e-b359-372213f7769b\") " pod="openshift-service-ca/service-ca-9c57cc56f-stdrz" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.925280 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/65fe8661-28bd-474d-aec9-a63801393725-apiservice-cert\") pod \"packageserver-d55dfcdfc-dsrxv\" (UID: \"65fe8661-28bd-474d-aec9-a63801393725\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dsrxv" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.926167 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-bound-sa-token\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.927261 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ec9fa804-6e0c-4375-b1c4-7a467da3a43a-certs\") pod \"machine-config-server-8jwvd\" (UID: \"ec9fa804-6e0c-4375-b1c4-7a467da3a43a\") " pod="openshift-machine-config-operator/machine-config-server-8jwvd" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.931241 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79fba350-e2c8-44c0-8151-e1e07559bfd7-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-v4kkz\" (UID: \"79fba350-e2c8-44c0-8151-e1e07559bfd7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v4kkz" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.935856 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/65fe8661-28bd-474d-aec9-a63801393725-webhook-cert\") pod \"packageserver-d55dfcdfc-dsrxv\" (UID: \"65fe8661-28bd-474d-aec9-a63801393725\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dsrxv" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.940847 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ea105de-fb15-46c7-b937-f4738b7fd8fd-serving-cert\") pod \"service-ca-operator-777779d784-cpmj6\" (UID: \"5ea105de-fb15-46c7-b937-f4738b7fd8fd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cpmj6" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.943736 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zn2v6" event={"ID":"317dead3-6446-45a7-a256-4ae88df539aa","Type":"ContainerStarted","Data":"6c4251dcc9b75edf3392ec0443fa66cb53fcc3f511d53d750ed9f96c62e92b90"} Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.943793 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zn2v6" event={"ID":"317dead3-6446-45a7-a256-4ae88df539aa","Type":"ContainerStarted","Data":"304f1339c09a6ad174d53d7f1621ed9956229bdac723ed088b2020456d5e9476"} Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.950355 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxq7w\" (UniqueName: \"kubernetes.io/projected/a7002a42-041f-4b43-a998-659fa4a3da84-kube-api-access-fxq7w\") pod \"kube-storage-version-migrator-operator-b67b599dd-7zphw\" (UID: \"a7002a42-041f-4b43-a998-659fa4a3da84\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7zphw" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.952116 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"6cf8cd42fd61dddb52787f7433e8dbf0429bd83bc9b85e9dfa197c18456451fb"} Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.952160 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f53894001958e4643985fd1643cdd95ddbe02eaf52d2e33f3da74952c701f1f9"} Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.960274 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"33562aa9e83260de2d10cae9006e1124971f50ba994bf5ed3aff640d468ca691"} Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.960319 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"bdbe8716b116526192744a9e083fa80b17e01d1a33e60e1b19db8ef9a02373bb"} Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.970468 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxb8d\" (UniqueName: \"kubernetes.io/projected/8cb1ac56-8d3f-4dce-be45-b9ef2e859e05-kube-api-access-dxb8d\") pod \"dns-default-6kmfx\" (UID: \"8cb1ac56-8d3f-4dce-be45-b9ef2e859e05\") " pod="openshift-dns/dns-default-6kmfx" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.979819 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-9nxrs" event={"ID":"93f05eb8-490a-4d15-8c38-dd158bd4a91f","Type":"ContainerStarted","Data":"98d9c5cda5e814a65990e320beba226ba4be23a1c84a2338cd67553c439292dc"} Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.980154 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-w5k56" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.983438 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbh2f\" (UniqueName: \"kubernetes.io/projected/2f1a7a84-f798-4242-b060-91a6c2f50a21-kube-api-access-fbh2f\") pod \"package-server-manager-789f6589d5-6zs9x\" (UID: \"2f1a7a84-f798-4242-b060-91a6c2f50a21\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6zs9x" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.991523 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"006478453d140104da5957428867a3af4e97f5f4c8a5c579b830cf3317b15aaf"} Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.991993 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:18:06 crc kubenswrapper[4756]: I1205 16:18:06.992090 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:06 crc kubenswrapper[4756]: E1205 16:18:06.992753 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:07.492726156 +0000 UTC m=+147.731399838 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:06.998917 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-zlcmz" event={"ID":"ed2725c8-948a-4757-8e92-e365e49e1f3f","Type":"ContainerStarted","Data":"bfd69fb37f5f3214b749129f246f224293b0af17bf4be6b8e22a4d30199786b6"} Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.004327 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5tdn\" (UniqueName: \"kubernetes.io/projected/65fe8661-28bd-474d-aec9-a63801393725-kube-api-access-q5tdn\") pod \"packageserver-d55dfcdfc-dsrxv\" (UID: \"65fe8661-28bd-474d-aec9-a63801393725\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dsrxv" Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.007603 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.027743 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvw2w\" (UniqueName: \"kubernetes.io/projected/996b39af-143d-4f4a-affc-b726debbfb0c-kube-api-access-rvw2w\") pod \"control-plane-machine-set-operator-78cbb6b69f-67x7d\" (UID: \"996b39af-143d-4f4a-affc-b726debbfb0c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-67x7d" Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.052380 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zmrlm" Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.062949 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75q94\" (UniqueName: \"kubernetes.io/projected/c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8-kube-api-access-75q94\") pod \"route-controller-manager-6576b87f9c-q296n\" (UID: \"c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q296n" Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.070019 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfbnm\" (UniqueName: \"kubernetes.io/projected/3708be10-919d-4500-8325-90de029cbb66-kube-api-access-lfbnm\") pod \"csi-hostpathplugin-wckzw\" (UID: \"3708be10-919d-4500-8325-90de029cbb66\") " pod="hostpath-provisioner/csi-hostpathplugin-wckzw" Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.092340 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fslf2\" (UniqueName: \"kubernetes.io/projected/598b5e4e-bbed-4011-ab04-f152e2905431-kube-api-access-fslf2\") pod \"ingress-canary-22v2l\" (UID: \"598b5e4e-bbed-4011-ab04-f152e2905431\") " pod="openshift-ingress-canary/ingress-canary-22v2l" Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.095033 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:07 crc kubenswrapper[4756]: E1205 16:18:07.095872 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:07.595847928 +0000 UTC m=+147.834521440 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.107440 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkrj7\" (UniqueName: \"kubernetes.io/projected/5ea105de-fb15-46c7-b937-f4738b7fd8fd-kube-api-access-zkrj7\") pod \"service-ca-operator-777779d784-cpmj6\" (UID: \"5ea105de-fb15-46c7-b937-f4738b7fd8fd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cpmj6" Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.119787 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-pt8fk" Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.121893 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bfgq\" (UniqueName: \"kubernetes.io/projected/f2e132f7-f4ea-426c-b094-b5eea4577d5f-kube-api-access-5bfgq\") pod \"collect-profiles-29415855-9cgsd\" (UID: \"f2e132f7-f4ea-426c-b094-b5eea4577d5f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-9cgsd" Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.138183 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsqk2\" (UniqueName: \"kubernetes.io/projected/79fba350-e2c8-44c0-8151-e1e07559bfd7-kube-api-access-hsqk2\") pod \"openshift-apiserver-operator-796bbdcf4f-v4kkz\" (UID: \"79fba350-e2c8-44c0-8151-e1e07559bfd7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v4kkz" Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.152038 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6zs9x" Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.158740 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q296n" Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.171826 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cpmj6" Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.173843 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dsrxv" Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.180703 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwk4c\" (UniqueName: \"kubernetes.io/projected/ec9fa804-6e0c-4375-b1c4-7a467da3a43a-kube-api-access-kwk4c\") pod \"machine-config-server-8jwvd\" (UID: \"ec9fa804-6e0c-4375-b1c4-7a467da3a43a\") " pod="openshift-machine-config-operator/machine-config-server-8jwvd" Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.181405 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k7kwr"] Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.184285 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9gdk" Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.184749 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-67x7d" Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.194644 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccd4g\" (UniqueName: \"kubernetes.io/projected/d7eef861-8bb0-429e-b359-372213f7769b-kube-api-access-ccd4g\") pod \"service-ca-9c57cc56f-stdrz\" (UID: \"d7eef861-8bb0-429e-b359-372213f7769b\") " pod="openshift-service-ca/service-ca-9c57cc56f-stdrz" Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.196800 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7zphw" Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.198866 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:07 crc kubenswrapper[4756]: E1205 16:18:07.199213 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:07.699197768 +0000 UTC m=+147.937871290 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.199609 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-9cgsd" Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.204705 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpz9m\" (UniqueName: \"kubernetes.io/projected/875c68c0-137f-466e-97ff-3c2f4bf90401-kube-api-access-rpz9m\") pod \"migrator-59844c95c7-lcrs9\" (UID: \"875c68c0-137f-466e-97ff-3c2f4bf90401\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lcrs9" Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.248647 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-wckzw" Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.265472 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6kmfx" Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.265600 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8jwvd" Dec 05 16:18:07 crc kubenswrapper[4756]: W1205 16:18:07.267791 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ad5d1b1_4e12_430f_91f1_70e8fd975a09.slice/crio-f350c4f1ea4a92516c903e51d4d24a7145a1a701e9224607ecc8ecf7eec03b0e WatchSource:0}: Error finding container f350c4f1ea4a92516c903e51d4d24a7145a1a701e9224607ecc8ecf7eec03b0e: Status 404 returned error can't find the container with id f350c4f1ea4a92516c903e51d4d24a7145a1a701e9224607ecc8ecf7eec03b0e Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.267823 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n"] Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.268742 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zs5pc"] Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.273559 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mlthg"] Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.274938 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-22v2l" Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.298307 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9xwp8"] Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.299219 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:07 crc kubenswrapper[4756]: E1205 16:18:07.299371 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:07.799347106 +0000 UTC m=+148.038020618 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.299454 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:07 crc kubenswrapper[4756]: E1205 16:18:07.299741 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:07.799730327 +0000 UTC m=+148.038403829 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.357150 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-z9xhd"] Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.367691 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nfdlb"] Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.381048 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-wld6x"] Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.385688 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-t6pc7"] Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.400118 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gn5x4"] Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.400897 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dpqm9"] Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.400943 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:07 crc kubenswrapper[4756]: E1205 16:18:07.401042 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:07.901024628 +0000 UTC m=+148.139698140 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.401186 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:07 crc kubenswrapper[4756]: E1205 16:18:07.401982 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:07.901966664 +0000 UTC m=+148.140640176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.414268 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-hqh5j"] Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.424124 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-stdrz" Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.433075 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v4kkz" Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.445469 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lcrs9" Dec 05 16:18:07 crc kubenswrapper[4756]: W1205 16:18:07.449951 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6e932c7_ad81_4830_b212_61bdde0cec5b.slice/crio-99dc1fc1ec52df06b3ea8cdfbd90db45a3bd455a044ba43663990aef9de80b2d WatchSource:0}: Error finding container 99dc1fc1ec52df06b3ea8cdfbd90db45a3bd455a044ba43663990aef9de80b2d: Status 404 returned error can't find the container with id 99dc1fc1ec52df06b3ea8cdfbd90db45a3bd455a044ba43663990aef9de80b2d Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.465300 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2rb"] Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.493968 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6zc4l"] Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.501952 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:07 crc kubenswrapper[4756]: E1205 16:18:07.502374 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:08.00236091 +0000 UTC m=+148.241034422 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:07 crc kubenswrapper[4756]: W1205 16:18:07.502554 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda229dbb4_017d_4ef2_bcd0_6c5585e3c497.slice/crio-98b01ff2f118ceb40297925845fb909d0ec1f1e56b494bf0724f5e6da65071ea WatchSource:0}: Error finding container 98b01ff2f118ceb40297925845fb909d0ec1f1e56b494bf0724f5e6da65071ea: Status 404 returned error can't find the container with id 98b01ff2f118ceb40297925845fb909d0ec1f1e56b494bf0724f5e6da65071ea Dec 05 16:18:07 crc kubenswrapper[4756]: W1205 16:18:07.513575 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0eb62b4b_b2c4_46c8_af81_b234e5f3bf71.slice/crio-834f77a8190ff068e41472ae7e9ff92ff415381c506e525461b9feaefb6a9b32 WatchSource:0}: Error finding container 834f77a8190ff068e41472ae7e9ff92ff415381c506e525461b9feaefb6a9b32: Status 404 returned error can't find the container with id 834f77a8190ff068e41472ae7e9ff92ff415381c506e525461b9feaefb6a9b32 Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.604395 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:07 crc kubenswrapper[4756]: E1205 16:18:07.604762 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:08.104751112 +0000 UTC m=+148.343424624 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.704983 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:07 crc kubenswrapper[4756]: E1205 16:18:07.705134 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:08.205111246 +0000 UTC m=+148.443784758 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.705223 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:07 crc kubenswrapper[4756]: E1205 16:18:07.705552 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:08.205536058 +0000 UTC m=+148.444209570 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.806250 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:07 crc kubenswrapper[4756]: E1205 16:18:07.806483 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:08.306456788 +0000 UTC m=+148.545130300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.806705 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:07 crc kubenswrapper[4756]: E1205 16:18:07.807033 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:08.307019314 +0000 UTC m=+148.545692816 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.814410 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-k8whp"] Dec 05 16:18:07 crc kubenswrapper[4756]: W1205 16:18:07.857160 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67cf5e8a_2d35_4db9_bcb6_68492f4be191.slice/crio-8636ff4b753c0c885e1207bb62b72b4943c6ff51c0226bb1528f33d709ae103e WatchSource:0}: Error finding container 8636ff4b753c0c885e1207bb62b72b4943c6ff51c0226bb1528f33d709ae103e: Status 404 returned error can't find the container with id 8636ff4b753c0c885e1207bb62b72b4943c6ff51c0226bb1528f33d709ae103e Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.866495 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-w5k56"] Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.898632 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-67x7d"] Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.908485 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:07 crc kubenswrapper[4756]: E1205 16:18:07.908836 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:08.40881531 +0000 UTC m=+148.647488822 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.956841 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zmrlm"] Dec 05 16:18:07 crc kubenswrapper[4756]: I1205 16:18:07.988298 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7zphw"] Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.001558 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dsrxv"] Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.009858 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:08 crc kubenswrapper[4756]: E1205 16:18:08.010312 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:08.510280945 +0000 UTC m=+148.748954457 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.015659 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cpmj6"] Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.016688 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" event={"ID":"a9694101-a510-49c6-8c1a-4869c9619b6a","Type":"ContainerStarted","Data":"41f5cb22d7da2fd3c1f92560926f81191913ef30adb976b40c02a0efede28182"} Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.019377 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-9nxrs" event={"ID":"93f05eb8-490a-4d15-8c38-dd158bd4a91f","Type":"ContainerStarted","Data":"d3048f6456deb0e62613985c7f50caeccd286e51eb0e7fea9e5b26ac2bd6cd99"} Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.019723 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-9nxrs" Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.021936 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" event={"ID":"597869bc-03e8-40fd-8f80-c156102a3dc7","Type":"ContainerStarted","Data":"c12581e9f364d91554747754e6e4cc43c733d6fee96db8213cde236f4504f146"} Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.023372 4756 patch_prober.go:28] interesting pod/console-operator-58897d9998-9nxrs container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.023424 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-9nxrs" podUID="93f05eb8-490a-4d15-8c38-dd158bd4a91f" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.031271 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6twhh" event={"ID":"6bcdf9c6-be18-4bd1-a335-a8479496f678","Type":"ContainerStarted","Data":"f05120158aae8f153bdf711d34015fc3c2a64fdb31ba59097aadb7eb39bcccd7"} Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.035599 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8jwvd" event={"ID":"ec9fa804-6e0c-4375-b1c4-7a467da3a43a","Type":"ContainerStarted","Data":"a4218102727658fff16a3372adad79093024d80b94a21244daca0404839db44d"} Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.055826 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nfdlb" event={"ID":"0eb62b4b-b2c4-46c8-af81-b234e5f3bf71","Type":"ContainerStarted","Data":"834f77a8190ff068e41472ae7e9ff92ff415381c506e525461b9feaefb6a9b32"} Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.055862 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cxc5" event={"ID":"0839e735-d517-4f9d-b742-45f842a9d9cc","Type":"ContainerStarted","Data":"21e2ed4c7a13bef76a6bb6dc9e8932c4ababd500528a6eb42f3eef662b3e213f"} Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.055874 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2rb" event={"ID":"d2ff7989-b26b-4c66-991a-fecafc6f38b9","Type":"ContainerStarted","Data":"2f1b3c7d1de1e4f8212b95bd915ec8d3f48adcfbd8bbada36edcb20947ccedde"} Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.055883 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6zc4l" event={"ID":"59db5196-70c4-4bd6-8406-3589d4e44599","Type":"ContainerStarted","Data":"1a6544a83e67cb11d26b72d760f609232b065614fd52080bee54a8258a243339"} Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.059792 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mlthg" event={"ID":"cba515e6-fcd1-4310-9630-70ca7c90c213","Type":"ContainerStarted","Data":"86366d17ab12df0583483e157a880080f0e926b82ecae6ed88ddad6bb3871822"} Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.062702 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q9vxd" event={"ID":"7a6b3333-726a-45d8-98e0-70a523f6ac6e","Type":"ContainerStarted","Data":"8b54cbe610a6f3afa53e85d1a163345cf70617dbff0dd3b7d6f796277c5cb70a"} Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.063768 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8cjkz" event={"ID":"e438b5e2-50cf-4f45-a46f-5e4ee62633b4","Type":"ContainerStarted","Data":"2c64b2d9615868baf1c99fcce37e0e74a35456a9b9c960940160cad77fdfdd8a"} Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.066104 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9xwp8" event={"ID":"a229dbb4-017d-4ef2-bcd0-6c5585e3c497","Type":"ContainerStarted","Data":"98b01ff2f118ceb40297925845fb909d0ec1f1e56b494bf0724f5e6da65071ea"} Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.081539 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wld6x" event={"ID":"351d65a8-44a7-4f84-be48-1b5204422e8b","Type":"ContainerStarted","Data":"95347d207c5ac003e2d08bf80cb965dcf4854e2ed9a69c2dc911bfc09d7f3da4"} Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.084716 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"09a41b1289d6ed1b017f357f440fb3d78bfe4791a34db6db214424d324f1a580"} Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.096878 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-pt8fk" event={"ID":"6ad5d1b1-4e12-430f-91f1-70e8fd975a09","Type":"ContainerStarted","Data":"f350c4f1ea4a92516c903e51d4d24a7145a1a701e9224607ecc8ecf7eec03b0e"} Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.102908 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-z9xhd" event={"ID":"6013c2d1-fe3a-4d8d-ba47-132ad91a6a34","Type":"ContainerStarted","Data":"10bb400c96a2d9e20f5c818efd033216b3093a5dfe4b3896877c4f24616aa659"} Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.103995 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dpqm9" event={"ID":"67cf5e8a-2d35-4db9-bcb6-68492f4be191","Type":"ContainerStarted","Data":"8636ff4b753c0c885e1207bb62b72b4943c6ff51c0226bb1528f33d709ae103e"} Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.106879 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-67x7d" event={"ID":"996b39af-143d-4f4a-affc-b726debbfb0c","Type":"ContainerStarted","Data":"bc989c109affb90b2092f9d1c8d5494bb41fc2a429e28e5a6e496ef2bb5afff4"} Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.112645 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:08 crc kubenswrapper[4756]: E1205 16:18:08.113183 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:08.613161751 +0000 UTC m=+148.851835263 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.113699 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t6pc7" event={"ID":"8b44c87a-bbab-469a-b5d7-2c8a0a9f05da","Type":"ContainerStarted","Data":"406ff94e64592cc4008ef72a633ed12eb2eb9ec316e34e83ce52ec8fa232a06d"} Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.118700 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4jdj5" event={"ID":"b5856492-e637-42d0-b018-6f205e03e897","Type":"ContainerStarted","Data":"8fe9f0aec15351831324b8e12be6d738d9ed2c47047cd223629f913c3fab4947"} Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.122705 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gn5x4" event={"ID":"c6e932c7-ad81-4830-b212-61bdde0cec5b","Type":"ContainerStarted","Data":"99dc1fc1ec52df06b3ea8cdfbd90db45a3bd455a044ba43663990aef9de80b2d"} Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.124098 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-w5k56" event={"ID":"1bd595eb-93e9-4ecf-9aad-b57eb902bcf0","Type":"ContainerStarted","Data":"7a18d80f79fb368317e7918ac218e198585c03de89ff021382fdad287adc10d2"} Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.125024 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" event={"ID":"cf4b45a0-f5a8-4d66-9d82-19e31c680447","Type":"ContainerStarted","Data":"267e9b992d6de50fa4f80cbc811b1b58d78cac5b926609c6f5df26e5802cc6ab"} Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.125652 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k7kwr" event={"ID":"dac87fdb-fe64-40be-81f6-e525e378a0c7","Type":"ContainerStarted","Data":"0475ff9b0512ecc9d14786c36b279f046efdf3e969a273a12999685ac82443bf"} Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.214620 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:08 crc kubenswrapper[4756]: E1205 16:18:08.215066 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:08.715053659 +0000 UTC m=+148.953727171 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:08 crc kubenswrapper[4756]: W1205 16:18:08.278308 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ea105de_fb15_46c7_b937_f4738b7fd8fd.slice/crio-1b8d8f2fd85a46aebd31b2534e8929028cf9fc741be78f6cabfd0535254ed3f5 WatchSource:0}: Error finding container 1b8d8f2fd85a46aebd31b2534e8929028cf9fc741be78f6cabfd0535254ed3f5: Status 404 returned error can't find the container with id 1b8d8f2fd85a46aebd31b2534e8929028cf9fc741be78f6cabfd0535254ed3f5 Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.317900 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:08 crc kubenswrapper[4756]: E1205 16:18:08.318313 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:08.818284375 +0000 UTC m=+149.056957887 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.321702 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:08 crc kubenswrapper[4756]: E1205 16:18:08.324636 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:08.824614454 +0000 UTC m=+149.063287966 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.379170 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-d9gdk"] Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.380154 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-q296n"] Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.424027 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:08 crc kubenswrapper[4756]: E1205 16:18:08.424208 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:08.924175626 +0000 UTC m=+149.162849158 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.428849 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:08 crc kubenswrapper[4756]: E1205 16:18:08.429532 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:08.929516217 +0000 UTC m=+149.168189739 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:08 crc kubenswrapper[4756]: W1205 16:18:08.445367 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24d8224d_0fe6_47f3_ae6b_14ea099aa69e.slice/crio-1afc76ad6a6910957fcd3f4bab06e6c69ebc96dde904f58a955f6a909172e7b3 WatchSource:0}: Error finding container 1afc76ad6a6910957fcd3f4bab06e6c69ebc96dde904f58a955f6a909172e7b3: Status 404 returned error can't find the container with id 1afc76ad6a6910957fcd3f4bab06e6c69ebc96dde904f58a955f6a909172e7b3 Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.468401 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415855-9cgsd"] Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.503703 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lcrs9"] Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.506356 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-22v2l"] Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.523776 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6zs9x"] Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.529865 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:08 crc kubenswrapper[4756]: E1205 16:18:08.530715 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:09.030674324 +0000 UTC m=+149.269347836 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.572302 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6kmfx"] Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.598994 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v4kkz"] Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.610106 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wckzw"] Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.631970 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:08 crc kubenswrapper[4756]: E1205 16:18:08.632335 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:09.132320515 +0000 UTC m=+149.370994027 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.711888 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-stdrz"] Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.733079 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:08 crc kubenswrapper[4756]: E1205 16:18:08.735986 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:09.235966633 +0000 UTC m=+149.474640145 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.758012 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-9nxrs" podStartSLOduration=124.757990577 podStartE2EDuration="2m4.757990577s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:08.756676259 +0000 UTC m=+148.995349781" watchObservedRunningTime="2025-12-05 16:18:08.757990577 +0000 UTC m=+148.996664089" Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.836893 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:08 crc kubenswrapper[4756]: E1205 16:18:08.837280 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:09.337263903 +0000 UTC m=+149.575937415 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:08 crc kubenswrapper[4756]: W1205 16:18:08.935155 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79fba350_e2c8_44c0_8151_e1e07559bfd7.slice/crio-e67353c519f722a8c144e7cdca9e8a376ff8d05a96377accc7511a26dbb3ef51 WatchSource:0}: Error finding container e67353c519f722a8c144e7cdca9e8a376ff8d05a96377accc7511a26dbb3ef51: Status 404 returned error can't find the container with id e67353c519f722a8c144e7cdca9e8a376ff8d05a96377accc7511a26dbb3ef51 Dec 05 16:18:08 crc kubenswrapper[4756]: I1205 16:18:08.938121 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:08 crc kubenswrapper[4756]: E1205 16:18:08.938873 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:09.438854192 +0000 UTC m=+149.677527724 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.039780 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:09 crc kubenswrapper[4756]: E1205 16:18:09.040095 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:09.540082211 +0000 UTC m=+149.778755723 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.141156 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:09 crc kubenswrapper[4756]: E1205 16:18:09.141571 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:09.641542526 +0000 UTC m=+149.880216038 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.141596 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-stdrz" event={"ID":"d7eef861-8bb0-429e-b359-372213f7769b","Type":"ContainerStarted","Data":"354a212abba0f03f7388c44b7b318ed921e82c907cee3e159e69ac5cbe878038"} Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.143963 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:09 crc kubenswrapper[4756]: E1205 16:18:09.144380 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:09.644364636 +0000 UTC m=+149.883038148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.153844 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-pt8fk" event={"ID":"6ad5d1b1-4e12-430f-91f1-70e8fd975a09","Type":"ContainerStarted","Data":"f77c59eb7fd8bafc7886f57dc2751d697211650eb036901c8e5d203c6de09d90"} Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.188381 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-pt8fk" podStartSLOduration=125.188360993 podStartE2EDuration="2m5.188360993s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:09.18824176 +0000 UTC m=+149.426915282" watchObservedRunningTime="2025-12-05 16:18:09.188360993 +0000 UTC m=+149.427034505" Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.207813 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-zlcmz" event={"ID":"ed2725c8-948a-4757-8e92-e365e49e1f3f","Type":"ContainerStarted","Data":"862b8e66a63c0a16273005f18ec80dcebe99fcbdb090b7dd25d76c4190bb32fa"} Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.229871 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2rb" event={"ID":"d2ff7989-b26b-4c66-991a-fecafc6f38b9","Type":"ContainerStarted","Data":"c81e390aa415d5a7d0b455155963e2f1e236fdbef270df72848892a40ecd261f"} Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.245467 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:09 crc kubenswrapper[4756]: E1205 16:18:09.245681 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:09.745649727 +0000 UTC m=+149.984323239 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.245949 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:09 crc kubenswrapper[4756]: E1205 16:18:09.248068 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:09.748047155 +0000 UTC m=+149.986720727 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.248725 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-zlcmz" podStartSLOduration=125.248707444 podStartE2EDuration="2m5.248707444s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:09.242819787 +0000 UTC m=+149.481493309" watchObservedRunningTime="2025-12-05 16:18:09.248707444 +0000 UTC m=+149.487380966" Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.260842 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zn2v6" event={"ID":"317dead3-6446-45a7-a256-4ae88df539aa","Type":"ContainerStarted","Data":"06ca901b8af984f8d466037b7e43898cab704d95f9415b767f70b75e05839566"} Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.267947 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-9cgsd" event={"ID":"f2e132f7-f4ea-426c-b094-b5eea4577d5f","Type":"ContainerStarted","Data":"bf6df1aaa6c133c751332b9e613d785f1b2cc6de6de053163519deb7e91a67ef"} Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.270237 4756 generic.go:334] "Generic (PLEG): container finished" podID="cf4b45a0-f5a8-4d66-9d82-19e31c680447" containerID="b456265ecc989e3fcd21ab8b6d1ed59a03509b93aeaca4fa8cd7477fd41466d2" exitCode=0 Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.270298 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" event={"ID":"cf4b45a0-f5a8-4d66-9d82-19e31c680447","Type":"ContainerDied","Data":"b456265ecc989e3fcd21ab8b6d1ed59a03509b93aeaca4fa8cd7477fd41466d2"} Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.279135 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2rb" podStartSLOduration=125.279115485 podStartE2EDuration="2m5.279115485s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:09.278036415 +0000 UTC m=+149.516709947" watchObservedRunningTime="2025-12-05 16:18:09.279115485 +0000 UTC m=+149.517788997" Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.324364 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-z9xhd" event={"ID":"6013c2d1-fe3a-4d8d-ba47-132ad91a6a34","Type":"ContainerStarted","Data":"513d09f71c201061a1cd9520bf9579dedd208a91f6b0eafc14d3ed5b9a829c9a"} Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.326117 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-z9xhd" Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.340447 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zn2v6" podStartSLOduration=125.340411043 podStartE2EDuration="2m5.340411043s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:09.339724173 +0000 UTC m=+149.578397695" watchObservedRunningTime="2025-12-05 16:18:09.340411043 +0000 UTC m=+149.579084555" Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.343379 4756 patch_prober.go:28] interesting pod/downloads-7954f5f757-z9xhd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.343467 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z9xhd" podUID="6013c2d1-fe3a-4d8d-ba47-132ad91a6a34" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.352228 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:09 crc kubenswrapper[4756]: E1205 16:18:09.354977 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:09.854954235 +0000 UTC m=+150.093627777 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.355715 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q9vxd" event={"ID":"7a6b3333-726a-45d8-98e0-70a523f6ac6e","Type":"ContainerStarted","Data":"9426f5628c87d6b59fa3e4b2ab13c4067c009497574b835062403a63bcf8868c"} Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.371399 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-z9xhd" podStartSLOduration=125.37138323 podStartE2EDuration="2m5.37138323s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:09.369875768 +0000 UTC m=+149.608549280" watchObservedRunningTime="2025-12-05 16:18:09.37138323 +0000 UTC m=+149.610056742" Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.386344 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6zs9x" event={"ID":"2f1a7a84-f798-4242-b060-91a6c2f50a21","Type":"ContainerStarted","Data":"1b1b3c7112771965382a2792a18b183cd736ecf983e6341cce654511088d232f"} Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.402049 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q9vxd" podStartSLOduration=125.402025929 podStartE2EDuration="2m5.402025929s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:09.392984293 +0000 UTC m=+149.631657815" watchObservedRunningTime="2025-12-05 16:18:09.402025929 +0000 UTC m=+149.640699441" Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.419488 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k7kwr" event={"ID":"dac87fdb-fe64-40be-81f6-e525e378a0c7","Type":"ContainerStarted","Data":"c40c5688ccb587570a95dc3414442222b867568ad011bc66a0783ba57538ead3"} Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.420265 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k7kwr" Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.439992 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6twhh" event={"ID":"6bcdf9c6-be18-4bd1-a335-a8479496f678","Type":"ContainerStarted","Data":"79b1869d5a81d1181f23b98b10a449f991cbcacc2af06d8a5b8b1bc26656f5e6"} Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.448323 4756 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-k7kwr container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.448371 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k7kwr" podUID="dac87fdb-fe64-40be-81f6-e525e378a0c7" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.459773 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:09 crc kubenswrapper[4756]: E1205 16:18:09.461839 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:09.961788563 +0000 UTC m=+150.200462075 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.467722 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wckzw" event={"ID":"3708be10-919d-4500-8325-90de029cbb66","Type":"ContainerStarted","Data":"1c7646832479e8b29b0172047477a6f97e7f1f18990d5f4c79f55b60ab072fa2"} Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.560602 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:09 crc kubenswrapper[4756]: E1205 16:18:09.561084 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:10.061069196 +0000 UTC m=+150.299742708 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.570162 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mlthg" event={"ID":"cba515e6-fcd1-4310-9630-70ca7c90c213","Type":"ContainerStarted","Data":"504f479f68da71447bd1e194076de35d0138021235882f3495f9b37af1194b42"} Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.609183 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q296n" event={"ID":"c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8","Type":"ContainerStarted","Data":"619ab76ee16d01a2eec4fdc118aef8893a35882b67ad63336f3e9039a100fdbc"} Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.640777 4756 generic.go:334] "Generic (PLEG): container finished" podID="0839e735-d517-4f9d-b742-45f842a9d9cc" containerID="a6686ecf514537a1baa9114b79debd5942afd3448c7963c7dd9de8b3ae7386cf" exitCode=0 Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.640852 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cxc5" event={"ID":"0839e735-d517-4f9d-b742-45f842a9d9cc","Type":"ContainerDied","Data":"a6686ecf514537a1baa9114b79debd5942afd3448c7963c7dd9de8b3ae7386cf"} Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.666516 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.675104 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gn5x4" event={"ID":"c6e932c7-ad81-4830-b212-61bdde0cec5b","Type":"ContainerStarted","Data":"d00618529dfcbaf2d4730a4d9f41be089cf8f73e7758348927dfada62b50d61f"} Dec 05 16:18:09 crc kubenswrapper[4756]: E1205 16:18:09.679575 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:10.179549265 +0000 UTC m=+150.418222777 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.716357 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k7kwr" podStartSLOduration=125.716336507 podStartE2EDuration="2m5.716336507s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:09.480708169 +0000 UTC m=+149.719381691" watchObservedRunningTime="2025-12-05 16:18:09.716336507 +0000 UTC m=+149.955010029" Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.735861 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9gdk" event={"ID":"24d8224d-0fe6-47f3-ae6b-14ea099aa69e","Type":"ContainerStarted","Data":"1afc76ad6a6910957fcd3f4bab06e6c69ebc96dde904f58a955f6a909172e7b3"} Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.752139 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cpmj6" event={"ID":"5ea105de-fb15-46c7-b937-f4738b7fd8fd","Type":"ContainerStarted","Data":"1b8d8f2fd85a46aebd31b2534e8929028cf9fc741be78f6cabfd0535254ed3f5"} Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.768932 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:09 crc kubenswrapper[4756]: E1205 16:18:09.769676 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:10.269646468 +0000 UTC m=+150.508319980 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.780522 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gn5x4" podStartSLOduration=125.780498576 podStartE2EDuration="2m5.780498576s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:09.776310527 +0000 UTC m=+150.014984039" watchObservedRunningTime="2025-12-05 16:18:09.780498576 +0000 UTC m=+150.019172088" Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.800782 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8cjkz" event={"ID":"e438b5e2-50cf-4f45-a46f-5e4ee62633b4","Type":"ContainerStarted","Data":"23ad000b8473206ad433be31b5527280a573510b8349e7cb71317ea7a8e7e775"} Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.870447 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dsrxv" event={"ID":"65fe8661-28bd-474d-aec9-a63801393725","Type":"ContainerStarted","Data":"b210f60779c8d477fe111de7b44f82d30f821db594a197f905c6d6176167e19d"} Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.873093 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dsrxv" Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.882765 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:09 crc kubenswrapper[4756]: E1205 16:18:09.884130 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:10.384112692 +0000 UTC m=+150.622786254 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.898106 4756 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-dsrxv container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" start-of-body= Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.898158 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dsrxv" podUID="65fe8661-28bd-474d-aec9-a63801393725" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.911870 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v4kkz" event={"ID":"79fba350-e2c8-44c0-8151-e1e07559bfd7","Type":"ContainerStarted","Data":"e67353c519f722a8c144e7cdca9e8a376ff8d05a96377accc7511a26dbb3ef51"} Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.930980 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cpmj6" podStartSLOduration=125.93096215 podStartE2EDuration="2m5.93096215s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:09.819777819 +0000 UTC m=+150.058451351" watchObservedRunningTime="2025-12-05 16:18:09.93096215 +0000 UTC m=+150.169635662" Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.981129 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zs5pc" event={"ID":"9c56993d-85b5-46f9-9897-4e371ee09769","Type":"ContainerStarted","Data":"d67f37c6ea0eb9283142509515356386cf61f0ae793eb045b8430de2c3ba1d3c"} Dec 05 16:18:09 crc kubenswrapper[4756]: I1205 16:18:09.985866 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:09 crc kubenswrapper[4756]: E1205 16:18:09.987094 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:10.48706764 +0000 UTC m=+150.725741152 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.014256 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8jwvd" event={"ID":"ec9fa804-6e0c-4375-b1c4-7a467da3a43a","Type":"ContainerStarted","Data":"8c981f024857cf14f389cd72f6a7f4d36f8a7e91b6dcd6536276fa002890b5d2"} Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.060928 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dsrxv" podStartSLOduration=126.060907023 podStartE2EDuration="2m6.060907023s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:09.939735999 +0000 UTC m=+150.178409511" watchObservedRunningTime="2025-12-05 16:18:10.060907023 +0000 UTC m=+150.299580545" Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.061054 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-8jwvd" podStartSLOduration=6.061048757 podStartE2EDuration="6.061048757s" podCreationTimestamp="2025-12-05 16:18:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:10.0586894 +0000 UTC m=+150.297362912" watchObservedRunningTime="2025-12-05 16:18:10.061048757 +0000 UTC m=+150.299722279" Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.087175 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:10 crc kubenswrapper[4756]: E1205 16:18:10.088039 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:10.588019261 +0000 UTC m=+150.826692773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.127313 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6kmfx" event={"ID":"8cb1ac56-8d3f-4dce-be45-b9ef2e859e05","Type":"ContainerStarted","Data":"71cfac0f7369b45630718486280475ed38c5615569bbb8309b6f9c37f03c82a5"} Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.127362 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zmrlm" event={"ID":"0dc1aa3a-bdf5-423b-bb6e-ef83b6b2a14b","Type":"ContainerStarted","Data":"e9511a2910254d1dc24de1bb30e56008c509ded3ff0c9cc260ac10b8d3b12fdc"} Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.129291 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-pt8fk" Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.129312 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zmrlm" Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.144584 4756 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-zmrlm container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.144776 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zmrlm" podUID="0dc1aa3a-bdf5-423b-bb6e-ef83b6b2a14b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.152158 4756 patch_prober.go:28] interesting pod/router-default-5444994796-pt8fk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 16:18:10 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Dec 05 16:18:10 crc kubenswrapper[4756]: [+]process-running ok Dec 05 16:18:10 crc kubenswrapper[4756]: healthz check failed Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.152233 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pt8fk" podUID="6ad5d1b1-4e12-430f-91f1-70e8fd975a09" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.174221 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wld6x" event={"ID":"351d65a8-44a7-4f84-be48-1b5204422e8b","Type":"ContainerStarted","Data":"807e4f05e2b3df029033160fdc685755c6dc28c21e03e6ae81167eb101f6d369"} Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.195034 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:10 crc kubenswrapper[4756]: E1205 16:18:10.195740 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:10.695725654 +0000 UTC m=+150.934399166 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.213257 4756 generic.go:334] "Generic (PLEG): container finished" podID="a9694101-a510-49c6-8c1a-4869c9619b6a" containerID="622eed7b46e5537ca95486082530b22ff508d99a433e386fa5b2dcba21aa308a" exitCode=0 Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.213346 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" event={"ID":"a9694101-a510-49c6-8c1a-4869c9619b6a","Type":"ContainerDied","Data":"622eed7b46e5537ca95486082530b22ff508d99a433e386fa5b2dcba21aa308a"} Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.242214 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4jdj5" event={"ID":"b5856492-e637-42d0-b018-6f205e03e897","Type":"ContainerStarted","Data":"708e55edf0cb786a5bcd19244b522351900f1aeb5c97f99dd8ee13733c753119"} Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.243514 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-4jdj5" Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.287183 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-4jdj5" Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.301285 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:10 crc kubenswrapper[4756]: E1205 16:18:10.302577 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:10.802562492 +0000 UTC m=+151.041236004 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.338681 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lcrs9" event={"ID":"875c68c0-137f-466e-97ff-3c2f4bf90401","Type":"ContainerStarted","Data":"7edb323cd3b0425878f8a564c87d420ec2305af3c538056ad14f9399429eb58c"} Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.375445 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dpqm9" event={"ID":"67cf5e8a-2d35-4db9-bcb6-68492f4be191","Type":"ContainerStarted","Data":"116227a6e7006bd15fc3c4166b95468cf979ce33a4d10cc05d9d4f9d8f2e58d9"} Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.375978 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-dpqm9" Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.383147 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nfdlb" event={"ID":"0eb62b4b-b2c4-46c8-af81-b234e5f3bf71","Type":"ContainerStarted","Data":"9d4d16bfb099d5c746bd8b74719dd2b8b1094722bf9d499db69c847cbaad1ccf"} Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.419941 4756 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dpqm9 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/healthz\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.421349 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:10 crc kubenswrapper[4756]: E1205 16:18:10.421859 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:10.921844973 +0000 UTC m=+151.160518485 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.425609 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dpqm9" podUID="67cf5e8a-2d35-4db9-bcb6-68492f4be191" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.16:8080/healthz\": dial tcp 10.217.0.16:8080: connect: connection refused" Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.441699 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7zphw" event={"ID":"a7002a42-041f-4b43-a998-659fa4a3da84","Type":"ContainerStarted","Data":"1e9e4e0434e72ed66702fe244df53f8bd71878ced06e8418ef9f573a0de518cc"} Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.489158 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-22v2l" event={"ID":"598b5e4e-bbed-4011-ab04-f152e2905431","Type":"ContainerStarted","Data":"29443a4e66e775ec69ef9d0c1830ede1506d909f1816fb6e47e17e2e4376ac99"} Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.523402 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:10 crc kubenswrapper[4756]: E1205 16:18:10.525275 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:11.025263084 +0000 UTC m=+151.263936596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.576241 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-9nxrs" Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.625594 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:10 crc kubenswrapper[4756]: E1205 16:18:10.625746 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:11.125720611 +0000 UTC m=+151.364394123 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.625844 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:10 crc kubenswrapper[4756]: E1205 16:18:10.626183 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:11.126166534 +0000 UTC m=+151.364840046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.728792 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:10 crc kubenswrapper[4756]: E1205 16:18:10.729159 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:11.229144922 +0000 UTC m=+151.467818434 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.830005 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:10 crc kubenswrapper[4756]: E1205 16:18:10.830449 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:11.330432873 +0000 UTC m=+151.569106385 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.844505 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-22v2l" podStartSLOduration=6.844488251 podStartE2EDuration="6.844488251s" podCreationTimestamp="2025-12-05 16:18:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:10.814844291 +0000 UTC m=+151.053517813" watchObservedRunningTime="2025-12-05 16:18:10.844488251 +0000 UTC m=+151.083161763" Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.907614 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zmrlm" podStartSLOduration=126.90759926 podStartE2EDuration="2m6.90759926s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:10.906652663 +0000 UTC m=+151.145326175" watchObservedRunningTime="2025-12-05 16:18:10.90759926 +0000 UTC m=+151.146272772" Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.908394 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-4jdj5" podStartSLOduration=126.908388492 podStartE2EDuration="2m6.908388492s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:10.843808962 +0000 UTC m=+151.082482484" watchObservedRunningTime="2025-12-05 16:18:10.908388492 +0000 UTC m=+151.147062004" Dec 05 16:18:10 crc kubenswrapper[4756]: I1205 16:18:10.933862 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:10 crc kubenswrapper[4756]: E1205 16:18:10.934238 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:11.434224155 +0000 UTC m=+151.672897667 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.020942 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-dpqm9" podStartSLOduration=127.020922412 podStartE2EDuration="2m7.020922412s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:11.019204533 +0000 UTC m=+151.257878055" watchObservedRunningTime="2025-12-05 16:18:11.020922412 +0000 UTC m=+151.259595924" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.036991 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:11 crc kubenswrapper[4756]: E1205 16:18:11.037260 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:11.537249905 +0000 UTC m=+151.775923407 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.134380 4756 patch_prober.go:28] interesting pod/router-default-5444994796-pt8fk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 16:18:11 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Dec 05 16:18:11 crc kubenswrapper[4756]: [+]process-running ok Dec 05 16:18:11 crc kubenswrapper[4756]: healthz check failed Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.134443 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pt8fk" podUID="6ad5d1b1-4e12-430f-91f1-70e8fd975a09" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.140390 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:11 crc kubenswrapper[4756]: E1205 16:18:11.141061 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:11.641042926 +0000 UTC m=+151.879716438 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.172393 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7zphw" podStartSLOduration=127.172375924 podStartE2EDuration="2m7.172375924s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:11.144790822 +0000 UTC m=+151.383464344" watchObservedRunningTime="2025-12-05 16:18:11.172375924 +0000 UTC m=+151.411049436" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.245984 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:11 crc kubenswrapper[4756]: E1205 16:18:11.249857 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:11.74984166 +0000 UTC m=+151.988515172 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.350302 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:11 crc kubenswrapper[4756]: E1205 16:18:11.350696 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:11.850677658 +0000 UTC m=+152.089351170 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.457157 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:11 crc kubenswrapper[4756]: E1205 16:18:11.457536 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:11.957522736 +0000 UTC m=+152.196196248 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.534677 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zs5pc" event={"ID":"9c56993d-85b5-46f9-9897-4e371ee09769","Type":"ContainerStarted","Data":"e3b0284b06588e2f05e515171b0c840798590651ed97931136e74753c432db8e"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.534735 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zs5pc" event={"ID":"9c56993d-85b5-46f9-9897-4e371ee09769","Type":"ContainerStarted","Data":"eb4a11ddd8aab576e7c855cdcf4f81d8e0f4ec2513386c7dfd89dddf25ada6e4"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.540124 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-w5k56" event={"ID":"1bd595eb-93e9-4ecf-9aad-b57eb902bcf0","Type":"ContainerStarted","Data":"8d555b8c7b9f2558fb17ded94f3b47fbe466e21fc6456d27c19a02d78ad6e069"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.556605 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-wld6x" podStartSLOduration=127.556588224 podStartE2EDuration="2m7.556588224s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:11.408378983 +0000 UTC m=+151.647052505" watchObservedRunningTime="2025-12-05 16:18:11.556588224 +0000 UTC m=+151.795261736" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.557677 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zs5pc" podStartSLOduration=127.557669434 podStartE2EDuration="2m7.557669434s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:11.554304869 +0000 UTC m=+151.792978401" watchObservedRunningTime="2025-12-05 16:18:11.557669434 +0000 UTC m=+151.796342946" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.557810 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:11 crc kubenswrapper[4756]: E1205 16:18:11.558141 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:12.058119217 +0000 UTC m=+152.296792729 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.561042 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6twhh" event={"ID":"6bcdf9c6-be18-4bd1-a335-a8479496f678","Type":"ContainerStarted","Data":"4fd2a5a2c9678d54e0f032aafaa2cf18f774e290cc94d6631df8b3bd550da60d"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.563540 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" event={"ID":"597869bc-03e8-40fd-8f80-c156102a3dc7","Type":"ContainerStarted","Data":"fdf9c1fd829f6d8af32b11f5b7204be65c58a53990cc74dbda9452d63fa8c78b"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.565002 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.567528 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dsrxv" event={"ID":"65fe8661-28bd-474d-aec9-a63801393725","Type":"ContainerStarted","Data":"c5d334293edf4e6fd5e117795c1cbb421aa95484f0a9af9a2732be0111b4716e"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.570089 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q296n" event={"ID":"c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8","Type":"ContainerStarted","Data":"e1a23a8732fa23534d18ddccc53c42a3c957a8ef4567cfbf5b6b2404ccdc4b1f"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.571049 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q296n" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.574952 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.581165 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lcrs9" event={"ID":"875c68c0-137f-466e-97ff-3c2f4bf90401","Type":"ContainerStarted","Data":"c98846426b4aa602ee8f74e00a85e1950bb9d5800924137cc64156fbb977cd6c"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.581219 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lcrs9" event={"ID":"875c68c0-137f-466e-97ff-3c2f4bf90401","Type":"ContainerStarted","Data":"3a94a9d9d9ae61435c5d9b45c26727b0ef1c5833d172928e4838a344f82e6a32"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.585314 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t6pc7" event={"ID":"8b44c87a-bbab-469a-b5d7-2c8a0a9f05da","Type":"ContainerStarted","Data":"88a32af82f73601de325de5be38d2f331213e660c763fd6343213d65ee53eaaa"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.585370 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t6pc7" event={"ID":"8b44c87a-bbab-469a-b5d7-2c8a0a9f05da","Type":"ContainerStarted","Data":"9e89b542860cb9fad8a79b716543495a4030e14cc39066fce405b78c84bebecd"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.588543 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9gdk" event={"ID":"24d8224d-0fe6-47f3-ae6b-14ea099aa69e","Type":"ContainerStarted","Data":"436d9f530ac8524c7072830b26af5bf5ff6bae57313d24896f2af5646635f842"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.588578 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9gdk" event={"ID":"24d8224d-0fe6-47f3-ae6b-14ea099aa69e","Type":"ContainerStarted","Data":"b71617b266e79663b2f08d19bd8d768d8f3f8132d35521841bf79acaf984c3d5"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.590863 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-w5k56" podStartSLOduration=127.590850665 podStartE2EDuration="2m7.590850665s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:11.59034583 +0000 UTC m=+151.829019342" watchObservedRunningTime="2025-12-05 16:18:11.590850665 +0000 UTC m=+151.829524187" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.592675 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cpmj6" event={"ID":"5ea105de-fb15-46c7-b937-f4738b7fd8fd","Type":"ContainerStarted","Data":"7ea6023d43ea8648ee396b54c658d51042bd995f680b01f3fec34a0a6ce04159"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.595737 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zmrlm" event={"ID":"0dc1aa3a-bdf5-423b-bb6e-ef83b6b2a14b","Type":"ContainerStarted","Data":"677f6103e50fbe97a75bfdea8df08ac94ea25c104df42a9cf1511bb93b5c3d78"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.601521 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-22v2l" event={"ID":"598b5e4e-bbed-4011-ab04-f152e2905431","Type":"ContainerStarted","Data":"5e723018821bd1a46862f524b745fb395ed763f3083d0ab2e0e8e0a65098ac88"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.610730 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zmrlm" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.611884 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lcrs9" podStartSLOduration=127.61186636 podStartE2EDuration="2m7.61186636s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:11.610835901 +0000 UTC m=+151.849509433" watchObservedRunningTime="2025-12-05 16:18:11.61186636 +0000 UTC m=+151.850539882" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.612246 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6zc4l" event={"ID":"59db5196-70c4-4bd6-8406-3589d4e44599","Type":"ContainerStarted","Data":"385e0aab4ae0b3f145a1876bf333791abf3c1d976452d5f0c845f586d0d584d3"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.615985 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8cjkz" event={"ID":"e438b5e2-50cf-4f45-a46f-5e4ee62633b4","Type":"ContainerStarted","Data":"d26409e4773a269bc5c5fd6a2168168f081d9140a149b1ee39c04de207289f3a"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.626926 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-9cgsd" event={"ID":"f2e132f7-f4ea-426c-b094-b5eea4577d5f","Type":"ContainerStarted","Data":"5bf7e4ce4c2ad1fa74046b0bbb6176cfd3dfff53529b073ffe77e626a33292ba"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.628833 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nfdlb" event={"ID":"0eb62b4b-b2c4-46c8-af81-b234e5f3bf71","Type":"ContainerStarted","Data":"46922205ffc2d7e3c1ab4e80d8978280295faf96574bea099b90a11550a0994c"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.630841 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6kmfx" event={"ID":"8cb1ac56-8d3f-4dce-be45-b9ef2e859e05","Type":"ContainerStarted","Data":"107c53546f2069afc559c2d2a255785a21116c51dc7df2044f434d063f16e85a"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.635028 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t6pc7" podStartSLOduration=127.635014466 podStartE2EDuration="2m7.635014466s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:11.63338968 +0000 UTC m=+151.872063202" watchObservedRunningTime="2025-12-05 16:18:11.635014466 +0000 UTC m=+151.873687978" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.642544 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" event={"ID":"cf4b45a0-f5a8-4d66-9d82-19e31c680447","Type":"ContainerStarted","Data":"449220bc1734e3433f1cbcedb1531b0b08fb0e5b1d899c0f98095d6ee9e4bab3"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.645007 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mlthg" event={"ID":"cba515e6-fcd1-4310-9630-70ca7c90c213","Type":"ContainerStarted","Data":"434fbfb6b7acc4efcf078c61daa65ab4e8ce463ee8fd6b483876856af46b8c0d"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.659670 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:11 crc kubenswrapper[4756]: E1205 16:18:11.662896 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:12.162880016 +0000 UTC m=+152.401553518 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.669121 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9gdk" podStartSLOduration=127.669100932 podStartE2EDuration="2m7.669100932s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:11.659378167 +0000 UTC m=+151.898051679" watchObservedRunningTime="2025-12-05 16:18:11.669100932 +0000 UTC m=+151.907774444" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.672618 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-crlcr"] Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.673724 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-crlcr" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.676014 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.677251 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v4kkz" event={"ID":"79fba350-e2c8-44c0-8151-e1e07559bfd7","Type":"ContainerStarted","Data":"a2bba5dcdc803faddb8d03fe70ba357569165b71744fb81b6830e5f670b183b5"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.682674 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-crlcr"] Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.687579 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" podStartSLOduration=127.687564626 podStartE2EDuration="2m7.687564626s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:11.68700147 +0000 UTC m=+151.925674992" watchObservedRunningTime="2025-12-05 16:18:11.687564626 +0000 UTC m=+151.926238138" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.689900 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6zs9x" event={"ID":"2f1a7a84-f798-4242-b060-91a6c2f50a21","Type":"ContainerStarted","Data":"08e5f3c4ebd7d48016a6fbc6bddcc9fff3514e74717260218487f284b36fb681"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.689930 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6zs9x" event={"ID":"2f1a7a84-f798-4242-b060-91a6c2f50a21","Type":"ContainerStarted","Data":"35de1f7ace3fde56d2790d0a35d55e9997816fd65ef84d6f1e738e06e127b3b7"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.690259 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6zs9x" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.691217 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9xwp8" event={"ID":"a229dbb4-017d-4ef2-bcd0-6c5585e3c497","Type":"ContainerStarted","Data":"fd5fb81726bab2eff311c2ffa308bfeed06e21d0f5bf5a6c31f7f68da6f2f7c0"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.698858 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cxc5" event={"ID":"0839e735-d517-4f9d-b742-45f842a9d9cc","Type":"ContainerStarted","Data":"9b3fc8c2e8f32e30adef0b12019bfb13d44afe2abf9364bb085586d2fe0823ba"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.699503 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cxc5" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.700584 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-67x7d" event={"ID":"996b39af-143d-4f4a-affc-b726debbfb0c","Type":"ContainerStarted","Data":"0ff30aef4e23cd979ba3e3b069145aab2fa9b7d04b3f5fbdffb5af4935bb66cb"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.705106 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-stdrz" event={"ID":"d7eef861-8bb0-429e-b359-372213f7769b","Type":"ContainerStarted","Data":"2de4e7cdfd51d342dd085f8c3d2074bd6375b5501648600dc07783bd16e3947a"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.716650 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6twhh" podStartSLOduration=127.71663535 podStartE2EDuration="2m7.71663535s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:11.715144277 +0000 UTC m=+151.953817799" watchObservedRunningTime="2025-12-05 16:18:11.71663535 +0000 UTC m=+151.955308852" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.718342 4756 patch_prober.go:28] interesting pod/downloads-7954f5f757-z9xhd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.718389 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z9xhd" podUID="6013c2d1-fe3a-4d8d-ba47-132ad91a6a34" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.718774 4756 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dpqm9 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/healthz\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.718818 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dpqm9" podUID="67cf5e8a-2d35-4db9-bcb6-68492f4be191" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.16:8080/healthz\": dial tcp 10.217.0.16:8080: connect: connection refused" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.718972 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7zphw" event={"ID":"a7002a42-041f-4b43-a998-659fa4a3da84","Type":"ContainerStarted","Data":"f564744486ec54fcc1fc92e65cb5bd0a04f024061947746a3c8b6ed9b455d2c6"} Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.734306 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k7kwr" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.762472 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.762761 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95c208d2-1ef6-4716-8053-ccbabcd7fe59-catalog-content\") pod \"community-operators-crlcr\" (UID: \"95c208d2-1ef6-4716-8053-ccbabcd7fe59\") " pod="openshift-marketplace/community-operators-crlcr" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.762896 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-759hf\" (UniqueName: \"kubernetes.io/projected/95c208d2-1ef6-4716-8053-ccbabcd7fe59-kube-api-access-759hf\") pod \"community-operators-crlcr\" (UID: \"95c208d2-1ef6-4716-8053-ccbabcd7fe59\") " pod="openshift-marketplace/community-operators-crlcr" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.763040 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95c208d2-1ef6-4716-8053-ccbabcd7fe59-utilities\") pod \"community-operators-crlcr\" (UID: \"95c208d2-1ef6-4716-8053-ccbabcd7fe59\") " pod="openshift-marketplace/community-operators-crlcr" Dec 05 16:18:11 crc kubenswrapper[4756]: E1205 16:18:11.764229 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:12.264207268 +0000 UTC m=+152.502880780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.768023 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q296n" podStartSLOduration=127.768008076 podStartE2EDuration="2m7.768008076s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:11.74309667 +0000 UTC m=+151.981770192" watchObservedRunningTime="2025-12-05 16:18:11.768008076 +0000 UTC m=+152.006681588" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.786592 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9xwp8" podStartSLOduration=127.786572462 podStartE2EDuration="2m7.786572462s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:11.77062522 +0000 UTC m=+152.009298732" watchObservedRunningTime="2025-12-05 16:18:11.786572462 +0000 UTC m=+152.025245984" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.788799 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6zc4l" podStartSLOduration=127.788792025 podStartE2EDuration="2m7.788792025s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:11.785574464 +0000 UTC m=+152.024247986" watchObservedRunningTime="2025-12-05 16:18:11.788792025 +0000 UTC m=+152.027465537" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.801893 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v4kkz" podStartSLOduration=127.801879896 podStartE2EDuration="2m7.801879896s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:11.799659723 +0000 UTC m=+152.038333235" watchObservedRunningTime="2025-12-05 16:18:11.801879896 +0000 UTC m=+152.040553398" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.824847 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6zs9x" podStartSLOduration=127.824829126 podStartE2EDuration="2m7.824829126s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:11.813915537 +0000 UTC m=+152.052589059" watchObservedRunningTime="2025-12-05 16:18:11.824829126 +0000 UTC m=+152.063502638" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.858223 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w6h66"] Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.859158 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w6h66" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.861576 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.864255 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95c208d2-1ef6-4716-8053-ccbabcd7fe59-catalog-content\") pod \"community-operators-crlcr\" (UID: \"95c208d2-1ef6-4716-8053-ccbabcd7fe59\") " pod="openshift-marketplace/community-operators-crlcr" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.864576 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-759hf\" (UniqueName: \"kubernetes.io/projected/95c208d2-1ef6-4716-8053-ccbabcd7fe59-kube-api-access-759hf\") pod \"community-operators-crlcr\" (UID: \"95c208d2-1ef6-4716-8053-ccbabcd7fe59\") " pod="openshift-marketplace/community-operators-crlcr" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.864658 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.864824 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95c208d2-1ef6-4716-8053-ccbabcd7fe59-utilities\") pod \"community-operators-crlcr\" (UID: \"95c208d2-1ef6-4716-8053-ccbabcd7fe59\") " pod="openshift-marketplace/community-operators-crlcr" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.865681 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95c208d2-1ef6-4716-8053-ccbabcd7fe59-catalog-content\") pod \"community-operators-crlcr\" (UID: \"95c208d2-1ef6-4716-8053-ccbabcd7fe59\") " pod="openshift-marketplace/community-operators-crlcr" Dec 05 16:18:11 crc kubenswrapper[4756]: E1205 16:18:11.866136 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:12.366124937 +0000 UTC m=+152.604798449 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.867945 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95c208d2-1ef6-4716-8053-ccbabcd7fe59-utilities\") pod \"community-operators-crlcr\" (UID: \"95c208d2-1ef6-4716-8053-ccbabcd7fe59\") " pod="openshift-marketplace/community-operators-crlcr" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.879170 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-nfdlb" podStartSLOduration=127.879129435 podStartE2EDuration="2m7.879129435s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:11.870745647 +0000 UTC m=+152.109419159" watchObservedRunningTime="2025-12-05 16:18:11.879129435 +0000 UTC m=+152.117802947" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.879245 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w6h66"] Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.886802 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-9cgsd" podStartSLOduration=127.886786602 podStartE2EDuration="2m7.886786602s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:11.885983569 +0000 UTC m=+152.124657081" watchObservedRunningTime="2025-12-05 16:18:11.886786602 +0000 UTC m=+152.125460114" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.893216 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-759hf\" (UniqueName: \"kubernetes.io/projected/95c208d2-1ef6-4716-8053-ccbabcd7fe59-kube-api-access-759hf\") pod \"community-operators-crlcr\" (UID: \"95c208d2-1ef6-4716-8053-ccbabcd7fe59\") " pod="openshift-marketplace/community-operators-crlcr" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.912987 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-67x7d" podStartSLOduration=127.912971984 podStartE2EDuration="2m7.912971984s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:11.911185144 +0000 UTC m=+152.149858666" watchObservedRunningTime="2025-12-05 16:18:11.912971984 +0000 UTC m=+152.151645496" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.924648 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-stdrz" podStartSLOduration=127.924629445 podStartE2EDuration="2m7.924629445s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:11.922680209 +0000 UTC m=+152.161353721" watchObservedRunningTime="2025-12-05 16:18:11.924629445 +0000 UTC m=+152.163302957" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.941384 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-mlthg" podStartSLOduration=127.941367209 podStartE2EDuration="2m7.941367209s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:11.940891066 +0000 UTC m=+152.179564598" watchObservedRunningTime="2025-12-05 16:18:11.941367209 +0000 UTC m=+152.180040721" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.966975 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.967230 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsk6s\" (UniqueName: \"kubernetes.io/projected/1bd03756-cb80-40f8-b5eb-c936c80c6c21-kube-api-access-jsk6s\") pod \"certified-operators-w6h66\" (UID: \"1bd03756-cb80-40f8-b5eb-c936c80c6c21\") " pod="openshift-marketplace/certified-operators-w6h66" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.967261 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bd03756-cb80-40f8-b5eb-c936c80c6c21-utilities\") pod \"certified-operators-w6h66\" (UID: \"1bd03756-cb80-40f8-b5eb-c936c80c6c21\") " pod="openshift-marketplace/certified-operators-w6h66" Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.967284 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bd03756-cb80-40f8-b5eb-c936c80c6c21-catalog-content\") pod \"certified-operators-w6h66\" (UID: \"1bd03756-cb80-40f8-b5eb-c936c80c6c21\") " pod="openshift-marketplace/certified-operators-w6h66" Dec 05 16:18:11 crc kubenswrapper[4756]: E1205 16:18:11.967465 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:12.467445048 +0000 UTC m=+152.706118560 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:11 crc kubenswrapper[4756]: I1205 16:18:11.972917 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q296n" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.001954 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-crlcr" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.038081 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-8cjkz" podStartSLOduration=128.03806334 podStartE2EDuration="2m8.03806334s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:11.977811082 +0000 UTC m=+152.216484604" watchObservedRunningTime="2025-12-05 16:18:12.03806334 +0000 UTC m=+152.276736852" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.039141 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cxc5" podStartSLOduration=128.03913249 podStartE2EDuration="2m8.03913249s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:12.032827571 +0000 UTC m=+152.271501083" watchObservedRunningTime="2025-12-05 16:18:12.03913249 +0000 UTC m=+152.277805992" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.068450 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.068519 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsk6s\" (UniqueName: \"kubernetes.io/projected/1bd03756-cb80-40f8-b5eb-c936c80c6c21-kube-api-access-jsk6s\") pod \"certified-operators-w6h66\" (UID: \"1bd03756-cb80-40f8-b5eb-c936c80c6c21\") " pod="openshift-marketplace/certified-operators-w6h66" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.068540 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bd03756-cb80-40f8-b5eb-c936c80c6c21-utilities\") pod \"certified-operators-w6h66\" (UID: \"1bd03756-cb80-40f8-b5eb-c936c80c6c21\") " pod="openshift-marketplace/certified-operators-w6h66" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.068562 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bd03756-cb80-40f8-b5eb-c936c80c6c21-catalog-content\") pod \"certified-operators-w6h66\" (UID: \"1bd03756-cb80-40f8-b5eb-c936c80c6c21\") " pod="openshift-marketplace/certified-operators-w6h66" Dec 05 16:18:12 crc kubenswrapper[4756]: E1205 16:18:12.069265 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:12.569243703 +0000 UTC m=+152.807917265 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.069466 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bd03756-cb80-40f8-b5eb-c936c80c6c21-catalog-content\") pod \"certified-operators-w6h66\" (UID: \"1bd03756-cb80-40f8-b5eb-c936c80c6c21\") " pod="openshift-marketplace/certified-operators-w6h66" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.069797 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bd03756-cb80-40f8-b5eb-c936c80c6c21-utilities\") pod \"certified-operators-w6h66\" (UID: \"1bd03756-cb80-40f8-b5eb-c936c80c6c21\") " pod="openshift-marketplace/certified-operators-w6h66" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.107967 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nqkjr"] Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.109288 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nqkjr" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.129702 4756 patch_prober.go:28] interesting pod/router-default-5444994796-pt8fk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 16:18:12 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Dec 05 16:18:12 crc kubenswrapper[4756]: [+]process-running ok Dec 05 16:18:12 crc kubenswrapper[4756]: healthz check failed Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.129762 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pt8fk" podUID="6ad5d1b1-4e12-430f-91f1-70e8fd975a09" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.136167 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsk6s\" (UniqueName: \"kubernetes.io/projected/1bd03756-cb80-40f8-b5eb-c936c80c6c21-kube-api-access-jsk6s\") pod \"certified-operators-w6h66\" (UID: \"1bd03756-cb80-40f8-b5eb-c936c80c6c21\") " pod="openshift-marketplace/certified-operators-w6h66" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.138721 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nqkjr"] Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.169995 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:12 crc kubenswrapper[4756]: E1205 16:18:12.170594 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:12.670572585 +0000 UTC m=+152.909246097 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.170730 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:12 crc kubenswrapper[4756]: E1205 16:18:12.171288 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:12.671277575 +0000 UTC m=+152.909951097 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.177719 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w6h66" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.271482 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.271721 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmlgl\" (UniqueName: \"kubernetes.io/projected/bd889233-fc28-41f6-806f-3f7ab7670bca-kube-api-access-xmlgl\") pod \"community-operators-nqkjr\" (UID: \"bd889233-fc28-41f6-806f-3f7ab7670bca\") " pod="openshift-marketplace/community-operators-nqkjr" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.271781 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd889233-fc28-41f6-806f-3f7ab7670bca-catalog-content\") pod \"community-operators-nqkjr\" (UID: \"bd889233-fc28-41f6-806f-3f7ab7670bca\") " pod="openshift-marketplace/community-operators-nqkjr" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.271799 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd889233-fc28-41f6-806f-3f7ab7670bca-utilities\") pod \"community-operators-nqkjr\" (UID: \"bd889233-fc28-41f6-806f-3f7ab7670bca\") " pod="openshift-marketplace/community-operators-nqkjr" Dec 05 16:18:12 crc kubenswrapper[4756]: E1205 16:18:12.271896 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:12.771881847 +0000 UTC m=+153.010555359 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.325445 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qndg5"] Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.330828 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qndg5" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.341656 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qndg5"] Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.402511 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.402567 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9n9k\" (UniqueName: \"kubernetes.io/projected/9805087f-2ab0-4780-9ee1-37f58467e8a2-kube-api-access-g9n9k\") pod \"certified-operators-qndg5\" (UID: \"9805087f-2ab0-4780-9ee1-37f58467e8a2\") " pod="openshift-marketplace/certified-operators-qndg5" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.402610 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9805087f-2ab0-4780-9ee1-37f58467e8a2-catalog-content\") pod \"certified-operators-qndg5\" (UID: \"9805087f-2ab0-4780-9ee1-37f58467e8a2\") " pod="openshift-marketplace/certified-operators-qndg5" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.402643 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9805087f-2ab0-4780-9ee1-37f58467e8a2-utilities\") pod \"certified-operators-qndg5\" (UID: \"9805087f-2ab0-4780-9ee1-37f58467e8a2\") " pod="openshift-marketplace/certified-operators-qndg5" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.402683 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmlgl\" (UniqueName: \"kubernetes.io/projected/bd889233-fc28-41f6-806f-3f7ab7670bca-kube-api-access-xmlgl\") pod \"community-operators-nqkjr\" (UID: \"bd889233-fc28-41f6-806f-3f7ab7670bca\") " pod="openshift-marketplace/community-operators-nqkjr" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.402747 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd889233-fc28-41f6-806f-3f7ab7670bca-catalog-content\") pod \"community-operators-nqkjr\" (UID: \"bd889233-fc28-41f6-806f-3f7ab7670bca\") " pod="openshift-marketplace/community-operators-nqkjr" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.402772 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd889233-fc28-41f6-806f-3f7ab7670bca-utilities\") pod \"community-operators-nqkjr\" (UID: \"bd889233-fc28-41f6-806f-3f7ab7670bca\") " pod="openshift-marketplace/community-operators-nqkjr" Dec 05 16:18:12 crc kubenswrapper[4756]: E1205 16:18:12.403361 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:12.903342812 +0000 UTC m=+153.142016324 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.417428 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd889233-fc28-41f6-806f-3f7ab7670bca-utilities\") pod \"community-operators-nqkjr\" (UID: \"bd889233-fc28-41f6-806f-3f7ab7670bca\") " pod="openshift-marketplace/community-operators-nqkjr" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.420509 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd889233-fc28-41f6-806f-3f7ab7670bca-catalog-content\") pod \"community-operators-nqkjr\" (UID: \"bd889233-fc28-41f6-806f-3f7ab7670bca\") " pod="openshift-marketplace/community-operators-nqkjr" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.431495 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmlgl\" (UniqueName: \"kubernetes.io/projected/bd889233-fc28-41f6-806f-3f7ab7670bca-kube-api-access-xmlgl\") pod \"community-operators-nqkjr\" (UID: \"bd889233-fc28-41f6-806f-3f7ab7670bca\") " pod="openshift-marketplace/community-operators-nqkjr" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.456654 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dsrxv" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.469548 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nqkjr" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.504270 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:12 crc kubenswrapper[4756]: E1205 16:18:12.504637 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:13.004615162 +0000 UTC m=+153.243288674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.504929 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.505018 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9n9k\" (UniqueName: \"kubernetes.io/projected/9805087f-2ab0-4780-9ee1-37f58467e8a2-kube-api-access-g9n9k\") pod \"certified-operators-qndg5\" (UID: \"9805087f-2ab0-4780-9ee1-37f58467e8a2\") " pod="openshift-marketplace/certified-operators-qndg5" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.505092 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9805087f-2ab0-4780-9ee1-37f58467e8a2-catalog-content\") pod \"certified-operators-qndg5\" (UID: \"9805087f-2ab0-4780-9ee1-37f58467e8a2\") " pod="openshift-marketplace/certified-operators-qndg5" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.505169 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9805087f-2ab0-4780-9ee1-37f58467e8a2-utilities\") pod \"certified-operators-qndg5\" (UID: \"9805087f-2ab0-4780-9ee1-37f58467e8a2\") " pod="openshift-marketplace/certified-operators-qndg5" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.505602 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9805087f-2ab0-4780-9ee1-37f58467e8a2-utilities\") pod \"certified-operators-qndg5\" (UID: \"9805087f-2ab0-4780-9ee1-37f58467e8a2\") " pod="openshift-marketplace/certified-operators-qndg5" Dec 05 16:18:12 crc kubenswrapper[4756]: E1205 16:18:12.505886 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:13.005877878 +0000 UTC m=+153.244551390 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.506560 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9805087f-2ab0-4780-9ee1-37f58467e8a2-catalog-content\") pod \"certified-operators-qndg5\" (UID: \"9805087f-2ab0-4780-9ee1-37f58467e8a2\") " pod="openshift-marketplace/certified-operators-qndg5" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.606708 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:12 crc kubenswrapper[4756]: E1205 16:18:12.607237 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:13.10721111 +0000 UTC m=+153.345884622 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.621353 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9n9k\" (UniqueName: \"kubernetes.io/projected/9805087f-2ab0-4780-9ee1-37f58467e8a2-kube-api-access-g9n9k\") pod \"certified-operators-qndg5\" (UID: \"9805087f-2ab0-4780-9ee1-37f58467e8a2\") " pod="openshift-marketplace/certified-operators-qndg5" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.622089 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:12 crc kubenswrapper[4756]: E1205 16:18:12.622505 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:13.122490343 +0000 UTC m=+153.361163855 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.627974 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.724920 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:12 crc kubenswrapper[4756]: E1205 16:18:12.725707 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:13.225693038 +0000 UTC m=+153.464366550 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.726146 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qndg5" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.759265 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wckzw" event={"ID":"3708be10-919d-4500-8325-90de029cbb66","Type":"ContainerStarted","Data":"ce4231ad69b1a017bd87a2ead7c8ce576ad13175092d5be3eccd7522c15743b0"} Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.760723 4756 generic.go:334] "Generic (PLEG): container finished" podID="f2e132f7-f4ea-426c-b094-b5eea4577d5f" containerID="5bf7e4ce4c2ad1fa74046b0bbb6176cfd3dfff53529b073ffe77e626a33292ba" exitCode=0 Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.760776 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-9cgsd" event={"ID":"f2e132f7-f4ea-426c-b094-b5eea4577d5f","Type":"ContainerDied","Data":"5bf7e4ce4c2ad1fa74046b0bbb6176cfd3dfff53529b073ffe77e626a33292ba"} Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.768294 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6kmfx" event={"ID":"8cb1ac56-8d3f-4dce-be45-b9ef2e859e05","Type":"ContainerStarted","Data":"51d86cff3e09b77c9ad39db841b8c13fa8e8dd6276d9573ed843b599458d3d25"} Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.771244 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" event={"ID":"a9694101-a510-49c6-8c1a-4869c9619b6a","Type":"ContainerStarted","Data":"c462a434b5386d16000e1d1ee60b09e1aa15452ffc24db53e429cf9502c871c0"} Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.773623 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" event={"ID":"cf4b45a0-f5a8-4d66-9d82-19e31c680447","Type":"ContainerStarted","Data":"562585f1955e7f054bffc82cdb28bfdd8651c20032eef0bb20009b7c331139a1"} Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.777246 4756 patch_prober.go:28] interesting pod/downloads-7954f5f757-z9xhd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.777284 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z9xhd" podUID="6013c2d1-fe3a-4d8d-ba47-132ad91a6a34" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.845943 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:12 crc kubenswrapper[4756]: E1205 16:18:12.862883 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:13.362868385 +0000 UTC m=+153.601541897 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.965112 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-crlcr"] Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.982149 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:12 crc kubenswrapper[4756]: E1205 16:18:12.983342 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:13.483323339 +0000 UTC m=+153.721996851 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:12 crc kubenswrapper[4756]: I1205 16:18:12.986040 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" podStartSLOduration=128.986026716 podStartE2EDuration="2m8.986026716s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:12.982697612 +0000 UTC m=+153.221371124" watchObservedRunningTime="2025-12-05 16:18:12.986026716 +0000 UTC m=+153.224700228" Dec 05 16:18:12 crc kubenswrapper[4756]: W1205 16:18:12.986073 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95c208d2_1ef6_4716_8053_ccbabcd7fe59.slice/crio-95648b16a7df680eb44faab267e704e555be37ede7b5d64806065ac42c6812ce WatchSource:0}: Error finding container 95648b16a7df680eb44faab267e704e555be37ede7b5d64806065ac42c6812ce: Status 404 returned error can't find the container with id 95648b16a7df680eb44faab267e704e555be37ede7b5d64806065ac42c6812ce Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.009061 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w6h66"] Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.087695 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:13 crc kubenswrapper[4756]: E1205 16:18:13.088519 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:13.58850516 +0000 UTC m=+153.827178672 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.127219 4756 patch_prober.go:28] interesting pod/router-default-5444994796-pt8fk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 16:18:13 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Dec 05 16:18:13 crc kubenswrapper[4756]: [+]process-running ok Dec 05 16:18:13 crc kubenswrapper[4756]: healthz check failed Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.127272 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pt8fk" podUID="6ad5d1b1-4e12-430f-91f1-70e8fd975a09" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.188996 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:13 crc kubenswrapper[4756]: E1205 16:18:13.189363 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:13.689344468 +0000 UTC m=+153.928017990 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.213618 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qndg5"] Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.291877 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:13 crc kubenswrapper[4756]: E1205 16:18:13.292184 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:13.792172453 +0000 UTC m=+154.030845955 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.347786 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nqkjr"] Dec 05 16:18:13 crc kubenswrapper[4756]: W1205 16:18:13.354670 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd889233_fc28_41f6_806f_3f7ab7670bca.slice/crio-2722033ff00e3a47df036e6bcd9eef00db9dead3b40da665e0ad662450ba575a WatchSource:0}: Error finding container 2722033ff00e3a47df036e6bcd9eef00db9dead3b40da665e0ad662450ba575a: Status 404 returned error can't find the container with id 2722033ff00e3a47df036e6bcd9eef00db9dead3b40da665e0ad662450ba575a Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.397596 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:13 crc kubenswrapper[4756]: E1205 16:18:13.397889 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:13.897870268 +0000 UTC m=+154.136543780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.499231 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:13 crc kubenswrapper[4756]: E1205 16:18:13.499735 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:13.999719495 +0000 UTC m=+154.238392997 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.600251 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:13 crc kubenswrapper[4756]: E1205 16:18:13.600448 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:14.100390698 +0000 UTC m=+154.339064210 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.600518 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:13 crc kubenswrapper[4756]: E1205 16:18:13.600911 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:14.100903623 +0000 UTC m=+154.339577135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.701816 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:13 crc kubenswrapper[4756]: E1205 16:18:13.701952 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:14.201920986 +0000 UTC m=+154.440594498 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.702109 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:13 crc kubenswrapper[4756]: E1205 16:18:13.702473 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:14.202460261 +0000 UTC m=+154.441133773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.803794 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:13 crc kubenswrapper[4756]: E1205 16:18:13.810657 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:14.310629927 +0000 UTC m=+154.549303439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.811314 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:13 crc kubenswrapper[4756]: E1205 16:18:13.811818 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:14.31180425 +0000 UTC m=+154.550477762 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.815918 4756 generic.go:334] "Generic (PLEG): container finished" podID="bd889233-fc28-41f6-806f-3f7ab7670bca" containerID="8c10a5f4c7b5459b927d9c240525c0ef93b1a56a4512c9a1a766430dc21dfae3" exitCode=0 Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.815972 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqkjr" event={"ID":"bd889233-fc28-41f6-806f-3f7ab7670bca","Type":"ContainerDied","Data":"8c10a5f4c7b5459b927d9c240525c0ef93b1a56a4512c9a1a766430dc21dfae3"} Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.815997 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqkjr" event={"ID":"bd889233-fc28-41f6-806f-3f7ab7670bca","Type":"ContainerStarted","Data":"2722033ff00e3a47df036e6bcd9eef00db9dead3b40da665e0ad662450ba575a"} Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.819321 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.823703 4756 generic.go:334] "Generic (PLEG): container finished" podID="9805087f-2ab0-4780-9ee1-37f58467e8a2" containerID="fe28ad0693bc96c2b04addecf03334b83d1c77ef0969145f610608351396aeda" exitCode=0 Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.823763 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qndg5" event={"ID":"9805087f-2ab0-4780-9ee1-37f58467e8a2","Type":"ContainerDied","Data":"fe28ad0693bc96c2b04addecf03334b83d1c77ef0969145f610608351396aeda"} Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.823787 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qndg5" event={"ID":"9805087f-2ab0-4780-9ee1-37f58467e8a2","Type":"ContainerStarted","Data":"2a0146cd0cb0ac6f1494bcd945ae8bcb8358037797ecf7b89bac7e868701e666"} Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.837861 4756 generic.go:334] "Generic (PLEG): container finished" podID="95c208d2-1ef6-4716-8053-ccbabcd7fe59" containerID="0eb9155162f65e0efd3e588a21e36e3bf343bf430db876646d35bf13d02a85d9" exitCode=0 Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.837955 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crlcr" event={"ID":"95c208d2-1ef6-4716-8053-ccbabcd7fe59","Type":"ContainerDied","Data":"0eb9155162f65e0efd3e588a21e36e3bf343bf430db876646d35bf13d02a85d9"} Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.837984 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crlcr" event={"ID":"95c208d2-1ef6-4716-8053-ccbabcd7fe59","Type":"ContainerStarted","Data":"95648b16a7df680eb44faab267e704e555be37ede7b5d64806065ac42c6812ce"} Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.843397 4756 generic.go:334] "Generic (PLEG): container finished" podID="1bd03756-cb80-40f8-b5eb-c936c80c6c21" containerID="3a0c815fd4a5438cdc2a7f3812dd379016aeb2e8cff3cffb27b97240c8ed6a57" exitCode=0 Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.844115 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6h66" event={"ID":"1bd03756-cb80-40f8-b5eb-c936c80c6c21","Type":"ContainerDied","Data":"3a0c815fd4a5438cdc2a7f3812dd379016aeb2e8cff3cffb27b97240c8ed6a57"} Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.844155 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6h66" event={"ID":"1bd03756-cb80-40f8-b5eb-c936c80c6c21","Type":"ContainerStarted","Data":"379222f3448f5fe46508df43ab2a01561de1cea82e64d2d775cef160b62e6aee"} Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.862123 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z92m7"] Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.863323 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z92m7" Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.865907 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.895239 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z92m7"] Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.919302 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.919570 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956dc850-ee41-4d30-b209-2f3270c376a5-catalog-content\") pod \"redhat-marketplace-z92m7\" (UID: \"956dc850-ee41-4d30-b209-2f3270c376a5\") " pod="openshift-marketplace/redhat-marketplace-z92m7" Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.919693 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956dc850-ee41-4d30-b209-2f3270c376a5-utilities\") pod \"redhat-marketplace-z92m7\" (UID: \"956dc850-ee41-4d30-b209-2f3270c376a5\") " pod="openshift-marketplace/redhat-marketplace-z92m7" Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.919748 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5jfv\" (UniqueName: \"kubernetes.io/projected/956dc850-ee41-4d30-b209-2f3270c376a5-kube-api-access-z5jfv\") pod \"redhat-marketplace-z92m7\" (UID: \"956dc850-ee41-4d30-b209-2f3270c376a5\") " pod="openshift-marketplace/redhat-marketplace-z92m7" Dec 05 16:18:13 crc kubenswrapper[4756]: E1205 16:18:13.920669 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:14.420655265 +0000 UTC m=+154.659328777 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:13 crc kubenswrapper[4756]: I1205 16:18:13.986150 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6kmfx" podStartSLOduration=9.986133151 podStartE2EDuration="9.986133151s" podCreationTimestamp="2025-12-05 16:18:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:13.983779014 +0000 UTC m=+154.222452516" watchObservedRunningTime="2025-12-05 16:18:13.986133151 +0000 UTC m=+154.224806663" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.012531 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" podStartSLOduration=130.012399865 podStartE2EDuration="2m10.012399865s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:14.008067673 +0000 UTC m=+154.246741185" watchObservedRunningTime="2025-12-05 16:18:14.012399865 +0000 UTC m=+154.251073387" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.021157 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956dc850-ee41-4d30-b209-2f3270c376a5-catalog-content\") pod \"redhat-marketplace-z92m7\" (UID: \"956dc850-ee41-4d30-b209-2f3270c376a5\") " pod="openshift-marketplace/redhat-marketplace-z92m7" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.021459 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956dc850-ee41-4d30-b209-2f3270c376a5-utilities\") pod \"redhat-marketplace-z92m7\" (UID: \"956dc850-ee41-4d30-b209-2f3270c376a5\") " pod="openshift-marketplace/redhat-marketplace-z92m7" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.021586 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5jfv\" (UniqueName: \"kubernetes.io/projected/956dc850-ee41-4d30-b209-2f3270c376a5-kube-api-access-z5jfv\") pod \"redhat-marketplace-z92m7\" (UID: \"956dc850-ee41-4d30-b209-2f3270c376a5\") " pod="openshift-marketplace/redhat-marketplace-z92m7" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.021713 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.021600 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956dc850-ee41-4d30-b209-2f3270c376a5-catalog-content\") pod \"redhat-marketplace-z92m7\" (UID: \"956dc850-ee41-4d30-b209-2f3270c376a5\") " pod="openshift-marketplace/redhat-marketplace-z92m7" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.021912 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956dc850-ee41-4d30-b209-2f3270c376a5-utilities\") pod \"redhat-marketplace-z92m7\" (UID: \"956dc850-ee41-4d30-b209-2f3270c376a5\") " pod="openshift-marketplace/redhat-marketplace-z92m7" Dec 05 16:18:14 crc kubenswrapper[4756]: E1205 16:18:14.022013 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:14.521998787 +0000 UTC m=+154.760672299 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.059706 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5jfv\" (UniqueName: \"kubernetes.io/projected/956dc850-ee41-4d30-b209-2f3270c376a5-kube-api-access-z5jfv\") pod \"redhat-marketplace-z92m7\" (UID: \"956dc850-ee41-4d30-b209-2f3270c376a5\") " pod="openshift-marketplace/redhat-marketplace-z92m7" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.123304 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:14 crc kubenswrapper[4756]: E1205 16:18:14.123468 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:14.623448083 +0000 UTC m=+154.862121595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.123495 4756 patch_prober.go:28] interesting pod/router-default-5444994796-pt8fk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 16:18:14 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Dec 05 16:18:14 crc kubenswrapper[4756]: [+]process-running ok Dec 05 16:18:14 crc kubenswrapper[4756]: healthz check failed Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.123528 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pt8fk" podUID="6ad5d1b1-4e12-430f-91f1-70e8fd975a09" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.123582 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:14 crc kubenswrapper[4756]: E1205 16:18:14.123871 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:14.623861224 +0000 UTC m=+154.862534726 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.125819 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-9cgsd" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.203263 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z92m7" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.224780 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2e132f7-f4ea-426c-b094-b5eea4577d5f-secret-volume\") pod \"f2e132f7-f4ea-426c-b094-b5eea4577d5f\" (UID: \"f2e132f7-f4ea-426c-b094-b5eea4577d5f\") " Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.224825 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2e132f7-f4ea-426c-b094-b5eea4577d5f-config-volume\") pod \"f2e132f7-f4ea-426c-b094-b5eea4577d5f\" (UID: \"f2e132f7-f4ea-426c-b094-b5eea4577d5f\") " Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.224849 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bfgq\" (UniqueName: \"kubernetes.io/projected/f2e132f7-f4ea-426c-b094-b5eea4577d5f-kube-api-access-5bfgq\") pod \"f2e132f7-f4ea-426c-b094-b5eea4577d5f\" (UID: \"f2e132f7-f4ea-426c-b094-b5eea4577d5f\") " Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.224972 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:14 crc kubenswrapper[4756]: E1205 16:18:14.225221 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:14.725207597 +0000 UTC m=+154.963881109 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.225618 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2e132f7-f4ea-426c-b094-b5eea4577d5f-config-volume" (OuterVolumeSpecName: "config-volume") pod "f2e132f7-f4ea-426c-b094-b5eea4577d5f" (UID: "f2e132f7-f4ea-426c-b094-b5eea4577d5f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.229110 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2e132f7-f4ea-426c-b094-b5eea4577d5f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f2e132f7-f4ea-426c-b094-b5eea4577d5f" (UID: "f2e132f7-f4ea-426c-b094-b5eea4577d5f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.229301 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2e132f7-f4ea-426c-b094-b5eea4577d5f-kube-api-access-5bfgq" (OuterVolumeSpecName: "kube-api-access-5bfgq") pod "f2e132f7-f4ea-426c-b094-b5eea4577d5f" (UID: "f2e132f7-f4ea-426c-b094-b5eea4577d5f"). InnerVolumeSpecName "kube-api-access-5bfgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.254642 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tfrgx"] Dec 05 16:18:14 crc kubenswrapper[4756]: E1205 16:18:14.254863 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2e132f7-f4ea-426c-b094-b5eea4577d5f" containerName="collect-profiles" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.254876 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2e132f7-f4ea-426c-b094-b5eea4577d5f" containerName="collect-profiles" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.254970 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2e132f7-f4ea-426c-b094-b5eea4577d5f" containerName="collect-profiles" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.255753 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tfrgx" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.276020 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tfrgx"] Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.326238 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqjrx\" (UniqueName: \"kubernetes.io/projected/7ff82294-1c1b-4fdb-8644-d3629d847a36-kube-api-access-mqjrx\") pod \"redhat-marketplace-tfrgx\" (UID: \"7ff82294-1c1b-4fdb-8644-d3629d847a36\") " pod="openshift-marketplace/redhat-marketplace-tfrgx" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.326586 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.326624 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ff82294-1c1b-4fdb-8644-d3629d847a36-utilities\") pod \"redhat-marketplace-tfrgx\" (UID: \"7ff82294-1c1b-4fdb-8644-d3629d847a36\") " pod="openshift-marketplace/redhat-marketplace-tfrgx" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.326683 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ff82294-1c1b-4fdb-8644-d3629d847a36-catalog-content\") pod \"redhat-marketplace-tfrgx\" (UID: \"7ff82294-1c1b-4fdb-8644-d3629d847a36\") " pod="openshift-marketplace/redhat-marketplace-tfrgx" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.326727 4756 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2e132f7-f4ea-426c-b094-b5eea4577d5f-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.326741 4756 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2e132f7-f4ea-426c-b094-b5eea4577d5f-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.326755 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bfgq\" (UniqueName: \"kubernetes.io/projected/f2e132f7-f4ea-426c-b094-b5eea4577d5f-kube-api-access-5bfgq\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:14 crc kubenswrapper[4756]: E1205 16:18:14.327022 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:14.827008552 +0000 UTC m=+155.065682064 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.387358 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cxc5" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.428168 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.428356 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ff82294-1c1b-4fdb-8644-d3629d847a36-catalog-content\") pod \"redhat-marketplace-tfrgx\" (UID: \"7ff82294-1c1b-4fdb-8644-d3629d847a36\") " pod="openshift-marketplace/redhat-marketplace-tfrgx" Dec 05 16:18:14 crc kubenswrapper[4756]: E1205 16:18:14.428387 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:14.928360905 +0000 UTC m=+155.167034417 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.428544 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqjrx\" (UniqueName: \"kubernetes.io/projected/7ff82294-1c1b-4fdb-8644-d3629d847a36-kube-api-access-mqjrx\") pod \"redhat-marketplace-tfrgx\" (UID: \"7ff82294-1c1b-4fdb-8644-d3629d847a36\") " pod="openshift-marketplace/redhat-marketplace-tfrgx" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.428626 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.428653 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ff82294-1c1b-4fdb-8644-d3629d847a36-utilities\") pod \"redhat-marketplace-tfrgx\" (UID: \"7ff82294-1c1b-4fdb-8644-d3629d847a36\") " pod="openshift-marketplace/redhat-marketplace-tfrgx" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.428782 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ff82294-1c1b-4fdb-8644-d3629d847a36-catalog-content\") pod \"redhat-marketplace-tfrgx\" (UID: \"7ff82294-1c1b-4fdb-8644-d3629d847a36\") " pod="openshift-marketplace/redhat-marketplace-tfrgx" Dec 05 16:18:14 crc kubenswrapper[4756]: E1205 16:18:14.428942 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:14.928929911 +0000 UTC m=+155.167603423 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.429035 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ff82294-1c1b-4fdb-8644-d3629d847a36-utilities\") pod \"redhat-marketplace-tfrgx\" (UID: \"7ff82294-1c1b-4fdb-8644-d3629d847a36\") " pod="openshift-marketplace/redhat-marketplace-tfrgx" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.440083 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z92m7"] Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.446103 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqjrx\" (UniqueName: \"kubernetes.io/projected/7ff82294-1c1b-4fdb-8644-d3629d847a36-kube-api-access-mqjrx\") pod \"redhat-marketplace-tfrgx\" (UID: \"7ff82294-1c1b-4fdb-8644-d3629d847a36\") " pod="openshift-marketplace/redhat-marketplace-tfrgx" Dec 05 16:18:14 crc kubenswrapper[4756]: W1205 16:18:14.460690 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod956dc850_ee41_4d30_b209_2f3270c376a5.slice/crio-77ec9095d21f26ea7d09940a6fbab2d80ec321c55c4b3ef7a84bc65250205c0c WatchSource:0}: Error finding container 77ec9095d21f26ea7d09940a6fbab2d80ec321c55c4b3ef7a84bc65250205c0c: Status 404 returned error can't find the container with id 77ec9095d21f26ea7d09940a6fbab2d80ec321c55c4b3ef7a84bc65250205c0c Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.531106 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:14 crc kubenswrapper[4756]: E1205 16:18:14.531296 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:15.031271781 +0000 UTC m=+155.269945293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.531349 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:14 crc kubenswrapper[4756]: E1205 16:18:14.531843 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:15.031834597 +0000 UTC m=+155.270508109 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.580491 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tfrgx" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.632081 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:14 crc kubenswrapper[4756]: E1205 16:18:14.632583 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:15.132567762 +0000 UTC m=+155.371241274 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.733571 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:14 crc kubenswrapper[4756]: E1205 16:18:14.733856 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:15.233845363 +0000 UTC m=+155.472518875 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.786848 4756 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.837330 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:14 crc kubenswrapper[4756]: E1205 16:18:14.837525 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:15.33749874 +0000 UTC m=+155.576172252 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.837593 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:14 crc kubenswrapper[4756]: E1205 16:18:14.837984 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:15.337976264 +0000 UTC m=+155.576649776 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.880829 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wm9rv"] Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.901060 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wm9rv" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.918951 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wm9rv"] Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.925166 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.934555 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wckzw" event={"ID":"3708be10-919d-4500-8325-90de029cbb66","Type":"ContainerStarted","Data":"7c91187c8735ab2ac2248563b3a15e3cbe8b37bf2559185cca573a199328baad"} Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.938859 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:14 crc kubenswrapper[4756]: E1205 16:18:14.940076 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:15.440053487 +0000 UTC m=+155.678727009 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.940798 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.940854 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb8ph\" (UniqueName: \"kubernetes.io/projected/5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d-kube-api-access-mb8ph\") pod \"redhat-operators-wm9rv\" (UID: \"5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d\") " pod="openshift-marketplace/redhat-operators-wm9rv" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.940900 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d-catalog-content\") pod \"redhat-operators-wm9rv\" (UID: \"5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d\") " pod="openshift-marketplace/redhat-operators-wm9rv" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.940935 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d-utilities\") pod \"redhat-operators-wm9rv\" (UID: \"5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d\") " pod="openshift-marketplace/redhat-operators-wm9rv" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.945515 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-9cgsd" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.946066 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415855-9cgsd" event={"ID":"f2e132f7-f4ea-426c-b094-b5eea4577d5f","Type":"ContainerDied","Data":"bf6df1aaa6c133c751332b9e613d785f1b2cc6de6de053163519deb7e91a67ef"} Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.946117 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf6df1aaa6c133c751332b9e613d785f1b2cc6de6de053163519deb7e91a67ef" Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.946139 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tfrgx"] Dec 05 16:18:14 crc kubenswrapper[4756]: E1205 16:18:14.946307 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:15.446290464 +0000 UTC m=+155.684963976 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.953208 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z92m7" event={"ID":"956dc850-ee41-4d30-b209-2f3270c376a5","Type":"ContainerStarted","Data":"af26c0120b339ed597346fa0e5095b3ea3d87f7ac2614bee427d4e23f38193b6"} Dec 05 16:18:14 crc kubenswrapper[4756]: I1205 16:18:14.953288 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z92m7" event={"ID":"956dc850-ee41-4d30-b209-2f3270c376a5","Type":"ContainerStarted","Data":"77ec9095d21f26ea7d09940a6fbab2d80ec321c55c4b3ef7a84bc65250205c0c"} Dec 05 16:18:14 crc kubenswrapper[4756]: W1205 16:18:14.978355 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ff82294_1c1b_4fdb_8644_d3629d847a36.slice/crio-2748014b90449f7622f2ad326fe5281c66e9bcc1991e6dd2bf21058712200dbd WatchSource:0}: Error finding container 2748014b90449f7622f2ad326fe5281c66e9bcc1991e6dd2bf21058712200dbd: Status 404 returned error can't find the container with id 2748014b90449f7622f2ad326fe5281c66e9bcc1991e6dd2bf21058712200dbd Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.042328 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:15 crc kubenswrapper[4756]: E1205 16:18:15.042496 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:15.54246608 +0000 UTC m=+155.781139602 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.042580 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.042609 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb8ph\" (UniqueName: \"kubernetes.io/projected/5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d-kube-api-access-mb8ph\") pod \"redhat-operators-wm9rv\" (UID: \"5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d\") " pod="openshift-marketplace/redhat-operators-wm9rv" Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.042642 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d-catalog-content\") pod \"redhat-operators-wm9rv\" (UID: \"5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d\") " pod="openshift-marketplace/redhat-operators-wm9rv" Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.042663 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d-utilities\") pod \"redhat-operators-wm9rv\" (UID: \"5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d\") " pod="openshift-marketplace/redhat-operators-wm9rv" Dec 05 16:18:15 crc kubenswrapper[4756]: E1205 16:18:15.042948 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:15.542936613 +0000 UTC m=+155.781610125 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.043207 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d-catalog-content\") pod \"redhat-operators-wm9rv\" (UID: \"5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d\") " pod="openshift-marketplace/redhat-operators-wm9rv" Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.043300 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d-utilities\") pod \"redhat-operators-wm9rv\" (UID: \"5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d\") " pod="openshift-marketplace/redhat-operators-wm9rv" Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.061963 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb8ph\" (UniqueName: \"kubernetes.io/projected/5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d-kube-api-access-mb8ph\") pod \"redhat-operators-wm9rv\" (UID: \"5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d\") " pod="openshift-marketplace/redhat-operators-wm9rv" Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.124224 4756 patch_prober.go:28] interesting pod/router-default-5444994796-pt8fk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 16:18:15 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Dec 05 16:18:15 crc kubenswrapper[4756]: [+]process-running ok Dec 05 16:18:15 crc kubenswrapper[4756]: healthz check failed Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.124448 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pt8fk" podUID="6ad5d1b1-4e12-430f-91f1-70e8fd975a09" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.146207 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:15 crc kubenswrapper[4756]: E1205 16:18:15.146518 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:15.646485348 +0000 UTC m=+155.885158860 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.148201 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:15 crc kubenswrapper[4756]: E1205 16:18:15.148800 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:15.648790703 +0000 UTC m=+155.887464215 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:15 crc kubenswrapper[4756]: E1205 16:18:15.250391 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 16:18:15.74995833 +0000 UTC m=+155.988631842 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.252161 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.252494 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:15 crc kubenswrapper[4756]: E1205 16:18:15.252935 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 16:18:15.752924424 +0000 UTC m=+155.991597936 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hxxk6" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.256054 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wm9rv" Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.258699 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6kt7t"] Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.260102 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6kt7t" Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.264533 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6kt7t"] Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.334027 4756 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-05T16:18:14.786873326Z","Handler":null,"Name":""} Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.342368 4756 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.342666 4756 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.353620 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.354162 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-788p4\" (UniqueName: \"kubernetes.io/projected/a5bc48be-ea95-47a5-b804-ac4dc8df958b-kube-api-access-788p4\") pod \"redhat-operators-6kt7t\" (UID: \"a5bc48be-ea95-47a5-b804-ac4dc8df958b\") " pod="openshift-marketplace/redhat-operators-6kt7t" Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.354256 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5bc48be-ea95-47a5-b804-ac4dc8df958b-catalog-content\") pod \"redhat-operators-6kt7t\" (UID: \"a5bc48be-ea95-47a5-b804-ac4dc8df958b\") " pod="openshift-marketplace/redhat-operators-6kt7t" Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.354322 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5bc48be-ea95-47a5-b804-ac4dc8df958b-utilities\") pod \"redhat-operators-6kt7t\" (UID: \"a5bc48be-ea95-47a5-b804-ac4dc8df958b\") " pod="openshift-marketplace/redhat-operators-6kt7t" Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.367305 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.455327 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-788p4\" (UniqueName: \"kubernetes.io/projected/a5bc48be-ea95-47a5-b804-ac4dc8df958b-kube-api-access-788p4\") pod \"redhat-operators-6kt7t\" (UID: \"a5bc48be-ea95-47a5-b804-ac4dc8df958b\") " pod="openshift-marketplace/redhat-operators-6kt7t" Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.455391 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.455460 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5bc48be-ea95-47a5-b804-ac4dc8df958b-catalog-content\") pod \"redhat-operators-6kt7t\" (UID: \"a5bc48be-ea95-47a5-b804-ac4dc8df958b\") " pod="openshift-marketplace/redhat-operators-6kt7t" Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.455527 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5bc48be-ea95-47a5-b804-ac4dc8df958b-utilities\") pod \"redhat-operators-6kt7t\" (UID: \"a5bc48be-ea95-47a5-b804-ac4dc8df958b\") " pod="openshift-marketplace/redhat-operators-6kt7t" Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.456540 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5bc48be-ea95-47a5-b804-ac4dc8df958b-utilities\") pod \"redhat-operators-6kt7t\" (UID: \"a5bc48be-ea95-47a5-b804-ac4dc8df958b\") " pod="openshift-marketplace/redhat-operators-6kt7t" Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.456841 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5bc48be-ea95-47a5-b804-ac4dc8df958b-catalog-content\") pod \"redhat-operators-6kt7t\" (UID: \"a5bc48be-ea95-47a5-b804-ac4dc8df958b\") " pod="openshift-marketplace/redhat-operators-6kt7t" Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.461070 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.461134 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.479470 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-788p4\" (UniqueName: \"kubernetes.io/projected/a5bc48be-ea95-47a5-b804-ac4dc8df958b-kube-api-access-788p4\") pod \"redhat-operators-6kt7t\" (UID: \"a5bc48be-ea95-47a5-b804-ac4dc8df958b\") " pod="openshift-marketplace/redhat-operators-6kt7t" Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.514633 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hxxk6\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.526963 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wm9rv"] Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.588135 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6kt7t" Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.651813 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.915047 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6kt7t"] Dec 05 16:18:15 crc kubenswrapper[4756]: W1205 16:18:15.935633 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5bc48be_ea95_47a5_b804_ac4dc8df958b.slice/crio-b837e5e45befaa4813f144419fb81921b55bb6cd8dc49e3a044479fdcc003f11 WatchSource:0}: Error finding container b837e5e45befaa4813f144419fb81921b55bb6cd8dc49e3a044479fdcc003f11: Status 404 returned error can't find the container with id b837e5e45befaa4813f144419fb81921b55bb6cd8dc49e3a044479fdcc003f11 Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.960945 4756 generic.go:334] "Generic (PLEG): container finished" podID="5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d" containerID="a1f226357f47fa0a9e6553f48ba50b538b8a9cf43b9e656fb0ce3e2136510291" exitCode=0 Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.960993 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wm9rv" event={"ID":"5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d","Type":"ContainerDied","Data":"a1f226357f47fa0a9e6553f48ba50b538b8a9cf43b9e656fb0ce3e2136510291"} Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.961016 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wm9rv" event={"ID":"5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d","Type":"ContainerStarted","Data":"b63f6a72ffca755b88b8d96fecf862eecbac6caba33efa7a0286a6c1bb7ba978"} Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.966368 4756 generic.go:334] "Generic (PLEG): container finished" podID="956dc850-ee41-4d30-b209-2f3270c376a5" containerID="af26c0120b339ed597346fa0e5095b3ea3d87f7ac2614bee427d4e23f38193b6" exitCode=0 Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.966432 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z92m7" event={"ID":"956dc850-ee41-4d30-b209-2f3270c376a5","Type":"ContainerDied","Data":"af26c0120b339ed597346fa0e5095b3ea3d87f7ac2614bee427d4e23f38193b6"} Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.971403 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kt7t" event={"ID":"a5bc48be-ea95-47a5-b804-ac4dc8df958b","Type":"ContainerStarted","Data":"b837e5e45befaa4813f144419fb81921b55bb6cd8dc49e3a044479fdcc003f11"} Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.972973 4756 generic.go:334] "Generic (PLEG): container finished" podID="7ff82294-1c1b-4fdb-8644-d3629d847a36" containerID="9cfa234e17d6f4bf0b1dd8561f361dd36adf6a00a5cfd33880a406bb9377043c" exitCode=0 Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.973024 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tfrgx" event={"ID":"7ff82294-1c1b-4fdb-8644-d3629d847a36","Type":"ContainerDied","Data":"9cfa234e17d6f4bf0b1dd8561f361dd36adf6a00a5cfd33880a406bb9377043c"} Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.973042 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tfrgx" event={"ID":"7ff82294-1c1b-4fdb-8644-d3629d847a36","Type":"ContainerStarted","Data":"2748014b90449f7622f2ad326fe5281c66e9bcc1991e6dd2bf21058712200dbd"} Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.982039 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wckzw" event={"ID":"3708be10-919d-4500-8325-90de029cbb66","Type":"ContainerStarted","Data":"2575011a77ca5571e1d00f4d4e4d2a75b5ba1ddb7bb189397ddbc021a6e667be"} Dec 05 16:18:15 crc kubenswrapper[4756]: I1205 16:18:15.982070 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wckzw" event={"ID":"3708be10-919d-4500-8325-90de029cbb66","Type":"ContainerStarted","Data":"0207c3eae8ba162c1c02e30cdf298cae42dbff557f04b2d0892e7111082b57d9"} Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.014247 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-wckzw" podStartSLOduration=12.01423382 podStartE2EDuration="12.01423382s" podCreationTimestamp="2025-12-05 16:18:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:16.012185972 +0000 UTC m=+156.250859484" watchObservedRunningTime="2025-12-05 16:18:16.01423382 +0000 UTC m=+156.252907332" Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.078032 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.126680 4756 patch_prober.go:28] interesting pod/router-default-5444994796-pt8fk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 16:18:16 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Dec 05 16:18:16 crc kubenswrapper[4756]: [+]process-running ok Dec 05 16:18:16 crc kubenswrapper[4756]: healthz check failed Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.126729 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pt8fk" podUID="6ad5d1b1-4e12-430f-91f1-70e8fd975a09" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.142616 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hxxk6"] Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.239787 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.253956 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.257982 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.259871 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.264189 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.266122 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-6kmfx" Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.372302 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.372888 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.448953 4756 patch_prober.go:28] interesting pod/downloads-7954f5f757-z9xhd container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.448999 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-z9xhd" podUID="6013c2d1-fe3a-4d8d-ba47-132ad91a6a34" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.449727 4756 patch_prober.go:28] interesting pod/downloads-7954f5f757-z9xhd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.449774 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z9xhd" podUID="6013c2d1-fe3a-4d8d-ba47-132ad91a6a34" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.474126 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.474179 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.474309 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.480851 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.480903 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.488907 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.488943 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.493458 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.495680 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.511560 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.537100 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-wld6x" Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.537158 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-wld6x" Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.542224 4756 patch_prober.go:28] interesting pod/console-f9d7485db-wld6x container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.542276 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-wld6x" podUID="351d65a8-44a7-4f84-be48-1b5204422e8b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.679467 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-dpqm9" Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.686093 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 16:18:16 crc kubenswrapper[4756]: I1205 16:18:16.957219 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 16:18:17 crc kubenswrapper[4756]: I1205 16:18:17.014790 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" event={"ID":"9daff7ba-fe02-4b65-8db3-26fb89ff1c25","Type":"ContainerStarted","Data":"a0556efbe5aba168f5ccb0228a0cd0f138f5cc3098bee6613ca39f6e15bfb069"} Dec 05 16:18:17 crc kubenswrapper[4756]: I1205 16:18:17.014843 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" event={"ID":"9daff7ba-fe02-4b65-8db3-26fb89ff1c25","Type":"ContainerStarted","Data":"07636d37957ece3848e8d468083e9438324df0bbc00cef8959144f66f23cc833"} Dec 05 16:18:17 crc kubenswrapper[4756]: I1205 16:18:17.015258 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:17 crc kubenswrapper[4756]: I1205 16:18:17.022260 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58","Type":"ContainerStarted","Data":"58e647fc6c9fa6c046c6913d25ca064ba6229993280d1cf0902051d1e8cc393d"} Dec 05 16:18:17 crc kubenswrapper[4756]: I1205 16:18:17.025632 4756 generic.go:334] "Generic (PLEG): container finished" podID="a5bc48be-ea95-47a5-b804-ac4dc8df958b" containerID="8f805d8387b2644ad0214ffc36db9e06ad5d9a327f52bb99829b9fcfa8a5082c" exitCode=0 Dec 05 16:18:17 crc kubenswrapper[4756]: I1205 16:18:17.026040 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kt7t" event={"ID":"a5bc48be-ea95-47a5-b804-ac4dc8df958b","Type":"ContainerDied","Data":"8f805d8387b2644ad0214ffc36db9e06ad5d9a327f52bb99829b9fcfa8a5082c"} Dec 05 16:18:17 crc kubenswrapper[4756]: I1205 16:18:17.033766 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-hqh5j" Dec 05 16:18:17 crc kubenswrapper[4756]: I1205 16:18:17.034045 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zr8n" Dec 05 16:18:17 crc kubenswrapper[4756]: I1205 16:18:17.054847 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" podStartSLOduration=133.054832473 podStartE2EDuration="2m13.054832473s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:18:17.037175433 +0000 UTC m=+157.275848955" watchObservedRunningTime="2025-12-05 16:18:17.054832473 +0000 UTC m=+157.293505985" Dec 05 16:18:17 crc kubenswrapper[4756]: I1205 16:18:17.121449 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-pt8fk" Dec 05 16:18:17 crc kubenswrapper[4756]: I1205 16:18:17.132850 4756 patch_prober.go:28] interesting pod/router-default-5444994796-pt8fk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 16:18:17 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Dec 05 16:18:17 crc kubenswrapper[4756]: [+]process-running ok Dec 05 16:18:17 crc kubenswrapper[4756]: healthz check failed Dec 05 16:18:17 crc kubenswrapper[4756]: I1205 16:18:17.132913 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pt8fk" podUID="6ad5d1b1-4e12-430f-91f1-70e8fd975a09" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 16:18:18 crc kubenswrapper[4756]: I1205 16:18:18.038829 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58","Type":"ContainerStarted","Data":"6dcf5600c8eabf58c58716a30828eff95c090ee381c79c289e508913530e62c4"} Dec 05 16:18:18 crc kubenswrapper[4756]: I1205 16:18:18.133625 4756 patch_prober.go:28] interesting pod/router-default-5444994796-pt8fk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 16:18:18 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Dec 05 16:18:18 crc kubenswrapper[4756]: [+]process-running ok Dec 05 16:18:18 crc kubenswrapper[4756]: healthz check failed Dec 05 16:18:18 crc kubenswrapper[4756]: I1205 16:18:18.133690 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pt8fk" podUID="6ad5d1b1-4e12-430f-91f1-70e8fd975a09" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 16:18:19 crc kubenswrapper[4756]: I1205 16:18:19.054133 4756 generic.go:334] "Generic (PLEG): container finished" podID="0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58" containerID="6dcf5600c8eabf58c58716a30828eff95c090ee381c79c289e508913530e62c4" exitCode=0 Dec 05 16:18:19 crc kubenswrapper[4756]: I1205 16:18:19.054227 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58","Type":"ContainerDied","Data":"6dcf5600c8eabf58c58716a30828eff95c090ee381c79c289e508913530e62c4"} Dec 05 16:18:19 crc kubenswrapper[4756]: I1205 16:18:19.122160 4756 patch_prober.go:28] interesting pod/router-default-5444994796-pt8fk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 16:18:19 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Dec 05 16:18:19 crc kubenswrapper[4756]: [+]process-running ok Dec 05 16:18:19 crc kubenswrapper[4756]: healthz check failed Dec 05 16:18:19 crc kubenswrapper[4756]: I1205 16:18:19.122209 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pt8fk" podUID="6ad5d1b1-4e12-430f-91f1-70e8fd975a09" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 16:18:19 crc kubenswrapper[4756]: I1205 16:18:19.321315 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 16:18:19 crc kubenswrapper[4756]: I1205 16:18:19.322582 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 16:18:19 crc kubenswrapper[4756]: I1205 16:18:19.325663 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 05 16:18:19 crc kubenswrapper[4756]: I1205 16:18:19.325909 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 05 16:18:19 crc kubenswrapper[4756]: I1205 16:18:19.326116 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 16:18:19 crc kubenswrapper[4756]: I1205 16:18:19.430369 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/312153b2-8409-49aa-a910-24f5c72bea93-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"312153b2-8409-49aa-a910-24f5c72bea93\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 16:18:19 crc kubenswrapper[4756]: I1205 16:18:19.430564 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/312153b2-8409-49aa-a910-24f5c72bea93-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"312153b2-8409-49aa-a910-24f5c72bea93\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 16:18:19 crc kubenswrapper[4756]: I1205 16:18:19.532755 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/312153b2-8409-49aa-a910-24f5c72bea93-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"312153b2-8409-49aa-a910-24f5c72bea93\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 16:18:19 crc kubenswrapper[4756]: I1205 16:18:19.532819 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/312153b2-8409-49aa-a910-24f5c72bea93-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"312153b2-8409-49aa-a910-24f5c72bea93\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 16:18:19 crc kubenswrapper[4756]: I1205 16:18:19.532975 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/312153b2-8409-49aa-a910-24f5c72bea93-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"312153b2-8409-49aa-a910-24f5c72bea93\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 16:18:19 crc kubenswrapper[4756]: I1205 16:18:19.748162 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/312153b2-8409-49aa-a910-24f5c72bea93-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"312153b2-8409-49aa-a910-24f5c72bea93\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 16:18:19 crc kubenswrapper[4756]: I1205 16:18:19.944939 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 16:18:20 crc kubenswrapper[4756]: I1205 16:18:20.124145 4756 patch_prober.go:28] interesting pod/router-default-5444994796-pt8fk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 16:18:20 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Dec 05 16:18:20 crc kubenswrapper[4756]: [+]process-running ok Dec 05 16:18:20 crc kubenswrapper[4756]: healthz check failed Dec 05 16:18:20 crc kubenswrapper[4756]: I1205 16:18:20.124229 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pt8fk" podUID="6ad5d1b1-4e12-430f-91f1-70e8fd975a09" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 16:18:20 crc kubenswrapper[4756]: I1205 16:18:20.382225 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 16:18:20 crc kubenswrapper[4756]: I1205 16:18:20.464080 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58-kube-api-access\") pod \"0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58\" (UID: \"0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58\") " Dec 05 16:18:20 crc kubenswrapper[4756]: I1205 16:18:20.464207 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58-kubelet-dir\") pod \"0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58\" (UID: \"0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58\") " Dec 05 16:18:20 crc kubenswrapper[4756]: I1205 16:18:20.464369 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58" (UID: "0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:18:20 crc kubenswrapper[4756]: I1205 16:18:20.464523 4756 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:20 crc kubenswrapper[4756]: I1205 16:18:20.469731 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58" (UID: "0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:18:20 crc kubenswrapper[4756]: I1205 16:18:20.553788 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 16:18:20 crc kubenswrapper[4756]: I1205 16:18:20.566261 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:20 crc kubenswrapper[4756]: W1205 16:18:20.601750 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod312153b2_8409_49aa_a910_24f5c72bea93.slice/crio-f732caa833d04dbc297db59dcf2afeb5794d14f02607dfe86ce4e1670f0b5598 WatchSource:0}: Error finding container f732caa833d04dbc297db59dcf2afeb5794d14f02607dfe86ce4e1670f0b5598: Status 404 returned error can't find the container with id f732caa833d04dbc297db59dcf2afeb5794d14f02607dfe86ce4e1670f0b5598 Dec 05 16:18:21 crc kubenswrapper[4756]: I1205 16:18:21.090895 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 16:18:21 crc kubenswrapper[4756]: I1205 16:18:21.090978 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58","Type":"ContainerDied","Data":"58e647fc6c9fa6c046c6913d25ca064ba6229993280d1cf0902051d1e8cc393d"} Dec 05 16:18:21 crc kubenswrapper[4756]: I1205 16:18:21.091452 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58e647fc6c9fa6c046c6913d25ca064ba6229993280d1cf0902051d1e8cc393d" Dec 05 16:18:21 crc kubenswrapper[4756]: I1205 16:18:21.092509 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"312153b2-8409-49aa-a910-24f5c72bea93","Type":"ContainerStarted","Data":"f732caa833d04dbc297db59dcf2afeb5794d14f02607dfe86ce4e1670f0b5598"} Dec 05 16:18:21 crc kubenswrapper[4756]: I1205 16:18:21.123272 4756 patch_prober.go:28] interesting pod/router-default-5444994796-pt8fk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 16:18:21 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Dec 05 16:18:21 crc kubenswrapper[4756]: [+]process-running ok Dec 05 16:18:21 crc kubenswrapper[4756]: healthz check failed Dec 05 16:18:21 crc kubenswrapper[4756]: I1205 16:18:21.123338 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pt8fk" podUID="6ad5d1b1-4e12-430f-91f1-70e8fd975a09" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 16:18:21 crc kubenswrapper[4756]: E1205 16:18:21.202496 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-pod0dc68d4f_b3cc_4d2d_bc46_d968b4b20a58.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod0dc68d4f_b3cc_4d2d_bc46_d968b4b20a58.slice/crio-58e647fc6c9fa6c046c6913d25ca064ba6229993280d1cf0902051d1e8cc393d\": RecentStats: unable to find data in memory cache]" Dec 05 16:18:22 crc kubenswrapper[4756]: I1205 16:18:22.106279 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"312153b2-8409-49aa-a910-24f5c72bea93","Type":"ContainerStarted","Data":"d939a4f1d94ca1e27c490fcda58855d5d52cdec3bdb83a3db1ca68267cbe3c9e"} Dec 05 16:18:22 crc kubenswrapper[4756]: I1205 16:18:22.125185 4756 patch_prober.go:28] interesting pod/router-default-5444994796-pt8fk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 16:18:22 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Dec 05 16:18:22 crc kubenswrapper[4756]: [+]process-running ok Dec 05 16:18:22 crc kubenswrapper[4756]: healthz check failed Dec 05 16:18:22 crc kubenswrapper[4756]: I1205 16:18:22.125252 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pt8fk" podUID="6ad5d1b1-4e12-430f-91f1-70e8fd975a09" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 16:18:22 crc kubenswrapper[4756]: I1205 16:18:22.274589 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6kmfx" Dec 05 16:18:22 crc kubenswrapper[4756]: I1205 16:18:22.296815 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:18:22 crc kubenswrapper[4756]: I1205 16:18:22.296871 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:18:23 crc kubenswrapper[4756]: I1205 16:18:23.123074 4756 patch_prober.go:28] interesting pod/router-default-5444994796-pt8fk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 16:18:23 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Dec 05 16:18:23 crc kubenswrapper[4756]: [+]process-running ok Dec 05 16:18:23 crc kubenswrapper[4756]: healthz check failed Dec 05 16:18:23 crc kubenswrapper[4756]: I1205 16:18:23.123156 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pt8fk" podUID="6ad5d1b1-4e12-430f-91f1-70e8fd975a09" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 16:18:23 crc kubenswrapper[4756]: I1205 16:18:23.123840 4756 generic.go:334] "Generic (PLEG): container finished" podID="312153b2-8409-49aa-a910-24f5c72bea93" containerID="d939a4f1d94ca1e27c490fcda58855d5d52cdec3bdb83a3db1ca68267cbe3c9e" exitCode=0 Dec 05 16:18:23 crc kubenswrapper[4756]: I1205 16:18:23.123893 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"312153b2-8409-49aa-a910-24f5c72bea93","Type":"ContainerDied","Data":"d939a4f1d94ca1e27c490fcda58855d5d52cdec3bdb83a3db1ca68267cbe3c9e"} Dec 05 16:18:24 crc kubenswrapper[4756]: I1205 16:18:24.122344 4756 patch_prober.go:28] interesting pod/router-default-5444994796-pt8fk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 16:18:24 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Dec 05 16:18:24 crc kubenswrapper[4756]: [+]process-running ok Dec 05 16:18:24 crc kubenswrapper[4756]: healthz check failed Dec 05 16:18:24 crc kubenswrapper[4756]: I1205 16:18:24.122399 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pt8fk" podUID="6ad5d1b1-4e12-430f-91f1-70e8fd975a09" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 16:18:25 crc kubenswrapper[4756]: I1205 16:18:25.121719 4756 patch_prober.go:28] interesting pod/router-default-5444994796-pt8fk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 16:18:25 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Dec 05 16:18:25 crc kubenswrapper[4756]: [+]process-running ok Dec 05 16:18:25 crc kubenswrapper[4756]: healthz check failed Dec 05 16:18:25 crc kubenswrapper[4756]: I1205 16:18:25.121989 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pt8fk" podUID="6ad5d1b1-4e12-430f-91f1-70e8fd975a09" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 16:18:26 crc kubenswrapper[4756]: I1205 16:18:26.122158 4756 patch_prober.go:28] interesting pod/router-default-5444994796-pt8fk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 16:18:26 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Dec 05 16:18:26 crc kubenswrapper[4756]: [+]process-running ok Dec 05 16:18:26 crc kubenswrapper[4756]: healthz check failed Dec 05 16:18:26 crc kubenswrapper[4756]: I1205 16:18:26.122213 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pt8fk" podUID="6ad5d1b1-4e12-430f-91f1-70e8fd975a09" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 16:18:26 crc kubenswrapper[4756]: I1205 16:18:26.454574 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-z9xhd" Dec 05 16:18:26 crc kubenswrapper[4756]: I1205 16:18:26.536795 4756 patch_prober.go:28] interesting pod/console-f9d7485db-wld6x container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Dec 05 16:18:26 crc kubenswrapper[4756]: I1205 16:18:26.537079 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-wld6x" podUID="351d65a8-44a7-4f84-be48-1b5204422e8b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" Dec 05 16:18:26 crc kubenswrapper[4756]: I1205 16:18:26.659404 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a3ee468-140c-408e-80a1-aa8fe3fc0acc-metrics-certs\") pod \"network-metrics-daemon-8lsn9\" (UID: \"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\") " pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:18:26 crc kubenswrapper[4756]: I1205 16:18:26.669093 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a3ee468-140c-408e-80a1-aa8fe3fc0acc-metrics-certs\") pod \"network-metrics-daemon-8lsn9\" (UID: \"7a3ee468-140c-408e-80a1-aa8fe3fc0acc\") " pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:18:26 crc kubenswrapper[4756]: I1205 16:18:26.872585 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lsn9" Dec 05 16:18:27 crc kubenswrapper[4756]: I1205 16:18:27.125846 4756 patch_prober.go:28] interesting pod/router-default-5444994796-pt8fk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 16:18:27 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Dec 05 16:18:27 crc kubenswrapper[4756]: [+]process-running ok Dec 05 16:18:27 crc kubenswrapper[4756]: healthz check failed Dec 05 16:18:27 crc kubenswrapper[4756]: I1205 16:18:27.127704 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pt8fk" podUID="6ad5d1b1-4e12-430f-91f1-70e8fd975a09" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 16:18:28 crc kubenswrapper[4756]: I1205 16:18:28.124367 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-pt8fk" Dec 05 16:18:28 crc kubenswrapper[4756]: I1205 16:18:28.127238 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-pt8fk" Dec 05 16:18:28 crc kubenswrapper[4756]: I1205 16:18:28.314481 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 16:18:28 crc kubenswrapper[4756]: I1205 16:18:28.382086 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/312153b2-8409-49aa-a910-24f5c72bea93-kube-api-access\") pod \"312153b2-8409-49aa-a910-24f5c72bea93\" (UID: \"312153b2-8409-49aa-a910-24f5c72bea93\") " Dec 05 16:18:28 crc kubenswrapper[4756]: I1205 16:18:28.382193 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/312153b2-8409-49aa-a910-24f5c72bea93-kubelet-dir\") pod \"312153b2-8409-49aa-a910-24f5c72bea93\" (UID: \"312153b2-8409-49aa-a910-24f5c72bea93\") " Dec 05 16:18:28 crc kubenswrapper[4756]: I1205 16:18:28.382496 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/312153b2-8409-49aa-a910-24f5c72bea93-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "312153b2-8409-49aa-a910-24f5c72bea93" (UID: "312153b2-8409-49aa-a910-24f5c72bea93"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:18:28 crc kubenswrapper[4756]: I1205 16:18:28.385621 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/312153b2-8409-49aa-a910-24f5c72bea93-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "312153b2-8409-49aa-a910-24f5c72bea93" (UID: "312153b2-8409-49aa-a910-24f5c72bea93"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:18:28 crc kubenswrapper[4756]: I1205 16:18:28.483291 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/312153b2-8409-49aa-a910-24f5c72bea93-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:28 crc kubenswrapper[4756]: I1205 16:18:28.483332 4756 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/312153b2-8409-49aa-a910-24f5c72bea93-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 16:18:29 crc kubenswrapper[4756]: I1205 16:18:29.163666 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"312153b2-8409-49aa-a910-24f5c72bea93","Type":"ContainerDied","Data":"f732caa833d04dbc297db59dcf2afeb5794d14f02607dfe86ce4e1670f0b5598"} Dec 05 16:18:29 crc kubenswrapper[4756]: I1205 16:18:29.163710 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f732caa833d04dbc297db59dcf2afeb5794d14f02607dfe86ce4e1670f0b5598" Dec 05 16:18:29 crc kubenswrapper[4756]: I1205 16:18:29.163782 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 16:18:35 crc kubenswrapper[4756]: I1205 16:18:35.657620 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:18:36 crc kubenswrapper[4756]: I1205 16:18:36.543517 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-wld6x" Dec 05 16:18:36 crc kubenswrapper[4756]: I1205 16:18:36.549675 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-wld6x" Dec 05 16:18:41 crc kubenswrapper[4756]: E1205 16:18:41.379506 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 05 16:18:41 crc kubenswrapper[4756]: E1205 16:18:41.380380 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z5jfv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-z92m7_openshift-marketplace(956dc850-ee41-4d30-b209-2f3270c376a5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 16:18:41 crc kubenswrapper[4756]: E1205 16:18:41.381601 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-z92m7" podUID="956dc850-ee41-4d30-b209-2f3270c376a5" Dec 05 16:18:46 crc kubenswrapper[4756]: I1205 16:18:46.133164 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 16:18:47 crc kubenswrapper[4756]: I1205 16:18:47.797905 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6zs9x" Dec 05 16:18:52 crc kubenswrapper[4756]: I1205 16:18:52.295696 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:18:52 crc kubenswrapper[4756]: I1205 16:18:52.295796 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:18:55 crc kubenswrapper[4756]: I1205 16:18:55.121871 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 16:18:55 crc kubenswrapper[4756]: E1205 16:18:55.122335 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="312153b2-8409-49aa-a910-24f5c72bea93" containerName="pruner" Dec 05 16:18:55 crc kubenswrapper[4756]: I1205 16:18:55.122350 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="312153b2-8409-49aa-a910-24f5c72bea93" containerName="pruner" Dec 05 16:18:55 crc kubenswrapper[4756]: E1205 16:18:55.122361 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58" containerName="pruner" Dec 05 16:18:55 crc kubenswrapper[4756]: I1205 16:18:55.122369 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58" containerName="pruner" Dec 05 16:18:55 crc kubenswrapper[4756]: I1205 16:18:55.122501 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dc68d4f-b3cc-4d2d-bc46-d968b4b20a58" containerName="pruner" Dec 05 16:18:55 crc kubenswrapper[4756]: I1205 16:18:55.122519 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="312153b2-8409-49aa-a910-24f5c72bea93" containerName="pruner" Dec 05 16:18:55 crc kubenswrapper[4756]: I1205 16:18:55.122969 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 16:18:55 crc kubenswrapper[4756]: I1205 16:18:55.126339 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 05 16:18:55 crc kubenswrapper[4756]: I1205 16:18:55.131712 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 05 16:18:55 crc kubenswrapper[4756]: I1205 16:18:55.135610 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 16:18:55 crc kubenswrapper[4756]: I1205 16:18:55.256456 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/85782b86-3bb4-4ae4-a706-654876b73aec-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"85782b86-3bb4-4ae4-a706-654876b73aec\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 16:18:55 crc kubenswrapper[4756]: I1205 16:18:55.256549 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85782b86-3bb4-4ae4-a706-654876b73aec-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"85782b86-3bb4-4ae4-a706-654876b73aec\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 16:18:55 crc kubenswrapper[4756]: I1205 16:18:55.358155 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/85782b86-3bb4-4ae4-a706-654876b73aec-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"85782b86-3bb4-4ae4-a706-654876b73aec\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 16:18:55 crc kubenswrapper[4756]: I1205 16:18:55.358231 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85782b86-3bb4-4ae4-a706-654876b73aec-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"85782b86-3bb4-4ae4-a706-654876b73aec\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 16:18:55 crc kubenswrapper[4756]: I1205 16:18:55.358409 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/85782b86-3bb4-4ae4-a706-654876b73aec-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"85782b86-3bb4-4ae4-a706-654876b73aec\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 16:18:55 crc kubenswrapper[4756]: I1205 16:18:55.381376 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85782b86-3bb4-4ae4-a706-654876b73aec-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"85782b86-3bb4-4ae4-a706-654876b73aec\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 16:18:55 crc kubenswrapper[4756]: I1205 16:18:55.455120 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 16:18:55 crc kubenswrapper[4756]: E1205 16:18:55.580665 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 05 16:18:55 crc kubenswrapper[4756]: E1205 16:18:55.580999 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-759hf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-crlcr_openshift-marketplace(95c208d2-1ef6-4716-8053-ccbabcd7fe59): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 16:18:55 crc kubenswrapper[4756]: E1205 16:18:55.582197 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-crlcr" podUID="95c208d2-1ef6-4716-8053-ccbabcd7fe59" Dec 05 16:18:55 crc kubenswrapper[4756]: E1205 16:18:55.617824 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 05 16:18:55 crc kubenswrapper[4756]: E1205 16:18:55.617969 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jsk6s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-w6h66_openshift-marketplace(1bd03756-cb80-40f8-b5eb-c936c80c6c21): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 16:18:55 crc kubenswrapper[4756]: E1205 16:18:55.619143 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-w6h66" podUID="1bd03756-cb80-40f8-b5eb-c936c80c6c21" Dec 05 16:19:00 crc kubenswrapper[4756]: I1205 16:19:00.732540 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 16:19:00 crc kubenswrapper[4756]: I1205 16:19:00.733975 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 16:19:00 crc kubenswrapper[4756]: I1205 16:19:00.736261 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 16:19:00 crc kubenswrapper[4756]: I1205 16:19:00.822200 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0bfa691e-5839-4735-9378-d06063656d75-var-lock\") pod \"installer-9-crc\" (UID: \"0bfa691e-5839-4735-9378-d06063656d75\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 16:19:00 crc kubenswrapper[4756]: I1205 16:19:00.822253 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0bfa691e-5839-4735-9378-d06063656d75-kubelet-dir\") pod \"installer-9-crc\" (UID: \"0bfa691e-5839-4735-9378-d06063656d75\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 16:19:00 crc kubenswrapper[4756]: I1205 16:19:00.822289 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bfa691e-5839-4735-9378-d06063656d75-kube-api-access\") pod \"installer-9-crc\" (UID: \"0bfa691e-5839-4735-9378-d06063656d75\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 16:19:00 crc kubenswrapper[4756]: I1205 16:19:00.923457 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0bfa691e-5839-4735-9378-d06063656d75-var-lock\") pod \"installer-9-crc\" (UID: \"0bfa691e-5839-4735-9378-d06063656d75\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 16:19:00 crc kubenswrapper[4756]: I1205 16:19:00.923501 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0bfa691e-5839-4735-9378-d06063656d75-kubelet-dir\") pod \"installer-9-crc\" (UID: \"0bfa691e-5839-4735-9378-d06063656d75\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 16:19:00 crc kubenswrapper[4756]: I1205 16:19:00.923536 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bfa691e-5839-4735-9378-d06063656d75-kube-api-access\") pod \"installer-9-crc\" (UID: \"0bfa691e-5839-4735-9378-d06063656d75\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 16:19:00 crc kubenswrapper[4756]: I1205 16:19:00.923582 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0bfa691e-5839-4735-9378-d06063656d75-var-lock\") pod \"installer-9-crc\" (UID: \"0bfa691e-5839-4735-9378-d06063656d75\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 16:19:00 crc kubenswrapper[4756]: I1205 16:19:00.923650 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0bfa691e-5839-4735-9378-d06063656d75-kubelet-dir\") pod \"installer-9-crc\" (UID: \"0bfa691e-5839-4735-9378-d06063656d75\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 16:19:00 crc kubenswrapper[4756]: I1205 16:19:00.946354 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bfa691e-5839-4735-9378-d06063656d75-kube-api-access\") pod \"installer-9-crc\" (UID: \"0bfa691e-5839-4735-9378-d06063656d75\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 16:19:01 crc kubenswrapper[4756]: I1205 16:19:01.054736 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 16:19:01 crc kubenswrapper[4756]: E1205 16:19:01.244821 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-crlcr" podUID="95c208d2-1ef6-4716-8053-ccbabcd7fe59" Dec 05 16:19:01 crc kubenswrapper[4756]: E1205 16:19:01.244821 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-w6h66" podUID="1bd03756-cb80-40f8-b5eb-c936c80c6c21" Dec 05 16:19:01 crc kubenswrapper[4756]: E1205 16:19:01.335825 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 05 16:19:01 crc kubenswrapper[4756]: E1205 16:19:01.335976 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g9n9k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-qndg5_openshift-marketplace(9805087f-2ab0-4780-9ee1-37f58467e8a2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 16:19:01 crc kubenswrapper[4756]: E1205 16:19:01.337114 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-qndg5" podUID="9805087f-2ab0-4780-9ee1-37f58467e8a2" Dec 05 16:19:01 crc kubenswrapper[4756]: E1205 16:19:01.348819 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 05 16:19:01 crc kubenswrapper[4756]: E1205 16:19:01.348969 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-788p4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-6kt7t_openshift-marketplace(a5bc48be-ea95-47a5-b804-ac4dc8df958b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 16:19:01 crc kubenswrapper[4756]: E1205 16:19:01.350142 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-6kt7t" podUID="a5bc48be-ea95-47a5-b804-ac4dc8df958b" Dec 05 16:19:01 crc kubenswrapper[4756]: E1205 16:19:01.353869 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 05 16:19:01 crc kubenswrapper[4756]: E1205 16:19:01.354006 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mb8ph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-wm9rv_openshift-marketplace(5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 16:19:01 crc kubenswrapper[4756]: E1205 16:19:01.355180 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-wm9rv" podUID="5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d" Dec 05 16:19:01 crc kubenswrapper[4756]: E1205 16:19:01.393205 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 05 16:19:01 crc kubenswrapper[4756]: E1205 16:19:01.393342 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xmlgl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-nqkjr_openshift-marketplace(bd889233-fc28-41f6-806f-3f7ab7670bca): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 16:19:01 crc kubenswrapper[4756]: E1205 16:19:01.395921 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-nqkjr" podUID="bd889233-fc28-41f6-806f-3f7ab7670bca" Dec 05 16:19:01 crc kubenswrapper[4756]: I1205 16:19:01.754891 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8lsn9"] Dec 05 16:19:01 crc kubenswrapper[4756]: I1205 16:19:01.800802 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 16:19:01 crc kubenswrapper[4756]: W1205 16:19:01.821356 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod85782b86_3bb4_4ae4_a706_654876b73aec.slice/crio-0c5ad74f4100f6276fd76cf03df3139ab334de97a8801e52dc1d7cf702215c37 WatchSource:0}: Error finding container 0c5ad74f4100f6276fd76cf03df3139ab334de97a8801e52dc1d7cf702215c37: Status 404 returned error can't find the container with id 0c5ad74f4100f6276fd76cf03df3139ab334de97a8801e52dc1d7cf702215c37 Dec 05 16:19:01 crc kubenswrapper[4756]: W1205 16:19:01.821636 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a3ee468_140c_408e_80a1_aa8fe3fc0acc.slice/crio-a9266ef56c3f7808d59f4e4bc58c92c90d570568bb39d6c45b901a77869bd7d1 WatchSource:0}: Error finding container a9266ef56c3f7808d59f4e4bc58c92c90d570568bb39d6c45b901a77869bd7d1: Status 404 returned error can't find the container with id a9266ef56c3f7808d59f4e4bc58c92c90d570568bb39d6c45b901a77869bd7d1 Dec 05 16:19:01 crc kubenswrapper[4756]: I1205 16:19:01.866182 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 16:19:01 crc kubenswrapper[4756]: W1205 16:19:01.879729 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0bfa691e_5839_4735_9378_d06063656d75.slice/crio-4ba278cff331963bffe360a8f071c63a514475ade7e5e2fb64b99231f91310ef WatchSource:0}: Error finding container 4ba278cff331963bffe360a8f071c63a514475ade7e5e2fb64b99231f91310ef: Status 404 returned error can't find the container with id 4ba278cff331963bffe360a8f071c63a514475ade7e5e2fb64b99231f91310ef Dec 05 16:19:02 crc kubenswrapper[4756]: I1205 16:19:02.368106 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0bfa691e-5839-4735-9378-d06063656d75","Type":"ContainerStarted","Data":"c7f612dc8b7202136ee842eb5def2542d67b330ba2a6a74195cd6c4730e2aafc"} Dec 05 16:19:02 crc kubenswrapper[4756]: I1205 16:19:02.368507 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0bfa691e-5839-4735-9378-d06063656d75","Type":"ContainerStarted","Data":"4ba278cff331963bffe360a8f071c63a514475ade7e5e2fb64b99231f91310ef"} Dec 05 16:19:02 crc kubenswrapper[4756]: I1205 16:19:02.370494 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8lsn9" event={"ID":"7a3ee468-140c-408e-80a1-aa8fe3fc0acc","Type":"ContainerStarted","Data":"dcd9b934e03a7302a9c4addf5b7c045156cf03ed3e62363eb6c0795062c381cd"} Dec 05 16:19:02 crc kubenswrapper[4756]: I1205 16:19:02.370532 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8lsn9" event={"ID":"7a3ee468-140c-408e-80a1-aa8fe3fc0acc","Type":"ContainerStarted","Data":"71609498d80f075b4febb13909ef911f41f79b721335f5210a1a682f9c520ed5"} Dec 05 16:19:02 crc kubenswrapper[4756]: I1205 16:19:02.370543 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8lsn9" event={"ID":"7a3ee468-140c-408e-80a1-aa8fe3fc0acc","Type":"ContainerStarted","Data":"a9266ef56c3f7808d59f4e4bc58c92c90d570568bb39d6c45b901a77869bd7d1"} Dec 05 16:19:02 crc kubenswrapper[4756]: I1205 16:19:02.372700 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"85782b86-3bb4-4ae4-a706-654876b73aec","Type":"ContainerStarted","Data":"c46da39123932473584162221814fa081bfda06b6200a33836e363b7b74863f0"} Dec 05 16:19:02 crc kubenswrapper[4756]: I1205 16:19:02.372733 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"85782b86-3bb4-4ae4-a706-654876b73aec","Type":"ContainerStarted","Data":"0c5ad74f4100f6276fd76cf03df3139ab334de97a8801e52dc1d7cf702215c37"} Dec 05 16:19:02 crc kubenswrapper[4756]: I1205 16:19:02.375799 4756 generic.go:334] "Generic (PLEG): container finished" podID="956dc850-ee41-4d30-b209-2f3270c376a5" containerID="1bc8b34620d87755dbb185323ddae8813a6ba4c9db612bcd619cb68560326592" exitCode=0 Dec 05 16:19:02 crc kubenswrapper[4756]: I1205 16:19:02.375873 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z92m7" event={"ID":"956dc850-ee41-4d30-b209-2f3270c376a5","Type":"ContainerDied","Data":"1bc8b34620d87755dbb185323ddae8813a6ba4c9db612bcd619cb68560326592"} Dec 05 16:19:02 crc kubenswrapper[4756]: I1205 16:19:02.378267 4756 generic.go:334] "Generic (PLEG): container finished" podID="7ff82294-1c1b-4fdb-8644-d3629d847a36" containerID="781c55bc368a026ba5228d05fbba4298c6067ebdd93c4468d8c6e23c0e309c09" exitCode=0 Dec 05 16:19:02 crc kubenswrapper[4756]: I1205 16:19:02.379183 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tfrgx" event={"ID":"7ff82294-1c1b-4fdb-8644-d3629d847a36","Type":"ContainerDied","Data":"781c55bc368a026ba5228d05fbba4298c6067ebdd93c4468d8c6e23c0e309c09"} Dec 05 16:19:02 crc kubenswrapper[4756]: E1205 16:19:02.380361 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-nqkjr" podUID="bd889233-fc28-41f6-806f-3f7ab7670bca" Dec 05 16:19:02 crc kubenswrapper[4756]: E1205 16:19:02.380842 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-6kt7t" podUID="a5bc48be-ea95-47a5-b804-ac4dc8df958b" Dec 05 16:19:02 crc kubenswrapper[4756]: E1205 16:19:02.381729 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qndg5" podUID="9805087f-2ab0-4780-9ee1-37f58467e8a2" Dec 05 16:19:02 crc kubenswrapper[4756]: E1205 16:19:02.381833 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-wm9rv" podUID="5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d" Dec 05 16:19:02 crc kubenswrapper[4756]: I1205 16:19:02.390075 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.390052897 podStartE2EDuration="2.390052897s" podCreationTimestamp="2025-12-05 16:19:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:19:02.385404308 +0000 UTC m=+202.624077850" watchObservedRunningTime="2025-12-05 16:19:02.390052897 +0000 UTC m=+202.628726439" Dec 05 16:19:02 crc kubenswrapper[4756]: I1205 16:19:02.425073 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=7.425048128 podStartE2EDuration="7.425048128s" podCreationTimestamp="2025-12-05 16:18:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:19:02.419068692 +0000 UTC m=+202.657742244" watchObservedRunningTime="2025-12-05 16:19:02.425048128 +0000 UTC m=+202.663721650" Dec 05 16:19:02 crc kubenswrapper[4756]: I1205 16:19:02.535691 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8lsn9" podStartSLOduration=178.535669364 podStartE2EDuration="2m58.535669364s" podCreationTimestamp="2025-12-05 16:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:19:02.535391526 +0000 UTC m=+202.774065058" watchObservedRunningTime="2025-12-05 16:19:02.535669364 +0000 UTC m=+202.774342876" Dec 05 16:19:03 crc kubenswrapper[4756]: I1205 16:19:03.393469 4756 generic.go:334] "Generic (PLEG): container finished" podID="85782b86-3bb4-4ae4-a706-654876b73aec" containerID="c46da39123932473584162221814fa081bfda06b6200a33836e363b7b74863f0" exitCode=0 Dec 05 16:19:03 crc kubenswrapper[4756]: I1205 16:19:03.394179 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"85782b86-3bb4-4ae4-a706-654876b73aec","Type":"ContainerDied","Data":"c46da39123932473584162221814fa081bfda06b6200a33836e363b7b74863f0"} Dec 05 16:19:03 crc kubenswrapper[4756]: I1205 16:19:03.398123 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z92m7" event={"ID":"956dc850-ee41-4d30-b209-2f3270c376a5","Type":"ContainerStarted","Data":"42a3a1556de1882d1dd77108e70f66eb804737ab143faecb1ee2e1981dd7a818"} Dec 05 16:19:03 crc kubenswrapper[4756]: I1205 16:19:03.401136 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tfrgx" event={"ID":"7ff82294-1c1b-4fdb-8644-d3629d847a36","Type":"ContainerStarted","Data":"cb0b9d15f579efdd73b8d34d314a6ff690ab0bf46e4d61d1fc3903d2e3cb34b2"} Dec 05 16:19:03 crc kubenswrapper[4756]: I1205 16:19:03.425001 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tfrgx" podStartSLOduration=2.631205183 podStartE2EDuration="49.424983795s" podCreationTimestamp="2025-12-05 16:18:14 +0000 UTC" firstStartedPulling="2025-12-05 16:18:16.004987278 +0000 UTC m=+156.243660790" lastFinishedPulling="2025-12-05 16:19:02.79876589 +0000 UTC m=+203.037439402" observedRunningTime="2025-12-05 16:19:03.423619637 +0000 UTC m=+203.662293159" watchObservedRunningTime="2025-12-05 16:19:03.424983795 +0000 UTC m=+203.663657307" Dec 05 16:19:03 crc kubenswrapper[4756]: I1205 16:19:03.440315 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z92m7" podStartSLOduration=2.552285443 podStartE2EDuration="50.440295799s" podCreationTimestamp="2025-12-05 16:18:13 +0000 UTC" firstStartedPulling="2025-12-05 16:18:14.954660671 +0000 UTC m=+155.193334183" lastFinishedPulling="2025-12-05 16:19:02.842671027 +0000 UTC m=+203.081344539" observedRunningTime="2025-12-05 16:19:03.439350763 +0000 UTC m=+203.678024275" watchObservedRunningTime="2025-12-05 16:19:03.440295799 +0000 UTC m=+203.678969311" Dec 05 16:19:04 crc kubenswrapper[4756]: I1205 16:19:04.203577 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z92m7" Dec 05 16:19:04 crc kubenswrapper[4756]: I1205 16:19:04.203972 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z92m7" Dec 05 16:19:04 crc kubenswrapper[4756]: I1205 16:19:04.583009 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tfrgx" Dec 05 16:19:04 crc kubenswrapper[4756]: I1205 16:19:04.583061 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tfrgx" Dec 05 16:19:04 crc kubenswrapper[4756]: I1205 16:19:04.677179 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 16:19:04 crc kubenswrapper[4756]: I1205 16:19:04.767138 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z92m7" Dec 05 16:19:04 crc kubenswrapper[4756]: I1205 16:19:04.769057 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tfrgx" Dec 05 16:19:04 crc kubenswrapper[4756]: I1205 16:19:04.799492 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85782b86-3bb4-4ae4-a706-654876b73aec-kube-api-access\") pod \"85782b86-3bb4-4ae4-a706-654876b73aec\" (UID: \"85782b86-3bb4-4ae4-a706-654876b73aec\") " Dec 05 16:19:04 crc kubenswrapper[4756]: I1205 16:19:04.799583 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/85782b86-3bb4-4ae4-a706-654876b73aec-kubelet-dir\") pod \"85782b86-3bb4-4ae4-a706-654876b73aec\" (UID: \"85782b86-3bb4-4ae4-a706-654876b73aec\") " Dec 05 16:19:04 crc kubenswrapper[4756]: I1205 16:19:04.799967 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85782b86-3bb4-4ae4-a706-654876b73aec-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "85782b86-3bb4-4ae4-a706-654876b73aec" (UID: "85782b86-3bb4-4ae4-a706-654876b73aec"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:19:04 crc kubenswrapper[4756]: I1205 16:19:04.814638 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85782b86-3bb4-4ae4-a706-654876b73aec-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "85782b86-3bb4-4ae4-a706-654876b73aec" (UID: "85782b86-3bb4-4ae4-a706-654876b73aec"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:19:04 crc kubenswrapper[4756]: I1205 16:19:04.900850 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85782b86-3bb4-4ae4-a706-654876b73aec-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:04 crc kubenswrapper[4756]: I1205 16:19:04.900890 4756 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/85782b86-3bb4-4ae4-a706-654876b73aec-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:05 crc kubenswrapper[4756]: I1205 16:19:05.413294 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"85782b86-3bb4-4ae4-a706-654876b73aec","Type":"ContainerDied","Data":"0c5ad74f4100f6276fd76cf03df3139ab334de97a8801e52dc1d7cf702215c37"} Dec 05 16:19:05 crc kubenswrapper[4756]: I1205 16:19:05.413350 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 16:19:05 crc kubenswrapper[4756]: I1205 16:19:05.413352 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c5ad74f4100f6276fd76cf03df3139ab334de97a8801e52dc1d7cf702215c37" Dec 05 16:19:14 crc kubenswrapper[4756]: I1205 16:19:14.255711 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z92m7" Dec 05 16:19:14 crc kubenswrapper[4756]: I1205 16:19:14.635794 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tfrgx" Dec 05 16:19:15 crc kubenswrapper[4756]: I1205 16:19:15.460677 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqkjr" event={"ID":"bd889233-fc28-41f6-806f-3f7ab7670bca","Type":"ContainerStarted","Data":"32106228a801f0d43ed68ef387867d502b994c0d5400b8d4f6c859c4579a0c15"} Dec 05 16:19:15 crc kubenswrapper[4756]: I1205 16:19:15.463283 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wm9rv" event={"ID":"5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d","Type":"ContainerStarted","Data":"175bae53c0b41360dfffe8fbb20b408fcd6f6cba6f57662dd4134a499c704230"} Dec 05 16:19:15 crc kubenswrapper[4756]: I1205 16:19:15.465774 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kt7t" event={"ID":"a5bc48be-ea95-47a5-b804-ac4dc8df958b","Type":"ContainerStarted","Data":"43a0a8c4496018731bab6aa537fd70dfe39d11ff0bbe5cca1b38a8780ee73d5b"} Dec 05 16:19:16 crc kubenswrapper[4756]: I1205 16:19:16.474331 4756 generic.go:334] "Generic (PLEG): container finished" podID="bd889233-fc28-41f6-806f-3f7ab7670bca" containerID="32106228a801f0d43ed68ef387867d502b994c0d5400b8d4f6c859c4579a0c15" exitCode=0 Dec 05 16:19:16 crc kubenswrapper[4756]: I1205 16:19:16.474432 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqkjr" event={"ID":"bd889233-fc28-41f6-806f-3f7ab7670bca","Type":"ContainerDied","Data":"32106228a801f0d43ed68ef387867d502b994c0d5400b8d4f6c859c4579a0c15"} Dec 05 16:19:16 crc kubenswrapper[4756]: I1205 16:19:16.477700 4756 generic.go:334] "Generic (PLEG): container finished" podID="5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d" containerID="175bae53c0b41360dfffe8fbb20b408fcd6f6cba6f57662dd4134a499c704230" exitCode=0 Dec 05 16:19:16 crc kubenswrapper[4756]: I1205 16:19:16.477790 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wm9rv" event={"ID":"5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d","Type":"ContainerDied","Data":"175bae53c0b41360dfffe8fbb20b408fcd6f6cba6f57662dd4134a499c704230"} Dec 05 16:19:16 crc kubenswrapper[4756]: I1205 16:19:16.479834 4756 generic.go:334] "Generic (PLEG): container finished" podID="95c208d2-1ef6-4716-8053-ccbabcd7fe59" containerID="11e98e6644303fe9bb529fc5ba0d3e5258979403f00df73e8d92a3ceb93d9702" exitCode=0 Dec 05 16:19:16 crc kubenswrapper[4756]: I1205 16:19:16.479880 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crlcr" event={"ID":"95c208d2-1ef6-4716-8053-ccbabcd7fe59","Type":"ContainerDied","Data":"11e98e6644303fe9bb529fc5ba0d3e5258979403f00df73e8d92a3ceb93d9702"} Dec 05 16:19:16 crc kubenswrapper[4756]: I1205 16:19:16.482382 4756 generic.go:334] "Generic (PLEG): container finished" podID="a5bc48be-ea95-47a5-b804-ac4dc8df958b" containerID="43a0a8c4496018731bab6aa537fd70dfe39d11ff0bbe5cca1b38a8780ee73d5b" exitCode=0 Dec 05 16:19:16 crc kubenswrapper[4756]: I1205 16:19:16.482469 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kt7t" event={"ID":"a5bc48be-ea95-47a5-b804-ac4dc8df958b","Type":"ContainerDied","Data":"43a0a8c4496018731bab6aa537fd70dfe39d11ff0bbe5cca1b38a8780ee73d5b"} Dec 05 16:19:16 crc kubenswrapper[4756]: I1205 16:19:16.512946 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tfrgx"] Dec 05 16:19:16 crc kubenswrapper[4756]: I1205 16:19:16.513226 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tfrgx" podUID="7ff82294-1c1b-4fdb-8644-d3629d847a36" containerName="registry-server" containerID="cri-o://cb0b9d15f579efdd73b8d34d314a6ff690ab0bf46e4d61d1fc3903d2e3cb34b2" gracePeriod=2 Dec 05 16:19:16 crc kubenswrapper[4756]: I1205 16:19:16.863567 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tfrgx" Dec 05 16:19:16 crc kubenswrapper[4756]: I1205 16:19:16.956691 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ff82294-1c1b-4fdb-8644-d3629d847a36-catalog-content\") pod \"7ff82294-1c1b-4fdb-8644-d3629d847a36\" (UID: \"7ff82294-1c1b-4fdb-8644-d3629d847a36\") " Dec 05 16:19:16 crc kubenswrapper[4756]: I1205 16:19:16.957164 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqjrx\" (UniqueName: \"kubernetes.io/projected/7ff82294-1c1b-4fdb-8644-d3629d847a36-kube-api-access-mqjrx\") pod \"7ff82294-1c1b-4fdb-8644-d3629d847a36\" (UID: \"7ff82294-1c1b-4fdb-8644-d3629d847a36\") " Dec 05 16:19:16 crc kubenswrapper[4756]: I1205 16:19:16.957254 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ff82294-1c1b-4fdb-8644-d3629d847a36-utilities\") pod \"7ff82294-1c1b-4fdb-8644-d3629d847a36\" (UID: \"7ff82294-1c1b-4fdb-8644-d3629d847a36\") " Dec 05 16:19:16 crc kubenswrapper[4756]: I1205 16:19:16.958005 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ff82294-1c1b-4fdb-8644-d3629d847a36-utilities" (OuterVolumeSpecName: "utilities") pod "7ff82294-1c1b-4fdb-8644-d3629d847a36" (UID: "7ff82294-1c1b-4fdb-8644-d3629d847a36"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:19:16 crc kubenswrapper[4756]: I1205 16:19:16.964568 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ff82294-1c1b-4fdb-8644-d3629d847a36-kube-api-access-mqjrx" (OuterVolumeSpecName: "kube-api-access-mqjrx") pod "7ff82294-1c1b-4fdb-8644-d3629d847a36" (UID: "7ff82294-1c1b-4fdb-8644-d3629d847a36"). InnerVolumeSpecName "kube-api-access-mqjrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:19:16 crc kubenswrapper[4756]: I1205 16:19:16.983075 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ff82294-1c1b-4fdb-8644-d3629d847a36-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ff82294-1c1b-4fdb-8644-d3629d847a36" (UID: "7ff82294-1c1b-4fdb-8644-d3629d847a36"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:19:17 crc kubenswrapper[4756]: I1205 16:19:17.058733 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ff82294-1c1b-4fdb-8644-d3629d847a36-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:17 crc kubenswrapper[4756]: I1205 16:19:17.058825 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqjrx\" (UniqueName: \"kubernetes.io/projected/7ff82294-1c1b-4fdb-8644-d3629d847a36-kube-api-access-mqjrx\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:17 crc kubenswrapper[4756]: I1205 16:19:17.059341 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ff82294-1c1b-4fdb-8644-d3629d847a36-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:17 crc kubenswrapper[4756]: I1205 16:19:17.488912 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wm9rv" event={"ID":"5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d","Type":"ContainerStarted","Data":"a594d1535fe50b0e0810f90de0d896de51a3a43797f8f58ff1bf4ff1a97f8d19"} Dec 05 16:19:17 crc kubenswrapper[4756]: I1205 16:19:17.491769 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crlcr" event={"ID":"95c208d2-1ef6-4716-8053-ccbabcd7fe59","Type":"ContainerStarted","Data":"a1b7f1d8ead78b7fdbfc35ba19576984330abbc34adaa14e957409f7cf995a55"} Dec 05 16:19:17 crc kubenswrapper[4756]: I1205 16:19:17.494055 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kt7t" event={"ID":"a5bc48be-ea95-47a5-b804-ac4dc8df958b","Type":"ContainerStarted","Data":"7355480d9eaa6cc2511c66e81e5f3919469dfb47d9794dedb747b2031577b275"} Dec 05 16:19:17 crc kubenswrapper[4756]: I1205 16:19:17.496653 4756 generic.go:334] "Generic (PLEG): container finished" podID="7ff82294-1c1b-4fdb-8644-d3629d847a36" containerID="cb0b9d15f579efdd73b8d34d314a6ff690ab0bf46e4d61d1fc3903d2e3cb34b2" exitCode=0 Dec 05 16:19:17 crc kubenswrapper[4756]: I1205 16:19:17.496716 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tfrgx" event={"ID":"7ff82294-1c1b-4fdb-8644-d3629d847a36","Type":"ContainerDied","Data":"cb0b9d15f579efdd73b8d34d314a6ff690ab0bf46e4d61d1fc3903d2e3cb34b2"} Dec 05 16:19:17 crc kubenswrapper[4756]: I1205 16:19:17.496744 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tfrgx" event={"ID":"7ff82294-1c1b-4fdb-8644-d3629d847a36","Type":"ContainerDied","Data":"2748014b90449f7622f2ad326fe5281c66e9bcc1991e6dd2bf21058712200dbd"} Dec 05 16:19:17 crc kubenswrapper[4756]: I1205 16:19:17.496762 4756 scope.go:117] "RemoveContainer" containerID="cb0b9d15f579efdd73b8d34d314a6ff690ab0bf46e4d61d1fc3903d2e3cb34b2" Dec 05 16:19:17 crc kubenswrapper[4756]: I1205 16:19:17.496890 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tfrgx" Dec 05 16:19:17 crc kubenswrapper[4756]: I1205 16:19:17.499953 4756 generic.go:334] "Generic (PLEG): container finished" podID="1bd03756-cb80-40f8-b5eb-c936c80c6c21" containerID="c4ef9b640030c84adcf3f8d83e0dacb531e12d532de70244f3adaf7816c0baf2" exitCode=0 Dec 05 16:19:17 crc kubenswrapper[4756]: I1205 16:19:17.500009 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6h66" event={"ID":"1bd03756-cb80-40f8-b5eb-c936c80c6c21","Type":"ContainerDied","Data":"c4ef9b640030c84adcf3f8d83e0dacb531e12d532de70244f3adaf7816c0baf2"} Dec 05 16:19:17 crc kubenswrapper[4756]: I1205 16:19:17.504346 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqkjr" event={"ID":"bd889233-fc28-41f6-806f-3f7ab7670bca","Type":"ContainerStarted","Data":"2c281d4a61f0ad2ee3cb4ff011ef7ecb929af6cc25bbdbdd33e68b4dcd60734c"} Dec 05 16:19:17 crc kubenswrapper[4756]: I1205 16:19:17.513796 4756 scope.go:117] "RemoveContainer" containerID="781c55bc368a026ba5228d05fbba4298c6067ebdd93c4468d8c6e23c0e309c09" Dec 05 16:19:17 crc kubenswrapper[4756]: I1205 16:19:17.518224 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wm9rv" podStartSLOduration=2.644665846 podStartE2EDuration="1m3.518205353s" podCreationTimestamp="2025-12-05 16:18:14 +0000 UTC" firstStartedPulling="2025-12-05 16:18:16.00470229 +0000 UTC m=+156.243375802" lastFinishedPulling="2025-12-05 16:19:16.878241797 +0000 UTC m=+217.116915309" observedRunningTime="2025-12-05 16:19:17.511219509 +0000 UTC m=+217.749893031" watchObservedRunningTime="2025-12-05 16:19:17.518205353 +0000 UTC m=+217.756878885" Dec 05 16:19:17 crc kubenswrapper[4756]: I1205 16:19:17.535710 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6kt7t" podStartSLOduration=2.63418973 podStartE2EDuration="1m2.535694148s" podCreationTimestamp="2025-12-05 16:18:15 +0000 UTC" firstStartedPulling="2025-12-05 16:18:17.04097029 +0000 UTC m=+157.279643802" lastFinishedPulling="2025-12-05 16:19:16.942474708 +0000 UTC m=+217.181148220" observedRunningTime="2025-12-05 16:19:17.533220659 +0000 UTC m=+217.771894171" watchObservedRunningTime="2025-12-05 16:19:17.535694148 +0000 UTC m=+217.774367660" Dec 05 16:19:17 crc kubenswrapper[4756]: I1205 16:19:17.543745 4756 scope.go:117] "RemoveContainer" containerID="9cfa234e17d6f4bf0b1dd8561f361dd36adf6a00a5cfd33880a406bb9377043c" Dec 05 16:19:17 crc kubenswrapper[4756]: I1205 16:19:17.560222 4756 scope.go:117] "RemoveContainer" containerID="cb0b9d15f579efdd73b8d34d314a6ff690ab0bf46e4d61d1fc3903d2e3cb34b2" Dec 05 16:19:17 crc kubenswrapper[4756]: I1205 16:19:17.566583 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-crlcr" podStartSLOduration=3.343045558 podStartE2EDuration="1m6.566567514s" podCreationTimestamp="2025-12-05 16:18:11 +0000 UTC" firstStartedPulling="2025-12-05 16:18:13.839250008 +0000 UTC m=+154.077923520" lastFinishedPulling="2025-12-05 16:19:17.062771964 +0000 UTC m=+217.301445476" observedRunningTime="2025-12-05 16:19:17.553997525 +0000 UTC m=+217.792671027" watchObservedRunningTime="2025-12-05 16:19:17.566567514 +0000 UTC m=+217.805241026" Dec 05 16:19:17 crc kubenswrapper[4756]: E1205 16:19:17.569061 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb0b9d15f579efdd73b8d34d314a6ff690ab0bf46e4d61d1fc3903d2e3cb34b2\": container with ID starting with cb0b9d15f579efdd73b8d34d314a6ff690ab0bf46e4d61d1fc3903d2e3cb34b2 not found: ID does not exist" containerID="cb0b9d15f579efdd73b8d34d314a6ff690ab0bf46e4d61d1fc3903d2e3cb34b2" Dec 05 16:19:17 crc kubenswrapper[4756]: I1205 16:19:17.569105 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb0b9d15f579efdd73b8d34d314a6ff690ab0bf46e4d61d1fc3903d2e3cb34b2"} err="failed to get container status \"cb0b9d15f579efdd73b8d34d314a6ff690ab0bf46e4d61d1fc3903d2e3cb34b2\": rpc error: code = NotFound desc = could not find container \"cb0b9d15f579efdd73b8d34d314a6ff690ab0bf46e4d61d1fc3903d2e3cb34b2\": container with ID starting with cb0b9d15f579efdd73b8d34d314a6ff690ab0bf46e4d61d1fc3903d2e3cb34b2 not found: ID does not exist" Dec 05 16:19:17 crc kubenswrapper[4756]: I1205 16:19:17.569148 4756 scope.go:117] "RemoveContainer" containerID="781c55bc368a026ba5228d05fbba4298c6067ebdd93c4468d8c6e23c0e309c09" Dec 05 16:19:17 crc kubenswrapper[4756]: E1205 16:19:17.569510 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"781c55bc368a026ba5228d05fbba4298c6067ebdd93c4468d8c6e23c0e309c09\": container with ID starting with 781c55bc368a026ba5228d05fbba4298c6067ebdd93c4468d8c6e23c0e309c09 not found: ID does not exist" containerID="781c55bc368a026ba5228d05fbba4298c6067ebdd93c4468d8c6e23c0e309c09" Dec 05 16:19:17 crc kubenswrapper[4756]: I1205 16:19:17.569543 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"781c55bc368a026ba5228d05fbba4298c6067ebdd93c4468d8c6e23c0e309c09"} err="failed to get container status \"781c55bc368a026ba5228d05fbba4298c6067ebdd93c4468d8c6e23c0e309c09\": rpc error: code = NotFound desc = could not find container \"781c55bc368a026ba5228d05fbba4298c6067ebdd93c4468d8c6e23c0e309c09\": container with ID starting with 781c55bc368a026ba5228d05fbba4298c6067ebdd93c4468d8c6e23c0e309c09 not found: ID does not exist" Dec 05 16:19:17 crc kubenswrapper[4756]: I1205 16:19:17.569566 4756 scope.go:117] "RemoveContainer" containerID="9cfa234e17d6f4bf0b1dd8561f361dd36adf6a00a5cfd33880a406bb9377043c" Dec 05 16:19:17 crc kubenswrapper[4756]: E1205 16:19:17.569977 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cfa234e17d6f4bf0b1dd8561f361dd36adf6a00a5cfd33880a406bb9377043c\": container with ID starting with 9cfa234e17d6f4bf0b1dd8561f361dd36adf6a00a5cfd33880a406bb9377043c not found: ID does not exist" containerID="9cfa234e17d6f4bf0b1dd8561f361dd36adf6a00a5cfd33880a406bb9377043c" Dec 05 16:19:17 crc kubenswrapper[4756]: I1205 16:19:17.570001 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cfa234e17d6f4bf0b1dd8561f361dd36adf6a00a5cfd33880a406bb9377043c"} err="failed to get container status \"9cfa234e17d6f4bf0b1dd8561f361dd36adf6a00a5cfd33880a406bb9377043c\": rpc error: code = NotFound desc = could not find container \"9cfa234e17d6f4bf0b1dd8561f361dd36adf6a00a5cfd33880a406bb9377043c\": container with ID starting with 9cfa234e17d6f4bf0b1dd8561f361dd36adf6a00a5cfd33880a406bb9377043c not found: ID does not exist" Dec 05 16:19:17 crc kubenswrapper[4756]: I1205 16:19:17.601849 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nqkjr" podStartSLOduration=2.460626546 podStartE2EDuration="1m5.601832872s" podCreationTimestamp="2025-12-05 16:18:12 +0000 UTC" firstStartedPulling="2025-12-05 16:18:13.819067296 +0000 UTC m=+154.057740808" lastFinishedPulling="2025-12-05 16:19:16.960273622 +0000 UTC m=+217.198947134" observedRunningTime="2025-12-05 16:19:17.596904655 +0000 UTC m=+217.835578187" watchObservedRunningTime="2025-12-05 16:19:17.601832872 +0000 UTC m=+217.840506384" Dec 05 16:19:17 crc kubenswrapper[4756]: I1205 16:19:17.617911 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tfrgx"] Dec 05 16:19:17 crc kubenswrapper[4756]: I1205 16:19:17.626316 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tfrgx"] Dec 05 16:19:18 crc kubenswrapper[4756]: I1205 16:19:18.059069 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ff82294-1c1b-4fdb-8644-d3629d847a36" path="/var/lib/kubelet/pods/7ff82294-1c1b-4fdb-8644-d3629d847a36/volumes" Dec 05 16:19:18 crc kubenswrapper[4756]: I1205 16:19:18.513049 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6h66" event={"ID":"1bd03756-cb80-40f8-b5eb-c936c80c6c21","Type":"ContainerStarted","Data":"49599143ed507b4e0b78b3bc947fe60c84566e490ae2fc67d3a6a3360f4168fa"} Dec 05 16:19:18 crc kubenswrapper[4756]: I1205 16:19:18.534981 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w6h66" podStartSLOduration=3.445376832 podStartE2EDuration="1m7.534964678s" podCreationTimestamp="2025-12-05 16:18:11 +0000 UTC" firstStartedPulling="2025-12-05 16:18:13.847738999 +0000 UTC m=+154.086412511" lastFinishedPulling="2025-12-05 16:19:17.937326845 +0000 UTC m=+218.176000357" observedRunningTime="2025-12-05 16:19:18.532195941 +0000 UTC m=+218.770869483" watchObservedRunningTime="2025-12-05 16:19:18.534964678 +0000 UTC m=+218.773638190" Dec 05 16:19:19 crc kubenswrapper[4756]: I1205 16:19:19.519013 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qndg5" event={"ID":"9805087f-2ab0-4780-9ee1-37f58467e8a2","Type":"ContainerStarted","Data":"fd67e4193b526e7fc0c3e6e429f22f967205c2319d371ce91efa1e59b4f69a9a"} Dec 05 16:19:20 crc kubenswrapper[4756]: I1205 16:19:20.526070 4756 generic.go:334] "Generic (PLEG): container finished" podID="9805087f-2ab0-4780-9ee1-37f58467e8a2" containerID="fd67e4193b526e7fc0c3e6e429f22f967205c2319d371ce91efa1e59b4f69a9a" exitCode=0 Dec 05 16:19:20 crc kubenswrapper[4756]: I1205 16:19:20.526125 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qndg5" event={"ID":"9805087f-2ab0-4780-9ee1-37f58467e8a2","Type":"ContainerDied","Data":"fd67e4193b526e7fc0c3e6e429f22f967205c2319d371ce91efa1e59b4f69a9a"} Dec 05 16:19:22 crc kubenswrapper[4756]: I1205 16:19:22.003388 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-crlcr" Dec 05 16:19:22 crc kubenswrapper[4756]: I1205 16:19:22.006528 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-crlcr" Dec 05 16:19:22 crc kubenswrapper[4756]: I1205 16:19:22.060416 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-crlcr" Dec 05 16:19:22 crc kubenswrapper[4756]: I1205 16:19:22.179548 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w6h66" Dec 05 16:19:22 crc kubenswrapper[4756]: I1205 16:19:22.179606 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w6h66" Dec 05 16:19:22 crc kubenswrapper[4756]: I1205 16:19:22.218203 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w6h66" Dec 05 16:19:22 crc kubenswrapper[4756]: I1205 16:19:22.295376 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:19:22 crc kubenswrapper[4756]: I1205 16:19:22.295450 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:19:22 crc kubenswrapper[4756]: I1205 16:19:22.295499 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 16:19:22 crc kubenswrapper[4756]: I1205 16:19:22.296046 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 16:19:22 crc kubenswrapper[4756]: I1205 16:19:22.296144 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d" gracePeriod=600 Dec 05 16:19:22 crc kubenswrapper[4756]: I1205 16:19:22.469986 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nqkjr" Dec 05 16:19:22 crc kubenswrapper[4756]: I1205 16:19:22.470319 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nqkjr" Dec 05 16:19:22 crc kubenswrapper[4756]: I1205 16:19:22.522821 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nqkjr" Dec 05 16:19:22 crc kubenswrapper[4756]: I1205 16:19:22.578283 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nqkjr" Dec 05 16:19:22 crc kubenswrapper[4756]: I1205 16:19:22.585089 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-crlcr" Dec 05 16:19:23 crc kubenswrapper[4756]: I1205 16:19:23.549746 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d" exitCode=0 Dec 05 16:19:23 crc kubenswrapper[4756]: I1205 16:19:23.549827 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d"} Dec 05 16:19:24 crc kubenswrapper[4756]: I1205 16:19:24.902596 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nqkjr"] Dec 05 16:19:25 crc kubenswrapper[4756]: I1205 16:19:25.257791 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wm9rv" Dec 05 16:19:25 crc kubenswrapper[4756]: I1205 16:19:25.258315 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wm9rv" Dec 05 16:19:25 crc kubenswrapper[4756]: I1205 16:19:25.300554 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wm9rv" Dec 05 16:19:25 crc kubenswrapper[4756]: I1205 16:19:25.562354 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nqkjr" podUID="bd889233-fc28-41f6-806f-3f7ab7670bca" containerName="registry-server" containerID="cri-o://2c281d4a61f0ad2ee3cb4ff011ef7ecb929af6cc25bbdbdd33e68b4dcd60734c" gracePeriod=2 Dec 05 16:19:25 crc kubenswrapper[4756]: I1205 16:19:25.588907 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6kt7t" Dec 05 16:19:25 crc kubenswrapper[4756]: I1205 16:19:25.588980 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6kt7t" Dec 05 16:19:25 crc kubenswrapper[4756]: I1205 16:19:25.621859 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wm9rv" Dec 05 16:19:25 crc kubenswrapper[4756]: I1205 16:19:25.645998 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6kt7t" Dec 05 16:19:26 crc kubenswrapper[4756]: I1205 16:19:26.568633 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"8621722fac0e1b78a92a26f20fd0a72e3bdd08c5777a346841d5033b97d8ac3a"} Dec 05 16:19:26 crc kubenswrapper[4756]: I1205 16:19:26.571449 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qndg5" event={"ID":"9805087f-2ab0-4780-9ee1-37f58467e8a2","Type":"ContainerStarted","Data":"80dd55c811b6a59fb4de2a4b1967790dc8f7268d46cb554a8da4171cc0d9db21"} Dec 05 16:19:26 crc kubenswrapper[4756]: I1205 16:19:26.576470 4756 generic.go:334] "Generic (PLEG): container finished" podID="bd889233-fc28-41f6-806f-3f7ab7670bca" containerID="2c281d4a61f0ad2ee3cb4ff011ef7ecb929af6cc25bbdbdd33e68b4dcd60734c" exitCode=0 Dec 05 16:19:26 crc kubenswrapper[4756]: I1205 16:19:26.577060 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqkjr" event={"ID":"bd889233-fc28-41f6-806f-3f7ab7670bca","Type":"ContainerDied","Data":"2c281d4a61f0ad2ee3cb4ff011ef7ecb929af6cc25bbdbdd33e68b4dcd60734c"} Dec 05 16:19:26 crc kubenswrapper[4756]: I1205 16:19:26.607824 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qndg5" podStartSLOduration=2.095299753 podStartE2EDuration="1m14.607805454s" podCreationTimestamp="2025-12-05 16:18:12 +0000 UTC" firstStartedPulling="2025-12-05 16:18:13.830680295 +0000 UTC m=+154.069353807" lastFinishedPulling="2025-12-05 16:19:26.343185986 +0000 UTC m=+226.581859508" observedRunningTime="2025-12-05 16:19:26.604908434 +0000 UTC m=+226.843581966" watchObservedRunningTime="2025-12-05 16:19:26.607805454 +0000 UTC m=+226.846478966" Dec 05 16:19:26 crc kubenswrapper[4756]: I1205 16:19:26.616621 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6kt7t" Dec 05 16:19:27 crc kubenswrapper[4756]: I1205 16:19:27.051506 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nqkjr" Dec 05 16:19:27 crc kubenswrapper[4756]: I1205 16:19:27.183368 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd889233-fc28-41f6-806f-3f7ab7670bca-utilities\") pod \"bd889233-fc28-41f6-806f-3f7ab7670bca\" (UID: \"bd889233-fc28-41f6-806f-3f7ab7670bca\") " Dec 05 16:19:27 crc kubenswrapper[4756]: I1205 16:19:27.183582 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd889233-fc28-41f6-806f-3f7ab7670bca-catalog-content\") pod \"bd889233-fc28-41f6-806f-3f7ab7670bca\" (UID: \"bd889233-fc28-41f6-806f-3f7ab7670bca\") " Dec 05 16:19:27 crc kubenswrapper[4756]: I1205 16:19:27.183629 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmlgl\" (UniqueName: \"kubernetes.io/projected/bd889233-fc28-41f6-806f-3f7ab7670bca-kube-api-access-xmlgl\") pod \"bd889233-fc28-41f6-806f-3f7ab7670bca\" (UID: \"bd889233-fc28-41f6-806f-3f7ab7670bca\") " Dec 05 16:19:27 crc kubenswrapper[4756]: I1205 16:19:27.185720 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd889233-fc28-41f6-806f-3f7ab7670bca-utilities" (OuterVolumeSpecName: "utilities") pod "bd889233-fc28-41f6-806f-3f7ab7670bca" (UID: "bd889233-fc28-41f6-806f-3f7ab7670bca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:19:27 crc kubenswrapper[4756]: I1205 16:19:27.189217 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd889233-fc28-41f6-806f-3f7ab7670bca-kube-api-access-xmlgl" (OuterVolumeSpecName: "kube-api-access-xmlgl") pod "bd889233-fc28-41f6-806f-3f7ab7670bca" (UID: "bd889233-fc28-41f6-806f-3f7ab7670bca"). InnerVolumeSpecName "kube-api-access-xmlgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:19:27 crc kubenswrapper[4756]: I1205 16:19:27.246395 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd889233-fc28-41f6-806f-3f7ab7670bca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd889233-fc28-41f6-806f-3f7ab7670bca" (UID: "bd889233-fc28-41f6-806f-3f7ab7670bca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:19:27 crc kubenswrapper[4756]: I1205 16:19:27.285132 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd889233-fc28-41f6-806f-3f7ab7670bca-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:27 crc kubenswrapper[4756]: I1205 16:19:27.285161 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmlgl\" (UniqueName: \"kubernetes.io/projected/bd889233-fc28-41f6-806f-3f7ab7670bca-kube-api-access-xmlgl\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:27 crc kubenswrapper[4756]: I1205 16:19:27.285173 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd889233-fc28-41f6-806f-3f7ab7670bca-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:27 crc kubenswrapper[4756]: I1205 16:19:27.585106 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqkjr" event={"ID":"bd889233-fc28-41f6-806f-3f7ab7670bca","Type":"ContainerDied","Data":"2722033ff00e3a47df036e6bcd9eef00db9dead3b40da665e0ad662450ba575a"} Dec 05 16:19:27 crc kubenswrapper[4756]: I1205 16:19:27.585156 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nqkjr" Dec 05 16:19:27 crc kubenswrapper[4756]: I1205 16:19:27.585217 4756 scope.go:117] "RemoveContainer" containerID="2c281d4a61f0ad2ee3cb4ff011ef7ecb929af6cc25bbdbdd33e68b4dcd60734c" Dec 05 16:19:27 crc kubenswrapper[4756]: I1205 16:19:27.612178 4756 scope.go:117] "RemoveContainer" containerID="32106228a801f0d43ed68ef387867d502b994c0d5400b8d4f6c859c4579a0c15" Dec 05 16:19:27 crc kubenswrapper[4756]: I1205 16:19:27.618084 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nqkjr"] Dec 05 16:19:27 crc kubenswrapper[4756]: I1205 16:19:27.624692 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nqkjr"] Dec 05 16:19:27 crc kubenswrapper[4756]: I1205 16:19:27.650254 4756 scope.go:117] "RemoveContainer" containerID="8c10a5f4c7b5459b927d9c240525c0ef93b1a56a4512c9a1a766430dc21dfae3" Dec 05 16:19:28 crc kubenswrapper[4756]: I1205 16:19:28.056145 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd889233-fc28-41f6-806f-3f7ab7670bca" path="/var/lib/kubelet/pods/bd889233-fc28-41f6-806f-3f7ab7670bca/volumes" Dec 05 16:19:28 crc kubenswrapper[4756]: I1205 16:19:28.299842 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6kt7t"] Dec 05 16:19:28 crc kubenswrapper[4756]: I1205 16:19:28.593552 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6kt7t" podUID="a5bc48be-ea95-47a5-b804-ac4dc8df958b" containerName="registry-server" containerID="cri-o://7355480d9eaa6cc2511c66e81e5f3919469dfb47d9794dedb747b2031577b275" gracePeriod=2 Dec 05 16:19:28 crc kubenswrapper[4756]: I1205 16:19:28.986129 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6kt7t" Dec 05 16:19:29 crc kubenswrapper[4756]: I1205 16:19:29.113358 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-788p4\" (UniqueName: \"kubernetes.io/projected/a5bc48be-ea95-47a5-b804-ac4dc8df958b-kube-api-access-788p4\") pod \"a5bc48be-ea95-47a5-b804-ac4dc8df958b\" (UID: \"a5bc48be-ea95-47a5-b804-ac4dc8df958b\") " Dec 05 16:19:29 crc kubenswrapper[4756]: I1205 16:19:29.113571 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5bc48be-ea95-47a5-b804-ac4dc8df958b-catalog-content\") pod \"a5bc48be-ea95-47a5-b804-ac4dc8df958b\" (UID: \"a5bc48be-ea95-47a5-b804-ac4dc8df958b\") " Dec 05 16:19:29 crc kubenswrapper[4756]: I1205 16:19:29.113730 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5bc48be-ea95-47a5-b804-ac4dc8df958b-utilities\") pod \"a5bc48be-ea95-47a5-b804-ac4dc8df958b\" (UID: \"a5bc48be-ea95-47a5-b804-ac4dc8df958b\") " Dec 05 16:19:29 crc kubenswrapper[4756]: I1205 16:19:29.114850 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5bc48be-ea95-47a5-b804-ac4dc8df958b-utilities" (OuterVolumeSpecName: "utilities") pod "a5bc48be-ea95-47a5-b804-ac4dc8df958b" (UID: "a5bc48be-ea95-47a5-b804-ac4dc8df958b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:19:29 crc kubenswrapper[4756]: I1205 16:19:29.122244 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5bc48be-ea95-47a5-b804-ac4dc8df958b-kube-api-access-788p4" (OuterVolumeSpecName: "kube-api-access-788p4") pod "a5bc48be-ea95-47a5-b804-ac4dc8df958b" (UID: "a5bc48be-ea95-47a5-b804-ac4dc8df958b"). InnerVolumeSpecName "kube-api-access-788p4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:19:29 crc kubenswrapper[4756]: I1205 16:19:29.215983 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-788p4\" (UniqueName: \"kubernetes.io/projected/a5bc48be-ea95-47a5-b804-ac4dc8df958b-kube-api-access-788p4\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:29 crc kubenswrapper[4756]: I1205 16:19:29.216025 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5bc48be-ea95-47a5-b804-ac4dc8df958b-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:29 crc kubenswrapper[4756]: I1205 16:19:29.237867 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5bc48be-ea95-47a5-b804-ac4dc8df958b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5bc48be-ea95-47a5-b804-ac4dc8df958b" (UID: "a5bc48be-ea95-47a5-b804-ac4dc8df958b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:19:29 crc kubenswrapper[4756]: I1205 16:19:29.317723 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5bc48be-ea95-47a5-b804-ac4dc8df958b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:29 crc kubenswrapper[4756]: I1205 16:19:29.603306 4756 generic.go:334] "Generic (PLEG): container finished" podID="a5bc48be-ea95-47a5-b804-ac4dc8df958b" containerID="7355480d9eaa6cc2511c66e81e5f3919469dfb47d9794dedb747b2031577b275" exitCode=0 Dec 05 16:19:29 crc kubenswrapper[4756]: I1205 16:19:29.603373 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6kt7t" Dec 05 16:19:29 crc kubenswrapper[4756]: I1205 16:19:29.603367 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kt7t" event={"ID":"a5bc48be-ea95-47a5-b804-ac4dc8df958b","Type":"ContainerDied","Data":"7355480d9eaa6cc2511c66e81e5f3919469dfb47d9794dedb747b2031577b275"} Dec 05 16:19:29 crc kubenswrapper[4756]: I1205 16:19:29.603484 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kt7t" event={"ID":"a5bc48be-ea95-47a5-b804-ac4dc8df958b","Type":"ContainerDied","Data":"b837e5e45befaa4813f144419fb81921b55bb6cd8dc49e3a044479fdcc003f11"} Dec 05 16:19:29 crc kubenswrapper[4756]: I1205 16:19:29.603532 4756 scope.go:117] "RemoveContainer" containerID="7355480d9eaa6cc2511c66e81e5f3919469dfb47d9794dedb747b2031577b275" Dec 05 16:19:29 crc kubenswrapper[4756]: I1205 16:19:29.636998 4756 scope.go:117] "RemoveContainer" containerID="43a0a8c4496018731bab6aa537fd70dfe39d11ff0bbe5cca1b38a8780ee73d5b" Dec 05 16:19:29 crc kubenswrapper[4756]: I1205 16:19:29.647248 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6kt7t"] Dec 05 16:19:29 crc kubenswrapper[4756]: I1205 16:19:29.655574 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6kt7t"] Dec 05 16:19:29 crc kubenswrapper[4756]: I1205 16:19:29.677257 4756 scope.go:117] "RemoveContainer" containerID="8f805d8387b2644ad0214ffc36db9e06ad5d9a327f52bb99829b9fcfa8a5082c" Dec 05 16:19:29 crc kubenswrapper[4756]: I1205 16:19:29.696755 4756 scope.go:117] "RemoveContainer" containerID="7355480d9eaa6cc2511c66e81e5f3919469dfb47d9794dedb747b2031577b275" Dec 05 16:19:29 crc kubenswrapper[4756]: E1205 16:19:29.697240 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7355480d9eaa6cc2511c66e81e5f3919469dfb47d9794dedb747b2031577b275\": container with ID starting with 7355480d9eaa6cc2511c66e81e5f3919469dfb47d9794dedb747b2031577b275 not found: ID does not exist" containerID="7355480d9eaa6cc2511c66e81e5f3919469dfb47d9794dedb747b2031577b275" Dec 05 16:19:29 crc kubenswrapper[4756]: I1205 16:19:29.697293 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7355480d9eaa6cc2511c66e81e5f3919469dfb47d9794dedb747b2031577b275"} err="failed to get container status \"7355480d9eaa6cc2511c66e81e5f3919469dfb47d9794dedb747b2031577b275\": rpc error: code = NotFound desc = could not find container \"7355480d9eaa6cc2511c66e81e5f3919469dfb47d9794dedb747b2031577b275\": container with ID starting with 7355480d9eaa6cc2511c66e81e5f3919469dfb47d9794dedb747b2031577b275 not found: ID does not exist" Dec 05 16:19:29 crc kubenswrapper[4756]: I1205 16:19:29.697329 4756 scope.go:117] "RemoveContainer" containerID="43a0a8c4496018731bab6aa537fd70dfe39d11ff0bbe5cca1b38a8780ee73d5b" Dec 05 16:19:29 crc kubenswrapper[4756]: E1205 16:19:29.697985 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43a0a8c4496018731bab6aa537fd70dfe39d11ff0bbe5cca1b38a8780ee73d5b\": container with ID starting with 43a0a8c4496018731bab6aa537fd70dfe39d11ff0bbe5cca1b38a8780ee73d5b not found: ID does not exist" containerID="43a0a8c4496018731bab6aa537fd70dfe39d11ff0bbe5cca1b38a8780ee73d5b" Dec 05 16:19:29 crc kubenswrapper[4756]: I1205 16:19:29.698014 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43a0a8c4496018731bab6aa537fd70dfe39d11ff0bbe5cca1b38a8780ee73d5b"} err="failed to get container status \"43a0a8c4496018731bab6aa537fd70dfe39d11ff0bbe5cca1b38a8780ee73d5b\": rpc error: code = NotFound desc = could not find container \"43a0a8c4496018731bab6aa537fd70dfe39d11ff0bbe5cca1b38a8780ee73d5b\": container with ID starting with 43a0a8c4496018731bab6aa537fd70dfe39d11ff0bbe5cca1b38a8780ee73d5b not found: ID does not exist" Dec 05 16:19:29 crc kubenswrapper[4756]: I1205 16:19:29.698032 4756 scope.go:117] "RemoveContainer" containerID="8f805d8387b2644ad0214ffc36db9e06ad5d9a327f52bb99829b9fcfa8a5082c" Dec 05 16:19:29 crc kubenswrapper[4756]: E1205 16:19:29.698493 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f805d8387b2644ad0214ffc36db9e06ad5d9a327f52bb99829b9fcfa8a5082c\": container with ID starting with 8f805d8387b2644ad0214ffc36db9e06ad5d9a327f52bb99829b9fcfa8a5082c not found: ID does not exist" containerID="8f805d8387b2644ad0214ffc36db9e06ad5d9a327f52bb99829b9fcfa8a5082c" Dec 05 16:19:29 crc kubenswrapper[4756]: I1205 16:19:29.698534 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f805d8387b2644ad0214ffc36db9e06ad5d9a327f52bb99829b9fcfa8a5082c"} err="failed to get container status \"8f805d8387b2644ad0214ffc36db9e06ad5d9a327f52bb99829b9fcfa8a5082c\": rpc error: code = NotFound desc = could not find container \"8f805d8387b2644ad0214ffc36db9e06ad5d9a327f52bb99829b9fcfa8a5082c\": container with ID starting with 8f805d8387b2644ad0214ffc36db9e06ad5d9a327f52bb99829b9fcfa8a5082c not found: ID does not exist" Dec 05 16:19:30 crc kubenswrapper[4756]: I1205 16:19:30.062534 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5bc48be-ea95-47a5-b804-ac4dc8df958b" path="/var/lib/kubelet/pods/a5bc48be-ea95-47a5-b804-ac4dc8df958b/volumes" Dec 05 16:19:32 crc kubenswrapper[4756]: I1205 16:19:32.225489 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w6h66" Dec 05 16:19:32 crc kubenswrapper[4756]: I1205 16:19:32.726666 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qndg5" Dec 05 16:19:32 crc kubenswrapper[4756]: I1205 16:19:32.726749 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qndg5" Dec 05 16:19:32 crc kubenswrapper[4756]: I1205 16:19:32.771152 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qndg5" Dec 05 16:19:33 crc kubenswrapper[4756]: I1205 16:19:33.669078 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qndg5" Dec 05 16:19:35 crc kubenswrapper[4756]: I1205 16:19:35.706249 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qndg5"] Dec 05 16:19:35 crc kubenswrapper[4756]: I1205 16:19:35.707458 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qndg5" podUID="9805087f-2ab0-4780-9ee1-37f58467e8a2" containerName="registry-server" containerID="cri-o://80dd55c811b6a59fb4de2a4b1967790dc8f7268d46cb554a8da4171cc0d9db21" gracePeriod=2 Dec 05 16:19:36 crc kubenswrapper[4756]: I1205 16:19:36.061324 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-k8whp"] Dec 05 16:19:36 crc kubenswrapper[4756]: I1205 16:19:36.111653 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qndg5" Dec 05 16:19:36 crc kubenswrapper[4756]: I1205 16:19:36.310021 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9n9k\" (UniqueName: \"kubernetes.io/projected/9805087f-2ab0-4780-9ee1-37f58467e8a2-kube-api-access-g9n9k\") pod \"9805087f-2ab0-4780-9ee1-37f58467e8a2\" (UID: \"9805087f-2ab0-4780-9ee1-37f58467e8a2\") " Dec 05 16:19:36 crc kubenswrapper[4756]: I1205 16:19:36.310108 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9805087f-2ab0-4780-9ee1-37f58467e8a2-catalog-content\") pod \"9805087f-2ab0-4780-9ee1-37f58467e8a2\" (UID: \"9805087f-2ab0-4780-9ee1-37f58467e8a2\") " Dec 05 16:19:36 crc kubenswrapper[4756]: I1205 16:19:36.310162 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9805087f-2ab0-4780-9ee1-37f58467e8a2-utilities\") pod \"9805087f-2ab0-4780-9ee1-37f58467e8a2\" (UID: \"9805087f-2ab0-4780-9ee1-37f58467e8a2\") " Dec 05 16:19:36 crc kubenswrapper[4756]: I1205 16:19:36.311102 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9805087f-2ab0-4780-9ee1-37f58467e8a2-utilities" (OuterVolumeSpecName: "utilities") pod "9805087f-2ab0-4780-9ee1-37f58467e8a2" (UID: "9805087f-2ab0-4780-9ee1-37f58467e8a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:19:36 crc kubenswrapper[4756]: I1205 16:19:36.317655 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9805087f-2ab0-4780-9ee1-37f58467e8a2-kube-api-access-g9n9k" (OuterVolumeSpecName: "kube-api-access-g9n9k") pod "9805087f-2ab0-4780-9ee1-37f58467e8a2" (UID: "9805087f-2ab0-4780-9ee1-37f58467e8a2"). InnerVolumeSpecName "kube-api-access-g9n9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:19:36 crc kubenswrapper[4756]: I1205 16:19:36.359494 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9805087f-2ab0-4780-9ee1-37f58467e8a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9805087f-2ab0-4780-9ee1-37f58467e8a2" (UID: "9805087f-2ab0-4780-9ee1-37f58467e8a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:19:36 crc kubenswrapper[4756]: I1205 16:19:36.411344 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9805087f-2ab0-4780-9ee1-37f58467e8a2-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:36 crc kubenswrapper[4756]: I1205 16:19:36.411383 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9n9k\" (UniqueName: \"kubernetes.io/projected/9805087f-2ab0-4780-9ee1-37f58467e8a2-kube-api-access-g9n9k\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:36 crc kubenswrapper[4756]: I1205 16:19:36.411392 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9805087f-2ab0-4780-9ee1-37f58467e8a2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:36 crc kubenswrapper[4756]: I1205 16:19:36.645535 4756 generic.go:334] "Generic (PLEG): container finished" podID="9805087f-2ab0-4780-9ee1-37f58467e8a2" containerID="80dd55c811b6a59fb4de2a4b1967790dc8f7268d46cb554a8da4171cc0d9db21" exitCode=0 Dec 05 16:19:36 crc kubenswrapper[4756]: I1205 16:19:36.645584 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qndg5" event={"ID":"9805087f-2ab0-4780-9ee1-37f58467e8a2","Type":"ContainerDied","Data":"80dd55c811b6a59fb4de2a4b1967790dc8f7268d46cb554a8da4171cc0d9db21"} Dec 05 16:19:36 crc kubenswrapper[4756]: I1205 16:19:36.645641 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qndg5" event={"ID":"9805087f-2ab0-4780-9ee1-37f58467e8a2","Type":"ContainerDied","Data":"2a0146cd0cb0ac6f1494bcd945ae8bcb8358037797ecf7b89bac7e868701e666"} Dec 05 16:19:36 crc kubenswrapper[4756]: I1205 16:19:36.645665 4756 scope.go:117] "RemoveContainer" containerID="80dd55c811b6a59fb4de2a4b1967790dc8f7268d46cb554a8da4171cc0d9db21" Dec 05 16:19:36 crc kubenswrapper[4756]: I1205 16:19:36.645689 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qndg5" Dec 05 16:19:36 crc kubenswrapper[4756]: I1205 16:19:36.668184 4756 scope.go:117] "RemoveContainer" containerID="fd67e4193b526e7fc0c3e6e429f22f967205c2319d371ce91efa1e59b4f69a9a" Dec 05 16:19:36 crc kubenswrapper[4756]: I1205 16:19:36.688888 4756 scope.go:117] "RemoveContainer" containerID="fe28ad0693bc96c2b04addecf03334b83d1c77ef0969145f610608351396aeda" Dec 05 16:19:36 crc kubenswrapper[4756]: I1205 16:19:36.705653 4756 scope.go:117] "RemoveContainer" containerID="80dd55c811b6a59fb4de2a4b1967790dc8f7268d46cb554a8da4171cc0d9db21" Dec 05 16:19:36 crc kubenswrapper[4756]: E1205 16:19:36.706130 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80dd55c811b6a59fb4de2a4b1967790dc8f7268d46cb554a8da4171cc0d9db21\": container with ID starting with 80dd55c811b6a59fb4de2a4b1967790dc8f7268d46cb554a8da4171cc0d9db21 not found: ID does not exist" containerID="80dd55c811b6a59fb4de2a4b1967790dc8f7268d46cb554a8da4171cc0d9db21" Dec 05 16:19:36 crc kubenswrapper[4756]: I1205 16:19:36.706196 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80dd55c811b6a59fb4de2a4b1967790dc8f7268d46cb554a8da4171cc0d9db21"} err="failed to get container status \"80dd55c811b6a59fb4de2a4b1967790dc8f7268d46cb554a8da4171cc0d9db21\": rpc error: code = NotFound desc = could not find container \"80dd55c811b6a59fb4de2a4b1967790dc8f7268d46cb554a8da4171cc0d9db21\": container with ID starting with 80dd55c811b6a59fb4de2a4b1967790dc8f7268d46cb554a8da4171cc0d9db21 not found: ID does not exist" Dec 05 16:19:36 crc kubenswrapper[4756]: I1205 16:19:36.706222 4756 scope.go:117] "RemoveContainer" containerID="fd67e4193b526e7fc0c3e6e429f22f967205c2319d371ce91efa1e59b4f69a9a" Dec 05 16:19:36 crc kubenswrapper[4756]: E1205 16:19:36.706569 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd67e4193b526e7fc0c3e6e429f22f967205c2319d371ce91efa1e59b4f69a9a\": container with ID starting with fd67e4193b526e7fc0c3e6e429f22f967205c2319d371ce91efa1e59b4f69a9a not found: ID does not exist" containerID="fd67e4193b526e7fc0c3e6e429f22f967205c2319d371ce91efa1e59b4f69a9a" Dec 05 16:19:36 crc kubenswrapper[4756]: I1205 16:19:36.706593 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd67e4193b526e7fc0c3e6e429f22f967205c2319d371ce91efa1e59b4f69a9a"} err="failed to get container status \"fd67e4193b526e7fc0c3e6e429f22f967205c2319d371ce91efa1e59b4f69a9a\": rpc error: code = NotFound desc = could not find container \"fd67e4193b526e7fc0c3e6e429f22f967205c2319d371ce91efa1e59b4f69a9a\": container with ID starting with fd67e4193b526e7fc0c3e6e429f22f967205c2319d371ce91efa1e59b4f69a9a not found: ID does not exist" Dec 05 16:19:36 crc kubenswrapper[4756]: I1205 16:19:36.706608 4756 scope.go:117] "RemoveContainer" containerID="fe28ad0693bc96c2b04addecf03334b83d1c77ef0969145f610608351396aeda" Dec 05 16:19:36 crc kubenswrapper[4756]: E1205 16:19:36.706879 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe28ad0693bc96c2b04addecf03334b83d1c77ef0969145f610608351396aeda\": container with ID starting with fe28ad0693bc96c2b04addecf03334b83d1c77ef0969145f610608351396aeda not found: ID does not exist" containerID="fe28ad0693bc96c2b04addecf03334b83d1c77ef0969145f610608351396aeda" Dec 05 16:19:36 crc kubenswrapper[4756]: I1205 16:19:36.706908 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe28ad0693bc96c2b04addecf03334b83d1c77ef0969145f610608351396aeda"} err="failed to get container status \"fe28ad0693bc96c2b04addecf03334b83d1c77ef0969145f610608351396aeda\": rpc error: code = NotFound desc = could not find container \"fe28ad0693bc96c2b04addecf03334b83d1c77ef0969145f610608351396aeda\": container with ID starting with fe28ad0693bc96c2b04addecf03334b83d1c77ef0969145f610608351396aeda not found: ID does not exist" Dec 05 16:19:36 crc kubenswrapper[4756]: I1205 16:19:36.713184 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qndg5"] Dec 05 16:19:36 crc kubenswrapper[4756]: I1205 16:19:36.715306 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qndg5"] Dec 05 16:19:38 crc kubenswrapper[4756]: I1205 16:19:38.062307 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9805087f-2ab0-4780-9ee1-37f58467e8a2" path="/var/lib/kubelet/pods/9805087f-2ab0-4780-9ee1-37f58467e8a2/volumes" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.873689 4756 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.874091 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e" gracePeriod=15 Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.874264 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723" gracePeriod=15 Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.874315 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907" gracePeriod=15 Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.874360 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e" gracePeriod=15 Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.874408 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29" gracePeriod=15 Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.878140 4756 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 16:19:39 crc kubenswrapper[4756]: E1205 16:19:39.878488 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.878510 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 05 16:19:39 crc kubenswrapper[4756]: E1205 16:19:39.878523 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.878532 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 16:19:39 crc kubenswrapper[4756]: E1205 16:19:39.878545 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.878552 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 16:19:39 crc kubenswrapper[4756]: E1205 16:19:39.878560 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.878566 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 16:19:39 crc kubenswrapper[4756]: E1205 16:19:39.878575 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5bc48be-ea95-47a5-b804-ac4dc8df958b" containerName="extract-content" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.878580 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5bc48be-ea95-47a5-b804-ac4dc8df958b" containerName="extract-content" Dec 05 16:19:39 crc kubenswrapper[4756]: E1205 16:19:39.878589 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.878596 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 16:19:39 crc kubenswrapper[4756]: E1205 16:19:39.878604 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ff82294-1c1b-4fdb-8644-d3629d847a36" containerName="extract-content" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.878611 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ff82294-1c1b-4fdb-8644-d3629d847a36" containerName="extract-content" Dec 05 16:19:39 crc kubenswrapper[4756]: E1205 16:19:39.878622 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd889233-fc28-41f6-806f-3f7ab7670bca" containerName="extract-utilities" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.878632 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd889233-fc28-41f6-806f-3f7ab7670bca" containerName="extract-utilities" Dec 05 16:19:39 crc kubenswrapper[4756]: E1205 16:19:39.878642 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ff82294-1c1b-4fdb-8644-d3629d847a36" containerName="registry-server" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.878648 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ff82294-1c1b-4fdb-8644-d3629d847a36" containerName="registry-server" Dec 05 16:19:39 crc kubenswrapper[4756]: E1205 16:19:39.878657 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9805087f-2ab0-4780-9ee1-37f58467e8a2" containerName="extract-content" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.878666 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="9805087f-2ab0-4780-9ee1-37f58467e8a2" containerName="extract-content" Dec 05 16:19:39 crc kubenswrapper[4756]: E1205 16:19:39.878678 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5bc48be-ea95-47a5-b804-ac4dc8df958b" containerName="registry-server" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.878686 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5bc48be-ea95-47a5-b804-ac4dc8df958b" containerName="registry-server" Dec 05 16:19:39 crc kubenswrapper[4756]: E1205 16:19:39.878698 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5bc48be-ea95-47a5-b804-ac4dc8df958b" containerName="extract-utilities" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.878704 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5bc48be-ea95-47a5-b804-ac4dc8df958b" containerName="extract-utilities" Dec 05 16:19:39 crc kubenswrapper[4756]: E1205 16:19:39.878713 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd889233-fc28-41f6-806f-3f7ab7670bca" containerName="extract-content" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.878720 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd889233-fc28-41f6-806f-3f7ab7670bca" containerName="extract-content" Dec 05 16:19:39 crc kubenswrapper[4756]: E1205 16:19:39.878730 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd889233-fc28-41f6-806f-3f7ab7670bca" containerName="registry-server" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.878736 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd889233-fc28-41f6-806f-3f7ab7670bca" containerName="registry-server" Dec 05 16:19:39 crc kubenswrapper[4756]: E1205 16:19:39.878745 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ff82294-1c1b-4fdb-8644-d3629d847a36" containerName="extract-utilities" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.878751 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ff82294-1c1b-4fdb-8644-d3629d847a36" containerName="extract-utilities" Dec 05 16:19:39 crc kubenswrapper[4756]: E1205 16:19:39.878760 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85782b86-3bb4-4ae4-a706-654876b73aec" containerName="pruner" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.878766 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="85782b86-3bb4-4ae4-a706-654876b73aec" containerName="pruner" Dec 05 16:19:39 crc kubenswrapper[4756]: E1205 16:19:39.878776 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.878782 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 16:19:39 crc kubenswrapper[4756]: E1205 16:19:39.878793 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9805087f-2ab0-4780-9ee1-37f58467e8a2" containerName="extract-utilities" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.878800 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="9805087f-2ab0-4780-9ee1-37f58467e8a2" containerName="extract-utilities" Dec 05 16:19:39 crc kubenswrapper[4756]: E1205 16:19:39.878809 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9805087f-2ab0-4780-9ee1-37f58467e8a2" containerName="registry-server" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.878815 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="9805087f-2ab0-4780-9ee1-37f58467e8a2" containerName="registry-server" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.878916 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.878929 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5bc48be-ea95-47a5-b804-ac4dc8df958b" containerName="registry-server" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.878940 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd889233-fc28-41f6-806f-3f7ab7670bca" containerName="registry-server" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.878948 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.878959 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="9805087f-2ab0-4780-9ee1-37f58467e8a2" containerName="registry-server" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.878969 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ff82294-1c1b-4fdb-8644-d3629d847a36" containerName="registry-server" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.878978 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.878986 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.878993 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="85782b86-3bb4-4ae4-a706-654876b73aec" containerName="pruner" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.879000 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.879007 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 16:19:39 crc kubenswrapper[4756]: E1205 16:19:39.879147 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.879155 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.881703 4756 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.882979 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.887118 4756 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.969105 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.969169 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.969199 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.969230 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.969262 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.969283 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.969311 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 16:19:39 crc kubenswrapper[4756]: I1205 16:19:39.969333 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 16:19:40 crc kubenswrapper[4756]: I1205 16:19:40.070829 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:19:40 crc kubenswrapper[4756]: I1205 16:19:40.071274 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:19:40 crc kubenswrapper[4756]: I1205 16:19:40.071309 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 16:19:40 crc kubenswrapper[4756]: I1205 16:19:40.071352 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 16:19:40 crc kubenswrapper[4756]: I1205 16:19:40.071405 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:19:40 crc kubenswrapper[4756]: I1205 16:19:40.071448 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 16:19:40 crc kubenswrapper[4756]: I1205 16:19:40.071495 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 16:19:40 crc kubenswrapper[4756]: I1205 16:19:40.071519 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 16:19:40 crc kubenswrapper[4756]: I1205 16:19:40.071542 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:19:40 crc kubenswrapper[4756]: I1205 16:19:40.071600 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 16:19:40 crc kubenswrapper[4756]: I1205 16:19:40.071639 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 16:19:40 crc kubenswrapper[4756]: I1205 16:19:40.071663 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 16:19:40 crc kubenswrapper[4756]: I1205 16:19:40.072039 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:19:40 crc kubenswrapper[4756]: I1205 16:19:40.072134 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 16:19:40 crc kubenswrapper[4756]: I1205 16:19:40.072199 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:19:40 crc kubenswrapper[4756]: I1205 16:19:40.072494 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 16:19:40 crc kubenswrapper[4756]: I1205 16:19:40.669574 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 05 16:19:40 crc kubenswrapper[4756]: I1205 16:19:40.671238 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 16:19:40 crc kubenswrapper[4756]: I1205 16:19:40.671988 4756 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723" exitCode=0 Dec 05 16:19:40 crc kubenswrapper[4756]: I1205 16:19:40.672101 4756 scope.go:117] "RemoveContainer" containerID="fe857c029ec52298ee76b0636de51b426dba35ac7a6fd265cc85fa0c476a9961" Dec 05 16:19:40 crc kubenswrapper[4756]: I1205 16:19:40.672163 4756 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907" exitCode=0 Dec 05 16:19:40 crc kubenswrapper[4756]: I1205 16:19:40.672392 4756 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e" exitCode=0 Dec 05 16:19:40 crc kubenswrapper[4756]: I1205 16:19:40.672566 4756 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29" exitCode=2 Dec 05 16:19:40 crc kubenswrapper[4756]: I1205 16:19:40.674382 4756 generic.go:334] "Generic (PLEG): container finished" podID="0bfa691e-5839-4735-9378-d06063656d75" containerID="c7f612dc8b7202136ee842eb5def2542d67b330ba2a6a74195cd6c4730e2aafc" exitCode=0 Dec 05 16:19:40 crc kubenswrapper[4756]: I1205 16:19:40.674440 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0bfa691e-5839-4735-9378-d06063656d75","Type":"ContainerDied","Data":"c7f612dc8b7202136ee842eb5def2542d67b330ba2a6a74195cd6c4730e2aafc"} Dec 05 16:19:40 crc kubenswrapper[4756]: I1205 16:19:40.675391 4756 status_manager.go:851] "Failed to get status for pod" podUID="0bfa691e-5839-4735-9378-d06063656d75" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Dec 05 16:19:41 crc kubenswrapper[4756]: I1205 16:19:41.681557 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 16:19:41 crc kubenswrapper[4756]: I1205 16:19:41.946371 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 16:19:41 crc kubenswrapper[4756]: I1205 16:19:41.947496 4756 status_manager.go:851] "Failed to get status for pod" podUID="0bfa691e-5839-4735-9378-d06063656d75" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Dec 05 16:19:41 crc kubenswrapper[4756]: I1205 16:19:41.998850 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0bfa691e-5839-4735-9378-d06063656d75-kubelet-dir\") pod \"0bfa691e-5839-4735-9378-d06063656d75\" (UID: \"0bfa691e-5839-4735-9378-d06063656d75\") " Dec 05 16:19:41 crc kubenswrapper[4756]: I1205 16:19:41.998907 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0bfa691e-5839-4735-9378-d06063656d75-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0bfa691e-5839-4735-9378-d06063656d75" (UID: "0bfa691e-5839-4735-9378-d06063656d75"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:19:41 crc kubenswrapper[4756]: I1205 16:19:41.998986 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0bfa691e-5839-4735-9378-d06063656d75-var-lock\") pod \"0bfa691e-5839-4735-9378-d06063656d75\" (UID: \"0bfa691e-5839-4735-9378-d06063656d75\") " Dec 05 16:19:41 crc kubenswrapper[4756]: I1205 16:19:41.999056 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bfa691e-5839-4735-9378-d06063656d75-kube-api-access\") pod \"0bfa691e-5839-4735-9378-d06063656d75\" (UID: \"0bfa691e-5839-4735-9378-d06063656d75\") " Dec 05 16:19:41 crc kubenswrapper[4756]: I1205 16:19:41.999342 4756 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0bfa691e-5839-4735-9378-d06063656d75-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:41 crc kubenswrapper[4756]: I1205 16:19:41.999807 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0bfa691e-5839-4735-9378-d06063656d75-var-lock" (OuterVolumeSpecName: "var-lock") pod "0bfa691e-5839-4735-9378-d06063656d75" (UID: "0bfa691e-5839-4735-9378-d06063656d75"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.008244 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bfa691e-5839-4735-9378-d06063656d75-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0bfa691e-5839-4735-9378-d06063656d75" (UID: "0bfa691e-5839-4735-9378-d06063656d75"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.104453 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bfa691e-5839-4735-9378-d06063656d75-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.104875 4756 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0bfa691e-5839-4735-9378-d06063656d75-var-lock\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:42 crc kubenswrapper[4756]: E1205 16:19:42.241783 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-conmon-82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e.scope\": RecentStats: unable to find data in memory cache]" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.284545 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.285604 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.286321 4756 status_manager.go:851] "Failed to get status for pod" podUID="0bfa691e-5839-4735-9378-d06063656d75" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.286748 4756 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.310639 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.310688 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.310733 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.310796 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.310823 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.310911 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.311225 4756 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.311248 4756 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.311261 4756 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.694071 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.695337 4756 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e" exitCode=0 Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.695477 4756 scope.go:117] "RemoveContainer" containerID="91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.695674 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.700307 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0bfa691e-5839-4735-9378-d06063656d75","Type":"ContainerDied","Data":"4ba278cff331963bffe360a8f071c63a514475ade7e5e2fb64b99231f91310ef"} Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.700371 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ba278cff331963bffe360a8f071c63a514475ade7e5e2fb64b99231f91310ef" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.700516 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.703739 4756 status_manager.go:851] "Failed to get status for pod" podUID="0bfa691e-5839-4735-9378-d06063656d75" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.704110 4756 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.713089 4756 scope.go:117] "RemoveContainer" containerID="a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.714357 4756 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.714871 4756 status_manager.go:851] "Failed to get status for pod" podUID="0bfa691e-5839-4735-9378-d06063656d75" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.730705 4756 scope.go:117] "RemoveContainer" containerID="a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.744280 4756 scope.go:117] "RemoveContainer" containerID="0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.760940 4756 scope.go:117] "RemoveContainer" containerID="82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.779796 4756 scope.go:117] "RemoveContainer" containerID="6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.806894 4756 scope.go:117] "RemoveContainer" containerID="91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723" Dec 05 16:19:42 crc kubenswrapper[4756]: E1205 16:19:42.807506 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\": container with ID starting with 91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723 not found: ID does not exist" containerID="91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.807551 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723"} err="failed to get container status \"91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\": rpc error: code = NotFound desc = could not find container \"91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723\": container with ID starting with 91741685922fbd37f5022e0f4e1c1b6a4829c4f505ce142e0521168f3bf1d723 not found: ID does not exist" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.807582 4756 scope.go:117] "RemoveContainer" containerID="a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907" Dec 05 16:19:42 crc kubenswrapper[4756]: E1205 16:19:42.808478 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\": container with ID starting with a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907 not found: ID does not exist" containerID="a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.808603 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907"} err="failed to get container status \"a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\": rpc error: code = NotFound desc = could not find container \"a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907\": container with ID starting with a1f05c3cade5e413dfa6bde9ebce8ec640631f1a7941f797e9bbc72c566f7907 not found: ID does not exist" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.808702 4756 scope.go:117] "RemoveContainer" containerID="a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e" Dec 05 16:19:42 crc kubenswrapper[4756]: E1205 16:19:42.809196 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\": container with ID starting with a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e not found: ID does not exist" containerID="a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.809235 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e"} err="failed to get container status \"a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\": rpc error: code = NotFound desc = could not find container \"a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e\": container with ID starting with a321b960b80f949c10ade7088523c5fd508c2cc5f4cf832d54069cb1f6ab563e not found: ID does not exist" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.809253 4756 scope.go:117] "RemoveContainer" containerID="0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29" Dec 05 16:19:42 crc kubenswrapper[4756]: E1205 16:19:42.809882 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\": container with ID starting with 0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29 not found: ID does not exist" containerID="0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.809967 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29"} err="failed to get container status \"0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\": rpc error: code = NotFound desc = could not find container \"0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29\": container with ID starting with 0a8f7d9b6a9ea2d6fe676420ea7c45513a15906fa421c738f086293de3535b29 not found: ID does not exist" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.810034 4756 scope.go:117] "RemoveContainer" containerID="82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e" Dec 05 16:19:42 crc kubenswrapper[4756]: E1205 16:19:42.810437 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\": container with ID starting with 82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e not found: ID does not exist" containerID="82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.810526 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e"} err="failed to get container status \"82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\": rpc error: code = NotFound desc = could not find container \"82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e\": container with ID starting with 82758a8000534f7a12f21812d86a26c92fd6c4b669bd8c213075338da64b121e not found: ID does not exist" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.810603 4756 scope.go:117] "RemoveContainer" containerID="6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613" Dec 05 16:19:42 crc kubenswrapper[4756]: E1205 16:19:42.810991 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\": container with ID starting with 6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613 not found: ID does not exist" containerID="6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613" Dec 05 16:19:42 crc kubenswrapper[4756]: I1205 16:19:42.811096 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613"} err="failed to get container status \"6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\": rpc error: code = NotFound desc = could not find container \"6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613\": container with ID starting with 6c3fef479e189ed0f72f18cd74924ef39b9ad7ef43fc072cbfd2806ab5ac6613 not found: ID does not exist" Dec 05 16:19:44 crc kubenswrapper[4756]: I1205 16:19:44.057185 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 05 16:19:44 crc kubenswrapper[4756]: E1205 16:19:44.955709 4756 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.189:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 16:19:44 crc kubenswrapper[4756]: I1205 16:19:44.956348 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 16:19:44 crc kubenswrapper[4756]: E1205 16:19:44.985850 4756 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.189:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e5e1d8cd0ce85 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 16:19:44.985370245 +0000 UTC m=+245.224043757,LastTimestamp:2025-12-05 16:19:44.985370245 +0000 UTC m=+245.224043757,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 16:19:45 crc kubenswrapper[4756]: E1205 16:19:45.581254 4756 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" Dec 05 16:19:45 crc kubenswrapper[4756]: E1205 16:19:45.584163 4756 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" Dec 05 16:19:45 crc kubenswrapper[4756]: E1205 16:19:45.585053 4756 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" Dec 05 16:19:45 crc kubenswrapper[4756]: E1205 16:19:45.585506 4756 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" Dec 05 16:19:45 crc kubenswrapper[4756]: E1205 16:19:45.586000 4756 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" Dec 05 16:19:45 crc kubenswrapper[4756]: I1205 16:19:45.586297 4756 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 05 16:19:45 crc kubenswrapper[4756]: E1205 16:19:45.587008 4756 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" interval="200ms" Dec 05 16:19:45 crc kubenswrapper[4756]: I1205 16:19:45.724325 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"039eefdbd24d88a680a9f7c1245b96ca7216ead64ee0af4e5360c9825d17d924"} Dec 05 16:19:45 crc kubenswrapper[4756]: I1205 16:19:45.724373 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"37d21e15a6790a14e2dfc0e488a284dd1140599113276be9bc21dd3e126e2e0e"} Dec 05 16:19:45 crc kubenswrapper[4756]: I1205 16:19:45.724977 4756 status_manager.go:851] "Failed to get status for pod" podUID="0bfa691e-5839-4735-9378-d06063656d75" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Dec 05 16:19:45 crc kubenswrapper[4756]: E1205 16:19:45.725011 4756 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.189:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 16:19:45 crc kubenswrapper[4756]: E1205 16:19:45.787476 4756 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" interval="400ms" Dec 05 16:19:46 crc kubenswrapper[4756]: E1205 16:19:46.069195 4756 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.189:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e5e1d8cd0ce85 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 16:19:44.985370245 +0000 UTC m=+245.224043757,LastTimestamp:2025-12-05 16:19:44.985370245 +0000 UTC m=+245.224043757,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 16:19:46 crc kubenswrapper[4756]: E1205 16:19:46.188024 4756 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" interval="800ms" Dec 05 16:19:46 crc kubenswrapper[4756]: E1205 16:19:46.989298 4756 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" interval="1.6s" Dec 05 16:19:48 crc kubenswrapper[4756]: E1205 16:19:48.591087 4756 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" interval="3.2s" Dec 05 16:19:50 crc kubenswrapper[4756]: I1205 16:19:50.053519 4756 status_manager.go:851] "Failed to get status for pod" podUID="0bfa691e-5839-4735-9378-d06063656d75" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Dec 05 16:19:51 crc kubenswrapper[4756]: E1205 16:19:51.792354 4756 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" interval="6.4s" Dec 05 16:19:52 crc kubenswrapper[4756]: E1205 16:19:52.122455 4756 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.189:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" volumeName="registry-storage" Dec 05 16:19:52 crc kubenswrapper[4756]: I1205 16:19:52.767368 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 05 16:19:52 crc kubenswrapper[4756]: I1205 16:19:52.767477 4756 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714" exitCode=1 Dec 05 16:19:52 crc kubenswrapper[4756]: I1205 16:19:52.767541 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714"} Dec 05 16:19:52 crc kubenswrapper[4756]: I1205 16:19:52.768149 4756 scope.go:117] "RemoveContainer" containerID="b908bf9cb8c6d343ac133c0784c3eabab7f22062d7adb5c4093c14857d34f714" Dec 05 16:19:52 crc kubenswrapper[4756]: I1205 16:19:52.768526 4756 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Dec 05 16:19:52 crc kubenswrapper[4756]: I1205 16:19:52.769020 4756 status_manager.go:851] "Failed to get status for pod" podUID="0bfa691e-5839-4735-9378-d06063656d75" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Dec 05 16:19:53 crc kubenswrapper[4756]: I1205 16:19:53.033818 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 16:19:53 crc kubenswrapper[4756]: I1205 16:19:53.243915 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 16:19:53 crc kubenswrapper[4756]: I1205 16:19:53.779981 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 05 16:19:53 crc kubenswrapper[4756]: I1205 16:19:53.780058 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"293ca685f936d2691fa1332ecae1bd903614e6f19177d91e1e9a866764ef3ec3"} Dec 05 16:19:53 crc kubenswrapper[4756]: I1205 16:19:53.781068 4756 status_manager.go:851] "Failed to get status for pod" podUID="0bfa691e-5839-4735-9378-d06063656d75" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Dec 05 16:19:53 crc kubenswrapper[4756]: I1205 16:19:53.781618 4756 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Dec 05 16:19:54 crc kubenswrapper[4756]: I1205 16:19:54.049451 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:19:54 crc kubenswrapper[4756]: I1205 16:19:54.050391 4756 status_manager.go:851] "Failed to get status for pod" podUID="0bfa691e-5839-4735-9378-d06063656d75" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Dec 05 16:19:54 crc kubenswrapper[4756]: I1205 16:19:54.050824 4756 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Dec 05 16:19:54 crc kubenswrapper[4756]: I1205 16:19:54.073813 4756 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2fd37c06-6aa3-4dbc-a783-e2c3d31bb680" Dec 05 16:19:54 crc kubenswrapper[4756]: I1205 16:19:54.073857 4756 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2fd37c06-6aa3-4dbc-a783-e2c3d31bb680" Dec 05 16:19:54 crc kubenswrapper[4756]: E1205 16:19:54.074274 4756 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:19:54 crc kubenswrapper[4756]: I1205 16:19:54.074777 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:19:54 crc kubenswrapper[4756]: I1205 16:19:54.788509 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"98c1c1fbfe045eb5d441025fd950dd3838e7f3b526be817ff9d4142446452a7c"} Dec 05 16:19:55 crc kubenswrapper[4756]: I1205 16:19:55.797944 4756 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="5a6f7d0fb265711590e8a9c5b2f96fb11ab3b55a4f9490f28840cdc550ea4d10" exitCode=0 Dec 05 16:19:55 crc kubenswrapper[4756]: I1205 16:19:55.798000 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"5a6f7d0fb265711590e8a9c5b2f96fb11ab3b55a4f9490f28840cdc550ea4d10"} Dec 05 16:19:55 crc kubenswrapper[4756]: I1205 16:19:55.798358 4756 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2fd37c06-6aa3-4dbc-a783-e2c3d31bb680" Dec 05 16:19:55 crc kubenswrapper[4756]: I1205 16:19:55.798393 4756 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2fd37c06-6aa3-4dbc-a783-e2c3d31bb680" Dec 05 16:19:55 crc kubenswrapper[4756]: E1205 16:19:55.799155 4756 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:19:55 crc kubenswrapper[4756]: I1205 16:19:55.799166 4756 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Dec 05 16:19:55 crc kubenswrapper[4756]: I1205 16:19:55.799847 4756 status_manager.go:851] "Failed to get status for pod" podUID="0bfa691e-5839-4735-9378-d06063656d75" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Dec 05 16:19:56 crc kubenswrapper[4756]: E1205 16:19:56.071295 4756 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.189:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e5e1d8cd0ce85 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 16:19:44.985370245 +0000 UTC m=+245.224043757,LastTimestamp:2025-12-05 16:19:44.985370245 +0000 UTC m=+245.224043757,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 16:19:56 crc kubenswrapper[4756]: I1205 16:19:56.820864 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8f5ac09fb3dc7aab31c9ad58f92c697e7a6d33007f4f8794b30258ccbd2d253e"} Dec 05 16:19:56 crc kubenswrapper[4756]: I1205 16:19:56.821191 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6a0614a331d81f8193f6d7d5b1e1b7a2d58aee404b3d5a2486049ba9926e5341"} Dec 05 16:19:56 crc kubenswrapper[4756]: I1205 16:19:56.821202 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2f359dbc28422ddee059cf64731d99c17ac28397f059d0376ece380204ac8207"} Dec 05 16:19:56 crc kubenswrapper[4756]: I1205 16:19:56.821210 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f1d06b3d7b99cd8e0dce1496d23ddb7bcfa759a6a553726518d6d13b39ccae82"} Dec 05 16:19:57 crc kubenswrapper[4756]: I1205 16:19:57.828751 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"28b8fba301845b7bb910d22063e39e797ce4acf905b739dec89832580065951c"} Dec 05 16:19:57 crc kubenswrapper[4756]: I1205 16:19:57.828976 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:19:57 crc kubenswrapper[4756]: I1205 16:19:57.829805 4756 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2fd37c06-6aa3-4dbc-a783-e2c3d31bb680" Dec 05 16:19:57 crc kubenswrapper[4756]: I1205 16:19:57.829897 4756 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2fd37c06-6aa3-4dbc-a783-e2c3d31bb680" Dec 05 16:19:59 crc kubenswrapper[4756]: I1205 16:19:59.075696 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:19:59 crc kubenswrapper[4756]: I1205 16:19:59.076890 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:19:59 crc kubenswrapper[4756]: I1205 16:19:59.083972 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:19:59 crc kubenswrapper[4756]: I1205 16:19:59.116944 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 16:19:59 crc kubenswrapper[4756]: I1205 16:19:59.122890 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 16:19:59 crc kubenswrapper[4756]: I1205 16:19:59.839490 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.091634 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" podUID="597869bc-03e8-40fd-8f80-c156102a3dc7" containerName="oauth-openshift" containerID="cri-o://fdf9c1fd829f6d8af32b11f5b7204be65c58a53990cc74dbda9452d63fa8c78b" gracePeriod=15 Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.471781 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.588451 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-service-ca\") pod \"597869bc-03e8-40fd-8f80-c156102a3dc7\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.588533 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-session\") pod \"597869bc-03e8-40fd-8f80-c156102a3dc7\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.588577 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-user-template-login\") pod \"597869bc-03e8-40fd-8f80-c156102a3dc7\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.588634 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/597869bc-03e8-40fd-8f80-c156102a3dc7-audit-dir\") pod \"597869bc-03e8-40fd-8f80-c156102a3dc7\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.588655 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-serving-cert\") pod \"597869bc-03e8-40fd-8f80-c156102a3dc7\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.588684 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-user-idp-0-file-data\") pod \"597869bc-03e8-40fd-8f80-c156102a3dc7\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.588727 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-user-template-provider-selection\") pod \"597869bc-03e8-40fd-8f80-c156102a3dc7\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.588765 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvfqz\" (UniqueName: \"kubernetes.io/projected/597869bc-03e8-40fd-8f80-c156102a3dc7-kube-api-access-lvfqz\") pod \"597869bc-03e8-40fd-8f80-c156102a3dc7\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.588787 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/597869bc-03e8-40fd-8f80-c156102a3dc7-audit-policies\") pod \"597869bc-03e8-40fd-8f80-c156102a3dc7\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.588809 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-ocp-branding-template\") pod \"597869bc-03e8-40fd-8f80-c156102a3dc7\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.588835 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-router-certs\") pod \"597869bc-03e8-40fd-8f80-c156102a3dc7\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.588860 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-cliconfig\") pod \"597869bc-03e8-40fd-8f80-c156102a3dc7\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.588887 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-user-template-error\") pod \"597869bc-03e8-40fd-8f80-c156102a3dc7\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.588915 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-trusted-ca-bundle\") pod \"597869bc-03e8-40fd-8f80-c156102a3dc7\" (UID: \"597869bc-03e8-40fd-8f80-c156102a3dc7\") " Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.589839 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "597869bc-03e8-40fd-8f80-c156102a3dc7" (UID: "597869bc-03e8-40fd-8f80-c156102a3dc7"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.590678 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/597869bc-03e8-40fd-8f80-c156102a3dc7-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "597869bc-03e8-40fd-8f80-c156102a3dc7" (UID: "597869bc-03e8-40fd-8f80-c156102a3dc7"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.590698 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "597869bc-03e8-40fd-8f80-c156102a3dc7" (UID: "597869bc-03e8-40fd-8f80-c156102a3dc7"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.590768 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/597869bc-03e8-40fd-8f80-c156102a3dc7-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "597869bc-03e8-40fd-8f80-c156102a3dc7" (UID: "597869bc-03e8-40fd-8f80-c156102a3dc7"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.590863 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "597869bc-03e8-40fd-8f80-c156102a3dc7" (UID: "597869bc-03e8-40fd-8f80-c156102a3dc7"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.595851 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "597869bc-03e8-40fd-8f80-c156102a3dc7" (UID: "597869bc-03e8-40fd-8f80-c156102a3dc7"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.596157 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "597869bc-03e8-40fd-8f80-c156102a3dc7" (UID: "597869bc-03e8-40fd-8f80-c156102a3dc7"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.596536 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "597869bc-03e8-40fd-8f80-c156102a3dc7" (UID: "597869bc-03e8-40fd-8f80-c156102a3dc7"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.597502 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "597869bc-03e8-40fd-8f80-c156102a3dc7" (UID: "597869bc-03e8-40fd-8f80-c156102a3dc7"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.602971 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "597869bc-03e8-40fd-8f80-c156102a3dc7" (UID: "597869bc-03e8-40fd-8f80-c156102a3dc7"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.603284 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "597869bc-03e8-40fd-8f80-c156102a3dc7" (UID: "597869bc-03e8-40fd-8f80-c156102a3dc7"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.604434 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/597869bc-03e8-40fd-8f80-c156102a3dc7-kube-api-access-lvfqz" (OuterVolumeSpecName: "kube-api-access-lvfqz") pod "597869bc-03e8-40fd-8f80-c156102a3dc7" (UID: "597869bc-03e8-40fd-8f80-c156102a3dc7"). InnerVolumeSpecName "kube-api-access-lvfqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.604894 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "597869bc-03e8-40fd-8f80-c156102a3dc7" (UID: "597869bc-03e8-40fd-8f80-c156102a3dc7"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.606043 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "597869bc-03e8-40fd-8f80-c156102a3dc7" (UID: "597869bc-03e8-40fd-8f80-c156102a3dc7"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.689867 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.690160 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.690255 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.690330 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.690446 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.690526 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.690617 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.690693 4756 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/597869bc-03e8-40fd-8f80-c156102a3dc7-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.690770 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.690848 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.690919 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.691003 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvfqz\" (UniqueName: \"kubernetes.io/projected/597869bc-03e8-40fd-8f80-c156102a3dc7-kube-api-access-lvfqz\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.691074 4756 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/597869bc-03e8-40fd-8f80-c156102a3dc7-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.691151 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/597869bc-03e8-40fd-8f80-c156102a3dc7-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.850104 4756 generic.go:334] "Generic (PLEG): container finished" podID="597869bc-03e8-40fd-8f80-c156102a3dc7" containerID="fdf9c1fd829f6d8af32b11f5b7204be65c58a53990cc74dbda9452d63fa8c78b" exitCode=0 Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.850166 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" event={"ID":"597869bc-03e8-40fd-8f80-c156102a3dc7","Type":"ContainerDied","Data":"fdf9c1fd829f6d8af32b11f5b7204be65c58a53990cc74dbda9452d63fa8c78b"} Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.850497 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" event={"ID":"597869bc-03e8-40fd-8f80-c156102a3dc7","Type":"ContainerDied","Data":"c12581e9f364d91554747754e6e4cc43c733d6fee96db8213cde236f4504f146"} Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.850241 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-k8whp" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.850579 4756 scope.go:117] "RemoveContainer" containerID="fdf9c1fd829f6d8af32b11f5b7204be65c58a53990cc74dbda9452d63fa8c78b" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.880714 4756 scope.go:117] "RemoveContainer" containerID="fdf9c1fd829f6d8af32b11f5b7204be65c58a53990cc74dbda9452d63fa8c78b" Dec 05 16:20:01 crc kubenswrapper[4756]: E1205 16:20:01.881291 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdf9c1fd829f6d8af32b11f5b7204be65c58a53990cc74dbda9452d63fa8c78b\": container with ID starting with fdf9c1fd829f6d8af32b11f5b7204be65c58a53990cc74dbda9452d63fa8c78b not found: ID does not exist" containerID="fdf9c1fd829f6d8af32b11f5b7204be65c58a53990cc74dbda9452d63fa8c78b" Dec 05 16:20:01 crc kubenswrapper[4756]: I1205 16:20:01.881334 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdf9c1fd829f6d8af32b11f5b7204be65c58a53990cc74dbda9452d63fa8c78b"} err="failed to get container status \"fdf9c1fd829f6d8af32b11f5b7204be65c58a53990cc74dbda9452d63fa8c78b\": rpc error: code = NotFound desc = could not find container \"fdf9c1fd829f6d8af32b11f5b7204be65c58a53990cc74dbda9452d63fa8c78b\": container with ID starting with fdf9c1fd829f6d8af32b11f5b7204be65c58a53990cc74dbda9452d63fa8c78b not found: ID does not exist" Dec 05 16:20:02 crc kubenswrapper[4756]: I1205 16:20:02.856975 4756 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:20:02 crc kubenswrapper[4756]: I1205 16:20:02.954917 4756 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="979d17bb-13d7-49b6-8689-43538b8a4681" Dec 05 16:20:03 crc kubenswrapper[4756]: E1205 16:20:03.174312 4756 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-template-provider-selection\": Failed to watch *v1.Secret: unknown (get secrets)" logger="UnhandledError" Dec 05 16:20:03 crc kubenswrapper[4756]: I1205 16:20:03.246931 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 16:20:03 crc kubenswrapper[4756]: E1205 16:20:03.345944 4756 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-cliconfig\": Failed to watch *v1.ConfigMap: unknown (get configmaps)" logger="UnhandledError" Dec 05 16:20:03 crc kubenswrapper[4756]: E1205 16:20:03.841456 4756 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: unknown (get configmaps)" logger="UnhandledError" Dec 05 16:20:03 crc kubenswrapper[4756]: I1205 16:20:03.864022 4756 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2fd37c06-6aa3-4dbc-a783-e2c3d31bb680" Dec 05 16:20:03 crc kubenswrapper[4756]: I1205 16:20:03.864054 4756 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2fd37c06-6aa3-4dbc-a783-e2c3d31bb680" Dec 05 16:20:03 crc kubenswrapper[4756]: I1205 16:20:03.867677 4756 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="979d17bb-13d7-49b6-8689-43538b8a4681" Dec 05 16:20:03 crc kubenswrapper[4756]: I1205 16:20:03.868184 4756 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://f1d06b3d7b99cd8e0dce1496d23ddb7bcfa759a6a553726518d6d13b39ccae82" Dec 05 16:20:03 crc kubenswrapper[4756]: I1205 16:20:03.868214 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:20:04 crc kubenswrapper[4756]: E1205 16:20:04.020025 4756 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: unknown (get configmaps)" logger="UnhandledError" Dec 05 16:20:04 crc kubenswrapper[4756]: I1205 16:20:04.868966 4756 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2fd37c06-6aa3-4dbc-a783-e2c3d31bb680" Dec 05 16:20:04 crc kubenswrapper[4756]: I1205 16:20:04.869023 4756 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2fd37c06-6aa3-4dbc-a783-e2c3d31bb680" Dec 05 16:20:04 crc kubenswrapper[4756]: I1205 16:20:04.871970 4756 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="979d17bb-13d7-49b6-8689-43538b8a4681" Dec 05 16:20:12 crc kubenswrapper[4756]: I1205 16:20:12.369937 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 05 16:20:12 crc kubenswrapper[4756]: I1205 16:20:12.698396 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 05 16:20:13 crc kubenswrapper[4756]: I1205 16:20:13.000685 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 05 16:20:13 crc kubenswrapper[4756]: I1205 16:20:13.425668 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 16:20:13 crc kubenswrapper[4756]: I1205 16:20:13.492333 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 05 16:20:13 crc kubenswrapper[4756]: I1205 16:20:13.576988 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 05 16:20:13 crc kubenswrapper[4756]: I1205 16:20:13.577428 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 05 16:20:13 crc kubenswrapper[4756]: I1205 16:20:13.782889 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 05 16:20:14 crc kubenswrapper[4756]: I1205 16:20:14.075823 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 05 16:20:14 crc kubenswrapper[4756]: I1205 16:20:14.087043 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 05 16:20:14 crc kubenswrapper[4756]: I1205 16:20:14.187728 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 05 16:20:14 crc kubenswrapper[4756]: I1205 16:20:14.453597 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 05 16:20:14 crc kubenswrapper[4756]: I1205 16:20:14.454853 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 05 16:20:14 crc kubenswrapper[4756]: I1205 16:20:14.529379 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 05 16:20:14 crc kubenswrapper[4756]: I1205 16:20:14.644997 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 05 16:20:14 crc kubenswrapper[4756]: I1205 16:20:14.669932 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 05 16:20:14 crc kubenswrapper[4756]: I1205 16:20:14.771912 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 05 16:20:14 crc kubenswrapper[4756]: I1205 16:20:14.809281 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 16:20:14 crc kubenswrapper[4756]: I1205 16:20:14.846854 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 05 16:20:14 crc kubenswrapper[4756]: I1205 16:20:14.999534 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 05 16:20:15 crc kubenswrapper[4756]: I1205 16:20:15.224879 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 05 16:20:15 crc kubenswrapper[4756]: I1205 16:20:15.609535 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 05 16:20:15 crc kubenswrapper[4756]: I1205 16:20:15.611623 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 05 16:20:15 crc kubenswrapper[4756]: I1205 16:20:15.649167 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 05 16:20:15 crc kubenswrapper[4756]: I1205 16:20:15.725584 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 05 16:20:15 crc kubenswrapper[4756]: I1205 16:20:15.773192 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 05 16:20:15 crc kubenswrapper[4756]: I1205 16:20:15.791741 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 05 16:20:15 crc kubenswrapper[4756]: I1205 16:20:15.879870 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 05 16:20:15 crc kubenswrapper[4756]: I1205 16:20:15.940253 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 05 16:20:15 crc kubenswrapper[4756]: I1205 16:20:15.943672 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 05 16:20:15 crc kubenswrapper[4756]: I1205 16:20:15.963313 4756 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 05 16:20:16 crc kubenswrapper[4756]: I1205 16:20:16.012437 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 05 16:20:16 crc kubenswrapper[4756]: I1205 16:20:16.089477 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 05 16:20:16 crc kubenswrapper[4756]: I1205 16:20:16.121906 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 16:20:16 crc kubenswrapper[4756]: I1205 16:20:16.136016 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 05 16:20:16 crc kubenswrapper[4756]: I1205 16:20:16.243878 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 16:20:16 crc kubenswrapper[4756]: I1205 16:20:16.260021 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 05 16:20:16 crc kubenswrapper[4756]: I1205 16:20:16.275690 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 05 16:20:16 crc kubenswrapper[4756]: I1205 16:20:16.290394 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 05 16:20:16 crc kubenswrapper[4756]: I1205 16:20:16.368263 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 05 16:20:16 crc kubenswrapper[4756]: I1205 16:20:16.419837 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 05 16:20:16 crc kubenswrapper[4756]: I1205 16:20:16.586236 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 05 16:20:16 crc kubenswrapper[4756]: I1205 16:20:16.672127 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 05 16:20:16 crc kubenswrapper[4756]: I1205 16:20:16.780758 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 05 16:20:16 crc kubenswrapper[4756]: I1205 16:20:16.784642 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 05 16:20:16 crc kubenswrapper[4756]: I1205 16:20:16.833986 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 05 16:20:16 crc kubenswrapper[4756]: I1205 16:20:16.843769 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 05 16:20:16 crc kubenswrapper[4756]: I1205 16:20:16.865245 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 05 16:20:17 crc kubenswrapper[4756]: I1205 16:20:17.016953 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 05 16:20:17 crc kubenswrapper[4756]: I1205 16:20:17.040642 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 05 16:20:17 crc kubenswrapper[4756]: I1205 16:20:17.062335 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 05 16:20:17 crc kubenswrapper[4756]: I1205 16:20:17.079636 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 05 16:20:17 crc kubenswrapper[4756]: I1205 16:20:17.236759 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 05 16:20:17 crc kubenswrapper[4756]: I1205 16:20:17.236877 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 05 16:20:17 crc kubenswrapper[4756]: I1205 16:20:17.313027 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 05 16:20:17 crc kubenswrapper[4756]: I1205 16:20:17.335496 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 05 16:20:17 crc kubenswrapper[4756]: I1205 16:20:17.500000 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 05 16:20:17 crc kubenswrapper[4756]: I1205 16:20:17.554743 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 05 16:20:17 crc kubenswrapper[4756]: I1205 16:20:17.822578 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 05 16:20:17 crc kubenswrapper[4756]: I1205 16:20:17.893222 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 05 16:20:17 crc kubenswrapper[4756]: I1205 16:20:17.904807 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 16:20:17 crc kubenswrapper[4756]: I1205 16:20:17.949091 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 05 16:20:17 crc kubenswrapper[4756]: I1205 16:20:17.949862 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 05 16:20:18 crc kubenswrapper[4756]: I1205 16:20:18.022076 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 05 16:20:18 crc kubenswrapper[4756]: I1205 16:20:18.073249 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 16:20:18 crc kubenswrapper[4756]: I1205 16:20:18.083171 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 05 16:20:18 crc kubenswrapper[4756]: I1205 16:20:18.206299 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 16:20:18 crc kubenswrapper[4756]: I1205 16:20:18.351580 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 05 16:20:18 crc kubenswrapper[4756]: I1205 16:20:18.381172 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 16:20:18 crc kubenswrapper[4756]: I1205 16:20:18.400548 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 05 16:20:18 crc kubenswrapper[4756]: I1205 16:20:18.607739 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 05 16:20:18 crc kubenswrapper[4756]: I1205 16:20:18.694782 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 05 16:20:18 crc kubenswrapper[4756]: I1205 16:20:18.887324 4756 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 05 16:20:18 crc kubenswrapper[4756]: I1205 16:20:18.892275 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-k8whp"] Dec 05 16:20:18 crc kubenswrapper[4756]: I1205 16:20:18.892345 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 16:20:18 crc kubenswrapper[4756]: I1205 16:20:18.899467 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 16:20:18 crc kubenswrapper[4756]: I1205 16:20:18.915606 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=16.915586922 podStartE2EDuration="16.915586922s" podCreationTimestamp="2025-12-05 16:20:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:20:18.912538827 +0000 UTC m=+279.151212389" watchObservedRunningTime="2025-12-05 16:20:18.915586922 +0000 UTC m=+279.154260444" Dec 05 16:20:18 crc kubenswrapper[4756]: I1205 16:20:18.977750 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 05 16:20:19 crc kubenswrapper[4756]: I1205 16:20:19.013881 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 05 16:20:19 crc kubenswrapper[4756]: I1205 16:20:19.040589 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 05 16:20:19 crc kubenswrapper[4756]: I1205 16:20:19.058947 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 05 16:20:19 crc kubenswrapper[4756]: I1205 16:20:19.059585 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 05 16:20:19 crc kubenswrapper[4756]: I1205 16:20:19.137515 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 05 16:20:19 crc kubenswrapper[4756]: I1205 16:20:19.160505 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 05 16:20:19 crc kubenswrapper[4756]: I1205 16:20:19.314076 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 05 16:20:19 crc kubenswrapper[4756]: I1205 16:20:19.347179 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 16:20:19 crc kubenswrapper[4756]: I1205 16:20:19.348721 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 05 16:20:19 crc kubenswrapper[4756]: I1205 16:20:19.384627 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 05 16:20:19 crc kubenswrapper[4756]: I1205 16:20:19.483402 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 05 16:20:19 crc kubenswrapper[4756]: I1205 16:20:19.610283 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 05 16:20:19 crc kubenswrapper[4756]: I1205 16:20:19.717008 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 05 16:20:19 crc kubenswrapper[4756]: I1205 16:20:19.759573 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 16:20:19 crc kubenswrapper[4756]: I1205 16:20:19.792475 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 05 16:20:19 crc kubenswrapper[4756]: I1205 16:20:19.828322 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 05 16:20:19 crc kubenswrapper[4756]: I1205 16:20:19.965229 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 05 16:20:20 crc kubenswrapper[4756]: I1205 16:20:20.028252 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 05 16:20:20 crc kubenswrapper[4756]: I1205 16:20:20.056911 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="597869bc-03e8-40fd-8f80-c156102a3dc7" path="/var/lib/kubelet/pods/597869bc-03e8-40fd-8f80-c156102a3dc7/volumes" Dec 05 16:20:20 crc kubenswrapper[4756]: I1205 16:20:20.081014 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 05 16:20:20 crc kubenswrapper[4756]: I1205 16:20:20.130273 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 05 16:20:20 crc kubenswrapper[4756]: I1205 16:20:20.181878 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 05 16:20:20 crc kubenswrapper[4756]: I1205 16:20:20.201708 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 05 16:20:20 crc kubenswrapper[4756]: I1205 16:20:20.230654 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 05 16:20:20 crc kubenswrapper[4756]: I1205 16:20:20.324744 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 05 16:20:20 crc kubenswrapper[4756]: I1205 16:20:20.362876 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 05 16:20:20 crc kubenswrapper[4756]: I1205 16:20:20.382875 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 05 16:20:20 crc kubenswrapper[4756]: I1205 16:20:20.386180 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 05 16:20:20 crc kubenswrapper[4756]: I1205 16:20:20.419067 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 05 16:20:20 crc kubenswrapper[4756]: I1205 16:20:20.449264 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 05 16:20:20 crc kubenswrapper[4756]: I1205 16:20:20.466032 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 05 16:20:20 crc kubenswrapper[4756]: I1205 16:20:20.524399 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 05 16:20:20 crc kubenswrapper[4756]: I1205 16:20:20.559845 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 05 16:20:20 crc kubenswrapper[4756]: I1205 16:20:20.673148 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 05 16:20:20 crc kubenswrapper[4756]: I1205 16:20:20.760707 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 05 16:20:20 crc kubenswrapper[4756]: I1205 16:20:20.800701 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 05 16:20:21 crc kubenswrapper[4756]: I1205 16:20:21.089497 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 05 16:20:21 crc kubenswrapper[4756]: I1205 16:20:21.188057 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 05 16:20:21 crc kubenswrapper[4756]: I1205 16:20:21.245589 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 05 16:20:21 crc kubenswrapper[4756]: I1205 16:20:21.266985 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 05 16:20:21 crc kubenswrapper[4756]: I1205 16:20:21.316896 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 05 16:20:21 crc kubenswrapper[4756]: I1205 16:20:21.317918 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 16:20:21 crc kubenswrapper[4756]: I1205 16:20:21.407625 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 05 16:20:21 crc kubenswrapper[4756]: I1205 16:20:21.437704 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 05 16:20:21 crc kubenswrapper[4756]: I1205 16:20:21.594253 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 05 16:20:21 crc kubenswrapper[4756]: I1205 16:20:21.728053 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 05 16:20:21 crc kubenswrapper[4756]: I1205 16:20:21.766354 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 05 16:20:21 crc kubenswrapper[4756]: I1205 16:20:21.806116 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 05 16:20:21 crc kubenswrapper[4756]: I1205 16:20:21.840979 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 05 16:20:21 crc kubenswrapper[4756]: I1205 16:20:21.907881 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 05 16:20:21 crc kubenswrapper[4756]: I1205 16:20:21.987608 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 05 16:20:21 crc kubenswrapper[4756]: I1205 16:20:21.999367 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 05 16:20:22 crc kubenswrapper[4756]: I1205 16:20:22.105747 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 05 16:20:22 crc kubenswrapper[4756]: I1205 16:20:22.186200 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 05 16:20:22 crc kubenswrapper[4756]: I1205 16:20:22.270630 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 05 16:20:22 crc kubenswrapper[4756]: I1205 16:20:22.376860 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 05 16:20:22 crc kubenswrapper[4756]: I1205 16:20:22.432535 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 05 16:20:22 crc kubenswrapper[4756]: I1205 16:20:22.484395 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 05 16:20:22 crc kubenswrapper[4756]: I1205 16:20:22.501480 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 05 16:20:22 crc kubenswrapper[4756]: I1205 16:20:22.624481 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 05 16:20:22 crc kubenswrapper[4756]: I1205 16:20:22.684116 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 05 16:20:22 crc kubenswrapper[4756]: I1205 16:20:22.886672 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 05 16:20:22 crc kubenswrapper[4756]: I1205 16:20:22.892947 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 05 16:20:23 crc kubenswrapper[4756]: I1205 16:20:23.063041 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 05 16:20:23 crc kubenswrapper[4756]: I1205 16:20:23.132948 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 05 16:20:23 crc kubenswrapper[4756]: I1205 16:20:23.220903 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 05 16:20:23 crc kubenswrapper[4756]: I1205 16:20:23.319855 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 05 16:20:23 crc kubenswrapper[4756]: I1205 16:20:23.379630 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 05 16:20:23 crc kubenswrapper[4756]: I1205 16:20:23.414081 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 05 16:20:23 crc kubenswrapper[4756]: I1205 16:20:23.449374 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 05 16:20:23 crc kubenswrapper[4756]: I1205 16:20:23.469625 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 16:20:23 crc kubenswrapper[4756]: I1205 16:20:23.510756 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 05 16:20:23 crc kubenswrapper[4756]: I1205 16:20:23.514587 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 05 16:20:23 crc kubenswrapper[4756]: I1205 16:20:23.541315 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 05 16:20:23 crc kubenswrapper[4756]: I1205 16:20:23.562315 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 05 16:20:23 crc kubenswrapper[4756]: I1205 16:20:23.604041 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 05 16:20:23 crc kubenswrapper[4756]: I1205 16:20:23.607719 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 05 16:20:23 crc kubenswrapper[4756]: I1205 16:20:23.650813 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 05 16:20:23 crc kubenswrapper[4756]: I1205 16:20:23.676175 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 05 16:20:23 crc kubenswrapper[4756]: I1205 16:20:23.698269 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 05 16:20:23 crc kubenswrapper[4756]: I1205 16:20:23.784613 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 05 16:20:23 crc kubenswrapper[4756]: I1205 16:20:23.854609 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 05 16:20:23 crc kubenswrapper[4756]: I1205 16:20:23.860592 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 05 16:20:23 crc kubenswrapper[4756]: I1205 16:20:23.946322 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 05 16:20:23 crc kubenswrapper[4756]: I1205 16:20:23.950119 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 05 16:20:23 crc kubenswrapper[4756]: I1205 16:20:23.969661 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 05 16:20:23 crc kubenswrapper[4756]: I1205 16:20:23.976743 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 05 16:20:24 crc kubenswrapper[4756]: I1205 16:20:24.065165 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 05 16:20:24 crc kubenswrapper[4756]: I1205 16:20:24.100363 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 05 16:20:24 crc kubenswrapper[4756]: I1205 16:20:24.119007 4756 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 05 16:20:24 crc kubenswrapper[4756]: I1205 16:20:24.189302 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 16:20:24 crc kubenswrapper[4756]: I1205 16:20:24.235358 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 05 16:20:24 crc kubenswrapper[4756]: I1205 16:20:24.259579 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 05 16:20:24 crc kubenswrapper[4756]: I1205 16:20:24.323359 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 05 16:20:24 crc kubenswrapper[4756]: I1205 16:20:24.354512 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 05 16:20:24 crc kubenswrapper[4756]: I1205 16:20:24.383754 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 05 16:20:24 crc kubenswrapper[4756]: I1205 16:20:24.394485 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 05 16:20:24 crc kubenswrapper[4756]: I1205 16:20:24.424809 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 05 16:20:24 crc kubenswrapper[4756]: I1205 16:20:24.459449 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 05 16:20:24 crc kubenswrapper[4756]: I1205 16:20:24.523452 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 05 16:20:24 crc kubenswrapper[4756]: I1205 16:20:24.601243 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 05 16:20:24 crc kubenswrapper[4756]: I1205 16:20:24.673191 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 16:20:24 crc kubenswrapper[4756]: I1205 16:20:24.970563 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 05 16:20:24 crc kubenswrapper[4756]: I1205 16:20:24.984054 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 05 16:20:24 crc kubenswrapper[4756]: I1205 16:20:24.996098 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 05 16:20:25 crc kubenswrapper[4756]: I1205 16:20:25.045167 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 05 16:20:25 crc kubenswrapper[4756]: I1205 16:20:25.072106 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 05 16:20:25 crc kubenswrapper[4756]: I1205 16:20:25.077286 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 05 16:20:25 crc kubenswrapper[4756]: I1205 16:20:25.093601 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 05 16:20:25 crc kubenswrapper[4756]: I1205 16:20:25.184116 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 05 16:20:25 crc kubenswrapper[4756]: I1205 16:20:25.375115 4756 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 05 16:20:25 crc kubenswrapper[4756]: I1205 16:20:25.429993 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 05 16:20:25 crc kubenswrapper[4756]: I1205 16:20:25.463866 4756 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 16:20:25 crc kubenswrapper[4756]: I1205 16:20:25.464114 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://039eefdbd24d88a680a9f7c1245b96ca7216ead64ee0af4e5360c9825d17d924" gracePeriod=5 Dec 05 16:20:25 crc kubenswrapper[4756]: I1205 16:20:25.503845 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 05 16:20:25 crc kubenswrapper[4756]: I1205 16:20:25.604576 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 05 16:20:25 crc kubenswrapper[4756]: I1205 16:20:25.615691 4756 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 05 16:20:25 crc kubenswrapper[4756]: I1205 16:20:25.681902 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 05 16:20:25 crc kubenswrapper[4756]: I1205 16:20:25.721376 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 05 16:20:25 crc kubenswrapper[4756]: I1205 16:20:25.827677 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 05 16:20:26 crc kubenswrapper[4756]: I1205 16:20:26.086937 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 05 16:20:26 crc kubenswrapper[4756]: I1205 16:20:26.124327 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 05 16:20:26 crc kubenswrapper[4756]: I1205 16:20:26.197605 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 16:20:26 crc kubenswrapper[4756]: I1205 16:20:26.215408 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 05 16:20:26 crc kubenswrapper[4756]: I1205 16:20:26.247886 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 05 16:20:26 crc kubenswrapper[4756]: I1205 16:20:26.366738 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 05 16:20:26 crc kubenswrapper[4756]: I1205 16:20:26.399237 4756 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 05 16:20:26 crc kubenswrapper[4756]: I1205 16:20:26.402769 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 05 16:20:26 crc kubenswrapper[4756]: I1205 16:20:26.574449 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 05 16:20:26 crc kubenswrapper[4756]: I1205 16:20:26.585108 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 05 16:20:26 crc kubenswrapper[4756]: I1205 16:20:26.628269 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 05 16:20:26 crc kubenswrapper[4756]: I1205 16:20:26.636893 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 05 16:20:26 crc kubenswrapper[4756]: I1205 16:20:26.718069 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 05 16:20:26 crc kubenswrapper[4756]: I1205 16:20:26.729884 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 05 16:20:26 crc kubenswrapper[4756]: I1205 16:20:26.746245 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 16:20:26 crc kubenswrapper[4756]: I1205 16:20:26.870512 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 05 16:20:26 crc kubenswrapper[4756]: I1205 16:20:26.950828 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 05 16:20:26 crc kubenswrapper[4756]: I1205 16:20:26.972368 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 05 16:20:26 crc kubenswrapper[4756]: I1205 16:20:26.996329 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.016859 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.098498 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.159163 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.174902 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.196692 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.257692 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.380259 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.450217 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7b964c775c-wg8nc"] Dec 05 16:20:27 crc kubenswrapper[4756]: E1205 16:20:27.450459 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="597869bc-03e8-40fd-8f80-c156102a3dc7" containerName="oauth-openshift" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.450471 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="597869bc-03e8-40fd-8f80-c156102a3dc7" containerName="oauth-openshift" Dec 05 16:20:27 crc kubenswrapper[4756]: E1205 16:20:27.450482 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.450487 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 16:20:27 crc kubenswrapper[4756]: E1205 16:20:27.450500 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bfa691e-5839-4735-9378-d06063656d75" containerName="installer" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.450506 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bfa691e-5839-4735-9378-d06063656d75" containerName="installer" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.450603 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bfa691e-5839-4735-9378-d06063656d75" containerName="installer" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.450614 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.450621 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="597869bc-03e8-40fd-8f80-c156102a3dc7" containerName="oauth-openshift" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.450966 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.453497 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.453557 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.453852 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.454169 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.454802 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.454935 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.455369 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.455391 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.455534 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.456333 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.457600 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.458533 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.464721 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.468687 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.469684 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7b964c775c-wg8nc"] Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.472780 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.620887 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.620931 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-system-service-ca\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.620950 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fm5f\" (UniqueName: \"kubernetes.io/projected/1090d00d-42e7-4952-8109-d06ab2e0896e-kube-api-access-2fm5f\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.621155 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-user-template-login\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.621257 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-user-template-error\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.621319 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-system-session\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.621392 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.621437 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.621470 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-system-router-certs\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.621539 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.621564 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1090d00d-42e7-4952-8109-d06ab2e0896e-audit-policies\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.621579 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.621600 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.621625 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1090d00d-42e7-4952-8109-d06ab2e0896e-audit-dir\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.648158 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.681785 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.722187 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-system-service-ca\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.722231 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fm5f\" (UniqueName: \"kubernetes.io/projected/1090d00d-42e7-4952-8109-d06ab2e0896e-kube-api-access-2fm5f\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.722257 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-user-template-login\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.722292 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-user-template-error\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.722324 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-system-session\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.722345 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.722367 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.722384 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-system-router-certs\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.722439 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.722459 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1090d00d-42e7-4952-8109-d06ab2e0896e-audit-policies\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.722477 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.722498 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.722514 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1090d00d-42e7-4952-8109-d06ab2e0896e-audit-dir\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.722533 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.723232 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.723647 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-system-service-ca\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.725394 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.725697 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1090d00d-42e7-4952-8109-d06ab2e0896e-audit-dir\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.726487 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1090d00d-42e7-4952-8109-d06ab2e0896e-audit-policies\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.730352 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-user-template-login\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.730562 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.730936 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-system-router-certs\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.731262 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.731518 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-user-template-error\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.733306 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.734375 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.760127 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1090d00d-42e7-4952-8109-d06ab2e0896e-v4-0-config-system-session\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.763605 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.768877 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fm5f\" (UniqueName: \"kubernetes.io/projected/1090d00d-42e7-4952-8109-d06ab2e0896e-kube-api-access-2fm5f\") pod \"oauth-openshift-7b964c775c-wg8nc\" (UID: \"1090d00d-42e7-4952-8109-d06ab2e0896e\") " pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.772758 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.776227 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.779870 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 05 16:20:27 crc kubenswrapper[4756]: I1205 16:20:27.924306 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 05 16:20:28 crc kubenswrapper[4756]: I1205 16:20:28.054605 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 05 16:20:28 crc kubenswrapper[4756]: I1205 16:20:28.221385 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 05 16:20:28 crc kubenswrapper[4756]: I1205 16:20:28.239296 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 05 16:20:28 crc kubenswrapper[4756]: I1205 16:20:28.244399 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7b964c775c-wg8nc"] Dec 05 16:20:28 crc kubenswrapper[4756]: I1205 16:20:28.295199 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 05 16:20:28 crc kubenswrapper[4756]: I1205 16:20:28.751869 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 05 16:20:29 crc kubenswrapper[4756]: I1205 16:20:29.034766 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" event={"ID":"1090d00d-42e7-4952-8109-d06ab2e0896e","Type":"ContainerStarted","Data":"a2e574970ca979bcefb808233d8051a924f425680c02d2fefae51ddab6e41e04"} Dec 05 16:20:29 crc kubenswrapper[4756]: I1205 16:20:29.034822 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" event={"ID":"1090d00d-42e7-4952-8109-d06ab2e0896e","Type":"ContainerStarted","Data":"316fd685b1329058cde9276eacc5a549bf5a8c156fbfe8c7d334ba95ca8995d0"} Dec 05 16:20:29 crc kubenswrapper[4756]: I1205 16:20:29.035344 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:29 crc kubenswrapper[4756]: I1205 16:20:29.078797 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" podStartSLOduration=53.078773368 podStartE2EDuration="53.078773368s" podCreationTimestamp="2025-12-05 16:19:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:20:29.077724859 +0000 UTC m=+289.316398371" watchObservedRunningTime="2025-12-05 16:20:29.078773368 +0000 UTC m=+289.317446900" Dec 05 16:20:29 crc kubenswrapper[4756]: I1205 16:20:29.150897 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7b964c775c-wg8nc" Dec 05 16:20:29 crc kubenswrapper[4756]: I1205 16:20:29.522085 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 05 16:20:29 crc kubenswrapper[4756]: I1205 16:20:29.966740 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 05 16:20:31 crc kubenswrapper[4756]: I1205 16:20:31.041085 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 05 16:20:31 crc kubenswrapper[4756]: I1205 16:20:31.041188 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 16:20:31 crc kubenswrapper[4756]: I1205 16:20:31.056279 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 05 16:20:31 crc kubenswrapper[4756]: I1205 16:20:31.056343 4756 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="039eefdbd24d88a680a9f7c1245b96ca7216ead64ee0af4e5360c9825d17d924" exitCode=137 Dec 05 16:20:31 crc kubenswrapper[4756]: I1205 16:20:31.056521 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 16:20:31 crc kubenswrapper[4756]: I1205 16:20:31.056518 4756 scope.go:117] "RemoveContainer" containerID="039eefdbd24d88a680a9f7c1245b96ca7216ead64ee0af4e5360c9825d17d924" Dec 05 16:20:31 crc kubenswrapper[4756]: I1205 16:20:31.079176 4756 scope.go:117] "RemoveContainer" containerID="039eefdbd24d88a680a9f7c1245b96ca7216ead64ee0af4e5360c9825d17d924" Dec 05 16:20:31 crc kubenswrapper[4756]: E1205 16:20:31.080128 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"039eefdbd24d88a680a9f7c1245b96ca7216ead64ee0af4e5360c9825d17d924\": container with ID starting with 039eefdbd24d88a680a9f7c1245b96ca7216ead64ee0af4e5360c9825d17d924 not found: ID does not exist" containerID="039eefdbd24d88a680a9f7c1245b96ca7216ead64ee0af4e5360c9825d17d924" Dec 05 16:20:31 crc kubenswrapper[4756]: I1205 16:20:31.080184 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"039eefdbd24d88a680a9f7c1245b96ca7216ead64ee0af4e5360c9825d17d924"} err="failed to get container status \"039eefdbd24d88a680a9f7c1245b96ca7216ead64ee0af4e5360c9825d17d924\": rpc error: code = NotFound desc = could not find container \"039eefdbd24d88a680a9f7c1245b96ca7216ead64ee0af4e5360c9825d17d924\": container with ID starting with 039eefdbd24d88a680a9f7c1245b96ca7216ead64ee0af4e5360c9825d17d924 not found: ID does not exist" Dec 05 16:20:31 crc kubenswrapper[4756]: I1205 16:20:31.166467 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 16:20:31 crc kubenswrapper[4756]: I1205 16:20:31.166833 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 16:20:31 crc kubenswrapper[4756]: I1205 16:20:31.166953 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 16:20:31 crc kubenswrapper[4756]: I1205 16:20:31.167065 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 16:20:31 crc kubenswrapper[4756]: I1205 16:20:31.167194 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 16:20:31 crc kubenswrapper[4756]: I1205 16:20:31.166583 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:20:31 crc kubenswrapper[4756]: I1205 16:20:31.166866 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:20:31 crc kubenswrapper[4756]: I1205 16:20:31.167225 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:20:31 crc kubenswrapper[4756]: I1205 16:20:31.167267 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:20:31 crc kubenswrapper[4756]: I1205 16:20:31.168094 4756 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:31 crc kubenswrapper[4756]: I1205 16:20:31.168198 4756 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:31 crc kubenswrapper[4756]: I1205 16:20:31.168286 4756 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:31 crc kubenswrapper[4756]: I1205 16:20:31.168380 4756 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:31 crc kubenswrapper[4756]: I1205 16:20:31.177298 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:20:31 crc kubenswrapper[4756]: I1205 16:20:31.269727 4756 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:32 crc kubenswrapper[4756]: I1205 16:20:32.055974 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 05 16:20:36 crc kubenswrapper[4756]: I1205 16:20:36.956224 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 05 16:20:37 crc kubenswrapper[4756]: I1205 16:20:37.789985 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w6h66"] Dec 05 16:20:37 crc kubenswrapper[4756]: I1205 16:20:37.790274 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w6h66" podUID="1bd03756-cb80-40f8-b5eb-c936c80c6c21" containerName="registry-server" containerID="cri-o://49599143ed507b4e0b78b3bc947fe60c84566e490ae2fc67d3a6a3360f4168fa" gracePeriod=30 Dec 05 16:20:37 crc kubenswrapper[4756]: I1205 16:20:37.798517 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-crlcr"] Dec 05 16:20:37 crc kubenswrapper[4756]: I1205 16:20:37.798792 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-crlcr" podUID="95c208d2-1ef6-4716-8053-ccbabcd7fe59" containerName="registry-server" containerID="cri-o://a1b7f1d8ead78b7fdbfc35ba19576984330abbc34adaa14e957409f7cf995a55" gracePeriod=30 Dec 05 16:20:37 crc kubenswrapper[4756]: I1205 16:20:37.819517 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dpqm9"] Dec 05 16:20:37 crc kubenswrapper[4756]: I1205 16:20:37.820886 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-dpqm9" podUID="67cf5e8a-2d35-4db9-bcb6-68492f4be191" containerName="marketplace-operator" containerID="cri-o://116227a6e7006bd15fc3c4166b95468cf979ce33a4d10cc05d9d4f9d8f2e58d9" gracePeriod=30 Dec 05 16:20:37 crc kubenswrapper[4756]: I1205 16:20:37.835487 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z92m7"] Dec 05 16:20:37 crc kubenswrapper[4756]: I1205 16:20:37.836027 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z92m7" podUID="956dc850-ee41-4d30-b209-2f3270c376a5" containerName="registry-server" containerID="cri-o://42a3a1556de1882d1dd77108e70f66eb804737ab143faecb1ee2e1981dd7a818" gracePeriod=30 Dec 05 16:20:37 crc kubenswrapper[4756]: I1205 16:20:37.844702 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wm9rv"] Dec 05 16:20:37 crc kubenswrapper[4756]: I1205 16:20:37.845008 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wm9rv" podUID="5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d" containerName="registry-server" containerID="cri-o://a594d1535fe50b0e0810f90de0d896de51a3a43797f8f58ff1bf4ff1a97f8d19" gracePeriod=30 Dec 05 16:20:37 crc kubenswrapper[4756]: I1205 16:20:37.860551 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fpgmr"] Dec 05 16:20:37 crc kubenswrapper[4756]: I1205 16:20:37.861790 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fpgmr" Dec 05 16:20:37 crc kubenswrapper[4756]: I1205 16:20:37.864044 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fpgmr"] Dec 05 16:20:37 crc kubenswrapper[4756]: I1205 16:20:37.954031 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c445575-d157-46b0-a496-79e9dfc47c17-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fpgmr\" (UID: \"6c445575-d157-46b0-a496-79e9dfc47c17\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpgmr" Dec 05 16:20:37 crc kubenswrapper[4756]: I1205 16:20:37.954087 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kmjq\" (UniqueName: \"kubernetes.io/projected/6c445575-d157-46b0-a496-79e9dfc47c17-kube-api-access-4kmjq\") pod \"marketplace-operator-79b997595-fpgmr\" (UID: \"6c445575-d157-46b0-a496-79e9dfc47c17\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpgmr" Dec 05 16:20:37 crc kubenswrapper[4756]: I1205 16:20:37.954136 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6c445575-d157-46b0-a496-79e9dfc47c17-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fpgmr\" (UID: \"6c445575-d157-46b0-a496-79e9dfc47c17\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpgmr" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.055793 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6c445575-d157-46b0-a496-79e9dfc47c17-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fpgmr\" (UID: \"6c445575-d157-46b0-a496-79e9dfc47c17\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpgmr" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.055909 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c445575-d157-46b0-a496-79e9dfc47c17-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fpgmr\" (UID: \"6c445575-d157-46b0-a496-79e9dfc47c17\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpgmr" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.055940 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kmjq\" (UniqueName: \"kubernetes.io/projected/6c445575-d157-46b0-a496-79e9dfc47c17-kube-api-access-4kmjq\") pod \"marketplace-operator-79b997595-fpgmr\" (UID: \"6c445575-d157-46b0-a496-79e9dfc47c17\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpgmr" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.058561 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c445575-d157-46b0-a496-79e9dfc47c17-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fpgmr\" (UID: \"6c445575-d157-46b0-a496-79e9dfc47c17\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpgmr" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.064264 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6c445575-d157-46b0-a496-79e9dfc47c17-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fpgmr\" (UID: \"6c445575-d157-46b0-a496-79e9dfc47c17\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpgmr" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.075296 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kmjq\" (UniqueName: \"kubernetes.io/projected/6c445575-d157-46b0-a496-79e9dfc47c17-kube-api-access-4kmjq\") pod \"marketplace-operator-79b997595-fpgmr\" (UID: \"6c445575-d157-46b0-a496-79e9dfc47c17\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpgmr" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.116729 4756 generic.go:334] "Generic (PLEG): container finished" podID="1bd03756-cb80-40f8-b5eb-c936c80c6c21" containerID="49599143ed507b4e0b78b3bc947fe60c84566e490ae2fc67d3a6a3360f4168fa" exitCode=0 Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.116818 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6h66" event={"ID":"1bd03756-cb80-40f8-b5eb-c936c80c6c21","Type":"ContainerDied","Data":"49599143ed507b4e0b78b3bc947fe60c84566e490ae2fc67d3a6a3360f4168fa"} Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.118826 4756 generic.go:334] "Generic (PLEG): container finished" podID="5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d" containerID="a594d1535fe50b0e0810f90de0d896de51a3a43797f8f58ff1bf4ff1a97f8d19" exitCode=0 Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.118936 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wm9rv" event={"ID":"5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d","Type":"ContainerDied","Data":"a594d1535fe50b0e0810f90de0d896de51a3a43797f8f58ff1bf4ff1a97f8d19"} Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.121119 4756 generic.go:334] "Generic (PLEG): container finished" podID="95c208d2-1ef6-4716-8053-ccbabcd7fe59" containerID="a1b7f1d8ead78b7fdbfc35ba19576984330abbc34adaa14e957409f7cf995a55" exitCode=0 Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.121212 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crlcr" event={"ID":"95c208d2-1ef6-4716-8053-ccbabcd7fe59","Type":"ContainerDied","Data":"a1b7f1d8ead78b7fdbfc35ba19576984330abbc34adaa14e957409f7cf995a55"} Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.123021 4756 generic.go:334] "Generic (PLEG): container finished" podID="956dc850-ee41-4d30-b209-2f3270c376a5" containerID="42a3a1556de1882d1dd77108e70f66eb804737ab143faecb1ee2e1981dd7a818" exitCode=0 Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.123084 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z92m7" event={"ID":"956dc850-ee41-4d30-b209-2f3270c376a5","Type":"ContainerDied","Data":"42a3a1556de1882d1dd77108e70f66eb804737ab143faecb1ee2e1981dd7a818"} Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.126220 4756 generic.go:334] "Generic (PLEG): container finished" podID="67cf5e8a-2d35-4db9-bcb6-68492f4be191" containerID="116227a6e7006bd15fc3c4166b95468cf979ce33a4d10cc05d9d4f9d8f2e58d9" exitCode=0 Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.126258 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dpqm9" event={"ID":"67cf5e8a-2d35-4db9-bcb6-68492f4be191","Type":"ContainerDied","Data":"116227a6e7006bd15fc3c4166b95468cf979ce33a4d10cc05d9d4f9d8f2e58d9"} Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.325918 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fpgmr" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.330217 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w6h66" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.336940 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dpqm9" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.344981 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-crlcr" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.352699 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z92m7" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.389387 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wm9rv" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.461069 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bd03756-cb80-40f8-b5eb-c936c80c6c21-utilities\") pod \"1bd03756-cb80-40f8-b5eb-c936c80c6c21\" (UID: \"1bd03756-cb80-40f8-b5eb-c936c80c6c21\") " Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.462639 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d-utilities\") pod \"5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d\" (UID: \"5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d\") " Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.462678 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5jfv\" (UniqueName: \"kubernetes.io/projected/956dc850-ee41-4d30-b209-2f3270c376a5-kube-api-access-z5jfv\") pod \"956dc850-ee41-4d30-b209-2f3270c376a5\" (UID: \"956dc850-ee41-4d30-b209-2f3270c376a5\") " Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.462730 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bd03756-cb80-40f8-b5eb-c936c80c6c21-catalog-content\") pod \"1bd03756-cb80-40f8-b5eb-c936c80c6c21\" (UID: \"1bd03756-cb80-40f8-b5eb-c936c80c6c21\") " Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.462770 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbg62\" (UniqueName: \"kubernetes.io/projected/67cf5e8a-2d35-4db9-bcb6-68492f4be191-kube-api-access-kbg62\") pod \"67cf5e8a-2d35-4db9-bcb6-68492f4be191\" (UID: \"67cf5e8a-2d35-4db9-bcb6-68492f4be191\") " Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.462796 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/67cf5e8a-2d35-4db9-bcb6-68492f4be191-marketplace-trusted-ca\") pod \"67cf5e8a-2d35-4db9-bcb6-68492f4be191\" (UID: \"67cf5e8a-2d35-4db9-bcb6-68492f4be191\") " Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.462831 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956dc850-ee41-4d30-b209-2f3270c376a5-catalog-content\") pod \"956dc850-ee41-4d30-b209-2f3270c376a5\" (UID: \"956dc850-ee41-4d30-b209-2f3270c376a5\") " Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.462860 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-759hf\" (UniqueName: \"kubernetes.io/projected/95c208d2-1ef6-4716-8053-ccbabcd7fe59-kube-api-access-759hf\") pod \"95c208d2-1ef6-4716-8053-ccbabcd7fe59\" (UID: \"95c208d2-1ef6-4716-8053-ccbabcd7fe59\") " Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.462891 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsk6s\" (UniqueName: \"kubernetes.io/projected/1bd03756-cb80-40f8-b5eb-c936c80c6c21-kube-api-access-jsk6s\") pod \"1bd03756-cb80-40f8-b5eb-c936c80c6c21\" (UID: \"1bd03756-cb80-40f8-b5eb-c936c80c6c21\") " Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.462927 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/67cf5e8a-2d35-4db9-bcb6-68492f4be191-marketplace-operator-metrics\") pod \"67cf5e8a-2d35-4db9-bcb6-68492f4be191\" (UID: \"67cf5e8a-2d35-4db9-bcb6-68492f4be191\") " Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.462954 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956dc850-ee41-4d30-b209-2f3270c376a5-utilities\") pod \"956dc850-ee41-4d30-b209-2f3270c376a5\" (UID: \"956dc850-ee41-4d30-b209-2f3270c376a5\") " Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.462986 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95c208d2-1ef6-4716-8053-ccbabcd7fe59-utilities\") pod \"95c208d2-1ef6-4716-8053-ccbabcd7fe59\" (UID: \"95c208d2-1ef6-4716-8053-ccbabcd7fe59\") " Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.463014 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95c208d2-1ef6-4716-8053-ccbabcd7fe59-catalog-content\") pod \"95c208d2-1ef6-4716-8053-ccbabcd7fe59\" (UID: \"95c208d2-1ef6-4716-8053-ccbabcd7fe59\") " Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.463045 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d-catalog-content\") pod \"5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d\" (UID: \"5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d\") " Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.463071 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mb8ph\" (UniqueName: \"kubernetes.io/projected/5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d-kube-api-access-mb8ph\") pod \"5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d\" (UID: \"5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d\") " Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.463544 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d-utilities" (OuterVolumeSpecName: "utilities") pod "5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d" (UID: "5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.464694 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95c208d2-1ef6-4716-8053-ccbabcd7fe59-utilities" (OuterVolumeSpecName: "utilities") pod "95c208d2-1ef6-4716-8053-ccbabcd7fe59" (UID: "95c208d2-1ef6-4716-8053-ccbabcd7fe59"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.464920 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67cf5e8a-2d35-4db9-bcb6-68492f4be191-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "67cf5e8a-2d35-4db9-bcb6-68492f4be191" (UID: "67cf5e8a-2d35-4db9-bcb6-68492f4be191"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.465724 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/956dc850-ee41-4d30-b209-2f3270c376a5-utilities" (OuterVolumeSpecName: "utilities") pod "956dc850-ee41-4d30-b209-2f3270c376a5" (UID: "956dc850-ee41-4d30-b209-2f3270c376a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.467911 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95c208d2-1ef6-4716-8053-ccbabcd7fe59-kube-api-access-759hf" (OuterVolumeSpecName: "kube-api-access-759hf") pod "95c208d2-1ef6-4716-8053-ccbabcd7fe59" (UID: "95c208d2-1ef6-4716-8053-ccbabcd7fe59"). InnerVolumeSpecName "kube-api-access-759hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.467951 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d-kube-api-access-mb8ph" (OuterVolumeSpecName: "kube-api-access-mb8ph") pod "5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d" (UID: "5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d"). InnerVolumeSpecName "kube-api-access-mb8ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.468577 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/956dc850-ee41-4d30-b209-2f3270c376a5-kube-api-access-z5jfv" (OuterVolumeSpecName: "kube-api-access-z5jfv") pod "956dc850-ee41-4d30-b209-2f3270c376a5" (UID: "956dc850-ee41-4d30-b209-2f3270c376a5"). InnerVolumeSpecName "kube-api-access-z5jfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.468650 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67cf5e8a-2d35-4db9-bcb6-68492f4be191-kube-api-access-kbg62" (OuterVolumeSpecName: "kube-api-access-kbg62") pod "67cf5e8a-2d35-4db9-bcb6-68492f4be191" (UID: "67cf5e8a-2d35-4db9-bcb6-68492f4be191"). InnerVolumeSpecName "kube-api-access-kbg62". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.472356 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bd03756-cb80-40f8-b5eb-c936c80c6c21-kube-api-access-jsk6s" (OuterVolumeSpecName: "kube-api-access-jsk6s") pod "1bd03756-cb80-40f8-b5eb-c936c80c6c21" (UID: "1bd03756-cb80-40f8-b5eb-c936c80c6c21"). InnerVolumeSpecName "kube-api-access-jsk6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.475525 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bd03756-cb80-40f8-b5eb-c936c80c6c21-utilities" (OuterVolumeSpecName: "utilities") pod "1bd03756-cb80-40f8-b5eb-c936c80c6c21" (UID: "1bd03756-cb80-40f8-b5eb-c936c80c6c21"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.478097 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67cf5e8a-2d35-4db9-bcb6-68492f4be191-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "67cf5e8a-2d35-4db9-bcb6-68492f4be191" (UID: "67cf5e8a-2d35-4db9-bcb6-68492f4be191"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.493460 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/956dc850-ee41-4d30-b209-2f3270c376a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "956dc850-ee41-4d30-b209-2f3270c376a5" (UID: "956dc850-ee41-4d30-b209-2f3270c376a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.543199 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bd03756-cb80-40f8-b5eb-c936c80c6c21-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1bd03756-cb80-40f8-b5eb-c936c80c6c21" (UID: "1bd03756-cb80-40f8-b5eb-c936c80c6c21"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.546480 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95c208d2-1ef6-4716-8053-ccbabcd7fe59-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "95c208d2-1ef6-4716-8053-ccbabcd7fe59" (UID: "95c208d2-1ef6-4716-8053-ccbabcd7fe59"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.552131 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fpgmr"] Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.564149 4756 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/67cf5e8a-2d35-4db9-bcb6-68492f4be191-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.564190 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956dc850-ee41-4d30-b209-2f3270c376a5-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.564204 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95c208d2-1ef6-4716-8053-ccbabcd7fe59-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.564216 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95c208d2-1ef6-4716-8053-ccbabcd7fe59-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.564227 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mb8ph\" (UniqueName: \"kubernetes.io/projected/5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d-kube-api-access-mb8ph\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.564238 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bd03756-cb80-40f8-b5eb-c936c80c6c21-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.564249 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.564260 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5jfv\" (UniqueName: \"kubernetes.io/projected/956dc850-ee41-4d30-b209-2f3270c376a5-kube-api-access-z5jfv\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.564270 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bd03756-cb80-40f8-b5eb-c936c80c6c21-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.564280 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbg62\" (UniqueName: \"kubernetes.io/projected/67cf5e8a-2d35-4db9-bcb6-68492f4be191-kube-api-access-kbg62\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.564290 4756 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/67cf5e8a-2d35-4db9-bcb6-68492f4be191-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.564300 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956dc850-ee41-4d30-b209-2f3270c376a5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.564313 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-759hf\" (UniqueName: \"kubernetes.io/projected/95c208d2-1ef6-4716-8053-ccbabcd7fe59-kube-api-access-759hf\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.564326 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsk6s\" (UniqueName: \"kubernetes.io/projected/1bd03756-cb80-40f8-b5eb-c936c80c6c21-kube-api-access-jsk6s\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.613297 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d" (UID: "5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:20:38 crc kubenswrapper[4756]: I1205 16:20:38.665523 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.133763 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z92m7" Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.133851 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z92m7" event={"ID":"956dc850-ee41-4d30-b209-2f3270c376a5","Type":"ContainerDied","Data":"77ec9095d21f26ea7d09940a6fbab2d80ec321c55c4b3ef7a84bc65250205c0c"} Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.133899 4756 scope.go:117] "RemoveContainer" containerID="42a3a1556de1882d1dd77108e70f66eb804737ab143faecb1ee2e1981dd7a818" Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.135015 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dpqm9" Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.135094 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dpqm9" event={"ID":"67cf5e8a-2d35-4db9-bcb6-68492f4be191","Type":"ContainerDied","Data":"8636ff4b753c0c885e1207bb62b72b4943c6ff51c0226bb1528f33d709ae103e"} Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.152523 4756 scope.go:117] "RemoveContainer" containerID="1bc8b34620d87755dbb185323ddae8813a6ba4c9db612bcd619cb68560326592" Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.153998 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6h66" event={"ID":"1bd03756-cb80-40f8-b5eb-c936c80c6c21","Type":"ContainerDied","Data":"379222f3448f5fe46508df43ab2a01561de1cea82e64d2d775cef160b62e6aee"} Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.154014 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w6h66" Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.156141 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fpgmr" event={"ID":"6c445575-d157-46b0-a496-79e9dfc47c17","Type":"ContainerStarted","Data":"1083d51299a5a643943ec3a8410628bcf05e8b52432646367845233b3cf93199"} Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.156193 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fpgmr" event={"ID":"6c445575-d157-46b0-a496-79e9dfc47c17","Type":"ContainerStarted","Data":"39c88a46ea039c009d383356f5fe043e172e2461fd941d51474004f7b4b06c08"} Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.157599 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-fpgmr" Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.161577 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-fpgmr" Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.166154 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wm9rv" event={"ID":"5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d","Type":"ContainerDied","Data":"b63f6a72ffca755b88b8d96fecf862eecbac6caba33efa7a0286a6c1bb7ba978"} Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.166212 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wm9rv" Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.172944 4756 scope.go:117] "RemoveContainer" containerID="af26c0120b339ed597346fa0e5095b3ea3d87f7ac2614bee427d4e23f38193b6" Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.174768 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crlcr" event={"ID":"95c208d2-1ef6-4716-8053-ccbabcd7fe59","Type":"ContainerDied","Data":"95648b16a7df680eb44faab267e704e555be37ede7b5d64806065ac42c6812ce"} Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.174859 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-crlcr" Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.197776 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-fpgmr" podStartSLOduration=2.197756546 podStartE2EDuration="2.197756546s" podCreationTimestamp="2025-12-05 16:20:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:20:39.180355043 +0000 UTC m=+299.419028575" watchObservedRunningTime="2025-12-05 16:20:39.197756546 +0000 UTC m=+299.436430058" Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.200552 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z92m7"] Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.202188 4756 scope.go:117] "RemoveContainer" containerID="116227a6e7006bd15fc3c4166b95468cf979ce33a4d10cc05d9d4f9d8f2e58d9" Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.204849 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z92m7"] Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.229882 4756 scope.go:117] "RemoveContainer" containerID="49599143ed507b4e0b78b3bc947fe60c84566e490ae2fc67d3a6a3360f4168fa" Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.266647 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-crlcr"] Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.271189 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-crlcr"] Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.275733 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dpqm9"] Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.278129 4756 scope.go:117] "RemoveContainer" containerID="c4ef9b640030c84adcf3f8d83e0dacb531e12d532de70244f3adaf7816c0baf2" Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.280150 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dpqm9"] Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.283713 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w6h66"] Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.286845 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w6h66"] Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.289470 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wm9rv"] Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.292086 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wm9rv"] Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.297744 4756 scope.go:117] "RemoveContainer" containerID="3a0c815fd4a5438cdc2a7f3812dd379016aeb2e8cff3cffb27b97240c8ed6a57" Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.315184 4756 scope.go:117] "RemoveContainer" containerID="a594d1535fe50b0e0810f90de0d896de51a3a43797f8f58ff1bf4ff1a97f8d19" Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.329237 4756 scope.go:117] "RemoveContainer" containerID="175bae53c0b41360dfffe8fbb20b408fcd6f6cba6f57662dd4134a499c704230" Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.345205 4756 scope.go:117] "RemoveContainer" containerID="a1f226357f47fa0a9e6553f48ba50b538b8a9cf43b9e656fb0ce3e2136510291" Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.362627 4756 scope.go:117] "RemoveContainer" containerID="a1b7f1d8ead78b7fdbfc35ba19576984330abbc34adaa14e957409f7cf995a55" Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.379390 4756 scope.go:117] "RemoveContainer" containerID="11e98e6644303fe9bb529fc5ba0d3e5258979403f00df73e8d92a3ceb93d9702" Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.400005 4756 scope.go:117] "RemoveContainer" containerID="0eb9155162f65e0efd3e588a21e36e3bf343bf430db876646d35bf13d02a85d9" Dec 05 16:20:39 crc kubenswrapper[4756]: I1205 16:20:39.902377 4756 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 05 16:20:40 crc kubenswrapper[4756]: I1205 16:20:40.055636 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bd03756-cb80-40f8-b5eb-c936c80c6c21" path="/var/lib/kubelet/pods/1bd03756-cb80-40f8-b5eb-c936c80c6c21/volumes" Dec 05 16:20:40 crc kubenswrapper[4756]: I1205 16:20:40.057041 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d" path="/var/lib/kubelet/pods/5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d/volumes" Dec 05 16:20:40 crc kubenswrapper[4756]: I1205 16:20:40.058320 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67cf5e8a-2d35-4db9-bcb6-68492f4be191" path="/var/lib/kubelet/pods/67cf5e8a-2d35-4db9-bcb6-68492f4be191/volumes" Dec 05 16:20:40 crc kubenswrapper[4756]: I1205 16:20:40.060027 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="956dc850-ee41-4d30-b209-2f3270c376a5" path="/var/lib/kubelet/pods/956dc850-ee41-4d30-b209-2f3270c376a5/volumes" Dec 05 16:20:40 crc kubenswrapper[4756]: I1205 16:20:40.061169 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95c208d2-1ef6-4716-8053-ccbabcd7fe59" path="/var/lib/kubelet/pods/95c208d2-1ef6-4716-8053-ccbabcd7fe59/volumes" Dec 05 16:20:52 crc kubenswrapper[4756]: I1205 16:20:52.181257 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 05 16:20:52 crc kubenswrapper[4756]: I1205 16:20:52.343304 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 05 16:20:52 crc kubenswrapper[4756]: I1205 16:20:52.582791 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.420245 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7xvln"] Dec 05 16:20:54 crc kubenswrapper[4756]: E1205 16:20:54.420746 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956dc850-ee41-4d30-b209-2f3270c376a5" containerName="registry-server" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.420760 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="956dc850-ee41-4d30-b209-2f3270c376a5" containerName="registry-server" Dec 05 16:20:54 crc kubenswrapper[4756]: E1205 16:20:54.420767 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d" containerName="registry-server" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.420773 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d" containerName="registry-server" Dec 05 16:20:54 crc kubenswrapper[4756]: E1205 16:20:54.420785 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d" containerName="extract-content" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.420792 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d" containerName="extract-content" Dec 05 16:20:54 crc kubenswrapper[4756]: E1205 16:20:54.420798 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bd03756-cb80-40f8-b5eb-c936c80c6c21" containerName="registry-server" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.420804 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bd03756-cb80-40f8-b5eb-c936c80c6c21" containerName="registry-server" Dec 05 16:20:54 crc kubenswrapper[4756]: E1205 16:20:54.420812 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95c208d2-1ef6-4716-8053-ccbabcd7fe59" containerName="extract-utilities" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.420818 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c208d2-1ef6-4716-8053-ccbabcd7fe59" containerName="extract-utilities" Dec 05 16:20:54 crc kubenswrapper[4756]: E1205 16:20:54.420828 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956dc850-ee41-4d30-b209-2f3270c376a5" containerName="extract-utilities" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.420834 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="956dc850-ee41-4d30-b209-2f3270c376a5" containerName="extract-utilities" Dec 05 16:20:54 crc kubenswrapper[4756]: E1205 16:20:54.420841 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bd03756-cb80-40f8-b5eb-c936c80c6c21" containerName="extract-utilities" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.420846 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bd03756-cb80-40f8-b5eb-c936c80c6c21" containerName="extract-utilities" Dec 05 16:20:54 crc kubenswrapper[4756]: E1205 16:20:54.420857 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d" containerName="extract-utilities" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.420863 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d" containerName="extract-utilities" Dec 05 16:20:54 crc kubenswrapper[4756]: E1205 16:20:54.420871 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95c208d2-1ef6-4716-8053-ccbabcd7fe59" containerName="extract-content" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.420876 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c208d2-1ef6-4716-8053-ccbabcd7fe59" containerName="extract-content" Dec 05 16:20:54 crc kubenswrapper[4756]: E1205 16:20:54.420885 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67cf5e8a-2d35-4db9-bcb6-68492f4be191" containerName="marketplace-operator" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.420890 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="67cf5e8a-2d35-4db9-bcb6-68492f4be191" containerName="marketplace-operator" Dec 05 16:20:54 crc kubenswrapper[4756]: E1205 16:20:54.420900 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956dc850-ee41-4d30-b209-2f3270c376a5" containerName="extract-content" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.420907 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="956dc850-ee41-4d30-b209-2f3270c376a5" containerName="extract-content" Dec 05 16:20:54 crc kubenswrapper[4756]: E1205 16:20:54.420924 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95c208d2-1ef6-4716-8053-ccbabcd7fe59" containerName="registry-server" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.420931 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c208d2-1ef6-4716-8053-ccbabcd7fe59" containerName="registry-server" Dec 05 16:20:54 crc kubenswrapper[4756]: E1205 16:20:54.420940 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bd03756-cb80-40f8-b5eb-c936c80c6c21" containerName="extract-content" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.420946 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bd03756-cb80-40f8-b5eb-c936c80c6c21" containerName="extract-content" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.421039 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="67cf5e8a-2d35-4db9-bcb6-68492f4be191" containerName="marketplace-operator" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.421055 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bd03756-cb80-40f8-b5eb-c936c80c6c21" containerName="registry-server" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.421066 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="95c208d2-1ef6-4716-8053-ccbabcd7fe59" containerName="registry-server" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.421076 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="5178d75b-fae4-4b68-8f0c-45b4ac2f6f0d" containerName="registry-server" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.421091 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="956dc850-ee41-4d30-b209-2f3270c376a5" containerName="registry-server" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.423723 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7xvln" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.432588 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.438123 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7xvln"] Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.580754 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4jdj5"] Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.580967 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-4jdj5" podUID="b5856492-e637-42d0-b018-6f205e03e897" containerName="controller-manager" containerID="cri-o://708e55edf0cb786a5bcd19244b522351900f1aeb5c97f99dd8ee13733c753119" gracePeriod=30 Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.600250 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/549ad202-d7ad-4107-bfdd-af2497550691-utilities\") pod \"community-operators-7xvln\" (UID: \"549ad202-d7ad-4107-bfdd-af2497550691\") " pod="openshift-marketplace/community-operators-7xvln" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.600312 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/549ad202-d7ad-4107-bfdd-af2497550691-catalog-content\") pod \"community-operators-7xvln\" (UID: \"549ad202-d7ad-4107-bfdd-af2497550691\") " pod="openshift-marketplace/community-operators-7xvln" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.600352 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78stt\" (UniqueName: \"kubernetes.io/projected/549ad202-d7ad-4107-bfdd-af2497550691-kube-api-access-78stt\") pod \"community-operators-7xvln\" (UID: \"549ad202-d7ad-4107-bfdd-af2497550691\") " pod="openshift-marketplace/community-operators-7xvln" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.623583 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c9shz"] Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.624805 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c9shz" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.627003 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.635114 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c9shz"] Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.701128 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78stt\" (UniqueName: \"kubernetes.io/projected/549ad202-d7ad-4107-bfdd-af2497550691-kube-api-access-78stt\") pod \"community-operators-7xvln\" (UID: \"549ad202-d7ad-4107-bfdd-af2497550691\") " pod="openshift-marketplace/community-operators-7xvln" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.701401 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/549ad202-d7ad-4107-bfdd-af2497550691-utilities\") pod \"community-operators-7xvln\" (UID: \"549ad202-d7ad-4107-bfdd-af2497550691\") " pod="openshift-marketplace/community-operators-7xvln" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.701484 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/549ad202-d7ad-4107-bfdd-af2497550691-catalog-content\") pod \"community-operators-7xvln\" (UID: \"549ad202-d7ad-4107-bfdd-af2497550691\") " pod="openshift-marketplace/community-operators-7xvln" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.701825 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/549ad202-d7ad-4107-bfdd-af2497550691-utilities\") pod \"community-operators-7xvln\" (UID: \"549ad202-d7ad-4107-bfdd-af2497550691\") " pod="openshift-marketplace/community-operators-7xvln" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.702013 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/549ad202-d7ad-4107-bfdd-af2497550691-catalog-content\") pod \"community-operators-7xvln\" (UID: \"549ad202-d7ad-4107-bfdd-af2497550691\") " pod="openshift-marketplace/community-operators-7xvln" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.705164 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-q296n"] Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.707922 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q296n" podUID="c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8" containerName="route-controller-manager" containerID="cri-o://e1a23a8732fa23534d18ddccc53c42a3c957a8ef4567cfbf5b6b2404ccdc4b1f" gracePeriod=30 Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.720746 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78stt\" (UniqueName: \"kubernetes.io/projected/549ad202-d7ad-4107-bfdd-af2497550691-kube-api-access-78stt\") pod \"community-operators-7xvln\" (UID: \"549ad202-d7ad-4107-bfdd-af2497550691\") " pod="openshift-marketplace/community-operators-7xvln" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.751986 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7xvln" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.802112 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf4e82d2-e343-4e98-8e73-80e04c22e903-catalog-content\") pod \"certified-operators-c9shz\" (UID: \"bf4e82d2-e343-4e98-8e73-80e04c22e903\") " pod="openshift-marketplace/certified-operators-c9shz" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.802193 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf4e82d2-e343-4e98-8e73-80e04c22e903-utilities\") pod \"certified-operators-c9shz\" (UID: \"bf4e82d2-e343-4e98-8e73-80e04c22e903\") " pod="openshift-marketplace/certified-operators-c9shz" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.802232 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs5jg\" (UniqueName: \"kubernetes.io/projected/bf4e82d2-e343-4e98-8e73-80e04c22e903-kube-api-access-hs5jg\") pod \"certified-operators-c9shz\" (UID: \"bf4e82d2-e343-4e98-8e73-80e04c22e903\") " pod="openshift-marketplace/certified-operators-c9shz" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.904529 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs5jg\" (UniqueName: \"kubernetes.io/projected/bf4e82d2-e343-4e98-8e73-80e04c22e903-kube-api-access-hs5jg\") pod \"certified-operators-c9shz\" (UID: \"bf4e82d2-e343-4e98-8e73-80e04c22e903\") " pod="openshift-marketplace/certified-operators-c9shz" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.904591 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf4e82d2-e343-4e98-8e73-80e04c22e903-catalog-content\") pod \"certified-operators-c9shz\" (UID: \"bf4e82d2-e343-4e98-8e73-80e04c22e903\") " pod="openshift-marketplace/certified-operators-c9shz" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.905200 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf4e82d2-e343-4e98-8e73-80e04c22e903-utilities\") pod \"certified-operators-c9shz\" (UID: \"bf4e82d2-e343-4e98-8e73-80e04c22e903\") " pod="openshift-marketplace/certified-operators-c9shz" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.905834 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf4e82d2-e343-4e98-8e73-80e04c22e903-catalog-content\") pod \"certified-operators-c9shz\" (UID: \"bf4e82d2-e343-4e98-8e73-80e04c22e903\") " pod="openshift-marketplace/certified-operators-c9shz" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.905852 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf4e82d2-e343-4e98-8e73-80e04c22e903-utilities\") pod \"certified-operators-c9shz\" (UID: \"bf4e82d2-e343-4e98-8e73-80e04c22e903\") " pod="openshift-marketplace/certified-operators-c9shz" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.939291 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs5jg\" (UniqueName: \"kubernetes.io/projected/bf4e82d2-e343-4e98-8e73-80e04c22e903-kube-api-access-hs5jg\") pod \"certified-operators-c9shz\" (UID: \"bf4e82d2-e343-4e98-8e73-80e04c22e903\") " pod="openshift-marketplace/certified-operators-c9shz" Dec 05 16:20:54 crc kubenswrapper[4756]: I1205 16:20:54.955998 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4jdj5" Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.005874 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdpfl\" (UniqueName: \"kubernetes.io/projected/b5856492-e637-42d0-b018-6f205e03e897-kube-api-access-jdpfl\") pod \"b5856492-e637-42d0-b018-6f205e03e897\" (UID: \"b5856492-e637-42d0-b018-6f205e03e897\") " Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.006212 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5856492-e637-42d0-b018-6f205e03e897-serving-cert\") pod \"b5856492-e637-42d0-b018-6f205e03e897\" (UID: \"b5856492-e637-42d0-b018-6f205e03e897\") " Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.006276 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5856492-e637-42d0-b018-6f205e03e897-client-ca\") pod \"b5856492-e637-42d0-b018-6f205e03e897\" (UID: \"b5856492-e637-42d0-b018-6f205e03e897\") " Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.006338 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5856492-e637-42d0-b018-6f205e03e897-proxy-ca-bundles\") pod \"b5856492-e637-42d0-b018-6f205e03e897\" (UID: \"b5856492-e637-42d0-b018-6f205e03e897\") " Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.006364 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5856492-e637-42d0-b018-6f205e03e897-config\") pod \"b5856492-e637-42d0-b018-6f205e03e897\" (UID: \"b5856492-e637-42d0-b018-6f205e03e897\") " Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.007027 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5856492-e637-42d0-b018-6f205e03e897-client-ca" (OuterVolumeSpecName: "client-ca") pod "b5856492-e637-42d0-b018-6f205e03e897" (UID: "b5856492-e637-42d0-b018-6f205e03e897"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.007078 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5856492-e637-42d0-b018-6f205e03e897-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b5856492-e637-42d0-b018-6f205e03e897" (UID: "b5856492-e637-42d0-b018-6f205e03e897"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.007204 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5856492-e637-42d0-b018-6f205e03e897-config" (OuterVolumeSpecName: "config") pod "b5856492-e637-42d0-b018-6f205e03e897" (UID: "b5856492-e637-42d0-b018-6f205e03e897"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.010816 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5856492-e637-42d0-b018-6f205e03e897-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b5856492-e637-42d0-b018-6f205e03e897" (UID: "b5856492-e637-42d0-b018-6f205e03e897"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.011132 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5856492-e637-42d0-b018-6f205e03e897-kube-api-access-jdpfl" (OuterVolumeSpecName: "kube-api-access-jdpfl") pod "b5856492-e637-42d0-b018-6f205e03e897" (UID: "b5856492-e637-42d0-b018-6f205e03e897"). InnerVolumeSpecName "kube-api-access-jdpfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.011810 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c9shz" Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.091171 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q296n" Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.109754 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5856492-e637-42d0-b018-6f205e03e897-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.109790 4756 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5856492-e637-42d0-b018-6f205e03e897-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.109799 4756 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5856492-e637-42d0-b018-6f205e03e897-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.109810 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5856492-e637-42d0-b018-6f205e03e897-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.109821 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdpfl\" (UniqueName: \"kubernetes.io/projected/b5856492-e637-42d0-b018-6f205e03e897-kube-api-access-jdpfl\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.211588 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8-client-ca\") pod \"c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8\" (UID: \"c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8\") " Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.211707 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75q94\" (UniqueName: \"kubernetes.io/projected/c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8-kube-api-access-75q94\") pod \"c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8\" (UID: \"c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8\") " Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.211747 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8-serving-cert\") pod \"c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8\" (UID: \"c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8\") " Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.211784 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8-config\") pod \"c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8\" (UID: \"c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8\") " Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.212641 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8-client-ca" (OuterVolumeSpecName: "client-ca") pod "c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8" (UID: "c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.213101 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8-config" (OuterVolumeSpecName: "config") pod "c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8" (UID: "c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.215259 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8-kube-api-access-75q94" (OuterVolumeSpecName: "kube-api-access-75q94") pod "c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8" (UID: "c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8"). InnerVolumeSpecName "kube-api-access-75q94". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.216001 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8" (UID: "c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.259136 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7xvln"] Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.276802 4756 generic.go:334] "Generic (PLEG): container finished" podID="b5856492-e637-42d0-b018-6f205e03e897" containerID="708e55edf0cb786a5bcd19244b522351900f1aeb5c97f99dd8ee13733c753119" exitCode=0 Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.276860 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4jdj5" Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.276893 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4jdj5" event={"ID":"b5856492-e637-42d0-b018-6f205e03e897","Type":"ContainerDied","Data":"708e55edf0cb786a5bcd19244b522351900f1aeb5c97f99dd8ee13733c753119"} Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.277019 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4jdj5" event={"ID":"b5856492-e637-42d0-b018-6f205e03e897","Type":"ContainerDied","Data":"8fe9f0aec15351831324b8e12be6d738d9ed2c47047cd223629f913c3fab4947"} Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.277051 4756 scope.go:117] "RemoveContainer" containerID="708e55edf0cb786a5bcd19244b522351900f1aeb5c97f99dd8ee13733c753119" Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.278794 4756 generic.go:334] "Generic (PLEG): container finished" podID="c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8" containerID="e1a23a8732fa23534d18ddccc53c42a3c957a8ef4567cfbf5b6b2404ccdc4b1f" exitCode=0 Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.278871 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q296n" event={"ID":"c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8","Type":"ContainerDied","Data":"e1a23a8732fa23534d18ddccc53c42a3c957a8ef4567cfbf5b6b2404ccdc4b1f"} Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.278886 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q296n" Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.278904 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q296n" event={"ID":"c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8","Type":"ContainerDied","Data":"619ab76ee16d01a2eec4fdc118aef8893a35882b67ad63336f3e9039a100fdbc"} Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.281155 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7xvln" event={"ID":"549ad202-d7ad-4107-bfdd-af2497550691","Type":"ContainerStarted","Data":"cab59fe67186e58fa604e19efd6aec8829bc37c51a805190106d42b0f2dfd3b0"} Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.297115 4756 scope.go:117] "RemoveContainer" containerID="708e55edf0cb786a5bcd19244b522351900f1aeb5c97f99dd8ee13733c753119" Dec 05 16:20:55 crc kubenswrapper[4756]: E1205 16:20:55.297575 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"708e55edf0cb786a5bcd19244b522351900f1aeb5c97f99dd8ee13733c753119\": container with ID starting with 708e55edf0cb786a5bcd19244b522351900f1aeb5c97f99dd8ee13733c753119 not found: ID does not exist" containerID="708e55edf0cb786a5bcd19244b522351900f1aeb5c97f99dd8ee13733c753119" Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.297713 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"708e55edf0cb786a5bcd19244b522351900f1aeb5c97f99dd8ee13733c753119"} err="failed to get container status \"708e55edf0cb786a5bcd19244b522351900f1aeb5c97f99dd8ee13733c753119\": rpc error: code = NotFound desc = could not find container \"708e55edf0cb786a5bcd19244b522351900f1aeb5c97f99dd8ee13733c753119\": container with ID starting with 708e55edf0cb786a5bcd19244b522351900f1aeb5c97f99dd8ee13733c753119 not found: ID does not exist" Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.297825 4756 scope.go:117] "RemoveContainer" containerID="e1a23a8732fa23534d18ddccc53c42a3c957a8ef4567cfbf5b6b2404ccdc4b1f" Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.313140 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.313170 4756 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.313183 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75q94\" (UniqueName: \"kubernetes.io/projected/c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8-kube-api-access-75q94\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.313194 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.333898 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4jdj5"] Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.336177 4756 scope.go:117] "RemoveContainer" containerID="e1a23a8732fa23534d18ddccc53c42a3c957a8ef4567cfbf5b6b2404ccdc4b1f" Dec 05 16:20:55 crc kubenswrapper[4756]: E1205 16:20:55.337010 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1a23a8732fa23534d18ddccc53c42a3c957a8ef4567cfbf5b6b2404ccdc4b1f\": container with ID starting with e1a23a8732fa23534d18ddccc53c42a3c957a8ef4567cfbf5b6b2404ccdc4b1f not found: ID does not exist" containerID="e1a23a8732fa23534d18ddccc53c42a3c957a8ef4567cfbf5b6b2404ccdc4b1f" Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.337057 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1a23a8732fa23534d18ddccc53c42a3c957a8ef4567cfbf5b6b2404ccdc4b1f"} err="failed to get container status \"e1a23a8732fa23534d18ddccc53c42a3c957a8ef4567cfbf5b6b2404ccdc4b1f\": rpc error: code = NotFound desc = could not find container \"e1a23a8732fa23534d18ddccc53c42a3c957a8ef4567cfbf5b6b2404ccdc4b1f\": container with ID starting with e1a23a8732fa23534d18ddccc53c42a3c957a8ef4567cfbf5b6b2404ccdc4b1f not found: ID does not exist" Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.338196 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4jdj5"] Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.343582 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-q296n"] Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.346229 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-q296n"] Dec 05 16:20:55 crc kubenswrapper[4756]: I1205 16:20:55.452382 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c9shz"] Dec 05 16:20:55 crc kubenswrapper[4756]: W1205 16:20:55.500571 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf4e82d2_e343_4e98_8e73_80e04c22e903.slice/crio-ae9521aa8e1a3e76f34a066c206870e8d480530de2e3801ca863c08e2b7ef955 WatchSource:0}: Error finding container ae9521aa8e1a3e76f34a066c206870e8d480530de2e3801ca863c08e2b7ef955: Status 404 returned error can't find the container with id ae9521aa8e1a3e76f34a066c206870e8d480530de2e3801ca863c08e2b7ef955 Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.055991 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5856492-e637-42d0-b018-6f205e03e897" path="/var/lib/kubelet/pods/b5856492-e637-42d0-b018-6f205e03e897/volumes" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.057943 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8" path="/var/lib/kubelet/pods/c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8/volumes" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.085041 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-67c9f74866-zlmjx"] Dec 05 16:20:56 crc kubenswrapper[4756]: E1205 16:20:56.085275 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5856492-e637-42d0-b018-6f205e03e897" containerName="controller-manager" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.085288 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5856492-e637-42d0-b018-6f205e03e897" containerName="controller-manager" Dec 05 16:20:56 crc kubenswrapper[4756]: E1205 16:20:56.085302 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8" containerName="route-controller-manager" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.085309 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8" containerName="route-controller-manager" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.085398 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c598f4bd-a7e1-46b1-b886-2dda3dbdc0f8" containerName="route-controller-manager" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.085436 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5856492-e637-42d0-b018-6f205e03e897" containerName="controller-manager" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.085883 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c9f74866-zlmjx" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.086251 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c7f6d8788-96k4g"] Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.088821 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c7f6d8788-96k4g" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.089601 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.089718 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.089810 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.090464 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c7f6d8788-96k4g"] Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.091603 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.091816 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.092330 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.092672 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.092929 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.093124 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.093269 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.093511 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.094271 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67c9f74866-zlmjx"] Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.094346 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.102853 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.223789 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a252b14-0959-4fa8-8dd7-224dbfa801b8-config\") pod \"controller-manager-67c9f74866-zlmjx\" (UID: \"9a252b14-0959-4fa8-8dd7-224dbfa801b8\") " pod="openshift-controller-manager/controller-manager-67c9f74866-zlmjx" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.223850 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf9s5\" (UniqueName: \"kubernetes.io/projected/9a252b14-0959-4fa8-8dd7-224dbfa801b8-kube-api-access-kf9s5\") pod \"controller-manager-67c9f74866-zlmjx\" (UID: \"9a252b14-0959-4fa8-8dd7-224dbfa801b8\") " pod="openshift-controller-manager/controller-manager-67c9f74866-zlmjx" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.223900 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a13e827-7142-46be-a9ec-9c5e34804160-client-ca\") pod \"route-controller-manager-7c7f6d8788-96k4g\" (UID: \"7a13e827-7142-46be-a9ec-9c5e34804160\") " pod="openshift-route-controller-manager/route-controller-manager-7c7f6d8788-96k4g" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.223935 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a252b14-0959-4fa8-8dd7-224dbfa801b8-client-ca\") pod \"controller-manager-67c9f74866-zlmjx\" (UID: \"9a252b14-0959-4fa8-8dd7-224dbfa801b8\") " pod="openshift-controller-manager/controller-manager-67c9f74866-zlmjx" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.223957 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a13e827-7142-46be-a9ec-9c5e34804160-serving-cert\") pod \"route-controller-manager-7c7f6d8788-96k4g\" (UID: \"7a13e827-7142-46be-a9ec-9c5e34804160\") " pod="openshift-route-controller-manager/route-controller-manager-7c7f6d8788-96k4g" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.223981 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a252b14-0959-4fa8-8dd7-224dbfa801b8-proxy-ca-bundles\") pod \"controller-manager-67c9f74866-zlmjx\" (UID: \"9a252b14-0959-4fa8-8dd7-224dbfa801b8\") " pod="openshift-controller-manager/controller-manager-67c9f74866-zlmjx" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.224007 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb5d2\" (UniqueName: \"kubernetes.io/projected/7a13e827-7142-46be-a9ec-9c5e34804160-kube-api-access-pb5d2\") pod \"route-controller-manager-7c7f6d8788-96k4g\" (UID: \"7a13e827-7142-46be-a9ec-9c5e34804160\") " pod="openshift-route-controller-manager/route-controller-manager-7c7f6d8788-96k4g" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.224030 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a252b14-0959-4fa8-8dd7-224dbfa801b8-serving-cert\") pod \"controller-manager-67c9f74866-zlmjx\" (UID: \"9a252b14-0959-4fa8-8dd7-224dbfa801b8\") " pod="openshift-controller-manager/controller-manager-67c9f74866-zlmjx" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.224069 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a13e827-7142-46be-a9ec-9c5e34804160-config\") pod \"route-controller-manager-7c7f6d8788-96k4g\" (UID: \"7a13e827-7142-46be-a9ec-9c5e34804160\") " pod="openshift-route-controller-manager/route-controller-manager-7c7f6d8788-96k4g" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.287695 4756 generic.go:334] "Generic (PLEG): container finished" podID="bf4e82d2-e343-4e98-8e73-80e04c22e903" containerID="3fdffde472a87c546fa8231797bf9b157befde2de13d6ca75c3b78f95c924c86" exitCode=0 Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.287757 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c9shz" event={"ID":"bf4e82d2-e343-4e98-8e73-80e04c22e903","Type":"ContainerDied","Data":"3fdffde472a87c546fa8231797bf9b157befde2de13d6ca75c3b78f95c924c86"} Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.287811 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c9shz" event={"ID":"bf4e82d2-e343-4e98-8e73-80e04c22e903","Type":"ContainerStarted","Data":"ae9521aa8e1a3e76f34a066c206870e8d480530de2e3801ca863c08e2b7ef955"} Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.292731 4756 generic.go:334] "Generic (PLEG): container finished" podID="549ad202-d7ad-4107-bfdd-af2497550691" containerID="6dc4ea09ebbcc108c1d8ab55b287b53b762ac0115be10ae7fd3144eb069beecf" exitCode=0 Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.292773 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7xvln" event={"ID":"549ad202-d7ad-4107-bfdd-af2497550691","Type":"ContainerDied","Data":"6dc4ea09ebbcc108c1d8ab55b287b53b762ac0115be10ae7fd3144eb069beecf"} Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.325144 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a13e827-7142-46be-a9ec-9c5e34804160-client-ca\") pod \"route-controller-manager-7c7f6d8788-96k4g\" (UID: \"7a13e827-7142-46be-a9ec-9c5e34804160\") " pod="openshift-route-controller-manager/route-controller-manager-7c7f6d8788-96k4g" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.325208 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a252b14-0959-4fa8-8dd7-224dbfa801b8-client-ca\") pod \"controller-manager-67c9f74866-zlmjx\" (UID: \"9a252b14-0959-4fa8-8dd7-224dbfa801b8\") " pod="openshift-controller-manager/controller-manager-67c9f74866-zlmjx" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.325230 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a13e827-7142-46be-a9ec-9c5e34804160-serving-cert\") pod \"route-controller-manager-7c7f6d8788-96k4g\" (UID: \"7a13e827-7142-46be-a9ec-9c5e34804160\") " pod="openshift-route-controller-manager/route-controller-manager-7c7f6d8788-96k4g" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.325254 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a252b14-0959-4fa8-8dd7-224dbfa801b8-proxy-ca-bundles\") pod \"controller-manager-67c9f74866-zlmjx\" (UID: \"9a252b14-0959-4fa8-8dd7-224dbfa801b8\") " pod="openshift-controller-manager/controller-manager-67c9f74866-zlmjx" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.325284 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb5d2\" (UniqueName: \"kubernetes.io/projected/7a13e827-7142-46be-a9ec-9c5e34804160-kube-api-access-pb5d2\") pod \"route-controller-manager-7c7f6d8788-96k4g\" (UID: \"7a13e827-7142-46be-a9ec-9c5e34804160\") " pod="openshift-route-controller-manager/route-controller-manager-7c7f6d8788-96k4g" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.325305 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a252b14-0959-4fa8-8dd7-224dbfa801b8-serving-cert\") pod \"controller-manager-67c9f74866-zlmjx\" (UID: \"9a252b14-0959-4fa8-8dd7-224dbfa801b8\") " pod="openshift-controller-manager/controller-manager-67c9f74866-zlmjx" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.325711 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a13e827-7142-46be-a9ec-9c5e34804160-config\") pod \"route-controller-manager-7c7f6d8788-96k4g\" (UID: \"7a13e827-7142-46be-a9ec-9c5e34804160\") " pod="openshift-route-controller-manager/route-controller-manager-7c7f6d8788-96k4g" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.325757 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a252b14-0959-4fa8-8dd7-224dbfa801b8-config\") pod \"controller-manager-67c9f74866-zlmjx\" (UID: \"9a252b14-0959-4fa8-8dd7-224dbfa801b8\") " pod="openshift-controller-manager/controller-manager-67c9f74866-zlmjx" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.325817 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf9s5\" (UniqueName: \"kubernetes.io/projected/9a252b14-0959-4fa8-8dd7-224dbfa801b8-kube-api-access-kf9s5\") pod \"controller-manager-67c9f74866-zlmjx\" (UID: \"9a252b14-0959-4fa8-8dd7-224dbfa801b8\") " pod="openshift-controller-manager/controller-manager-67c9f74866-zlmjx" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.326262 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a13e827-7142-46be-a9ec-9c5e34804160-client-ca\") pod \"route-controller-manager-7c7f6d8788-96k4g\" (UID: \"7a13e827-7142-46be-a9ec-9c5e34804160\") " pod="openshift-route-controller-manager/route-controller-manager-7c7f6d8788-96k4g" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.326340 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a252b14-0959-4fa8-8dd7-224dbfa801b8-client-ca\") pod \"controller-manager-67c9f74866-zlmjx\" (UID: \"9a252b14-0959-4fa8-8dd7-224dbfa801b8\") " pod="openshift-controller-manager/controller-manager-67c9f74866-zlmjx" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.326866 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a252b14-0959-4fa8-8dd7-224dbfa801b8-proxy-ca-bundles\") pod \"controller-manager-67c9f74866-zlmjx\" (UID: \"9a252b14-0959-4fa8-8dd7-224dbfa801b8\") " pod="openshift-controller-manager/controller-manager-67c9f74866-zlmjx" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.327137 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a13e827-7142-46be-a9ec-9c5e34804160-config\") pod \"route-controller-manager-7c7f6d8788-96k4g\" (UID: \"7a13e827-7142-46be-a9ec-9c5e34804160\") " pod="openshift-route-controller-manager/route-controller-manager-7c7f6d8788-96k4g" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.327299 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a252b14-0959-4fa8-8dd7-224dbfa801b8-config\") pod \"controller-manager-67c9f74866-zlmjx\" (UID: \"9a252b14-0959-4fa8-8dd7-224dbfa801b8\") " pod="openshift-controller-manager/controller-manager-67c9f74866-zlmjx" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.340388 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a252b14-0959-4fa8-8dd7-224dbfa801b8-serving-cert\") pod \"controller-manager-67c9f74866-zlmjx\" (UID: \"9a252b14-0959-4fa8-8dd7-224dbfa801b8\") " pod="openshift-controller-manager/controller-manager-67c9f74866-zlmjx" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.342443 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb5d2\" (UniqueName: \"kubernetes.io/projected/7a13e827-7142-46be-a9ec-9c5e34804160-kube-api-access-pb5d2\") pod \"route-controller-manager-7c7f6d8788-96k4g\" (UID: \"7a13e827-7142-46be-a9ec-9c5e34804160\") " pod="openshift-route-controller-manager/route-controller-manager-7c7f6d8788-96k4g" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.342915 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a13e827-7142-46be-a9ec-9c5e34804160-serving-cert\") pod \"route-controller-manager-7c7f6d8788-96k4g\" (UID: \"7a13e827-7142-46be-a9ec-9c5e34804160\") " pod="openshift-route-controller-manager/route-controller-manager-7c7f6d8788-96k4g" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.351252 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf9s5\" (UniqueName: \"kubernetes.io/projected/9a252b14-0959-4fa8-8dd7-224dbfa801b8-kube-api-access-kf9s5\") pod \"controller-manager-67c9f74866-zlmjx\" (UID: \"9a252b14-0959-4fa8-8dd7-224dbfa801b8\") " pod="openshift-controller-manager/controller-manager-67c9f74866-zlmjx" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.423545 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c9f74866-zlmjx" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.432559 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c7f6d8788-96k4g" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.659716 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67c9f74866-zlmjx"] Dec 05 16:20:56 crc kubenswrapper[4756]: W1205 16:20:56.671437 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a252b14_0959_4fa8_8dd7_224dbfa801b8.slice/crio-ed107cc50d46ec0475cb2cabbbf9ba48ec7361cc3659ecdac25400e19de4228b WatchSource:0}: Error finding container ed107cc50d46ec0475cb2cabbbf9ba48ec7361cc3659ecdac25400e19de4228b: Status 404 returned error can't find the container with id ed107cc50d46ec0475cb2cabbbf9ba48ec7361cc3659ecdac25400e19de4228b Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.696958 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c7f6d8788-96k4g"] Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.821361 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m8tg4"] Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.822929 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m8tg4" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.825192 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.841612 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m8tg4"] Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.933098 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fv9s\" (UniqueName: \"kubernetes.io/projected/1e18238d-9140-4421-a912-377c9c626b83-kube-api-access-9fv9s\") pod \"redhat-marketplace-m8tg4\" (UID: \"1e18238d-9140-4421-a912-377c9c626b83\") " pod="openshift-marketplace/redhat-marketplace-m8tg4" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.933142 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e18238d-9140-4421-a912-377c9c626b83-catalog-content\") pod \"redhat-marketplace-m8tg4\" (UID: \"1e18238d-9140-4421-a912-377c9c626b83\") " pod="openshift-marketplace/redhat-marketplace-m8tg4" Dec 05 16:20:56 crc kubenswrapper[4756]: I1205 16:20:56.933181 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e18238d-9140-4421-a912-377c9c626b83-utilities\") pod \"redhat-marketplace-m8tg4\" (UID: \"1e18238d-9140-4421-a912-377c9c626b83\") " pod="openshift-marketplace/redhat-marketplace-m8tg4" Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.024138 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-trs6s"] Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.025173 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-trs6s" Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.027522 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.034703 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fv9s\" (UniqueName: \"kubernetes.io/projected/1e18238d-9140-4421-a912-377c9c626b83-kube-api-access-9fv9s\") pod \"redhat-marketplace-m8tg4\" (UID: \"1e18238d-9140-4421-a912-377c9c626b83\") " pod="openshift-marketplace/redhat-marketplace-m8tg4" Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.034736 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e18238d-9140-4421-a912-377c9c626b83-catalog-content\") pod \"redhat-marketplace-m8tg4\" (UID: \"1e18238d-9140-4421-a912-377c9c626b83\") " pod="openshift-marketplace/redhat-marketplace-m8tg4" Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.034774 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e18238d-9140-4421-a912-377c9c626b83-utilities\") pod \"redhat-marketplace-m8tg4\" (UID: \"1e18238d-9140-4421-a912-377c9c626b83\") " pod="openshift-marketplace/redhat-marketplace-m8tg4" Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.035218 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e18238d-9140-4421-a912-377c9c626b83-utilities\") pod \"redhat-marketplace-m8tg4\" (UID: \"1e18238d-9140-4421-a912-377c9c626b83\") " pod="openshift-marketplace/redhat-marketplace-m8tg4" Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.035272 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-trs6s"] Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.035697 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e18238d-9140-4421-a912-377c9c626b83-catalog-content\") pod \"redhat-marketplace-m8tg4\" (UID: \"1e18238d-9140-4421-a912-377c9c626b83\") " pod="openshift-marketplace/redhat-marketplace-m8tg4" Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.064347 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fv9s\" (UniqueName: \"kubernetes.io/projected/1e18238d-9140-4421-a912-377c9c626b83-kube-api-access-9fv9s\") pod \"redhat-marketplace-m8tg4\" (UID: \"1e18238d-9140-4421-a912-377c9c626b83\") " pod="openshift-marketplace/redhat-marketplace-m8tg4" Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.117094 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.135535 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtnqp\" (UniqueName: \"kubernetes.io/projected/fd51e2e3-9449-49c9-9ac0-a8496f61265d-kube-api-access-wtnqp\") pod \"redhat-operators-trs6s\" (UID: \"fd51e2e3-9449-49c9-9ac0-a8496f61265d\") " pod="openshift-marketplace/redhat-operators-trs6s" Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.135576 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd51e2e3-9449-49c9-9ac0-a8496f61265d-catalog-content\") pod \"redhat-operators-trs6s\" (UID: \"fd51e2e3-9449-49c9-9ac0-a8496f61265d\") " pod="openshift-marketplace/redhat-operators-trs6s" Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.135724 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd51e2e3-9449-49c9-9ac0-a8496f61265d-utilities\") pod \"redhat-operators-trs6s\" (UID: \"fd51e2e3-9449-49c9-9ac0-a8496f61265d\") " pod="openshift-marketplace/redhat-operators-trs6s" Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.146682 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m8tg4" Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.237221 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtnqp\" (UniqueName: \"kubernetes.io/projected/fd51e2e3-9449-49c9-9ac0-a8496f61265d-kube-api-access-wtnqp\") pod \"redhat-operators-trs6s\" (UID: \"fd51e2e3-9449-49c9-9ac0-a8496f61265d\") " pod="openshift-marketplace/redhat-operators-trs6s" Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.237773 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd51e2e3-9449-49c9-9ac0-a8496f61265d-catalog-content\") pod \"redhat-operators-trs6s\" (UID: \"fd51e2e3-9449-49c9-9ac0-a8496f61265d\") " pod="openshift-marketplace/redhat-operators-trs6s" Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.237860 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd51e2e3-9449-49c9-9ac0-a8496f61265d-utilities\") pod \"redhat-operators-trs6s\" (UID: \"fd51e2e3-9449-49c9-9ac0-a8496f61265d\") " pod="openshift-marketplace/redhat-operators-trs6s" Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.238410 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd51e2e3-9449-49c9-9ac0-a8496f61265d-utilities\") pod \"redhat-operators-trs6s\" (UID: \"fd51e2e3-9449-49c9-9ac0-a8496f61265d\") " pod="openshift-marketplace/redhat-operators-trs6s" Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.238442 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd51e2e3-9449-49c9-9ac0-a8496f61265d-catalog-content\") pod \"redhat-operators-trs6s\" (UID: \"fd51e2e3-9449-49c9-9ac0-a8496f61265d\") " pod="openshift-marketplace/redhat-operators-trs6s" Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.260270 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtnqp\" (UniqueName: \"kubernetes.io/projected/fd51e2e3-9449-49c9-9ac0-a8496f61265d-kube-api-access-wtnqp\") pod \"redhat-operators-trs6s\" (UID: \"fd51e2e3-9449-49c9-9ac0-a8496f61265d\") " pod="openshift-marketplace/redhat-operators-trs6s" Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.299524 4756 generic.go:334] "Generic (PLEG): container finished" podID="bf4e82d2-e343-4e98-8e73-80e04c22e903" containerID="44fc5a5b9f6f524c06c6f96d299e86b681980494870f46cc4b7ff16043afb79a" exitCode=0 Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.299588 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c9shz" event={"ID":"bf4e82d2-e343-4e98-8e73-80e04c22e903","Type":"ContainerDied","Data":"44fc5a5b9f6f524c06c6f96d299e86b681980494870f46cc4b7ff16043afb79a"} Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.303029 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7xvln" event={"ID":"549ad202-d7ad-4107-bfdd-af2497550691","Type":"ContainerStarted","Data":"b5272825c9f1bfbdd67a7c3a1f7e24d45f298013f9d4789707594800ab6d7a1b"} Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.305604 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c9f74866-zlmjx" event={"ID":"9a252b14-0959-4fa8-8dd7-224dbfa801b8","Type":"ContainerStarted","Data":"8bae1628c3853fe5d6e3d99d0d5741a3787dfca55898f5096ad77e17efa10ca5"} Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.305642 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c9f74866-zlmjx" event={"ID":"9a252b14-0959-4fa8-8dd7-224dbfa801b8","Type":"ContainerStarted","Data":"ed107cc50d46ec0475cb2cabbbf9ba48ec7361cc3659ecdac25400e19de4228b"} Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.306220 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-67c9f74866-zlmjx" Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.307927 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c7f6d8788-96k4g" event={"ID":"7a13e827-7142-46be-a9ec-9c5e34804160","Type":"ContainerStarted","Data":"8aa6d61e37cb62b45ed4bfaa89bb2b84ff0a2ad1bf63bacb9b67f6498ffe55fe"} Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.307957 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c7f6d8788-96k4g" event={"ID":"7a13e827-7142-46be-a9ec-9c5e34804160","Type":"ContainerStarted","Data":"bf849694596f33f7fd682990fd5dba702cc74dc365dcb92e06aa2a5a1beca59a"} Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.308130 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7c7f6d8788-96k4g" Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.313102 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-67c9f74866-zlmjx" Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.340740 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7c7f6d8788-96k4g" Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.348751 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-trs6s" Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.366011 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-67c9f74866-zlmjx" podStartSLOduration=3.365994569 podStartE2EDuration="3.365994569s" podCreationTimestamp="2025-12-05 16:20:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:20:57.363234582 +0000 UTC m=+317.601908124" watchObservedRunningTime="2025-12-05 16:20:57.365994569 +0000 UTC m=+317.604668091" Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.455750 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7c7f6d8788-96k4g" podStartSLOduration=3.45573408 podStartE2EDuration="3.45573408s" podCreationTimestamp="2025-12-05 16:20:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:20:57.424673088 +0000 UTC m=+317.663346600" watchObservedRunningTime="2025-12-05 16:20:57.45573408 +0000 UTC m=+317.694407592" Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.677316 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m8tg4"] Dec 05 16:20:57 crc kubenswrapper[4756]: W1205 16:20:57.684741 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e18238d_9140_4421_a912_377c9c626b83.slice/crio-89dcee5014c2f3aac7e113594be9e955c5f5fdc33d11b30ff97f5794e55f0940 WatchSource:0}: Error finding container 89dcee5014c2f3aac7e113594be9e955c5f5fdc33d11b30ff97f5794e55f0940: Status 404 returned error can't find the container with id 89dcee5014c2f3aac7e113594be9e955c5f5fdc33d11b30ff97f5794e55f0940 Dec 05 16:20:57 crc kubenswrapper[4756]: I1205 16:20:57.834627 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-trs6s"] Dec 05 16:20:57 crc kubenswrapper[4756]: W1205 16:20:57.845032 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd51e2e3_9449_49c9_9ac0_a8496f61265d.slice/crio-7cd755f1866a0854925472fc67db62c0770ef3fa6d3f933379d4aff9171ce221 WatchSource:0}: Error finding container 7cd755f1866a0854925472fc67db62c0770ef3fa6d3f933379d4aff9171ce221: Status 404 returned error can't find the container with id 7cd755f1866a0854925472fc67db62c0770ef3fa6d3f933379d4aff9171ce221 Dec 05 16:20:58 crc kubenswrapper[4756]: I1205 16:20:58.315652 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c9shz" event={"ID":"bf4e82d2-e343-4e98-8e73-80e04c22e903","Type":"ContainerStarted","Data":"d5a64978ddb96dca5e34ea047d464e3aaa5b77ee99e0c4ed21cb063b2336a394"} Dec 05 16:20:58 crc kubenswrapper[4756]: I1205 16:20:58.316955 4756 generic.go:334] "Generic (PLEG): container finished" podID="fd51e2e3-9449-49c9-9ac0-a8496f61265d" containerID="f0e3160ac6af17a57106d3dc7e49ea8109a6867c7809032defe21192167e0acb" exitCode=0 Dec 05 16:20:58 crc kubenswrapper[4756]: I1205 16:20:58.317034 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-trs6s" event={"ID":"fd51e2e3-9449-49c9-9ac0-a8496f61265d","Type":"ContainerDied","Data":"f0e3160ac6af17a57106d3dc7e49ea8109a6867c7809032defe21192167e0acb"} Dec 05 16:20:58 crc kubenswrapper[4756]: I1205 16:20:58.317063 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-trs6s" event={"ID":"fd51e2e3-9449-49c9-9ac0-a8496f61265d","Type":"ContainerStarted","Data":"7cd755f1866a0854925472fc67db62c0770ef3fa6d3f933379d4aff9171ce221"} Dec 05 16:20:58 crc kubenswrapper[4756]: I1205 16:20:58.319763 4756 generic.go:334] "Generic (PLEG): container finished" podID="549ad202-d7ad-4107-bfdd-af2497550691" containerID="b5272825c9f1bfbdd67a7c3a1f7e24d45f298013f9d4789707594800ab6d7a1b" exitCode=0 Dec 05 16:20:58 crc kubenswrapper[4756]: I1205 16:20:58.319824 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7xvln" event={"ID":"549ad202-d7ad-4107-bfdd-af2497550691","Type":"ContainerDied","Data":"b5272825c9f1bfbdd67a7c3a1f7e24d45f298013f9d4789707594800ab6d7a1b"} Dec 05 16:20:58 crc kubenswrapper[4756]: I1205 16:20:58.321562 4756 generic.go:334] "Generic (PLEG): container finished" podID="1e18238d-9140-4421-a912-377c9c626b83" containerID="b7964cfb0f7e97b80098088902e9301e4df6f752dbb091132db12dfedce01bb2" exitCode=0 Dec 05 16:20:58 crc kubenswrapper[4756]: I1205 16:20:58.321617 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8tg4" event={"ID":"1e18238d-9140-4421-a912-377c9c626b83","Type":"ContainerDied","Data":"b7964cfb0f7e97b80098088902e9301e4df6f752dbb091132db12dfedce01bb2"} Dec 05 16:20:58 crc kubenswrapper[4756]: I1205 16:20:58.321664 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8tg4" event={"ID":"1e18238d-9140-4421-a912-377c9c626b83","Type":"ContainerStarted","Data":"89dcee5014c2f3aac7e113594be9e955c5f5fdc33d11b30ff97f5794e55f0940"} Dec 05 16:20:58 crc kubenswrapper[4756]: I1205 16:20:58.335926 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c9shz" podStartSLOduration=2.901511723 podStartE2EDuration="4.335905968s" podCreationTimestamp="2025-12-05 16:20:54 +0000 UTC" firstStartedPulling="2025-12-05 16:20:56.289541302 +0000 UTC m=+316.528214814" lastFinishedPulling="2025-12-05 16:20:57.723935547 +0000 UTC m=+317.962609059" observedRunningTime="2025-12-05 16:20:58.332945146 +0000 UTC m=+318.571618658" watchObservedRunningTime="2025-12-05 16:20:58.335905968 +0000 UTC m=+318.574579540" Dec 05 16:20:58 crc kubenswrapper[4756]: I1205 16:20:58.863540 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 05 16:20:59 crc kubenswrapper[4756]: I1205 16:20:59.329040 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7xvln" event={"ID":"549ad202-d7ad-4107-bfdd-af2497550691","Type":"ContainerStarted","Data":"7a881dd99491d30d026324bc5b0aae45eea3d7100727e0fb31c3392bc7c678a8"} Dec 05 16:20:59 crc kubenswrapper[4756]: I1205 16:20:59.331323 4756 generic.go:334] "Generic (PLEG): container finished" podID="1e18238d-9140-4421-a912-377c9c626b83" containerID="89505c8368c061f00de328fb5f0b4437d6627a0253cfd0b3b9fb173a13ec6a0d" exitCode=0 Dec 05 16:20:59 crc kubenswrapper[4756]: I1205 16:20:59.331432 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8tg4" event={"ID":"1e18238d-9140-4421-a912-377c9c626b83","Type":"ContainerDied","Data":"89505c8368c061f00de328fb5f0b4437d6627a0253cfd0b3b9fb173a13ec6a0d"} Dec 05 16:20:59 crc kubenswrapper[4756]: I1205 16:20:59.334519 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-trs6s" event={"ID":"fd51e2e3-9449-49c9-9ac0-a8496f61265d","Type":"ContainerStarted","Data":"4920d6208b52969416d4729332770cc059b63b107c9a4d2b39223826d0675ddf"} Dec 05 16:20:59 crc kubenswrapper[4756]: I1205 16:20:59.352434 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7xvln" podStartSLOduration=2.824172436 podStartE2EDuration="5.352399031s" podCreationTimestamp="2025-12-05 16:20:54 +0000 UTC" firstStartedPulling="2025-12-05 16:20:56.294269213 +0000 UTC m=+316.532942725" lastFinishedPulling="2025-12-05 16:20:58.822495808 +0000 UTC m=+319.061169320" observedRunningTime="2025-12-05 16:20:59.350263741 +0000 UTC m=+319.588937283" watchObservedRunningTime="2025-12-05 16:20:59.352399031 +0000 UTC m=+319.591072543" Dec 05 16:21:00 crc kubenswrapper[4756]: I1205 16:21:00.339796 4756 generic.go:334] "Generic (PLEG): container finished" podID="fd51e2e3-9449-49c9-9ac0-a8496f61265d" containerID="4920d6208b52969416d4729332770cc059b63b107c9a4d2b39223826d0675ddf" exitCode=0 Dec 05 16:21:00 crc kubenswrapper[4756]: I1205 16:21:00.339844 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-trs6s" event={"ID":"fd51e2e3-9449-49c9-9ac0-a8496f61265d","Type":"ContainerDied","Data":"4920d6208b52969416d4729332770cc059b63b107c9a4d2b39223826d0675ddf"} Dec 05 16:21:00 crc kubenswrapper[4756]: I1205 16:21:00.342386 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8tg4" event={"ID":"1e18238d-9140-4421-a912-377c9c626b83","Type":"ContainerStarted","Data":"82599506da1f10cbc0ea147045421535b65981b70b4931056f084100b688ffc2"} Dec 05 16:21:00 crc kubenswrapper[4756]: I1205 16:21:00.395977 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m8tg4" podStartSLOduration=2.97884825 podStartE2EDuration="4.395954475s" podCreationTimestamp="2025-12-05 16:20:56 +0000 UTC" firstStartedPulling="2025-12-05 16:20:58.323182345 +0000 UTC m=+318.561855857" lastFinishedPulling="2025-12-05 16:20:59.74028857 +0000 UTC m=+319.978962082" observedRunningTime="2025-12-05 16:21:00.377986476 +0000 UTC m=+320.616659998" watchObservedRunningTime="2025-12-05 16:21:00.395954475 +0000 UTC m=+320.634627987" Dec 05 16:21:01 crc kubenswrapper[4756]: I1205 16:21:01.353985 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-trs6s" event={"ID":"fd51e2e3-9449-49c9-9ac0-a8496f61265d","Type":"ContainerStarted","Data":"e404e8efd582f7938637fbc6acc0880938a8d29ebb5f52af9fa57ff1174cad5e"} Dec 05 16:21:01 crc kubenswrapper[4756]: I1205 16:21:01.374059 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-trs6s" podStartSLOduration=1.955940013 podStartE2EDuration="4.3740437s" podCreationTimestamp="2025-12-05 16:20:57 +0000 UTC" firstStartedPulling="2025-12-05 16:20:58.319167303 +0000 UTC m=+318.557840815" lastFinishedPulling="2025-12-05 16:21:00.73727099 +0000 UTC m=+320.975944502" observedRunningTime="2025-12-05 16:21:01.371761977 +0000 UTC m=+321.610435489" watchObservedRunningTime="2025-12-05 16:21:01.3740437 +0000 UTC m=+321.612717212" Dec 05 16:21:04 crc kubenswrapper[4756]: I1205 16:21:04.752354 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7xvln" Dec 05 16:21:04 crc kubenswrapper[4756]: I1205 16:21:04.753786 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7xvln" Dec 05 16:21:04 crc kubenswrapper[4756]: I1205 16:21:04.807739 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7xvln" Dec 05 16:21:05 crc kubenswrapper[4756]: I1205 16:21:05.012821 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c9shz" Dec 05 16:21:05 crc kubenswrapper[4756]: I1205 16:21:05.012879 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c9shz" Dec 05 16:21:05 crc kubenswrapper[4756]: I1205 16:21:05.054547 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c9shz" Dec 05 16:21:05 crc kubenswrapper[4756]: I1205 16:21:05.418498 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c9shz" Dec 05 16:21:05 crc kubenswrapper[4756]: I1205 16:21:05.425184 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7xvln" Dec 05 16:21:05 crc kubenswrapper[4756]: I1205 16:21:05.990476 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 05 16:21:07 crc kubenswrapper[4756]: I1205 16:21:07.147630 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m8tg4" Dec 05 16:21:07 crc kubenswrapper[4756]: I1205 16:21:07.147996 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m8tg4" Dec 05 16:21:07 crc kubenswrapper[4756]: I1205 16:21:07.197942 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m8tg4" Dec 05 16:21:07 crc kubenswrapper[4756]: I1205 16:21:07.349933 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-trs6s" Dec 05 16:21:07 crc kubenswrapper[4756]: I1205 16:21:07.349990 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-trs6s" Dec 05 16:21:07 crc kubenswrapper[4756]: I1205 16:21:07.386293 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-trs6s" Dec 05 16:21:07 crc kubenswrapper[4756]: I1205 16:21:07.421947 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-trs6s" Dec 05 16:21:07 crc kubenswrapper[4756]: I1205 16:21:07.432600 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m8tg4" Dec 05 16:21:14 crc kubenswrapper[4756]: I1205 16:21:14.615295 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c7f6d8788-96k4g"] Dec 05 16:21:14 crc kubenswrapper[4756]: I1205 16:21:14.615747 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7c7f6d8788-96k4g" podUID="7a13e827-7142-46be-a9ec-9c5e34804160" containerName="route-controller-manager" containerID="cri-o://8aa6d61e37cb62b45ed4bfaa89bb2b84ff0a2ad1bf63bacb9b67f6498ffe55fe" gracePeriod=30 Dec 05 16:21:14 crc kubenswrapper[4756]: I1205 16:21:14.618600 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-67c9f74866-zlmjx"] Dec 05 16:21:14 crc kubenswrapper[4756]: I1205 16:21:14.618809 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-67c9f74866-zlmjx" podUID="9a252b14-0959-4fa8-8dd7-224dbfa801b8" containerName="controller-manager" containerID="cri-o://8bae1628c3853fe5d6e3d99d0d5741a3787dfca55898f5096ad77e17efa10ca5" gracePeriod=30 Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.431311 4756 generic.go:334] "Generic (PLEG): container finished" podID="7a13e827-7142-46be-a9ec-9c5e34804160" containerID="8aa6d61e37cb62b45ed4bfaa89bb2b84ff0a2ad1bf63bacb9b67f6498ffe55fe" exitCode=0 Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.431639 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c7f6d8788-96k4g" event={"ID":"7a13e827-7142-46be-a9ec-9c5e34804160","Type":"ContainerDied","Data":"8aa6d61e37cb62b45ed4bfaa89bb2b84ff0a2ad1bf63bacb9b67f6498ffe55fe"} Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.432862 4756 generic.go:334] "Generic (PLEG): container finished" podID="9a252b14-0959-4fa8-8dd7-224dbfa801b8" containerID="8bae1628c3853fe5d6e3d99d0d5741a3787dfca55898f5096ad77e17efa10ca5" exitCode=0 Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.432899 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c9f74866-zlmjx" event={"ID":"9a252b14-0959-4fa8-8dd7-224dbfa801b8","Type":"ContainerDied","Data":"8bae1628c3853fe5d6e3d99d0d5741a3787dfca55898f5096ad77e17efa10ca5"} Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.691007 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c7f6d8788-96k4g" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.730076 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77f7895c56-88vcn"] Dec 05 16:21:15 crc kubenswrapper[4756]: E1205 16:21:15.730321 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a13e827-7142-46be-a9ec-9c5e34804160" containerName="route-controller-manager" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.730342 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a13e827-7142-46be-a9ec-9c5e34804160" containerName="route-controller-manager" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.730520 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a13e827-7142-46be-a9ec-9c5e34804160" containerName="route-controller-manager" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.730948 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77f7895c56-88vcn" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.735162 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77f7895c56-88vcn"] Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.765510 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb5d2\" (UniqueName: \"kubernetes.io/projected/7a13e827-7142-46be-a9ec-9c5e34804160-kube-api-access-pb5d2\") pod \"7a13e827-7142-46be-a9ec-9c5e34804160\" (UID: \"7a13e827-7142-46be-a9ec-9c5e34804160\") " Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.765583 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a13e827-7142-46be-a9ec-9c5e34804160-config\") pod \"7a13e827-7142-46be-a9ec-9c5e34804160\" (UID: \"7a13e827-7142-46be-a9ec-9c5e34804160\") " Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.765688 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a13e827-7142-46be-a9ec-9c5e34804160-client-ca\") pod \"7a13e827-7142-46be-a9ec-9c5e34804160\" (UID: \"7a13e827-7142-46be-a9ec-9c5e34804160\") " Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.765761 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a13e827-7142-46be-a9ec-9c5e34804160-serving-cert\") pod \"7a13e827-7142-46be-a9ec-9c5e34804160\" (UID: \"7a13e827-7142-46be-a9ec-9c5e34804160\") " Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.765998 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b73b55a-5190-4be7-ad98-b9b62ca765c0-serving-cert\") pod \"route-controller-manager-77f7895c56-88vcn\" (UID: \"8b73b55a-5190-4be7-ad98-b9b62ca765c0\") " pod="openshift-route-controller-manager/route-controller-manager-77f7895c56-88vcn" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.766047 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b73b55a-5190-4be7-ad98-b9b62ca765c0-client-ca\") pod \"route-controller-manager-77f7895c56-88vcn\" (UID: \"8b73b55a-5190-4be7-ad98-b9b62ca765c0\") " pod="openshift-route-controller-manager/route-controller-manager-77f7895c56-88vcn" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.766084 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b73b55a-5190-4be7-ad98-b9b62ca765c0-config\") pod \"route-controller-manager-77f7895c56-88vcn\" (UID: \"8b73b55a-5190-4be7-ad98-b9b62ca765c0\") " pod="openshift-route-controller-manager/route-controller-manager-77f7895c56-88vcn" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.766146 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j56gd\" (UniqueName: \"kubernetes.io/projected/8b73b55a-5190-4be7-ad98-b9b62ca765c0-kube-api-access-j56gd\") pod \"route-controller-manager-77f7895c56-88vcn\" (UID: \"8b73b55a-5190-4be7-ad98-b9b62ca765c0\") " pod="openshift-route-controller-manager/route-controller-manager-77f7895c56-88vcn" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.766628 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a13e827-7142-46be-a9ec-9c5e34804160-client-ca" (OuterVolumeSpecName: "client-ca") pod "7a13e827-7142-46be-a9ec-9c5e34804160" (UID: "7a13e827-7142-46be-a9ec-9c5e34804160"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.766649 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a13e827-7142-46be-a9ec-9c5e34804160-config" (OuterVolumeSpecName: "config") pod "7a13e827-7142-46be-a9ec-9c5e34804160" (UID: "7a13e827-7142-46be-a9ec-9c5e34804160"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.773561 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a13e827-7142-46be-a9ec-9c5e34804160-kube-api-access-pb5d2" (OuterVolumeSpecName: "kube-api-access-pb5d2") pod "7a13e827-7142-46be-a9ec-9c5e34804160" (UID: "7a13e827-7142-46be-a9ec-9c5e34804160"). InnerVolumeSpecName "kube-api-access-pb5d2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.773643 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a13e827-7142-46be-a9ec-9c5e34804160-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7a13e827-7142-46be-a9ec-9c5e34804160" (UID: "7a13e827-7142-46be-a9ec-9c5e34804160"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.816763 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c9f74866-zlmjx" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.867428 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf9s5\" (UniqueName: \"kubernetes.io/projected/9a252b14-0959-4fa8-8dd7-224dbfa801b8-kube-api-access-kf9s5\") pod \"9a252b14-0959-4fa8-8dd7-224dbfa801b8\" (UID: \"9a252b14-0959-4fa8-8dd7-224dbfa801b8\") " Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.867684 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a252b14-0959-4fa8-8dd7-224dbfa801b8-config\") pod \"9a252b14-0959-4fa8-8dd7-224dbfa801b8\" (UID: \"9a252b14-0959-4fa8-8dd7-224dbfa801b8\") " Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.867741 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a252b14-0959-4fa8-8dd7-224dbfa801b8-serving-cert\") pod \"9a252b14-0959-4fa8-8dd7-224dbfa801b8\" (UID: \"9a252b14-0959-4fa8-8dd7-224dbfa801b8\") " Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.867774 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a252b14-0959-4fa8-8dd7-224dbfa801b8-client-ca\") pod \"9a252b14-0959-4fa8-8dd7-224dbfa801b8\" (UID: \"9a252b14-0959-4fa8-8dd7-224dbfa801b8\") " Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.867810 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a252b14-0959-4fa8-8dd7-224dbfa801b8-proxy-ca-bundles\") pod \"9a252b14-0959-4fa8-8dd7-224dbfa801b8\" (UID: \"9a252b14-0959-4fa8-8dd7-224dbfa801b8\") " Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.867980 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j56gd\" (UniqueName: \"kubernetes.io/projected/8b73b55a-5190-4be7-ad98-b9b62ca765c0-kube-api-access-j56gd\") pod \"route-controller-manager-77f7895c56-88vcn\" (UID: \"8b73b55a-5190-4be7-ad98-b9b62ca765c0\") " pod="openshift-route-controller-manager/route-controller-manager-77f7895c56-88vcn" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.868047 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b73b55a-5190-4be7-ad98-b9b62ca765c0-serving-cert\") pod \"route-controller-manager-77f7895c56-88vcn\" (UID: \"8b73b55a-5190-4be7-ad98-b9b62ca765c0\") " pod="openshift-route-controller-manager/route-controller-manager-77f7895c56-88vcn" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.868065 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b73b55a-5190-4be7-ad98-b9b62ca765c0-client-ca\") pod \"route-controller-manager-77f7895c56-88vcn\" (UID: \"8b73b55a-5190-4be7-ad98-b9b62ca765c0\") " pod="openshift-route-controller-manager/route-controller-manager-77f7895c56-88vcn" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.868084 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b73b55a-5190-4be7-ad98-b9b62ca765c0-config\") pod \"route-controller-manager-77f7895c56-88vcn\" (UID: \"8b73b55a-5190-4be7-ad98-b9b62ca765c0\") " pod="openshift-route-controller-manager/route-controller-manager-77f7895c56-88vcn" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.868115 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb5d2\" (UniqueName: \"kubernetes.io/projected/7a13e827-7142-46be-a9ec-9c5e34804160-kube-api-access-pb5d2\") on node \"crc\" DevicePath \"\"" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.868125 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a13e827-7142-46be-a9ec-9c5e34804160-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.868134 4756 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a13e827-7142-46be-a9ec-9c5e34804160-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.868143 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a13e827-7142-46be-a9ec-9c5e34804160-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.869229 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b73b55a-5190-4be7-ad98-b9b62ca765c0-config\") pod \"route-controller-manager-77f7895c56-88vcn\" (UID: \"8b73b55a-5190-4be7-ad98-b9b62ca765c0\") " pod="openshift-route-controller-manager/route-controller-manager-77f7895c56-88vcn" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.870672 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b73b55a-5190-4be7-ad98-b9b62ca765c0-client-ca\") pod \"route-controller-manager-77f7895c56-88vcn\" (UID: \"8b73b55a-5190-4be7-ad98-b9b62ca765c0\") " pod="openshift-route-controller-manager/route-controller-manager-77f7895c56-88vcn" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.870812 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a252b14-0959-4fa8-8dd7-224dbfa801b8-client-ca" (OuterVolumeSpecName: "client-ca") pod "9a252b14-0959-4fa8-8dd7-224dbfa801b8" (UID: "9a252b14-0959-4fa8-8dd7-224dbfa801b8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.870891 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a252b14-0959-4fa8-8dd7-224dbfa801b8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9a252b14-0959-4fa8-8dd7-224dbfa801b8" (UID: "9a252b14-0959-4fa8-8dd7-224dbfa801b8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.870847 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a252b14-0959-4fa8-8dd7-224dbfa801b8-config" (OuterVolumeSpecName: "config") pod "9a252b14-0959-4fa8-8dd7-224dbfa801b8" (UID: "9a252b14-0959-4fa8-8dd7-224dbfa801b8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.876283 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a252b14-0959-4fa8-8dd7-224dbfa801b8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9a252b14-0959-4fa8-8dd7-224dbfa801b8" (UID: "9a252b14-0959-4fa8-8dd7-224dbfa801b8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.882737 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a252b14-0959-4fa8-8dd7-224dbfa801b8-kube-api-access-kf9s5" (OuterVolumeSpecName: "kube-api-access-kf9s5") pod "9a252b14-0959-4fa8-8dd7-224dbfa801b8" (UID: "9a252b14-0959-4fa8-8dd7-224dbfa801b8"). InnerVolumeSpecName "kube-api-access-kf9s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.888673 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b73b55a-5190-4be7-ad98-b9b62ca765c0-serving-cert\") pod \"route-controller-manager-77f7895c56-88vcn\" (UID: \"8b73b55a-5190-4be7-ad98-b9b62ca765c0\") " pod="openshift-route-controller-manager/route-controller-manager-77f7895c56-88vcn" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.890455 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j56gd\" (UniqueName: \"kubernetes.io/projected/8b73b55a-5190-4be7-ad98-b9b62ca765c0-kube-api-access-j56gd\") pod \"route-controller-manager-77f7895c56-88vcn\" (UID: \"8b73b55a-5190-4be7-ad98-b9b62ca765c0\") " pod="openshift-route-controller-manager/route-controller-manager-77f7895c56-88vcn" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.968979 4756 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a252b14-0959-4fa8-8dd7-224dbfa801b8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.969016 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf9s5\" (UniqueName: \"kubernetes.io/projected/9a252b14-0959-4fa8-8dd7-224dbfa801b8-kube-api-access-kf9s5\") on node \"crc\" DevicePath \"\"" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.969027 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a252b14-0959-4fa8-8dd7-224dbfa801b8-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.969036 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a252b14-0959-4fa8-8dd7-224dbfa801b8-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:21:15 crc kubenswrapper[4756]: I1205 16:21:15.969045 4756 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a252b14-0959-4fa8-8dd7-224dbfa801b8-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:21:16 crc kubenswrapper[4756]: I1205 16:21:16.050366 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77f7895c56-88vcn" Dec 05 16:21:16 crc kubenswrapper[4756]: I1205 16:21:16.440434 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c9f74866-zlmjx" event={"ID":"9a252b14-0959-4fa8-8dd7-224dbfa801b8","Type":"ContainerDied","Data":"ed107cc50d46ec0475cb2cabbbf9ba48ec7361cc3659ecdac25400e19de4228b"} Dec 05 16:21:16 crc kubenswrapper[4756]: I1205 16:21:16.440500 4756 scope.go:117] "RemoveContainer" containerID="8bae1628c3853fe5d6e3d99d0d5741a3787dfca55898f5096ad77e17efa10ca5" Dec 05 16:21:16 crc kubenswrapper[4756]: I1205 16:21:16.440501 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c9f74866-zlmjx" Dec 05 16:21:16 crc kubenswrapper[4756]: I1205 16:21:16.445558 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c7f6d8788-96k4g" event={"ID":"7a13e827-7142-46be-a9ec-9c5e34804160","Type":"ContainerDied","Data":"bf849694596f33f7fd682990fd5dba702cc74dc365dcb92e06aa2a5a1beca59a"} Dec 05 16:21:16 crc kubenswrapper[4756]: I1205 16:21:16.445658 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c7f6d8788-96k4g" Dec 05 16:21:16 crc kubenswrapper[4756]: I1205 16:21:16.473593 4756 scope.go:117] "RemoveContainer" containerID="8aa6d61e37cb62b45ed4bfaa89bb2b84ff0a2ad1bf63bacb9b67f6498ffe55fe" Dec 05 16:21:16 crc kubenswrapper[4756]: I1205 16:21:16.487154 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-67c9f74866-zlmjx"] Dec 05 16:21:16 crc kubenswrapper[4756]: I1205 16:21:16.496339 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-67c9f74866-zlmjx"] Dec 05 16:21:16 crc kubenswrapper[4756]: I1205 16:21:16.509970 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c7f6d8788-96k4g"] Dec 05 16:21:16 crc kubenswrapper[4756]: W1205 16:21:16.512076 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b73b55a_5190_4be7_ad98_b9b62ca765c0.slice/crio-3b95a8b149d4a0d7a7e4e12b884976223e77da3fa2a77a72f062e929a32dbdc2 WatchSource:0}: Error finding container 3b95a8b149d4a0d7a7e4e12b884976223e77da3fa2a77a72f062e929a32dbdc2: Status 404 returned error can't find the container with id 3b95a8b149d4a0d7a7e4e12b884976223e77da3fa2a77a72f062e929a32dbdc2 Dec 05 16:21:16 crc kubenswrapper[4756]: I1205 16:21:16.512558 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c7f6d8788-96k4g"] Dec 05 16:21:16 crc kubenswrapper[4756]: I1205 16:21:16.520181 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77f7895c56-88vcn"] Dec 05 16:21:17 crc kubenswrapper[4756]: I1205 16:21:17.453754 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77f7895c56-88vcn" event={"ID":"8b73b55a-5190-4be7-ad98-b9b62ca765c0","Type":"ContainerStarted","Data":"2b52f919ba608e50602e20f7cf7a60f545ff90d094fe25d81d348cb1e5670f65"} Dec 05 16:21:17 crc kubenswrapper[4756]: I1205 16:21:17.454246 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77f7895c56-88vcn" event={"ID":"8b73b55a-5190-4be7-ad98-b9b62ca765c0","Type":"ContainerStarted","Data":"3b95a8b149d4a0d7a7e4e12b884976223e77da3fa2a77a72f062e929a32dbdc2"} Dec 05 16:21:17 crc kubenswrapper[4756]: I1205 16:21:17.454289 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-77f7895c56-88vcn" Dec 05 16:21:17 crc kubenswrapper[4756]: I1205 16:21:17.467706 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-77f7895c56-88vcn" Dec 05 16:21:17 crc kubenswrapper[4756]: I1205 16:21:17.483280 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-77f7895c56-88vcn" podStartSLOduration=3.4832626380000002 podStartE2EDuration="3.483262638s" podCreationTimestamp="2025-12-05 16:21:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:21:17.480028412 +0000 UTC m=+337.718702004" watchObservedRunningTime="2025-12-05 16:21:17.483262638 +0000 UTC m=+337.721936150" Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.057360 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a13e827-7142-46be-a9ec-9c5e34804160" path="/var/lib/kubelet/pods/7a13e827-7142-46be-a9ec-9c5e34804160/volumes" Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.058317 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a252b14-0959-4fa8-8dd7-224dbfa801b8" path="/var/lib/kubelet/pods/9a252b14-0959-4fa8-8dd7-224dbfa801b8/volumes" Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.094764 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-c84f94c7d-lgn2p"] Dec 05 16:21:18 crc kubenswrapper[4756]: E1205 16:21:18.095076 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a252b14-0959-4fa8-8dd7-224dbfa801b8" containerName="controller-manager" Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.095099 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a252b14-0959-4fa8-8dd7-224dbfa801b8" containerName="controller-manager" Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.095253 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a252b14-0959-4fa8-8dd7-224dbfa801b8" containerName="controller-manager" Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.095797 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c84f94c7d-lgn2p" Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.102048 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.105680 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.113345 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.113353 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.113467 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.114078 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.119599 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c84f94c7d-lgn2p"] Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.121711 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.196492 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a06a12f6-5bae-4c42-9365-aee23140523c-serving-cert\") pod \"controller-manager-c84f94c7d-lgn2p\" (UID: \"a06a12f6-5bae-4c42-9365-aee23140523c\") " pod="openshift-controller-manager/controller-manager-c84f94c7d-lgn2p" Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.196545 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a06a12f6-5bae-4c42-9365-aee23140523c-proxy-ca-bundles\") pod \"controller-manager-c84f94c7d-lgn2p\" (UID: \"a06a12f6-5bae-4c42-9365-aee23140523c\") " pod="openshift-controller-manager/controller-manager-c84f94c7d-lgn2p" Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.196644 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a06a12f6-5bae-4c42-9365-aee23140523c-client-ca\") pod \"controller-manager-c84f94c7d-lgn2p\" (UID: \"a06a12f6-5bae-4c42-9365-aee23140523c\") " pod="openshift-controller-manager/controller-manager-c84f94c7d-lgn2p" Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.196668 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpw6h\" (UniqueName: \"kubernetes.io/projected/a06a12f6-5bae-4c42-9365-aee23140523c-kube-api-access-gpw6h\") pod \"controller-manager-c84f94c7d-lgn2p\" (UID: \"a06a12f6-5bae-4c42-9365-aee23140523c\") " pod="openshift-controller-manager/controller-manager-c84f94c7d-lgn2p" Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.196727 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a06a12f6-5bae-4c42-9365-aee23140523c-config\") pod \"controller-manager-c84f94c7d-lgn2p\" (UID: \"a06a12f6-5bae-4c42-9365-aee23140523c\") " pod="openshift-controller-manager/controller-manager-c84f94c7d-lgn2p" Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.298341 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a06a12f6-5bae-4c42-9365-aee23140523c-client-ca\") pod \"controller-manager-c84f94c7d-lgn2p\" (UID: \"a06a12f6-5bae-4c42-9365-aee23140523c\") " pod="openshift-controller-manager/controller-manager-c84f94c7d-lgn2p" Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.298388 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpw6h\" (UniqueName: \"kubernetes.io/projected/a06a12f6-5bae-4c42-9365-aee23140523c-kube-api-access-gpw6h\") pod \"controller-manager-c84f94c7d-lgn2p\" (UID: \"a06a12f6-5bae-4c42-9365-aee23140523c\") " pod="openshift-controller-manager/controller-manager-c84f94c7d-lgn2p" Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.298426 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a06a12f6-5bae-4c42-9365-aee23140523c-config\") pod \"controller-manager-c84f94c7d-lgn2p\" (UID: \"a06a12f6-5bae-4c42-9365-aee23140523c\") " pod="openshift-controller-manager/controller-manager-c84f94c7d-lgn2p" Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.298464 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a06a12f6-5bae-4c42-9365-aee23140523c-serving-cert\") pod \"controller-manager-c84f94c7d-lgn2p\" (UID: \"a06a12f6-5bae-4c42-9365-aee23140523c\") " pod="openshift-controller-manager/controller-manager-c84f94c7d-lgn2p" Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.298481 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a06a12f6-5bae-4c42-9365-aee23140523c-proxy-ca-bundles\") pod \"controller-manager-c84f94c7d-lgn2p\" (UID: \"a06a12f6-5bae-4c42-9365-aee23140523c\") " pod="openshift-controller-manager/controller-manager-c84f94c7d-lgn2p" Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.299230 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a06a12f6-5bae-4c42-9365-aee23140523c-client-ca\") pod \"controller-manager-c84f94c7d-lgn2p\" (UID: \"a06a12f6-5bae-4c42-9365-aee23140523c\") " pod="openshift-controller-manager/controller-manager-c84f94c7d-lgn2p" Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.299632 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a06a12f6-5bae-4c42-9365-aee23140523c-proxy-ca-bundles\") pod \"controller-manager-c84f94c7d-lgn2p\" (UID: \"a06a12f6-5bae-4c42-9365-aee23140523c\") " pod="openshift-controller-manager/controller-manager-c84f94c7d-lgn2p" Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.299736 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a06a12f6-5bae-4c42-9365-aee23140523c-config\") pod \"controller-manager-c84f94c7d-lgn2p\" (UID: \"a06a12f6-5bae-4c42-9365-aee23140523c\") " pod="openshift-controller-manager/controller-manager-c84f94c7d-lgn2p" Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.303060 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a06a12f6-5bae-4c42-9365-aee23140523c-serving-cert\") pod \"controller-manager-c84f94c7d-lgn2p\" (UID: \"a06a12f6-5bae-4c42-9365-aee23140523c\") " pod="openshift-controller-manager/controller-manager-c84f94c7d-lgn2p" Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.316284 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpw6h\" (UniqueName: \"kubernetes.io/projected/a06a12f6-5bae-4c42-9365-aee23140523c-kube-api-access-gpw6h\") pod \"controller-manager-c84f94c7d-lgn2p\" (UID: \"a06a12f6-5bae-4c42-9365-aee23140523c\") " pod="openshift-controller-manager/controller-manager-c84f94c7d-lgn2p" Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.418993 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c84f94c7d-lgn2p" Dec 05 16:21:18 crc kubenswrapper[4756]: I1205 16:21:18.816937 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c84f94c7d-lgn2p"] Dec 05 16:21:19 crc kubenswrapper[4756]: I1205 16:21:19.463876 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c84f94c7d-lgn2p" event={"ID":"a06a12f6-5bae-4c42-9365-aee23140523c","Type":"ContainerStarted","Data":"0e1d96c5162e6d592ba13e8df02f1f3dbcc36fa9b136e0d4be5da975dda89a5c"} Dec 05 16:21:19 crc kubenswrapper[4756]: I1205 16:21:19.464275 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c84f94c7d-lgn2p" event={"ID":"a06a12f6-5bae-4c42-9365-aee23140523c","Type":"ContainerStarted","Data":"7153b985f19eb84128552ae540591ea7b79384a57c143db6958ff6fa3b455a03"} Dec 05 16:21:19 crc kubenswrapper[4756]: I1205 16:21:19.483313 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-c84f94c7d-lgn2p" podStartSLOduration=5.483293501 podStartE2EDuration="5.483293501s" podCreationTimestamp="2025-12-05 16:21:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:21:19.47985287 +0000 UTC m=+339.718526402" watchObservedRunningTime="2025-12-05 16:21:19.483293501 +0000 UTC m=+339.721967013" Dec 05 16:21:20 crc kubenswrapper[4756]: I1205 16:21:20.468987 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-c84f94c7d-lgn2p" Dec 05 16:21:20 crc kubenswrapper[4756]: I1205 16:21:20.474111 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-c84f94c7d-lgn2p" Dec 05 16:21:44 crc kubenswrapper[4756]: I1205 16:21:44.615869 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-w6mmm"] Dec 05 16:21:44 crc kubenswrapper[4756]: I1205 16:21:44.617008 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-w6mmm" Dec 05 16:21:44 crc kubenswrapper[4756]: I1205 16:21:44.646892 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-w6mmm"] Dec 05 16:21:44 crc kubenswrapper[4756]: I1205 16:21:44.671082 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e1bf86ba-4a34-43b6-af95-b4922ed8bc0a-bound-sa-token\") pod \"image-registry-66df7c8f76-w6mmm\" (UID: \"e1bf86ba-4a34-43b6-af95-b4922ed8bc0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6mmm" Dec 05 16:21:44 crc kubenswrapper[4756]: I1205 16:21:44.671190 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e1bf86ba-4a34-43b6-af95-b4922ed8bc0a-ca-trust-extracted\") pod \"image-registry-66df7c8f76-w6mmm\" (UID: \"e1bf86ba-4a34-43b6-af95-b4922ed8bc0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6mmm" Dec 05 16:21:44 crc kubenswrapper[4756]: I1205 16:21:44.671362 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e1bf86ba-4a34-43b6-af95-b4922ed8bc0a-installation-pull-secrets\") pod \"image-registry-66df7c8f76-w6mmm\" (UID: \"e1bf86ba-4a34-43b6-af95-b4922ed8bc0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6mmm" Dec 05 16:21:44 crc kubenswrapper[4756]: I1205 16:21:44.671441 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e1bf86ba-4a34-43b6-af95-b4922ed8bc0a-registry-tls\") pod \"image-registry-66df7c8f76-w6mmm\" (UID: \"e1bf86ba-4a34-43b6-af95-b4922ed8bc0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6mmm" Dec 05 16:21:44 crc kubenswrapper[4756]: I1205 16:21:44.671475 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e1bf86ba-4a34-43b6-af95-b4922ed8bc0a-registry-certificates\") pod \"image-registry-66df7c8f76-w6mmm\" (UID: \"e1bf86ba-4a34-43b6-af95-b4922ed8bc0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6mmm" Dec 05 16:21:44 crc kubenswrapper[4756]: I1205 16:21:44.671555 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-w6mmm\" (UID: \"e1bf86ba-4a34-43b6-af95-b4922ed8bc0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6mmm" Dec 05 16:21:44 crc kubenswrapper[4756]: I1205 16:21:44.671665 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8gnm\" (UniqueName: \"kubernetes.io/projected/e1bf86ba-4a34-43b6-af95-b4922ed8bc0a-kube-api-access-x8gnm\") pod \"image-registry-66df7c8f76-w6mmm\" (UID: \"e1bf86ba-4a34-43b6-af95-b4922ed8bc0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6mmm" Dec 05 16:21:44 crc kubenswrapper[4756]: I1205 16:21:44.671698 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e1bf86ba-4a34-43b6-af95-b4922ed8bc0a-trusted-ca\") pod \"image-registry-66df7c8f76-w6mmm\" (UID: \"e1bf86ba-4a34-43b6-af95-b4922ed8bc0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6mmm" Dec 05 16:21:44 crc kubenswrapper[4756]: I1205 16:21:44.694400 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-w6mmm\" (UID: \"e1bf86ba-4a34-43b6-af95-b4922ed8bc0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6mmm" Dec 05 16:21:44 crc kubenswrapper[4756]: I1205 16:21:44.772953 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8gnm\" (UniqueName: \"kubernetes.io/projected/e1bf86ba-4a34-43b6-af95-b4922ed8bc0a-kube-api-access-x8gnm\") pod \"image-registry-66df7c8f76-w6mmm\" (UID: \"e1bf86ba-4a34-43b6-af95-b4922ed8bc0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6mmm" Dec 05 16:21:44 crc kubenswrapper[4756]: I1205 16:21:44.773000 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e1bf86ba-4a34-43b6-af95-b4922ed8bc0a-trusted-ca\") pod \"image-registry-66df7c8f76-w6mmm\" (UID: \"e1bf86ba-4a34-43b6-af95-b4922ed8bc0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6mmm" Dec 05 16:21:44 crc kubenswrapper[4756]: I1205 16:21:44.773030 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e1bf86ba-4a34-43b6-af95-b4922ed8bc0a-bound-sa-token\") pod \"image-registry-66df7c8f76-w6mmm\" (UID: \"e1bf86ba-4a34-43b6-af95-b4922ed8bc0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6mmm" Dec 05 16:21:44 crc kubenswrapper[4756]: I1205 16:21:44.773062 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e1bf86ba-4a34-43b6-af95-b4922ed8bc0a-ca-trust-extracted\") pod \"image-registry-66df7c8f76-w6mmm\" (UID: \"e1bf86ba-4a34-43b6-af95-b4922ed8bc0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6mmm" Dec 05 16:21:44 crc kubenswrapper[4756]: I1205 16:21:44.773109 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e1bf86ba-4a34-43b6-af95-b4922ed8bc0a-installation-pull-secrets\") pod \"image-registry-66df7c8f76-w6mmm\" (UID: \"e1bf86ba-4a34-43b6-af95-b4922ed8bc0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6mmm" Dec 05 16:21:44 crc kubenswrapper[4756]: I1205 16:21:44.773129 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e1bf86ba-4a34-43b6-af95-b4922ed8bc0a-registry-tls\") pod \"image-registry-66df7c8f76-w6mmm\" (UID: \"e1bf86ba-4a34-43b6-af95-b4922ed8bc0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6mmm" Dec 05 16:21:44 crc kubenswrapper[4756]: I1205 16:21:44.773149 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e1bf86ba-4a34-43b6-af95-b4922ed8bc0a-registry-certificates\") pod \"image-registry-66df7c8f76-w6mmm\" (UID: \"e1bf86ba-4a34-43b6-af95-b4922ed8bc0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6mmm" Dec 05 16:21:44 crc kubenswrapper[4756]: I1205 16:21:44.774272 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e1bf86ba-4a34-43b6-af95-b4922ed8bc0a-registry-certificates\") pod \"image-registry-66df7c8f76-w6mmm\" (UID: \"e1bf86ba-4a34-43b6-af95-b4922ed8bc0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6mmm" Dec 05 16:21:44 crc kubenswrapper[4756]: I1205 16:21:44.775568 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e1bf86ba-4a34-43b6-af95-b4922ed8bc0a-trusted-ca\") pod \"image-registry-66df7c8f76-w6mmm\" (UID: \"e1bf86ba-4a34-43b6-af95-b4922ed8bc0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6mmm" Dec 05 16:21:44 crc kubenswrapper[4756]: I1205 16:21:44.779091 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e1bf86ba-4a34-43b6-af95-b4922ed8bc0a-ca-trust-extracted\") pod \"image-registry-66df7c8f76-w6mmm\" (UID: \"e1bf86ba-4a34-43b6-af95-b4922ed8bc0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6mmm" Dec 05 16:21:44 crc kubenswrapper[4756]: I1205 16:21:44.780901 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e1bf86ba-4a34-43b6-af95-b4922ed8bc0a-installation-pull-secrets\") pod \"image-registry-66df7c8f76-w6mmm\" (UID: \"e1bf86ba-4a34-43b6-af95-b4922ed8bc0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6mmm" Dec 05 16:21:44 crc kubenswrapper[4756]: I1205 16:21:44.780935 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e1bf86ba-4a34-43b6-af95-b4922ed8bc0a-registry-tls\") pod \"image-registry-66df7c8f76-w6mmm\" (UID: \"e1bf86ba-4a34-43b6-af95-b4922ed8bc0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6mmm" Dec 05 16:21:44 crc kubenswrapper[4756]: I1205 16:21:44.794602 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e1bf86ba-4a34-43b6-af95-b4922ed8bc0a-bound-sa-token\") pod \"image-registry-66df7c8f76-w6mmm\" (UID: \"e1bf86ba-4a34-43b6-af95-b4922ed8bc0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6mmm" Dec 05 16:21:44 crc kubenswrapper[4756]: I1205 16:21:44.798775 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8gnm\" (UniqueName: \"kubernetes.io/projected/e1bf86ba-4a34-43b6-af95-b4922ed8bc0a-kube-api-access-x8gnm\") pod \"image-registry-66df7c8f76-w6mmm\" (UID: \"e1bf86ba-4a34-43b6-af95-b4922ed8bc0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6mmm" Dec 05 16:21:44 crc kubenswrapper[4756]: I1205 16:21:44.941889 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-w6mmm" Dec 05 16:21:45 crc kubenswrapper[4756]: I1205 16:21:45.339268 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-w6mmm"] Dec 05 16:21:45 crc kubenswrapper[4756]: I1205 16:21:45.610995 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-w6mmm" event={"ID":"e1bf86ba-4a34-43b6-af95-b4922ed8bc0a","Type":"ContainerStarted","Data":"6a5c4d5cf8151b27effed42183006ce8758f9c2f60585a079ff75be5fe1475ad"} Dec 05 16:21:46 crc kubenswrapper[4756]: I1205 16:21:46.619066 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-w6mmm" event={"ID":"e1bf86ba-4a34-43b6-af95-b4922ed8bc0a","Type":"ContainerStarted","Data":"b4616fd5fd4d244ddd020ffbebfeccfb7130da8a26deb11fde7debbd243b0130"} Dec 05 16:21:46 crc kubenswrapper[4756]: I1205 16:21:46.619259 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-w6mmm" Dec 05 16:21:46 crc kubenswrapper[4756]: I1205 16:21:46.649473 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-w6mmm" podStartSLOduration=2.649384491 podStartE2EDuration="2.649384491s" podCreationTimestamp="2025-12-05 16:21:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:21:46.643880619 +0000 UTC m=+366.882554181" watchObservedRunningTime="2025-12-05 16:21:46.649384491 +0000 UTC m=+366.888058043" Dec 05 16:21:52 crc kubenswrapper[4756]: I1205 16:21:52.295078 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:21:52 crc kubenswrapper[4756]: I1205 16:21:52.295531 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:22:04 crc kubenswrapper[4756]: I1205 16:22:04.946914 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-w6mmm" Dec 05 16:22:04 crc kubenswrapper[4756]: I1205 16:22:04.998525 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hxxk6"] Dec 05 16:22:22 crc kubenswrapper[4756]: I1205 16:22:22.295466 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:22:22 crc kubenswrapper[4756]: I1205 16:22:22.296096 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:22:30 crc kubenswrapper[4756]: I1205 16:22:30.033018 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" podUID="9daff7ba-fe02-4b65-8db3-26fb89ff1c25" containerName="registry" containerID="cri-o://a0556efbe5aba168f5ccb0228a0cd0f138f5cc3098bee6613ca39f6e15bfb069" gracePeriod=30 Dec 05 16:22:32 crc kubenswrapper[4756]: I1205 16:22:32.883335 4756 generic.go:334] "Generic (PLEG): container finished" podID="9daff7ba-fe02-4b65-8db3-26fb89ff1c25" containerID="a0556efbe5aba168f5ccb0228a0cd0f138f5cc3098bee6613ca39f6e15bfb069" exitCode=0 Dec 05 16:22:32 crc kubenswrapper[4756]: I1205 16:22:32.883678 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" event={"ID":"9daff7ba-fe02-4b65-8db3-26fb89ff1c25","Type":"ContainerDied","Data":"a0556efbe5aba168f5ccb0228a0cd0f138f5cc3098bee6613ca39f6e15bfb069"} Dec 05 16:22:33 crc kubenswrapper[4756]: I1205 16:22:33.508235 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:22:33 crc kubenswrapper[4756]: I1205 16:22:33.549272 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-trusted-ca\") pod \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " Dec 05 16:22:33 crc kubenswrapper[4756]: I1205 16:22:33.549324 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-registry-tls\") pod \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " Dec 05 16:22:33 crc kubenswrapper[4756]: I1205 16:22:33.549344 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-installation-pull-secrets\") pod \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " Dec 05 16:22:33 crc kubenswrapper[4756]: I1205 16:22:33.549601 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " Dec 05 16:22:33 crc kubenswrapper[4756]: I1205 16:22:33.549653 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-registry-certificates\") pod \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " Dec 05 16:22:33 crc kubenswrapper[4756]: I1205 16:22:33.549687 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9jtw\" (UniqueName: \"kubernetes.io/projected/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-kube-api-access-v9jtw\") pod \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " Dec 05 16:22:33 crc kubenswrapper[4756]: I1205 16:22:33.549715 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-ca-trust-extracted\") pod \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " Dec 05 16:22:33 crc kubenswrapper[4756]: I1205 16:22:33.549737 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-bound-sa-token\") pod \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\" (UID: \"9daff7ba-fe02-4b65-8db3-26fb89ff1c25\") " Dec 05 16:22:33 crc kubenswrapper[4756]: I1205 16:22:33.550322 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9daff7ba-fe02-4b65-8db3-26fb89ff1c25" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:22:33 crc kubenswrapper[4756]: I1205 16:22:33.550380 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "9daff7ba-fe02-4b65-8db3-26fb89ff1c25" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:22:33 crc kubenswrapper[4756]: I1205 16:22:33.563611 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "9daff7ba-fe02-4b65-8db3-26fb89ff1c25" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:22:33 crc kubenswrapper[4756]: I1205 16:22:33.563631 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "9daff7ba-fe02-4b65-8db3-26fb89ff1c25" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:22:33 crc kubenswrapper[4756]: I1205 16:22:33.564314 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "9daff7ba-fe02-4b65-8db3-26fb89ff1c25" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:22:33 crc kubenswrapper[4756]: I1205 16:22:33.565906 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-kube-api-access-v9jtw" (OuterVolumeSpecName: "kube-api-access-v9jtw") pod "9daff7ba-fe02-4b65-8db3-26fb89ff1c25" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25"). InnerVolumeSpecName "kube-api-access-v9jtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:22:33 crc kubenswrapper[4756]: I1205 16:22:33.568620 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "9daff7ba-fe02-4b65-8db3-26fb89ff1c25" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:22:33 crc kubenswrapper[4756]: I1205 16:22:33.572985 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "9daff7ba-fe02-4b65-8db3-26fb89ff1c25" (UID: "9daff7ba-fe02-4b65-8db3-26fb89ff1c25"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 16:22:33 crc kubenswrapper[4756]: I1205 16:22:33.650869 4756 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:22:33 crc kubenswrapper[4756]: I1205 16:22:33.650900 4756 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 05 16:22:33 crc kubenswrapper[4756]: I1205 16:22:33.650913 4756 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 05 16:22:33 crc kubenswrapper[4756]: I1205 16:22:33.650925 4756 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 05 16:22:33 crc kubenswrapper[4756]: I1205 16:22:33.650933 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9jtw\" (UniqueName: \"kubernetes.io/projected/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-kube-api-access-v9jtw\") on node \"crc\" DevicePath \"\"" Dec 05 16:22:33 crc kubenswrapper[4756]: I1205 16:22:33.650942 4756 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 05 16:22:33 crc kubenswrapper[4756]: I1205 16:22:33.650951 4756 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9daff7ba-fe02-4b65-8db3-26fb89ff1c25-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 16:22:33 crc kubenswrapper[4756]: I1205 16:22:33.890567 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" event={"ID":"9daff7ba-fe02-4b65-8db3-26fb89ff1c25","Type":"ContainerDied","Data":"07636d37957ece3848e8d468083e9438324df0bbc00cef8959144f66f23cc833"} Dec 05 16:22:33 crc kubenswrapper[4756]: I1205 16:22:33.890626 4756 scope.go:117] "RemoveContainer" containerID="a0556efbe5aba168f5ccb0228a0cd0f138f5cc3098bee6613ca39f6e15bfb069" Dec 05 16:22:33 crc kubenswrapper[4756]: I1205 16:22:33.890624 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hxxk6" Dec 05 16:22:33 crc kubenswrapper[4756]: I1205 16:22:33.925345 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hxxk6"] Dec 05 16:22:33 crc kubenswrapper[4756]: I1205 16:22:33.928809 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hxxk6"] Dec 05 16:22:34 crc kubenswrapper[4756]: I1205 16:22:34.060610 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9daff7ba-fe02-4b65-8db3-26fb89ff1c25" path="/var/lib/kubelet/pods/9daff7ba-fe02-4b65-8db3-26fb89ff1c25/volumes" Dec 05 16:22:52 crc kubenswrapper[4756]: I1205 16:22:52.295971 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:22:52 crc kubenswrapper[4756]: I1205 16:22:52.296638 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:22:52 crc kubenswrapper[4756]: I1205 16:22:52.296696 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 16:22:52 crc kubenswrapper[4756]: I1205 16:22:52.297366 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8621722fac0e1b78a92a26f20fd0a72e3bdd08c5777a346841d5033b97d8ac3a"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 16:22:52 crc kubenswrapper[4756]: I1205 16:22:52.297464 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://8621722fac0e1b78a92a26f20fd0a72e3bdd08c5777a346841d5033b97d8ac3a" gracePeriod=600 Dec 05 16:22:53 crc kubenswrapper[4756]: I1205 16:22:53.018753 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="8621722fac0e1b78a92a26f20fd0a72e3bdd08c5777a346841d5033b97d8ac3a" exitCode=0 Dec 05 16:22:53 crc kubenswrapper[4756]: I1205 16:22:53.018820 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"8621722fac0e1b78a92a26f20fd0a72e3bdd08c5777a346841d5033b97d8ac3a"} Dec 05 16:22:53 crc kubenswrapper[4756]: I1205 16:22:53.018869 4756 scope.go:117] "RemoveContainer" containerID="27713da853112411e4924f6e0d7db122dd6350c3f5fc2dafac243c765aa6700d" Dec 05 16:22:54 crc kubenswrapper[4756]: I1205 16:22:54.026703 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"7338622c696eeea43693c9484face505b7d076a37d939ddfdf63dc83f5acf1ba"} Dec 05 16:25:22 crc kubenswrapper[4756]: I1205 16:25:22.295951 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:25:22 crc kubenswrapper[4756]: I1205 16:25:22.296677 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:25:52 crc kubenswrapper[4756]: I1205 16:25:52.295958 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:25:52 crc kubenswrapper[4756]: I1205 16:25:52.296972 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:26:22 crc kubenswrapper[4756]: I1205 16:26:22.296043 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:26:22 crc kubenswrapper[4756]: I1205 16:26:22.296657 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:26:22 crc kubenswrapper[4756]: I1205 16:26:22.296701 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 16:26:22 crc kubenswrapper[4756]: I1205 16:26:22.297256 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7338622c696eeea43693c9484face505b7d076a37d939ddfdf63dc83f5acf1ba"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 16:26:22 crc kubenswrapper[4756]: I1205 16:26:22.297315 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://7338622c696eeea43693c9484face505b7d076a37d939ddfdf63dc83f5acf1ba" gracePeriod=600 Dec 05 16:26:23 crc kubenswrapper[4756]: I1205 16:26:23.217782 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="7338622c696eeea43693c9484face505b7d076a37d939ddfdf63dc83f5acf1ba" exitCode=0 Dec 05 16:26:23 crc kubenswrapper[4756]: I1205 16:26:23.217869 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"7338622c696eeea43693c9484face505b7d076a37d939ddfdf63dc83f5acf1ba"} Dec 05 16:26:23 crc kubenswrapper[4756]: I1205 16:26:23.218414 4756 scope.go:117] "RemoveContainer" containerID="8621722fac0e1b78a92a26f20fd0a72e3bdd08c5777a346841d5033b97d8ac3a" Dec 05 16:26:24 crc kubenswrapper[4756]: I1205 16:26:24.229121 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"af109c606f5763151a202cd55133291df026de6e76c3ec3a043528aa8e422ca0"} Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.557303 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pz5rs"] Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.558225 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="ovn-controller" containerID="cri-o://bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191" gracePeriod=30 Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.558257 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1" gracePeriod=30 Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.558336 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="nbdb" containerID="cri-o://c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6" gracePeriod=30 Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.558344 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="kube-rbac-proxy-node" containerID="cri-o://8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c" gracePeriod=30 Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.558384 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="ovn-acl-logging" containerID="cri-o://24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af" gracePeriod=30 Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.558310 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="sbdb" containerID="cri-o://612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36" gracePeriod=30 Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.558406 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="northd" containerID="cri-o://0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9" gracePeriod=30 Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.630480 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="ovnkube-controller" containerID="cri-o://d93234e29ff75d7e51fd9d09b43ab59a7f60fa2c432745aa3e0d9de700af402e" gracePeriod=30 Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.822233 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fcqf4_fb26e42b-df49-418d-8be1-8a0704cd5dfd/kube-multus/2.log" Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.822757 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fcqf4_fb26e42b-df49-418d-8be1-8a0704cd5dfd/kube-multus/1.log" Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.822813 4756 generic.go:334] "Generic (PLEG): container finished" podID="fb26e42b-df49-418d-8be1-8a0704cd5dfd" containerID="dac33531a016040a5246f6e69fadedf2c460a4d592313c2e4c01a9e8c3dca800" exitCode=2 Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.822887 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fcqf4" event={"ID":"fb26e42b-df49-418d-8be1-8a0704cd5dfd","Type":"ContainerDied","Data":"dac33531a016040a5246f6e69fadedf2c460a4d592313c2e4c01a9e8c3dca800"} Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.822948 4756 scope.go:117] "RemoveContainer" containerID="51fbf3a68f2e5d6a355b8181868e9ea812ba19b0f1b42ce32ec95e2126451436" Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.823715 4756 scope.go:117] "RemoveContainer" containerID="dac33531a016040a5246f6e69fadedf2c460a4d592313c2e4c01a9e8c3dca800" Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.826307 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz5rs_b75aef14-f06d-44ec-8a2d-28d9661b1001/ovnkube-controller/3.log" Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.828489 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz5rs_b75aef14-f06d-44ec-8a2d-28d9661b1001/ovn-acl-logging/0.log" Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.828905 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz5rs_b75aef14-f06d-44ec-8a2d-28d9661b1001/ovn-controller/0.log" Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.829293 4756 generic.go:334] "Generic (PLEG): container finished" podID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerID="d93234e29ff75d7e51fd9d09b43ab59a7f60fa2c432745aa3e0d9de700af402e" exitCode=0 Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.829319 4756 generic.go:334] "Generic (PLEG): container finished" podID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerID="c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6" exitCode=0 Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.829328 4756 generic.go:334] "Generic (PLEG): container finished" podID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerID="8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1" exitCode=0 Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.829336 4756 generic.go:334] "Generic (PLEG): container finished" podID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerID="8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c" exitCode=0 Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.829343 4756 generic.go:334] "Generic (PLEG): container finished" podID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerID="24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af" exitCode=143 Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.829351 4756 generic.go:334] "Generic (PLEG): container finished" podID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerID="bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191" exitCode=143 Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.829375 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" event={"ID":"b75aef14-f06d-44ec-8a2d-28d9661b1001","Type":"ContainerDied","Data":"d93234e29ff75d7e51fd9d09b43ab59a7f60fa2c432745aa3e0d9de700af402e"} Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.829410 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" event={"ID":"b75aef14-f06d-44ec-8a2d-28d9661b1001","Type":"ContainerDied","Data":"c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6"} Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.829453 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" event={"ID":"b75aef14-f06d-44ec-8a2d-28d9661b1001","Type":"ContainerDied","Data":"8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1"} Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.829462 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" event={"ID":"b75aef14-f06d-44ec-8a2d-28d9661b1001","Type":"ContainerDied","Data":"8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c"} Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.829473 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" event={"ID":"b75aef14-f06d-44ec-8a2d-28d9661b1001","Type":"ContainerDied","Data":"24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af"} Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.829483 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" event={"ID":"b75aef14-f06d-44ec-8a2d-28d9661b1001","Type":"ContainerDied","Data":"bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191"} Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.929794 4756 scope.go:117] "RemoveContainer" containerID="678012ecd6a1f851dd87783e9c2fc348817b41557fcb46a8cc4a8acc313bf09c" Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.986434 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz5rs_b75aef14-f06d-44ec-8a2d-28d9661b1001/ovn-acl-logging/0.log" Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.987122 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz5rs_b75aef14-f06d-44ec-8a2d-28d9661b1001/ovn-controller/0.log" Dec 05 16:28:06 crc kubenswrapper[4756]: I1205 16:28:06.988634 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.051778 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2qh8s"] Dec 05 16:28:07 crc kubenswrapper[4756]: E1205 16:28:07.052075 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="ovnkube-controller" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.052102 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="ovnkube-controller" Dec 05 16:28:07 crc kubenswrapper[4756]: E1205 16:28:07.052112 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="northd" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.052120 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="northd" Dec 05 16:28:07 crc kubenswrapper[4756]: E1205 16:28:07.052130 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="sbdb" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.052137 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="sbdb" Dec 05 16:28:07 crc kubenswrapper[4756]: E1205 16:28:07.052150 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="ovnkube-controller" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.052158 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="ovnkube-controller" Dec 05 16:28:07 crc kubenswrapper[4756]: E1205 16:28:07.052166 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="kube-rbac-proxy-node" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.052172 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="kube-rbac-proxy-node" Dec 05 16:28:07 crc kubenswrapper[4756]: E1205 16:28:07.052183 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="nbdb" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.052192 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="nbdb" Dec 05 16:28:07 crc kubenswrapper[4756]: E1205 16:28:07.052201 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="ovn-controller" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.052208 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="ovn-controller" Dec 05 16:28:07 crc kubenswrapper[4756]: E1205 16:28:07.052215 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="ovnkube-controller" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.052220 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="ovnkube-controller" Dec 05 16:28:07 crc kubenswrapper[4756]: E1205 16:28:07.052228 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="ovn-acl-logging" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.052234 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="ovn-acl-logging" Dec 05 16:28:07 crc kubenswrapper[4756]: E1205 16:28:07.052243 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="kubecfg-setup" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.052249 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="kubecfg-setup" Dec 05 16:28:07 crc kubenswrapper[4756]: E1205 16:28:07.052256 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="ovnkube-controller" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.052262 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="ovnkube-controller" Dec 05 16:28:07 crc kubenswrapper[4756]: E1205 16:28:07.052270 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.052277 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 16:28:07 crc kubenswrapper[4756]: E1205 16:28:07.052290 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9daff7ba-fe02-4b65-8db3-26fb89ff1c25" containerName="registry" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.052296 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="9daff7ba-fe02-4b65-8db3-26fb89ff1c25" containerName="registry" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.052448 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.052466 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="nbdb" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.052476 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="ovnkube-controller" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.052483 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="ovnkube-controller" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.052491 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="ovnkube-controller" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.052498 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="northd" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.052507 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="ovnkube-controller" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.052514 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="ovn-acl-logging" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.052520 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="sbdb" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.052528 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="kube-rbac-proxy-node" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.052536 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="ovn-controller" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.052544 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="9daff7ba-fe02-4b65-8db3-26fb89ff1c25" containerName="registry" Dec 05 16:28:07 crc kubenswrapper[4756]: E1205 16:28:07.052641 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="ovnkube-controller" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.052649 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="ovnkube-controller" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.052746 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerName="ovnkube-controller" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.054555 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.110965 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-systemd-units\") pod \"b75aef14-f06d-44ec-8a2d-28d9661b1001\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.111071 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6l5kc\" (UniqueName: \"kubernetes.io/projected/b75aef14-f06d-44ec-8a2d-28d9661b1001-kube-api-access-6l5kc\") pod \"b75aef14-f06d-44ec-8a2d-28d9661b1001\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.111114 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "b75aef14-f06d-44ec-8a2d-28d9661b1001" (UID: "b75aef14-f06d-44ec-8a2d-28d9661b1001"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.111141 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-node-log\") pod \"b75aef14-f06d-44ec-8a2d-28d9661b1001\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.111202 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-node-log" (OuterVolumeSpecName: "node-log") pod "b75aef14-f06d-44ec-8a2d-28d9661b1001" (UID: "b75aef14-f06d-44ec-8a2d-28d9661b1001"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.111252 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-var-lib-openvswitch\") pod \"b75aef14-f06d-44ec-8a2d-28d9661b1001\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.111335 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-run-ovn-kubernetes\") pod \"b75aef14-f06d-44ec-8a2d-28d9661b1001\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.111371 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "b75aef14-f06d-44ec-8a2d-28d9661b1001" (UID: "b75aef14-f06d-44ec-8a2d-28d9661b1001"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.111395 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b75aef14-f06d-44ec-8a2d-28d9661b1001-ovnkube-config\") pod \"b75aef14-f06d-44ec-8a2d-28d9661b1001\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.111474 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "b75aef14-f06d-44ec-8a2d-28d9661b1001" (UID: "b75aef14-f06d-44ec-8a2d-28d9661b1001"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.111518 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-slash\") pod \"b75aef14-f06d-44ec-8a2d-28d9661b1001\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.111585 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-slash" (OuterVolumeSpecName: "host-slash") pod "b75aef14-f06d-44ec-8a2d-28d9661b1001" (UID: "b75aef14-f06d-44ec-8a2d-28d9661b1001"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.111650 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-cni-netd\") pod \"b75aef14-f06d-44ec-8a2d-28d9661b1001\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.111677 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-cni-bin\") pod \"b75aef14-f06d-44ec-8a2d-28d9661b1001\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.111728 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "b75aef14-f06d-44ec-8a2d-28d9661b1001" (UID: "b75aef14-f06d-44ec-8a2d-28d9661b1001"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.111773 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-var-lib-cni-networks-ovn-kubernetes\") pod \"b75aef14-f06d-44ec-8a2d-28d9661b1001\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.111811 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "b75aef14-f06d-44ec-8a2d-28d9661b1001" (UID: "b75aef14-f06d-44ec-8a2d-28d9661b1001"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.111834 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "b75aef14-f06d-44ec-8a2d-28d9661b1001" (UID: "b75aef14-f06d-44ec-8a2d-28d9661b1001"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.111869 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-run-ovn\") pod \"b75aef14-f06d-44ec-8a2d-28d9661b1001\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.111894 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-run-openvswitch\") pod \"b75aef14-f06d-44ec-8a2d-28d9661b1001\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.111895 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b75aef14-f06d-44ec-8a2d-28d9661b1001-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "b75aef14-f06d-44ec-8a2d-28d9661b1001" (UID: "b75aef14-f06d-44ec-8a2d-28d9661b1001"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.111912 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "b75aef14-f06d-44ec-8a2d-28d9661b1001" (UID: "b75aef14-f06d-44ec-8a2d-28d9661b1001"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.111931 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "b75aef14-f06d-44ec-8a2d-28d9661b1001" (UID: "b75aef14-f06d-44ec-8a2d-28d9661b1001"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.111946 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-run-netns\") pod \"b75aef14-f06d-44ec-8a2d-28d9661b1001\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.111988 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-run-systemd\") pod \"b75aef14-f06d-44ec-8a2d-28d9661b1001\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.112016 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b75aef14-f06d-44ec-8a2d-28d9661b1001-env-overrides\") pod \"b75aef14-f06d-44ec-8a2d-28d9661b1001\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.112029 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "b75aef14-f06d-44ec-8a2d-28d9661b1001" (UID: "b75aef14-f06d-44ec-8a2d-28d9661b1001"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.112080 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b75aef14-f06d-44ec-8a2d-28d9661b1001-ovnkube-script-lib\") pod \"b75aef14-f06d-44ec-8a2d-28d9661b1001\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.112615 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-log-socket\") pod \"b75aef14-f06d-44ec-8a2d-28d9661b1001\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.112716 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-etc-openvswitch\") pod \"b75aef14-f06d-44ec-8a2d-28d9661b1001\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.112739 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-kubelet\") pod \"b75aef14-f06d-44ec-8a2d-28d9661b1001\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.112545 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b75aef14-f06d-44ec-8a2d-28d9661b1001-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "b75aef14-f06d-44ec-8a2d-28d9661b1001" (UID: "b75aef14-f06d-44ec-8a2d-28d9661b1001"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.112665 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-log-socket" (OuterVolumeSpecName: "log-socket") pod "b75aef14-f06d-44ec-8a2d-28d9661b1001" (UID: "b75aef14-f06d-44ec-8a2d-28d9661b1001"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.112696 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b75aef14-f06d-44ec-8a2d-28d9661b1001-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "b75aef14-f06d-44ec-8a2d-28d9661b1001" (UID: "b75aef14-f06d-44ec-8a2d-28d9661b1001"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.112820 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b75aef14-f06d-44ec-8a2d-28d9661b1001-ovn-node-metrics-cert\") pod \"b75aef14-f06d-44ec-8a2d-28d9661b1001\" (UID: \"b75aef14-f06d-44ec-8a2d-28d9661b1001\") " Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.112777 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "b75aef14-f06d-44ec-8a2d-28d9661b1001" (UID: "b75aef14-f06d-44ec-8a2d-28d9661b1001"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.112809 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "b75aef14-f06d-44ec-8a2d-28d9661b1001" (UID: "b75aef14-f06d-44ec-8a2d-28d9661b1001"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.113268 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-etc-openvswitch\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.113325 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-host-run-netns\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.113474 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-run-ovn\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.113543 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-var-lib-openvswitch\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.113597 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-host-kubelet\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.113645 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-host-slash\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.113675 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2gdk\" (UniqueName: \"kubernetes.io/projected/43e079d3-5a47-41ff-951f-469f1b78e79f-kube-api-access-v2gdk\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.113846 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-run-openvswitch\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.113932 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-log-socket\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.113974 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43e079d3-5a47-41ff-951f-469f1b78e79f-ovnkube-config\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.114022 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43e079d3-5a47-41ff-951f-469f1b78e79f-ovnkube-script-lib\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.114079 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-host-cni-bin\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.114107 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-host-run-ovn-kubernetes\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.114127 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-host-cni-netd\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.114151 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-run-systemd\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.114181 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.114215 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43e079d3-5a47-41ff-951f-469f1b78e79f-ovn-node-metrics-cert\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.114237 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-node-log\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.114255 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-systemd-units\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.114282 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43e079d3-5a47-41ff-951f-469f1b78e79f-env-overrides\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.114581 4756 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-log-socket\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.114608 4756 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.114621 4756 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.114634 4756 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.114648 4756 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-node-log\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.114659 4756 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.114670 4756 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.114681 4756 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b75aef14-f06d-44ec-8a2d-28d9661b1001-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.114691 4756 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-slash\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.114701 4756 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.114712 4756 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.114724 4756 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.114737 4756 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.114747 4756 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.114759 4756 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.114772 4756 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b75aef14-f06d-44ec-8a2d-28d9661b1001-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.114783 4756 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b75aef14-f06d-44ec-8a2d-28d9661b1001-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.120038 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b75aef14-f06d-44ec-8a2d-28d9661b1001-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "b75aef14-f06d-44ec-8a2d-28d9661b1001" (UID: "b75aef14-f06d-44ec-8a2d-28d9661b1001"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.120874 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b75aef14-f06d-44ec-8a2d-28d9661b1001-kube-api-access-6l5kc" (OuterVolumeSpecName: "kube-api-access-6l5kc") pod "b75aef14-f06d-44ec-8a2d-28d9661b1001" (UID: "b75aef14-f06d-44ec-8a2d-28d9661b1001"). InnerVolumeSpecName "kube-api-access-6l5kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.134152 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "b75aef14-f06d-44ec-8a2d-28d9661b1001" (UID: "b75aef14-f06d-44ec-8a2d-28d9661b1001"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.215440 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43e079d3-5a47-41ff-951f-469f1b78e79f-ovnkube-config\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.215501 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43e079d3-5a47-41ff-951f-469f1b78e79f-ovnkube-script-lib\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.215527 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-host-cni-bin\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.215583 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-host-run-ovn-kubernetes\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.215608 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-run-systemd\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.215632 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-host-cni-netd\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.215660 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.215685 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43e079d3-5a47-41ff-951f-469f1b78e79f-ovn-node-metrics-cert\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.215712 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-node-log\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.215743 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-systemd-units\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.215766 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-host-cni-netd\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.215778 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-host-cni-bin\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.215833 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.215809 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43e079d3-5a47-41ff-951f-469f1b78e79f-env-overrides\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.215796 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-run-systemd\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.215864 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-systemd-units\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.215835 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-node-log\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.215876 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-etc-openvswitch\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.216044 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-etc-openvswitch\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.216118 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-host-run-netns\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.216154 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-host-run-netns\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.216163 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-run-ovn\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.216191 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-var-lib-openvswitch\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.216212 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-host-kubelet\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.216231 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-host-slash\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.216251 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2gdk\" (UniqueName: \"kubernetes.io/projected/43e079d3-5a47-41ff-951f-469f1b78e79f-kube-api-access-v2gdk\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.216263 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-run-ovn\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.216274 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-run-openvswitch\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.216302 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-log-socket\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.216326 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-var-lib-openvswitch\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.216342 4756 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b75aef14-f06d-44ec-8a2d-28d9661b1001-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.216356 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6l5kc\" (UniqueName: \"kubernetes.io/projected/b75aef14-f06d-44ec-8a2d-28d9661b1001-kube-api-access-6l5kc\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.216368 4756 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b75aef14-f06d-44ec-8a2d-28d9661b1001-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.216301 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-host-kubelet\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.216406 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-log-socket\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.216452 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-run-openvswitch\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.216470 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-host-slash\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.216661 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43e079d3-5a47-41ff-951f-469f1b78e79f-env-overrides\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.216727 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43e079d3-5a47-41ff-951f-469f1b78e79f-host-run-ovn-kubernetes\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.216936 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43e079d3-5a47-41ff-951f-469f1b78e79f-ovnkube-script-lib\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.216936 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43e079d3-5a47-41ff-951f-469f1b78e79f-ovnkube-config\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.220818 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43e079d3-5a47-41ff-951f-469f1b78e79f-ovn-node-metrics-cert\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.237025 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2gdk\" (UniqueName: \"kubernetes.io/projected/43e079d3-5a47-41ff-951f-469f1b78e79f-kube-api-access-v2gdk\") pod \"ovnkube-node-2qh8s\" (UID: \"43e079d3-5a47-41ff-951f-469f1b78e79f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.380342 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:07 crc kubenswrapper[4756]: W1205 16:28:07.413011 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43e079d3_5a47_41ff_951f_469f1b78e79f.slice/crio-bb0f1a30c5a874ab065e666215ced96535ec6eb2856029689af4958ea6dd9f29 WatchSource:0}: Error finding container bb0f1a30c5a874ab065e666215ced96535ec6eb2856029689af4958ea6dd9f29: Status 404 returned error can't find the container with id bb0f1a30c5a874ab065e666215ced96535ec6eb2856029689af4958ea6dd9f29 Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.843228 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz5rs_b75aef14-f06d-44ec-8a2d-28d9661b1001/ovn-acl-logging/0.log" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.844745 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz5rs_b75aef14-f06d-44ec-8a2d-28d9661b1001/ovn-controller/0.log" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.845222 4756 generic.go:334] "Generic (PLEG): container finished" podID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerID="612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36" exitCode=0 Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.845257 4756 generic.go:334] "Generic (PLEG): container finished" podID="b75aef14-f06d-44ec-8a2d-28d9661b1001" containerID="0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9" exitCode=0 Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.845327 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" event={"ID":"b75aef14-f06d-44ec-8a2d-28d9661b1001","Type":"ContainerDied","Data":"612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36"} Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.845380 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" event={"ID":"b75aef14-f06d-44ec-8a2d-28d9661b1001","Type":"ContainerDied","Data":"0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9"} Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.845393 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" event={"ID":"b75aef14-f06d-44ec-8a2d-28d9661b1001","Type":"ContainerDied","Data":"a1ecf2925c03484c8de154e4d997146fbca06b5be684279eeb3d456cc0c05cfc"} Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.845466 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pz5rs" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.845494 4756 scope.go:117] "RemoveContainer" containerID="d93234e29ff75d7e51fd9d09b43ab59a7f60fa2c432745aa3e0d9de700af402e" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.847616 4756 generic.go:334] "Generic (PLEG): container finished" podID="43e079d3-5a47-41ff-951f-469f1b78e79f" containerID="6d03fb6e5ce54eaf9a6c3a2dcb4ab0a1e2c3b21a95c31d1558f883b6928220f4" exitCode=0 Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.847675 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" event={"ID":"43e079d3-5a47-41ff-951f-469f1b78e79f","Type":"ContainerDied","Data":"6d03fb6e5ce54eaf9a6c3a2dcb4ab0a1e2c3b21a95c31d1558f883b6928220f4"} Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.847692 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" event={"ID":"43e079d3-5a47-41ff-951f-469f1b78e79f","Type":"ContainerStarted","Data":"bb0f1a30c5a874ab065e666215ced96535ec6eb2856029689af4958ea6dd9f29"} Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.851962 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fcqf4_fb26e42b-df49-418d-8be1-8a0704cd5dfd/kube-multus/2.log" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.852048 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fcqf4" event={"ID":"fb26e42b-df49-418d-8be1-8a0704cd5dfd","Type":"ContainerStarted","Data":"ec05c2a5b8432b8ebea39290188a4c01bd416d55044fc1d566cbd7962a55fd6c"} Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.872908 4756 scope.go:117] "RemoveContainer" containerID="612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.915368 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pz5rs"] Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.918331 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pz5rs"] Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.934166 4756 scope.go:117] "RemoveContainer" containerID="c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.962218 4756 scope.go:117] "RemoveContainer" containerID="0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.976872 4756 scope.go:117] "RemoveContainer" containerID="8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1" Dec 05 16:28:07 crc kubenswrapper[4756]: I1205 16:28:07.992440 4756 scope.go:117] "RemoveContainer" containerID="8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.010679 4756 scope.go:117] "RemoveContainer" containerID="24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.039969 4756 scope.go:117] "RemoveContainer" containerID="bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.061586 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b75aef14-f06d-44ec-8a2d-28d9661b1001" path="/var/lib/kubelet/pods/b75aef14-f06d-44ec-8a2d-28d9661b1001/volumes" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.078877 4756 scope.go:117] "RemoveContainer" containerID="d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.097195 4756 scope.go:117] "RemoveContainer" containerID="d93234e29ff75d7e51fd9d09b43ab59a7f60fa2c432745aa3e0d9de700af402e" Dec 05 16:28:08 crc kubenswrapper[4756]: E1205 16:28:08.098113 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d93234e29ff75d7e51fd9d09b43ab59a7f60fa2c432745aa3e0d9de700af402e\": container with ID starting with d93234e29ff75d7e51fd9d09b43ab59a7f60fa2c432745aa3e0d9de700af402e not found: ID does not exist" containerID="d93234e29ff75d7e51fd9d09b43ab59a7f60fa2c432745aa3e0d9de700af402e" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.098173 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d93234e29ff75d7e51fd9d09b43ab59a7f60fa2c432745aa3e0d9de700af402e"} err="failed to get container status \"d93234e29ff75d7e51fd9d09b43ab59a7f60fa2c432745aa3e0d9de700af402e\": rpc error: code = NotFound desc = could not find container \"d93234e29ff75d7e51fd9d09b43ab59a7f60fa2c432745aa3e0d9de700af402e\": container with ID starting with d93234e29ff75d7e51fd9d09b43ab59a7f60fa2c432745aa3e0d9de700af402e not found: ID does not exist" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.098206 4756 scope.go:117] "RemoveContainer" containerID="612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36" Dec 05 16:28:08 crc kubenswrapper[4756]: E1205 16:28:08.098819 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\": container with ID starting with 612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36 not found: ID does not exist" containerID="612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.098898 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36"} err="failed to get container status \"612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\": rpc error: code = NotFound desc = could not find container \"612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\": container with ID starting with 612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36 not found: ID does not exist" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.098937 4756 scope.go:117] "RemoveContainer" containerID="c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6" Dec 05 16:28:08 crc kubenswrapper[4756]: E1205 16:28:08.099345 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\": container with ID starting with c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6 not found: ID does not exist" containerID="c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.099388 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6"} err="failed to get container status \"c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\": rpc error: code = NotFound desc = could not find container \"c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\": container with ID starting with c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6 not found: ID does not exist" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.099430 4756 scope.go:117] "RemoveContainer" containerID="0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9" Dec 05 16:28:08 crc kubenswrapper[4756]: E1205 16:28:08.099748 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\": container with ID starting with 0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9 not found: ID does not exist" containerID="0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.099780 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9"} err="failed to get container status \"0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\": rpc error: code = NotFound desc = could not find container \"0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\": container with ID starting with 0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9 not found: ID does not exist" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.099799 4756 scope.go:117] "RemoveContainer" containerID="8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1" Dec 05 16:28:08 crc kubenswrapper[4756]: E1205 16:28:08.100180 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\": container with ID starting with 8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1 not found: ID does not exist" containerID="8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.100220 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1"} err="failed to get container status \"8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\": rpc error: code = NotFound desc = could not find container \"8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\": container with ID starting with 8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1 not found: ID does not exist" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.100240 4756 scope.go:117] "RemoveContainer" containerID="8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c" Dec 05 16:28:08 crc kubenswrapper[4756]: E1205 16:28:08.100698 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\": container with ID starting with 8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c not found: ID does not exist" containerID="8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.100725 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c"} err="failed to get container status \"8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\": rpc error: code = NotFound desc = could not find container \"8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\": container with ID starting with 8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c not found: ID does not exist" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.100742 4756 scope.go:117] "RemoveContainer" containerID="24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af" Dec 05 16:28:08 crc kubenswrapper[4756]: E1205 16:28:08.101071 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\": container with ID starting with 24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af not found: ID does not exist" containerID="24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.101108 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af"} err="failed to get container status \"24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\": rpc error: code = NotFound desc = could not find container \"24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\": container with ID starting with 24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af not found: ID does not exist" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.101149 4756 scope.go:117] "RemoveContainer" containerID="bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191" Dec 05 16:28:08 crc kubenswrapper[4756]: E1205 16:28:08.101563 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\": container with ID starting with bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191 not found: ID does not exist" containerID="bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.101595 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191"} err="failed to get container status \"bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\": rpc error: code = NotFound desc = could not find container \"bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\": container with ID starting with bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191 not found: ID does not exist" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.101618 4756 scope.go:117] "RemoveContainer" containerID="d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a" Dec 05 16:28:08 crc kubenswrapper[4756]: E1205 16:28:08.101925 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\": container with ID starting with d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a not found: ID does not exist" containerID="d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.101950 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a"} err="failed to get container status \"d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\": rpc error: code = NotFound desc = could not find container \"d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\": container with ID starting with d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a not found: ID does not exist" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.101965 4756 scope.go:117] "RemoveContainer" containerID="d93234e29ff75d7e51fd9d09b43ab59a7f60fa2c432745aa3e0d9de700af402e" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.102255 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d93234e29ff75d7e51fd9d09b43ab59a7f60fa2c432745aa3e0d9de700af402e"} err="failed to get container status \"d93234e29ff75d7e51fd9d09b43ab59a7f60fa2c432745aa3e0d9de700af402e\": rpc error: code = NotFound desc = could not find container \"d93234e29ff75d7e51fd9d09b43ab59a7f60fa2c432745aa3e0d9de700af402e\": container with ID starting with d93234e29ff75d7e51fd9d09b43ab59a7f60fa2c432745aa3e0d9de700af402e not found: ID does not exist" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.102296 4756 scope.go:117] "RemoveContainer" containerID="612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.102677 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36"} err="failed to get container status \"612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\": rpc error: code = NotFound desc = could not find container \"612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36\": container with ID starting with 612644bb5b20f059e5645729dfe938564e2c277c36bc1cfee45236de2d9ecd36 not found: ID does not exist" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.102703 4756 scope.go:117] "RemoveContainer" containerID="c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.102979 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6"} err="failed to get container status \"c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\": rpc error: code = NotFound desc = could not find container \"c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6\": container with ID starting with c9396fa3d9c6ab89c168cc6dd0ca4a4245afb2b270312c279d914ce2bec5d5c6 not found: ID does not exist" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.102999 4756 scope.go:117] "RemoveContainer" containerID="0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.103326 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9"} err="failed to get container status \"0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\": rpc error: code = NotFound desc = could not find container \"0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9\": container with ID starting with 0d2026dcfc35a88bf159c33542cc2cb0394a8656c45ebbe5093521edc2cccdc9 not found: ID does not exist" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.103351 4756 scope.go:117] "RemoveContainer" containerID="8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.103679 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1"} err="failed to get container status \"8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\": rpc error: code = NotFound desc = could not find container \"8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1\": container with ID starting with 8917f89bf3d8c9b90198ac6de19c6f973803bedd24e4d976ce75eda6a027cab1 not found: ID does not exist" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.103705 4756 scope.go:117] "RemoveContainer" containerID="8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.103967 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c"} err="failed to get container status \"8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\": rpc error: code = NotFound desc = could not find container \"8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c\": container with ID starting with 8e451d33b78ac8140c78a5eba9bcd789dd021289604c6771340338facfacd23c not found: ID does not exist" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.103990 4756 scope.go:117] "RemoveContainer" containerID="24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.104257 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af"} err="failed to get container status \"24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\": rpc error: code = NotFound desc = could not find container \"24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af\": container with ID starting with 24de74305bc0ea4313ddf64095d1d0f3ec0e7febead030e2d35657ef86b1e2af not found: ID does not exist" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.104282 4756 scope.go:117] "RemoveContainer" containerID="bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.104634 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191"} err="failed to get container status \"bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\": rpc error: code = NotFound desc = could not find container \"bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191\": container with ID starting with bfbbb41282211dcb9b94364b54b66de862a09ddbe4c1bca5aa6cf4f2cc4d5191 not found: ID does not exist" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.104666 4756 scope.go:117] "RemoveContainer" containerID="d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.104956 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a"} err="failed to get container status \"d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\": rpc error: code = NotFound desc = could not find container \"d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a\": container with ID starting with d9f28d9e3a458f0cf0792deffd7459bb409b7977d1e605d9067671c72a63dc7a not found: ID does not exist" Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.882582 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" event={"ID":"43e079d3-5a47-41ff-951f-469f1b78e79f","Type":"ContainerStarted","Data":"6cf58df908fc347ca8565369ea78800e2a69408c533c1c11f14172569a264efd"} Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.883113 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" event={"ID":"43e079d3-5a47-41ff-951f-469f1b78e79f","Type":"ContainerStarted","Data":"8e72d15c5c03e893dcf320d515b3bc1d5e834e3baa90b128e7fd9167ecfddfaa"} Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.883128 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" event={"ID":"43e079d3-5a47-41ff-951f-469f1b78e79f","Type":"ContainerStarted","Data":"d1e01dd6f3be2b3128ba3c90a298b182558fbceac13cccbb7b05699e57687d27"} Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.883139 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" event={"ID":"43e079d3-5a47-41ff-951f-469f1b78e79f","Type":"ContainerStarted","Data":"c6dae702b67958a249fce9344177b250c40ca39a7cc77813e741c5c1f9ba5d4e"} Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.883149 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" event={"ID":"43e079d3-5a47-41ff-951f-469f1b78e79f","Type":"ContainerStarted","Data":"f1f011f28def704a5c5c01fbb52cd683270cb9033e183bde2c787ffd45b719e1"} Dec 05 16:28:08 crc kubenswrapper[4756]: I1205 16:28:08.883161 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" event={"ID":"43e079d3-5a47-41ff-951f-469f1b78e79f","Type":"ContainerStarted","Data":"8b06d81c65eac7fd6c8fa3a13cca9945bf7f4ca5d04cd3ba8763a7c6352248c5"} Dec 05 16:28:10 crc kubenswrapper[4756]: I1205 16:28:10.899405 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" event={"ID":"43e079d3-5a47-41ff-951f-469f1b78e79f","Type":"ContainerStarted","Data":"fea9d3cf1756bd23945f7094c38c176493b1ef1f8050d8629135c6c1db38d8b3"} Dec 05 16:28:13 crc kubenswrapper[4756]: I1205 16:28:13.924982 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" event={"ID":"43e079d3-5a47-41ff-951f-469f1b78e79f","Type":"ContainerStarted","Data":"d00226471ee500b13d655f7bd37b892b52987d8fc768fafca1f3e1ad33c33c60"} Dec 05 16:28:13 crc kubenswrapper[4756]: I1205 16:28:13.925960 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:13 crc kubenswrapper[4756]: I1205 16:28:13.926099 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:13 crc kubenswrapper[4756]: I1205 16:28:13.926199 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:13 crc kubenswrapper[4756]: I1205 16:28:13.958949 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:13 crc kubenswrapper[4756]: I1205 16:28:13.960203 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:13 crc kubenswrapper[4756]: I1205 16:28:13.967134 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" podStartSLOduration=6.967115955 podStartE2EDuration="6.967115955s" podCreationTimestamp="2025-12-05 16:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:28:13.964727082 +0000 UTC m=+754.203400614" watchObservedRunningTime="2025-12-05 16:28:13.967115955 +0000 UTC m=+754.205789497" Dec 05 16:28:14 crc kubenswrapper[4756]: I1205 16:28:14.213183 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-ch5kc"] Dec 05 16:28:14 crc kubenswrapper[4756]: I1205 16:28:14.214461 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ch5kc" Dec 05 16:28:14 crc kubenswrapper[4756]: I1205 16:28:14.216987 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 05 16:28:14 crc kubenswrapper[4756]: I1205 16:28:14.217783 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 05 16:28:14 crc kubenswrapper[4756]: I1205 16:28:14.218793 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 05 16:28:14 crc kubenswrapper[4756]: I1205 16:28:14.218866 4756 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-7bf44" Dec 05 16:28:14 crc kubenswrapper[4756]: I1205 16:28:14.317838 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg8pj\" (UniqueName: \"kubernetes.io/projected/aac6ba13-41b0-4db9-9dfd-9280afe331f0-kube-api-access-kg8pj\") pod \"crc-storage-crc-ch5kc\" (UID: \"aac6ba13-41b0-4db9-9dfd-9280afe331f0\") " pod="crc-storage/crc-storage-crc-ch5kc" Dec 05 16:28:14 crc kubenswrapper[4756]: I1205 16:28:14.317948 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/aac6ba13-41b0-4db9-9dfd-9280afe331f0-node-mnt\") pod \"crc-storage-crc-ch5kc\" (UID: \"aac6ba13-41b0-4db9-9dfd-9280afe331f0\") " pod="crc-storage/crc-storage-crc-ch5kc" Dec 05 16:28:14 crc kubenswrapper[4756]: I1205 16:28:14.317992 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/aac6ba13-41b0-4db9-9dfd-9280afe331f0-crc-storage\") pod \"crc-storage-crc-ch5kc\" (UID: \"aac6ba13-41b0-4db9-9dfd-9280afe331f0\") " pod="crc-storage/crc-storage-crc-ch5kc" Dec 05 16:28:14 crc kubenswrapper[4756]: I1205 16:28:14.419221 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/aac6ba13-41b0-4db9-9dfd-9280afe331f0-node-mnt\") pod \"crc-storage-crc-ch5kc\" (UID: \"aac6ba13-41b0-4db9-9dfd-9280afe331f0\") " pod="crc-storage/crc-storage-crc-ch5kc" Dec 05 16:28:14 crc kubenswrapper[4756]: I1205 16:28:14.419284 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/aac6ba13-41b0-4db9-9dfd-9280afe331f0-crc-storage\") pod \"crc-storage-crc-ch5kc\" (UID: \"aac6ba13-41b0-4db9-9dfd-9280afe331f0\") " pod="crc-storage/crc-storage-crc-ch5kc" Dec 05 16:28:14 crc kubenswrapper[4756]: I1205 16:28:14.419320 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg8pj\" (UniqueName: \"kubernetes.io/projected/aac6ba13-41b0-4db9-9dfd-9280afe331f0-kube-api-access-kg8pj\") pod \"crc-storage-crc-ch5kc\" (UID: \"aac6ba13-41b0-4db9-9dfd-9280afe331f0\") " pod="crc-storage/crc-storage-crc-ch5kc" Dec 05 16:28:14 crc kubenswrapper[4756]: I1205 16:28:14.419753 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/aac6ba13-41b0-4db9-9dfd-9280afe331f0-node-mnt\") pod \"crc-storage-crc-ch5kc\" (UID: \"aac6ba13-41b0-4db9-9dfd-9280afe331f0\") " pod="crc-storage/crc-storage-crc-ch5kc" Dec 05 16:28:14 crc kubenswrapper[4756]: I1205 16:28:14.420734 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/aac6ba13-41b0-4db9-9dfd-9280afe331f0-crc-storage\") pod \"crc-storage-crc-ch5kc\" (UID: \"aac6ba13-41b0-4db9-9dfd-9280afe331f0\") " pod="crc-storage/crc-storage-crc-ch5kc" Dec 05 16:28:14 crc kubenswrapper[4756]: I1205 16:28:14.439500 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg8pj\" (UniqueName: \"kubernetes.io/projected/aac6ba13-41b0-4db9-9dfd-9280afe331f0-kube-api-access-kg8pj\") pod \"crc-storage-crc-ch5kc\" (UID: \"aac6ba13-41b0-4db9-9dfd-9280afe331f0\") " pod="crc-storage/crc-storage-crc-ch5kc" Dec 05 16:28:14 crc kubenswrapper[4756]: I1205 16:28:14.534161 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ch5kc" Dec 05 16:28:14 crc kubenswrapper[4756]: E1205 16:28:14.566401 4756 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ch5kc_crc-storage_aac6ba13-41b0-4db9-9dfd-9280afe331f0_0(5f1a643c4a74ea399f0ae9b25a02265e8a59029f28b618a3a7657b9ed75d8a4a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 16:28:14 crc kubenswrapper[4756]: E1205 16:28:14.566512 4756 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ch5kc_crc-storage_aac6ba13-41b0-4db9-9dfd-9280afe331f0_0(5f1a643c4a74ea399f0ae9b25a02265e8a59029f28b618a3a7657b9ed75d8a4a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-ch5kc" Dec 05 16:28:14 crc kubenswrapper[4756]: E1205 16:28:14.566533 4756 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ch5kc_crc-storage_aac6ba13-41b0-4db9-9dfd-9280afe331f0_0(5f1a643c4a74ea399f0ae9b25a02265e8a59029f28b618a3a7657b9ed75d8a4a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-ch5kc" Dec 05 16:28:14 crc kubenswrapper[4756]: E1205 16:28:14.566583 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-ch5kc_crc-storage(aac6ba13-41b0-4db9-9dfd-9280afe331f0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-ch5kc_crc-storage(aac6ba13-41b0-4db9-9dfd-9280afe331f0)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ch5kc_crc-storage_aac6ba13-41b0-4db9-9dfd-9280afe331f0_0(5f1a643c4a74ea399f0ae9b25a02265e8a59029f28b618a3a7657b9ed75d8a4a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-ch5kc" podUID="aac6ba13-41b0-4db9-9dfd-9280afe331f0" Dec 05 16:28:15 crc kubenswrapper[4756]: I1205 16:28:15.631979 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ch5kc"] Dec 05 16:28:15 crc kubenswrapper[4756]: I1205 16:28:15.632466 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ch5kc" Dec 05 16:28:15 crc kubenswrapper[4756]: I1205 16:28:15.633022 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ch5kc" Dec 05 16:28:15 crc kubenswrapper[4756]: E1205 16:28:15.654282 4756 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ch5kc_crc-storage_aac6ba13-41b0-4db9-9dfd-9280afe331f0_0(d761a8a3dfd0657e5289beb63c6f498a4b5ed217f251a63a8611cd4f0a724622): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 16:28:15 crc kubenswrapper[4756]: E1205 16:28:15.654344 4756 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ch5kc_crc-storage_aac6ba13-41b0-4db9-9dfd-9280afe331f0_0(d761a8a3dfd0657e5289beb63c6f498a4b5ed217f251a63a8611cd4f0a724622): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-ch5kc" Dec 05 16:28:15 crc kubenswrapper[4756]: E1205 16:28:15.654364 4756 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ch5kc_crc-storage_aac6ba13-41b0-4db9-9dfd-9280afe331f0_0(d761a8a3dfd0657e5289beb63c6f498a4b5ed217f251a63a8611cd4f0a724622): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-ch5kc" Dec 05 16:28:15 crc kubenswrapper[4756]: E1205 16:28:15.654411 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-ch5kc_crc-storage(aac6ba13-41b0-4db9-9dfd-9280afe331f0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-ch5kc_crc-storage(aac6ba13-41b0-4db9-9dfd-9280afe331f0)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ch5kc_crc-storage_aac6ba13-41b0-4db9-9dfd-9280afe331f0_0(d761a8a3dfd0657e5289beb63c6f498a4b5ed217f251a63a8611cd4f0a724622): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-ch5kc" podUID="aac6ba13-41b0-4db9-9dfd-9280afe331f0" Dec 05 16:28:19 crc kubenswrapper[4756]: I1205 16:28:19.501258 4756 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 16:28:27 crc kubenswrapper[4756]: I1205 16:28:27.049394 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ch5kc" Dec 05 16:28:27 crc kubenswrapper[4756]: I1205 16:28:27.049953 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ch5kc" Dec 05 16:28:27 crc kubenswrapper[4756]: I1205 16:28:27.288178 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ch5kc"] Dec 05 16:28:27 crc kubenswrapper[4756]: W1205 16:28:27.300593 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaac6ba13_41b0_4db9_9dfd_9280afe331f0.slice/crio-ce8d768b095173c0b0a9f63d68162ca60c3064bce94bffc201e752a09393cb3f WatchSource:0}: Error finding container ce8d768b095173c0b0a9f63d68162ca60c3064bce94bffc201e752a09393cb3f: Status 404 returned error can't find the container with id ce8d768b095173c0b0a9f63d68162ca60c3064bce94bffc201e752a09393cb3f Dec 05 16:28:27 crc kubenswrapper[4756]: I1205 16:28:27.304496 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 16:28:28 crc kubenswrapper[4756]: I1205 16:28:28.008718 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ch5kc" event={"ID":"aac6ba13-41b0-4db9-9dfd-9280afe331f0","Type":"ContainerStarted","Data":"ce8d768b095173c0b0a9f63d68162ca60c3064bce94bffc201e752a09393cb3f"} Dec 05 16:28:30 crc kubenswrapper[4756]: I1205 16:28:30.020538 4756 generic.go:334] "Generic (PLEG): container finished" podID="aac6ba13-41b0-4db9-9dfd-9280afe331f0" containerID="00e826abeb1e4814f50006f7d5860ca14bfe871c096fbe9022b06a2651165e25" exitCode=0 Dec 05 16:28:30 crc kubenswrapper[4756]: I1205 16:28:30.020635 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ch5kc" event={"ID":"aac6ba13-41b0-4db9-9dfd-9280afe331f0","Type":"ContainerDied","Data":"00e826abeb1e4814f50006f7d5860ca14bfe871c096fbe9022b06a2651165e25"} Dec 05 16:28:31 crc kubenswrapper[4756]: I1205 16:28:31.252359 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ch5kc" Dec 05 16:28:31 crc kubenswrapper[4756]: I1205 16:28:31.359470 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg8pj\" (UniqueName: \"kubernetes.io/projected/aac6ba13-41b0-4db9-9dfd-9280afe331f0-kube-api-access-kg8pj\") pod \"aac6ba13-41b0-4db9-9dfd-9280afe331f0\" (UID: \"aac6ba13-41b0-4db9-9dfd-9280afe331f0\") " Dec 05 16:28:31 crc kubenswrapper[4756]: I1205 16:28:31.359550 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/aac6ba13-41b0-4db9-9dfd-9280afe331f0-node-mnt\") pod \"aac6ba13-41b0-4db9-9dfd-9280afe331f0\" (UID: \"aac6ba13-41b0-4db9-9dfd-9280afe331f0\") " Dec 05 16:28:31 crc kubenswrapper[4756]: I1205 16:28:31.359586 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/aac6ba13-41b0-4db9-9dfd-9280afe331f0-crc-storage\") pod \"aac6ba13-41b0-4db9-9dfd-9280afe331f0\" (UID: \"aac6ba13-41b0-4db9-9dfd-9280afe331f0\") " Dec 05 16:28:31 crc kubenswrapper[4756]: I1205 16:28:31.359683 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aac6ba13-41b0-4db9-9dfd-9280afe331f0-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "aac6ba13-41b0-4db9-9dfd-9280afe331f0" (UID: "aac6ba13-41b0-4db9-9dfd-9280afe331f0"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:28:31 crc kubenswrapper[4756]: I1205 16:28:31.359918 4756 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/aac6ba13-41b0-4db9-9dfd-9280afe331f0-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:31 crc kubenswrapper[4756]: I1205 16:28:31.363924 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aac6ba13-41b0-4db9-9dfd-9280afe331f0-kube-api-access-kg8pj" (OuterVolumeSpecName: "kube-api-access-kg8pj") pod "aac6ba13-41b0-4db9-9dfd-9280afe331f0" (UID: "aac6ba13-41b0-4db9-9dfd-9280afe331f0"). InnerVolumeSpecName "kube-api-access-kg8pj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:28:31 crc kubenswrapper[4756]: I1205 16:28:31.372141 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aac6ba13-41b0-4db9-9dfd-9280afe331f0-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "aac6ba13-41b0-4db9-9dfd-9280afe331f0" (UID: "aac6ba13-41b0-4db9-9dfd-9280afe331f0"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:28:31 crc kubenswrapper[4756]: I1205 16:28:31.462125 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg8pj\" (UniqueName: \"kubernetes.io/projected/aac6ba13-41b0-4db9-9dfd-9280afe331f0-kube-api-access-kg8pj\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:31 crc kubenswrapper[4756]: I1205 16:28:31.462188 4756 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/aac6ba13-41b0-4db9-9dfd-9280afe331f0-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:32 crc kubenswrapper[4756]: I1205 16:28:32.035282 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ch5kc" event={"ID":"aac6ba13-41b0-4db9-9dfd-9280afe331f0","Type":"ContainerDied","Data":"ce8d768b095173c0b0a9f63d68162ca60c3064bce94bffc201e752a09393cb3f"} Dec 05 16:28:32 crc kubenswrapper[4756]: I1205 16:28:32.035532 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce8d768b095173c0b0a9f63d68162ca60c3064bce94bffc201e752a09393cb3f" Dec 05 16:28:32 crc kubenswrapper[4756]: I1205 16:28:32.035383 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ch5kc" Dec 05 16:28:37 crc kubenswrapper[4756]: I1205 16:28:37.405493 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2qh8s" Dec 05 16:28:39 crc kubenswrapper[4756]: I1205 16:28:39.502458 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5"] Dec 05 16:28:39 crc kubenswrapper[4756]: E1205 16:28:39.502707 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aac6ba13-41b0-4db9-9dfd-9280afe331f0" containerName="storage" Dec 05 16:28:39 crc kubenswrapper[4756]: I1205 16:28:39.502722 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="aac6ba13-41b0-4db9-9dfd-9280afe331f0" containerName="storage" Dec 05 16:28:39 crc kubenswrapper[4756]: I1205 16:28:39.502843 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="aac6ba13-41b0-4db9-9dfd-9280afe331f0" containerName="storage" Dec 05 16:28:39 crc kubenswrapper[4756]: I1205 16:28:39.503727 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5" Dec 05 16:28:39 crc kubenswrapper[4756]: I1205 16:28:39.511775 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 16:28:39 crc kubenswrapper[4756]: I1205 16:28:39.514528 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5"] Dec 05 16:28:39 crc kubenswrapper[4756]: I1205 16:28:39.565919 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbbff\" (UniqueName: \"kubernetes.io/projected/68caf581-c3d0-49bb-9fff-c574269dbcf6-kube-api-access-sbbff\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5\" (UID: \"68caf581-c3d0-49bb-9fff-c574269dbcf6\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5" Dec 05 16:28:39 crc kubenswrapper[4756]: I1205 16:28:39.565975 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68caf581-c3d0-49bb-9fff-c574269dbcf6-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5\" (UID: \"68caf581-c3d0-49bb-9fff-c574269dbcf6\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5" Dec 05 16:28:39 crc kubenswrapper[4756]: I1205 16:28:39.566027 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68caf581-c3d0-49bb-9fff-c574269dbcf6-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5\" (UID: \"68caf581-c3d0-49bb-9fff-c574269dbcf6\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5" Dec 05 16:28:39 crc kubenswrapper[4756]: I1205 16:28:39.667508 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbbff\" (UniqueName: \"kubernetes.io/projected/68caf581-c3d0-49bb-9fff-c574269dbcf6-kube-api-access-sbbff\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5\" (UID: \"68caf581-c3d0-49bb-9fff-c574269dbcf6\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5" Dec 05 16:28:39 crc kubenswrapper[4756]: I1205 16:28:39.668229 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68caf581-c3d0-49bb-9fff-c574269dbcf6-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5\" (UID: \"68caf581-c3d0-49bb-9fff-c574269dbcf6\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5" Dec 05 16:28:39 crc kubenswrapper[4756]: I1205 16:28:39.668375 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68caf581-c3d0-49bb-9fff-c574269dbcf6-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5\" (UID: \"68caf581-c3d0-49bb-9fff-c574269dbcf6\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5" Dec 05 16:28:39 crc kubenswrapper[4756]: I1205 16:28:39.669012 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68caf581-c3d0-49bb-9fff-c574269dbcf6-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5\" (UID: \"68caf581-c3d0-49bb-9fff-c574269dbcf6\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5" Dec 05 16:28:39 crc kubenswrapper[4756]: I1205 16:28:39.669156 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68caf581-c3d0-49bb-9fff-c574269dbcf6-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5\" (UID: \"68caf581-c3d0-49bb-9fff-c574269dbcf6\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5" Dec 05 16:28:39 crc kubenswrapper[4756]: I1205 16:28:39.691022 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbbff\" (UniqueName: \"kubernetes.io/projected/68caf581-c3d0-49bb-9fff-c574269dbcf6-kube-api-access-sbbff\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5\" (UID: \"68caf581-c3d0-49bb-9fff-c574269dbcf6\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5" Dec 05 16:28:39 crc kubenswrapper[4756]: I1205 16:28:39.831016 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5" Dec 05 16:28:40 crc kubenswrapper[4756]: I1205 16:28:40.225954 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5"] Dec 05 16:28:41 crc kubenswrapper[4756]: I1205 16:28:41.079625 4756 generic.go:334] "Generic (PLEG): container finished" podID="68caf581-c3d0-49bb-9fff-c574269dbcf6" containerID="9307cdd899f1fa12d0acd5977bf5320dd8ded187f6f5bee8f11b788a026dc551" exitCode=0 Dec 05 16:28:41 crc kubenswrapper[4756]: I1205 16:28:41.079720 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5" event={"ID":"68caf581-c3d0-49bb-9fff-c574269dbcf6","Type":"ContainerDied","Data":"9307cdd899f1fa12d0acd5977bf5320dd8ded187f6f5bee8f11b788a026dc551"} Dec 05 16:28:41 crc kubenswrapper[4756]: I1205 16:28:41.079759 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5" event={"ID":"68caf581-c3d0-49bb-9fff-c574269dbcf6","Type":"ContainerStarted","Data":"77cbd155e9500a2568836761e29285733a45b66aac920f119b1c995a744f33ed"} Dec 05 16:28:41 crc kubenswrapper[4756]: I1205 16:28:41.863154 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5c29z"] Dec 05 16:28:41 crc kubenswrapper[4756]: I1205 16:28:41.868001 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5c29z" Dec 05 16:28:41 crc kubenswrapper[4756]: I1205 16:28:41.868882 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5c29z"] Dec 05 16:28:41 crc kubenswrapper[4756]: I1205 16:28:41.898436 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3964b617-ce29-4325-b97a-0f484096cbbb-catalog-content\") pod \"redhat-operators-5c29z\" (UID: \"3964b617-ce29-4325-b97a-0f484096cbbb\") " pod="openshift-marketplace/redhat-operators-5c29z" Dec 05 16:28:41 crc kubenswrapper[4756]: I1205 16:28:41.898503 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3964b617-ce29-4325-b97a-0f484096cbbb-utilities\") pod \"redhat-operators-5c29z\" (UID: \"3964b617-ce29-4325-b97a-0f484096cbbb\") " pod="openshift-marketplace/redhat-operators-5c29z" Dec 05 16:28:41 crc kubenswrapper[4756]: I1205 16:28:41.898542 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsgt2\" (UniqueName: \"kubernetes.io/projected/3964b617-ce29-4325-b97a-0f484096cbbb-kube-api-access-dsgt2\") pod \"redhat-operators-5c29z\" (UID: \"3964b617-ce29-4325-b97a-0f484096cbbb\") " pod="openshift-marketplace/redhat-operators-5c29z" Dec 05 16:28:42 crc kubenswrapper[4756]: I1205 16:28:42.000216 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3964b617-ce29-4325-b97a-0f484096cbbb-catalog-content\") pod \"redhat-operators-5c29z\" (UID: \"3964b617-ce29-4325-b97a-0f484096cbbb\") " pod="openshift-marketplace/redhat-operators-5c29z" Dec 05 16:28:42 crc kubenswrapper[4756]: I1205 16:28:42.000672 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3964b617-ce29-4325-b97a-0f484096cbbb-utilities\") pod \"redhat-operators-5c29z\" (UID: \"3964b617-ce29-4325-b97a-0f484096cbbb\") " pod="openshift-marketplace/redhat-operators-5c29z" Dec 05 16:28:42 crc kubenswrapper[4756]: I1205 16:28:42.000813 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsgt2\" (UniqueName: \"kubernetes.io/projected/3964b617-ce29-4325-b97a-0f484096cbbb-kube-api-access-dsgt2\") pod \"redhat-operators-5c29z\" (UID: \"3964b617-ce29-4325-b97a-0f484096cbbb\") " pod="openshift-marketplace/redhat-operators-5c29z" Dec 05 16:28:42 crc kubenswrapper[4756]: I1205 16:28:42.000869 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3964b617-ce29-4325-b97a-0f484096cbbb-catalog-content\") pod \"redhat-operators-5c29z\" (UID: \"3964b617-ce29-4325-b97a-0f484096cbbb\") " pod="openshift-marketplace/redhat-operators-5c29z" Dec 05 16:28:42 crc kubenswrapper[4756]: I1205 16:28:42.001400 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3964b617-ce29-4325-b97a-0f484096cbbb-utilities\") pod \"redhat-operators-5c29z\" (UID: \"3964b617-ce29-4325-b97a-0f484096cbbb\") " pod="openshift-marketplace/redhat-operators-5c29z" Dec 05 16:28:42 crc kubenswrapper[4756]: I1205 16:28:42.029565 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsgt2\" (UniqueName: \"kubernetes.io/projected/3964b617-ce29-4325-b97a-0f484096cbbb-kube-api-access-dsgt2\") pod \"redhat-operators-5c29z\" (UID: \"3964b617-ce29-4325-b97a-0f484096cbbb\") " pod="openshift-marketplace/redhat-operators-5c29z" Dec 05 16:28:42 crc kubenswrapper[4756]: I1205 16:28:42.184716 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5c29z" Dec 05 16:28:42 crc kubenswrapper[4756]: I1205 16:28:42.445618 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5c29z"] Dec 05 16:28:43 crc kubenswrapper[4756]: I1205 16:28:43.092217 4756 generic.go:334] "Generic (PLEG): container finished" podID="3964b617-ce29-4325-b97a-0f484096cbbb" containerID="d7e2969fa082c50c1169bb94cc5010785ee8479ebec143b314386ecae831b300" exitCode=0 Dec 05 16:28:43 crc kubenswrapper[4756]: I1205 16:28:43.093633 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5c29z" event={"ID":"3964b617-ce29-4325-b97a-0f484096cbbb","Type":"ContainerDied","Data":"d7e2969fa082c50c1169bb94cc5010785ee8479ebec143b314386ecae831b300"} Dec 05 16:28:43 crc kubenswrapper[4756]: I1205 16:28:43.093796 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5c29z" event={"ID":"3964b617-ce29-4325-b97a-0f484096cbbb","Type":"ContainerStarted","Data":"2fa3c382ce9f2a88f8f550523f77e928c90c082df966d97059a67572abddccd2"} Dec 05 16:28:43 crc kubenswrapper[4756]: I1205 16:28:43.097095 4756 generic.go:334] "Generic (PLEG): container finished" podID="68caf581-c3d0-49bb-9fff-c574269dbcf6" containerID="3baf329551623de2028b7c1170b94d147abf1b1137be181c9f771990b189be06" exitCode=0 Dec 05 16:28:43 crc kubenswrapper[4756]: I1205 16:28:43.097132 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5" event={"ID":"68caf581-c3d0-49bb-9fff-c574269dbcf6","Type":"ContainerDied","Data":"3baf329551623de2028b7c1170b94d147abf1b1137be181c9f771990b189be06"} Dec 05 16:28:44 crc kubenswrapper[4756]: I1205 16:28:44.105814 4756 generic.go:334] "Generic (PLEG): container finished" podID="68caf581-c3d0-49bb-9fff-c574269dbcf6" containerID="743dd34eacba60361954c3004c50f9c6d3eb514aa8b0ec39c375d1033e4e7495" exitCode=0 Dec 05 16:28:44 crc kubenswrapper[4756]: I1205 16:28:44.105868 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5" event={"ID":"68caf581-c3d0-49bb-9fff-c574269dbcf6","Type":"ContainerDied","Data":"743dd34eacba60361954c3004c50f9c6d3eb514aa8b0ec39c375d1033e4e7495"} Dec 05 16:28:44 crc kubenswrapper[4756]: I1205 16:28:44.108819 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5c29z" event={"ID":"3964b617-ce29-4325-b97a-0f484096cbbb","Type":"ContainerStarted","Data":"92a1202bc90ad02dc8edf00203c9f29a32214423e59ca6491dbd8bf327723f7e"} Dec 05 16:28:45 crc kubenswrapper[4756]: I1205 16:28:45.116275 4756 generic.go:334] "Generic (PLEG): container finished" podID="3964b617-ce29-4325-b97a-0f484096cbbb" containerID="92a1202bc90ad02dc8edf00203c9f29a32214423e59ca6491dbd8bf327723f7e" exitCode=0 Dec 05 16:28:45 crc kubenswrapper[4756]: I1205 16:28:45.116370 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5c29z" event={"ID":"3964b617-ce29-4325-b97a-0f484096cbbb","Type":"ContainerDied","Data":"92a1202bc90ad02dc8edf00203c9f29a32214423e59ca6491dbd8bf327723f7e"} Dec 05 16:28:45 crc kubenswrapper[4756]: I1205 16:28:45.449137 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5" Dec 05 16:28:45 crc kubenswrapper[4756]: I1205 16:28:45.552917 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbbff\" (UniqueName: \"kubernetes.io/projected/68caf581-c3d0-49bb-9fff-c574269dbcf6-kube-api-access-sbbff\") pod \"68caf581-c3d0-49bb-9fff-c574269dbcf6\" (UID: \"68caf581-c3d0-49bb-9fff-c574269dbcf6\") " Dec 05 16:28:45 crc kubenswrapper[4756]: I1205 16:28:45.552981 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68caf581-c3d0-49bb-9fff-c574269dbcf6-util\") pod \"68caf581-c3d0-49bb-9fff-c574269dbcf6\" (UID: \"68caf581-c3d0-49bb-9fff-c574269dbcf6\") " Dec 05 16:28:45 crc kubenswrapper[4756]: I1205 16:28:45.553018 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68caf581-c3d0-49bb-9fff-c574269dbcf6-bundle\") pod \"68caf581-c3d0-49bb-9fff-c574269dbcf6\" (UID: \"68caf581-c3d0-49bb-9fff-c574269dbcf6\") " Dec 05 16:28:45 crc kubenswrapper[4756]: I1205 16:28:45.553705 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68caf581-c3d0-49bb-9fff-c574269dbcf6-bundle" (OuterVolumeSpecName: "bundle") pod "68caf581-c3d0-49bb-9fff-c574269dbcf6" (UID: "68caf581-c3d0-49bb-9fff-c574269dbcf6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:28:45 crc kubenswrapper[4756]: I1205 16:28:45.558893 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68caf581-c3d0-49bb-9fff-c574269dbcf6-kube-api-access-sbbff" (OuterVolumeSpecName: "kube-api-access-sbbff") pod "68caf581-c3d0-49bb-9fff-c574269dbcf6" (UID: "68caf581-c3d0-49bb-9fff-c574269dbcf6"). InnerVolumeSpecName "kube-api-access-sbbff". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:28:45 crc kubenswrapper[4756]: I1205 16:28:45.569847 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68caf581-c3d0-49bb-9fff-c574269dbcf6-util" (OuterVolumeSpecName: "util") pod "68caf581-c3d0-49bb-9fff-c574269dbcf6" (UID: "68caf581-c3d0-49bb-9fff-c574269dbcf6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:28:45 crc kubenswrapper[4756]: I1205 16:28:45.654312 4756 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68caf581-c3d0-49bb-9fff-c574269dbcf6-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:45 crc kubenswrapper[4756]: I1205 16:28:45.654347 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbbff\" (UniqueName: \"kubernetes.io/projected/68caf581-c3d0-49bb-9fff-c574269dbcf6-kube-api-access-sbbff\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:45 crc kubenswrapper[4756]: I1205 16:28:45.654363 4756 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68caf581-c3d0-49bb-9fff-c574269dbcf6-util\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:46 crc kubenswrapper[4756]: I1205 16:28:46.123322 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5" event={"ID":"68caf581-c3d0-49bb-9fff-c574269dbcf6","Type":"ContainerDied","Data":"77cbd155e9500a2568836761e29285733a45b66aac920f119b1c995a744f33ed"} Dec 05 16:28:46 crc kubenswrapper[4756]: I1205 16:28:46.123380 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77cbd155e9500a2568836761e29285733a45b66aac920f119b1c995a744f33ed" Dec 05 16:28:46 crc kubenswrapper[4756]: I1205 16:28:46.123387 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5" Dec 05 16:28:46 crc kubenswrapper[4756]: I1205 16:28:46.127166 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5c29z" event={"ID":"3964b617-ce29-4325-b97a-0f484096cbbb","Type":"ContainerStarted","Data":"778220d8f3ec45fc57368cc8abed250de61a11099cc0302afa3153197287fd64"} Dec 05 16:28:46 crc kubenswrapper[4756]: I1205 16:28:46.144791 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5c29z" podStartSLOduration=2.711454483 podStartE2EDuration="5.144774514s" podCreationTimestamp="2025-12-05 16:28:41 +0000 UTC" firstStartedPulling="2025-12-05 16:28:43.094320001 +0000 UTC m=+783.332993503" lastFinishedPulling="2025-12-05 16:28:45.527640022 +0000 UTC m=+785.766313534" observedRunningTime="2025-12-05 16:28:46.143602694 +0000 UTC m=+786.382276216" watchObservedRunningTime="2025-12-05 16:28:46.144774514 +0000 UTC m=+786.383448026" Dec 05 16:28:49 crc kubenswrapper[4756]: I1205 16:28:49.167231 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-jpztz"] Dec 05 16:28:49 crc kubenswrapper[4756]: E1205 16:28:49.167817 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68caf581-c3d0-49bb-9fff-c574269dbcf6" containerName="extract" Dec 05 16:28:49 crc kubenswrapper[4756]: I1205 16:28:49.167835 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="68caf581-c3d0-49bb-9fff-c574269dbcf6" containerName="extract" Dec 05 16:28:49 crc kubenswrapper[4756]: E1205 16:28:49.167858 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68caf581-c3d0-49bb-9fff-c574269dbcf6" containerName="pull" Dec 05 16:28:49 crc kubenswrapper[4756]: I1205 16:28:49.167865 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="68caf581-c3d0-49bb-9fff-c574269dbcf6" containerName="pull" Dec 05 16:28:49 crc kubenswrapper[4756]: E1205 16:28:49.167875 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68caf581-c3d0-49bb-9fff-c574269dbcf6" containerName="util" Dec 05 16:28:49 crc kubenswrapper[4756]: I1205 16:28:49.167885 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="68caf581-c3d0-49bb-9fff-c574269dbcf6" containerName="util" Dec 05 16:28:49 crc kubenswrapper[4756]: I1205 16:28:49.168010 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="68caf581-c3d0-49bb-9fff-c574269dbcf6" containerName="extract" Dec 05 16:28:49 crc kubenswrapper[4756]: I1205 16:28:49.168521 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-jpztz" Dec 05 16:28:49 crc kubenswrapper[4756]: I1205 16:28:49.174120 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-2s6jv" Dec 05 16:28:49 crc kubenswrapper[4756]: I1205 16:28:49.174365 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 05 16:28:49 crc kubenswrapper[4756]: I1205 16:28:49.178026 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 05 16:28:49 crc kubenswrapper[4756]: I1205 16:28:49.182863 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-jpztz"] Dec 05 16:28:49 crc kubenswrapper[4756]: I1205 16:28:49.196875 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t72k\" (UniqueName: \"kubernetes.io/projected/2c336167-a213-4d0d-a396-512a7a80deda-kube-api-access-7t72k\") pod \"nmstate-operator-5b5b58f5c8-jpztz\" (UID: \"2c336167-a213-4d0d-a396-512a7a80deda\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-jpztz" Dec 05 16:28:49 crc kubenswrapper[4756]: I1205 16:28:49.297558 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t72k\" (UniqueName: \"kubernetes.io/projected/2c336167-a213-4d0d-a396-512a7a80deda-kube-api-access-7t72k\") pod \"nmstate-operator-5b5b58f5c8-jpztz\" (UID: \"2c336167-a213-4d0d-a396-512a7a80deda\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-jpztz" Dec 05 16:28:49 crc kubenswrapper[4756]: I1205 16:28:49.314478 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t72k\" (UniqueName: \"kubernetes.io/projected/2c336167-a213-4d0d-a396-512a7a80deda-kube-api-access-7t72k\") pod \"nmstate-operator-5b5b58f5c8-jpztz\" (UID: \"2c336167-a213-4d0d-a396-512a7a80deda\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-jpztz" Dec 05 16:28:49 crc kubenswrapper[4756]: I1205 16:28:49.485906 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-jpztz" Dec 05 16:28:49 crc kubenswrapper[4756]: I1205 16:28:49.719483 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-jpztz"] Dec 05 16:28:49 crc kubenswrapper[4756]: W1205 16:28:49.746761 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c336167_a213_4d0d_a396_512a7a80deda.slice/crio-e2db18cf0bb61c136cd0a061f0ce9f312d437b4c625832533a999b1c5a1cb213 WatchSource:0}: Error finding container e2db18cf0bb61c136cd0a061f0ce9f312d437b4c625832533a999b1c5a1cb213: Status 404 returned error can't find the container with id e2db18cf0bb61c136cd0a061f0ce9f312d437b4c625832533a999b1c5a1cb213 Dec 05 16:28:50 crc kubenswrapper[4756]: I1205 16:28:50.155003 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-jpztz" event={"ID":"2c336167-a213-4d0d-a396-512a7a80deda","Type":"ContainerStarted","Data":"e2db18cf0bb61c136cd0a061f0ce9f312d437b4c625832533a999b1c5a1cb213"} Dec 05 16:28:52 crc kubenswrapper[4756]: I1205 16:28:52.190632 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5c29z" Dec 05 16:28:52 crc kubenswrapper[4756]: I1205 16:28:52.190692 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5c29z" Dec 05 16:28:52 crc kubenswrapper[4756]: I1205 16:28:52.263499 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5c29z" Dec 05 16:28:52 crc kubenswrapper[4756]: I1205 16:28:52.296060 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:28:52 crc kubenswrapper[4756]: I1205 16:28:52.296129 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:28:53 crc kubenswrapper[4756]: I1205 16:28:53.209866 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5c29z" Dec 05 16:28:54 crc kubenswrapper[4756]: I1205 16:28:54.176468 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-jpztz" event={"ID":"2c336167-a213-4d0d-a396-512a7a80deda","Type":"ContainerStarted","Data":"04bff355929daeb812e9ae441d81f90185edafd190ff2b986b8c6a0482989949"} Dec 05 16:28:54 crc kubenswrapper[4756]: I1205 16:28:54.193612 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-jpztz" podStartSLOduration=1.6800863449999999 podStartE2EDuration="5.19359152s" podCreationTimestamp="2025-12-05 16:28:49 +0000 UTC" firstStartedPulling="2025-12-05 16:28:49.749498976 +0000 UTC m=+789.988172488" lastFinishedPulling="2025-12-05 16:28:53.263004151 +0000 UTC m=+793.501677663" observedRunningTime="2025-12-05 16:28:54.19203058 +0000 UTC m=+794.430704112" watchObservedRunningTime="2025-12-05 16:28:54.19359152 +0000 UTC m=+794.432265032" Dec 05 16:28:54 crc kubenswrapper[4756]: I1205 16:28:54.844016 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5c29z"] Dec 05 16:28:55 crc kubenswrapper[4756]: I1205 16:28:55.182496 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5c29z" podUID="3964b617-ce29-4325-b97a-0f484096cbbb" containerName="registry-server" containerID="cri-o://778220d8f3ec45fc57368cc8abed250de61a11099cc0302afa3153197287fd64" gracePeriod=2 Dec 05 16:28:56 crc kubenswrapper[4756]: I1205 16:28:56.189119 4756 generic.go:334] "Generic (PLEG): container finished" podID="3964b617-ce29-4325-b97a-0f484096cbbb" containerID="778220d8f3ec45fc57368cc8abed250de61a11099cc0302afa3153197287fd64" exitCode=0 Dec 05 16:28:56 crc kubenswrapper[4756]: I1205 16:28:56.189169 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5c29z" event={"ID":"3964b617-ce29-4325-b97a-0f484096cbbb","Type":"ContainerDied","Data":"778220d8f3ec45fc57368cc8abed250de61a11099cc0302afa3153197287fd64"} Dec 05 16:28:56 crc kubenswrapper[4756]: I1205 16:28:56.635441 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5c29z" Dec 05 16:28:56 crc kubenswrapper[4756]: I1205 16:28:56.792198 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsgt2\" (UniqueName: \"kubernetes.io/projected/3964b617-ce29-4325-b97a-0f484096cbbb-kube-api-access-dsgt2\") pod \"3964b617-ce29-4325-b97a-0f484096cbbb\" (UID: \"3964b617-ce29-4325-b97a-0f484096cbbb\") " Dec 05 16:28:56 crc kubenswrapper[4756]: I1205 16:28:56.792603 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3964b617-ce29-4325-b97a-0f484096cbbb-catalog-content\") pod \"3964b617-ce29-4325-b97a-0f484096cbbb\" (UID: \"3964b617-ce29-4325-b97a-0f484096cbbb\") " Dec 05 16:28:56 crc kubenswrapper[4756]: I1205 16:28:56.792694 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3964b617-ce29-4325-b97a-0f484096cbbb-utilities\") pod \"3964b617-ce29-4325-b97a-0f484096cbbb\" (UID: \"3964b617-ce29-4325-b97a-0f484096cbbb\") " Dec 05 16:28:56 crc kubenswrapper[4756]: I1205 16:28:56.794701 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3964b617-ce29-4325-b97a-0f484096cbbb-utilities" (OuterVolumeSpecName: "utilities") pod "3964b617-ce29-4325-b97a-0f484096cbbb" (UID: "3964b617-ce29-4325-b97a-0f484096cbbb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:28:56 crc kubenswrapper[4756]: I1205 16:28:56.800933 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3964b617-ce29-4325-b97a-0f484096cbbb-kube-api-access-dsgt2" (OuterVolumeSpecName: "kube-api-access-dsgt2") pod "3964b617-ce29-4325-b97a-0f484096cbbb" (UID: "3964b617-ce29-4325-b97a-0f484096cbbb"). InnerVolumeSpecName "kube-api-access-dsgt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:28:56 crc kubenswrapper[4756]: I1205 16:28:56.894621 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3964b617-ce29-4325-b97a-0f484096cbbb-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:56 crc kubenswrapper[4756]: I1205 16:28:56.894660 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsgt2\" (UniqueName: \"kubernetes.io/projected/3964b617-ce29-4325-b97a-0f484096cbbb-kube-api-access-dsgt2\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:56 crc kubenswrapper[4756]: I1205 16:28:56.900308 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3964b617-ce29-4325-b97a-0f484096cbbb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3964b617-ce29-4325-b97a-0f484096cbbb" (UID: "3964b617-ce29-4325-b97a-0f484096cbbb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:28:56 crc kubenswrapper[4756]: I1205 16:28:56.995241 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3964b617-ce29-4325-b97a-0f484096cbbb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:28:57 crc kubenswrapper[4756]: I1205 16:28:57.196301 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5c29z" event={"ID":"3964b617-ce29-4325-b97a-0f484096cbbb","Type":"ContainerDied","Data":"2fa3c382ce9f2a88f8f550523f77e928c90c082df966d97059a67572abddccd2"} Dec 05 16:28:57 crc kubenswrapper[4756]: I1205 16:28:57.196364 4756 scope.go:117] "RemoveContainer" containerID="778220d8f3ec45fc57368cc8abed250de61a11099cc0302afa3153197287fd64" Dec 05 16:28:57 crc kubenswrapper[4756]: I1205 16:28:57.196379 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5c29z" Dec 05 16:28:57 crc kubenswrapper[4756]: I1205 16:28:57.211522 4756 scope.go:117] "RemoveContainer" containerID="92a1202bc90ad02dc8edf00203c9f29a32214423e59ca6491dbd8bf327723f7e" Dec 05 16:28:57 crc kubenswrapper[4756]: I1205 16:28:57.223608 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5c29z"] Dec 05 16:28:57 crc kubenswrapper[4756]: I1205 16:28:57.228251 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5c29z"] Dec 05 16:28:57 crc kubenswrapper[4756]: I1205 16:28:57.249744 4756 scope.go:117] "RemoveContainer" containerID="d7e2969fa082c50c1169bb94cc5010785ee8479ebec143b314386ecae831b300" Dec 05 16:28:58 crc kubenswrapper[4756]: I1205 16:28:58.057841 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3964b617-ce29-4325-b97a-0f484096cbbb" path="/var/lib/kubelet/pods/3964b617-ce29-4325-b97a-0f484096cbbb/volumes" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.415105 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-jbg86"] Dec 05 16:28:59 crc kubenswrapper[4756]: E1205 16:28:59.415637 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3964b617-ce29-4325-b97a-0f484096cbbb" containerName="extract-content" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.415654 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3964b617-ce29-4325-b97a-0f484096cbbb" containerName="extract-content" Dec 05 16:28:59 crc kubenswrapper[4756]: E1205 16:28:59.415668 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3964b617-ce29-4325-b97a-0f484096cbbb" containerName="registry-server" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.415676 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3964b617-ce29-4325-b97a-0f484096cbbb" containerName="registry-server" Dec 05 16:28:59 crc kubenswrapper[4756]: E1205 16:28:59.415687 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3964b617-ce29-4325-b97a-0f484096cbbb" containerName="extract-utilities" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.415695 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3964b617-ce29-4325-b97a-0f484096cbbb" containerName="extract-utilities" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.415814 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="3964b617-ce29-4325-b97a-0f484096cbbb" containerName="registry-server" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.416438 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-jbg86" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.418955 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-jzjpj" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.429475 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-jbg86"] Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.469759 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2lv4"] Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.481102 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2lv4" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.487395 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.494427 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-sfl86"] Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.495552 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-sfl86" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.499579 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2lv4"] Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.526065 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntrh9\" (UniqueName: \"kubernetes.io/projected/92193823-fb9a-4291-a9cd-5d25f4a566cc-kube-api-access-ntrh9\") pod \"nmstate-metrics-7f946cbc9-jbg86\" (UID: \"92193823-fb9a-4291-a9cd-5d25f4a566cc\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-jbg86" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.578584 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqqkr"] Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.579730 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqqkr" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.617017 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.617205 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.617597 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-9ngt6" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.627041 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c9f8a111-e6d4-440a-a3d4-4f3010304ba9-dbus-socket\") pod \"nmstate-handler-sfl86\" (UID: \"c9f8a111-e6d4-440a-a3d4-4f3010304ba9\") " pod="openshift-nmstate/nmstate-handler-sfl86" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.627113 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntrh9\" (UniqueName: \"kubernetes.io/projected/92193823-fb9a-4291-a9cd-5d25f4a566cc-kube-api-access-ntrh9\") pod \"nmstate-metrics-7f946cbc9-jbg86\" (UID: \"92193823-fb9a-4291-a9cd-5d25f4a566cc\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-jbg86" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.627251 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c9f8a111-e6d4-440a-a3d4-4f3010304ba9-nmstate-lock\") pod \"nmstate-handler-sfl86\" (UID: \"c9f8a111-e6d4-440a-a3d4-4f3010304ba9\") " pod="openshift-nmstate/nmstate-handler-sfl86" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.627442 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrzv7\" (UniqueName: \"kubernetes.io/projected/81bd8cf2-cb7b-4039-9a4d-fa57e5200d7a-kube-api-access-qrzv7\") pod \"nmstate-webhook-5f6d4c5ccb-z2lv4\" (UID: \"81bd8cf2-cb7b-4039-9a4d-fa57e5200d7a\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2lv4" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.627651 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c9f8a111-e6d4-440a-a3d4-4f3010304ba9-ovs-socket\") pod \"nmstate-handler-sfl86\" (UID: \"c9f8a111-e6d4-440a-a3d4-4f3010304ba9\") " pod="openshift-nmstate/nmstate-handler-sfl86" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.627682 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89844\" (UniqueName: \"kubernetes.io/projected/c9f8a111-e6d4-440a-a3d4-4f3010304ba9-kube-api-access-89844\") pod \"nmstate-handler-sfl86\" (UID: \"c9f8a111-e6d4-440a-a3d4-4f3010304ba9\") " pod="openshift-nmstate/nmstate-handler-sfl86" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.627708 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/81bd8cf2-cb7b-4039-9a4d-fa57e5200d7a-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-z2lv4\" (UID: \"81bd8cf2-cb7b-4039-9a4d-fa57e5200d7a\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2lv4" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.643312 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqqkr"] Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.670460 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntrh9\" (UniqueName: \"kubernetes.io/projected/92193823-fb9a-4291-a9cd-5d25f4a566cc-kube-api-access-ntrh9\") pod \"nmstate-metrics-7f946cbc9-jbg86\" (UID: \"92193823-fb9a-4291-a9cd-5d25f4a566cc\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-jbg86" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.728758 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvf7w\" (UniqueName: \"kubernetes.io/projected/f2ec05fa-c742-4ba4-830e-571af72430c0-kube-api-access-jvf7w\") pod \"nmstate-console-plugin-7fbb5f6569-pqqkr\" (UID: \"f2ec05fa-c742-4ba4-830e-571af72430c0\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqqkr" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.728835 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c9f8a111-e6d4-440a-a3d4-4f3010304ba9-dbus-socket\") pod \"nmstate-handler-sfl86\" (UID: \"c9f8a111-e6d4-440a-a3d4-4f3010304ba9\") " pod="openshift-nmstate/nmstate-handler-sfl86" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.728858 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f2ec05fa-c742-4ba4-830e-571af72430c0-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-pqqkr\" (UID: \"f2ec05fa-c742-4ba4-830e-571af72430c0\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqqkr" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.728889 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2ec05fa-c742-4ba4-830e-571af72430c0-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-pqqkr\" (UID: \"f2ec05fa-c742-4ba4-830e-571af72430c0\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqqkr" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.728935 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c9f8a111-e6d4-440a-a3d4-4f3010304ba9-nmstate-lock\") pod \"nmstate-handler-sfl86\" (UID: \"c9f8a111-e6d4-440a-a3d4-4f3010304ba9\") " pod="openshift-nmstate/nmstate-handler-sfl86" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.728987 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c9f8a111-e6d4-440a-a3d4-4f3010304ba9-nmstate-lock\") pod \"nmstate-handler-sfl86\" (UID: \"c9f8a111-e6d4-440a-a3d4-4f3010304ba9\") " pod="openshift-nmstate/nmstate-handler-sfl86" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.729025 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrzv7\" (UniqueName: \"kubernetes.io/projected/81bd8cf2-cb7b-4039-9a4d-fa57e5200d7a-kube-api-access-qrzv7\") pod \"nmstate-webhook-5f6d4c5ccb-z2lv4\" (UID: \"81bd8cf2-cb7b-4039-9a4d-fa57e5200d7a\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2lv4" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.729059 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c9f8a111-e6d4-440a-a3d4-4f3010304ba9-ovs-socket\") pod \"nmstate-handler-sfl86\" (UID: \"c9f8a111-e6d4-440a-a3d4-4f3010304ba9\") " pod="openshift-nmstate/nmstate-handler-sfl86" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.729075 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89844\" (UniqueName: \"kubernetes.io/projected/c9f8a111-e6d4-440a-a3d4-4f3010304ba9-kube-api-access-89844\") pod \"nmstate-handler-sfl86\" (UID: \"c9f8a111-e6d4-440a-a3d4-4f3010304ba9\") " pod="openshift-nmstate/nmstate-handler-sfl86" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.729091 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/81bd8cf2-cb7b-4039-9a4d-fa57e5200d7a-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-z2lv4\" (UID: \"81bd8cf2-cb7b-4039-9a4d-fa57e5200d7a\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2lv4" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.729201 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c9f8a111-e6d4-440a-a3d4-4f3010304ba9-dbus-socket\") pod \"nmstate-handler-sfl86\" (UID: \"c9f8a111-e6d4-440a-a3d4-4f3010304ba9\") " pod="openshift-nmstate/nmstate-handler-sfl86" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.729194 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c9f8a111-e6d4-440a-a3d4-4f3010304ba9-ovs-socket\") pod \"nmstate-handler-sfl86\" (UID: \"c9f8a111-e6d4-440a-a3d4-4f3010304ba9\") " pod="openshift-nmstate/nmstate-handler-sfl86" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.735981 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/81bd8cf2-cb7b-4039-9a4d-fa57e5200d7a-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-z2lv4\" (UID: \"81bd8cf2-cb7b-4039-9a4d-fa57e5200d7a\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2lv4" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.755196 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-jbg86" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.756225 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89844\" (UniqueName: \"kubernetes.io/projected/c9f8a111-e6d4-440a-a3d4-4f3010304ba9-kube-api-access-89844\") pod \"nmstate-handler-sfl86\" (UID: \"c9f8a111-e6d4-440a-a3d4-4f3010304ba9\") " pod="openshift-nmstate/nmstate-handler-sfl86" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.756934 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrzv7\" (UniqueName: \"kubernetes.io/projected/81bd8cf2-cb7b-4039-9a4d-fa57e5200d7a-kube-api-access-qrzv7\") pod \"nmstate-webhook-5f6d4c5ccb-z2lv4\" (UID: \"81bd8cf2-cb7b-4039-9a4d-fa57e5200d7a\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2lv4" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.802643 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2lv4" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.821686 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-sfl86" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.831472 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvf7w\" (UniqueName: \"kubernetes.io/projected/f2ec05fa-c742-4ba4-830e-571af72430c0-kube-api-access-jvf7w\") pod \"nmstate-console-plugin-7fbb5f6569-pqqkr\" (UID: \"f2ec05fa-c742-4ba4-830e-571af72430c0\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqqkr" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.831568 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f2ec05fa-c742-4ba4-830e-571af72430c0-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-pqqkr\" (UID: \"f2ec05fa-c742-4ba4-830e-571af72430c0\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqqkr" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.831629 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2ec05fa-c742-4ba4-830e-571af72430c0-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-pqqkr\" (UID: \"f2ec05fa-c742-4ba4-830e-571af72430c0\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqqkr" Dec 05 16:28:59 crc kubenswrapper[4756]: E1205 16:28:59.831775 4756 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 05 16:28:59 crc kubenswrapper[4756]: E1205 16:28:59.831849 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2ec05fa-c742-4ba4-830e-571af72430c0-plugin-serving-cert podName:f2ec05fa-c742-4ba4-830e-571af72430c0 nodeName:}" failed. No retries permitted until 2025-12-05 16:29:00.331818625 +0000 UTC m=+800.570492137 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/f2ec05fa-c742-4ba4-830e-571af72430c0-plugin-serving-cert") pod "nmstate-console-plugin-7fbb5f6569-pqqkr" (UID: "f2ec05fa-c742-4ba4-830e-571af72430c0") : secret "plugin-serving-cert" not found Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.832403 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f2ec05fa-c742-4ba4-830e-571af72430c0-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-pqqkr\" (UID: \"f2ec05fa-c742-4ba4-830e-571af72430c0\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqqkr" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.869686 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvf7w\" (UniqueName: \"kubernetes.io/projected/f2ec05fa-c742-4ba4-830e-571af72430c0-kube-api-access-jvf7w\") pod \"nmstate-console-plugin-7fbb5f6569-pqqkr\" (UID: \"f2ec05fa-c742-4ba4-830e-571af72430c0\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqqkr" Dec 05 16:28:59 crc kubenswrapper[4756]: W1205 16:28:59.892335 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9f8a111_e6d4_440a_a3d4_4f3010304ba9.slice/crio-bea167363cbfe454d249003f4db683d5c03eefe171b393756cb07cca32de9bdd WatchSource:0}: Error finding container bea167363cbfe454d249003f4db683d5c03eefe171b393756cb07cca32de9bdd: Status 404 returned error can't find the container with id bea167363cbfe454d249003f4db683d5c03eefe171b393756cb07cca32de9bdd Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.893913 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-646b8b7948-zpgwc"] Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.894807 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-646b8b7948-zpgwc" Dec 05 16:28:59 crc kubenswrapper[4756]: I1205 16:28:59.923791 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-646b8b7948-zpgwc"] Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.033177 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr9w2\" (UniqueName: \"kubernetes.io/projected/e4f98ecd-8e11-4387-a950-c4a8359fc6f2-kube-api-access-nr9w2\") pod \"console-646b8b7948-zpgwc\" (UID: \"e4f98ecd-8e11-4387-a950-c4a8359fc6f2\") " pod="openshift-console/console-646b8b7948-zpgwc" Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.033241 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4f98ecd-8e11-4387-a950-c4a8359fc6f2-console-serving-cert\") pod \"console-646b8b7948-zpgwc\" (UID: \"e4f98ecd-8e11-4387-a950-c4a8359fc6f2\") " pod="openshift-console/console-646b8b7948-zpgwc" Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.033290 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e4f98ecd-8e11-4387-a950-c4a8359fc6f2-console-oauth-config\") pod \"console-646b8b7948-zpgwc\" (UID: \"e4f98ecd-8e11-4387-a950-c4a8359fc6f2\") " pod="openshift-console/console-646b8b7948-zpgwc" Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.033330 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4f98ecd-8e11-4387-a950-c4a8359fc6f2-trusted-ca-bundle\") pod \"console-646b8b7948-zpgwc\" (UID: \"e4f98ecd-8e11-4387-a950-c4a8359fc6f2\") " pod="openshift-console/console-646b8b7948-zpgwc" Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.033366 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e4f98ecd-8e11-4387-a950-c4a8359fc6f2-console-config\") pod \"console-646b8b7948-zpgwc\" (UID: \"e4f98ecd-8e11-4387-a950-c4a8359fc6f2\") " pod="openshift-console/console-646b8b7948-zpgwc" Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.033383 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e4f98ecd-8e11-4387-a950-c4a8359fc6f2-service-ca\") pod \"console-646b8b7948-zpgwc\" (UID: \"e4f98ecd-8e11-4387-a950-c4a8359fc6f2\") " pod="openshift-console/console-646b8b7948-zpgwc" Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.033407 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e4f98ecd-8e11-4387-a950-c4a8359fc6f2-oauth-serving-cert\") pod \"console-646b8b7948-zpgwc\" (UID: \"e4f98ecd-8e11-4387-a950-c4a8359fc6f2\") " pod="openshift-console/console-646b8b7948-zpgwc" Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.084981 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-jbg86"] Dec 05 16:29:00 crc kubenswrapper[4756]: W1205 16:29:00.091144 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92193823_fb9a_4291_a9cd_5d25f4a566cc.slice/crio-8c85d7e376541bcf6b833961a773f3d439aaea43fb689e9c2673185b197a8a53 WatchSource:0}: Error finding container 8c85d7e376541bcf6b833961a773f3d439aaea43fb689e9c2673185b197a8a53: Status 404 returned error can't find the container with id 8c85d7e376541bcf6b833961a773f3d439aaea43fb689e9c2673185b197a8a53 Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.135071 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4f98ecd-8e11-4387-a950-c4a8359fc6f2-console-serving-cert\") pod \"console-646b8b7948-zpgwc\" (UID: \"e4f98ecd-8e11-4387-a950-c4a8359fc6f2\") " pod="openshift-console/console-646b8b7948-zpgwc" Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.135131 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e4f98ecd-8e11-4387-a950-c4a8359fc6f2-console-oauth-config\") pod \"console-646b8b7948-zpgwc\" (UID: \"e4f98ecd-8e11-4387-a950-c4a8359fc6f2\") " pod="openshift-console/console-646b8b7948-zpgwc" Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.135184 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4f98ecd-8e11-4387-a950-c4a8359fc6f2-trusted-ca-bundle\") pod \"console-646b8b7948-zpgwc\" (UID: \"e4f98ecd-8e11-4387-a950-c4a8359fc6f2\") " pod="openshift-console/console-646b8b7948-zpgwc" Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.135240 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e4f98ecd-8e11-4387-a950-c4a8359fc6f2-console-config\") pod \"console-646b8b7948-zpgwc\" (UID: \"e4f98ecd-8e11-4387-a950-c4a8359fc6f2\") " pod="openshift-console/console-646b8b7948-zpgwc" Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.135275 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e4f98ecd-8e11-4387-a950-c4a8359fc6f2-service-ca\") pod \"console-646b8b7948-zpgwc\" (UID: \"e4f98ecd-8e11-4387-a950-c4a8359fc6f2\") " pod="openshift-console/console-646b8b7948-zpgwc" Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.135317 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e4f98ecd-8e11-4387-a950-c4a8359fc6f2-oauth-serving-cert\") pod \"console-646b8b7948-zpgwc\" (UID: \"e4f98ecd-8e11-4387-a950-c4a8359fc6f2\") " pod="openshift-console/console-646b8b7948-zpgwc" Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.135336 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr9w2\" (UniqueName: \"kubernetes.io/projected/e4f98ecd-8e11-4387-a950-c4a8359fc6f2-kube-api-access-nr9w2\") pod \"console-646b8b7948-zpgwc\" (UID: \"e4f98ecd-8e11-4387-a950-c4a8359fc6f2\") " pod="openshift-console/console-646b8b7948-zpgwc" Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.135979 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e4f98ecd-8e11-4387-a950-c4a8359fc6f2-console-config\") pod \"console-646b8b7948-zpgwc\" (UID: \"e4f98ecd-8e11-4387-a950-c4a8359fc6f2\") " pod="openshift-console/console-646b8b7948-zpgwc" Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.136186 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4f98ecd-8e11-4387-a950-c4a8359fc6f2-trusted-ca-bundle\") pod \"console-646b8b7948-zpgwc\" (UID: \"e4f98ecd-8e11-4387-a950-c4a8359fc6f2\") " pod="openshift-console/console-646b8b7948-zpgwc" Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.136934 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e4f98ecd-8e11-4387-a950-c4a8359fc6f2-oauth-serving-cert\") pod \"console-646b8b7948-zpgwc\" (UID: \"e4f98ecd-8e11-4387-a950-c4a8359fc6f2\") " pod="openshift-console/console-646b8b7948-zpgwc" Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.137068 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e4f98ecd-8e11-4387-a950-c4a8359fc6f2-service-ca\") pod \"console-646b8b7948-zpgwc\" (UID: \"e4f98ecd-8e11-4387-a950-c4a8359fc6f2\") " pod="openshift-console/console-646b8b7948-zpgwc" Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.139366 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e4f98ecd-8e11-4387-a950-c4a8359fc6f2-console-oauth-config\") pod \"console-646b8b7948-zpgwc\" (UID: \"e4f98ecd-8e11-4387-a950-c4a8359fc6f2\") " pod="openshift-console/console-646b8b7948-zpgwc" Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.140964 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4f98ecd-8e11-4387-a950-c4a8359fc6f2-console-serving-cert\") pod \"console-646b8b7948-zpgwc\" (UID: \"e4f98ecd-8e11-4387-a950-c4a8359fc6f2\") " pod="openshift-console/console-646b8b7948-zpgwc" Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.152337 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr9w2\" (UniqueName: \"kubernetes.io/projected/e4f98ecd-8e11-4387-a950-c4a8359fc6f2-kube-api-access-nr9w2\") pod \"console-646b8b7948-zpgwc\" (UID: \"e4f98ecd-8e11-4387-a950-c4a8359fc6f2\") " pod="openshift-console/console-646b8b7948-zpgwc" Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.215179 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-sfl86" event={"ID":"c9f8a111-e6d4-440a-a3d4-4f3010304ba9","Type":"ContainerStarted","Data":"bea167363cbfe454d249003f4db683d5c03eefe171b393756cb07cca32de9bdd"} Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.215634 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-646b8b7948-zpgwc" Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.217379 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-jbg86" event={"ID":"92193823-fb9a-4291-a9cd-5d25f4a566cc","Type":"ContainerStarted","Data":"8c85d7e376541bcf6b833961a773f3d439aaea43fb689e9c2673185b197a8a53"} Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.335050 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2lv4"] Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.339008 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2ec05fa-c742-4ba4-830e-571af72430c0-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-pqqkr\" (UID: \"f2ec05fa-c742-4ba4-830e-571af72430c0\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqqkr" Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.345382 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2ec05fa-c742-4ba4-830e-571af72430c0-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-pqqkr\" (UID: \"f2ec05fa-c742-4ba4-830e-571af72430c0\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqqkr" Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.422857 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-646b8b7948-zpgwc"] Dec 05 16:29:00 crc kubenswrapper[4756]: W1205 16:29:00.432502 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4f98ecd_8e11_4387_a950_c4a8359fc6f2.slice/crio-9d85edac1c0279912fa856240d21f8170eb43d9119af2205723173c7d6c8ee3f WatchSource:0}: Error finding container 9d85edac1c0279912fa856240d21f8170eb43d9119af2205723173c7d6c8ee3f: Status 404 returned error can't find the container with id 9d85edac1c0279912fa856240d21f8170eb43d9119af2205723173c7d6c8ee3f Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.526873 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqqkr" Dec 05 16:29:00 crc kubenswrapper[4756]: I1205 16:29:00.757935 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqqkr"] Dec 05 16:29:01 crc kubenswrapper[4756]: I1205 16:29:01.226609 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-646b8b7948-zpgwc" event={"ID":"e4f98ecd-8e11-4387-a950-c4a8359fc6f2","Type":"ContainerStarted","Data":"d05071d751395ce12e2aa0cb6c52201c3b87122c470eee9dda0601c8e21f8888"} Dec 05 16:29:01 crc kubenswrapper[4756]: I1205 16:29:01.226697 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-646b8b7948-zpgwc" event={"ID":"e4f98ecd-8e11-4387-a950-c4a8359fc6f2","Type":"ContainerStarted","Data":"9d85edac1c0279912fa856240d21f8170eb43d9119af2205723173c7d6c8ee3f"} Dec 05 16:29:01 crc kubenswrapper[4756]: I1205 16:29:01.229234 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqqkr" event={"ID":"f2ec05fa-c742-4ba4-830e-571af72430c0","Type":"ContainerStarted","Data":"a6fead12f131b3e8da980563d56d97e6fd1ff3019ebb5058b6a58a23339f57d6"} Dec 05 16:29:01 crc kubenswrapper[4756]: I1205 16:29:01.230608 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2lv4" event={"ID":"81bd8cf2-cb7b-4039-9a4d-fa57e5200d7a","Type":"ContainerStarted","Data":"37db3e39ebd403ac0dba4ea4eac17dd5b86e3b04d15d811de6de643364bf884c"} Dec 05 16:29:01 crc kubenswrapper[4756]: I1205 16:29:01.251240 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-646b8b7948-zpgwc" podStartSLOduration=2.251209011 podStartE2EDuration="2.251209011s" podCreationTimestamp="2025-12-05 16:28:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:29:01.246030709 +0000 UTC m=+801.484704261" watchObservedRunningTime="2025-12-05 16:29:01.251209011 +0000 UTC m=+801.489882533" Dec 05 16:29:04 crc kubenswrapper[4756]: I1205 16:29:04.260494 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqqkr" event={"ID":"f2ec05fa-c742-4ba4-830e-571af72430c0","Type":"ContainerStarted","Data":"af24256d54d6d99c0ce6fa5888e42f1a8cd0b57b15230a80cc02d63020d8b580"} Dec 05 16:29:04 crc kubenswrapper[4756]: I1205 16:29:04.268253 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2lv4" event={"ID":"81bd8cf2-cb7b-4039-9a4d-fa57e5200d7a","Type":"ContainerStarted","Data":"43daf75be51d5da553dec0192cb3cc0b31df503f9c52d146c9df21a02b2b64d4"} Dec 05 16:29:04 crc kubenswrapper[4756]: I1205 16:29:04.268527 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2lv4" Dec 05 16:29:04 crc kubenswrapper[4756]: I1205 16:29:04.270624 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-sfl86" event={"ID":"c9f8a111-e6d4-440a-a3d4-4f3010304ba9","Type":"ContainerStarted","Data":"94b959c4b4318a6001e232152e75385512c946213846acc38d068ecab0e6c5fa"} Dec 05 16:29:04 crc kubenswrapper[4756]: I1205 16:29:04.273266 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-jbg86" event={"ID":"92193823-fb9a-4291-a9cd-5d25f4a566cc","Type":"ContainerStarted","Data":"cb19890b2d3852be951005d45778ba182575ce66358e67fa892e07d34e0334bc"} Dec 05 16:29:04 crc kubenswrapper[4756]: I1205 16:29:04.281289 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pqqkr" podStartSLOduration=2.46407222 podStartE2EDuration="5.281261744s" podCreationTimestamp="2025-12-05 16:28:59 +0000 UTC" firstStartedPulling="2025-12-05 16:29:00.763612453 +0000 UTC m=+801.002285965" lastFinishedPulling="2025-12-05 16:29:03.580801977 +0000 UTC m=+803.819475489" observedRunningTime="2025-12-05 16:29:04.280164666 +0000 UTC m=+804.518838188" watchObservedRunningTime="2025-12-05 16:29:04.281261744 +0000 UTC m=+804.519935256" Dec 05 16:29:04 crc kubenswrapper[4756]: I1205 16:29:04.311914 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2lv4" podStartSLOduration=2.060130265 podStartE2EDuration="5.311883215s" podCreationTimestamp="2025-12-05 16:28:59 +0000 UTC" firstStartedPulling="2025-12-05 16:29:00.346951754 +0000 UTC m=+800.585625266" lastFinishedPulling="2025-12-05 16:29:03.598704704 +0000 UTC m=+803.837378216" observedRunningTime="2025-12-05 16:29:04.309637878 +0000 UTC m=+804.548311410" watchObservedRunningTime="2025-12-05 16:29:04.311883215 +0000 UTC m=+804.550556727" Dec 05 16:29:04 crc kubenswrapper[4756]: I1205 16:29:04.822607 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-sfl86" Dec 05 16:29:07 crc kubenswrapper[4756]: I1205 16:29:07.305175 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-jbg86" event={"ID":"92193823-fb9a-4291-a9cd-5d25f4a566cc","Type":"ContainerStarted","Data":"f32444625e5834365d4ceccf4f79b67e8eecd5f9b3bc69902d582e2a3c4bd8d0"} Dec 05 16:29:07 crc kubenswrapper[4756]: I1205 16:29:07.329270 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-sfl86" podStartSLOduration=4.640774636 podStartE2EDuration="8.329245235s" podCreationTimestamp="2025-12-05 16:28:59 +0000 UTC" firstStartedPulling="2025-12-05 16:28:59.899497141 +0000 UTC m=+800.138170653" lastFinishedPulling="2025-12-05 16:29:03.58796774 +0000 UTC m=+803.826641252" observedRunningTime="2025-12-05 16:29:04.330839659 +0000 UTC m=+804.569513171" watchObservedRunningTime="2025-12-05 16:29:07.329245235 +0000 UTC m=+807.567918747" Dec 05 16:29:07 crc kubenswrapper[4756]: I1205 16:29:07.329926 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-jbg86" podStartSLOduration=1.672728126 podStartE2EDuration="8.329920603s" podCreationTimestamp="2025-12-05 16:28:59 +0000 UTC" firstStartedPulling="2025-12-05 16:29:00.093185882 +0000 UTC m=+800.331859394" lastFinishedPulling="2025-12-05 16:29:06.750378359 +0000 UTC m=+806.989051871" observedRunningTime="2025-12-05 16:29:07.325077409 +0000 UTC m=+807.563750921" watchObservedRunningTime="2025-12-05 16:29:07.329920603 +0000 UTC m=+807.568594115" Dec 05 16:29:09 crc kubenswrapper[4756]: I1205 16:29:09.852176 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-sfl86" Dec 05 16:29:10 crc kubenswrapper[4756]: I1205 16:29:10.216263 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-646b8b7948-zpgwc" Dec 05 16:29:10 crc kubenswrapper[4756]: I1205 16:29:10.216337 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-646b8b7948-zpgwc" Dec 05 16:29:10 crc kubenswrapper[4756]: I1205 16:29:10.225340 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-646b8b7948-zpgwc" Dec 05 16:29:10 crc kubenswrapper[4756]: I1205 16:29:10.330395 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-646b8b7948-zpgwc" Dec 05 16:29:10 crc kubenswrapper[4756]: I1205 16:29:10.405975 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-wld6x"] Dec 05 16:29:19 crc kubenswrapper[4756]: I1205 16:29:19.810232 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2lv4" Dec 05 16:29:22 crc kubenswrapper[4756]: I1205 16:29:22.296064 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:29:22 crc kubenswrapper[4756]: I1205 16:29:22.296130 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:29:31 crc kubenswrapper[4756]: I1205 16:29:31.986016 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk"] Dec 05 16:29:31 crc kubenswrapper[4756]: I1205 16:29:31.988190 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk" Dec 05 16:29:31 crc kubenswrapper[4756]: I1205 16:29:31.990434 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 16:29:31 crc kubenswrapper[4756]: I1205 16:29:31.992821 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk"] Dec 05 16:29:32 crc kubenswrapper[4756]: I1205 16:29:32.099534 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1e72e52-3661-4e92-93a2-a23afb4f2560-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk\" (UID: \"c1e72e52-3661-4e92-93a2-a23afb4f2560\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk" Dec 05 16:29:32 crc kubenswrapper[4756]: I1205 16:29:32.099770 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jwtv\" (UniqueName: \"kubernetes.io/projected/c1e72e52-3661-4e92-93a2-a23afb4f2560-kube-api-access-7jwtv\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk\" (UID: \"c1e72e52-3661-4e92-93a2-a23afb4f2560\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk" Dec 05 16:29:32 crc kubenswrapper[4756]: I1205 16:29:32.099877 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1e72e52-3661-4e92-93a2-a23afb4f2560-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk\" (UID: \"c1e72e52-3661-4e92-93a2-a23afb4f2560\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk" Dec 05 16:29:32 crc kubenswrapper[4756]: I1205 16:29:32.201103 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jwtv\" (UniqueName: \"kubernetes.io/projected/c1e72e52-3661-4e92-93a2-a23afb4f2560-kube-api-access-7jwtv\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk\" (UID: \"c1e72e52-3661-4e92-93a2-a23afb4f2560\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk" Dec 05 16:29:32 crc kubenswrapper[4756]: I1205 16:29:32.201612 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1e72e52-3661-4e92-93a2-a23afb4f2560-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk\" (UID: \"c1e72e52-3661-4e92-93a2-a23afb4f2560\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk" Dec 05 16:29:32 crc kubenswrapper[4756]: I1205 16:29:32.201841 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1e72e52-3661-4e92-93a2-a23afb4f2560-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk\" (UID: \"c1e72e52-3661-4e92-93a2-a23afb4f2560\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk" Dec 05 16:29:32 crc kubenswrapper[4756]: I1205 16:29:32.202308 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1e72e52-3661-4e92-93a2-a23afb4f2560-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk\" (UID: \"c1e72e52-3661-4e92-93a2-a23afb4f2560\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk" Dec 05 16:29:32 crc kubenswrapper[4756]: I1205 16:29:32.202620 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1e72e52-3661-4e92-93a2-a23afb4f2560-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk\" (UID: \"c1e72e52-3661-4e92-93a2-a23afb4f2560\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk" Dec 05 16:29:32 crc kubenswrapper[4756]: I1205 16:29:32.227273 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jwtv\" (UniqueName: \"kubernetes.io/projected/c1e72e52-3661-4e92-93a2-a23afb4f2560-kube-api-access-7jwtv\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk\" (UID: \"c1e72e52-3661-4e92-93a2-a23afb4f2560\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk" Dec 05 16:29:32 crc kubenswrapper[4756]: I1205 16:29:32.307776 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk" Dec 05 16:29:32 crc kubenswrapper[4756]: I1205 16:29:32.543024 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk"] Dec 05 16:29:33 crc kubenswrapper[4756]: I1205 16:29:33.497387 4756 generic.go:334] "Generic (PLEG): container finished" podID="c1e72e52-3661-4e92-93a2-a23afb4f2560" containerID="36f5961dc62582d8c89e279a66e111f4f8e74100df9500492407e26ec7dbba6e" exitCode=0 Dec 05 16:29:33 crc kubenswrapper[4756]: I1205 16:29:33.497503 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk" event={"ID":"c1e72e52-3661-4e92-93a2-a23afb4f2560","Type":"ContainerDied","Data":"36f5961dc62582d8c89e279a66e111f4f8e74100df9500492407e26ec7dbba6e"} Dec 05 16:29:33 crc kubenswrapper[4756]: I1205 16:29:33.497566 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk" event={"ID":"c1e72e52-3661-4e92-93a2-a23afb4f2560","Type":"ContainerStarted","Data":"8fc3e6d6caff79caf34ebcdcf3d344abf6568ce7a7192f7863078367bdec988f"} Dec 05 16:29:35 crc kubenswrapper[4756]: I1205 16:29:35.452626 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-wld6x" podUID="351d65a8-44a7-4f84-be48-1b5204422e8b" containerName="console" containerID="cri-o://807e4f05e2b3df029033160fdc685755c6dc28c21e03e6ae81167eb101f6d369" gracePeriod=15 Dec 05 16:29:35 crc kubenswrapper[4756]: I1205 16:29:35.522455 4756 generic.go:334] "Generic (PLEG): container finished" podID="c1e72e52-3661-4e92-93a2-a23afb4f2560" containerID="8990b947fab918c938f645ee4f3c5f7e81d7ffc5b0c2cc83fc20eaae550d1009" exitCode=0 Dec 05 16:29:35 crc kubenswrapper[4756]: I1205 16:29:35.522511 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk" event={"ID":"c1e72e52-3661-4e92-93a2-a23afb4f2560","Type":"ContainerDied","Data":"8990b947fab918c938f645ee4f3c5f7e81d7ffc5b0c2cc83fc20eaae550d1009"} Dec 05 16:29:35 crc kubenswrapper[4756]: I1205 16:29:35.864217 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-wld6x_351d65a8-44a7-4f84-be48-1b5204422e8b/console/0.log" Dec 05 16:29:35 crc kubenswrapper[4756]: I1205 16:29:35.864575 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wld6x" Dec 05 16:29:35 crc kubenswrapper[4756]: I1205 16:29:35.985793 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2jfd\" (UniqueName: \"kubernetes.io/projected/351d65a8-44a7-4f84-be48-1b5204422e8b-kube-api-access-n2jfd\") pod \"351d65a8-44a7-4f84-be48-1b5204422e8b\" (UID: \"351d65a8-44a7-4f84-be48-1b5204422e8b\") " Dec 05 16:29:35 crc kubenswrapper[4756]: I1205 16:29:35.985897 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/351d65a8-44a7-4f84-be48-1b5204422e8b-console-config\") pod \"351d65a8-44a7-4f84-be48-1b5204422e8b\" (UID: \"351d65a8-44a7-4f84-be48-1b5204422e8b\") " Dec 05 16:29:35 crc kubenswrapper[4756]: I1205 16:29:35.985955 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/351d65a8-44a7-4f84-be48-1b5204422e8b-console-oauth-config\") pod \"351d65a8-44a7-4f84-be48-1b5204422e8b\" (UID: \"351d65a8-44a7-4f84-be48-1b5204422e8b\") " Dec 05 16:29:35 crc kubenswrapper[4756]: I1205 16:29:35.985995 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/351d65a8-44a7-4f84-be48-1b5204422e8b-trusted-ca-bundle\") pod \"351d65a8-44a7-4f84-be48-1b5204422e8b\" (UID: \"351d65a8-44a7-4f84-be48-1b5204422e8b\") " Dec 05 16:29:35 crc kubenswrapper[4756]: I1205 16:29:35.986030 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/351d65a8-44a7-4f84-be48-1b5204422e8b-console-serving-cert\") pod \"351d65a8-44a7-4f84-be48-1b5204422e8b\" (UID: \"351d65a8-44a7-4f84-be48-1b5204422e8b\") " Dec 05 16:29:35 crc kubenswrapper[4756]: I1205 16:29:35.987035 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/351d65a8-44a7-4f84-be48-1b5204422e8b-console-config" (OuterVolumeSpecName: "console-config") pod "351d65a8-44a7-4f84-be48-1b5204422e8b" (UID: "351d65a8-44a7-4f84-be48-1b5204422e8b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:29:35 crc kubenswrapper[4756]: I1205 16:29:35.987120 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/351d65a8-44a7-4f84-be48-1b5204422e8b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "351d65a8-44a7-4f84-be48-1b5204422e8b" (UID: "351d65a8-44a7-4f84-be48-1b5204422e8b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:29:35 crc kubenswrapper[4756]: I1205 16:29:35.987343 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/351d65a8-44a7-4f84-be48-1b5204422e8b-oauth-serving-cert\") pod \"351d65a8-44a7-4f84-be48-1b5204422e8b\" (UID: \"351d65a8-44a7-4f84-be48-1b5204422e8b\") " Dec 05 16:29:35 crc kubenswrapper[4756]: I1205 16:29:35.987382 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/351d65a8-44a7-4f84-be48-1b5204422e8b-service-ca\") pod \"351d65a8-44a7-4f84-be48-1b5204422e8b\" (UID: \"351d65a8-44a7-4f84-be48-1b5204422e8b\") " Dec 05 16:29:35 crc kubenswrapper[4756]: I1205 16:29:35.987938 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/351d65a8-44a7-4f84-be48-1b5204422e8b-service-ca" (OuterVolumeSpecName: "service-ca") pod "351d65a8-44a7-4f84-be48-1b5204422e8b" (UID: "351d65a8-44a7-4f84-be48-1b5204422e8b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:29:35 crc kubenswrapper[4756]: I1205 16:29:35.987945 4756 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/351d65a8-44a7-4f84-be48-1b5204422e8b-console-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:29:35 crc kubenswrapper[4756]: I1205 16:29:35.987997 4756 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/351d65a8-44a7-4f84-be48-1b5204422e8b-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:29:35 crc kubenswrapper[4756]: I1205 16:29:35.988110 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/351d65a8-44a7-4f84-be48-1b5204422e8b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "351d65a8-44a7-4f84-be48-1b5204422e8b" (UID: "351d65a8-44a7-4f84-be48-1b5204422e8b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:29:35 crc kubenswrapper[4756]: I1205 16:29:35.991316 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/351d65a8-44a7-4f84-be48-1b5204422e8b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "351d65a8-44a7-4f84-be48-1b5204422e8b" (UID: "351d65a8-44a7-4f84-be48-1b5204422e8b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:29:35 crc kubenswrapper[4756]: I1205 16:29:35.991773 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/351d65a8-44a7-4f84-be48-1b5204422e8b-kube-api-access-n2jfd" (OuterVolumeSpecName: "kube-api-access-n2jfd") pod "351d65a8-44a7-4f84-be48-1b5204422e8b" (UID: "351d65a8-44a7-4f84-be48-1b5204422e8b"). InnerVolumeSpecName "kube-api-access-n2jfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:29:35 crc kubenswrapper[4756]: I1205 16:29:35.991854 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/351d65a8-44a7-4f84-be48-1b5204422e8b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "351d65a8-44a7-4f84-be48-1b5204422e8b" (UID: "351d65a8-44a7-4f84-be48-1b5204422e8b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:29:36 crc kubenswrapper[4756]: I1205 16:29:36.089164 4756 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/351d65a8-44a7-4f84-be48-1b5204422e8b-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:29:36 crc kubenswrapper[4756]: I1205 16:29:36.089205 4756 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/351d65a8-44a7-4f84-be48-1b5204422e8b-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:29:36 crc kubenswrapper[4756]: I1205 16:29:36.089222 4756 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/351d65a8-44a7-4f84-be48-1b5204422e8b-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 16:29:36 crc kubenswrapper[4756]: I1205 16:29:36.089235 4756 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/351d65a8-44a7-4f84-be48-1b5204422e8b-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 16:29:36 crc kubenswrapper[4756]: I1205 16:29:36.089245 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2jfd\" (UniqueName: \"kubernetes.io/projected/351d65a8-44a7-4f84-be48-1b5204422e8b-kube-api-access-n2jfd\") on node \"crc\" DevicePath \"\"" Dec 05 16:29:36 crc kubenswrapper[4756]: I1205 16:29:36.531254 4756 generic.go:334] "Generic (PLEG): container finished" podID="c1e72e52-3661-4e92-93a2-a23afb4f2560" containerID="5e81eeb39cd296e73f67f552a6b50fdeb1b77e8e765dd36a547f2897ee754dfc" exitCode=0 Dec 05 16:29:36 crc kubenswrapper[4756]: I1205 16:29:36.531330 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk" event={"ID":"c1e72e52-3661-4e92-93a2-a23afb4f2560","Type":"ContainerDied","Data":"5e81eeb39cd296e73f67f552a6b50fdeb1b77e8e765dd36a547f2897ee754dfc"} Dec 05 16:29:36 crc kubenswrapper[4756]: I1205 16:29:36.533344 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-wld6x_351d65a8-44a7-4f84-be48-1b5204422e8b/console/0.log" Dec 05 16:29:36 crc kubenswrapper[4756]: I1205 16:29:36.533393 4756 generic.go:334] "Generic (PLEG): container finished" podID="351d65a8-44a7-4f84-be48-1b5204422e8b" containerID="807e4f05e2b3df029033160fdc685755c6dc28c21e03e6ae81167eb101f6d369" exitCode=2 Dec 05 16:29:36 crc kubenswrapper[4756]: I1205 16:29:36.533445 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wld6x" event={"ID":"351d65a8-44a7-4f84-be48-1b5204422e8b","Type":"ContainerDied","Data":"807e4f05e2b3df029033160fdc685755c6dc28c21e03e6ae81167eb101f6d369"} Dec 05 16:29:36 crc kubenswrapper[4756]: I1205 16:29:36.533468 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wld6x" event={"ID":"351d65a8-44a7-4f84-be48-1b5204422e8b","Type":"ContainerDied","Data":"95347d207c5ac003e2d08bf80cb965dcf4854e2ed9a69c2dc911bfc09d7f3da4"} Dec 05 16:29:36 crc kubenswrapper[4756]: I1205 16:29:36.533486 4756 scope.go:117] "RemoveContainer" containerID="807e4f05e2b3df029033160fdc685755c6dc28c21e03e6ae81167eb101f6d369" Dec 05 16:29:36 crc kubenswrapper[4756]: I1205 16:29:36.533583 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wld6x" Dec 05 16:29:36 crc kubenswrapper[4756]: I1205 16:29:36.553257 4756 scope.go:117] "RemoveContainer" containerID="807e4f05e2b3df029033160fdc685755c6dc28c21e03e6ae81167eb101f6d369" Dec 05 16:29:36 crc kubenswrapper[4756]: E1205 16:29:36.553688 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"807e4f05e2b3df029033160fdc685755c6dc28c21e03e6ae81167eb101f6d369\": container with ID starting with 807e4f05e2b3df029033160fdc685755c6dc28c21e03e6ae81167eb101f6d369 not found: ID does not exist" containerID="807e4f05e2b3df029033160fdc685755c6dc28c21e03e6ae81167eb101f6d369" Dec 05 16:29:36 crc kubenswrapper[4756]: I1205 16:29:36.553750 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"807e4f05e2b3df029033160fdc685755c6dc28c21e03e6ae81167eb101f6d369"} err="failed to get container status \"807e4f05e2b3df029033160fdc685755c6dc28c21e03e6ae81167eb101f6d369\": rpc error: code = NotFound desc = could not find container \"807e4f05e2b3df029033160fdc685755c6dc28c21e03e6ae81167eb101f6d369\": container with ID starting with 807e4f05e2b3df029033160fdc685755c6dc28c21e03e6ae81167eb101f6d369 not found: ID does not exist" Dec 05 16:29:36 crc kubenswrapper[4756]: I1205 16:29:36.565504 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-wld6x"] Dec 05 16:29:36 crc kubenswrapper[4756]: I1205 16:29:36.567946 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-wld6x"] Dec 05 16:29:37 crc kubenswrapper[4756]: I1205 16:29:37.756566 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk" Dec 05 16:29:37 crc kubenswrapper[4756]: I1205 16:29:37.911168 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1e72e52-3661-4e92-93a2-a23afb4f2560-bundle\") pod \"c1e72e52-3661-4e92-93a2-a23afb4f2560\" (UID: \"c1e72e52-3661-4e92-93a2-a23afb4f2560\") " Dec 05 16:29:37 crc kubenswrapper[4756]: I1205 16:29:37.911239 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1e72e52-3661-4e92-93a2-a23afb4f2560-util\") pod \"c1e72e52-3661-4e92-93a2-a23afb4f2560\" (UID: \"c1e72e52-3661-4e92-93a2-a23afb4f2560\") " Dec 05 16:29:37 crc kubenswrapper[4756]: I1205 16:29:37.911296 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jwtv\" (UniqueName: \"kubernetes.io/projected/c1e72e52-3661-4e92-93a2-a23afb4f2560-kube-api-access-7jwtv\") pod \"c1e72e52-3661-4e92-93a2-a23afb4f2560\" (UID: \"c1e72e52-3661-4e92-93a2-a23afb4f2560\") " Dec 05 16:29:37 crc kubenswrapper[4756]: I1205 16:29:37.912407 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1e72e52-3661-4e92-93a2-a23afb4f2560-bundle" (OuterVolumeSpecName: "bundle") pod "c1e72e52-3661-4e92-93a2-a23afb4f2560" (UID: "c1e72e52-3661-4e92-93a2-a23afb4f2560"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:29:37 crc kubenswrapper[4756]: I1205 16:29:37.917791 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1e72e52-3661-4e92-93a2-a23afb4f2560-kube-api-access-7jwtv" (OuterVolumeSpecName: "kube-api-access-7jwtv") pod "c1e72e52-3661-4e92-93a2-a23afb4f2560" (UID: "c1e72e52-3661-4e92-93a2-a23afb4f2560"). InnerVolumeSpecName "kube-api-access-7jwtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:29:37 crc kubenswrapper[4756]: I1205 16:29:37.925399 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1e72e52-3661-4e92-93a2-a23afb4f2560-util" (OuterVolumeSpecName: "util") pod "c1e72e52-3661-4e92-93a2-a23afb4f2560" (UID: "c1e72e52-3661-4e92-93a2-a23afb4f2560"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:29:38 crc kubenswrapper[4756]: I1205 16:29:38.012477 4756 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1e72e52-3661-4e92-93a2-a23afb4f2560-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:29:38 crc kubenswrapper[4756]: I1205 16:29:38.012529 4756 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1e72e52-3661-4e92-93a2-a23afb4f2560-util\") on node \"crc\" DevicePath \"\"" Dec 05 16:29:38 crc kubenswrapper[4756]: I1205 16:29:38.012542 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jwtv\" (UniqueName: \"kubernetes.io/projected/c1e72e52-3661-4e92-93a2-a23afb4f2560-kube-api-access-7jwtv\") on node \"crc\" DevicePath \"\"" Dec 05 16:29:38 crc kubenswrapper[4756]: I1205 16:29:38.059743 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="351d65a8-44a7-4f84-be48-1b5204422e8b" path="/var/lib/kubelet/pods/351d65a8-44a7-4f84-be48-1b5204422e8b/volumes" Dec 05 16:29:38 crc kubenswrapper[4756]: I1205 16:29:38.549250 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk" event={"ID":"c1e72e52-3661-4e92-93a2-a23afb4f2560","Type":"ContainerDied","Data":"8fc3e6d6caff79caf34ebcdcf3d344abf6568ce7a7192f7863078367bdec988f"} Dec 05 16:29:38 crc kubenswrapper[4756]: I1205 16:29:38.549312 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk" Dec 05 16:29:38 crc kubenswrapper[4756]: I1205 16:29:38.549311 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fc3e6d6caff79caf34ebcdcf3d344abf6568ce7a7192f7863078367bdec988f" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.282183 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-55964f8cbb-lhn6h"] Dec 05 16:29:47 crc kubenswrapper[4756]: E1205 16:29:47.282685 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="351d65a8-44a7-4f84-be48-1b5204422e8b" containerName="console" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.282696 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="351d65a8-44a7-4f84-be48-1b5204422e8b" containerName="console" Dec 05 16:29:47 crc kubenswrapper[4756]: E1205 16:29:47.282708 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1e72e52-3661-4e92-93a2-a23afb4f2560" containerName="extract" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.282714 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1e72e52-3661-4e92-93a2-a23afb4f2560" containerName="extract" Dec 05 16:29:47 crc kubenswrapper[4756]: E1205 16:29:47.282725 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1e72e52-3661-4e92-93a2-a23afb4f2560" containerName="util" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.282731 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1e72e52-3661-4e92-93a2-a23afb4f2560" containerName="util" Dec 05 16:29:47 crc kubenswrapper[4756]: E1205 16:29:47.282738 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1e72e52-3661-4e92-93a2-a23afb4f2560" containerName="pull" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.282743 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1e72e52-3661-4e92-93a2-a23afb4f2560" containerName="pull" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.282835 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1e72e52-3661-4e92-93a2-a23afb4f2560" containerName="extract" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.282849 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="351d65a8-44a7-4f84-be48-1b5204422e8b" containerName="console" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.283215 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-55964f8cbb-lhn6h" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.286407 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.288166 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.290679 4756 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.290864 4756 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-skcv8" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.290983 4756 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.297914 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-55964f8cbb-lhn6h"] Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.497568 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bd7e4822-2466-41c9-9544-7748e0f123c5-webhook-cert\") pod \"metallb-operator-controller-manager-55964f8cbb-lhn6h\" (UID: \"bd7e4822-2466-41c9-9544-7748e0f123c5\") " pod="metallb-system/metallb-operator-controller-manager-55964f8cbb-lhn6h" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.497623 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7pbc\" (UniqueName: \"kubernetes.io/projected/bd7e4822-2466-41c9-9544-7748e0f123c5-kube-api-access-l7pbc\") pod \"metallb-operator-controller-manager-55964f8cbb-lhn6h\" (UID: \"bd7e4822-2466-41c9-9544-7748e0f123c5\") " pod="metallb-system/metallb-operator-controller-manager-55964f8cbb-lhn6h" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.497731 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bd7e4822-2466-41c9-9544-7748e0f123c5-apiservice-cert\") pod \"metallb-operator-controller-manager-55964f8cbb-lhn6h\" (UID: \"bd7e4822-2466-41c9-9544-7748e0f123c5\") " pod="metallb-system/metallb-operator-controller-manager-55964f8cbb-lhn6h" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.689904 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bd7e4822-2466-41c9-9544-7748e0f123c5-apiservice-cert\") pod \"metallb-operator-controller-manager-55964f8cbb-lhn6h\" (UID: \"bd7e4822-2466-41c9-9544-7748e0f123c5\") " pod="metallb-system/metallb-operator-controller-manager-55964f8cbb-lhn6h" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.689970 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bd7e4822-2466-41c9-9544-7748e0f123c5-webhook-cert\") pod \"metallb-operator-controller-manager-55964f8cbb-lhn6h\" (UID: \"bd7e4822-2466-41c9-9544-7748e0f123c5\") " pod="metallb-system/metallb-operator-controller-manager-55964f8cbb-lhn6h" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.689988 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7pbc\" (UniqueName: \"kubernetes.io/projected/bd7e4822-2466-41c9-9544-7748e0f123c5-kube-api-access-l7pbc\") pod \"metallb-operator-controller-manager-55964f8cbb-lhn6h\" (UID: \"bd7e4822-2466-41c9-9544-7748e0f123c5\") " pod="metallb-system/metallb-operator-controller-manager-55964f8cbb-lhn6h" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.695131 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bd7e4822-2466-41c9-9544-7748e0f123c5-webhook-cert\") pod \"metallb-operator-controller-manager-55964f8cbb-lhn6h\" (UID: \"bd7e4822-2466-41c9-9544-7748e0f123c5\") " pod="metallb-system/metallb-operator-controller-manager-55964f8cbb-lhn6h" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.701168 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bd7e4822-2466-41c9-9544-7748e0f123c5-apiservice-cert\") pod \"metallb-operator-controller-manager-55964f8cbb-lhn6h\" (UID: \"bd7e4822-2466-41c9-9544-7748e0f123c5\") " pod="metallb-system/metallb-operator-controller-manager-55964f8cbb-lhn6h" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.712427 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7pbc\" (UniqueName: \"kubernetes.io/projected/bd7e4822-2466-41c9-9544-7748e0f123c5-kube-api-access-l7pbc\") pod \"metallb-operator-controller-manager-55964f8cbb-lhn6h\" (UID: \"bd7e4822-2466-41c9-9544-7748e0f123c5\") " pod="metallb-system/metallb-operator-controller-manager-55964f8cbb-lhn6h" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.768986 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-77b74c6747-tsvjw"] Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.769732 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-77b74c6747-tsvjw" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.775758 4756 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.775946 4756 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.776085 4756 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-s4f8g" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.789611 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-77b74c6747-tsvjw"] Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.892442 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9a4d60c6-7374-4425-aa47-cfe2b7265c65-apiservice-cert\") pod \"metallb-operator-webhook-server-77b74c6747-tsvjw\" (UID: \"9a4d60c6-7374-4425-aa47-cfe2b7265c65\") " pod="metallb-system/metallb-operator-webhook-server-77b74c6747-tsvjw" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.892704 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj496\" (UniqueName: \"kubernetes.io/projected/9a4d60c6-7374-4425-aa47-cfe2b7265c65-kube-api-access-mj496\") pod \"metallb-operator-webhook-server-77b74c6747-tsvjw\" (UID: \"9a4d60c6-7374-4425-aa47-cfe2b7265c65\") " pod="metallb-system/metallb-operator-webhook-server-77b74c6747-tsvjw" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.892768 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9a4d60c6-7374-4425-aa47-cfe2b7265c65-webhook-cert\") pod \"metallb-operator-webhook-server-77b74c6747-tsvjw\" (UID: \"9a4d60c6-7374-4425-aa47-cfe2b7265c65\") " pod="metallb-system/metallb-operator-webhook-server-77b74c6747-tsvjw" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.898252 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-55964f8cbb-lhn6h" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.993770 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9a4d60c6-7374-4425-aa47-cfe2b7265c65-apiservice-cert\") pod \"metallb-operator-webhook-server-77b74c6747-tsvjw\" (UID: \"9a4d60c6-7374-4425-aa47-cfe2b7265c65\") " pod="metallb-system/metallb-operator-webhook-server-77b74c6747-tsvjw" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.993873 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj496\" (UniqueName: \"kubernetes.io/projected/9a4d60c6-7374-4425-aa47-cfe2b7265c65-kube-api-access-mj496\") pod \"metallb-operator-webhook-server-77b74c6747-tsvjw\" (UID: \"9a4d60c6-7374-4425-aa47-cfe2b7265c65\") " pod="metallb-system/metallb-operator-webhook-server-77b74c6747-tsvjw" Dec 05 16:29:47 crc kubenswrapper[4756]: I1205 16:29:47.993915 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9a4d60c6-7374-4425-aa47-cfe2b7265c65-webhook-cert\") pod \"metallb-operator-webhook-server-77b74c6747-tsvjw\" (UID: \"9a4d60c6-7374-4425-aa47-cfe2b7265c65\") " pod="metallb-system/metallb-operator-webhook-server-77b74c6747-tsvjw" Dec 05 16:29:48 crc kubenswrapper[4756]: I1205 16:29:48.000106 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9a4d60c6-7374-4425-aa47-cfe2b7265c65-apiservice-cert\") pod \"metallb-operator-webhook-server-77b74c6747-tsvjw\" (UID: \"9a4d60c6-7374-4425-aa47-cfe2b7265c65\") " pod="metallb-system/metallb-operator-webhook-server-77b74c6747-tsvjw" Dec 05 16:29:48 crc kubenswrapper[4756]: I1205 16:29:48.015467 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9a4d60c6-7374-4425-aa47-cfe2b7265c65-webhook-cert\") pod \"metallb-operator-webhook-server-77b74c6747-tsvjw\" (UID: \"9a4d60c6-7374-4425-aa47-cfe2b7265c65\") " pod="metallb-system/metallb-operator-webhook-server-77b74c6747-tsvjw" Dec 05 16:29:48 crc kubenswrapper[4756]: I1205 16:29:48.015475 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj496\" (UniqueName: \"kubernetes.io/projected/9a4d60c6-7374-4425-aa47-cfe2b7265c65-kube-api-access-mj496\") pod \"metallb-operator-webhook-server-77b74c6747-tsvjw\" (UID: \"9a4d60c6-7374-4425-aa47-cfe2b7265c65\") " pod="metallb-system/metallb-operator-webhook-server-77b74c6747-tsvjw" Dec 05 16:29:48 crc kubenswrapper[4756]: I1205 16:29:48.091134 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-77b74c6747-tsvjw" Dec 05 16:29:48 crc kubenswrapper[4756]: I1205 16:29:48.349941 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-55964f8cbb-lhn6h"] Dec 05 16:29:48 crc kubenswrapper[4756]: W1205 16:29:48.360176 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd7e4822_2466_41c9_9544_7748e0f123c5.slice/crio-b90aba93500958823addc3ece6c39d17c252e7f90b9b61db4599fe70e50e5985 WatchSource:0}: Error finding container b90aba93500958823addc3ece6c39d17c252e7f90b9b61db4599fe70e50e5985: Status 404 returned error can't find the container with id b90aba93500958823addc3ece6c39d17c252e7f90b9b61db4599fe70e50e5985 Dec 05 16:29:48 crc kubenswrapper[4756]: I1205 16:29:48.470020 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-77b74c6747-tsvjw"] Dec 05 16:29:48 crc kubenswrapper[4756]: W1205 16:29:48.473550 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a4d60c6_7374_4425_aa47_cfe2b7265c65.slice/crio-4df8082ce11b4ea61da19a7da68325e23d7145419eab5096d4e7f1520a5bd7df WatchSource:0}: Error finding container 4df8082ce11b4ea61da19a7da68325e23d7145419eab5096d4e7f1520a5bd7df: Status 404 returned error can't find the container with id 4df8082ce11b4ea61da19a7da68325e23d7145419eab5096d4e7f1520a5bd7df Dec 05 16:29:48 crc kubenswrapper[4756]: I1205 16:29:48.709949 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-77b74c6747-tsvjw" event={"ID":"9a4d60c6-7374-4425-aa47-cfe2b7265c65","Type":"ContainerStarted","Data":"4df8082ce11b4ea61da19a7da68325e23d7145419eab5096d4e7f1520a5bd7df"} Dec 05 16:29:48 crc kubenswrapper[4756]: I1205 16:29:48.711020 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-55964f8cbb-lhn6h" event={"ID":"bd7e4822-2466-41c9-9544-7748e0f123c5","Type":"ContainerStarted","Data":"b90aba93500958823addc3ece6c39d17c252e7f90b9b61db4599fe70e50e5985"} Dec 05 16:29:52 crc kubenswrapper[4756]: I1205 16:29:52.297773 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:29:52 crc kubenswrapper[4756]: I1205 16:29:52.298164 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:29:52 crc kubenswrapper[4756]: I1205 16:29:52.298225 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 16:29:52 crc kubenswrapper[4756]: I1205 16:29:52.298947 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"af109c606f5763151a202cd55133291df026de6e76c3ec3a043528aa8e422ca0"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 16:29:52 crc kubenswrapper[4756]: I1205 16:29:52.298995 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://af109c606f5763151a202cd55133291df026de6e76c3ec3a043528aa8e422ca0" gracePeriod=600 Dec 05 16:29:52 crc kubenswrapper[4756]: I1205 16:29:52.760037 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="af109c606f5763151a202cd55133291df026de6e76c3ec3a043528aa8e422ca0" exitCode=0 Dec 05 16:29:52 crc kubenswrapper[4756]: I1205 16:29:52.760099 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"af109c606f5763151a202cd55133291df026de6e76c3ec3a043528aa8e422ca0"} Dec 05 16:29:52 crc kubenswrapper[4756]: I1205 16:29:52.760169 4756 scope.go:117] "RemoveContainer" containerID="7338622c696eeea43693c9484face505b7d076a37d939ddfdf63dc83f5acf1ba" Dec 05 16:29:53 crc kubenswrapper[4756]: I1205 16:29:53.766979 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-55964f8cbb-lhn6h" event={"ID":"bd7e4822-2466-41c9-9544-7748e0f123c5","Type":"ContainerStarted","Data":"938416b4554131ea2ff96b172f4e3ba76d5e8df2f9e3e84ff633064cdc039ec5"} Dec 05 16:29:53 crc kubenswrapper[4756]: I1205 16:29:53.767540 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-55964f8cbb-lhn6h" Dec 05 16:29:53 crc kubenswrapper[4756]: I1205 16:29:53.770069 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"e8a4fad4de22e09f52b10e225d79410b313d20ac3252ed14a260ce12e986a83e"} Dec 05 16:29:53 crc kubenswrapper[4756]: I1205 16:29:53.772573 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-77b74c6747-tsvjw" event={"ID":"9a4d60c6-7374-4425-aa47-cfe2b7265c65","Type":"ContainerStarted","Data":"fee17bcac355f04e61ae4ca0c1d40a5cece1b0928499bfc9e4e126d884001d09"} Dec 05 16:29:53 crc kubenswrapper[4756]: I1205 16:29:53.772752 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-77b74c6747-tsvjw" Dec 05 16:29:53 crc kubenswrapper[4756]: I1205 16:29:53.788168 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-55964f8cbb-lhn6h" podStartSLOduration=1.753676553 podStartE2EDuration="6.78815325s" podCreationTimestamp="2025-12-05 16:29:47 +0000 UTC" firstStartedPulling="2025-12-05 16:29:48.363430704 +0000 UTC m=+848.602104216" lastFinishedPulling="2025-12-05 16:29:53.397907401 +0000 UTC m=+853.636580913" observedRunningTime="2025-12-05 16:29:53.78665074 +0000 UTC m=+854.025324272" watchObservedRunningTime="2025-12-05 16:29:53.78815325 +0000 UTC m=+854.026826762" Dec 05 16:29:53 crc kubenswrapper[4756]: I1205 16:29:53.807283 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-77b74c6747-tsvjw" podStartSLOduration=1.874688379 podStartE2EDuration="6.807265334s" podCreationTimestamp="2025-12-05 16:29:47 +0000 UTC" firstStartedPulling="2025-12-05 16:29:48.47663845 +0000 UTC m=+848.715311962" lastFinishedPulling="2025-12-05 16:29:53.409215405 +0000 UTC m=+853.647888917" observedRunningTime="2025-12-05 16:29:53.805710322 +0000 UTC m=+854.044383844" watchObservedRunningTime="2025-12-05 16:29:53.807265334 +0000 UTC m=+854.045938846" Dec 05 16:30:00 crc kubenswrapper[4756]: I1205 16:30:00.155291 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415870-t5g2j"] Dec 05 16:30:00 crc kubenswrapper[4756]: I1205 16:30:00.156484 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-t5g2j" Dec 05 16:30:00 crc kubenswrapper[4756]: I1205 16:30:00.158902 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 16:30:00 crc kubenswrapper[4756]: I1205 16:30:00.161286 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 16:30:00 crc kubenswrapper[4756]: I1205 16:30:00.167717 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415870-t5g2j"] Dec 05 16:30:00 crc kubenswrapper[4756]: I1205 16:30:00.293438 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/575bbbf2-c59a-42c7-8404-4e907139734d-config-volume\") pod \"collect-profiles-29415870-t5g2j\" (UID: \"575bbbf2-c59a-42c7-8404-4e907139734d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-t5g2j" Dec 05 16:30:00 crc kubenswrapper[4756]: I1205 16:30:00.293499 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc997\" (UniqueName: \"kubernetes.io/projected/575bbbf2-c59a-42c7-8404-4e907139734d-kube-api-access-xc997\") pod \"collect-profiles-29415870-t5g2j\" (UID: \"575bbbf2-c59a-42c7-8404-4e907139734d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-t5g2j" Dec 05 16:30:00 crc kubenswrapper[4756]: I1205 16:30:00.293639 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/575bbbf2-c59a-42c7-8404-4e907139734d-secret-volume\") pod \"collect-profiles-29415870-t5g2j\" (UID: \"575bbbf2-c59a-42c7-8404-4e907139734d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-t5g2j" Dec 05 16:30:00 crc kubenswrapper[4756]: I1205 16:30:00.395109 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/575bbbf2-c59a-42c7-8404-4e907139734d-secret-volume\") pod \"collect-profiles-29415870-t5g2j\" (UID: \"575bbbf2-c59a-42c7-8404-4e907139734d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-t5g2j" Dec 05 16:30:00 crc kubenswrapper[4756]: I1205 16:30:00.395199 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/575bbbf2-c59a-42c7-8404-4e907139734d-config-volume\") pod \"collect-profiles-29415870-t5g2j\" (UID: \"575bbbf2-c59a-42c7-8404-4e907139734d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-t5g2j" Dec 05 16:30:00 crc kubenswrapper[4756]: I1205 16:30:00.395228 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc997\" (UniqueName: \"kubernetes.io/projected/575bbbf2-c59a-42c7-8404-4e907139734d-kube-api-access-xc997\") pod \"collect-profiles-29415870-t5g2j\" (UID: \"575bbbf2-c59a-42c7-8404-4e907139734d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-t5g2j" Dec 05 16:30:00 crc kubenswrapper[4756]: I1205 16:30:00.396130 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/575bbbf2-c59a-42c7-8404-4e907139734d-config-volume\") pod \"collect-profiles-29415870-t5g2j\" (UID: \"575bbbf2-c59a-42c7-8404-4e907139734d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-t5g2j" Dec 05 16:30:00 crc kubenswrapper[4756]: I1205 16:30:00.400958 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/575bbbf2-c59a-42c7-8404-4e907139734d-secret-volume\") pod \"collect-profiles-29415870-t5g2j\" (UID: \"575bbbf2-c59a-42c7-8404-4e907139734d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-t5g2j" Dec 05 16:30:00 crc kubenswrapper[4756]: I1205 16:30:00.408991 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc997\" (UniqueName: \"kubernetes.io/projected/575bbbf2-c59a-42c7-8404-4e907139734d-kube-api-access-xc997\") pod \"collect-profiles-29415870-t5g2j\" (UID: \"575bbbf2-c59a-42c7-8404-4e907139734d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-t5g2j" Dec 05 16:30:00 crc kubenswrapper[4756]: I1205 16:30:00.520376 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-t5g2j" Dec 05 16:30:00 crc kubenswrapper[4756]: I1205 16:30:00.749000 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415870-t5g2j"] Dec 05 16:30:00 crc kubenswrapper[4756]: I1205 16:30:00.825633 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-t5g2j" event={"ID":"575bbbf2-c59a-42c7-8404-4e907139734d","Type":"ContainerStarted","Data":"71d44a58b936555b2e452eb96bcb0e42362b5c1225d00b1ea91bc47e07cc25d3"} Dec 05 16:30:01 crc kubenswrapper[4756]: I1205 16:30:01.833213 4756 generic.go:334] "Generic (PLEG): container finished" podID="575bbbf2-c59a-42c7-8404-4e907139734d" containerID="4aaee59e97e1eb51ba43c6a0b78c62b9c980ef955404c0c0b81ac0064035dff5" exitCode=0 Dec 05 16:30:01 crc kubenswrapper[4756]: I1205 16:30:01.833260 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-t5g2j" event={"ID":"575bbbf2-c59a-42c7-8404-4e907139734d","Type":"ContainerDied","Data":"4aaee59e97e1eb51ba43c6a0b78c62b9c980ef955404c0c0b81ac0064035dff5"} Dec 05 16:30:03 crc kubenswrapper[4756]: I1205 16:30:03.213166 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-t5g2j" Dec 05 16:30:03 crc kubenswrapper[4756]: I1205 16:30:03.337293 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/575bbbf2-c59a-42c7-8404-4e907139734d-config-volume\") pod \"575bbbf2-c59a-42c7-8404-4e907139734d\" (UID: \"575bbbf2-c59a-42c7-8404-4e907139734d\") " Dec 05 16:30:03 crc kubenswrapper[4756]: I1205 16:30:03.337429 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/575bbbf2-c59a-42c7-8404-4e907139734d-secret-volume\") pod \"575bbbf2-c59a-42c7-8404-4e907139734d\" (UID: \"575bbbf2-c59a-42c7-8404-4e907139734d\") " Dec 05 16:30:03 crc kubenswrapper[4756]: I1205 16:30:03.337491 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc997\" (UniqueName: \"kubernetes.io/projected/575bbbf2-c59a-42c7-8404-4e907139734d-kube-api-access-xc997\") pod \"575bbbf2-c59a-42c7-8404-4e907139734d\" (UID: \"575bbbf2-c59a-42c7-8404-4e907139734d\") " Dec 05 16:30:03 crc kubenswrapper[4756]: I1205 16:30:03.338528 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/575bbbf2-c59a-42c7-8404-4e907139734d-config-volume" (OuterVolumeSpecName: "config-volume") pod "575bbbf2-c59a-42c7-8404-4e907139734d" (UID: "575bbbf2-c59a-42c7-8404-4e907139734d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:30:03 crc kubenswrapper[4756]: I1205 16:30:03.343127 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/575bbbf2-c59a-42c7-8404-4e907139734d-kube-api-access-xc997" (OuterVolumeSpecName: "kube-api-access-xc997") pod "575bbbf2-c59a-42c7-8404-4e907139734d" (UID: "575bbbf2-c59a-42c7-8404-4e907139734d"). InnerVolumeSpecName "kube-api-access-xc997". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:30:03 crc kubenswrapper[4756]: I1205 16:30:03.343134 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/575bbbf2-c59a-42c7-8404-4e907139734d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "575bbbf2-c59a-42c7-8404-4e907139734d" (UID: "575bbbf2-c59a-42c7-8404-4e907139734d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:30:03 crc kubenswrapper[4756]: I1205 16:30:03.439200 4756 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/575bbbf2-c59a-42c7-8404-4e907139734d-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:03 crc kubenswrapper[4756]: I1205 16:30:03.439516 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc997\" (UniqueName: \"kubernetes.io/projected/575bbbf2-c59a-42c7-8404-4e907139734d-kube-api-access-xc997\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:03 crc kubenswrapper[4756]: I1205 16:30:03.439592 4756 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/575bbbf2-c59a-42c7-8404-4e907139734d-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:03 crc kubenswrapper[4756]: I1205 16:30:03.844187 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-t5g2j" event={"ID":"575bbbf2-c59a-42c7-8404-4e907139734d","Type":"ContainerDied","Data":"71d44a58b936555b2e452eb96bcb0e42362b5c1225d00b1ea91bc47e07cc25d3"} Dec 05 16:30:03 crc kubenswrapper[4756]: I1205 16:30:03.844230 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71d44a58b936555b2e452eb96bcb0e42362b5c1225d00b1ea91bc47e07cc25d3" Dec 05 16:30:03 crc kubenswrapper[4756]: I1205 16:30:03.844254 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415870-t5g2j" Dec 05 16:30:08 crc kubenswrapper[4756]: I1205 16:30:08.095367 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-77b74c6747-tsvjw" Dec 05 16:30:27 crc kubenswrapper[4756]: I1205 16:30:27.900899 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-55964f8cbb-lhn6h" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.602264 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-nnqf2"] Dec 05 16:30:28 crc kubenswrapper[4756]: E1205 16:30:28.602561 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="575bbbf2-c59a-42c7-8404-4e907139734d" containerName="collect-profiles" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.602576 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="575bbbf2-c59a-42c7-8404-4e907139734d" containerName="collect-profiles" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.602709 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="575bbbf2-c59a-42c7-8404-4e907139734d" containerName="collect-profiles" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.604485 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-nnqf2" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.617654 4756 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.617794 4756 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-pzscb" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.618124 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.632620 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-kzhn8"] Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.633571 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kzhn8" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.636125 4756 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.643468 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/03cce283-77b4-4dbb-995a-a28cc39ae376-metrics\") pod \"frr-k8s-nnqf2\" (UID: \"03cce283-77b4-4dbb-995a-a28cc39ae376\") " pod="metallb-system/frr-k8s-nnqf2" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.643619 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/03cce283-77b4-4dbb-995a-a28cc39ae376-reloader\") pod \"frr-k8s-nnqf2\" (UID: \"03cce283-77b4-4dbb-995a-a28cc39ae376\") " pod="metallb-system/frr-k8s-nnqf2" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.643740 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/03cce283-77b4-4dbb-995a-a28cc39ae376-frr-startup\") pod \"frr-k8s-nnqf2\" (UID: \"03cce283-77b4-4dbb-995a-a28cc39ae376\") " pod="metallb-system/frr-k8s-nnqf2" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.643779 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/03cce283-77b4-4dbb-995a-a28cc39ae376-frr-conf\") pod \"frr-k8s-nnqf2\" (UID: \"03cce283-77b4-4dbb-995a-a28cc39ae376\") " pod="metallb-system/frr-k8s-nnqf2" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.643835 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxpvp\" (UniqueName: \"kubernetes.io/projected/03cce283-77b4-4dbb-995a-a28cc39ae376-kube-api-access-gxpvp\") pod \"frr-k8s-nnqf2\" (UID: \"03cce283-77b4-4dbb-995a-a28cc39ae376\") " pod="metallb-system/frr-k8s-nnqf2" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.644000 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/03cce283-77b4-4dbb-995a-a28cc39ae376-frr-sockets\") pod \"frr-k8s-nnqf2\" (UID: \"03cce283-77b4-4dbb-995a-a28cc39ae376\") " pod="metallb-system/frr-k8s-nnqf2" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.644162 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03cce283-77b4-4dbb-995a-a28cc39ae376-metrics-certs\") pod \"frr-k8s-nnqf2\" (UID: \"03cce283-77b4-4dbb-995a-a28cc39ae376\") " pod="metallb-system/frr-k8s-nnqf2" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.667620 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-kzhn8"] Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.745731 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/03cce283-77b4-4dbb-995a-a28cc39ae376-frr-startup\") pod \"frr-k8s-nnqf2\" (UID: \"03cce283-77b4-4dbb-995a-a28cc39ae376\") " pod="metallb-system/frr-k8s-nnqf2" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.745787 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/03cce283-77b4-4dbb-995a-a28cc39ae376-frr-conf\") pod \"frr-k8s-nnqf2\" (UID: \"03cce283-77b4-4dbb-995a-a28cc39ae376\") " pod="metallb-system/frr-k8s-nnqf2" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.745824 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c64b6bd4-0f30-4b95-8bac-8c302e466540-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-kzhn8\" (UID: \"c64b6bd4-0f30-4b95-8bac-8c302e466540\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kzhn8" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.745856 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxpvp\" (UniqueName: \"kubernetes.io/projected/03cce283-77b4-4dbb-995a-a28cc39ae376-kube-api-access-gxpvp\") pod \"frr-k8s-nnqf2\" (UID: \"03cce283-77b4-4dbb-995a-a28cc39ae376\") " pod="metallb-system/frr-k8s-nnqf2" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.745873 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsfmb\" (UniqueName: \"kubernetes.io/projected/c64b6bd4-0f30-4b95-8bac-8c302e466540-kube-api-access-rsfmb\") pod \"frr-k8s-webhook-server-7fcb986d4-kzhn8\" (UID: \"c64b6bd4-0f30-4b95-8bac-8c302e466540\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kzhn8" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.746033 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/03cce283-77b4-4dbb-995a-a28cc39ae376-frr-sockets\") pod \"frr-k8s-nnqf2\" (UID: \"03cce283-77b4-4dbb-995a-a28cc39ae376\") " pod="metallb-system/frr-k8s-nnqf2" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.746063 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03cce283-77b4-4dbb-995a-a28cc39ae376-metrics-certs\") pod \"frr-k8s-nnqf2\" (UID: \"03cce283-77b4-4dbb-995a-a28cc39ae376\") " pod="metallb-system/frr-k8s-nnqf2" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.746101 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/03cce283-77b4-4dbb-995a-a28cc39ae376-metrics\") pod \"frr-k8s-nnqf2\" (UID: \"03cce283-77b4-4dbb-995a-a28cc39ae376\") " pod="metallb-system/frr-k8s-nnqf2" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.746127 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/03cce283-77b4-4dbb-995a-a28cc39ae376-reloader\") pod \"frr-k8s-nnqf2\" (UID: \"03cce283-77b4-4dbb-995a-a28cc39ae376\") " pod="metallb-system/frr-k8s-nnqf2" Dec 05 16:30:28 crc kubenswrapper[4756]: E1205 16:30:28.746175 4756 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 05 16:30:28 crc kubenswrapper[4756]: E1205 16:30:28.746239 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03cce283-77b4-4dbb-995a-a28cc39ae376-metrics-certs podName:03cce283-77b4-4dbb-995a-a28cc39ae376 nodeName:}" failed. No retries permitted until 2025-12-05 16:30:29.246220336 +0000 UTC m=+889.484893848 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/03cce283-77b4-4dbb-995a-a28cc39ae376-metrics-certs") pod "frr-k8s-nnqf2" (UID: "03cce283-77b4-4dbb-995a-a28cc39ae376") : secret "frr-k8s-certs-secret" not found Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.746811 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/03cce283-77b4-4dbb-995a-a28cc39ae376-frr-conf\") pod \"frr-k8s-nnqf2\" (UID: \"03cce283-77b4-4dbb-995a-a28cc39ae376\") " pod="metallb-system/frr-k8s-nnqf2" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.746894 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/03cce283-77b4-4dbb-995a-a28cc39ae376-metrics\") pod \"frr-k8s-nnqf2\" (UID: \"03cce283-77b4-4dbb-995a-a28cc39ae376\") " pod="metallb-system/frr-k8s-nnqf2" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.747004 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/03cce283-77b4-4dbb-995a-a28cc39ae376-reloader\") pod \"frr-k8s-nnqf2\" (UID: \"03cce283-77b4-4dbb-995a-a28cc39ae376\") " pod="metallb-system/frr-k8s-nnqf2" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.747017 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/03cce283-77b4-4dbb-995a-a28cc39ae376-frr-sockets\") pod \"frr-k8s-nnqf2\" (UID: \"03cce283-77b4-4dbb-995a-a28cc39ae376\") " pod="metallb-system/frr-k8s-nnqf2" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.747388 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/03cce283-77b4-4dbb-995a-a28cc39ae376-frr-startup\") pod \"frr-k8s-nnqf2\" (UID: \"03cce283-77b4-4dbb-995a-a28cc39ae376\") " pod="metallb-system/frr-k8s-nnqf2" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.777887 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-mwwdp"] Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.779076 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-mwwdp" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.783484 4756 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.783728 4756 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.783986 4756 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-pvhvq" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.784136 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.797983 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxpvp\" (UniqueName: \"kubernetes.io/projected/03cce283-77b4-4dbb-995a-a28cc39ae376-kube-api-access-gxpvp\") pod \"frr-k8s-nnqf2\" (UID: \"03cce283-77b4-4dbb-995a-a28cc39ae376\") " pod="metallb-system/frr-k8s-nnqf2" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.838751 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-65sww"] Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.840554 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-65sww" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.849013 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c64b6bd4-0f30-4b95-8bac-8c302e466540-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-kzhn8\" (UID: \"c64b6bd4-0f30-4b95-8bac-8c302e466540\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kzhn8" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.849092 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsfmb\" (UniqueName: \"kubernetes.io/projected/c64b6bd4-0f30-4b95-8bac-8c302e466540-kube-api-access-rsfmb\") pod \"frr-k8s-webhook-server-7fcb986d4-kzhn8\" (UID: \"c64b6bd4-0f30-4b95-8bac-8c302e466540\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kzhn8" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.849157 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9a2f231e-11fb-4b75-ba6b-35f59554bcbe-memberlist\") pod \"speaker-mwwdp\" (UID: \"9a2f231e-11fb-4b75-ba6b-35f59554bcbe\") " pod="metallb-system/speaker-mwwdp" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.849185 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a2f231e-11fb-4b75-ba6b-35f59554bcbe-metrics-certs\") pod \"speaker-mwwdp\" (UID: \"9a2f231e-11fb-4b75-ba6b-35f59554bcbe\") " pod="metallb-system/speaker-mwwdp" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.849213 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/9a2f231e-11fb-4b75-ba6b-35f59554bcbe-metallb-excludel2\") pod \"speaker-mwwdp\" (UID: \"9a2f231e-11fb-4b75-ba6b-35f59554bcbe\") " pod="metallb-system/speaker-mwwdp" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.849276 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nlkq\" (UniqueName: \"kubernetes.io/projected/9a2f231e-11fb-4b75-ba6b-35f59554bcbe-kube-api-access-6nlkq\") pod \"speaker-mwwdp\" (UID: \"9a2f231e-11fb-4b75-ba6b-35f59554bcbe\") " pod="metallb-system/speaker-mwwdp" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.854457 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c64b6bd4-0f30-4b95-8bac-8c302e466540-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-kzhn8\" (UID: \"c64b6bd4-0f30-4b95-8bac-8c302e466540\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kzhn8" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.863124 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-65sww"] Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.874771 4756 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.902859 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsfmb\" (UniqueName: \"kubernetes.io/projected/c64b6bd4-0f30-4b95-8bac-8c302e466540-kube-api-access-rsfmb\") pod \"frr-k8s-webhook-server-7fcb986d4-kzhn8\" (UID: \"c64b6bd4-0f30-4b95-8bac-8c302e466540\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kzhn8" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.948108 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kzhn8" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.951532 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/9a2f231e-11fb-4b75-ba6b-35f59554bcbe-metallb-excludel2\") pod \"speaker-mwwdp\" (UID: \"9a2f231e-11fb-4b75-ba6b-35f59554bcbe\") " pod="metallb-system/speaker-mwwdp" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.951575 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a2f231e-11fb-4b75-ba6b-35f59554bcbe-metrics-certs\") pod \"speaker-mwwdp\" (UID: \"9a2f231e-11fb-4b75-ba6b-35f59554bcbe\") " pod="metallb-system/speaker-mwwdp" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.951643 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nlkq\" (UniqueName: \"kubernetes.io/projected/9a2f231e-11fb-4b75-ba6b-35f59554bcbe-kube-api-access-6nlkq\") pod \"speaker-mwwdp\" (UID: \"9a2f231e-11fb-4b75-ba6b-35f59554bcbe\") " pod="metallb-system/speaker-mwwdp" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.951698 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ed26790-e827-4c04-a6c1-3a6befe3f875-cert\") pod \"controller-f8648f98b-65sww\" (UID: \"7ed26790-e827-4c04-a6c1-3a6befe3f875\") " pod="metallb-system/controller-f8648f98b-65sww" Dec 05 16:30:28 crc kubenswrapper[4756]: E1205 16:30:28.951738 4756 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.951744 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ed26790-e827-4c04-a6c1-3a6befe3f875-metrics-certs\") pod \"controller-f8648f98b-65sww\" (UID: \"7ed26790-e827-4c04-a6c1-3a6befe3f875\") " pod="metallb-system/controller-f8648f98b-65sww" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.951881 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlhdb\" (UniqueName: \"kubernetes.io/projected/7ed26790-e827-4c04-a6c1-3a6befe3f875-kube-api-access-dlhdb\") pod \"controller-f8648f98b-65sww\" (UID: \"7ed26790-e827-4c04-a6c1-3a6befe3f875\") " pod="metallb-system/controller-f8648f98b-65sww" Dec 05 16:30:28 crc kubenswrapper[4756]: E1205 16:30:28.951901 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a2f231e-11fb-4b75-ba6b-35f59554bcbe-metrics-certs podName:9a2f231e-11fb-4b75-ba6b-35f59554bcbe nodeName:}" failed. No retries permitted until 2025-12-05 16:30:29.451876599 +0000 UTC m=+889.690550191 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a2f231e-11fb-4b75-ba6b-35f59554bcbe-metrics-certs") pod "speaker-mwwdp" (UID: "9a2f231e-11fb-4b75-ba6b-35f59554bcbe") : secret "speaker-certs-secret" not found Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.951936 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9a2f231e-11fb-4b75-ba6b-35f59554bcbe-memberlist\") pod \"speaker-mwwdp\" (UID: \"9a2f231e-11fb-4b75-ba6b-35f59554bcbe\") " pod="metallb-system/speaker-mwwdp" Dec 05 16:30:28 crc kubenswrapper[4756]: E1205 16:30:28.952242 4756 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 05 16:30:28 crc kubenswrapper[4756]: E1205 16:30:28.952308 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a2f231e-11fb-4b75-ba6b-35f59554bcbe-memberlist podName:9a2f231e-11fb-4b75-ba6b-35f59554bcbe nodeName:}" failed. No retries permitted until 2025-12-05 16:30:29.45229053 +0000 UTC m=+889.690964042 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/9a2f231e-11fb-4b75-ba6b-35f59554bcbe-memberlist") pod "speaker-mwwdp" (UID: "9a2f231e-11fb-4b75-ba6b-35f59554bcbe") : secret "metallb-memberlist" not found Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.952503 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/9a2f231e-11fb-4b75-ba6b-35f59554bcbe-metallb-excludel2\") pod \"speaker-mwwdp\" (UID: \"9a2f231e-11fb-4b75-ba6b-35f59554bcbe\") " pod="metallb-system/speaker-mwwdp" Dec 05 16:30:28 crc kubenswrapper[4756]: I1205 16:30:28.978345 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nlkq\" (UniqueName: \"kubernetes.io/projected/9a2f231e-11fb-4b75-ba6b-35f59554bcbe-kube-api-access-6nlkq\") pod \"speaker-mwwdp\" (UID: \"9a2f231e-11fb-4b75-ba6b-35f59554bcbe\") " pod="metallb-system/speaker-mwwdp" Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.053269 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ed26790-e827-4c04-a6c1-3a6befe3f875-cert\") pod \"controller-f8648f98b-65sww\" (UID: \"7ed26790-e827-4c04-a6c1-3a6befe3f875\") " pod="metallb-system/controller-f8648f98b-65sww" Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.053355 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ed26790-e827-4c04-a6c1-3a6befe3f875-metrics-certs\") pod \"controller-f8648f98b-65sww\" (UID: \"7ed26790-e827-4c04-a6c1-3a6befe3f875\") " pod="metallb-system/controller-f8648f98b-65sww" Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.053389 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlhdb\" (UniqueName: \"kubernetes.io/projected/7ed26790-e827-4c04-a6c1-3a6befe3f875-kube-api-access-dlhdb\") pod \"controller-f8648f98b-65sww\" (UID: \"7ed26790-e827-4c04-a6c1-3a6befe3f875\") " pod="metallb-system/controller-f8648f98b-65sww" Dec 05 16:30:29 crc kubenswrapper[4756]: E1205 16:30:29.054608 4756 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 05 16:30:29 crc kubenswrapper[4756]: E1205 16:30:29.054689 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ed26790-e827-4c04-a6c1-3a6befe3f875-metrics-certs podName:7ed26790-e827-4c04-a6c1-3a6befe3f875 nodeName:}" failed. No retries permitted until 2025-12-05 16:30:29.554663775 +0000 UTC m=+889.793337357 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ed26790-e827-4c04-a6c1-3a6befe3f875-metrics-certs") pod "controller-f8648f98b-65sww" (UID: "7ed26790-e827-4c04-a6c1-3a6befe3f875") : secret "controller-certs-secret" not found Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.065167 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ed26790-e827-4c04-a6c1-3a6befe3f875-cert\") pod \"controller-f8648f98b-65sww\" (UID: \"7ed26790-e827-4c04-a6c1-3a6befe3f875\") " pod="metallb-system/controller-f8648f98b-65sww" Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.079863 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlhdb\" (UniqueName: \"kubernetes.io/projected/7ed26790-e827-4c04-a6c1-3a6befe3f875-kube-api-access-dlhdb\") pod \"controller-f8648f98b-65sww\" (UID: \"7ed26790-e827-4c04-a6c1-3a6befe3f875\") " pod="metallb-system/controller-f8648f98b-65sww" Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.090405 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pnx9t"] Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.091589 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pnx9t" Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.115671 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pnx9t"] Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.154291 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dz9r\" (UniqueName: \"kubernetes.io/projected/5fb359a8-6d4a-4b15-bcd0-50e857d681f1-kube-api-access-4dz9r\") pod \"redhat-marketplace-pnx9t\" (UID: \"5fb359a8-6d4a-4b15-bcd0-50e857d681f1\") " pod="openshift-marketplace/redhat-marketplace-pnx9t" Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.154373 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fb359a8-6d4a-4b15-bcd0-50e857d681f1-utilities\") pod \"redhat-marketplace-pnx9t\" (UID: \"5fb359a8-6d4a-4b15-bcd0-50e857d681f1\") " pod="openshift-marketplace/redhat-marketplace-pnx9t" Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.154455 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fb359a8-6d4a-4b15-bcd0-50e857d681f1-catalog-content\") pod \"redhat-marketplace-pnx9t\" (UID: \"5fb359a8-6d4a-4b15-bcd0-50e857d681f1\") " pod="openshift-marketplace/redhat-marketplace-pnx9t" Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.255633 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03cce283-77b4-4dbb-995a-a28cc39ae376-metrics-certs\") pod \"frr-k8s-nnqf2\" (UID: \"03cce283-77b4-4dbb-995a-a28cc39ae376\") " pod="metallb-system/frr-k8s-nnqf2" Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.255697 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fb359a8-6d4a-4b15-bcd0-50e857d681f1-catalog-content\") pod \"redhat-marketplace-pnx9t\" (UID: \"5fb359a8-6d4a-4b15-bcd0-50e857d681f1\") " pod="openshift-marketplace/redhat-marketplace-pnx9t" Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.255766 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dz9r\" (UniqueName: \"kubernetes.io/projected/5fb359a8-6d4a-4b15-bcd0-50e857d681f1-kube-api-access-4dz9r\") pod \"redhat-marketplace-pnx9t\" (UID: \"5fb359a8-6d4a-4b15-bcd0-50e857d681f1\") " pod="openshift-marketplace/redhat-marketplace-pnx9t" Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.255808 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fb359a8-6d4a-4b15-bcd0-50e857d681f1-utilities\") pod \"redhat-marketplace-pnx9t\" (UID: \"5fb359a8-6d4a-4b15-bcd0-50e857d681f1\") " pod="openshift-marketplace/redhat-marketplace-pnx9t" Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.256189 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fb359a8-6d4a-4b15-bcd0-50e857d681f1-catalog-content\") pod \"redhat-marketplace-pnx9t\" (UID: \"5fb359a8-6d4a-4b15-bcd0-50e857d681f1\") " pod="openshift-marketplace/redhat-marketplace-pnx9t" Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.256256 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fb359a8-6d4a-4b15-bcd0-50e857d681f1-utilities\") pod \"redhat-marketplace-pnx9t\" (UID: \"5fb359a8-6d4a-4b15-bcd0-50e857d681f1\") " pod="openshift-marketplace/redhat-marketplace-pnx9t" Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.259377 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03cce283-77b4-4dbb-995a-a28cc39ae376-metrics-certs\") pod \"frr-k8s-nnqf2\" (UID: \"03cce283-77b4-4dbb-995a-a28cc39ae376\") " pod="metallb-system/frr-k8s-nnqf2" Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.274247 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dz9r\" (UniqueName: \"kubernetes.io/projected/5fb359a8-6d4a-4b15-bcd0-50e857d681f1-kube-api-access-4dz9r\") pod \"redhat-marketplace-pnx9t\" (UID: \"5fb359a8-6d4a-4b15-bcd0-50e857d681f1\") " pod="openshift-marketplace/redhat-marketplace-pnx9t" Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.403400 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-kzhn8"] Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.417843 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pnx9t" Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.459313 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9a2f231e-11fb-4b75-ba6b-35f59554bcbe-memberlist\") pod \"speaker-mwwdp\" (UID: \"9a2f231e-11fb-4b75-ba6b-35f59554bcbe\") " pod="metallb-system/speaker-mwwdp" Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.459363 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a2f231e-11fb-4b75-ba6b-35f59554bcbe-metrics-certs\") pod \"speaker-mwwdp\" (UID: \"9a2f231e-11fb-4b75-ba6b-35f59554bcbe\") " pod="metallb-system/speaker-mwwdp" Dec 05 16:30:29 crc kubenswrapper[4756]: E1205 16:30:29.459486 4756 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 05 16:30:29 crc kubenswrapper[4756]: E1205 16:30:29.459552 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a2f231e-11fb-4b75-ba6b-35f59554bcbe-memberlist podName:9a2f231e-11fb-4b75-ba6b-35f59554bcbe nodeName:}" failed. No retries permitted until 2025-12-05 16:30:30.459534267 +0000 UTC m=+890.698207779 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/9a2f231e-11fb-4b75-ba6b-35f59554bcbe-memberlist") pod "speaker-mwwdp" (UID: "9a2f231e-11fb-4b75-ba6b-35f59554bcbe") : secret "metallb-memberlist" not found Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.462753 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a2f231e-11fb-4b75-ba6b-35f59554bcbe-metrics-certs\") pod \"speaker-mwwdp\" (UID: \"9a2f231e-11fb-4b75-ba6b-35f59554bcbe\") " pod="metallb-system/speaker-mwwdp" Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.520189 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-nnqf2" Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.560538 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ed26790-e827-4c04-a6c1-3a6befe3f875-metrics-certs\") pod \"controller-f8648f98b-65sww\" (UID: \"7ed26790-e827-4c04-a6c1-3a6befe3f875\") " pod="metallb-system/controller-f8648f98b-65sww" Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.568845 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ed26790-e827-4c04-a6c1-3a6befe3f875-metrics-certs\") pod \"controller-f8648f98b-65sww\" (UID: \"7ed26790-e827-4c04-a6c1-3a6befe3f875\") " pod="metallb-system/controller-f8648f98b-65sww" Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.606001 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pnx9t"] Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.783969 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-65sww" Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.985500 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nnqf2" event={"ID":"03cce283-77b4-4dbb-995a-a28cc39ae376","Type":"ContainerStarted","Data":"a3f8e2415e6b9093e11482563fbcc97244f630fa3637652575954bf867e8c3a4"} Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.986652 4756 generic.go:334] "Generic (PLEG): container finished" podID="5fb359a8-6d4a-4b15-bcd0-50e857d681f1" containerID="645b79e38560e7c047922ff773b309e7d16315ea17f87c8cc3462560635bce00" exitCode=0 Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.986694 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pnx9t" event={"ID":"5fb359a8-6d4a-4b15-bcd0-50e857d681f1","Type":"ContainerDied","Data":"645b79e38560e7c047922ff773b309e7d16315ea17f87c8cc3462560635bce00"} Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.986710 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pnx9t" event={"ID":"5fb359a8-6d4a-4b15-bcd0-50e857d681f1","Type":"ContainerStarted","Data":"bce0fa9e28898e679b8673aad2aa3593a553119000b651090b2ba35e3ea20262"} Dec 05 16:30:29 crc kubenswrapper[4756]: I1205 16:30:29.988851 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kzhn8" event={"ID":"c64b6bd4-0f30-4b95-8bac-8c302e466540","Type":"ContainerStarted","Data":"bcc078aed28055256bcdec07f8e7f9352fc6a817bb659ccc183355f25dd92e3b"} Dec 05 16:30:30 crc kubenswrapper[4756]: I1205 16:30:30.010476 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-65sww"] Dec 05 16:30:30 crc kubenswrapper[4756]: W1205 16:30:30.014088 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ed26790_e827_4c04_a6c1_3a6befe3f875.slice/crio-21c2b0297536c3ccca35bd7b98b1dbde227e098e3ced2a8378526f3b2f62a1eb WatchSource:0}: Error finding container 21c2b0297536c3ccca35bd7b98b1dbde227e098e3ced2a8378526f3b2f62a1eb: Status 404 returned error can't find the container with id 21c2b0297536c3ccca35bd7b98b1dbde227e098e3ced2a8378526f3b2f62a1eb Dec 05 16:30:30 crc kubenswrapper[4756]: I1205 16:30:30.473088 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9a2f231e-11fb-4b75-ba6b-35f59554bcbe-memberlist\") pod \"speaker-mwwdp\" (UID: \"9a2f231e-11fb-4b75-ba6b-35f59554bcbe\") " pod="metallb-system/speaker-mwwdp" Dec 05 16:30:30 crc kubenswrapper[4756]: I1205 16:30:30.477987 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9a2f231e-11fb-4b75-ba6b-35f59554bcbe-memberlist\") pod \"speaker-mwwdp\" (UID: \"9a2f231e-11fb-4b75-ba6b-35f59554bcbe\") " pod="metallb-system/speaker-mwwdp" Dec 05 16:30:30 crc kubenswrapper[4756]: I1205 16:30:30.615823 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-mwwdp" Dec 05 16:30:30 crc kubenswrapper[4756]: W1205 16:30:30.673554 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a2f231e_11fb_4b75_ba6b_35f59554bcbe.slice/crio-087f7a2354727c7bf4441db4824372a2faeb4928cbfa184d2e5a4c9ef343235f WatchSource:0}: Error finding container 087f7a2354727c7bf4441db4824372a2faeb4928cbfa184d2e5a4c9ef343235f: Status 404 returned error can't find the container with id 087f7a2354727c7bf4441db4824372a2faeb4928cbfa184d2e5a4c9ef343235f Dec 05 16:30:31 crc kubenswrapper[4756]: I1205 16:30:31.011743 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-mwwdp" event={"ID":"9a2f231e-11fb-4b75-ba6b-35f59554bcbe","Type":"ContainerStarted","Data":"087f7a2354727c7bf4441db4824372a2faeb4928cbfa184d2e5a4c9ef343235f"} Dec 05 16:30:31 crc kubenswrapper[4756]: I1205 16:30:31.021793 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-65sww" event={"ID":"7ed26790-e827-4c04-a6c1-3a6befe3f875","Type":"ContainerStarted","Data":"e534a0538d925d2980709e76e2d8ae7cbcd7e596589adb3f5ca60bc4eaa0d0c0"} Dec 05 16:30:31 crc kubenswrapper[4756]: I1205 16:30:31.021850 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-65sww" event={"ID":"7ed26790-e827-4c04-a6c1-3a6befe3f875","Type":"ContainerStarted","Data":"8ac4af90bc3fb5f3dd35a9e379627d49b41013e1545ae4cdc7f2a82a3d48e845"} Dec 05 16:30:31 crc kubenswrapper[4756]: I1205 16:30:31.021861 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-65sww" event={"ID":"7ed26790-e827-4c04-a6c1-3a6befe3f875","Type":"ContainerStarted","Data":"21c2b0297536c3ccca35bd7b98b1dbde227e098e3ced2a8378526f3b2f62a1eb"} Dec 05 16:30:31 crc kubenswrapper[4756]: I1205 16:30:31.022192 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-65sww" Dec 05 16:30:31 crc kubenswrapper[4756]: I1205 16:30:31.031006 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pnx9t" event={"ID":"5fb359a8-6d4a-4b15-bcd0-50e857d681f1","Type":"ContainerStarted","Data":"49ab1c9361450d863d2cb40fd9458e5731e044abaf8b1980c6232d795df39320"} Dec 05 16:30:31 crc kubenswrapper[4756]: I1205 16:30:31.048429 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-65sww" podStartSLOduration=3.048404073 podStartE2EDuration="3.048404073s" podCreationTimestamp="2025-12-05 16:30:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:30:31.045012352 +0000 UTC m=+891.283685884" watchObservedRunningTime="2025-12-05 16:30:31.048404073 +0000 UTC m=+891.287077585" Dec 05 16:30:31 crc kubenswrapper[4756]: I1205 16:30:31.650978 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-phkpd"] Dec 05 16:30:31 crc kubenswrapper[4756]: I1205 16:30:31.652131 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-phkpd" Dec 05 16:30:31 crc kubenswrapper[4756]: I1205 16:30:31.667372 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-phkpd"] Dec 05 16:30:31 crc kubenswrapper[4756]: I1205 16:30:31.694920 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d94dfbd-e25a-4edf-8fd6-9e3121623da7-catalog-content\") pod \"community-operators-phkpd\" (UID: \"4d94dfbd-e25a-4edf-8fd6-9e3121623da7\") " pod="openshift-marketplace/community-operators-phkpd" Dec 05 16:30:31 crc kubenswrapper[4756]: I1205 16:30:31.695002 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d94dfbd-e25a-4edf-8fd6-9e3121623da7-utilities\") pod \"community-operators-phkpd\" (UID: \"4d94dfbd-e25a-4edf-8fd6-9e3121623da7\") " pod="openshift-marketplace/community-operators-phkpd" Dec 05 16:30:31 crc kubenswrapper[4756]: I1205 16:30:31.695026 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnmll\" (UniqueName: \"kubernetes.io/projected/4d94dfbd-e25a-4edf-8fd6-9e3121623da7-kube-api-access-rnmll\") pod \"community-operators-phkpd\" (UID: \"4d94dfbd-e25a-4edf-8fd6-9e3121623da7\") " pod="openshift-marketplace/community-operators-phkpd" Dec 05 16:30:31 crc kubenswrapper[4756]: I1205 16:30:31.796342 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d94dfbd-e25a-4edf-8fd6-9e3121623da7-catalog-content\") pod \"community-operators-phkpd\" (UID: \"4d94dfbd-e25a-4edf-8fd6-9e3121623da7\") " pod="openshift-marketplace/community-operators-phkpd" Dec 05 16:30:31 crc kubenswrapper[4756]: I1205 16:30:31.796822 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d94dfbd-e25a-4edf-8fd6-9e3121623da7-catalog-content\") pod \"community-operators-phkpd\" (UID: \"4d94dfbd-e25a-4edf-8fd6-9e3121623da7\") " pod="openshift-marketplace/community-operators-phkpd" Dec 05 16:30:31 crc kubenswrapper[4756]: I1205 16:30:31.797751 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d94dfbd-e25a-4edf-8fd6-9e3121623da7-utilities\") pod \"community-operators-phkpd\" (UID: \"4d94dfbd-e25a-4edf-8fd6-9e3121623da7\") " pod="openshift-marketplace/community-operators-phkpd" Dec 05 16:30:31 crc kubenswrapper[4756]: I1205 16:30:31.797521 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d94dfbd-e25a-4edf-8fd6-9e3121623da7-utilities\") pod \"community-operators-phkpd\" (UID: \"4d94dfbd-e25a-4edf-8fd6-9e3121623da7\") " pod="openshift-marketplace/community-operators-phkpd" Dec 05 16:30:31 crc kubenswrapper[4756]: I1205 16:30:31.797812 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnmll\" (UniqueName: \"kubernetes.io/projected/4d94dfbd-e25a-4edf-8fd6-9e3121623da7-kube-api-access-rnmll\") pod \"community-operators-phkpd\" (UID: \"4d94dfbd-e25a-4edf-8fd6-9e3121623da7\") " pod="openshift-marketplace/community-operators-phkpd" Dec 05 16:30:31 crc kubenswrapper[4756]: I1205 16:30:31.818156 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnmll\" (UniqueName: \"kubernetes.io/projected/4d94dfbd-e25a-4edf-8fd6-9e3121623da7-kube-api-access-rnmll\") pod \"community-operators-phkpd\" (UID: \"4d94dfbd-e25a-4edf-8fd6-9e3121623da7\") " pod="openshift-marketplace/community-operators-phkpd" Dec 05 16:30:31 crc kubenswrapper[4756]: I1205 16:30:31.973242 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-phkpd" Dec 05 16:30:32 crc kubenswrapper[4756]: I1205 16:30:32.067007 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-mwwdp" event={"ID":"9a2f231e-11fb-4b75-ba6b-35f59554bcbe","Type":"ContainerStarted","Data":"1b2fb9c0b3e81720634e71cd16b3f561ee5a3c27e81910cbd4d4f3d82cdd72f0"} Dec 05 16:30:32 crc kubenswrapper[4756]: I1205 16:30:32.067057 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-mwwdp" event={"ID":"9a2f231e-11fb-4b75-ba6b-35f59554bcbe","Type":"ContainerStarted","Data":"b752b8e79ee07368a2432acc9362def0857631c5a8dc972ef6008159e90b51eb"} Dec 05 16:30:32 crc kubenswrapper[4756]: I1205 16:30:32.068046 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-mwwdp" Dec 05 16:30:32 crc kubenswrapper[4756]: I1205 16:30:32.073519 4756 generic.go:334] "Generic (PLEG): container finished" podID="5fb359a8-6d4a-4b15-bcd0-50e857d681f1" containerID="49ab1c9361450d863d2cb40fd9458e5731e044abaf8b1980c6232d795df39320" exitCode=0 Dec 05 16:30:32 crc kubenswrapper[4756]: I1205 16:30:32.074131 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pnx9t" event={"ID":"5fb359a8-6d4a-4b15-bcd0-50e857d681f1","Type":"ContainerDied","Data":"49ab1c9361450d863d2cb40fd9458e5731e044abaf8b1980c6232d795df39320"} Dec 05 16:30:32 crc kubenswrapper[4756]: I1205 16:30:32.103600 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-mwwdp" podStartSLOduration=4.103586262 podStartE2EDuration="4.103586262s" podCreationTimestamp="2025-12-05 16:30:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:30:32.100307404 +0000 UTC m=+892.338980916" watchObservedRunningTime="2025-12-05 16:30:32.103586262 +0000 UTC m=+892.342259774" Dec 05 16:30:32 crc kubenswrapper[4756]: I1205 16:30:32.400883 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-phkpd"] Dec 05 16:30:32 crc kubenswrapper[4756]: W1205 16:30:32.411150 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d94dfbd_e25a_4edf_8fd6_9e3121623da7.slice/crio-29e80ada54aaced00f298196695df14a6becba3dd190047c7d3db90f81fd16b6 WatchSource:0}: Error finding container 29e80ada54aaced00f298196695df14a6becba3dd190047c7d3db90f81fd16b6: Status 404 returned error can't find the container with id 29e80ada54aaced00f298196695df14a6becba3dd190047c7d3db90f81fd16b6 Dec 05 16:30:33 crc kubenswrapper[4756]: I1205 16:30:33.080962 4756 generic.go:334] "Generic (PLEG): container finished" podID="4d94dfbd-e25a-4edf-8fd6-9e3121623da7" containerID="62aad97d540b5eb5170a6feaaae5a6739b715f16910b97591ff427c44c9eaf0a" exitCode=0 Dec 05 16:30:33 crc kubenswrapper[4756]: I1205 16:30:33.081037 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phkpd" event={"ID":"4d94dfbd-e25a-4edf-8fd6-9e3121623da7","Type":"ContainerDied","Data":"62aad97d540b5eb5170a6feaaae5a6739b715f16910b97591ff427c44c9eaf0a"} Dec 05 16:30:33 crc kubenswrapper[4756]: I1205 16:30:33.081066 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phkpd" event={"ID":"4d94dfbd-e25a-4edf-8fd6-9e3121623da7","Type":"ContainerStarted","Data":"29e80ada54aaced00f298196695df14a6becba3dd190047c7d3db90f81fd16b6"} Dec 05 16:30:33 crc kubenswrapper[4756]: I1205 16:30:33.105816 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pnx9t" event={"ID":"5fb359a8-6d4a-4b15-bcd0-50e857d681f1","Type":"ContainerStarted","Data":"fe263e1e5fe76a0764bad55b68f4e660b4237b909666a107de6d967aa9f6fb70"} Dec 05 16:30:33 crc kubenswrapper[4756]: I1205 16:30:33.130864 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pnx9t" podStartSLOduration=1.5478049249999999 podStartE2EDuration="4.130841239s" podCreationTimestamp="2025-12-05 16:30:29 +0000 UTC" firstStartedPulling="2025-12-05 16:30:29.987956473 +0000 UTC m=+890.226629995" lastFinishedPulling="2025-12-05 16:30:32.570992797 +0000 UTC m=+892.809666309" observedRunningTime="2025-12-05 16:30:33.12344014 +0000 UTC m=+893.362113652" watchObservedRunningTime="2025-12-05 16:30:33.130841239 +0000 UTC m=+893.369514751" Dec 05 16:30:34 crc kubenswrapper[4756]: I1205 16:30:34.112267 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phkpd" event={"ID":"4d94dfbd-e25a-4edf-8fd6-9e3121623da7","Type":"ContainerStarted","Data":"6c9821cd44fcb41c024c0a9e54b9c7ff4e3f4f47d86c2125f387700cc040a6de"} Dec 05 16:30:35 crc kubenswrapper[4756]: I1205 16:30:35.119174 4756 generic.go:334] "Generic (PLEG): container finished" podID="4d94dfbd-e25a-4edf-8fd6-9e3121623da7" containerID="6c9821cd44fcb41c024c0a9e54b9c7ff4e3f4f47d86c2125f387700cc040a6de" exitCode=0 Dec 05 16:30:35 crc kubenswrapper[4756]: I1205 16:30:35.119221 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phkpd" event={"ID":"4d94dfbd-e25a-4edf-8fd6-9e3121623da7","Type":"ContainerDied","Data":"6c9821cd44fcb41c024c0a9e54b9c7ff4e3f4f47d86c2125f387700cc040a6de"} Dec 05 16:30:38 crc kubenswrapper[4756]: I1205 16:30:38.138309 4756 generic.go:334] "Generic (PLEG): container finished" podID="03cce283-77b4-4dbb-995a-a28cc39ae376" containerID="695d6b3f1ec131138196d3ba5509766faf875eea8bc2b2f6443d55c05d920e38" exitCode=0 Dec 05 16:30:38 crc kubenswrapper[4756]: I1205 16:30:38.138357 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nnqf2" event={"ID":"03cce283-77b4-4dbb-995a-a28cc39ae376","Type":"ContainerDied","Data":"695d6b3f1ec131138196d3ba5509766faf875eea8bc2b2f6443d55c05d920e38"} Dec 05 16:30:38 crc kubenswrapper[4756]: I1205 16:30:38.140417 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kzhn8" event={"ID":"c64b6bd4-0f30-4b95-8bac-8c302e466540","Type":"ContainerStarted","Data":"f7c862e1e6e2b5396417461868e906879d86a0ba6deccce93162caee1e662430"} Dec 05 16:30:38 crc kubenswrapper[4756]: I1205 16:30:38.141147 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kzhn8" Dec 05 16:30:38 crc kubenswrapper[4756]: I1205 16:30:38.143232 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phkpd" event={"ID":"4d94dfbd-e25a-4edf-8fd6-9e3121623da7","Type":"ContainerStarted","Data":"d556f72326aba78b8e27d596f137771d9a5708d9f70604e6e2af8589aa830a43"} Dec 05 16:30:38 crc kubenswrapper[4756]: I1205 16:30:38.186814 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kzhn8" podStartSLOduration=2.364825863 podStartE2EDuration="10.186789392s" podCreationTimestamp="2025-12-05 16:30:28 +0000 UTC" firstStartedPulling="2025-12-05 16:30:29.411795903 +0000 UTC m=+889.650469415" lastFinishedPulling="2025-12-05 16:30:37.233759422 +0000 UTC m=+897.472432944" observedRunningTime="2025-12-05 16:30:38.18335214 +0000 UTC m=+898.422025652" watchObservedRunningTime="2025-12-05 16:30:38.186789392 +0000 UTC m=+898.425462924" Dec 05 16:30:38 crc kubenswrapper[4756]: I1205 16:30:38.202304 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-phkpd" podStartSLOduration=2.778097223 podStartE2EDuration="7.202284859s" podCreationTimestamp="2025-12-05 16:30:31 +0000 UTC" firstStartedPulling="2025-12-05 16:30:33.086702202 +0000 UTC m=+893.325375704" lastFinishedPulling="2025-12-05 16:30:37.510889828 +0000 UTC m=+897.749563340" observedRunningTime="2025-12-05 16:30:38.201316973 +0000 UTC m=+898.439990495" watchObservedRunningTime="2025-12-05 16:30:38.202284859 +0000 UTC m=+898.440958391" Dec 05 16:30:39 crc kubenswrapper[4756]: I1205 16:30:39.150861 4756 generic.go:334] "Generic (PLEG): container finished" podID="03cce283-77b4-4dbb-995a-a28cc39ae376" containerID="9922cbb021ecc79b9fbe90189d2599a4513ff67f3efdab9cc04890ef904780b3" exitCode=0 Dec 05 16:30:39 crc kubenswrapper[4756]: I1205 16:30:39.150996 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nnqf2" event={"ID":"03cce283-77b4-4dbb-995a-a28cc39ae376","Type":"ContainerDied","Data":"9922cbb021ecc79b9fbe90189d2599a4513ff67f3efdab9cc04890ef904780b3"} Dec 05 16:30:39 crc kubenswrapper[4756]: I1205 16:30:39.418935 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pnx9t" Dec 05 16:30:39 crc kubenswrapper[4756]: I1205 16:30:39.419003 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pnx9t" Dec 05 16:30:39 crc kubenswrapper[4756]: I1205 16:30:39.460017 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pnx9t" Dec 05 16:30:40 crc kubenswrapper[4756]: I1205 16:30:40.158297 4756 generic.go:334] "Generic (PLEG): container finished" podID="03cce283-77b4-4dbb-995a-a28cc39ae376" containerID="b145d97d0e88f084948ded48b936e133101cb2a4dffdc977f8493ee23bce45fc" exitCode=0 Dec 05 16:30:40 crc kubenswrapper[4756]: I1205 16:30:40.158339 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nnqf2" event={"ID":"03cce283-77b4-4dbb-995a-a28cc39ae376","Type":"ContainerDied","Data":"b145d97d0e88f084948ded48b936e133101cb2a4dffdc977f8493ee23bce45fc"} Dec 05 16:30:40 crc kubenswrapper[4756]: I1205 16:30:40.211517 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pnx9t" Dec 05 16:30:40 crc kubenswrapper[4756]: I1205 16:30:40.640577 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pnx9t"] Dec 05 16:30:41 crc kubenswrapper[4756]: I1205 16:30:41.165576 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nnqf2" event={"ID":"03cce283-77b4-4dbb-995a-a28cc39ae376","Type":"ContainerStarted","Data":"286d4b183d4cf7d28c3aa7ee6da11222616f5114b27c9c2a16ebf3e8fc486d28"} Dec 05 16:30:41 crc kubenswrapper[4756]: I1205 16:30:41.974126 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-phkpd" Dec 05 16:30:41 crc kubenswrapper[4756]: I1205 16:30:41.974167 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-phkpd" Dec 05 16:30:42 crc kubenswrapper[4756]: I1205 16:30:42.036131 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-phkpd" Dec 05 16:30:42 crc kubenswrapper[4756]: I1205 16:30:42.173676 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pnx9t" podUID="5fb359a8-6d4a-4b15-bcd0-50e857d681f1" containerName="registry-server" containerID="cri-o://fe263e1e5fe76a0764bad55b68f4e660b4237b909666a107de6d967aa9f6fb70" gracePeriod=2 Dec 05 16:30:42 crc kubenswrapper[4756]: I1205 16:30:42.174809 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nnqf2" event={"ID":"03cce283-77b4-4dbb-995a-a28cc39ae376","Type":"ContainerStarted","Data":"fde2e2ad6abfdbe02a2b635cb0c4432c14dd7887fac445069ebdae8df73e47fd"} Dec 05 16:30:42 crc kubenswrapper[4756]: I1205 16:30:42.174846 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nnqf2" event={"ID":"03cce283-77b4-4dbb-995a-a28cc39ae376","Type":"ContainerStarted","Data":"18f15da8df5bfe55cc42ae103e1348f62997d7ed610b6d49ad4fbcc0e79c51b4"} Dec 05 16:30:42 crc kubenswrapper[4756]: I1205 16:30:42.174856 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nnqf2" event={"ID":"03cce283-77b4-4dbb-995a-a28cc39ae376","Type":"ContainerStarted","Data":"84d699dac9e8d33ee801711fca9b3144c45f9b1d713f0401d2c7d704e3cec320"} Dec 05 16:30:42 crc kubenswrapper[4756]: I1205 16:30:42.174867 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-nnqf2" Dec 05 16:30:42 crc kubenswrapper[4756]: I1205 16:30:42.174876 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nnqf2" event={"ID":"03cce283-77b4-4dbb-995a-a28cc39ae376","Type":"ContainerStarted","Data":"a59efd25a43682d696d4c5b667389a3d8b65ea9f724f2eab8910ac019a05d307"} Dec 05 16:30:42 crc kubenswrapper[4756]: I1205 16:30:42.174886 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nnqf2" event={"ID":"03cce283-77b4-4dbb-995a-a28cc39ae376","Type":"ContainerStarted","Data":"48679e4d188dd306f7e7987d5228e12a64e1355f746ed33362953b0735bb618c"} Dec 05 16:30:42 crc kubenswrapper[4756]: I1205 16:30:42.206922 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-nnqf2" podStartSLOduration=6.609377577 podStartE2EDuration="14.206904588s" podCreationTimestamp="2025-12-05 16:30:28 +0000 UTC" firstStartedPulling="2025-12-05 16:30:29.655057898 +0000 UTC m=+889.893731410" lastFinishedPulling="2025-12-05 16:30:37.252584909 +0000 UTC m=+897.491258421" observedRunningTime="2025-12-05 16:30:42.202638583 +0000 UTC m=+902.441312115" watchObservedRunningTime="2025-12-05 16:30:42.206904588 +0000 UTC m=+902.445578120" Dec 05 16:30:42 crc kubenswrapper[4756]: I1205 16:30:42.219719 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-phkpd" Dec 05 16:30:42 crc kubenswrapper[4756]: I1205 16:30:42.527952 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pnx9t" Dec 05 16:30:42 crc kubenswrapper[4756]: I1205 16:30:42.557484 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fb359a8-6d4a-4b15-bcd0-50e857d681f1-catalog-content\") pod \"5fb359a8-6d4a-4b15-bcd0-50e857d681f1\" (UID: \"5fb359a8-6d4a-4b15-bcd0-50e857d681f1\") " Dec 05 16:30:42 crc kubenswrapper[4756]: I1205 16:30:42.557525 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dz9r\" (UniqueName: \"kubernetes.io/projected/5fb359a8-6d4a-4b15-bcd0-50e857d681f1-kube-api-access-4dz9r\") pod \"5fb359a8-6d4a-4b15-bcd0-50e857d681f1\" (UID: \"5fb359a8-6d4a-4b15-bcd0-50e857d681f1\") " Dec 05 16:30:42 crc kubenswrapper[4756]: I1205 16:30:42.557605 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fb359a8-6d4a-4b15-bcd0-50e857d681f1-utilities\") pod \"5fb359a8-6d4a-4b15-bcd0-50e857d681f1\" (UID: \"5fb359a8-6d4a-4b15-bcd0-50e857d681f1\") " Dec 05 16:30:42 crc kubenswrapper[4756]: I1205 16:30:42.558652 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fb359a8-6d4a-4b15-bcd0-50e857d681f1-utilities" (OuterVolumeSpecName: "utilities") pod "5fb359a8-6d4a-4b15-bcd0-50e857d681f1" (UID: "5fb359a8-6d4a-4b15-bcd0-50e857d681f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:30:42 crc kubenswrapper[4756]: I1205 16:30:42.565121 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fb359a8-6d4a-4b15-bcd0-50e857d681f1-kube-api-access-4dz9r" (OuterVolumeSpecName: "kube-api-access-4dz9r") pod "5fb359a8-6d4a-4b15-bcd0-50e857d681f1" (UID: "5fb359a8-6d4a-4b15-bcd0-50e857d681f1"). InnerVolumeSpecName "kube-api-access-4dz9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:30:42 crc kubenswrapper[4756]: I1205 16:30:42.580735 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fb359a8-6d4a-4b15-bcd0-50e857d681f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5fb359a8-6d4a-4b15-bcd0-50e857d681f1" (UID: "5fb359a8-6d4a-4b15-bcd0-50e857d681f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:30:42 crc kubenswrapper[4756]: I1205 16:30:42.658409 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fb359a8-6d4a-4b15-bcd0-50e857d681f1-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:42 crc kubenswrapper[4756]: I1205 16:30:42.658452 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fb359a8-6d4a-4b15-bcd0-50e857d681f1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:42 crc kubenswrapper[4756]: I1205 16:30:42.658462 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dz9r\" (UniqueName: \"kubernetes.io/projected/5fb359a8-6d4a-4b15-bcd0-50e857d681f1-kube-api-access-4dz9r\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:43 crc kubenswrapper[4756]: I1205 16:30:43.182299 4756 generic.go:334] "Generic (PLEG): container finished" podID="5fb359a8-6d4a-4b15-bcd0-50e857d681f1" containerID="fe263e1e5fe76a0764bad55b68f4e660b4237b909666a107de6d967aa9f6fb70" exitCode=0 Dec 05 16:30:43 crc kubenswrapper[4756]: I1205 16:30:43.182366 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pnx9t" event={"ID":"5fb359a8-6d4a-4b15-bcd0-50e857d681f1","Type":"ContainerDied","Data":"fe263e1e5fe76a0764bad55b68f4e660b4237b909666a107de6d967aa9f6fb70"} Dec 05 16:30:43 crc kubenswrapper[4756]: I1205 16:30:43.182391 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pnx9t" Dec 05 16:30:43 crc kubenswrapper[4756]: I1205 16:30:43.182791 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pnx9t" event={"ID":"5fb359a8-6d4a-4b15-bcd0-50e857d681f1","Type":"ContainerDied","Data":"bce0fa9e28898e679b8673aad2aa3593a553119000b651090b2ba35e3ea20262"} Dec 05 16:30:43 crc kubenswrapper[4756]: I1205 16:30:43.182874 4756 scope.go:117] "RemoveContainer" containerID="fe263e1e5fe76a0764bad55b68f4e660b4237b909666a107de6d967aa9f6fb70" Dec 05 16:30:43 crc kubenswrapper[4756]: I1205 16:30:43.206583 4756 scope.go:117] "RemoveContainer" containerID="49ab1c9361450d863d2cb40fd9458e5731e044abaf8b1980c6232d795df39320" Dec 05 16:30:43 crc kubenswrapper[4756]: I1205 16:30:43.217409 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pnx9t"] Dec 05 16:30:43 crc kubenswrapper[4756]: I1205 16:30:43.221448 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pnx9t"] Dec 05 16:30:43 crc kubenswrapper[4756]: I1205 16:30:43.242649 4756 scope.go:117] "RemoveContainer" containerID="645b79e38560e7c047922ff773b309e7d16315ea17f87c8cc3462560635bce00" Dec 05 16:30:43 crc kubenswrapper[4756]: I1205 16:30:43.256880 4756 scope.go:117] "RemoveContainer" containerID="fe263e1e5fe76a0764bad55b68f4e660b4237b909666a107de6d967aa9f6fb70" Dec 05 16:30:43 crc kubenswrapper[4756]: E1205 16:30:43.257722 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe263e1e5fe76a0764bad55b68f4e660b4237b909666a107de6d967aa9f6fb70\": container with ID starting with fe263e1e5fe76a0764bad55b68f4e660b4237b909666a107de6d967aa9f6fb70 not found: ID does not exist" containerID="fe263e1e5fe76a0764bad55b68f4e660b4237b909666a107de6d967aa9f6fb70" Dec 05 16:30:43 crc kubenswrapper[4756]: I1205 16:30:43.257771 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe263e1e5fe76a0764bad55b68f4e660b4237b909666a107de6d967aa9f6fb70"} err="failed to get container status \"fe263e1e5fe76a0764bad55b68f4e660b4237b909666a107de6d967aa9f6fb70\": rpc error: code = NotFound desc = could not find container \"fe263e1e5fe76a0764bad55b68f4e660b4237b909666a107de6d967aa9f6fb70\": container with ID starting with fe263e1e5fe76a0764bad55b68f4e660b4237b909666a107de6d967aa9f6fb70 not found: ID does not exist" Dec 05 16:30:43 crc kubenswrapper[4756]: I1205 16:30:43.257806 4756 scope.go:117] "RemoveContainer" containerID="49ab1c9361450d863d2cb40fd9458e5731e044abaf8b1980c6232d795df39320" Dec 05 16:30:43 crc kubenswrapper[4756]: E1205 16:30:43.258101 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49ab1c9361450d863d2cb40fd9458e5731e044abaf8b1980c6232d795df39320\": container with ID starting with 49ab1c9361450d863d2cb40fd9458e5731e044abaf8b1980c6232d795df39320 not found: ID does not exist" containerID="49ab1c9361450d863d2cb40fd9458e5731e044abaf8b1980c6232d795df39320" Dec 05 16:30:43 crc kubenswrapper[4756]: I1205 16:30:43.258135 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49ab1c9361450d863d2cb40fd9458e5731e044abaf8b1980c6232d795df39320"} err="failed to get container status \"49ab1c9361450d863d2cb40fd9458e5731e044abaf8b1980c6232d795df39320\": rpc error: code = NotFound desc = could not find container \"49ab1c9361450d863d2cb40fd9458e5731e044abaf8b1980c6232d795df39320\": container with ID starting with 49ab1c9361450d863d2cb40fd9458e5731e044abaf8b1980c6232d795df39320 not found: ID does not exist" Dec 05 16:30:43 crc kubenswrapper[4756]: I1205 16:30:43.258157 4756 scope.go:117] "RemoveContainer" containerID="645b79e38560e7c047922ff773b309e7d16315ea17f87c8cc3462560635bce00" Dec 05 16:30:43 crc kubenswrapper[4756]: E1205 16:30:43.258455 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"645b79e38560e7c047922ff773b309e7d16315ea17f87c8cc3462560635bce00\": container with ID starting with 645b79e38560e7c047922ff773b309e7d16315ea17f87c8cc3462560635bce00 not found: ID does not exist" containerID="645b79e38560e7c047922ff773b309e7d16315ea17f87c8cc3462560635bce00" Dec 05 16:30:43 crc kubenswrapper[4756]: I1205 16:30:43.258481 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"645b79e38560e7c047922ff773b309e7d16315ea17f87c8cc3462560635bce00"} err="failed to get container status \"645b79e38560e7c047922ff773b309e7d16315ea17f87c8cc3462560635bce00\": rpc error: code = NotFound desc = could not find container \"645b79e38560e7c047922ff773b309e7d16315ea17f87c8cc3462560635bce00\": container with ID starting with 645b79e38560e7c047922ff773b309e7d16315ea17f87c8cc3462560635bce00 not found: ID does not exist" Dec 05 16:30:43 crc kubenswrapper[4756]: I1205 16:30:43.438028 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-phkpd"] Dec 05 16:30:44 crc kubenswrapper[4756]: I1205 16:30:44.058512 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fb359a8-6d4a-4b15-bcd0-50e857d681f1" path="/var/lib/kubelet/pods/5fb359a8-6d4a-4b15-bcd0-50e857d681f1/volumes" Dec 05 16:30:44 crc kubenswrapper[4756]: I1205 16:30:44.192535 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-phkpd" podUID="4d94dfbd-e25a-4edf-8fd6-9e3121623da7" containerName="registry-server" containerID="cri-o://d556f72326aba78b8e27d596f137771d9a5708d9f70604e6e2af8589aa830a43" gracePeriod=2 Dec 05 16:30:44 crc kubenswrapper[4756]: I1205 16:30:44.521192 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-nnqf2" Dec 05 16:30:44 crc kubenswrapper[4756]: I1205 16:30:44.573351 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-phkpd" Dec 05 16:30:44 crc kubenswrapper[4756]: I1205 16:30:44.578023 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-nnqf2" Dec 05 16:30:44 crc kubenswrapper[4756]: I1205 16:30:44.616919 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d94dfbd-e25a-4edf-8fd6-9e3121623da7-catalog-content\") pod \"4d94dfbd-e25a-4edf-8fd6-9e3121623da7\" (UID: \"4d94dfbd-e25a-4edf-8fd6-9e3121623da7\") " Dec 05 16:30:44 crc kubenswrapper[4756]: I1205 16:30:44.616991 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d94dfbd-e25a-4edf-8fd6-9e3121623da7-utilities\") pod \"4d94dfbd-e25a-4edf-8fd6-9e3121623da7\" (UID: \"4d94dfbd-e25a-4edf-8fd6-9e3121623da7\") " Dec 05 16:30:44 crc kubenswrapper[4756]: I1205 16:30:44.617057 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnmll\" (UniqueName: \"kubernetes.io/projected/4d94dfbd-e25a-4edf-8fd6-9e3121623da7-kube-api-access-rnmll\") pod \"4d94dfbd-e25a-4edf-8fd6-9e3121623da7\" (UID: \"4d94dfbd-e25a-4edf-8fd6-9e3121623da7\") " Dec 05 16:30:44 crc kubenswrapper[4756]: I1205 16:30:44.618838 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d94dfbd-e25a-4edf-8fd6-9e3121623da7-utilities" (OuterVolumeSpecName: "utilities") pod "4d94dfbd-e25a-4edf-8fd6-9e3121623da7" (UID: "4d94dfbd-e25a-4edf-8fd6-9e3121623da7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:30:44 crc kubenswrapper[4756]: I1205 16:30:44.622378 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d94dfbd-e25a-4edf-8fd6-9e3121623da7-kube-api-access-rnmll" (OuterVolumeSpecName: "kube-api-access-rnmll") pod "4d94dfbd-e25a-4edf-8fd6-9e3121623da7" (UID: "4d94dfbd-e25a-4edf-8fd6-9e3121623da7"). InnerVolumeSpecName "kube-api-access-rnmll". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:30:44 crc kubenswrapper[4756]: I1205 16:30:44.718876 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnmll\" (UniqueName: \"kubernetes.io/projected/4d94dfbd-e25a-4edf-8fd6-9e3121623da7-kube-api-access-rnmll\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:44 crc kubenswrapper[4756]: I1205 16:30:44.719215 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d94dfbd-e25a-4edf-8fd6-9e3121623da7-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:45 crc kubenswrapper[4756]: I1205 16:30:45.200385 4756 generic.go:334] "Generic (PLEG): container finished" podID="4d94dfbd-e25a-4edf-8fd6-9e3121623da7" containerID="d556f72326aba78b8e27d596f137771d9a5708d9f70604e6e2af8589aa830a43" exitCode=0 Dec 05 16:30:45 crc kubenswrapper[4756]: I1205 16:30:45.200502 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-phkpd" Dec 05 16:30:45 crc kubenswrapper[4756]: I1205 16:30:45.200554 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phkpd" event={"ID":"4d94dfbd-e25a-4edf-8fd6-9e3121623da7","Type":"ContainerDied","Data":"d556f72326aba78b8e27d596f137771d9a5708d9f70604e6e2af8589aa830a43"} Dec 05 16:30:45 crc kubenswrapper[4756]: I1205 16:30:45.200587 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phkpd" event={"ID":"4d94dfbd-e25a-4edf-8fd6-9e3121623da7","Type":"ContainerDied","Data":"29e80ada54aaced00f298196695df14a6becba3dd190047c7d3db90f81fd16b6"} Dec 05 16:30:45 crc kubenswrapper[4756]: I1205 16:30:45.200609 4756 scope.go:117] "RemoveContainer" containerID="d556f72326aba78b8e27d596f137771d9a5708d9f70604e6e2af8589aa830a43" Dec 05 16:30:45 crc kubenswrapper[4756]: I1205 16:30:45.214059 4756 scope.go:117] "RemoveContainer" containerID="6c9821cd44fcb41c024c0a9e54b9c7ff4e3f4f47d86c2125f387700cc040a6de" Dec 05 16:30:45 crc kubenswrapper[4756]: I1205 16:30:45.243314 4756 scope.go:117] "RemoveContainer" containerID="62aad97d540b5eb5170a6feaaae5a6739b715f16910b97591ff427c44c9eaf0a" Dec 05 16:30:45 crc kubenswrapper[4756]: I1205 16:30:45.258636 4756 scope.go:117] "RemoveContainer" containerID="d556f72326aba78b8e27d596f137771d9a5708d9f70604e6e2af8589aa830a43" Dec 05 16:30:45 crc kubenswrapper[4756]: E1205 16:30:45.259096 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d556f72326aba78b8e27d596f137771d9a5708d9f70604e6e2af8589aa830a43\": container with ID starting with d556f72326aba78b8e27d596f137771d9a5708d9f70604e6e2af8589aa830a43 not found: ID does not exist" containerID="d556f72326aba78b8e27d596f137771d9a5708d9f70604e6e2af8589aa830a43" Dec 05 16:30:45 crc kubenswrapper[4756]: I1205 16:30:45.259144 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d556f72326aba78b8e27d596f137771d9a5708d9f70604e6e2af8589aa830a43"} err="failed to get container status \"d556f72326aba78b8e27d596f137771d9a5708d9f70604e6e2af8589aa830a43\": rpc error: code = NotFound desc = could not find container \"d556f72326aba78b8e27d596f137771d9a5708d9f70604e6e2af8589aa830a43\": container with ID starting with d556f72326aba78b8e27d596f137771d9a5708d9f70604e6e2af8589aa830a43 not found: ID does not exist" Dec 05 16:30:45 crc kubenswrapper[4756]: I1205 16:30:45.259168 4756 scope.go:117] "RemoveContainer" containerID="6c9821cd44fcb41c024c0a9e54b9c7ff4e3f4f47d86c2125f387700cc040a6de" Dec 05 16:30:45 crc kubenswrapper[4756]: E1205 16:30:45.259662 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c9821cd44fcb41c024c0a9e54b9c7ff4e3f4f47d86c2125f387700cc040a6de\": container with ID starting with 6c9821cd44fcb41c024c0a9e54b9c7ff4e3f4f47d86c2125f387700cc040a6de not found: ID does not exist" containerID="6c9821cd44fcb41c024c0a9e54b9c7ff4e3f4f47d86c2125f387700cc040a6de" Dec 05 16:30:45 crc kubenswrapper[4756]: I1205 16:30:45.259681 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c9821cd44fcb41c024c0a9e54b9c7ff4e3f4f47d86c2125f387700cc040a6de"} err="failed to get container status \"6c9821cd44fcb41c024c0a9e54b9c7ff4e3f4f47d86c2125f387700cc040a6de\": rpc error: code = NotFound desc = could not find container \"6c9821cd44fcb41c024c0a9e54b9c7ff4e3f4f47d86c2125f387700cc040a6de\": container with ID starting with 6c9821cd44fcb41c024c0a9e54b9c7ff4e3f4f47d86c2125f387700cc040a6de not found: ID does not exist" Dec 05 16:30:45 crc kubenswrapper[4756]: I1205 16:30:45.259695 4756 scope.go:117] "RemoveContainer" containerID="62aad97d540b5eb5170a6feaaae5a6739b715f16910b97591ff427c44c9eaf0a" Dec 05 16:30:45 crc kubenswrapper[4756]: E1205 16:30:45.260014 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62aad97d540b5eb5170a6feaaae5a6739b715f16910b97591ff427c44c9eaf0a\": container with ID starting with 62aad97d540b5eb5170a6feaaae5a6739b715f16910b97591ff427c44c9eaf0a not found: ID does not exist" containerID="62aad97d540b5eb5170a6feaaae5a6739b715f16910b97591ff427c44c9eaf0a" Dec 05 16:30:45 crc kubenswrapper[4756]: I1205 16:30:45.260053 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62aad97d540b5eb5170a6feaaae5a6739b715f16910b97591ff427c44c9eaf0a"} err="failed to get container status \"62aad97d540b5eb5170a6feaaae5a6739b715f16910b97591ff427c44c9eaf0a\": rpc error: code = NotFound desc = could not find container \"62aad97d540b5eb5170a6feaaae5a6739b715f16910b97591ff427c44c9eaf0a\": container with ID starting with 62aad97d540b5eb5170a6feaaae5a6739b715f16910b97591ff427c44c9eaf0a not found: ID does not exist" Dec 05 16:30:46 crc kubenswrapper[4756]: I1205 16:30:46.553741 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d94dfbd-e25a-4edf-8fd6-9e3121623da7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d94dfbd-e25a-4edf-8fd6-9e3121623da7" (UID: "4d94dfbd-e25a-4edf-8fd6-9e3121623da7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:30:46 crc kubenswrapper[4756]: I1205 16:30:46.648270 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d94dfbd-e25a-4edf-8fd6-9e3121623da7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:46 crc kubenswrapper[4756]: I1205 16:30:46.727395 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-phkpd"] Dec 05 16:30:46 crc kubenswrapper[4756]: I1205 16:30:46.731300 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-phkpd"] Dec 05 16:30:47 crc kubenswrapper[4756]: I1205 16:30:47.842337 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6cz4x"] Dec 05 16:30:47 crc kubenswrapper[4756]: E1205 16:30:47.842604 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fb359a8-6d4a-4b15-bcd0-50e857d681f1" containerName="extract-content" Dec 05 16:30:47 crc kubenswrapper[4756]: I1205 16:30:47.842616 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fb359a8-6d4a-4b15-bcd0-50e857d681f1" containerName="extract-content" Dec 05 16:30:47 crc kubenswrapper[4756]: E1205 16:30:47.842626 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d94dfbd-e25a-4edf-8fd6-9e3121623da7" containerName="registry-server" Dec 05 16:30:47 crc kubenswrapper[4756]: I1205 16:30:47.842632 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d94dfbd-e25a-4edf-8fd6-9e3121623da7" containerName="registry-server" Dec 05 16:30:47 crc kubenswrapper[4756]: E1205 16:30:47.842643 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d94dfbd-e25a-4edf-8fd6-9e3121623da7" containerName="extract-content" Dec 05 16:30:47 crc kubenswrapper[4756]: I1205 16:30:47.842649 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d94dfbd-e25a-4edf-8fd6-9e3121623da7" containerName="extract-content" Dec 05 16:30:47 crc kubenswrapper[4756]: E1205 16:30:47.842658 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fb359a8-6d4a-4b15-bcd0-50e857d681f1" containerName="registry-server" Dec 05 16:30:47 crc kubenswrapper[4756]: I1205 16:30:47.842664 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fb359a8-6d4a-4b15-bcd0-50e857d681f1" containerName="registry-server" Dec 05 16:30:47 crc kubenswrapper[4756]: E1205 16:30:47.842673 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d94dfbd-e25a-4edf-8fd6-9e3121623da7" containerName="extract-utilities" Dec 05 16:30:47 crc kubenswrapper[4756]: I1205 16:30:47.842679 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d94dfbd-e25a-4edf-8fd6-9e3121623da7" containerName="extract-utilities" Dec 05 16:30:47 crc kubenswrapper[4756]: E1205 16:30:47.842693 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fb359a8-6d4a-4b15-bcd0-50e857d681f1" containerName="extract-utilities" Dec 05 16:30:47 crc kubenswrapper[4756]: I1205 16:30:47.842699 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fb359a8-6d4a-4b15-bcd0-50e857d681f1" containerName="extract-utilities" Dec 05 16:30:47 crc kubenswrapper[4756]: I1205 16:30:47.842796 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fb359a8-6d4a-4b15-bcd0-50e857d681f1" containerName="registry-server" Dec 05 16:30:47 crc kubenswrapper[4756]: I1205 16:30:47.842811 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d94dfbd-e25a-4edf-8fd6-9e3121623da7" containerName="registry-server" Dec 05 16:30:47 crc kubenswrapper[4756]: I1205 16:30:47.843574 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6cz4x" Dec 05 16:30:47 crc kubenswrapper[4756]: I1205 16:30:47.854524 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6cz4x"] Dec 05 16:30:47 crc kubenswrapper[4756]: I1205 16:30:47.982528 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b45737c-141c-4bc1-98d3-efd41cca10d2-utilities\") pod \"certified-operators-6cz4x\" (UID: \"1b45737c-141c-4bc1-98d3-efd41cca10d2\") " pod="openshift-marketplace/certified-operators-6cz4x" Dec 05 16:30:47 crc kubenswrapper[4756]: I1205 16:30:47.982603 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b45737c-141c-4bc1-98d3-efd41cca10d2-catalog-content\") pod \"certified-operators-6cz4x\" (UID: \"1b45737c-141c-4bc1-98d3-efd41cca10d2\") " pod="openshift-marketplace/certified-operators-6cz4x" Dec 05 16:30:47 crc kubenswrapper[4756]: I1205 16:30:47.982630 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68t6p\" (UniqueName: \"kubernetes.io/projected/1b45737c-141c-4bc1-98d3-efd41cca10d2-kube-api-access-68t6p\") pod \"certified-operators-6cz4x\" (UID: \"1b45737c-141c-4bc1-98d3-efd41cca10d2\") " pod="openshift-marketplace/certified-operators-6cz4x" Dec 05 16:30:48 crc kubenswrapper[4756]: I1205 16:30:48.056207 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d94dfbd-e25a-4edf-8fd6-9e3121623da7" path="/var/lib/kubelet/pods/4d94dfbd-e25a-4edf-8fd6-9e3121623da7/volumes" Dec 05 16:30:48 crc kubenswrapper[4756]: I1205 16:30:48.083947 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b45737c-141c-4bc1-98d3-efd41cca10d2-catalog-content\") pod \"certified-operators-6cz4x\" (UID: \"1b45737c-141c-4bc1-98d3-efd41cca10d2\") " pod="openshift-marketplace/certified-operators-6cz4x" Dec 05 16:30:48 crc kubenswrapper[4756]: I1205 16:30:48.083998 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68t6p\" (UniqueName: \"kubernetes.io/projected/1b45737c-141c-4bc1-98d3-efd41cca10d2-kube-api-access-68t6p\") pod \"certified-operators-6cz4x\" (UID: \"1b45737c-141c-4bc1-98d3-efd41cca10d2\") " pod="openshift-marketplace/certified-operators-6cz4x" Dec 05 16:30:48 crc kubenswrapper[4756]: I1205 16:30:48.084052 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b45737c-141c-4bc1-98d3-efd41cca10d2-utilities\") pod \"certified-operators-6cz4x\" (UID: \"1b45737c-141c-4bc1-98d3-efd41cca10d2\") " pod="openshift-marketplace/certified-operators-6cz4x" Dec 05 16:30:48 crc kubenswrapper[4756]: I1205 16:30:48.084407 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b45737c-141c-4bc1-98d3-efd41cca10d2-catalog-content\") pod \"certified-operators-6cz4x\" (UID: \"1b45737c-141c-4bc1-98d3-efd41cca10d2\") " pod="openshift-marketplace/certified-operators-6cz4x" Dec 05 16:30:48 crc kubenswrapper[4756]: I1205 16:30:48.084487 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b45737c-141c-4bc1-98d3-efd41cca10d2-utilities\") pod \"certified-operators-6cz4x\" (UID: \"1b45737c-141c-4bc1-98d3-efd41cca10d2\") " pod="openshift-marketplace/certified-operators-6cz4x" Dec 05 16:30:48 crc kubenswrapper[4756]: I1205 16:30:48.101964 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68t6p\" (UniqueName: \"kubernetes.io/projected/1b45737c-141c-4bc1-98d3-efd41cca10d2-kube-api-access-68t6p\") pod \"certified-operators-6cz4x\" (UID: \"1b45737c-141c-4bc1-98d3-efd41cca10d2\") " pod="openshift-marketplace/certified-operators-6cz4x" Dec 05 16:30:48 crc kubenswrapper[4756]: I1205 16:30:48.198586 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6cz4x" Dec 05 16:30:48 crc kubenswrapper[4756]: I1205 16:30:48.662140 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6cz4x"] Dec 05 16:30:48 crc kubenswrapper[4756]: W1205 16:30:48.669667 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b45737c_141c_4bc1_98d3_efd41cca10d2.slice/crio-db666a61bbd26cc0d3af0d67fab1b2a6220008d300cb6d92521ee2d37b41af4d WatchSource:0}: Error finding container db666a61bbd26cc0d3af0d67fab1b2a6220008d300cb6d92521ee2d37b41af4d: Status 404 returned error can't find the container with id db666a61bbd26cc0d3af0d67fab1b2a6220008d300cb6d92521ee2d37b41af4d Dec 05 16:30:48 crc kubenswrapper[4756]: I1205 16:30:48.955894 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kzhn8" Dec 05 16:30:49 crc kubenswrapper[4756]: I1205 16:30:49.245909 4756 generic.go:334] "Generic (PLEG): container finished" podID="1b45737c-141c-4bc1-98d3-efd41cca10d2" containerID="16542062a6640326e0ff425e8d6e62e92ad1ce0ff36ebb96db366a56479673ce" exitCode=0 Dec 05 16:30:49 crc kubenswrapper[4756]: I1205 16:30:49.246006 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6cz4x" event={"ID":"1b45737c-141c-4bc1-98d3-efd41cca10d2","Type":"ContainerDied","Data":"16542062a6640326e0ff425e8d6e62e92ad1ce0ff36ebb96db366a56479673ce"} Dec 05 16:30:49 crc kubenswrapper[4756]: I1205 16:30:49.247543 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6cz4x" event={"ID":"1b45737c-141c-4bc1-98d3-efd41cca10d2","Type":"ContainerStarted","Data":"db666a61bbd26cc0d3af0d67fab1b2a6220008d300cb6d92521ee2d37b41af4d"} Dec 05 16:30:49 crc kubenswrapper[4756]: I1205 16:30:49.789138 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-65sww" Dec 05 16:30:50 crc kubenswrapper[4756]: I1205 16:30:50.256612 4756 generic.go:334] "Generic (PLEG): container finished" podID="1b45737c-141c-4bc1-98d3-efd41cca10d2" containerID="9fc09c0b09b07c80bdebdb660fbd86211673e234def7419cf1bd70bb0a3fb2f4" exitCode=0 Dec 05 16:30:50 crc kubenswrapper[4756]: I1205 16:30:50.256683 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6cz4x" event={"ID":"1b45737c-141c-4bc1-98d3-efd41cca10d2","Type":"ContainerDied","Data":"9fc09c0b09b07c80bdebdb660fbd86211673e234def7419cf1bd70bb0a3fb2f4"} Dec 05 16:30:50 crc kubenswrapper[4756]: I1205 16:30:50.623075 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-mwwdp" Dec 05 16:30:51 crc kubenswrapper[4756]: I1205 16:30:51.266750 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6cz4x" event={"ID":"1b45737c-141c-4bc1-98d3-efd41cca10d2","Type":"ContainerStarted","Data":"77a39955f037695ce86f1065acff84723b5baff9173e6abf1fb1b64f1ecf89aa"} Dec 05 16:30:51 crc kubenswrapper[4756]: I1205 16:30:51.291655 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6cz4x" podStartSLOduration=2.824921301 podStartE2EDuration="4.291637591s" podCreationTimestamp="2025-12-05 16:30:47 +0000 UTC" firstStartedPulling="2025-12-05 16:30:49.247768364 +0000 UTC m=+909.486441876" lastFinishedPulling="2025-12-05 16:30:50.714484654 +0000 UTC m=+910.953158166" observedRunningTime="2025-12-05 16:30:51.287081548 +0000 UTC m=+911.525755070" watchObservedRunningTime="2025-12-05 16:30:51.291637591 +0000 UTC m=+911.530311103" Dec 05 16:30:52 crc kubenswrapper[4756]: I1205 16:30:52.288569 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67"] Dec 05 16:30:52 crc kubenswrapper[4756]: I1205 16:30:52.289759 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67" Dec 05 16:30:52 crc kubenswrapper[4756]: I1205 16:30:52.291704 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 16:30:52 crc kubenswrapper[4756]: I1205 16:30:52.298875 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67"] Dec 05 16:30:52 crc kubenswrapper[4756]: I1205 16:30:52.450918 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7pxx\" (UniqueName: \"kubernetes.io/projected/31e5042c-37e3-444e-adfb-3c7f50096886-kube-api-access-r7pxx\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67\" (UID: \"31e5042c-37e3-444e-adfb-3c7f50096886\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67" Dec 05 16:30:52 crc kubenswrapper[4756]: I1205 16:30:52.451012 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/31e5042c-37e3-444e-adfb-3c7f50096886-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67\" (UID: \"31e5042c-37e3-444e-adfb-3c7f50096886\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67" Dec 05 16:30:52 crc kubenswrapper[4756]: I1205 16:30:52.451055 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/31e5042c-37e3-444e-adfb-3c7f50096886-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67\" (UID: \"31e5042c-37e3-444e-adfb-3c7f50096886\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67" Dec 05 16:30:52 crc kubenswrapper[4756]: I1205 16:30:52.552912 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7pxx\" (UniqueName: \"kubernetes.io/projected/31e5042c-37e3-444e-adfb-3c7f50096886-kube-api-access-r7pxx\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67\" (UID: \"31e5042c-37e3-444e-adfb-3c7f50096886\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67" Dec 05 16:30:52 crc kubenswrapper[4756]: I1205 16:30:52.552984 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/31e5042c-37e3-444e-adfb-3c7f50096886-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67\" (UID: \"31e5042c-37e3-444e-adfb-3c7f50096886\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67" Dec 05 16:30:52 crc kubenswrapper[4756]: I1205 16:30:52.553006 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/31e5042c-37e3-444e-adfb-3c7f50096886-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67\" (UID: \"31e5042c-37e3-444e-adfb-3c7f50096886\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67" Dec 05 16:30:52 crc kubenswrapper[4756]: I1205 16:30:52.553463 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/31e5042c-37e3-444e-adfb-3c7f50096886-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67\" (UID: \"31e5042c-37e3-444e-adfb-3c7f50096886\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67" Dec 05 16:30:52 crc kubenswrapper[4756]: I1205 16:30:52.553553 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/31e5042c-37e3-444e-adfb-3c7f50096886-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67\" (UID: \"31e5042c-37e3-444e-adfb-3c7f50096886\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67" Dec 05 16:30:52 crc kubenswrapper[4756]: I1205 16:30:52.571829 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7pxx\" (UniqueName: \"kubernetes.io/projected/31e5042c-37e3-444e-adfb-3c7f50096886-kube-api-access-r7pxx\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67\" (UID: \"31e5042c-37e3-444e-adfb-3c7f50096886\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67" Dec 05 16:30:52 crc kubenswrapper[4756]: I1205 16:30:52.606870 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67" Dec 05 16:30:52 crc kubenswrapper[4756]: I1205 16:30:52.880392 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67"] Dec 05 16:30:52 crc kubenswrapper[4756]: W1205 16:30:52.886958 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31e5042c_37e3_444e_adfb_3c7f50096886.slice/crio-7e0f749069b7dbd69c8d1b263ccfdb46ae5d035b18e51af3849b236df7c02ec7 WatchSource:0}: Error finding container 7e0f749069b7dbd69c8d1b263ccfdb46ae5d035b18e51af3849b236df7c02ec7: Status 404 returned error can't find the container with id 7e0f749069b7dbd69c8d1b263ccfdb46ae5d035b18e51af3849b236df7c02ec7 Dec 05 16:30:53 crc kubenswrapper[4756]: I1205 16:30:53.279300 4756 generic.go:334] "Generic (PLEG): container finished" podID="31e5042c-37e3-444e-adfb-3c7f50096886" containerID="fea01f8066df154da7e5e37a7a52f9c0fb14428db7b7e230ca488fc3ccab2dc9" exitCode=0 Dec 05 16:30:53 crc kubenswrapper[4756]: I1205 16:30:53.279637 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67" event={"ID":"31e5042c-37e3-444e-adfb-3c7f50096886","Type":"ContainerDied","Data":"fea01f8066df154da7e5e37a7a52f9c0fb14428db7b7e230ca488fc3ccab2dc9"} Dec 05 16:30:53 crc kubenswrapper[4756]: I1205 16:30:53.279671 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67" event={"ID":"31e5042c-37e3-444e-adfb-3c7f50096886","Type":"ContainerStarted","Data":"7e0f749069b7dbd69c8d1b263ccfdb46ae5d035b18e51af3849b236df7c02ec7"} Dec 05 16:30:57 crc kubenswrapper[4756]: I1205 16:30:57.311680 4756 generic.go:334] "Generic (PLEG): container finished" podID="31e5042c-37e3-444e-adfb-3c7f50096886" containerID="07023d8108b876dfe5fa2d8351cdd6425d0d7d795d4fa83a549cff46dd5c8c7c" exitCode=0 Dec 05 16:30:57 crc kubenswrapper[4756]: I1205 16:30:57.312294 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67" event={"ID":"31e5042c-37e3-444e-adfb-3c7f50096886","Type":"ContainerDied","Data":"07023d8108b876dfe5fa2d8351cdd6425d0d7d795d4fa83a549cff46dd5c8c7c"} Dec 05 16:30:58 crc kubenswrapper[4756]: I1205 16:30:58.198761 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6cz4x" Dec 05 16:30:58 crc kubenswrapper[4756]: I1205 16:30:58.198825 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6cz4x" Dec 05 16:30:58 crc kubenswrapper[4756]: I1205 16:30:58.241605 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6cz4x" Dec 05 16:30:58 crc kubenswrapper[4756]: I1205 16:30:58.320835 4756 generic.go:334] "Generic (PLEG): container finished" podID="31e5042c-37e3-444e-adfb-3c7f50096886" containerID="3a85a3ca76e0b44092d617bb0b341f75e1227dde7492049d30b01b6bfe9c224c" exitCode=0 Dec 05 16:30:58 crc kubenswrapper[4756]: I1205 16:30:58.320907 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67" event={"ID":"31e5042c-37e3-444e-adfb-3c7f50096886","Type":"ContainerDied","Data":"3a85a3ca76e0b44092d617bb0b341f75e1227dde7492049d30b01b6bfe9c224c"} Dec 05 16:30:58 crc kubenswrapper[4756]: I1205 16:30:58.364716 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6cz4x" Dec 05 16:30:59 crc kubenswrapper[4756]: I1205 16:30:59.526109 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-nnqf2" Dec 05 16:30:59 crc kubenswrapper[4756]: I1205 16:30:59.597600 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67" Dec 05 16:30:59 crc kubenswrapper[4756]: I1205 16:30:59.766595 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7pxx\" (UniqueName: \"kubernetes.io/projected/31e5042c-37e3-444e-adfb-3c7f50096886-kube-api-access-r7pxx\") pod \"31e5042c-37e3-444e-adfb-3c7f50096886\" (UID: \"31e5042c-37e3-444e-adfb-3c7f50096886\") " Dec 05 16:30:59 crc kubenswrapper[4756]: I1205 16:30:59.766680 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/31e5042c-37e3-444e-adfb-3c7f50096886-bundle\") pod \"31e5042c-37e3-444e-adfb-3c7f50096886\" (UID: \"31e5042c-37e3-444e-adfb-3c7f50096886\") " Dec 05 16:30:59 crc kubenswrapper[4756]: I1205 16:30:59.766721 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/31e5042c-37e3-444e-adfb-3c7f50096886-util\") pod \"31e5042c-37e3-444e-adfb-3c7f50096886\" (UID: \"31e5042c-37e3-444e-adfb-3c7f50096886\") " Dec 05 16:30:59 crc kubenswrapper[4756]: I1205 16:30:59.768282 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31e5042c-37e3-444e-adfb-3c7f50096886-bundle" (OuterVolumeSpecName: "bundle") pod "31e5042c-37e3-444e-adfb-3c7f50096886" (UID: "31e5042c-37e3-444e-adfb-3c7f50096886"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:30:59 crc kubenswrapper[4756]: I1205 16:30:59.775618 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31e5042c-37e3-444e-adfb-3c7f50096886-kube-api-access-r7pxx" (OuterVolumeSpecName: "kube-api-access-r7pxx") pod "31e5042c-37e3-444e-adfb-3c7f50096886" (UID: "31e5042c-37e3-444e-adfb-3c7f50096886"). InnerVolumeSpecName "kube-api-access-r7pxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:30:59 crc kubenswrapper[4756]: I1205 16:30:59.776133 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31e5042c-37e3-444e-adfb-3c7f50096886-util" (OuterVolumeSpecName: "util") pod "31e5042c-37e3-444e-adfb-3c7f50096886" (UID: "31e5042c-37e3-444e-adfb-3c7f50096886"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:30:59 crc kubenswrapper[4756]: I1205 16:30:59.876845 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7pxx\" (UniqueName: \"kubernetes.io/projected/31e5042c-37e3-444e-adfb-3c7f50096886-kube-api-access-r7pxx\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:59 crc kubenswrapper[4756]: I1205 16:30:59.876889 4756 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/31e5042c-37e3-444e-adfb-3c7f50096886-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:30:59 crc kubenswrapper[4756]: I1205 16:30:59.876905 4756 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/31e5042c-37e3-444e-adfb-3c7f50096886-util\") on node \"crc\" DevicePath \"\"" Dec 05 16:31:00 crc kubenswrapper[4756]: I1205 16:31:00.334654 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67" event={"ID":"31e5042c-37e3-444e-adfb-3c7f50096886","Type":"ContainerDied","Data":"7e0f749069b7dbd69c8d1b263ccfdb46ae5d035b18e51af3849b236df7c02ec7"} Dec 05 16:31:00 crc kubenswrapper[4756]: I1205 16:31:00.334964 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e0f749069b7dbd69c8d1b263ccfdb46ae5d035b18e51af3849b236df7c02ec7" Dec 05 16:31:00 crc kubenswrapper[4756]: I1205 16:31:00.334732 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67" Dec 05 16:31:01 crc kubenswrapper[4756]: I1205 16:31:01.842387 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6cz4x"] Dec 05 16:31:01 crc kubenswrapper[4756]: I1205 16:31:01.843852 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6cz4x" podUID="1b45737c-141c-4bc1-98d3-efd41cca10d2" containerName="registry-server" containerID="cri-o://77a39955f037695ce86f1065acff84723b5baff9173e6abf1fb1b64f1ecf89aa" gracePeriod=2 Dec 05 16:31:03 crc kubenswrapper[4756]: I1205 16:31:03.365021 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6cz4x" Dec 05 16:31:03 crc kubenswrapper[4756]: I1205 16:31:03.365130 4756 generic.go:334] "Generic (PLEG): container finished" podID="1b45737c-141c-4bc1-98d3-efd41cca10d2" containerID="77a39955f037695ce86f1065acff84723b5baff9173e6abf1fb1b64f1ecf89aa" exitCode=0 Dec 05 16:31:03 crc kubenswrapper[4756]: I1205 16:31:03.365150 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6cz4x" event={"ID":"1b45737c-141c-4bc1-98d3-efd41cca10d2","Type":"ContainerDied","Data":"77a39955f037695ce86f1065acff84723b5baff9173e6abf1fb1b64f1ecf89aa"} Dec 05 16:31:03 crc kubenswrapper[4756]: I1205 16:31:03.365378 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6cz4x" event={"ID":"1b45737c-141c-4bc1-98d3-efd41cca10d2","Type":"ContainerDied","Data":"db666a61bbd26cc0d3af0d67fab1b2a6220008d300cb6d92521ee2d37b41af4d"} Dec 05 16:31:03 crc kubenswrapper[4756]: I1205 16:31:03.365428 4756 scope.go:117] "RemoveContainer" containerID="77a39955f037695ce86f1065acff84723b5baff9173e6abf1fb1b64f1ecf89aa" Dec 05 16:31:03 crc kubenswrapper[4756]: I1205 16:31:03.387223 4756 scope.go:117] "RemoveContainer" containerID="9fc09c0b09b07c80bdebdb660fbd86211673e234def7419cf1bd70bb0a3fb2f4" Dec 05 16:31:03 crc kubenswrapper[4756]: I1205 16:31:03.420848 4756 scope.go:117] "RemoveContainer" containerID="16542062a6640326e0ff425e8d6e62e92ad1ce0ff36ebb96db366a56479673ce" Dec 05 16:31:03 crc kubenswrapper[4756]: I1205 16:31:03.435691 4756 scope.go:117] "RemoveContainer" containerID="77a39955f037695ce86f1065acff84723b5baff9173e6abf1fb1b64f1ecf89aa" Dec 05 16:31:03 crc kubenswrapper[4756]: E1205 16:31:03.436495 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77a39955f037695ce86f1065acff84723b5baff9173e6abf1fb1b64f1ecf89aa\": container with ID starting with 77a39955f037695ce86f1065acff84723b5baff9173e6abf1fb1b64f1ecf89aa not found: ID does not exist" containerID="77a39955f037695ce86f1065acff84723b5baff9173e6abf1fb1b64f1ecf89aa" Dec 05 16:31:03 crc kubenswrapper[4756]: I1205 16:31:03.436638 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77a39955f037695ce86f1065acff84723b5baff9173e6abf1fb1b64f1ecf89aa"} err="failed to get container status \"77a39955f037695ce86f1065acff84723b5baff9173e6abf1fb1b64f1ecf89aa\": rpc error: code = NotFound desc = could not find container \"77a39955f037695ce86f1065acff84723b5baff9173e6abf1fb1b64f1ecf89aa\": container with ID starting with 77a39955f037695ce86f1065acff84723b5baff9173e6abf1fb1b64f1ecf89aa not found: ID does not exist" Dec 05 16:31:03 crc kubenswrapper[4756]: I1205 16:31:03.437552 4756 scope.go:117] "RemoveContainer" containerID="9fc09c0b09b07c80bdebdb660fbd86211673e234def7419cf1bd70bb0a3fb2f4" Dec 05 16:31:03 crc kubenswrapper[4756]: E1205 16:31:03.438034 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fc09c0b09b07c80bdebdb660fbd86211673e234def7419cf1bd70bb0a3fb2f4\": container with ID starting with 9fc09c0b09b07c80bdebdb660fbd86211673e234def7419cf1bd70bb0a3fb2f4 not found: ID does not exist" containerID="9fc09c0b09b07c80bdebdb660fbd86211673e234def7419cf1bd70bb0a3fb2f4" Dec 05 16:31:03 crc kubenswrapper[4756]: I1205 16:31:03.438080 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fc09c0b09b07c80bdebdb660fbd86211673e234def7419cf1bd70bb0a3fb2f4"} err="failed to get container status \"9fc09c0b09b07c80bdebdb660fbd86211673e234def7419cf1bd70bb0a3fb2f4\": rpc error: code = NotFound desc = could not find container \"9fc09c0b09b07c80bdebdb660fbd86211673e234def7419cf1bd70bb0a3fb2f4\": container with ID starting with 9fc09c0b09b07c80bdebdb660fbd86211673e234def7419cf1bd70bb0a3fb2f4 not found: ID does not exist" Dec 05 16:31:03 crc kubenswrapper[4756]: I1205 16:31:03.438106 4756 scope.go:117] "RemoveContainer" containerID="16542062a6640326e0ff425e8d6e62e92ad1ce0ff36ebb96db366a56479673ce" Dec 05 16:31:03 crc kubenswrapper[4756]: E1205 16:31:03.438339 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16542062a6640326e0ff425e8d6e62e92ad1ce0ff36ebb96db366a56479673ce\": container with ID starting with 16542062a6640326e0ff425e8d6e62e92ad1ce0ff36ebb96db366a56479673ce not found: ID does not exist" containerID="16542062a6640326e0ff425e8d6e62e92ad1ce0ff36ebb96db366a56479673ce" Dec 05 16:31:03 crc kubenswrapper[4756]: I1205 16:31:03.438367 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16542062a6640326e0ff425e8d6e62e92ad1ce0ff36ebb96db366a56479673ce"} err="failed to get container status \"16542062a6640326e0ff425e8d6e62e92ad1ce0ff36ebb96db366a56479673ce\": rpc error: code = NotFound desc = could not find container \"16542062a6640326e0ff425e8d6e62e92ad1ce0ff36ebb96db366a56479673ce\": container with ID starting with 16542062a6640326e0ff425e8d6e62e92ad1ce0ff36ebb96db366a56479673ce not found: ID does not exist" Dec 05 16:31:03 crc kubenswrapper[4756]: I1205 16:31:03.527524 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b45737c-141c-4bc1-98d3-efd41cca10d2-utilities\") pod \"1b45737c-141c-4bc1-98d3-efd41cca10d2\" (UID: \"1b45737c-141c-4bc1-98d3-efd41cca10d2\") " Dec 05 16:31:03 crc kubenswrapper[4756]: I1205 16:31:03.527582 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b45737c-141c-4bc1-98d3-efd41cca10d2-catalog-content\") pod \"1b45737c-141c-4bc1-98d3-efd41cca10d2\" (UID: \"1b45737c-141c-4bc1-98d3-efd41cca10d2\") " Dec 05 16:31:03 crc kubenswrapper[4756]: I1205 16:31:03.527618 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68t6p\" (UniqueName: \"kubernetes.io/projected/1b45737c-141c-4bc1-98d3-efd41cca10d2-kube-api-access-68t6p\") pod \"1b45737c-141c-4bc1-98d3-efd41cca10d2\" (UID: \"1b45737c-141c-4bc1-98d3-efd41cca10d2\") " Dec 05 16:31:03 crc kubenswrapper[4756]: I1205 16:31:03.528655 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b45737c-141c-4bc1-98d3-efd41cca10d2-utilities" (OuterVolumeSpecName: "utilities") pod "1b45737c-141c-4bc1-98d3-efd41cca10d2" (UID: "1b45737c-141c-4bc1-98d3-efd41cca10d2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:31:03 crc kubenswrapper[4756]: I1205 16:31:03.542893 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b45737c-141c-4bc1-98d3-efd41cca10d2-kube-api-access-68t6p" (OuterVolumeSpecName: "kube-api-access-68t6p") pod "1b45737c-141c-4bc1-98d3-efd41cca10d2" (UID: "1b45737c-141c-4bc1-98d3-efd41cca10d2"). InnerVolumeSpecName "kube-api-access-68t6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:31:03 crc kubenswrapper[4756]: I1205 16:31:03.579242 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b45737c-141c-4bc1-98d3-efd41cca10d2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b45737c-141c-4bc1-98d3-efd41cca10d2" (UID: "1b45737c-141c-4bc1-98d3-efd41cca10d2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:31:03 crc kubenswrapper[4756]: I1205 16:31:03.629125 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b45737c-141c-4bc1-98d3-efd41cca10d2-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:31:03 crc kubenswrapper[4756]: I1205 16:31:03.629159 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b45737c-141c-4bc1-98d3-efd41cca10d2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:31:03 crc kubenswrapper[4756]: I1205 16:31:03.629171 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68t6p\" (UniqueName: \"kubernetes.io/projected/1b45737c-141c-4bc1-98d3-efd41cca10d2-kube-api-access-68t6p\") on node \"crc\" DevicePath \"\"" Dec 05 16:31:04 crc kubenswrapper[4756]: I1205 16:31:04.383047 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6cz4x" Dec 05 16:31:04 crc kubenswrapper[4756]: I1205 16:31:04.401298 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6cz4x"] Dec 05 16:31:04 crc kubenswrapper[4756]: I1205 16:31:04.405246 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6cz4x"] Dec 05 16:31:04 crc kubenswrapper[4756]: I1205 16:31:04.783162 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fdvqn"] Dec 05 16:31:04 crc kubenswrapper[4756]: E1205 16:31:04.783380 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b45737c-141c-4bc1-98d3-efd41cca10d2" containerName="registry-server" Dec 05 16:31:04 crc kubenswrapper[4756]: I1205 16:31:04.783392 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b45737c-141c-4bc1-98d3-efd41cca10d2" containerName="registry-server" Dec 05 16:31:04 crc kubenswrapper[4756]: E1205 16:31:04.783404 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31e5042c-37e3-444e-adfb-3c7f50096886" containerName="extract" Dec 05 16:31:04 crc kubenswrapper[4756]: I1205 16:31:04.783409 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e5042c-37e3-444e-adfb-3c7f50096886" containerName="extract" Dec 05 16:31:04 crc kubenswrapper[4756]: E1205 16:31:04.783436 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b45737c-141c-4bc1-98d3-efd41cca10d2" containerName="extract-content" Dec 05 16:31:04 crc kubenswrapper[4756]: I1205 16:31:04.783442 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b45737c-141c-4bc1-98d3-efd41cca10d2" containerName="extract-content" Dec 05 16:31:04 crc kubenswrapper[4756]: E1205 16:31:04.783452 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31e5042c-37e3-444e-adfb-3c7f50096886" containerName="pull" Dec 05 16:31:04 crc kubenswrapper[4756]: I1205 16:31:04.783458 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e5042c-37e3-444e-adfb-3c7f50096886" containerName="pull" Dec 05 16:31:04 crc kubenswrapper[4756]: E1205 16:31:04.783472 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31e5042c-37e3-444e-adfb-3c7f50096886" containerName="util" Dec 05 16:31:04 crc kubenswrapper[4756]: I1205 16:31:04.783478 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e5042c-37e3-444e-adfb-3c7f50096886" containerName="util" Dec 05 16:31:04 crc kubenswrapper[4756]: E1205 16:31:04.783488 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b45737c-141c-4bc1-98d3-efd41cca10d2" containerName="extract-utilities" Dec 05 16:31:04 crc kubenswrapper[4756]: I1205 16:31:04.783494 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b45737c-141c-4bc1-98d3-efd41cca10d2" containerName="extract-utilities" Dec 05 16:31:04 crc kubenswrapper[4756]: I1205 16:31:04.783594 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="31e5042c-37e3-444e-adfb-3c7f50096886" containerName="extract" Dec 05 16:31:04 crc kubenswrapper[4756]: I1205 16:31:04.783606 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b45737c-141c-4bc1-98d3-efd41cca10d2" containerName="registry-server" Dec 05 16:31:04 crc kubenswrapper[4756]: I1205 16:31:04.784000 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fdvqn" Dec 05 16:31:04 crc kubenswrapper[4756]: I1205 16:31:04.792664 4756 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-8mtq8" Dec 05 16:31:04 crc kubenswrapper[4756]: I1205 16:31:04.794016 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Dec 05 16:31:04 crc kubenswrapper[4756]: I1205 16:31:04.794741 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Dec 05 16:31:04 crc kubenswrapper[4756]: I1205 16:31:04.812201 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fdvqn"] Dec 05 16:31:04 crc kubenswrapper[4756]: I1205 16:31:04.946810 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/65061bd9-6735-41d7-9314-870bd47be8c7-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-fdvqn\" (UID: \"65061bd9-6735-41d7-9314-870bd47be8c7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fdvqn" Dec 05 16:31:04 crc kubenswrapper[4756]: I1205 16:31:04.946905 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrkhc\" (UniqueName: \"kubernetes.io/projected/65061bd9-6735-41d7-9314-870bd47be8c7-kube-api-access-mrkhc\") pod \"cert-manager-operator-controller-manager-64cf6dff88-fdvqn\" (UID: \"65061bd9-6735-41d7-9314-870bd47be8c7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fdvqn" Dec 05 16:31:05 crc kubenswrapper[4756]: I1205 16:31:05.047991 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/65061bd9-6735-41d7-9314-870bd47be8c7-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-fdvqn\" (UID: \"65061bd9-6735-41d7-9314-870bd47be8c7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fdvqn" Dec 05 16:31:05 crc kubenswrapper[4756]: I1205 16:31:05.048070 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrkhc\" (UniqueName: \"kubernetes.io/projected/65061bd9-6735-41d7-9314-870bd47be8c7-kube-api-access-mrkhc\") pod \"cert-manager-operator-controller-manager-64cf6dff88-fdvqn\" (UID: \"65061bd9-6735-41d7-9314-870bd47be8c7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fdvqn" Dec 05 16:31:05 crc kubenswrapper[4756]: I1205 16:31:05.048548 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/65061bd9-6735-41d7-9314-870bd47be8c7-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-fdvqn\" (UID: \"65061bd9-6735-41d7-9314-870bd47be8c7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fdvqn" Dec 05 16:31:05 crc kubenswrapper[4756]: I1205 16:31:05.080449 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrkhc\" (UniqueName: \"kubernetes.io/projected/65061bd9-6735-41d7-9314-870bd47be8c7-kube-api-access-mrkhc\") pod \"cert-manager-operator-controller-manager-64cf6dff88-fdvqn\" (UID: \"65061bd9-6735-41d7-9314-870bd47be8c7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fdvqn" Dec 05 16:31:05 crc kubenswrapper[4756]: I1205 16:31:05.096703 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fdvqn" Dec 05 16:31:05 crc kubenswrapper[4756]: I1205 16:31:05.631653 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fdvqn"] Dec 05 16:31:06 crc kubenswrapper[4756]: I1205 16:31:06.064091 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b45737c-141c-4bc1-98d3-efd41cca10d2" path="/var/lib/kubelet/pods/1b45737c-141c-4bc1-98d3-efd41cca10d2/volumes" Dec 05 16:31:06 crc kubenswrapper[4756]: I1205 16:31:06.397292 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fdvqn" event={"ID":"65061bd9-6735-41d7-9314-870bd47be8c7","Type":"ContainerStarted","Data":"cbd13c62823d71cd732ef911c96218c20c40089fefa422507f8202bc3b9470b6"} Dec 05 16:31:13 crc kubenswrapper[4756]: I1205 16:31:13.453222 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fdvqn" event={"ID":"65061bd9-6735-41d7-9314-870bd47be8c7","Type":"ContainerStarted","Data":"98f29549e7df1c15a35792b850f09f73247aa92dab1b80475e9cfef22443bcb3"} Dec 05 16:31:13 crc kubenswrapper[4756]: I1205 16:31:13.479190 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fdvqn" podStartSLOduration=2.436577345 podStartE2EDuration="9.479164647s" podCreationTimestamp="2025-12-05 16:31:04 +0000 UTC" firstStartedPulling="2025-12-05 16:31:05.64766123 +0000 UTC m=+925.886334742" lastFinishedPulling="2025-12-05 16:31:12.690248512 +0000 UTC m=+932.928922044" observedRunningTime="2025-12-05 16:31:13.473024802 +0000 UTC m=+933.711698324" watchObservedRunningTime="2025-12-05 16:31:13.479164647 +0000 UTC m=+933.717838159" Dec 05 16:31:16 crc kubenswrapper[4756]: I1205 16:31:16.532397 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-f455g"] Dec 05 16:31:16 crc kubenswrapper[4756]: I1205 16:31:16.534079 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-f455g" Dec 05 16:31:16 crc kubenswrapper[4756]: I1205 16:31:16.536184 4756 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-wgbmj" Dec 05 16:31:16 crc kubenswrapper[4756]: I1205 16:31:16.536325 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 05 16:31:16 crc kubenswrapper[4756]: I1205 16:31:16.536877 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 05 16:31:16 crc kubenswrapper[4756]: I1205 16:31:16.550857 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-f455g"] Dec 05 16:31:16 crc kubenswrapper[4756]: I1205 16:31:16.601405 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd258cbd-151e-4165-b19d-9c5cc22e9f97-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-f455g\" (UID: \"bd258cbd-151e-4165-b19d-9c5cc22e9f97\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-f455g" Dec 05 16:31:16 crc kubenswrapper[4756]: I1205 16:31:16.601511 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8jt9\" (UniqueName: \"kubernetes.io/projected/bd258cbd-151e-4165-b19d-9c5cc22e9f97-kube-api-access-c8jt9\") pod \"cert-manager-webhook-f4fb5df64-f455g\" (UID: \"bd258cbd-151e-4165-b19d-9c5cc22e9f97\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-f455g" Dec 05 16:31:16 crc kubenswrapper[4756]: I1205 16:31:16.702774 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8jt9\" (UniqueName: \"kubernetes.io/projected/bd258cbd-151e-4165-b19d-9c5cc22e9f97-kube-api-access-c8jt9\") pod \"cert-manager-webhook-f4fb5df64-f455g\" (UID: \"bd258cbd-151e-4165-b19d-9c5cc22e9f97\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-f455g" Dec 05 16:31:16 crc kubenswrapper[4756]: I1205 16:31:16.702934 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd258cbd-151e-4165-b19d-9c5cc22e9f97-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-f455g\" (UID: \"bd258cbd-151e-4165-b19d-9c5cc22e9f97\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-f455g" Dec 05 16:31:16 crc kubenswrapper[4756]: I1205 16:31:16.722335 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd258cbd-151e-4165-b19d-9c5cc22e9f97-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-f455g\" (UID: \"bd258cbd-151e-4165-b19d-9c5cc22e9f97\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-f455g" Dec 05 16:31:16 crc kubenswrapper[4756]: I1205 16:31:16.722498 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8jt9\" (UniqueName: \"kubernetes.io/projected/bd258cbd-151e-4165-b19d-9c5cc22e9f97-kube-api-access-c8jt9\") pod \"cert-manager-webhook-f4fb5df64-f455g\" (UID: \"bd258cbd-151e-4165-b19d-9c5cc22e9f97\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-f455g" Dec 05 16:31:16 crc kubenswrapper[4756]: I1205 16:31:16.850744 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-f455g" Dec 05 16:31:17 crc kubenswrapper[4756]: I1205 16:31:17.264075 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-f455g"] Dec 05 16:31:17 crc kubenswrapper[4756]: I1205 16:31:17.477534 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-f455g" event={"ID":"bd258cbd-151e-4165-b19d-9c5cc22e9f97","Type":"ContainerStarted","Data":"0c9356700e64291b2beb10282d284ddc9e8ac005ed43fd813596bd0d7249eaf3"} Dec 05 16:31:20 crc kubenswrapper[4756]: I1205 16:31:20.114616 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-vv2kx"] Dec 05 16:31:20 crc kubenswrapper[4756]: I1205 16:31:20.115805 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-vv2kx" Dec 05 16:31:20 crc kubenswrapper[4756]: I1205 16:31:20.119780 4756 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-4tqmt" Dec 05 16:31:20 crc kubenswrapper[4756]: I1205 16:31:20.123226 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-vv2kx"] Dec 05 16:31:20 crc kubenswrapper[4756]: I1205 16:31:20.264177 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dab2fb4f-8a5a-473b-b211-4459f623785f-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-vv2kx\" (UID: \"dab2fb4f-8a5a-473b-b211-4459f623785f\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-vv2kx" Dec 05 16:31:20 crc kubenswrapper[4756]: I1205 16:31:20.264541 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lhfb\" (UniqueName: \"kubernetes.io/projected/dab2fb4f-8a5a-473b-b211-4459f623785f-kube-api-access-2lhfb\") pod \"cert-manager-cainjector-855d9ccff4-vv2kx\" (UID: \"dab2fb4f-8a5a-473b-b211-4459f623785f\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-vv2kx" Dec 05 16:31:20 crc kubenswrapper[4756]: I1205 16:31:20.365233 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lhfb\" (UniqueName: \"kubernetes.io/projected/dab2fb4f-8a5a-473b-b211-4459f623785f-kube-api-access-2lhfb\") pod \"cert-manager-cainjector-855d9ccff4-vv2kx\" (UID: \"dab2fb4f-8a5a-473b-b211-4459f623785f\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-vv2kx" Dec 05 16:31:20 crc kubenswrapper[4756]: I1205 16:31:20.365371 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dab2fb4f-8a5a-473b-b211-4459f623785f-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-vv2kx\" (UID: \"dab2fb4f-8a5a-473b-b211-4459f623785f\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-vv2kx" Dec 05 16:31:20 crc kubenswrapper[4756]: I1205 16:31:20.385195 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lhfb\" (UniqueName: \"kubernetes.io/projected/dab2fb4f-8a5a-473b-b211-4459f623785f-kube-api-access-2lhfb\") pod \"cert-manager-cainjector-855d9ccff4-vv2kx\" (UID: \"dab2fb4f-8a5a-473b-b211-4459f623785f\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-vv2kx" Dec 05 16:31:20 crc kubenswrapper[4756]: I1205 16:31:20.397638 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dab2fb4f-8a5a-473b-b211-4459f623785f-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-vv2kx\" (UID: \"dab2fb4f-8a5a-473b-b211-4459f623785f\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-vv2kx" Dec 05 16:31:20 crc kubenswrapper[4756]: I1205 16:31:20.437866 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-vv2kx" Dec 05 16:31:24 crc kubenswrapper[4756]: I1205 16:31:24.817130 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-vv2kx"] Dec 05 16:31:25 crc kubenswrapper[4756]: I1205 16:31:25.534722 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-f455g" event={"ID":"bd258cbd-151e-4165-b19d-9c5cc22e9f97","Type":"ContainerStarted","Data":"d91802ad7f60f2c2ee083c6557826c7e604c15a9403e507244712a9d046800d2"} Dec 05 16:31:25 crc kubenswrapper[4756]: I1205 16:31:25.534860 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-f455g" Dec 05 16:31:25 crc kubenswrapper[4756]: I1205 16:31:25.536268 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-vv2kx" event={"ID":"dab2fb4f-8a5a-473b-b211-4459f623785f","Type":"ContainerStarted","Data":"95c484ee4ca000f37034ef03ee3c8cd7ac2b22130d36665685db8af2d437f636"} Dec 05 16:31:25 crc kubenswrapper[4756]: I1205 16:31:25.536319 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-vv2kx" event={"ID":"dab2fb4f-8a5a-473b-b211-4459f623785f","Type":"ContainerStarted","Data":"f30a99486b7f6e8f4e9278f36b39b023ad17b50e383e58d3cb351f50ab9e3341"} Dec 05 16:31:25 crc kubenswrapper[4756]: I1205 16:31:25.555467 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-f455g" podStartSLOduration=2.278633261 podStartE2EDuration="9.555446055s" podCreationTimestamp="2025-12-05 16:31:16 +0000 UTC" firstStartedPulling="2025-12-05 16:31:17.279025047 +0000 UTC m=+937.517698559" lastFinishedPulling="2025-12-05 16:31:24.555837841 +0000 UTC m=+944.794511353" observedRunningTime="2025-12-05 16:31:25.549853074 +0000 UTC m=+945.788526586" watchObservedRunningTime="2025-12-05 16:31:25.555446055 +0000 UTC m=+945.794119597" Dec 05 16:31:25 crc kubenswrapper[4756]: I1205 16:31:25.564533 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-vv2kx" podStartSLOduration=5.564525709 podStartE2EDuration="5.564525709s" podCreationTimestamp="2025-12-05 16:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:31:25.561654912 +0000 UTC m=+945.800328424" watchObservedRunningTime="2025-12-05 16:31:25.564525709 +0000 UTC m=+945.803199221" Dec 05 16:31:29 crc kubenswrapper[4756]: I1205 16:31:29.947454 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-674nf"] Dec 05 16:31:29 crc kubenswrapper[4756]: I1205 16:31:29.948944 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-674nf" Dec 05 16:31:29 crc kubenswrapper[4756]: I1205 16:31:29.950732 4756 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-8887t" Dec 05 16:31:29 crc kubenswrapper[4756]: I1205 16:31:29.960706 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-674nf"] Dec 05 16:31:29 crc kubenswrapper[4756]: I1205 16:31:29.995912 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9cfc35ef-5ac8-4f3e-9713-7fdabcca35d6-bound-sa-token\") pod \"cert-manager-86cb77c54b-674nf\" (UID: \"9cfc35ef-5ac8-4f3e-9713-7fdabcca35d6\") " pod="cert-manager/cert-manager-86cb77c54b-674nf" Dec 05 16:31:29 crc kubenswrapper[4756]: I1205 16:31:29.996025 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpjp5\" (UniqueName: \"kubernetes.io/projected/9cfc35ef-5ac8-4f3e-9713-7fdabcca35d6-kube-api-access-lpjp5\") pod \"cert-manager-86cb77c54b-674nf\" (UID: \"9cfc35ef-5ac8-4f3e-9713-7fdabcca35d6\") " pod="cert-manager/cert-manager-86cb77c54b-674nf" Dec 05 16:31:30 crc kubenswrapper[4756]: I1205 16:31:30.097149 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9cfc35ef-5ac8-4f3e-9713-7fdabcca35d6-bound-sa-token\") pod \"cert-manager-86cb77c54b-674nf\" (UID: \"9cfc35ef-5ac8-4f3e-9713-7fdabcca35d6\") " pod="cert-manager/cert-manager-86cb77c54b-674nf" Dec 05 16:31:30 crc kubenswrapper[4756]: I1205 16:31:30.097219 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpjp5\" (UniqueName: \"kubernetes.io/projected/9cfc35ef-5ac8-4f3e-9713-7fdabcca35d6-kube-api-access-lpjp5\") pod \"cert-manager-86cb77c54b-674nf\" (UID: \"9cfc35ef-5ac8-4f3e-9713-7fdabcca35d6\") " pod="cert-manager/cert-manager-86cb77c54b-674nf" Dec 05 16:31:30 crc kubenswrapper[4756]: I1205 16:31:30.116578 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9cfc35ef-5ac8-4f3e-9713-7fdabcca35d6-bound-sa-token\") pod \"cert-manager-86cb77c54b-674nf\" (UID: \"9cfc35ef-5ac8-4f3e-9713-7fdabcca35d6\") " pod="cert-manager/cert-manager-86cb77c54b-674nf" Dec 05 16:31:30 crc kubenswrapper[4756]: I1205 16:31:30.116713 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpjp5\" (UniqueName: \"kubernetes.io/projected/9cfc35ef-5ac8-4f3e-9713-7fdabcca35d6-kube-api-access-lpjp5\") pod \"cert-manager-86cb77c54b-674nf\" (UID: \"9cfc35ef-5ac8-4f3e-9713-7fdabcca35d6\") " pod="cert-manager/cert-manager-86cb77c54b-674nf" Dec 05 16:31:30 crc kubenswrapper[4756]: I1205 16:31:30.266728 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-674nf" Dec 05 16:31:30 crc kubenswrapper[4756]: I1205 16:31:30.717399 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-674nf"] Dec 05 16:31:31 crc kubenswrapper[4756]: I1205 16:31:31.570199 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-674nf" event={"ID":"9cfc35ef-5ac8-4f3e-9713-7fdabcca35d6","Type":"ContainerStarted","Data":"b8d209f0466ac741fb48edbd5c59df21341a22dc6decd4f08cf90ff725145635"} Dec 05 16:31:31 crc kubenswrapper[4756]: I1205 16:31:31.854160 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-f455g" Dec 05 16:31:35 crc kubenswrapper[4756]: I1205 16:31:35.593005 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-674nf" event={"ID":"9cfc35ef-5ac8-4f3e-9713-7fdabcca35d6","Type":"ContainerStarted","Data":"ae9307c047b62c940973ab4f1f6736227926e0ed394475614d44f4b4436bf2e8"} Dec 05 16:31:36 crc kubenswrapper[4756]: I1205 16:31:36.618216 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-674nf" podStartSLOduration=7.618194151 podStartE2EDuration="7.618194151s" podCreationTimestamp="2025-12-05 16:31:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:31:36.612457986 +0000 UTC m=+956.851131518" watchObservedRunningTime="2025-12-05 16:31:36.618194151 +0000 UTC m=+956.856867663" Dec 05 16:31:42 crc kubenswrapper[4756]: I1205 16:31:42.060842 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-v9pl4"] Dec 05 16:31:42 crc kubenswrapper[4756]: I1205 16:31:42.062100 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-v9pl4" Dec 05 16:31:42 crc kubenswrapper[4756]: I1205 16:31:42.064572 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 05 16:31:42 crc kubenswrapper[4756]: I1205 16:31:42.064593 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-fjzwd" Dec 05 16:31:42 crc kubenswrapper[4756]: I1205 16:31:42.072000 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 05 16:31:42 crc kubenswrapper[4756]: I1205 16:31:42.081689 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-v9pl4"] Dec 05 16:31:42 crc kubenswrapper[4756]: I1205 16:31:42.264601 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjsn6\" (UniqueName: \"kubernetes.io/projected/666fbe1b-1b4e-4f59-a5a7-e4e8f6cb51a7-kube-api-access-fjsn6\") pod \"openstack-operator-index-v9pl4\" (UID: \"666fbe1b-1b4e-4f59-a5a7-e4e8f6cb51a7\") " pod="openstack-operators/openstack-operator-index-v9pl4" Dec 05 16:31:42 crc kubenswrapper[4756]: I1205 16:31:42.366812 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjsn6\" (UniqueName: \"kubernetes.io/projected/666fbe1b-1b4e-4f59-a5a7-e4e8f6cb51a7-kube-api-access-fjsn6\") pod \"openstack-operator-index-v9pl4\" (UID: \"666fbe1b-1b4e-4f59-a5a7-e4e8f6cb51a7\") " pod="openstack-operators/openstack-operator-index-v9pl4" Dec 05 16:31:42 crc kubenswrapper[4756]: I1205 16:31:42.400425 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjsn6\" (UniqueName: \"kubernetes.io/projected/666fbe1b-1b4e-4f59-a5a7-e4e8f6cb51a7-kube-api-access-fjsn6\") pod \"openstack-operator-index-v9pl4\" (UID: \"666fbe1b-1b4e-4f59-a5a7-e4e8f6cb51a7\") " pod="openstack-operators/openstack-operator-index-v9pl4" Dec 05 16:31:42 crc kubenswrapper[4756]: I1205 16:31:42.681106 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-v9pl4" Dec 05 16:31:42 crc kubenswrapper[4756]: I1205 16:31:42.865541 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-v9pl4"] Dec 05 16:31:43 crc kubenswrapper[4756]: I1205 16:31:43.644063 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-v9pl4" event={"ID":"666fbe1b-1b4e-4f59-a5a7-e4e8f6cb51a7","Type":"ContainerStarted","Data":"4524f6cf0c74823dfd79626e6269afef8b073521ad78a5f063a004b2c499aa91"} Dec 05 16:31:44 crc kubenswrapper[4756]: I1205 16:31:44.655247 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-v9pl4" event={"ID":"666fbe1b-1b4e-4f59-a5a7-e4e8f6cb51a7","Type":"ContainerStarted","Data":"790d8bd644509e011767ec0551cb1216e795e0c500d39170972f5da1252bde25"} Dec 05 16:31:44 crc kubenswrapper[4756]: I1205 16:31:44.684102 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-v9pl4" podStartSLOduration=1.897767156 podStartE2EDuration="2.684072051s" podCreationTimestamp="2025-12-05 16:31:42 +0000 UTC" firstStartedPulling="2025-12-05 16:31:42.867869998 +0000 UTC m=+963.106543530" lastFinishedPulling="2025-12-05 16:31:43.654174913 +0000 UTC m=+963.892848425" observedRunningTime="2025-12-05 16:31:44.67622607 +0000 UTC m=+964.914899652" watchObservedRunningTime="2025-12-05 16:31:44.684072051 +0000 UTC m=+964.922745603" Dec 05 16:31:45 crc kubenswrapper[4756]: I1205 16:31:45.428048 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-v9pl4"] Dec 05 16:31:46 crc kubenswrapper[4756]: I1205 16:31:46.035065 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-mq7cd"] Dec 05 16:31:46 crc kubenswrapper[4756]: I1205 16:31:46.035967 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mq7cd" Dec 05 16:31:46 crc kubenswrapper[4756]: I1205 16:31:46.047569 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mq7cd"] Dec 05 16:31:46 crc kubenswrapper[4756]: I1205 16:31:46.219539 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh5zq\" (UniqueName: \"kubernetes.io/projected/7402cf0a-a645-47dc-8f6f-6d9c833a9f8b-kube-api-access-hh5zq\") pod \"openstack-operator-index-mq7cd\" (UID: \"7402cf0a-a645-47dc-8f6f-6d9c833a9f8b\") " pod="openstack-operators/openstack-operator-index-mq7cd" Dec 05 16:31:46 crc kubenswrapper[4756]: I1205 16:31:46.321583 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh5zq\" (UniqueName: \"kubernetes.io/projected/7402cf0a-a645-47dc-8f6f-6d9c833a9f8b-kube-api-access-hh5zq\") pod \"openstack-operator-index-mq7cd\" (UID: \"7402cf0a-a645-47dc-8f6f-6d9c833a9f8b\") " pod="openstack-operators/openstack-operator-index-mq7cd" Dec 05 16:31:46 crc kubenswrapper[4756]: I1205 16:31:46.339393 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh5zq\" (UniqueName: \"kubernetes.io/projected/7402cf0a-a645-47dc-8f6f-6d9c833a9f8b-kube-api-access-hh5zq\") pod \"openstack-operator-index-mq7cd\" (UID: \"7402cf0a-a645-47dc-8f6f-6d9c833a9f8b\") " pod="openstack-operators/openstack-operator-index-mq7cd" Dec 05 16:31:46 crc kubenswrapper[4756]: I1205 16:31:46.358340 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mq7cd" Dec 05 16:31:46 crc kubenswrapper[4756]: I1205 16:31:46.560315 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mq7cd"] Dec 05 16:31:46 crc kubenswrapper[4756]: W1205 16:31:46.567659 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7402cf0a_a645_47dc_8f6f_6d9c833a9f8b.slice/crio-d9b4f06692bb02bf3494baf76ec77dfbca632487bd3d1f934909cc8074d7d75e WatchSource:0}: Error finding container d9b4f06692bb02bf3494baf76ec77dfbca632487bd3d1f934909cc8074d7d75e: Status 404 returned error can't find the container with id d9b4f06692bb02bf3494baf76ec77dfbca632487bd3d1f934909cc8074d7d75e Dec 05 16:31:46 crc kubenswrapper[4756]: I1205 16:31:46.671067 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-v9pl4" podUID="666fbe1b-1b4e-4f59-a5a7-e4e8f6cb51a7" containerName="registry-server" containerID="cri-o://790d8bd644509e011767ec0551cb1216e795e0c500d39170972f5da1252bde25" gracePeriod=2 Dec 05 16:31:46 crc kubenswrapper[4756]: I1205 16:31:46.671613 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mq7cd" event={"ID":"7402cf0a-a645-47dc-8f6f-6d9c833a9f8b","Type":"ContainerStarted","Data":"d9b4f06692bb02bf3494baf76ec77dfbca632487bd3d1f934909cc8074d7d75e"} Dec 05 16:31:47 crc kubenswrapper[4756]: I1205 16:31:47.003777 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-v9pl4" Dec 05 16:31:47 crc kubenswrapper[4756]: I1205 16:31:47.133771 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjsn6\" (UniqueName: \"kubernetes.io/projected/666fbe1b-1b4e-4f59-a5a7-e4e8f6cb51a7-kube-api-access-fjsn6\") pod \"666fbe1b-1b4e-4f59-a5a7-e4e8f6cb51a7\" (UID: \"666fbe1b-1b4e-4f59-a5a7-e4e8f6cb51a7\") " Dec 05 16:31:47 crc kubenswrapper[4756]: I1205 16:31:47.138884 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/666fbe1b-1b4e-4f59-a5a7-e4e8f6cb51a7-kube-api-access-fjsn6" (OuterVolumeSpecName: "kube-api-access-fjsn6") pod "666fbe1b-1b4e-4f59-a5a7-e4e8f6cb51a7" (UID: "666fbe1b-1b4e-4f59-a5a7-e4e8f6cb51a7"). InnerVolumeSpecName "kube-api-access-fjsn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:31:47 crc kubenswrapper[4756]: I1205 16:31:47.236062 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjsn6\" (UniqueName: \"kubernetes.io/projected/666fbe1b-1b4e-4f59-a5a7-e4e8f6cb51a7-kube-api-access-fjsn6\") on node \"crc\" DevicePath \"\"" Dec 05 16:31:47 crc kubenswrapper[4756]: I1205 16:31:47.677240 4756 generic.go:334] "Generic (PLEG): container finished" podID="666fbe1b-1b4e-4f59-a5a7-e4e8f6cb51a7" containerID="790d8bd644509e011767ec0551cb1216e795e0c500d39170972f5da1252bde25" exitCode=0 Dec 05 16:31:47 crc kubenswrapper[4756]: I1205 16:31:47.677390 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-v9pl4" event={"ID":"666fbe1b-1b4e-4f59-a5a7-e4e8f6cb51a7","Type":"ContainerDied","Data":"790d8bd644509e011767ec0551cb1216e795e0c500d39170972f5da1252bde25"} Dec 05 16:31:47 crc kubenswrapper[4756]: I1205 16:31:47.677482 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-v9pl4" Dec 05 16:31:47 crc kubenswrapper[4756]: I1205 16:31:47.677666 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-v9pl4" event={"ID":"666fbe1b-1b4e-4f59-a5a7-e4e8f6cb51a7","Type":"ContainerDied","Data":"4524f6cf0c74823dfd79626e6269afef8b073521ad78a5f063a004b2c499aa91"} Dec 05 16:31:47 crc kubenswrapper[4756]: I1205 16:31:47.677695 4756 scope.go:117] "RemoveContainer" containerID="790d8bd644509e011767ec0551cb1216e795e0c500d39170972f5da1252bde25" Dec 05 16:31:47 crc kubenswrapper[4756]: I1205 16:31:47.683252 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mq7cd" event={"ID":"7402cf0a-a645-47dc-8f6f-6d9c833a9f8b","Type":"ContainerStarted","Data":"8fb4c55450340b452dc2c058a68d061fcdd1f767486348506ba400cc9a0f85f9"} Dec 05 16:31:47 crc kubenswrapper[4756]: I1205 16:31:47.703289 4756 scope.go:117] "RemoveContainer" containerID="790d8bd644509e011767ec0551cb1216e795e0c500d39170972f5da1252bde25" Dec 05 16:31:47 crc kubenswrapper[4756]: E1205 16:31:47.704261 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"790d8bd644509e011767ec0551cb1216e795e0c500d39170972f5da1252bde25\": container with ID starting with 790d8bd644509e011767ec0551cb1216e795e0c500d39170972f5da1252bde25 not found: ID does not exist" containerID="790d8bd644509e011767ec0551cb1216e795e0c500d39170972f5da1252bde25" Dec 05 16:31:47 crc kubenswrapper[4756]: I1205 16:31:47.704505 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"790d8bd644509e011767ec0551cb1216e795e0c500d39170972f5da1252bde25"} err="failed to get container status \"790d8bd644509e011767ec0551cb1216e795e0c500d39170972f5da1252bde25\": rpc error: code = NotFound desc = could not find container \"790d8bd644509e011767ec0551cb1216e795e0c500d39170972f5da1252bde25\": container with ID starting with 790d8bd644509e011767ec0551cb1216e795e0c500d39170972f5da1252bde25 not found: ID does not exist" Dec 05 16:31:47 crc kubenswrapper[4756]: I1205 16:31:47.712891 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-mq7cd" podStartSLOduration=1.245972315 podStartE2EDuration="1.712846436s" podCreationTimestamp="2025-12-05 16:31:46 +0000 UTC" firstStartedPulling="2025-12-05 16:31:46.571124679 +0000 UTC m=+966.809798191" lastFinishedPulling="2025-12-05 16:31:47.0379988 +0000 UTC m=+967.276672312" observedRunningTime="2025-12-05 16:31:47.699032164 +0000 UTC m=+967.937705706" watchObservedRunningTime="2025-12-05 16:31:47.712846436 +0000 UTC m=+967.951519968" Dec 05 16:31:47 crc kubenswrapper[4756]: I1205 16:31:47.724309 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-v9pl4"] Dec 05 16:31:47 crc kubenswrapper[4756]: I1205 16:31:47.729263 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-v9pl4"] Dec 05 16:31:48 crc kubenswrapper[4756]: I1205 16:31:48.067705 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="666fbe1b-1b4e-4f59-a5a7-e4e8f6cb51a7" path="/var/lib/kubelet/pods/666fbe1b-1b4e-4f59-a5a7-e4e8f6cb51a7/volumes" Dec 05 16:31:56 crc kubenswrapper[4756]: I1205 16:31:56.359166 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-mq7cd" Dec 05 16:31:56 crc kubenswrapper[4756]: I1205 16:31:56.359221 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-mq7cd" Dec 05 16:31:56 crc kubenswrapper[4756]: I1205 16:31:56.400934 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-mq7cd" Dec 05 16:31:56 crc kubenswrapper[4756]: I1205 16:31:56.768708 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-mq7cd" Dec 05 16:32:03 crc kubenswrapper[4756]: I1205 16:32:03.146478 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws"] Dec 05 16:32:03 crc kubenswrapper[4756]: E1205 16:32:03.147001 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="666fbe1b-1b4e-4f59-a5a7-e4e8f6cb51a7" containerName="registry-server" Dec 05 16:32:03 crc kubenswrapper[4756]: I1205 16:32:03.147015 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="666fbe1b-1b4e-4f59-a5a7-e4e8f6cb51a7" containerName="registry-server" Dec 05 16:32:03 crc kubenswrapper[4756]: I1205 16:32:03.147161 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="666fbe1b-1b4e-4f59-a5a7-e4e8f6cb51a7" containerName="registry-server" Dec 05 16:32:03 crc kubenswrapper[4756]: I1205 16:32:03.148238 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws" Dec 05 16:32:03 crc kubenswrapper[4756]: I1205 16:32:03.150820 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-tx7fb" Dec 05 16:32:03 crc kubenswrapper[4756]: I1205 16:32:03.159148 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws"] Dec 05 16:32:03 crc kubenswrapper[4756]: I1205 16:32:03.253462 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3356d953-464c-40f6-bc07-8534d8a8976e-util\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws\" (UID: \"3356d953-464c-40f6-bc07-8534d8a8976e\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws" Dec 05 16:32:03 crc kubenswrapper[4756]: I1205 16:32:03.253517 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3356d953-464c-40f6-bc07-8534d8a8976e-bundle\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws\" (UID: \"3356d953-464c-40f6-bc07-8534d8a8976e\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws" Dec 05 16:32:03 crc kubenswrapper[4756]: I1205 16:32:03.253629 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvmpp\" (UniqueName: \"kubernetes.io/projected/3356d953-464c-40f6-bc07-8534d8a8976e-kube-api-access-bvmpp\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws\" (UID: \"3356d953-464c-40f6-bc07-8534d8a8976e\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws" Dec 05 16:32:03 crc kubenswrapper[4756]: I1205 16:32:03.355352 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3356d953-464c-40f6-bc07-8534d8a8976e-util\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws\" (UID: \"3356d953-464c-40f6-bc07-8534d8a8976e\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws" Dec 05 16:32:03 crc kubenswrapper[4756]: I1205 16:32:03.355413 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3356d953-464c-40f6-bc07-8534d8a8976e-bundle\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws\" (UID: \"3356d953-464c-40f6-bc07-8534d8a8976e\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws" Dec 05 16:32:03 crc kubenswrapper[4756]: I1205 16:32:03.355512 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvmpp\" (UniqueName: \"kubernetes.io/projected/3356d953-464c-40f6-bc07-8534d8a8976e-kube-api-access-bvmpp\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws\" (UID: \"3356d953-464c-40f6-bc07-8534d8a8976e\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws" Dec 05 16:32:03 crc kubenswrapper[4756]: I1205 16:32:03.355956 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3356d953-464c-40f6-bc07-8534d8a8976e-util\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws\" (UID: \"3356d953-464c-40f6-bc07-8534d8a8976e\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws" Dec 05 16:32:03 crc kubenswrapper[4756]: I1205 16:32:03.356132 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3356d953-464c-40f6-bc07-8534d8a8976e-bundle\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws\" (UID: \"3356d953-464c-40f6-bc07-8534d8a8976e\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws" Dec 05 16:32:03 crc kubenswrapper[4756]: I1205 16:32:03.376209 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvmpp\" (UniqueName: \"kubernetes.io/projected/3356d953-464c-40f6-bc07-8534d8a8976e-kube-api-access-bvmpp\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws\" (UID: \"3356d953-464c-40f6-bc07-8534d8a8976e\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws" Dec 05 16:32:03 crc kubenswrapper[4756]: I1205 16:32:03.502823 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws" Dec 05 16:32:03 crc kubenswrapper[4756]: I1205 16:32:03.924488 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws"] Dec 05 16:32:04 crc kubenswrapper[4756]: I1205 16:32:04.811154 4756 generic.go:334] "Generic (PLEG): container finished" podID="3356d953-464c-40f6-bc07-8534d8a8976e" containerID="9686e0e4c17c3f3af7e435b476c61718ba62bd190315e1c9b7edc8b2af646359" exitCode=0 Dec 05 16:32:04 crc kubenswrapper[4756]: I1205 16:32:04.811205 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws" event={"ID":"3356d953-464c-40f6-bc07-8534d8a8976e","Type":"ContainerDied","Data":"9686e0e4c17c3f3af7e435b476c61718ba62bd190315e1c9b7edc8b2af646359"} Dec 05 16:32:04 crc kubenswrapper[4756]: I1205 16:32:04.811252 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws" event={"ID":"3356d953-464c-40f6-bc07-8534d8a8976e","Type":"ContainerStarted","Data":"1d4d3618dfc8b08fbbee164c3531cb3e0b7326e98a14a345306a633580d14d7e"} Dec 05 16:32:05 crc kubenswrapper[4756]: I1205 16:32:05.820463 4756 generic.go:334] "Generic (PLEG): container finished" podID="3356d953-464c-40f6-bc07-8534d8a8976e" containerID="df632f0512c2732a1eb665a315e7441c047ee0cee53e103e554d9eabc788a9fc" exitCode=0 Dec 05 16:32:05 crc kubenswrapper[4756]: I1205 16:32:05.820506 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws" event={"ID":"3356d953-464c-40f6-bc07-8534d8a8976e","Type":"ContainerDied","Data":"df632f0512c2732a1eb665a315e7441c047ee0cee53e103e554d9eabc788a9fc"} Dec 05 16:32:06 crc kubenswrapper[4756]: I1205 16:32:06.831534 4756 generic.go:334] "Generic (PLEG): container finished" podID="3356d953-464c-40f6-bc07-8534d8a8976e" containerID="3b715513de4c73b0dea689d366c57dcf9ceaefc3cdc2b9b3a9b77c69a543714e" exitCode=0 Dec 05 16:32:06 crc kubenswrapper[4756]: I1205 16:32:06.831691 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws" event={"ID":"3356d953-464c-40f6-bc07-8534d8a8976e","Type":"ContainerDied","Data":"3b715513de4c73b0dea689d366c57dcf9ceaefc3cdc2b9b3a9b77c69a543714e"} Dec 05 16:32:08 crc kubenswrapper[4756]: I1205 16:32:08.073335 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws" Dec 05 16:32:08 crc kubenswrapper[4756]: I1205 16:32:08.218385 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3356d953-464c-40f6-bc07-8534d8a8976e-bundle\") pod \"3356d953-464c-40f6-bc07-8534d8a8976e\" (UID: \"3356d953-464c-40f6-bc07-8534d8a8976e\") " Dec 05 16:32:08 crc kubenswrapper[4756]: I1205 16:32:08.218489 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvmpp\" (UniqueName: \"kubernetes.io/projected/3356d953-464c-40f6-bc07-8534d8a8976e-kube-api-access-bvmpp\") pod \"3356d953-464c-40f6-bc07-8534d8a8976e\" (UID: \"3356d953-464c-40f6-bc07-8534d8a8976e\") " Dec 05 16:32:08 crc kubenswrapper[4756]: I1205 16:32:08.218595 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3356d953-464c-40f6-bc07-8534d8a8976e-util\") pod \"3356d953-464c-40f6-bc07-8534d8a8976e\" (UID: \"3356d953-464c-40f6-bc07-8534d8a8976e\") " Dec 05 16:32:08 crc kubenswrapper[4756]: I1205 16:32:08.219295 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3356d953-464c-40f6-bc07-8534d8a8976e-bundle" (OuterVolumeSpecName: "bundle") pod "3356d953-464c-40f6-bc07-8534d8a8976e" (UID: "3356d953-464c-40f6-bc07-8534d8a8976e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:32:08 crc kubenswrapper[4756]: I1205 16:32:08.227350 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3356d953-464c-40f6-bc07-8534d8a8976e-kube-api-access-bvmpp" (OuterVolumeSpecName: "kube-api-access-bvmpp") pod "3356d953-464c-40f6-bc07-8534d8a8976e" (UID: "3356d953-464c-40f6-bc07-8534d8a8976e"). InnerVolumeSpecName "kube-api-access-bvmpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:32:08 crc kubenswrapper[4756]: I1205 16:32:08.231887 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3356d953-464c-40f6-bc07-8534d8a8976e-util" (OuterVolumeSpecName: "util") pod "3356d953-464c-40f6-bc07-8534d8a8976e" (UID: "3356d953-464c-40f6-bc07-8534d8a8976e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:32:08 crc kubenswrapper[4756]: I1205 16:32:08.320547 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvmpp\" (UniqueName: \"kubernetes.io/projected/3356d953-464c-40f6-bc07-8534d8a8976e-kube-api-access-bvmpp\") on node \"crc\" DevicePath \"\"" Dec 05 16:32:08 crc kubenswrapper[4756]: I1205 16:32:08.320596 4756 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3356d953-464c-40f6-bc07-8534d8a8976e-util\") on node \"crc\" DevicePath \"\"" Dec 05 16:32:08 crc kubenswrapper[4756]: I1205 16:32:08.320613 4756 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3356d953-464c-40f6-bc07-8534d8a8976e-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:32:08 crc kubenswrapper[4756]: I1205 16:32:08.852030 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws" event={"ID":"3356d953-464c-40f6-bc07-8534d8a8976e","Type":"ContainerDied","Data":"1d4d3618dfc8b08fbbee164c3531cb3e0b7326e98a14a345306a633580d14d7e"} Dec 05 16:32:08 crc kubenswrapper[4756]: I1205 16:32:08.852064 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d4d3618dfc8b08fbbee164c3531cb3e0b7326e98a14a345306a633580d14d7e" Dec 05 16:32:08 crc kubenswrapper[4756]: I1205 16:32:08.852123 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws" Dec 05 16:32:15 crc kubenswrapper[4756]: I1205 16:32:15.469363 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55b6fb9447-djrcs"] Dec 05 16:32:15 crc kubenswrapper[4756]: E1205 16:32:15.470299 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3356d953-464c-40f6-bc07-8534d8a8976e" containerName="extract" Dec 05 16:32:15 crc kubenswrapper[4756]: I1205 16:32:15.470314 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3356d953-464c-40f6-bc07-8534d8a8976e" containerName="extract" Dec 05 16:32:15 crc kubenswrapper[4756]: E1205 16:32:15.470330 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3356d953-464c-40f6-bc07-8534d8a8976e" containerName="util" Dec 05 16:32:15 crc kubenswrapper[4756]: I1205 16:32:15.470336 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3356d953-464c-40f6-bc07-8534d8a8976e" containerName="util" Dec 05 16:32:15 crc kubenswrapper[4756]: E1205 16:32:15.470345 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3356d953-464c-40f6-bc07-8534d8a8976e" containerName="pull" Dec 05 16:32:15 crc kubenswrapper[4756]: I1205 16:32:15.470350 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3356d953-464c-40f6-bc07-8534d8a8976e" containerName="pull" Dec 05 16:32:15 crc kubenswrapper[4756]: I1205 16:32:15.470469 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="3356d953-464c-40f6-bc07-8534d8a8976e" containerName="extract" Dec 05 16:32:15 crc kubenswrapper[4756]: I1205 16:32:15.471077 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-djrcs" Dec 05 16:32:15 crc kubenswrapper[4756]: I1205 16:32:15.474788 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-dqk4g" Dec 05 16:32:15 crc kubenswrapper[4756]: I1205 16:32:15.497199 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55b6fb9447-djrcs"] Dec 05 16:32:15 crc kubenswrapper[4756]: I1205 16:32:15.507297 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2dgv\" (UniqueName: \"kubernetes.io/projected/fa646207-9ada-44e6-a096-f15872c59f7c-kube-api-access-j2dgv\") pod \"openstack-operator-controller-operator-55b6fb9447-djrcs\" (UID: \"fa646207-9ada-44e6-a096-f15872c59f7c\") " pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-djrcs" Dec 05 16:32:15 crc kubenswrapper[4756]: I1205 16:32:15.608097 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2dgv\" (UniqueName: \"kubernetes.io/projected/fa646207-9ada-44e6-a096-f15872c59f7c-kube-api-access-j2dgv\") pod \"openstack-operator-controller-operator-55b6fb9447-djrcs\" (UID: \"fa646207-9ada-44e6-a096-f15872c59f7c\") " pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-djrcs" Dec 05 16:32:15 crc kubenswrapper[4756]: I1205 16:32:15.631172 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2dgv\" (UniqueName: \"kubernetes.io/projected/fa646207-9ada-44e6-a096-f15872c59f7c-kube-api-access-j2dgv\") pod \"openstack-operator-controller-operator-55b6fb9447-djrcs\" (UID: \"fa646207-9ada-44e6-a096-f15872c59f7c\") " pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-djrcs" Dec 05 16:32:15 crc kubenswrapper[4756]: I1205 16:32:15.790202 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-djrcs" Dec 05 16:32:16 crc kubenswrapper[4756]: I1205 16:32:16.289163 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55b6fb9447-djrcs"] Dec 05 16:32:16 crc kubenswrapper[4756]: I1205 16:32:16.899576 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-djrcs" event={"ID":"fa646207-9ada-44e6-a096-f15872c59f7c","Type":"ContainerStarted","Data":"28cdd9160c4b3d2180024a67f350850d6cdd0a16225f428e342e45f435e50e6c"} Dec 05 16:32:21 crc kubenswrapper[4756]: I1205 16:32:21.940787 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-djrcs" event={"ID":"fa646207-9ada-44e6-a096-f15872c59f7c","Type":"ContainerStarted","Data":"daf30e62fa2a8b0884c05eb390a81616324f9f0be6969885fdf63208fc8928c7"} Dec 05 16:32:21 crc kubenswrapper[4756]: I1205 16:32:21.941410 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-djrcs" Dec 05 16:32:21 crc kubenswrapper[4756]: I1205 16:32:21.968767 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-djrcs" podStartSLOduration=1.554436959 podStartE2EDuration="6.968745104s" podCreationTimestamp="2025-12-05 16:32:15 +0000 UTC" firstStartedPulling="2025-12-05 16:32:16.296902938 +0000 UTC m=+996.535576440" lastFinishedPulling="2025-12-05 16:32:21.711211083 +0000 UTC m=+1001.949884585" observedRunningTime="2025-12-05 16:32:21.964256052 +0000 UTC m=+1002.202929584" watchObservedRunningTime="2025-12-05 16:32:21.968745104 +0000 UTC m=+1002.207418626" Dec 05 16:32:22 crc kubenswrapper[4756]: I1205 16:32:22.295820 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:32:22 crc kubenswrapper[4756]: I1205 16:32:22.295883 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:32:35 crc kubenswrapper[4756]: I1205 16:32:35.793216 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-djrcs" Dec 05 16:32:52 crc kubenswrapper[4756]: I1205 16:32:52.296046 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:32:52 crc kubenswrapper[4756]: I1205 16:32:52.296712 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.550684 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-r6xbc"] Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.552688 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-r6xbc" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.554625 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-qcxsq" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.554683 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-r7qh7"] Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.555657 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-r7qh7" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.557855 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-lk46j" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.562792 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-r6xbc"] Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.585662 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-r7qh7"] Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.598035 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-7qxl4"] Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.599186 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-7qxl4" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.600821 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-xtjjn" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.607075 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-f65kp"] Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.608449 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-f65kp" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.616543 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpsjz\" (UniqueName: \"kubernetes.io/projected/626bcac2-4a19-4f34-83be-1f7261f44161-kube-api-access-tpsjz\") pod \"glance-operator-controller-manager-77987cd8cd-7qxl4\" (UID: \"626bcac2-4a19-4f34-83be-1f7261f44161\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-7qxl4" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.616606 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtrj7\" (UniqueName: \"kubernetes.io/projected/9354b938-780a-4431-a060-48ad59f1bbec-kube-api-access-jtrj7\") pod \"cinder-operator-controller-manager-859b6ccc6-r7qh7\" (UID: \"9354b938-780a-4431-a060-48ad59f1bbec\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-r7qh7" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.616713 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbxd6\" (UniqueName: \"kubernetes.io/projected/ced54d49-8a17-4bba-a783-c0f69ed32a7a-kube-api-access-bbxd6\") pod \"designate-operator-controller-manager-78b4bc895b-f65kp\" (UID: \"ced54d49-8a17-4bba-a783-c0f69ed32a7a\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-f65kp" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.622702 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-454x8" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.625052 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-7qxl4"] Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.629806 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-f65kp"] Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.641589 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rm4xp"] Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.642901 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rm4xp" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.648772 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-57rzv" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.669479 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rm4xp"] Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.671761 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-njt8n"] Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.672692 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-njt8n" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.674808 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-ftfw5" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.697157 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-6xbpf"] Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.698081 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-6xbpf" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.702002 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-2pnm4" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.705008 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.711967 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-njt8n"] Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.719039 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38-cert\") pod \"infra-operator-controller-manager-57548d458d-6xbpf\" (UID: \"b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-6xbpf" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.719106 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbxd6\" (UniqueName: \"kubernetes.io/projected/ced54d49-8a17-4bba-a783-c0f69ed32a7a-kube-api-access-bbxd6\") pod \"designate-operator-controller-manager-78b4bc895b-f65kp\" (UID: \"ced54d49-8a17-4bba-a783-c0f69ed32a7a\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-f65kp" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.719142 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8pzz\" (UniqueName: \"kubernetes.io/projected/bc274847-f26a-4043-86f6-60af2577a321-kube-api-access-c8pzz\") pod \"heat-operator-controller-manager-5f64f6f8bb-rm4xp\" (UID: \"bc274847-f26a-4043-86f6-60af2577a321\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rm4xp" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.719176 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgcsp\" (UniqueName: \"kubernetes.io/projected/b7a067e2-8f42-4eec-9b62-623498c27f31-kube-api-access-jgcsp\") pod \"horizon-operator-controller-manager-68c6d99b8f-njt8n\" (UID: \"b7a067e2-8f42-4eec-9b62-623498c27f31\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-njt8n" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.719205 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpsjz\" (UniqueName: \"kubernetes.io/projected/626bcac2-4a19-4f34-83be-1f7261f44161-kube-api-access-tpsjz\") pod \"glance-operator-controller-manager-77987cd8cd-7qxl4\" (UID: \"626bcac2-4a19-4f34-83be-1f7261f44161\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-7qxl4" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.719239 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtrj7\" (UniqueName: \"kubernetes.io/projected/9354b938-780a-4431-a060-48ad59f1bbec-kube-api-access-jtrj7\") pod \"cinder-operator-controller-manager-859b6ccc6-r7qh7\" (UID: \"9354b938-780a-4431-a060-48ad59f1bbec\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-r7qh7" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.719271 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crk46\" (UniqueName: \"kubernetes.io/projected/b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38-kube-api-access-crk46\") pod \"infra-operator-controller-manager-57548d458d-6xbpf\" (UID: \"b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-6xbpf" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.719318 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw62h\" (UniqueName: \"kubernetes.io/projected/6013b2f1-3b04-47f0-bb2c-f741587c8efc-kube-api-access-cw62h\") pod \"barbican-operator-controller-manager-7d9dfd778-r6xbc\" (UID: \"6013b2f1-3b04-47f0-bb2c-f741587c8efc\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-r6xbc" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.720266 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-kbg5m"] Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.721552 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-kbg5m" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.726032 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-xvqbc" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.763586 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtrj7\" (UniqueName: \"kubernetes.io/projected/9354b938-780a-4431-a060-48ad59f1bbec-kube-api-access-jtrj7\") pod \"cinder-operator-controller-manager-859b6ccc6-r7qh7\" (UID: \"9354b938-780a-4431-a060-48ad59f1bbec\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-r7qh7" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.767485 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-6xbpf"] Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.773007 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpsjz\" (UniqueName: \"kubernetes.io/projected/626bcac2-4a19-4f34-83be-1f7261f44161-kube-api-access-tpsjz\") pod \"glance-operator-controller-manager-77987cd8cd-7qxl4\" (UID: \"626bcac2-4a19-4f34-83be-1f7261f44161\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-7qxl4" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.773034 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbxd6\" (UniqueName: \"kubernetes.io/projected/ced54d49-8a17-4bba-a783-c0f69ed32a7a-kube-api-access-bbxd6\") pod \"designate-operator-controller-manager-78b4bc895b-f65kp\" (UID: \"ced54d49-8a17-4bba-a783-c0f69ed32a7a\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-f65kp" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.776697 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-fcwhr"] Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.784178 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-kbg5m"] Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.784313 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fcwhr" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.794304 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-xx4cw" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.805743 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-fcwhr"] Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.820552 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-m8qwl"] Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.821830 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-m8qwl" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.822805 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8pzz\" (UniqueName: \"kubernetes.io/projected/bc274847-f26a-4043-86f6-60af2577a321-kube-api-access-c8pzz\") pod \"heat-operator-controller-manager-5f64f6f8bb-rm4xp\" (UID: \"bc274847-f26a-4043-86f6-60af2577a321\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rm4xp" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.822853 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgcsp\" (UniqueName: \"kubernetes.io/projected/b7a067e2-8f42-4eec-9b62-623498c27f31-kube-api-access-jgcsp\") pod \"horizon-operator-controller-manager-68c6d99b8f-njt8n\" (UID: \"b7a067e2-8f42-4eec-9b62-623498c27f31\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-njt8n" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.822912 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crk46\" (UniqueName: \"kubernetes.io/projected/b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38-kube-api-access-crk46\") pod \"infra-operator-controller-manager-57548d458d-6xbpf\" (UID: \"b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-6xbpf" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.822964 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw62h\" (UniqueName: \"kubernetes.io/projected/6013b2f1-3b04-47f0-bb2c-f741587c8efc-kube-api-access-cw62h\") pod \"barbican-operator-controller-manager-7d9dfd778-r6xbc\" (UID: \"6013b2f1-3b04-47f0-bb2c-f741587c8efc\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-r6xbc" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.823011 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38-cert\") pod \"infra-operator-controller-manager-57548d458d-6xbpf\" (UID: \"b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-6xbpf" Dec 05 16:33:03 crc kubenswrapper[4756]: E1205 16:33:03.823128 4756 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 16:33:03 crc kubenswrapper[4756]: E1205 16:33:03.823180 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38-cert podName:b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38 nodeName:}" failed. No retries permitted until 2025-12-05 16:33:04.323163263 +0000 UTC m=+1044.561836785 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38-cert") pod "infra-operator-controller-manager-57548d458d-6xbpf" (UID: "b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38") : secret "infra-operator-webhook-server-cert" not found Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.831891 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-fqwh2" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.840479 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-9lzlc"] Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.868987 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8pzz\" (UniqueName: \"kubernetes.io/projected/bc274847-f26a-4043-86f6-60af2577a321-kube-api-access-c8pzz\") pod \"heat-operator-controller-manager-5f64f6f8bb-rm4xp\" (UID: \"bc274847-f26a-4043-86f6-60af2577a321\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rm4xp" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.877211 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgcsp\" (UniqueName: \"kubernetes.io/projected/b7a067e2-8f42-4eec-9b62-623498c27f31-kube-api-access-jgcsp\") pod \"horizon-operator-controller-manager-68c6d99b8f-njt8n\" (UID: \"b7a067e2-8f42-4eec-9b62-623498c27f31\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-njt8n" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.878947 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-9lzlc" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.879331 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-r7qh7" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.883234 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-m8qwl"] Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.886126 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw62h\" (UniqueName: \"kubernetes.io/projected/6013b2f1-3b04-47f0-bb2c-f741587c8efc-kube-api-access-cw62h\") pod \"barbican-operator-controller-manager-7d9dfd778-r6xbc\" (UID: \"6013b2f1-3b04-47f0-bb2c-f741587c8efc\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-r6xbc" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.886546 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-njzrt" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.889754 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crk46\" (UniqueName: \"kubernetes.io/projected/b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38-kube-api-access-crk46\") pod \"infra-operator-controller-manager-57548d458d-6xbpf\" (UID: \"b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-6xbpf" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.918599 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-7qxl4" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.924251 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpvpk\" (UniqueName: \"kubernetes.io/projected/95cbe112-4bfb-463a-99d3-15283473d2c6-kube-api-access-kpvpk\") pod \"keystone-operator-controller-manager-7765d96ddf-fcwhr\" (UID: \"95cbe112-4bfb-463a-99d3-15283473d2c6\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fcwhr" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.924292 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54fj9\" (UniqueName: \"kubernetes.io/projected/ccaea4ee-e7e9-451f-a1e3-fef7402f356b-kube-api-access-54fj9\") pod \"manila-operator-controller-manager-7c79b5df47-m8qwl\" (UID: \"ccaea4ee-e7e9-451f-a1e3-fef7402f356b\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-m8qwl" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.924362 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsvkh\" (UniqueName: \"kubernetes.io/projected/2afcdd94-bf46-446a-b9d1-ff49260594d9-kube-api-access-zsvkh\") pod \"ironic-operator-controller-manager-6c548fd776-kbg5m\" (UID: \"2afcdd94-bf46-446a-b9d1-ff49260594d9\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-kbg5m" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.945376 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-pvn24"] Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.947126 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pvn24" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.948048 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-f65kp" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.950051 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-ll26m" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.975174 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rm4xp" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.984913 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-25dl4"] Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.986458 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-25dl4" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.988697 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-bf62k" Dec 05 16:33:03 crc kubenswrapper[4756]: I1205 16:33:03.999178 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-9lzlc"] Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.002441 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-njt8n" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.009694 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-25dl4"] Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.026832 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsvkh\" (UniqueName: \"kubernetes.io/projected/2afcdd94-bf46-446a-b9d1-ff49260594d9-kube-api-access-zsvkh\") pod \"ironic-operator-controller-manager-6c548fd776-kbg5m\" (UID: \"2afcdd94-bf46-446a-b9d1-ff49260594d9\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-kbg5m" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.026915 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxdjb\" (UniqueName: \"kubernetes.io/projected/482f7c65-7376-4794-9308-062ed6209299-kube-api-access-vxdjb\") pod \"mariadb-operator-controller-manager-56bbcc9d85-9lzlc\" (UID: \"482f7c65-7376-4794-9308-062ed6209299\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-9lzlc" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.026963 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j4b7\" (UniqueName: \"kubernetes.io/projected/156f45a8-a698-4f99-93c3-d8cc8a40d14f-kube-api-access-7j4b7\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-25dl4\" (UID: \"156f45a8-a698-4f99-93c3-d8cc8a40d14f\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-25dl4" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.027006 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpvpk\" (UniqueName: \"kubernetes.io/projected/95cbe112-4bfb-463a-99d3-15283473d2c6-kube-api-access-kpvpk\") pod \"keystone-operator-controller-manager-7765d96ddf-fcwhr\" (UID: \"95cbe112-4bfb-463a-99d3-15283473d2c6\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fcwhr" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.027037 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54fj9\" (UniqueName: \"kubernetes.io/projected/ccaea4ee-e7e9-451f-a1e3-fef7402f356b-kube-api-access-54fj9\") pod \"manila-operator-controller-manager-7c79b5df47-m8qwl\" (UID: \"ccaea4ee-e7e9-451f-a1e3-fef7402f356b\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-m8qwl" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.027064 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n62lj\" (UniqueName: \"kubernetes.io/projected/79ee86d5-861a-4a03-b57a-9b382be05361-kube-api-access-n62lj\") pod \"nova-operator-controller-manager-697bc559fc-pvn24\" (UID: \"79ee86d5-861a-4a03-b57a-9b382be05361\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pvn24" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.027792 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-qlbvt"] Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.029061 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qlbvt" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.032844 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-4brm6" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.036778 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-pvn24"] Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.053044 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpvpk\" (UniqueName: \"kubernetes.io/projected/95cbe112-4bfb-463a-99d3-15283473d2c6-kube-api-access-kpvpk\") pod \"keystone-operator-controller-manager-7765d96ddf-fcwhr\" (UID: \"95cbe112-4bfb-463a-99d3-15283473d2c6\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fcwhr" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.065304 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsvkh\" (UniqueName: \"kubernetes.io/projected/2afcdd94-bf46-446a-b9d1-ff49260594d9-kube-api-access-zsvkh\") pod \"ironic-operator-controller-manager-6c548fd776-kbg5m\" (UID: \"2afcdd94-bf46-446a-b9d1-ff49260594d9\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-kbg5m" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.072291 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54fj9\" (UniqueName: \"kubernetes.io/projected/ccaea4ee-e7e9-451f-a1e3-fef7402f356b-kube-api-access-54fj9\") pod \"manila-operator-controller-manager-7c79b5df47-m8qwl\" (UID: \"ccaea4ee-e7e9-451f-a1e3-fef7402f356b\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-m8qwl" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.084490 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnrr5"] Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.085764 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-qlbvt"] Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.085887 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnrr5" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.089031 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-wmpg2" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.104106 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5k5nb8"] Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.105727 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5k5nb8" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.110962 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.118503 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-qrkfp"] Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.119751 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-qrkfp" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.128212 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nhmh\" (UniqueName: \"kubernetes.io/projected/6953530f-9e26-4c39-b015-bdadcf81aca2-kube-api-access-9nhmh\") pod \"ovn-operator-controller-manager-b6456fdb6-dnrr5\" (UID: \"6953530f-9e26-4c39-b015-bdadcf81aca2\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnrr5" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.128265 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxdjb\" (UniqueName: \"kubernetes.io/projected/482f7c65-7376-4794-9308-062ed6209299-kube-api-access-vxdjb\") pod \"mariadb-operator-controller-manager-56bbcc9d85-9lzlc\" (UID: \"482f7c65-7376-4794-9308-062ed6209299\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-9lzlc" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.128375 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j4b7\" (UniqueName: \"kubernetes.io/projected/156f45a8-a698-4f99-93c3-d8cc8a40d14f-kube-api-access-7j4b7\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-25dl4\" (UID: \"156f45a8-a698-4f99-93c3-d8cc8a40d14f\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-25dl4" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.128410 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n62lj\" (UniqueName: \"kubernetes.io/projected/79ee86d5-861a-4a03-b57a-9b382be05361-kube-api-access-n62lj\") pod \"nova-operator-controller-manager-697bc559fc-pvn24\" (UID: \"79ee86d5-861a-4a03-b57a-9b382be05361\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pvn24" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.129265 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-pjlh9" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.129523 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-s74jn" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.136564 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fcwhr" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.143164 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-qrkfp"] Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.155440 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnrr5"] Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.157249 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j4b7\" (UniqueName: \"kubernetes.io/projected/156f45a8-a698-4f99-93c3-d8cc8a40d14f-kube-api-access-7j4b7\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-25dl4\" (UID: \"156f45a8-a698-4f99-93c3-d8cc8a40d14f\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-25dl4" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.162504 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n62lj\" (UniqueName: \"kubernetes.io/projected/79ee86d5-861a-4a03-b57a-9b382be05361-kube-api-access-n62lj\") pod \"nova-operator-controller-manager-697bc559fc-pvn24\" (UID: \"79ee86d5-861a-4a03-b57a-9b382be05361\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pvn24" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.168914 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c8h5n"] Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.169321 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-r6xbc" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.169943 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c8h5n" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.175859 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-m8qwl" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.183787 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-lkk6z" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.186995 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5k5nb8"] Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.200915 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxdjb\" (UniqueName: \"kubernetes.io/projected/482f7c65-7376-4794-9308-062ed6209299-kube-api-access-vxdjb\") pod \"mariadb-operator-controller-manager-56bbcc9d85-9lzlc\" (UID: \"482f7c65-7376-4794-9308-062ed6209299\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-9lzlc" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.216630 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c8h5n"] Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.234006 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rgvg\" (UniqueName: \"kubernetes.io/projected/94e40b0c-5c93-4c26-b9c1-3c9240a8a44e-kube-api-access-5rgvg\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5k5nb8\" (UID: \"94e40b0c-5c93-4c26-b9c1-3c9240a8a44e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5k5nb8" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.234062 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7rj4\" (UniqueName: \"kubernetes.io/projected/fd94fbcc-a2d3-423f-87fd-57b6e09d9b3d-kube-api-access-m7rj4\") pod \"swift-operator-controller-manager-5f8c65bbfc-c8h5n\" (UID: \"fd94fbcc-a2d3-423f-87fd-57b6e09d9b3d\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c8h5n" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.234105 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flrqt\" (UniqueName: \"kubernetes.io/projected/a6494f1e-bf6a-469e-a9ab-fdc9211668c9-kube-api-access-flrqt\") pod \"octavia-operator-controller-manager-998648c74-qlbvt\" (UID: \"a6494f1e-bf6a-469e-a9ab-fdc9211668c9\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-qlbvt" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.234134 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nhmh\" (UniqueName: \"kubernetes.io/projected/6953530f-9e26-4c39-b015-bdadcf81aca2-kube-api-access-9nhmh\") pod \"ovn-operator-controller-manager-b6456fdb6-dnrr5\" (UID: \"6953530f-9e26-4c39-b015-bdadcf81aca2\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnrr5" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.234180 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8qqn\" (UniqueName: \"kubernetes.io/projected/e9fe3d7c-713c-4b55-9775-c557846cae41-kube-api-access-r8qqn\") pod \"placement-operator-controller-manager-78f8948974-qrkfp\" (UID: \"e9fe3d7c-713c-4b55-9775-c557846cae41\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-qrkfp" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.234619 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94e40b0c-5c93-4c26-b9c1-3c9240a8a44e-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5k5nb8\" (UID: \"94e40b0c-5c93-4c26-b9c1-3c9240a8a44e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5k5nb8" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.238490 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8ln"] Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.239909 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8ln" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.246890 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-txwmw" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.255005 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8ln"] Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.264791 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nhmh\" (UniqueName: \"kubernetes.io/projected/6953530f-9e26-4c39-b015-bdadcf81aca2-kube-api-access-9nhmh\") pod \"ovn-operator-controller-manager-b6456fdb6-dnrr5\" (UID: \"6953530f-9e26-4c39-b015-bdadcf81aca2\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnrr5" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.266303 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-9lzlc" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.291578 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pvn24" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.292667 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-22zvw"] Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.293722 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-22zvw" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.303733 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-6nfkd" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.311874 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-22zvw"] Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.316408 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-25dl4" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.333786 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-rdp64"] Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.368374 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-kbg5m" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.419108 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rgvg\" (UniqueName: \"kubernetes.io/projected/94e40b0c-5c93-4c26-b9c1-3c9240a8a44e-kube-api-access-5rgvg\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5k5nb8\" (UID: \"94e40b0c-5c93-4c26-b9c1-3c9240a8a44e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5k5nb8" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.419226 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7rj4\" (UniqueName: \"kubernetes.io/projected/fd94fbcc-a2d3-423f-87fd-57b6e09d9b3d-kube-api-access-m7rj4\") pod \"swift-operator-controller-manager-5f8c65bbfc-c8h5n\" (UID: \"fd94fbcc-a2d3-423f-87fd-57b6e09d9b3d\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c8h5n" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.419338 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flrqt\" (UniqueName: \"kubernetes.io/projected/a6494f1e-bf6a-469e-a9ab-fdc9211668c9-kube-api-access-flrqt\") pod \"octavia-operator-controller-manager-998648c74-qlbvt\" (UID: \"a6494f1e-bf6a-469e-a9ab-fdc9211668c9\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-qlbvt" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.419455 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8qqn\" (UniqueName: \"kubernetes.io/projected/e9fe3d7c-713c-4b55-9775-c557846cae41-kube-api-access-r8qqn\") pod \"placement-operator-controller-manager-78f8948974-qrkfp\" (UID: \"e9fe3d7c-713c-4b55-9775-c557846cae41\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-qrkfp" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.419557 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94e40b0c-5c93-4c26-b9c1-3c9240a8a44e-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5k5nb8\" (UID: \"94e40b0c-5c93-4c26-b9c1-3c9240a8a44e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5k5nb8" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.419600 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38-cert\") pod \"infra-operator-controller-manager-57548d458d-6xbpf\" (UID: \"b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-6xbpf" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.420097 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-rdp64" Dec 05 16:33:04 crc kubenswrapper[4756]: E1205 16:33:04.420163 4756 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 16:33:04 crc kubenswrapper[4756]: E1205 16:33:04.424363 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38-cert podName:b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38 nodeName:}" failed. No retries permitted until 2025-12-05 16:33:05.424346229 +0000 UTC m=+1045.663019741 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38-cert") pod "infra-operator-controller-manager-57548d458d-6xbpf" (UID: "b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38") : secret "infra-operator-webhook-server-cert" not found Dec 05 16:33:04 crc kubenswrapper[4756]: E1205 16:33:04.424280 4756 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 16:33:04 crc kubenswrapper[4756]: E1205 16:33:04.424704 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94e40b0c-5c93-4c26-b9c1-3c9240a8a44e-cert podName:94e40b0c-5c93-4c26-b9c1-3c9240a8a44e nodeName:}" failed. No retries permitted until 2025-12-05 16:33:04.924695619 +0000 UTC m=+1045.163369131 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/94e40b0c-5c93-4c26-b9c1-3c9240a8a44e-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f5k5nb8" (UID: "94e40b0c-5c93-4c26-b9c1-3c9240a8a44e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.431058 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnrr5" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.444569 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-lc6fw" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.446425 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-rdp64"] Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.500157 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flrqt\" (UniqueName: \"kubernetes.io/projected/a6494f1e-bf6a-469e-a9ab-fdc9211668c9-kube-api-access-flrqt\") pod \"octavia-operator-controller-manager-998648c74-qlbvt\" (UID: \"a6494f1e-bf6a-469e-a9ab-fdc9211668c9\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-qlbvt" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.507686 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7rj4\" (UniqueName: \"kubernetes.io/projected/fd94fbcc-a2d3-423f-87fd-57b6e09d9b3d-kube-api-access-m7rj4\") pod \"swift-operator-controller-manager-5f8c65bbfc-c8h5n\" (UID: \"fd94fbcc-a2d3-423f-87fd-57b6e09d9b3d\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c8h5n" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.508706 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8qqn\" (UniqueName: \"kubernetes.io/projected/e9fe3d7c-713c-4b55-9775-c557846cae41-kube-api-access-r8qqn\") pod \"placement-operator-controller-manager-78f8948974-qrkfp\" (UID: \"e9fe3d7c-713c-4b55-9775-c557846cae41\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-qrkfp" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.509179 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rgvg\" (UniqueName: \"kubernetes.io/projected/94e40b0c-5c93-4c26-b9c1-3c9240a8a44e-kube-api-access-5rgvg\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5k5nb8\" (UID: \"94e40b0c-5c93-4c26-b9c1-3c9240a8a44e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5k5nb8" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.523934 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn5qs\" (UniqueName: \"kubernetes.io/projected/76126169-0993-4057-9698-56bb61a8872f-kube-api-access-gn5qs\") pod \"test-operator-controller-manager-5854674fcc-22zvw\" (UID: \"76126169-0993-4057-9698-56bb61a8872f\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-22zvw" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.524353 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz592\" (UniqueName: \"kubernetes.io/projected/fd6cc312-becc-437b-bfd6-e461f3ae0912-kube-api-access-tz592\") pod \"telemetry-operator-controller-manager-76cc84c6bb-2x8ln\" (UID: \"fd6cc312-becc-437b-bfd6-e461f3ae0912\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8ln" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.529568 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg"] Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.530457 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.532099 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.532434 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.532571 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-kmhsn" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.533717 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c8h5n" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.538446 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg"] Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.546023 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l5gdd"] Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.547322 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l5gdd" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.549012 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-29vz5" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.553964 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l5gdd"] Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.630617 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtltr\" (UniqueName: \"kubernetes.io/projected/dd84c3d4-9a84-4418-8950-c550a63f018a-kube-api-access-wtltr\") pod \"rabbitmq-cluster-operator-manager-668c99d594-l5gdd\" (UID: \"dd84c3d4-9a84-4418-8950-c550a63f018a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l5gdd" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.630920 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b8ghg\" (UID: \"f5709380-2e15-4b5b-94a0-196fcdfc73d3\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.630969 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn5qs\" (UniqueName: \"kubernetes.io/projected/76126169-0993-4057-9698-56bb61a8872f-kube-api-access-gn5qs\") pod \"test-operator-controller-manager-5854674fcc-22zvw\" (UID: \"76126169-0993-4057-9698-56bb61a8872f\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-22zvw" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.630988 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz592\" (UniqueName: \"kubernetes.io/projected/fd6cc312-becc-437b-bfd6-e461f3ae0912-kube-api-access-tz592\") pod \"telemetry-operator-controller-manager-76cc84c6bb-2x8ln\" (UID: \"fd6cc312-becc-437b-bfd6-e461f3ae0912\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8ln" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.631007 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz7mb\" (UniqueName: \"kubernetes.io/projected/e92435ef-34fe-44e9-bf45-7900fed5f199-kube-api-access-gz7mb\") pod \"watcher-operator-controller-manager-769dc69bc-rdp64\" (UID: \"e92435ef-34fe-44e9-bf45-7900fed5f199\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-rdp64" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.631026 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bfvb\" (UniqueName: \"kubernetes.io/projected/f5709380-2e15-4b5b-94a0-196fcdfc73d3-kube-api-access-5bfvb\") pod \"openstack-operator-controller-manager-54bdf956c4-b8ghg\" (UID: \"f5709380-2e15-4b5b-94a0-196fcdfc73d3\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.631064 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b8ghg\" (UID: \"f5709380-2e15-4b5b-94a0-196fcdfc73d3\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.667499 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn5qs\" (UniqueName: \"kubernetes.io/projected/76126169-0993-4057-9698-56bb61a8872f-kube-api-access-gn5qs\") pod \"test-operator-controller-manager-5854674fcc-22zvw\" (UID: \"76126169-0993-4057-9698-56bb61a8872f\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-22zvw" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.672790 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qlbvt" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.675021 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz592\" (UniqueName: \"kubernetes.io/projected/fd6cc312-becc-437b-bfd6-e461f3ae0912-kube-api-access-tz592\") pod \"telemetry-operator-controller-manager-76cc84c6bb-2x8ln\" (UID: \"fd6cc312-becc-437b-bfd6-e461f3ae0912\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8ln" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.702296 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8ln" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.731763 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b8ghg\" (UID: \"f5709380-2e15-4b5b-94a0-196fcdfc73d3\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.731833 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtltr\" (UniqueName: \"kubernetes.io/projected/dd84c3d4-9a84-4418-8950-c550a63f018a-kube-api-access-wtltr\") pod \"rabbitmq-cluster-operator-manager-668c99d594-l5gdd\" (UID: \"dd84c3d4-9a84-4418-8950-c550a63f018a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l5gdd" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.731873 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b8ghg\" (UID: \"f5709380-2e15-4b5b-94a0-196fcdfc73d3\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.731926 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gz7mb\" (UniqueName: \"kubernetes.io/projected/e92435ef-34fe-44e9-bf45-7900fed5f199-kube-api-access-gz7mb\") pod \"watcher-operator-controller-manager-769dc69bc-rdp64\" (UID: \"e92435ef-34fe-44e9-bf45-7900fed5f199\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-rdp64" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.731958 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bfvb\" (UniqueName: \"kubernetes.io/projected/f5709380-2e15-4b5b-94a0-196fcdfc73d3-kube-api-access-5bfvb\") pod \"openstack-operator-controller-manager-54bdf956c4-b8ghg\" (UID: \"f5709380-2e15-4b5b-94a0-196fcdfc73d3\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg" Dec 05 16:33:04 crc kubenswrapper[4756]: E1205 16:33:04.732459 4756 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 16:33:04 crc kubenswrapper[4756]: E1205 16:33:04.732541 4756 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 16:33:04 crc kubenswrapper[4756]: E1205 16:33:04.732553 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-metrics-certs podName:f5709380-2e15-4b5b-94a0-196fcdfc73d3 nodeName:}" failed. No retries permitted until 2025-12-05 16:33:05.232527593 +0000 UTC m=+1045.471201105 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-b8ghg" (UID: "f5709380-2e15-4b5b-94a0-196fcdfc73d3") : secret "metrics-server-cert" not found Dec 05 16:33:04 crc kubenswrapper[4756]: E1205 16:33:04.732666 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-webhook-certs podName:f5709380-2e15-4b5b-94a0-196fcdfc73d3 nodeName:}" failed. No retries permitted until 2025-12-05 16:33:05.232644946 +0000 UTC m=+1045.471318508 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-b8ghg" (UID: "f5709380-2e15-4b5b-94a0-196fcdfc73d3") : secret "webhook-server-cert" not found Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.765615 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bfvb\" (UniqueName: \"kubernetes.io/projected/f5709380-2e15-4b5b-94a0-196fcdfc73d3-kube-api-access-5bfvb\") pod \"openstack-operator-controller-manager-54bdf956c4-b8ghg\" (UID: \"f5709380-2e15-4b5b-94a0-196fcdfc73d3\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.765847 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtltr\" (UniqueName: \"kubernetes.io/projected/dd84c3d4-9a84-4418-8950-c550a63f018a-kube-api-access-wtltr\") pod \"rabbitmq-cluster-operator-manager-668c99d594-l5gdd\" (UID: \"dd84c3d4-9a84-4418-8950-c550a63f018a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l5gdd" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.771042 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz7mb\" (UniqueName: \"kubernetes.io/projected/e92435ef-34fe-44e9-bf45-7900fed5f199-kube-api-access-gz7mb\") pod \"watcher-operator-controller-manager-769dc69bc-rdp64\" (UID: \"e92435ef-34fe-44e9-bf45-7900fed5f199\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-rdp64" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.802387 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-qrkfp" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.886870 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-22zvw" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.895890 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-rdp64" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.924741 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l5gdd" Dec 05 16:33:04 crc kubenswrapper[4756]: I1205 16:33:04.941101 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94e40b0c-5c93-4c26-b9c1-3c9240a8a44e-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5k5nb8\" (UID: \"94e40b0c-5c93-4c26-b9c1-3c9240a8a44e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5k5nb8" Dec 05 16:33:04 crc kubenswrapper[4756]: E1205 16:33:04.941300 4756 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 16:33:04 crc kubenswrapper[4756]: E1205 16:33:04.941363 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94e40b0c-5c93-4c26-b9c1-3c9240a8a44e-cert podName:94e40b0c-5c93-4c26-b9c1-3c9240a8a44e nodeName:}" failed. No retries permitted until 2025-12-05 16:33:05.941338133 +0000 UTC m=+1046.180011635 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/94e40b0c-5c93-4c26-b9c1-3c9240a8a44e-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f5k5nb8" (UID: "94e40b0c-5c93-4c26-b9c1-3c9240a8a44e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 16:33:05 crc kubenswrapper[4756]: I1205 16:33:05.024167 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-r7qh7"] Dec 05 16:33:05 crc kubenswrapper[4756]: I1205 16:33:05.068758 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-f65kp"] Dec 05 16:33:05 crc kubenswrapper[4756]: I1205 16:33:05.101481 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-njt8n"] Dec 05 16:33:05 crc kubenswrapper[4756]: W1205 16:33:05.145347 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podced54d49_8a17_4bba_a783_c0f69ed32a7a.slice/crio-af5df99cb46f3e26fb8bedc299993ee60bc1579d3a09d9699fa8993962727005 WatchSource:0}: Error finding container af5df99cb46f3e26fb8bedc299993ee60bc1579d3a09d9699fa8993962727005: Status 404 returned error can't find the container with id af5df99cb46f3e26fb8bedc299993ee60bc1579d3a09d9699fa8993962727005 Dec 05 16:33:05 crc kubenswrapper[4756]: I1205 16:33:05.245300 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b8ghg\" (UID: \"f5709380-2e15-4b5b-94a0-196fcdfc73d3\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg" Dec 05 16:33:05 crc kubenswrapper[4756]: I1205 16:33:05.245449 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b8ghg\" (UID: \"f5709380-2e15-4b5b-94a0-196fcdfc73d3\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg" Dec 05 16:33:05 crc kubenswrapper[4756]: E1205 16:33:05.245626 4756 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 16:33:05 crc kubenswrapper[4756]: E1205 16:33:05.245692 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-webhook-certs podName:f5709380-2e15-4b5b-94a0-196fcdfc73d3 nodeName:}" failed. No retries permitted until 2025-12-05 16:33:06.245676433 +0000 UTC m=+1046.484349935 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-b8ghg" (UID: "f5709380-2e15-4b5b-94a0-196fcdfc73d3") : secret "webhook-server-cert" not found Dec 05 16:33:05 crc kubenswrapper[4756]: E1205 16:33:05.246084 4756 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 16:33:05 crc kubenswrapper[4756]: E1205 16:33:05.246131 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-metrics-certs podName:f5709380-2e15-4b5b-94a0-196fcdfc73d3 nodeName:}" failed. No retries permitted until 2025-12-05 16:33:06.246107365 +0000 UTC m=+1046.484780867 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-b8ghg" (UID: "f5709380-2e15-4b5b-94a0-196fcdfc73d3") : secret "metrics-server-cert" not found Dec 05 16:33:05 crc kubenswrapper[4756]: I1205 16:33:05.335647 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-r7qh7" event={"ID":"9354b938-780a-4431-a060-48ad59f1bbec","Type":"ContainerStarted","Data":"2543de61f509b1fa3184e6ee7837965240f870757fa06cb6649c967ee56d3709"} Dec 05 16:33:05 crc kubenswrapper[4756]: I1205 16:33:05.336789 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-njt8n" event={"ID":"b7a067e2-8f42-4eec-9b62-623498c27f31","Type":"ContainerStarted","Data":"cbca92539132c21d3d7f4a742b8143f650a05046bf1071a21a138aa136946cbc"} Dec 05 16:33:05 crc kubenswrapper[4756]: I1205 16:33:05.338405 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-f65kp" event={"ID":"ced54d49-8a17-4bba-a783-c0f69ed32a7a","Type":"ContainerStarted","Data":"af5df99cb46f3e26fb8bedc299993ee60bc1579d3a09d9699fa8993962727005"} Dec 05 16:33:05 crc kubenswrapper[4756]: I1205 16:33:05.448376 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38-cert\") pod \"infra-operator-controller-manager-57548d458d-6xbpf\" (UID: \"b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-6xbpf" Dec 05 16:33:05 crc kubenswrapper[4756]: E1205 16:33:05.448575 4756 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 16:33:05 crc kubenswrapper[4756]: E1205 16:33:05.448632 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38-cert podName:b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38 nodeName:}" failed. No retries permitted until 2025-12-05 16:33:07.448616905 +0000 UTC m=+1047.687290417 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38-cert") pod "infra-operator-controller-manager-57548d458d-6xbpf" (UID: "b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38") : secret "infra-operator-webhook-server-cert" not found Dec 05 16:33:05 crc kubenswrapper[4756]: I1205 16:33:05.490064 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-7qxl4"] Dec 05 16:33:05 crc kubenswrapper[4756]: I1205 16:33:05.557067 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rm4xp"] Dec 05 16:33:05 crc kubenswrapper[4756]: W1205 16:33:05.558089 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc274847_f26a_4043_86f6_60af2577a321.slice/crio-70752d20856079f8cf623af14b7446e9950aab64f8d8e78ee6aab0c5daa50fd4 WatchSource:0}: Error finding container 70752d20856079f8cf623af14b7446e9950aab64f8d8e78ee6aab0c5daa50fd4: Status 404 returned error can't find the container with id 70752d20856079f8cf623af14b7446e9950aab64f8d8e78ee6aab0c5daa50fd4 Dec 05 16:33:05 crc kubenswrapper[4756]: I1205 16:33:05.722601 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-9lzlc"] Dec 05 16:33:05 crc kubenswrapper[4756]: W1205 16:33:05.728909 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95cbe112_4bfb_463a_99d3_15283473d2c6.slice/crio-9d4bdecababc26588fbd99df4641ead01c0ec2de059c08ded0aa13ede0730398 WatchSource:0}: Error finding container 9d4bdecababc26588fbd99df4641ead01c0ec2de059c08ded0aa13ede0730398: Status 404 returned error can't find the container with id 9d4bdecababc26588fbd99df4641ead01c0ec2de059c08ded0aa13ede0730398 Dec 05 16:33:05 crc kubenswrapper[4756]: I1205 16:33:05.737953 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-fcwhr"] Dec 05 16:33:05 crc kubenswrapper[4756]: I1205 16:33:05.741637 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-m8qwl"] Dec 05 16:33:05 crc kubenswrapper[4756]: I1205 16:33:05.758867 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-r6xbc"] Dec 05 16:33:05 crc kubenswrapper[4756]: I1205 16:33:05.771474 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-pvn24"] Dec 05 16:33:05 crc kubenswrapper[4756]: I1205 16:33:05.786106 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-kbg5m"] Dec 05 16:33:05 crc kubenswrapper[4756]: I1205 16:33:05.793685 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-25dl4"] Dec 05 16:33:05 crc kubenswrapper[4756]: W1205 16:33:05.793813 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd94fbcc_a2d3_423f_87fd_57b6e09d9b3d.slice/crio-0c54c748032f3a3b492cf83a0d7a581fe97e8eb5605c0052c1d7ba3b45cee2fa WatchSource:0}: Error finding container 0c54c748032f3a3b492cf83a0d7a581fe97e8eb5605c0052c1d7ba3b45cee2fa: Status 404 returned error can't find the container with id 0c54c748032f3a3b492cf83a0d7a581fe97e8eb5605c0052c1d7ba3b45cee2fa Dec 05 16:33:05 crc kubenswrapper[4756]: I1205 16:33:05.800870 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c8h5n"] Dec 05 16:33:05 crc kubenswrapper[4756]: I1205 16:33:05.942320 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-qlbvt"] Dec 05 16:33:05 crc kubenswrapper[4756]: W1205 16:33:05.942537 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76126169_0993_4057_9698_56bb61a8872f.slice/crio-03a0729f948a35a24f629ce61f51a5dfb9e4d08976207395df2b75015d172cf8 WatchSource:0}: Error finding container 03a0729f948a35a24f629ce61f51a5dfb9e4d08976207395df2b75015d172cf8: Status 404 returned error can't find the container with id 03a0729f948a35a24f629ce61f51a5dfb9e4d08976207395df2b75015d172cf8 Dec 05 16:33:05 crc kubenswrapper[4756]: W1205 16:33:05.942948 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd84c3d4_9a84_4418_8950_c550a63f018a.slice/crio-fd649502c4bf3500154249dae71db91d1f292f53ef89301d7382b525b8d9a71d WatchSource:0}: Error finding container fd649502c4bf3500154249dae71db91d1f292f53ef89301d7382b525b8d9a71d: Status 404 returned error can't find the container with id fd649502c4bf3500154249dae71db91d1f292f53ef89301d7382b525b8d9a71d Dec 05 16:33:05 crc kubenswrapper[4756]: E1205 16:33:05.944395 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gn5qs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-22zvw_openstack-operators(76126169-0993-4057-9698-56bb61a8872f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 16:33:05 crc kubenswrapper[4756]: I1205 16:33:05.953257 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l5gdd"] Dec 05 16:33:05 crc kubenswrapper[4756]: I1205 16:33:05.956457 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94e40b0c-5c93-4c26-b9c1-3c9240a8a44e-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5k5nb8\" (UID: \"94e40b0c-5c93-4c26-b9c1-3c9240a8a44e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5k5nb8" Dec 05 16:33:05 crc kubenswrapper[4756]: E1205 16:33:05.956613 4756 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 16:33:05 crc kubenswrapper[4756]: E1205 16:33:05.956716 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94e40b0c-5c93-4c26-b9c1-3c9240a8a44e-cert podName:94e40b0c-5c93-4c26-b9c1-3c9240a8a44e nodeName:}" failed. No retries permitted until 2025-12-05 16:33:07.956687647 +0000 UTC m=+1048.195361209 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/94e40b0c-5c93-4c26-b9c1-3c9240a8a44e-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f5k5nb8" (UID: "94e40b0c-5c93-4c26-b9c1-3c9240a8a44e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 16:33:05 crc kubenswrapper[4756]: E1205 16:33:05.957724 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9nhmh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-dnrr5_openstack-operators(6953530f-9e26-4c39-b015-bdadcf81aca2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 16:33:05 crc kubenswrapper[4756]: E1205 16:33:05.960246 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9nhmh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-dnrr5_openstack-operators(6953530f-9e26-4c39-b015-bdadcf81aca2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 16:33:05 crc kubenswrapper[4756]: I1205 16:33:05.960315 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-22zvw"] Dec 05 16:33:05 crc kubenswrapper[4756]: E1205 16:33:05.961914 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnrr5" podUID="6953530f-9e26-4c39-b015-bdadcf81aca2" Dec 05 16:33:05 crc kubenswrapper[4756]: E1205 16:33:05.968485 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-flrqt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-qlbvt_openstack-operators(a6494f1e-bf6a-469e-a9ab-fdc9211668c9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 16:33:05 crc kubenswrapper[4756]: I1205 16:33:05.969807 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8ln"] Dec 05 16:33:05 crc kubenswrapper[4756]: E1205 16:33:05.970631 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-flrqt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-qlbvt_openstack-operators(a6494f1e-bf6a-469e-a9ab-fdc9211668c9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 16:33:05 crc kubenswrapper[4756]: E1205 16:33:05.972917 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qlbvt" podUID="a6494f1e-bf6a-469e-a9ab-fdc9211668c9" Dec 05 16:33:05 crc kubenswrapper[4756]: I1205 16:33:05.975118 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnrr5"] Dec 05 16:33:05 crc kubenswrapper[4756]: I1205 16:33:05.981343 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-qrkfp"] Dec 05 16:33:05 crc kubenswrapper[4756]: E1205 16:33:05.986809 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wtltr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-l5gdd_openstack-operators(dd84c3d4-9a84-4418-8950-c550a63f018a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 16:33:05 crc kubenswrapper[4756]: E1205 16:33:05.987005 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tz592,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-2x8ln_openstack-operators(fd6cc312-becc-437b-bfd6-e461f3ae0912): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 16:33:05 crc kubenswrapper[4756]: E1205 16:33:05.988602 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l5gdd" podUID="dd84c3d4-9a84-4418-8950-c550a63f018a" Dec 05 16:33:05 crc kubenswrapper[4756]: I1205 16:33:05.993134 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-rdp64"] Dec 05 16:33:05 crc kubenswrapper[4756]: E1205 16:33:05.995682 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tz592,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-2x8ln_openstack-operators(fd6cc312-becc-437b-bfd6-e461f3ae0912): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 16:33:05 crc kubenswrapper[4756]: E1205 16:33:05.996294 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gz7mb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-rdp64_openstack-operators(e92435ef-34fe-44e9-bf45-7900fed5f199): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 16:33:05 crc kubenswrapper[4756]: E1205 16:33:05.996813 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8ln" podUID="fd6cc312-becc-437b-bfd6-e461f3ae0912" Dec 05 16:33:05 crc kubenswrapper[4756]: E1205 16:33:05.998146 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gz7mb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-rdp64_openstack-operators(e92435ef-34fe-44e9-bf45-7900fed5f199): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 16:33:05 crc kubenswrapper[4756]: E1205 16:33:05.998225 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r8qqn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-qrkfp_openstack-operators(e9fe3d7c-713c-4b55-9775-c557846cae41): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 16:33:05 crc kubenswrapper[4756]: E1205 16:33:05.999413 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-rdp64" podUID="e92435ef-34fe-44e9-bf45-7900fed5f199" Dec 05 16:33:06 crc kubenswrapper[4756]: E1205 16:33:06.001490 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r8qqn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-qrkfp_openstack-operators(e9fe3d7c-713c-4b55-9775-c557846cae41): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 16:33:06 crc kubenswrapper[4756]: E1205 16:33:06.002689 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-qrkfp" podUID="e9fe3d7c-713c-4b55-9775-c557846cae41" Dec 05 16:33:06 crc kubenswrapper[4756]: I1205 16:33:06.272961 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b8ghg\" (UID: \"f5709380-2e15-4b5b-94a0-196fcdfc73d3\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg" Dec 05 16:33:06 crc kubenswrapper[4756]: I1205 16:33:06.273184 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b8ghg\" (UID: \"f5709380-2e15-4b5b-94a0-196fcdfc73d3\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg" Dec 05 16:33:06 crc kubenswrapper[4756]: E1205 16:33:06.273449 4756 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 16:33:06 crc kubenswrapper[4756]: E1205 16:33:06.273518 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-webhook-certs podName:f5709380-2e15-4b5b-94a0-196fcdfc73d3 nodeName:}" failed. No retries permitted until 2025-12-05 16:33:08.273503135 +0000 UTC m=+1048.512176647 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-b8ghg" (UID: "f5709380-2e15-4b5b-94a0-196fcdfc73d3") : secret "webhook-server-cert" not found Dec 05 16:33:06 crc kubenswrapper[4756]: E1205 16:33:06.273697 4756 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 16:33:06 crc kubenswrapper[4756]: E1205 16:33:06.273854 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-metrics-certs podName:f5709380-2e15-4b5b-94a0-196fcdfc73d3 nodeName:}" failed. No retries permitted until 2025-12-05 16:33:08.273822783 +0000 UTC m=+1048.512496295 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-b8ghg" (UID: "f5709380-2e15-4b5b-94a0-196fcdfc73d3") : secret "metrics-server-cert" not found Dec 05 16:33:06 crc kubenswrapper[4756]: I1205 16:33:06.348922 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-m8qwl" event={"ID":"ccaea4ee-e7e9-451f-a1e3-fef7402f356b","Type":"ContainerStarted","Data":"d072c53ebcd071d5e5e8438e61dbac5ce0fc25d930e6955fa16a96cffa4085e9"} Dec 05 16:33:06 crc kubenswrapper[4756]: I1205 16:33:06.350595 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-7qxl4" event={"ID":"626bcac2-4a19-4f34-83be-1f7261f44161","Type":"ContainerStarted","Data":"1ba5d4483bc44a953d1b83c74c619f850ea6b8802f769ea8a8edcf925d414ee0"} Dec 05 16:33:06 crc kubenswrapper[4756]: I1205 16:33:06.351781 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qlbvt" event={"ID":"a6494f1e-bf6a-469e-a9ab-fdc9211668c9","Type":"ContainerStarted","Data":"a629b96f00d1c2342183e5698812700beb79c5efd2c7f8f2a406328ae2a4e16c"} Dec 05 16:33:06 crc kubenswrapper[4756]: E1205 16:33:06.354539 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qlbvt" podUID="a6494f1e-bf6a-469e-a9ab-fdc9211668c9" Dec 05 16:33:06 crc kubenswrapper[4756]: I1205 16:33:06.355500 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l5gdd" event={"ID":"dd84c3d4-9a84-4418-8950-c550a63f018a","Type":"ContainerStarted","Data":"fd649502c4bf3500154249dae71db91d1f292f53ef89301d7382b525b8d9a71d"} Dec 05 16:33:06 crc kubenswrapper[4756]: E1205 16:33:06.357094 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l5gdd" podUID="dd84c3d4-9a84-4418-8950-c550a63f018a" Dec 05 16:33:06 crc kubenswrapper[4756]: I1205 16:33:06.362568 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-kbg5m" event={"ID":"2afcdd94-bf46-446a-b9d1-ff49260594d9","Type":"ContainerStarted","Data":"2292ebeab4bd4bc055c7eda57e04e38046a6cc64c0f75d71b475b5379875d225"} Dec 05 16:33:06 crc kubenswrapper[4756]: I1205 16:33:06.372345 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rm4xp" event={"ID":"bc274847-f26a-4043-86f6-60af2577a321","Type":"ContainerStarted","Data":"70752d20856079f8cf623af14b7446e9950aab64f8d8e78ee6aab0c5daa50fd4"} Dec 05 16:33:06 crc kubenswrapper[4756]: I1205 16:33:06.385114 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8ln" event={"ID":"fd6cc312-becc-437b-bfd6-e461f3ae0912","Type":"ContainerStarted","Data":"dbcd3514505d90a45a5e4e91fe56325f4790faa0cf79b0fb05025c7fae2103c4"} Dec 05 16:33:06 crc kubenswrapper[4756]: E1205 16:33:06.390463 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8ln" podUID="fd6cc312-becc-437b-bfd6-e461f3ae0912" Dec 05 16:33:06 crc kubenswrapper[4756]: I1205 16:33:06.391404 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c8h5n" event={"ID":"fd94fbcc-a2d3-423f-87fd-57b6e09d9b3d","Type":"ContainerStarted","Data":"0c54c748032f3a3b492cf83a0d7a581fe97e8eb5605c0052c1d7ba3b45cee2fa"} Dec 05 16:33:06 crc kubenswrapper[4756]: I1205 16:33:06.392835 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnrr5" event={"ID":"6953530f-9e26-4c39-b015-bdadcf81aca2","Type":"ContainerStarted","Data":"37065d4a04f21d0c29d1df08f866f92b8f3b7af1b35a2bed462eb39d7945d3e3"} Dec 05 16:33:06 crc kubenswrapper[4756]: I1205 16:33:06.396788 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-r6xbc" event={"ID":"6013b2f1-3b04-47f0-bb2c-f741587c8efc","Type":"ContainerStarted","Data":"6f9efdc44d8b8c66a3e2f8c78ca175cbe9014286028217cf7c33e40c2ad41e65"} Dec 05 16:33:06 crc kubenswrapper[4756]: E1205 16:33:06.398827 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnrr5" podUID="6953530f-9e26-4c39-b015-bdadcf81aca2" Dec 05 16:33:06 crc kubenswrapper[4756]: I1205 16:33:06.400164 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-qrkfp" event={"ID":"e9fe3d7c-713c-4b55-9775-c557846cae41","Type":"ContainerStarted","Data":"b3a313f032097a90c5f72de6dba5e819361401d12f0388d776a889693647de40"} Dec 05 16:33:06 crc kubenswrapper[4756]: I1205 16:33:06.401552 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-rdp64" event={"ID":"e92435ef-34fe-44e9-bf45-7900fed5f199","Type":"ContainerStarted","Data":"f8b4c0d3e5c659360f69ea7e57aaf3c87d36852cf008068fe7a2c85a40ddb15a"} Dec 05 16:33:06 crc kubenswrapper[4756]: E1205 16:33:06.704655 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-qrkfp" podUID="e9fe3d7c-713c-4b55-9775-c557846cae41" Dec 05 16:33:06 crc kubenswrapper[4756]: E1205 16:33:06.705578 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-rdp64" podUID="e92435ef-34fe-44e9-bf45-7900fed5f199" Dec 05 16:33:06 crc kubenswrapper[4756]: I1205 16:33:06.706088 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-22zvw" event={"ID":"76126169-0993-4057-9698-56bb61a8872f","Type":"ContainerStarted","Data":"03a0729f948a35a24f629ce61f51a5dfb9e4d08976207395df2b75015d172cf8"} Dec 05 16:33:06 crc kubenswrapper[4756]: I1205 16:33:06.711332 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fcwhr" event={"ID":"95cbe112-4bfb-463a-99d3-15283473d2c6","Type":"ContainerStarted","Data":"9d4bdecababc26588fbd99df4641ead01c0ec2de059c08ded0aa13ede0730398"} Dec 05 16:33:06 crc kubenswrapper[4756]: I1205 16:33:06.714859 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-25dl4" event={"ID":"156f45a8-a698-4f99-93c3-d8cc8a40d14f","Type":"ContainerStarted","Data":"e8bab76d2f6fc49706369efa6b371a48a1ceb1042c241ee2fb10a59b3ebd8ff2"} Dec 05 16:33:06 crc kubenswrapper[4756]: I1205 16:33:06.729848 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pvn24" event={"ID":"79ee86d5-861a-4a03-b57a-9b382be05361","Type":"ContainerStarted","Data":"fa579864df7a388d155e441639857615aef5cf4fe800bc13a742f204e82d4f44"} Dec 05 16:33:06 crc kubenswrapper[4756]: I1205 16:33:06.740024 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-9lzlc" event={"ID":"482f7c65-7376-4794-9308-062ed6209299","Type":"ContainerStarted","Data":"3ce3a15bdcd5139ac941de10159eed668316015e978a25f8678b11e2b31c492c"} Dec 05 16:33:07 crc kubenswrapper[4756]: I1205 16:33:07.513581 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38-cert\") pod \"infra-operator-controller-manager-57548d458d-6xbpf\" (UID: \"b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-6xbpf" Dec 05 16:33:07 crc kubenswrapper[4756]: E1205 16:33:07.513776 4756 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 16:33:07 crc kubenswrapper[4756]: E1205 16:33:07.513968 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38-cert podName:b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38 nodeName:}" failed. No retries permitted until 2025-12-05 16:33:11.513950739 +0000 UTC m=+1051.752624251 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38-cert") pod "infra-operator-controller-manager-57548d458d-6xbpf" (UID: "b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38") : secret "infra-operator-webhook-server-cert" not found Dec 05 16:33:07 crc kubenswrapper[4756]: E1205 16:33:07.749145 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-qrkfp" podUID="e9fe3d7c-713c-4b55-9775-c557846cae41" Dec 05 16:33:07 crc kubenswrapper[4756]: E1205 16:33:07.749324 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l5gdd" podUID="dd84c3d4-9a84-4418-8950-c550a63f018a" Dec 05 16:33:07 crc kubenswrapper[4756]: E1205 16:33:07.749481 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8ln" podUID="fd6cc312-becc-437b-bfd6-e461f3ae0912" Dec 05 16:33:07 crc kubenswrapper[4756]: E1205 16:33:07.749815 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnrr5" podUID="6953530f-9e26-4c39-b015-bdadcf81aca2" Dec 05 16:33:07 crc kubenswrapper[4756]: E1205 16:33:07.749928 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-rdp64" podUID="e92435ef-34fe-44e9-bf45-7900fed5f199" Dec 05 16:33:07 crc kubenswrapper[4756]: E1205 16:33:07.750010 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qlbvt" podUID="a6494f1e-bf6a-469e-a9ab-fdc9211668c9" Dec 05 16:33:08 crc kubenswrapper[4756]: I1205 16:33:08.020499 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94e40b0c-5c93-4c26-b9c1-3c9240a8a44e-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5k5nb8\" (UID: \"94e40b0c-5c93-4c26-b9c1-3c9240a8a44e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5k5nb8" Dec 05 16:33:08 crc kubenswrapper[4756]: E1205 16:33:08.020709 4756 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 16:33:08 crc kubenswrapper[4756]: E1205 16:33:08.020791 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94e40b0c-5c93-4c26-b9c1-3c9240a8a44e-cert podName:94e40b0c-5c93-4c26-b9c1-3c9240a8a44e nodeName:}" failed. No retries permitted until 2025-12-05 16:33:12.020772237 +0000 UTC m=+1052.259445759 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/94e40b0c-5c93-4c26-b9c1-3c9240a8a44e-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f5k5nb8" (UID: "94e40b0c-5c93-4c26-b9c1-3c9240a8a44e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 16:33:08 crc kubenswrapper[4756]: I1205 16:33:08.323872 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b8ghg\" (UID: \"f5709380-2e15-4b5b-94a0-196fcdfc73d3\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg" Dec 05 16:33:08 crc kubenswrapper[4756]: I1205 16:33:08.324049 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b8ghg\" (UID: \"f5709380-2e15-4b5b-94a0-196fcdfc73d3\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg" Dec 05 16:33:08 crc kubenswrapper[4756]: E1205 16:33:08.324098 4756 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 16:33:08 crc kubenswrapper[4756]: E1205 16:33:08.324165 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-metrics-certs podName:f5709380-2e15-4b5b-94a0-196fcdfc73d3 nodeName:}" failed. No retries permitted until 2025-12-05 16:33:12.324144501 +0000 UTC m=+1052.562818013 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-b8ghg" (UID: "f5709380-2e15-4b5b-94a0-196fcdfc73d3") : secret "metrics-server-cert" not found Dec 05 16:33:08 crc kubenswrapper[4756]: E1205 16:33:08.324410 4756 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 16:33:08 crc kubenswrapper[4756]: E1205 16:33:08.324669 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-webhook-certs podName:f5709380-2e15-4b5b-94a0-196fcdfc73d3 nodeName:}" failed. No retries permitted until 2025-12-05 16:33:12.324628764 +0000 UTC m=+1052.563302446 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-b8ghg" (UID: "f5709380-2e15-4b5b-94a0-196fcdfc73d3") : secret "webhook-server-cert" not found Dec 05 16:33:11 crc kubenswrapper[4756]: I1205 16:33:11.580164 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38-cert\") pod \"infra-operator-controller-manager-57548d458d-6xbpf\" (UID: \"b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-6xbpf" Dec 05 16:33:11 crc kubenswrapper[4756]: E1205 16:33:11.580365 4756 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 16:33:11 crc kubenswrapper[4756]: E1205 16:33:11.580966 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38-cert podName:b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38 nodeName:}" failed. No retries permitted until 2025-12-05 16:33:19.580944363 +0000 UTC m=+1059.819617875 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38-cert") pod "infra-operator-controller-manager-57548d458d-6xbpf" (UID: "b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38") : secret "infra-operator-webhook-server-cert" not found Dec 05 16:33:12 crc kubenswrapper[4756]: I1205 16:33:12.091191 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94e40b0c-5c93-4c26-b9c1-3c9240a8a44e-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5k5nb8\" (UID: \"94e40b0c-5c93-4c26-b9c1-3c9240a8a44e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5k5nb8" Dec 05 16:33:12 crc kubenswrapper[4756]: E1205 16:33:12.091368 4756 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 16:33:12 crc kubenswrapper[4756]: E1205 16:33:12.091526 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94e40b0c-5c93-4c26-b9c1-3c9240a8a44e-cert podName:94e40b0c-5c93-4c26-b9c1-3c9240a8a44e nodeName:}" failed. No retries permitted until 2025-12-05 16:33:20.091471962 +0000 UTC m=+1060.330145474 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/94e40b0c-5c93-4c26-b9c1-3c9240a8a44e-cert") pod "openstack-baremetal-operator-controller-manager-55c85496f5k5nb8" (UID: "94e40b0c-5c93-4c26-b9c1-3c9240a8a44e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 16:33:12 crc kubenswrapper[4756]: I1205 16:33:12.395986 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b8ghg\" (UID: \"f5709380-2e15-4b5b-94a0-196fcdfc73d3\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg" Dec 05 16:33:12 crc kubenswrapper[4756]: I1205 16:33:12.396087 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b8ghg\" (UID: \"f5709380-2e15-4b5b-94a0-196fcdfc73d3\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg" Dec 05 16:33:12 crc kubenswrapper[4756]: E1205 16:33:12.396272 4756 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 16:33:12 crc kubenswrapper[4756]: E1205 16:33:12.396282 4756 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 16:33:12 crc kubenswrapper[4756]: E1205 16:33:12.396332 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-metrics-certs podName:f5709380-2e15-4b5b-94a0-196fcdfc73d3 nodeName:}" failed. No retries permitted until 2025-12-05 16:33:20.396314695 +0000 UTC m=+1060.634988207 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-b8ghg" (UID: "f5709380-2e15-4b5b-94a0-196fcdfc73d3") : secret "metrics-server-cert" not found Dec 05 16:33:12 crc kubenswrapper[4756]: E1205 16:33:12.396372 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-webhook-certs podName:f5709380-2e15-4b5b-94a0-196fcdfc73d3 nodeName:}" failed. No retries permitted until 2025-12-05 16:33:20.396351046 +0000 UTC m=+1060.635024608 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-b8ghg" (UID: "f5709380-2e15-4b5b-94a0-196fcdfc73d3") : secret "webhook-server-cert" not found Dec 05 16:33:19 crc kubenswrapper[4756]: I1205 16:33:19.644688 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38-cert\") pod \"infra-operator-controller-manager-57548d458d-6xbpf\" (UID: \"b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-6xbpf" Dec 05 16:33:19 crc kubenswrapper[4756]: I1205 16:33:19.653883 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38-cert\") pod \"infra-operator-controller-manager-57548d458d-6xbpf\" (UID: \"b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-6xbpf" Dec 05 16:33:19 crc kubenswrapper[4756]: I1205 16:33:19.915334 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-6xbpf" Dec 05 16:33:20 crc kubenswrapper[4756]: I1205 16:33:20.152232 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94e40b0c-5c93-4c26-b9c1-3c9240a8a44e-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5k5nb8\" (UID: \"94e40b0c-5c93-4c26-b9c1-3c9240a8a44e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5k5nb8" Dec 05 16:33:20 crc kubenswrapper[4756]: I1205 16:33:20.156170 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94e40b0c-5c93-4c26-b9c1-3c9240a8a44e-cert\") pod \"openstack-baremetal-operator-controller-manager-55c85496f5k5nb8\" (UID: \"94e40b0c-5c93-4c26-b9c1-3c9240a8a44e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5k5nb8" Dec 05 16:33:20 crc kubenswrapper[4756]: I1205 16:33:20.358676 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5k5nb8" Dec 05 16:33:20 crc kubenswrapper[4756]: I1205 16:33:20.457337 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b8ghg\" (UID: \"f5709380-2e15-4b5b-94a0-196fcdfc73d3\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg" Dec 05 16:33:20 crc kubenswrapper[4756]: I1205 16:33:20.457463 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b8ghg\" (UID: \"f5709380-2e15-4b5b-94a0-196fcdfc73d3\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg" Dec 05 16:33:20 crc kubenswrapper[4756]: E1205 16:33:20.457519 4756 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 16:33:20 crc kubenswrapper[4756]: E1205 16:33:20.457579 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-metrics-certs podName:f5709380-2e15-4b5b-94a0-196fcdfc73d3 nodeName:}" failed. No retries permitted until 2025-12-05 16:33:36.457562132 +0000 UTC m=+1076.696235644 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-metrics-certs") pod "openstack-operator-controller-manager-54bdf956c4-b8ghg" (UID: "f5709380-2e15-4b5b-94a0-196fcdfc73d3") : secret "metrics-server-cert" not found Dec 05 16:33:20 crc kubenswrapper[4756]: E1205 16:33:20.457630 4756 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 16:33:20 crc kubenswrapper[4756]: E1205 16:33:20.457694 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-webhook-certs podName:f5709380-2e15-4b5b-94a0-196fcdfc73d3 nodeName:}" failed. No retries permitted until 2025-12-05 16:33:36.457674565 +0000 UTC m=+1076.696348157 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-webhook-certs") pod "openstack-operator-controller-manager-54bdf956c4-b8ghg" (UID: "f5709380-2e15-4b5b-94a0-196fcdfc73d3") : secret "webhook-server-cert" not found Dec 05 16:33:22 crc kubenswrapper[4756]: I1205 16:33:22.296097 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:33:22 crc kubenswrapper[4756]: I1205 16:33:22.296493 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:33:22 crc kubenswrapper[4756]: I1205 16:33:22.296556 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 16:33:22 crc kubenswrapper[4756]: I1205 16:33:22.297300 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e8a4fad4de22e09f52b10e225d79410b313d20ac3252ed14a260ce12e986a83e"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 16:33:22 crc kubenswrapper[4756]: I1205 16:33:22.297372 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://e8a4fad4de22e09f52b10e225d79410b313d20ac3252ed14a260ce12e986a83e" gracePeriod=600 Dec 05 16:33:22 crc kubenswrapper[4756]: I1205 16:33:22.989721 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="e8a4fad4de22e09f52b10e225d79410b313d20ac3252ed14a260ce12e986a83e" exitCode=0 Dec 05 16:33:22 crc kubenswrapper[4756]: I1205 16:33:22.989772 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"e8a4fad4de22e09f52b10e225d79410b313d20ac3252ed14a260ce12e986a83e"} Dec 05 16:33:22 crc kubenswrapper[4756]: I1205 16:33:22.989812 4756 scope.go:117] "RemoveContainer" containerID="af109c606f5763151a202cd55133291df026de6e76c3ec3a043528aa8e422ca0" Dec 05 16:33:23 crc kubenswrapper[4756]: E1205 16:33:23.339867 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557" Dec 05 16:33:23 crc kubenswrapper[4756]: E1205 16:33:23.340338 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7j4b7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-25dl4_openstack-operators(156f45a8-a698-4f99-93c3-d8cc8a40d14f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:33:27 crc kubenswrapper[4756]: E1205 16:33:27.164635 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 05 16:33:27 crc kubenswrapper[4756]: E1205 16:33:27.165165 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n62lj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-pvn24_openstack-operators(79ee86d5-861a-4a03-b57a-9b382be05361): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:33:36 crc kubenswrapper[4756]: I1205 16:33:36.503311 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b8ghg\" (UID: \"f5709380-2e15-4b5b-94a0-196fcdfc73d3\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg" Dec 05 16:33:36 crc kubenswrapper[4756]: I1205 16:33:36.504059 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b8ghg\" (UID: \"f5709380-2e15-4b5b-94a0-196fcdfc73d3\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg" Dec 05 16:33:36 crc kubenswrapper[4756]: I1205 16:33:36.514404 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-webhook-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b8ghg\" (UID: \"f5709380-2e15-4b5b-94a0-196fcdfc73d3\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg" Dec 05 16:33:36 crc kubenswrapper[4756]: I1205 16:33:36.514553 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5709380-2e15-4b5b-94a0-196fcdfc73d3-metrics-certs\") pod \"openstack-operator-controller-manager-54bdf956c4-b8ghg\" (UID: \"f5709380-2e15-4b5b-94a0-196fcdfc73d3\") " pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg" Dec 05 16:33:36 crc kubenswrapper[4756]: I1205 16:33:36.704215 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg" Dec 05 16:33:38 crc kubenswrapper[4756]: E1205 16:33:38.247889 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59" Dec 05 16:33:38 crc kubenswrapper[4756]: E1205 16:33:38.248368 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9nhmh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-dnrr5_openstack-operators(6953530f-9e26-4c39-b015-bdadcf81aca2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:33:41 crc kubenswrapper[4756]: I1205 16:33:41.373503 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 16:33:41 crc kubenswrapper[4756]: E1205 16:33:41.964378 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 05 16:33:41 crc kubenswrapper[4756]: E1205 16:33:41.964606 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-flrqt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-qlbvt_openstack-operators(a6494f1e-bf6a-469e-a9ab-fdc9211668c9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:33:43 crc kubenswrapper[4756]: E1205 16:33:43.678654 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621" Dec 05 16:33:43 crc kubenswrapper[4756]: E1205 16:33:43.679096 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gz7mb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-rdp64_openstack-operators(e92435ef-34fe-44e9-bf45-7900fed5f199): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:33:44 crc kubenswrapper[4756]: E1205 16:33:44.185490 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385" Dec 05 16:33:44 crc kubenswrapper[4756]: E1205 16:33:44.185765 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tz592,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-2x8ln_openstack-operators(fd6cc312-becc-437b-bfd6-e461f3ae0912): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:33:44 crc kubenswrapper[4756]: I1205 16:33:44.636504 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5k5nb8"] Dec 05 16:33:44 crc kubenswrapper[4756]: I1205 16:33:44.644623 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-6xbpf"] Dec 05 16:33:44 crc kubenswrapper[4756]: E1205 16:33:44.731722 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 05 16:33:44 crc kubenswrapper[4756]: E1205 16:33:44.731901 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wtltr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-l5gdd_openstack-operators(dd84c3d4-9a84-4418-8950-c550a63f018a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:33:44 crc kubenswrapper[4756]: E1205 16:33:44.733785 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l5gdd" podUID="dd84c3d4-9a84-4418-8950-c550a63f018a" Dec 05 16:33:44 crc kubenswrapper[4756]: W1205 16:33:44.816055 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94e40b0c_5c93_4c26_b9c1_3c9240a8a44e.slice/crio-151d2e8e08f929cd475d9a1f62ad64737965f9d5ffd855405bb0eef47cb873ae WatchSource:0}: Error finding container 151d2e8e08f929cd475d9a1f62ad64737965f9d5ffd855405bb0eef47cb873ae: Status 404 returned error can't find the container with id 151d2e8e08f929cd475d9a1f62ad64737965f9d5ffd855405bb0eef47cb873ae Dec 05 16:33:44 crc kubenswrapper[4756]: W1205 16:33:44.820238 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8e9cd2a_e3bd_4fd4_b4d7_f84806772b38.slice/crio-09044b145fd0b04c7b90b944b93eefc5b986d0455176abc758f31bd00cc3c342 WatchSource:0}: Error finding container 09044b145fd0b04c7b90b944b93eefc5b986d0455176abc758f31bd00cc3c342: Status 404 returned error can't find the container with id 09044b145fd0b04c7b90b944b93eefc5b986d0455176abc758f31bd00cc3c342 Dec 05 16:33:45 crc kubenswrapper[4756]: I1205 16:33:45.146031 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"314160243b8e9a333f056208abaaefb43e7de34a3e41ec79e08a896638376f87"} Dec 05 16:33:45 crc kubenswrapper[4756]: I1205 16:33:45.147246 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-6xbpf" event={"ID":"b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38","Type":"ContainerStarted","Data":"09044b145fd0b04c7b90b944b93eefc5b986d0455176abc758f31bd00cc3c342"} Dec 05 16:33:45 crc kubenswrapper[4756]: I1205 16:33:45.148700 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-njt8n" event={"ID":"b7a067e2-8f42-4eec-9b62-623498c27f31","Type":"ContainerStarted","Data":"ca17445b97cd7481e1b2739337d34fb299f9d8a20114a9a249756c0230c45366"} Dec 05 16:33:45 crc kubenswrapper[4756]: I1205 16:33:45.153698 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-f65kp" event={"ID":"ced54d49-8a17-4bba-a783-c0f69ed32a7a","Type":"ContainerStarted","Data":"f5935802dcdb59938bb2b01aa4ab29f01bb6dfcf4ed0556837d3d9d0c159cc5b"} Dec 05 16:33:45 crc kubenswrapper[4756]: I1205 16:33:45.154551 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5k5nb8" event={"ID":"94e40b0c-5c93-4c26-b9c1-3c9240a8a44e","Type":"ContainerStarted","Data":"151d2e8e08f929cd475d9a1f62ad64737965f9d5ffd855405bb0eef47cb873ae"} Dec 05 16:33:45 crc kubenswrapper[4756]: I1205 16:33:45.536465 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg"] Dec 05 16:33:45 crc kubenswrapper[4756]: W1205 16:33:45.589271 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5709380_2e15_4b5b_94a0_196fcdfc73d3.slice/crio-da6350957edd64c45c47eaf399dd36106492b852635b78af74aa7c6e0807886f WatchSource:0}: Error finding container da6350957edd64c45c47eaf399dd36106492b852635b78af74aa7c6e0807886f: Status 404 returned error can't find the container with id da6350957edd64c45c47eaf399dd36106492b852635b78af74aa7c6e0807886f Dec 05 16:33:46 crc kubenswrapper[4756]: I1205 16:33:46.197623 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-7qxl4" event={"ID":"626bcac2-4a19-4f34-83be-1f7261f44161","Type":"ContainerStarted","Data":"a867088ecd2b9191bf488767a0e03985c419db273819c7b6ec6637cc6eea6616"} Dec 05 16:33:46 crc kubenswrapper[4756]: I1205 16:33:46.213838 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-kbg5m" event={"ID":"2afcdd94-bf46-446a-b9d1-ff49260594d9","Type":"ContainerStarted","Data":"eac03275391043381e26412650c160d40cf16edb37daf4ad256d0d76e576b53d"} Dec 05 16:33:46 crc kubenswrapper[4756]: I1205 16:33:46.218227 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg" event={"ID":"f5709380-2e15-4b5b-94a0-196fcdfc73d3","Type":"ContainerStarted","Data":"da6350957edd64c45c47eaf399dd36106492b852635b78af74aa7c6e0807886f"} Dec 05 16:33:46 crc kubenswrapper[4756]: I1205 16:33:46.246624 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-9lzlc" event={"ID":"482f7c65-7376-4794-9308-062ed6209299","Type":"ContainerStarted","Data":"38a080d9d32ae0a195a221cd371cd9b628e444c5f98e0b95f68fe6e3301c0d39"} Dec 05 16:33:46 crc kubenswrapper[4756]: I1205 16:33:46.251378 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c8h5n" event={"ID":"fd94fbcc-a2d3-423f-87fd-57b6e09d9b3d","Type":"ContainerStarted","Data":"c7dfea5ccb31201d671fd0cc1482db30b3873c6240155a709e07e0e51d95fe30"} Dec 05 16:33:46 crc kubenswrapper[4756]: I1205 16:33:46.263435 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rm4xp" event={"ID":"bc274847-f26a-4043-86f6-60af2577a321","Type":"ContainerStarted","Data":"e352c1da9a83118a6be3846c54002269b6d9af9d9a4680f16b270f15771e3dce"} Dec 05 16:33:46 crc kubenswrapper[4756]: I1205 16:33:46.284645 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fcwhr" event={"ID":"95cbe112-4bfb-463a-99d3-15283473d2c6","Type":"ContainerStarted","Data":"cc751b6c19becdc9406ed181a357278b571d32d8fb51cf2c60e90fcc55c43376"} Dec 05 16:33:46 crc kubenswrapper[4756]: I1205 16:33:46.312910 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-r6xbc" event={"ID":"6013b2f1-3b04-47f0-bb2c-f741587c8efc","Type":"ContainerStarted","Data":"5dc19228bd8a1f7e16533bf8b35f21a085abccad16dbb92844cc6e5d152a6662"} Dec 05 16:33:46 crc kubenswrapper[4756]: I1205 16:33:46.319199 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-r7qh7" event={"ID":"9354b938-780a-4431-a060-48ad59f1bbec","Type":"ContainerStarted","Data":"0c40e411e53ec9b3a111a551122ec36876268146cb09513d89e48ad61d611769"} Dec 05 16:33:46 crc kubenswrapper[4756]: I1205 16:33:46.322825 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-m8qwl" event={"ID":"ccaea4ee-e7e9-451f-a1e3-fef7402f356b","Type":"ContainerStarted","Data":"36ae3bb6bc627e3e614d112c0164d9b0d1d15d0313c374a68f3e66af30dc0c8e"} Dec 05 16:33:46 crc kubenswrapper[4756]: E1205 16:33:46.600849 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage867063748/10\": happened during read: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 05 16:33:46 crc kubenswrapper[4756]: E1205 16:33:46.601203 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gn5qs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-22zvw_openstack-operators(76126169-0993-4057-9698-56bb61a8872f): ErrImagePull: rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage867063748/10\": happened during read: context canceled" logger="UnhandledError" Dec 05 16:33:46 crc kubenswrapper[4756]: E1205 16:33:46.602324 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = writing blob: storing blob to file \\\"/var/tmp/container_images_storage867063748/10\\\": happened during read: context canceled\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-22zvw" podUID="76126169-0993-4057-9698-56bb61a8872f" Dec 05 16:33:47 crc kubenswrapper[4756]: I1205 16:33:47.333641 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-qrkfp" event={"ID":"e9fe3d7c-713c-4b55-9775-c557846cae41","Type":"ContainerStarted","Data":"39e39214480601c025c4129efab3315233a4545749eec757a2f706f5fe692b64"} Dec 05 16:33:47 crc kubenswrapper[4756]: I1205 16:33:47.337163 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg" event={"ID":"f5709380-2e15-4b5b-94a0-196fcdfc73d3","Type":"ContainerStarted","Data":"60b1649eca738f735ad0480643cbd7ea0bd468ca9c114ece8171e60bbe063c62"} Dec 05 16:33:47 crc kubenswrapper[4756]: I1205 16:33:47.337567 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg" Dec 05 16:33:47 crc kubenswrapper[4756]: I1205 16:33:47.366658 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg" podStartSLOduration=43.366640888 podStartE2EDuration="43.366640888s" podCreationTimestamp="2025-12-05 16:33:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:33:47.362879026 +0000 UTC m=+1087.601552568" watchObservedRunningTime="2025-12-05 16:33:47.366640888 +0000 UTC m=+1087.605314410" Dec 05 16:33:51 crc kubenswrapper[4756]: I1205 16:33:51.393227 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-6xbpf" event={"ID":"b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38","Type":"ContainerStarted","Data":"6e16e8e44ec5a07b9fded30c93659a4dec8cb2ebb9130d622d86ee6d60975602"} Dec 05 16:33:51 crc kubenswrapper[4756]: E1205 16:33:51.580700 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-25dl4" podUID="156f45a8-a698-4f99-93c3-d8cc8a40d14f" Dec 05 16:33:51 crc kubenswrapper[4756]: E1205 16:33:51.857091 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8ln" podUID="fd6cc312-becc-437b-bfd6-e461f3ae0912" Dec 05 16:33:51 crc kubenswrapper[4756]: E1205 16:33:51.881056 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pvn24" podUID="79ee86d5-861a-4a03-b57a-9b382be05361" Dec 05 16:33:51 crc kubenswrapper[4756]: E1205 16:33:51.883751 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qlbvt" podUID="a6494f1e-bf6a-469e-a9ab-fdc9211668c9" Dec 05 16:33:52 crc kubenswrapper[4756]: E1205 16:33:52.250224 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-rdp64" podUID="e92435ef-34fe-44e9-bf45-7900fed5f199" Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.400624 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-f65kp" event={"ID":"ced54d49-8a17-4bba-a783-c0f69ed32a7a","Type":"ContainerStarted","Data":"b436fa630f0a7b22e45b27b9b2a40e5021c6d0f516922c0c17dbc0d852ba46ae"} Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.401595 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-f65kp" Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.402923 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-kbg5m" event={"ID":"2afcdd94-bf46-446a-b9d1-ff49260594d9","Type":"ContainerStarted","Data":"4c535acd16eece99781bd12c56e1c99a18aa25202880c7d8e7434b9a0ed507e7"} Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.403169 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-kbg5m" Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.403314 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-f65kp" Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.405866 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-kbg5m" Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.406539 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-m8qwl" event={"ID":"ccaea4ee-e7e9-451f-a1e3-fef7402f356b","Type":"ContainerStarted","Data":"76b10c7b1a0a511de8eeab8b88c378b40b79ae21c055ac55507daf1a3273248b"} Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.406664 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-m8qwl" Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.407897 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8ln" event={"ID":"fd6cc312-becc-437b-bfd6-e461f3ae0912","Type":"ContainerStarted","Data":"dc21c40abf26be5d46dae943c7e9b985385ec2a36ac231a7e51250ff6a7b2a93"} Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.409039 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-m8qwl" Dec 05 16:33:52 crc kubenswrapper[4756]: E1205 16:33:52.409726 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8ln" podUID="fd6cc312-becc-437b-bfd6-e461f3ae0912" Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.410229 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-9lzlc" event={"ID":"482f7c65-7376-4794-9308-062ed6209299","Type":"ContainerStarted","Data":"dda7066039f6726e7a211e299f2d6e5f6952532279b03a09175a3d6079f44b7b"} Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.410894 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-9lzlc" Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.412326 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-25dl4" event={"ID":"156f45a8-a698-4f99-93c3-d8cc8a40d14f","Type":"ContainerStarted","Data":"66416922ebb7ad886e8e7f6180afb212ad3a5df12cd61a3e9f1e87feb4019d66"} Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.414812 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-njt8n" event={"ID":"b7a067e2-8f42-4eec-9b62-623498c27f31","Type":"ContainerStarted","Data":"81c6e98212ca132ab22965fc0e9f4b602c04cc62c7ddea39a6d39ce82f78801e"} Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.415528 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-njt8n" Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.416872 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-rdp64" event={"ID":"e92435ef-34fe-44e9-bf45-7900fed5f199","Type":"ContainerStarted","Data":"bdd25532526faaf5b8fb49e1b2f0f280d83f3cffba3c70c59077ce8366096df0"} Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.417449 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-njt8n" Dec 05 16:33:52 crc kubenswrapper[4756]: E1205 16:33:52.417702 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-rdp64" podUID="e92435ef-34fe-44e9-bf45-7900fed5f199" Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.418370 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-9lzlc" Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.420019 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-qrkfp" event={"ID":"e9fe3d7c-713c-4b55-9775-c557846cae41","Type":"ContainerStarted","Data":"c3b6fba452884c17eab313a81704b07e9dbf87f1b7d440e29c4f036c612e159a"} Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.420290 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-qrkfp" Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.422227 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c8h5n" event={"ID":"fd94fbcc-a2d3-423f-87fd-57b6e09d9b3d","Type":"ContainerStarted","Data":"49e2a4a6da778f18f265ad4b95e18f9d946eaa40603d8371aee10ffc0d99d222"} Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.423279 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c8h5n" Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.426216 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-qrkfp" Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.427046 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c8h5n" Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.427404 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qlbvt" event={"ID":"a6494f1e-bf6a-469e-a9ab-fdc9211668c9","Type":"ContainerStarted","Data":"cbce2791f8b5b9e8b3f6ffeb48a7f91adbbf6750e2880ace2c025970dede078b"} Dec 05 16:33:52 crc kubenswrapper[4756]: E1205 16:33:52.429165 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qlbvt" podUID="a6494f1e-bf6a-469e-a9ab-fdc9211668c9" Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.430368 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pvn24" event={"ID":"79ee86d5-861a-4a03-b57a-9b382be05361","Type":"ContainerStarted","Data":"aec3fd491d0e7dc4e064040c1054bca7b65c5f990f4dc7696f110aacfb6ec2cc"} Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.430639 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-f65kp" podStartSLOduration=3.502273026 podStartE2EDuration="49.430621597s" podCreationTimestamp="2025-12-05 16:33:03 +0000 UTC" firstStartedPulling="2025-12-05 16:33:05.158586222 +0000 UTC m=+1045.397259734" lastFinishedPulling="2025-12-05 16:33:51.086934793 +0000 UTC m=+1091.325608305" observedRunningTime="2025-12-05 16:33:52.424873601 +0000 UTC m=+1092.663547123" watchObservedRunningTime="2025-12-05 16:33:52.430621597 +0000 UTC m=+1092.669295099" Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.433087 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5k5nb8" event={"ID":"94e40b0c-5c93-4c26-b9c1-3c9240a8a44e","Type":"ContainerStarted","Data":"1f846ffc88de5dd36be1f68fa51f7c61244889e9fa56f538ae451d861c04cb92"} Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.435019 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-r7qh7" event={"ID":"9354b938-780a-4431-a060-48ad59f1bbec","Type":"ContainerStarted","Data":"a69de244d94c53eb88a54cc967cd16378d65bbb2b220e861f01cc683cbb253ba"} Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.435234 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-r7qh7" Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.439442 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-7qxl4" event={"ID":"626bcac2-4a19-4f34-83be-1f7261f44161","Type":"ContainerStarted","Data":"74cc53ed82618a2b829a86a894bdbe98e16e709b6b57a60ca4aada56f9670c46"} Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.440047 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-7qxl4" Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.441743 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-7qxl4" Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.442739 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fcwhr" event={"ID":"95cbe112-4bfb-463a-99d3-15283473d2c6","Type":"ContainerStarted","Data":"c7d15f6319ceaf2cf703641b387fd43ad2fb206f4e532b98321717d7b45d9519"} Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.442906 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fcwhr" Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.444507 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fcwhr" Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.444611 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-r6xbc" event={"ID":"6013b2f1-3b04-47f0-bb2c-f741587c8efc","Type":"ContainerStarted","Data":"110065539d6b302b4cd1d5adbaa85938a9b5ee27e25205983ce6f8a84ab74d6b"} Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.444879 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-r6xbc" Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.449749 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-r6xbc" Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.455027 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-r7qh7" Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.458631 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-22zvw" event={"ID":"76126169-0993-4057-9698-56bb61a8872f","Type":"ContainerStarted","Data":"2896c91df2c481d6ea0bcb62e1fd64b4a800d37af6faac9fb063eb7d70f5f2af"} Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.460982 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-kbg5m" podStartSLOduration=4.329293235 podStartE2EDuration="49.46097077s" podCreationTimestamp="2025-12-05 16:33:03 +0000 UTC" firstStartedPulling="2025-12-05 16:33:05.791607862 +0000 UTC m=+1046.030281374" lastFinishedPulling="2025-12-05 16:33:50.923285387 +0000 UTC m=+1091.161958909" observedRunningTime="2025-12-05 16:33:52.458357939 +0000 UTC m=+1092.697031451" watchObservedRunningTime="2025-12-05 16:33:52.46097077 +0000 UTC m=+1092.699644282" Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.654429 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-9lzlc" podStartSLOduration=4.466662597 podStartE2EDuration="49.654401203s" podCreationTimestamp="2025-12-05 16:33:03 +0000 UTC" firstStartedPulling="2025-12-05 16:33:05.74064353 +0000 UTC m=+1045.979317042" lastFinishedPulling="2025-12-05 16:33:50.928382136 +0000 UTC m=+1091.167055648" observedRunningTime="2025-12-05 16:33:52.621401549 +0000 UTC m=+1092.860075061" watchObservedRunningTime="2025-12-05 16:33:52.654401203 +0000 UTC m=+1092.893074715" Dec 05 16:33:52 crc kubenswrapper[4756]: E1205 16:33:52.654725 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnrr5" podUID="6953530f-9e26-4c39-b015-bdadcf81aca2" Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.710954 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-qrkfp" podStartSLOduration=4.630448078 podStartE2EDuration="49.710936326s" podCreationTimestamp="2025-12-05 16:33:03 +0000 UTC" firstStartedPulling="2025-12-05 16:33:05.998080349 +0000 UTC m=+1046.236753861" lastFinishedPulling="2025-12-05 16:33:51.078568587 +0000 UTC m=+1091.317242109" observedRunningTime="2025-12-05 16:33:52.69263624 +0000 UTC m=+1092.931309762" watchObservedRunningTime="2025-12-05 16:33:52.710936326 +0000 UTC m=+1092.949609878" Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.740740 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-njt8n" podStartSLOduration=3.753438545 podStartE2EDuration="49.740718623s" podCreationTimestamp="2025-12-05 16:33:03 +0000 UTC" firstStartedPulling="2025-12-05 16:33:05.16402466 +0000 UTC m=+1045.402698172" lastFinishedPulling="2025-12-05 16:33:51.151304728 +0000 UTC m=+1091.389978250" observedRunningTime="2025-12-05 16:33:52.663994243 +0000 UTC m=+1092.902667775" watchObservedRunningTime="2025-12-05 16:33:52.740718623 +0000 UTC m=+1092.979392135" Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.767306 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c8h5n" podStartSLOduration=4.480512954 podStartE2EDuration="49.767266113s" podCreationTimestamp="2025-12-05 16:33:03 +0000 UTC" firstStartedPulling="2025-12-05 16:33:05.801594343 +0000 UTC m=+1046.040267855" lastFinishedPulling="2025-12-05 16:33:51.088347492 +0000 UTC m=+1091.327021014" observedRunningTime="2025-12-05 16:33:52.730957878 +0000 UTC m=+1092.969631410" watchObservedRunningTime="2025-12-05 16:33:52.767266113 +0000 UTC m=+1093.005939625" Dec 05 16:33:52 crc kubenswrapper[4756]: I1205 16:33:52.796373 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-m8qwl" podStartSLOduration=4.6407869779999995 podStartE2EDuration="49.796358271s" podCreationTimestamp="2025-12-05 16:33:03 +0000 UTC" firstStartedPulling="2025-12-05 16:33:05.748064552 +0000 UTC m=+1045.986738064" lastFinishedPulling="2025-12-05 16:33:50.903635835 +0000 UTC m=+1091.142309357" observedRunningTime="2025-12-05 16:33:52.793808552 +0000 UTC m=+1093.032482064" watchObservedRunningTime="2025-12-05 16:33:52.796358271 +0000 UTC m=+1093.035031783" Dec 05 16:33:53 crc kubenswrapper[4756]: I1205 16:33:53.029441 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-r6xbc" podStartSLOduration=4.832223898 podStartE2EDuration="50.029424149s" podCreationTimestamp="2025-12-05 16:33:03 +0000 UTC" firstStartedPulling="2025-12-05 16:33:05.772892455 +0000 UTC m=+1046.011565967" lastFinishedPulling="2025-12-05 16:33:50.970092666 +0000 UTC m=+1091.208766218" observedRunningTime="2025-12-05 16:33:53.024250199 +0000 UTC m=+1093.262923731" watchObservedRunningTime="2025-12-05 16:33:53.029424149 +0000 UTC m=+1093.268097651" Dec 05 16:33:53 crc kubenswrapper[4756]: I1205 16:33:53.061051 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-r7qh7" podStartSLOduration=4.131292858 podStartE2EDuration="50.061034486s" podCreationTimestamp="2025-12-05 16:33:03 +0000 UTC" firstStartedPulling="2025-12-05 16:33:05.145021965 +0000 UTC m=+1045.383695467" lastFinishedPulling="2025-12-05 16:33:51.074763563 +0000 UTC m=+1091.313437095" observedRunningTime="2025-12-05 16:33:53.058714863 +0000 UTC m=+1093.297388375" watchObservedRunningTime="2025-12-05 16:33:53.061034486 +0000 UTC m=+1093.299707998" Dec 05 16:33:53 crc kubenswrapper[4756]: I1205 16:33:53.127201 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fcwhr" podStartSLOduration=4.787631588 podStartE2EDuration="50.127185899s" podCreationTimestamp="2025-12-05 16:33:03 +0000 UTC" firstStartedPulling="2025-12-05 16:33:05.738133472 +0000 UTC m=+1045.976806984" lastFinishedPulling="2025-12-05 16:33:51.077687773 +0000 UTC m=+1091.316361295" observedRunningTime="2025-12-05 16:33:53.09071641 +0000 UTC m=+1093.329389922" watchObservedRunningTime="2025-12-05 16:33:53.127185899 +0000 UTC m=+1093.365859411" Dec 05 16:33:53 crc kubenswrapper[4756]: I1205 16:33:53.127854 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-7qxl4" podStartSLOduration=4.536677476 podStartE2EDuration="50.127847687s" podCreationTimestamp="2025-12-05 16:33:03 +0000 UTC" firstStartedPulling="2025-12-05 16:33:05.500505791 +0000 UTC m=+1045.739179303" lastFinishedPulling="2025-12-05 16:33:51.091676002 +0000 UTC m=+1091.330349514" observedRunningTime="2025-12-05 16:33:53.122734888 +0000 UTC m=+1093.361408400" watchObservedRunningTime="2025-12-05 16:33:53.127847687 +0000 UTC m=+1093.366521199" Dec 05 16:33:53 crc kubenswrapper[4756]: I1205 16:33:53.466809 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-25dl4" event={"ID":"156f45a8-a698-4f99-93c3-d8cc8a40d14f","Type":"ContainerStarted","Data":"dc6857953a82e298c53bc2f837f49eac1c38c374b068e6b232be116610211eff"} Dec 05 16:33:53 crc kubenswrapper[4756]: I1205 16:33:53.467004 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-25dl4" Dec 05 16:33:53 crc kubenswrapper[4756]: I1205 16:33:53.472117 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pvn24" event={"ID":"79ee86d5-861a-4a03-b57a-9b382be05361","Type":"ContainerStarted","Data":"d843c3cdc26120769b2e152febc153e735910b10c07cd9a81699186845a18731"} Dec 05 16:33:53 crc kubenswrapper[4756]: I1205 16:33:53.472300 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pvn24" Dec 05 16:33:53 crc kubenswrapper[4756]: I1205 16:33:53.474239 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-22zvw" event={"ID":"76126169-0993-4057-9698-56bb61a8872f","Type":"ContainerStarted","Data":"747e1f06826ee6b469f6ff7d246f497546809d32d1bf1f53616b30363d765148"} Dec 05 16:33:53 crc kubenswrapper[4756]: I1205 16:33:53.474647 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-22zvw" Dec 05 16:33:53 crc kubenswrapper[4756]: I1205 16:33:53.476676 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5k5nb8" event={"ID":"94e40b0c-5c93-4c26-b9c1-3c9240a8a44e","Type":"ContainerStarted","Data":"067ed6bb26a5424d1445e1285ffbac4ff323737bffac9ce7105b7042107df393"} Dec 05 16:33:53 crc kubenswrapper[4756]: I1205 16:33:53.477334 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5k5nb8" Dec 05 16:33:53 crc kubenswrapper[4756]: I1205 16:33:53.480182 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rm4xp" event={"ID":"bc274847-f26a-4043-86f6-60af2577a321","Type":"ContainerStarted","Data":"560149dc0431fb9630a05e684b06fc0bdc79703296bccf99c9cc1908ee2bf4a0"} Dec 05 16:33:53 crc kubenswrapper[4756]: I1205 16:33:53.481041 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rm4xp" Dec 05 16:33:53 crc kubenswrapper[4756]: I1205 16:33:53.483758 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rm4xp" Dec 05 16:33:53 crc kubenswrapper[4756]: I1205 16:33:53.485121 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-6xbpf" event={"ID":"b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38","Type":"ContainerStarted","Data":"a672502b47fd3ecabd8c68e47540dce27c7e319276d2cd394830b7f666cf20f6"} Dec 05 16:33:53 crc kubenswrapper[4756]: I1205 16:33:53.485381 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-6xbpf" Dec 05 16:33:53 crc kubenswrapper[4756]: I1205 16:33:53.487448 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnrr5" event={"ID":"6953530f-9e26-4c39-b015-bdadcf81aca2","Type":"ContainerStarted","Data":"e48142ed944f5b3c4d01302cac6b9fc0c2b82715a7ac3796cfd715b9ef839c26"} Dec 05 16:33:53 crc kubenswrapper[4756]: E1205 16:33:53.488575 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnrr5" podUID="6953530f-9e26-4c39-b015-bdadcf81aca2" Dec 05 16:33:53 crc kubenswrapper[4756]: I1205 16:33:53.492722 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-25dl4" podStartSLOduration=3.363480513 podStartE2EDuration="50.492706167s" podCreationTimestamp="2025-12-05 16:33:03 +0000 UTC" firstStartedPulling="2025-12-05 16:33:05.789290209 +0000 UTC m=+1046.027963721" lastFinishedPulling="2025-12-05 16:33:52.918515863 +0000 UTC m=+1093.157189375" observedRunningTime="2025-12-05 16:33:53.488046761 +0000 UTC m=+1093.726720273" watchObservedRunningTime="2025-12-05 16:33:53.492706167 +0000 UTC m=+1093.731379679" Dec 05 16:33:53 crc kubenswrapper[4756]: I1205 16:33:53.530902 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5k5nb8" podStartSLOduration=44.417284162 podStartE2EDuration="50.530886022s" podCreationTimestamp="2025-12-05 16:33:03 +0000 UTC" firstStartedPulling="2025-12-05 16:33:44.818504656 +0000 UTC m=+1085.057178168" lastFinishedPulling="2025-12-05 16:33:50.932106496 +0000 UTC m=+1091.170780028" observedRunningTime="2025-12-05 16:33:53.522943817 +0000 UTC m=+1093.761617329" watchObservedRunningTime="2025-12-05 16:33:53.530886022 +0000 UTC m=+1093.769559534" Dec 05 16:33:53 crc kubenswrapper[4756]: I1205 16:33:53.576213 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-22zvw" podStartSLOduration=4.550250221 podStartE2EDuration="49.57619602s" podCreationTimestamp="2025-12-05 16:33:04 +0000 UTC" firstStartedPulling="2025-12-05 16:33:05.944283671 +0000 UTC m=+1046.182957183" lastFinishedPulling="2025-12-05 16:33:50.97022946 +0000 UTC m=+1091.208902982" observedRunningTime="2025-12-05 16:33:53.573817936 +0000 UTC m=+1093.812491458" watchObservedRunningTime="2025-12-05 16:33:53.57619602 +0000 UTC m=+1093.814869542" Dec 05 16:33:53 crc kubenswrapper[4756]: I1205 16:33:53.607200 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rm4xp" podStartSLOduration=5.014584851 podStartE2EDuration="50.60718435s" podCreationTimestamp="2025-12-05 16:33:03 +0000 UTC" firstStartedPulling="2025-12-05 16:33:05.560289152 +0000 UTC m=+1045.798962664" lastFinishedPulling="2025-12-05 16:33:51.152888641 +0000 UTC m=+1091.391562163" observedRunningTime="2025-12-05 16:33:53.604519248 +0000 UTC m=+1093.843192760" watchObservedRunningTime="2025-12-05 16:33:53.60718435 +0000 UTC m=+1093.845857862" Dec 05 16:33:53 crc kubenswrapper[4756]: I1205 16:33:53.633844 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pvn24" podStartSLOduration=3.491761521 podStartE2EDuration="50.633816602s" podCreationTimestamp="2025-12-05 16:33:03 +0000 UTC" firstStartedPulling="2025-12-05 16:33:05.777463699 +0000 UTC m=+1046.016137211" lastFinishedPulling="2025-12-05 16:33:52.91951878 +0000 UTC m=+1093.158192292" observedRunningTime="2025-12-05 16:33:53.633035561 +0000 UTC m=+1093.871709073" watchObservedRunningTime="2025-12-05 16:33:53.633816602 +0000 UTC m=+1093.872490114" Dec 05 16:33:53 crc kubenswrapper[4756]: I1205 16:33:53.658151 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-6xbpf" podStartSLOduration=44.584434252 podStartE2EDuration="50.658122871s" podCreationTimestamp="2025-12-05 16:33:03 +0000 UTC" firstStartedPulling="2025-12-05 16:33:44.829955196 +0000 UTC m=+1085.068628728" lastFinishedPulling="2025-12-05 16:33:50.903643795 +0000 UTC m=+1091.142317347" observedRunningTime="2025-12-05 16:33:53.656085636 +0000 UTC m=+1093.894759168" watchObservedRunningTime="2025-12-05 16:33:53.658122871 +0000 UTC m=+1093.896796383" Dec 05 16:33:55 crc kubenswrapper[4756]: E1205 16:33:55.051600 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l5gdd" podUID="dd84c3d4-9a84-4418-8950-c550a63f018a" Dec 05 16:33:56 crc kubenswrapper[4756]: I1205 16:33:56.715642 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-54bdf956c4-b8ghg" Dec 05 16:33:59 crc kubenswrapper[4756]: I1205 16:33:59.925338 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-6xbpf" Dec 05 16:34:00 crc kubenswrapper[4756]: I1205 16:34:00.366450 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55c85496f5k5nb8" Dec 05 16:34:03 crc kubenswrapper[4756]: E1205 16:34:03.053192 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-rdp64" podUID="e92435ef-34fe-44e9-bf45-7900fed5f199" Dec 05 16:34:03 crc kubenswrapper[4756]: E1205 16:34:03.053374 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8ln" podUID="fd6cc312-becc-437b-bfd6-e461f3ae0912" Dec 05 16:34:04 crc kubenswrapper[4756]: I1205 16:34:04.294347 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pvn24" Dec 05 16:34:04 crc kubenswrapper[4756]: I1205 16:34:04.321714 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-25dl4" Dec 05 16:34:04 crc kubenswrapper[4756]: I1205 16:34:04.890770 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-22zvw" Dec 05 16:34:06 crc kubenswrapper[4756]: I1205 16:34:06.594630 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qlbvt" event={"ID":"a6494f1e-bf6a-469e-a9ab-fdc9211668c9","Type":"ContainerStarted","Data":"655c525392c008ef052c3c01176e3e8ec1ec9d0fbd0061f20e5be93935f924b9"} Dec 05 16:34:06 crc kubenswrapper[4756]: I1205 16:34:06.595482 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qlbvt" Dec 05 16:34:06 crc kubenswrapper[4756]: I1205 16:34:06.632183 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qlbvt" podStartSLOduration=4.106127143 podStartE2EDuration="1m3.632157187s" podCreationTimestamp="2025-12-05 16:33:03 +0000 UTC" firstStartedPulling="2025-12-05 16:33:05.968330742 +0000 UTC m=+1046.207004254" lastFinishedPulling="2025-12-05 16:34:05.494360776 +0000 UTC m=+1105.733034298" observedRunningTime="2025-12-05 16:34:06.627208793 +0000 UTC m=+1106.865882355" watchObservedRunningTime="2025-12-05 16:34:06.632157187 +0000 UTC m=+1106.870830709" Dec 05 16:34:07 crc kubenswrapper[4756]: I1205 16:34:07.603758 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l5gdd" event={"ID":"dd84c3d4-9a84-4418-8950-c550a63f018a","Type":"ContainerStarted","Data":"6f7590db33b013507fe2aa2d4e0af95c04d600dcd574dbb4618ec6f7f4e8dfc8"} Dec 05 16:34:07 crc kubenswrapper[4756]: I1205 16:34:07.621388 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l5gdd" podStartSLOduration=3.109487827 podStartE2EDuration="1m3.62137155s" podCreationTimestamp="2025-12-05 16:33:04 +0000 UTC" firstStartedPulling="2025-12-05 16:33:05.986637149 +0000 UTC m=+1046.225310661" lastFinishedPulling="2025-12-05 16:34:06.498520832 +0000 UTC m=+1106.737194384" observedRunningTime="2025-12-05 16:34:07.619621613 +0000 UTC m=+1107.858295135" watchObservedRunningTime="2025-12-05 16:34:07.62137155 +0000 UTC m=+1107.860045062" Dec 05 16:34:10 crc kubenswrapper[4756]: I1205 16:34:10.626451 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnrr5" event={"ID":"6953530f-9e26-4c39-b015-bdadcf81aca2","Type":"ContainerStarted","Data":"0e74915da9614262975f47dba0eff9a8ec143a682d8cf0c526ad4f9c8f318dc0"} Dec 05 16:34:10 crc kubenswrapper[4756]: I1205 16:34:10.627256 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnrr5" Dec 05 16:34:10 crc kubenswrapper[4756]: I1205 16:34:10.651078 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnrr5" podStartSLOduration=4.057433448 podStartE2EDuration="1m7.651058681s" podCreationTimestamp="2025-12-05 16:33:03 +0000 UTC" firstStartedPulling="2025-12-05 16:33:05.957591291 +0000 UTC m=+1046.196264813" lastFinishedPulling="2025-12-05 16:34:09.551216534 +0000 UTC m=+1109.789890046" observedRunningTime="2025-12-05 16:34:10.649437717 +0000 UTC m=+1110.888111259" watchObservedRunningTime="2025-12-05 16:34:10.651058681 +0000 UTC m=+1110.889732203" Dec 05 16:34:14 crc kubenswrapper[4756]: I1205 16:34:14.438576 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnrr5" Dec 05 16:34:14 crc kubenswrapper[4756]: I1205 16:34:14.675852 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qlbvt" Dec 05 16:34:21 crc kubenswrapper[4756]: I1205 16:34:21.732246 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8ln" event={"ID":"fd6cc312-becc-437b-bfd6-e461f3ae0912","Type":"ContainerStarted","Data":"e3e21eec9feb9528c40abd811d90f9cf7954c9ec033f923e4c7487ff01cde73d"} Dec 05 16:34:21 crc kubenswrapper[4756]: I1205 16:34:21.732942 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8ln" Dec 05 16:34:21 crc kubenswrapper[4756]: I1205 16:34:21.733741 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-rdp64" event={"ID":"e92435ef-34fe-44e9-bf45-7900fed5f199","Type":"ContainerStarted","Data":"d11b1a99182d517ee57bc2aad677daa4414582bbc336f65b81e75929db9e2c4b"} Dec 05 16:34:21 crc kubenswrapper[4756]: I1205 16:34:21.734074 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-rdp64" Dec 05 16:34:21 crc kubenswrapper[4756]: I1205 16:34:21.757521 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8ln" podStartSLOduration=3.447171211 podStartE2EDuration="1m18.757494753s" podCreationTimestamp="2025-12-05 16:33:03 +0000 UTC" firstStartedPulling="2025-12-05 16:33:05.986913716 +0000 UTC m=+1046.225587228" lastFinishedPulling="2025-12-05 16:34:21.297237258 +0000 UTC m=+1121.535910770" observedRunningTime="2025-12-05 16:34:21.754239135 +0000 UTC m=+1121.992912697" watchObservedRunningTime="2025-12-05 16:34:21.757494753 +0000 UTC m=+1121.996168305" Dec 05 16:34:21 crc kubenswrapper[4756]: I1205 16:34:21.776238 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-rdp64" podStartSLOduration=2.776998169 podStartE2EDuration="1m17.776222508s" podCreationTimestamp="2025-12-05 16:33:04 +0000 UTC" firstStartedPulling="2025-12-05 16:33:05.996129906 +0000 UTC m=+1046.234803408" lastFinishedPulling="2025-12-05 16:34:20.995354185 +0000 UTC m=+1121.234027747" observedRunningTime="2025-12-05 16:34:21.774278176 +0000 UTC m=+1122.012951688" watchObservedRunningTime="2025-12-05 16:34:21.776222508 +0000 UTC m=+1122.014896020" Dec 05 16:34:34 crc kubenswrapper[4756]: I1205 16:34:34.705499 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2x8ln" Dec 05 16:34:34 crc kubenswrapper[4756]: I1205 16:34:34.900196 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-rdp64" Dec 05 16:34:56 crc kubenswrapper[4756]: I1205 16:34:56.150974 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-x6rb8"] Dec 05 16:34:56 crc kubenswrapper[4756]: I1205 16:34:56.153857 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-x6rb8" Dec 05 16:34:56 crc kubenswrapper[4756]: I1205 16:34:56.159931 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 05 16:34:56 crc kubenswrapper[4756]: I1205 16:34:56.159970 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 05 16:34:56 crc kubenswrapper[4756]: I1205 16:34:56.160597 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 05 16:34:56 crc kubenswrapper[4756]: I1205 16:34:56.160658 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-tbc9d" Dec 05 16:34:56 crc kubenswrapper[4756]: I1205 16:34:56.161861 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-x6rb8"] Dec 05 16:34:56 crc kubenswrapper[4756]: I1205 16:34:56.184493 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t8xx\" (UniqueName: \"kubernetes.io/projected/f0f8a490-c56f-4a46-85c1-b609e9be9676-kube-api-access-5t8xx\") pod \"dnsmasq-dns-5cd484bb89-x6rb8\" (UID: \"f0f8a490-c56f-4a46-85c1-b609e9be9676\") " pod="openstack/dnsmasq-dns-5cd484bb89-x6rb8" Dec 05 16:34:56 crc kubenswrapper[4756]: I1205 16:34:56.184548 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0f8a490-c56f-4a46-85c1-b609e9be9676-config\") pod \"dnsmasq-dns-5cd484bb89-x6rb8\" (UID: \"f0f8a490-c56f-4a46-85c1-b609e9be9676\") " pod="openstack/dnsmasq-dns-5cd484bb89-x6rb8" Dec 05 16:34:56 crc kubenswrapper[4756]: I1205 16:34:56.218645 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-567c455747-crzvw"] Dec 05 16:34:56 crc kubenswrapper[4756]: I1205 16:34:56.221396 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-crzvw" Dec 05 16:34:56 crc kubenswrapper[4756]: I1205 16:34:56.226195 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 05 16:34:56 crc kubenswrapper[4756]: I1205 16:34:56.238636 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-567c455747-crzvw"] Dec 05 16:34:56 crc kubenswrapper[4756]: I1205 16:34:56.286224 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26532a0f-2abc-4635-b4ef-b4e59fdefcbf-dns-svc\") pod \"dnsmasq-dns-567c455747-crzvw\" (UID: \"26532a0f-2abc-4635-b4ef-b4e59fdefcbf\") " pod="openstack/dnsmasq-dns-567c455747-crzvw" Dec 05 16:34:56 crc kubenswrapper[4756]: I1205 16:34:56.286438 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t8xx\" (UniqueName: \"kubernetes.io/projected/f0f8a490-c56f-4a46-85c1-b609e9be9676-kube-api-access-5t8xx\") pod \"dnsmasq-dns-5cd484bb89-x6rb8\" (UID: \"f0f8a490-c56f-4a46-85c1-b609e9be9676\") " pod="openstack/dnsmasq-dns-5cd484bb89-x6rb8" Dec 05 16:34:56 crc kubenswrapper[4756]: I1205 16:34:56.286530 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0f8a490-c56f-4a46-85c1-b609e9be9676-config\") pod \"dnsmasq-dns-5cd484bb89-x6rb8\" (UID: \"f0f8a490-c56f-4a46-85c1-b609e9be9676\") " pod="openstack/dnsmasq-dns-5cd484bb89-x6rb8" Dec 05 16:34:56 crc kubenswrapper[4756]: I1205 16:34:56.286624 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2xqt\" (UniqueName: \"kubernetes.io/projected/26532a0f-2abc-4635-b4ef-b4e59fdefcbf-kube-api-access-z2xqt\") pod \"dnsmasq-dns-567c455747-crzvw\" (UID: \"26532a0f-2abc-4635-b4ef-b4e59fdefcbf\") " pod="openstack/dnsmasq-dns-567c455747-crzvw" Dec 05 16:34:56 crc kubenswrapper[4756]: I1205 16:34:56.286686 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26532a0f-2abc-4635-b4ef-b4e59fdefcbf-config\") pod \"dnsmasq-dns-567c455747-crzvw\" (UID: \"26532a0f-2abc-4635-b4ef-b4e59fdefcbf\") " pod="openstack/dnsmasq-dns-567c455747-crzvw" Dec 05 16:34:56 crc kubenswrapper[4756]: I1205 16:34:56.287746 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0f8a490-c56f-4a46-85c1-b609e9be9676-config\") pod \"dnsmasq-dns-5cd484bb89-x6rb8\" (UID: \"f0f8a490-c56f-4a46-85c1-b609e9be9676\") " pod="openstack/dnsmasq-dns-5cd484bb89-x6rb8" Dec 05 16:34:56 crc kubenswrapper[4756]: I1205 16:34:56.306506 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t8xx\" (UniqueName: \"kubernetes.io/projected/f0f8a490-c56f-4a46-85c1-b609e9be9676-kube-api-access-5t8xx\") pod \"dnsmasq-dns-5cd484bb89-x6rb8\" (UID: \"f0f8a490-c56f-4a46-85c1-b609e9be9676\") " pod="openstack/dnsmasq-dns-5cd484bb89-x6rb8" Dec 05 16:34:56 crc kubenswrapper[4756]: I1205 16:34:56.387623 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2xqt\" (UniqueName: \"kubernetes.io/projected/26532a0f-2abc-4635-b4ef-b4e59fdefcbf-kube-api-access-z2xqt\") pod \"dnsmasq-dns-567c455747-crzvw\" (UID: \"26532a0f-2abc-4635-b4ef-b4e59fdefcbf\") " pod="openstack/dnsmasq-dns-567c455747-crzvw" Dec 05 16:34:56 crc kubenswrapper[4756]: I1205 16:34:56.388865 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26532a0f-2abc-4635-b4ef-b4e59fdefcbf-config\") pod \"dnsmasq-dns-567c455747-crzvw\" (UID: \"26532a0f-2abc-4635-b4ef-b4e59fdefcbf\") " pod="openstack/dnsmasq-dns-567c455747-crzvw" Dec 05 16:34:56 crc kubenswrapper[4756]: I1205 16:34:56.388898 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26532a0f-2abc-4635-b4ef-b4e59fdefcbf-dns-svc\") pod \"dnsmasq-dns-567c455747-crzvw\" (UID: \"26532a0f-2abc-4635-b4ef-b4e59fdefcbf\") " pod="openstack/dnsmasq-dns-567c455747-crzvw" Dec 05 16:34:56 crc kubenswrapper[4756]: I1205 16:34:56.390010 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26532a0f-2abc-4635-b4ef-b4e59fdefcbf-dns-svc\") pod \"dnsmasq-dns-567c455747-crzvw\" (UID: \"26532a0f-2abc-4635-b4ef-b4e59fdefcbf\") " pod="openstack/dnsmasq-dns-567c455747-crzvw" Dec 05 16:34:56 crc kubenswrapper[4756]: I1205 16:34:56.391023 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26532a0f-2abc-4635-b4ef-b4e59fdefcbf-config\") pod \"dnsmasq-dns-567c455747-crzvw\" (UID: \"26532a0f-2abc-4635-b4ef-b4e59fdefcbf\") " pod="openstack/dnsmasq-dns-567c455747-crzvw" Dec 05 16:34:56 crc kubenswrapper[4756]: I1205 16:34:56.417397 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2xqt\" (UniqueName: \"kubernetes.io/projected/26532a0f-2abc-4635-b4ef-b4e59fdefcbf-kube-api-access-z2xqt\") pod \"dnsmasq-dns-567c455747-crzvw\" (UID: \"26532a0f-2abc-4635-b4ef-b4e59fdefcbf\") " pod="openstack/dnsmasq-dns-567c455747-crzvw" Dec 05 16:34:56 crc kubenswrapper[4756]: I1205 16:34:56.474796 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-x6rb8" Dec 05 16:34:56 crc kubenswrapper[4756]: I1205 16:34:56.539126 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-crzvw" Dec 05 16:34:56 crc kubenswrapper[4756]: I1205 16:34:56.927135 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-x6rb8"] Dec 05 16:34:56 crc kubenswrapper[4756]: W1205 16:34:56.935104 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0f8a490_c56f_4a46_85c1_b609e9be9676.slice/crio-425c6dab33f79daeebbdd7199c9736b1fcdc858cca75c620cc5b57d156921d99 WatchSource:0}: Error finding container 425c6dab33f79daeebbdd7199c9736b1fcdc858cca75c620cc5b57d156921d99: Status 404 returned error can't find the container with id 425c6dab33f79daeebbdd7199c9736b1fcdc858cca75c620cc5b57d156921d99 Dec 05 16:34:57 crc kubenswrapper[4756]: I1205 16:34:57.012291 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-567c455747-crzvw"] Dec 05 16:34:57 crc kubenswrapper[4756]: W1205 16:34:57.026512 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26532a0f_2abc_4635_b4ef_b4e59fdefcbf.slice/crio-d862f9294c0495e2c482ce8b4457b34974716acde9bb94d06c19d95d99ffe8e8 WatchSource:0}: Error finding container d862f9294c0495e2c482ce8b4457b34974716acde9bb94d06c19d95d99ffe8e8: Status 404 returned error can't find the container with id d862f9294c0495e2c482ce8b4457b34974716acde9bb94d06c19d95d99ffe8e8 Dec 05 16:34:57 crc kubenswrapper[4756]: I1205 16:34:57.032010 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cd484bb89-x6rb8" event={"ID":"f0f8a490-c56f-4a46-85c1-b609e9be9676","Type":"ContainerStarted","Data":"425c6dab33f79daeebbdd7199c9736b1fcdc858cca75c620cc5b57d156921d99"} Dec 05 16:34:57 crc kubenswrapper[4756]: I1205 16:34:57.178073 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-567c455747-crzvw"] Dec 05 16:34:57 crc kubenswrapper[4756]: I1205 16:34:57.213210 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-nxpgh"] Dec 05 16:34:57 crc kubenswrapper[4756]: I1205 16:34:57.222019 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc4b48fc9-nxpgh" Dec 05 16:34:57 crc kubenswrapper[4756]: I1205 16:34:57.233154 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-nxpgh"] Dec 05 16:34:57 crc kubenswrapper[4756]: I1205 16:34:57.409184 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76a2778a-54e4-480c-b234-50c6f65c5d6c-config\") pod \"dnsmasq-dns-bc4b48fc9-nxpgh\" (UID: \"76a2778a-54e4-480c-b234-50c6f65c5d6c\") " pod="openstack/dnsmasq-dns-bc4b48fc9-nxpgh" Dec 05 16:34:57 crc kubenswrapper[4756]: I1205 16:34:57.409263 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76a2778a-54e4-480c-b234-50c6f65c5d6c-dns-svc\") pod \"dnsmasq-dns-bc4b48fc9-nxpgh\" (UID: \"76a2778a-54e4-480c-b234-50c6f65c5d6c\") " pod="openstack/dnsmasq-dns-bc4b48fc9-nxpgh" Dec 05 16:34:57 crc kubenswrapper[4756]: I1205 16:34:57.409344 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsxfz\" (UniqueName: \"kubernetes.io/projected/76a2778a-54e4-480c-b234-50c6f65c5d6c-kube-api-access-dsxfz\") pod \"dnsmasq-dns-bc4b48fc9-nxpgh\" (UID: \"76a2778a-54e4-480c-b234-50c6f65c5d6c\") " pod="openstack/dnsmasq-dns-bc4b48fc9-nxpgh" Dec 05 16:34:57 crc kubenswrapper[4756]: I1205 16:34:57.510813 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76a2778a-54e4-480c-b234-50c6f65c5d6c-config\") pod \"dnsmasq-dns-bc4b48fc9-nxpgh\" (UID: \"76a2778a-54e4-480c-b234-50c6f65c5d6c\") " pod="openstack/dnsmasq-dns-bc4b48fc9-nxpgh" Dec 05 16:34:57 crc kubenswrapper[4756]: I1205 16:34:57.510884 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76a2778a-54e4-480c-b234-50c6f65c5d6c-dns-svc\") pod \"dnsmasq-dns-bc4b48fc9-nxpgh\" (UID: \"76a2778a-54e4-480c-b234-50c6f65c5d6c\") " pod="openstack/dnsmasq-dns-bc4b48fc9-nxpgh" Dec 05 16:34:57 crc kubenswrapper[4756]: I1205 16:34:57.510933 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsxfz\" (UniqueName: \"kubernetes.io/projected/76a2778a-54e4-480c-b234-50c6f65c5d6c-kube-api-access-dsxfz\") pod \"dnsmasq-dns-bc4b48fc9-nxpgh\" (UID: \"76a2778a-54e4-480c-b234-50c6f65c5d6c\") " pod="openstack/dnsmasq-dns-bc4b48fc9-nxpgh" Dec 05 16:34:57 crc kubenswrapper[4756]: I1205 16:34:57.511989 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76a2778a-54e4-480c-b234-50c6f65c5d6c-config\") pod \"dnsmasq-dns-bc4b48fc9-nxpgh\" (UID: \"76a2778a-54e4-480c-b234-50c6f65c5d6c\") " pod="openstack/dnsmasq-dns-bc4b48fc9-nxpgh" Dec 05 16:34:57 crc kubenswrapper[4756]: I1205 16:34:57.512236 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76a2778a-54e4-480c-b234-50c6f65c5d6c-dns-svc\") pod \"dnsmasq-dns-bc4b48fc9-nxpgh\" (UID: \"76a2778a-54e4-480c-b234-50c6f65c5d6c\") " pod="openstack/dnsmasq-dns-bc4b48fc9-nxpgh" Dec 05 16:34:57 crc kubenswrapper[4756]: I1205 16:34:57.537262 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsxfz\" (UniqueName: \"kubernetes.io/projected/76a2778a-54e4-480c-b234-50c6f65c5d6c-kube-api-access-dsxfz\") pod \"dnsmasq-dns-bc4b48fc9-nxpgh\" (UID: \"76a2778a-54e4-480c-b234-50c6f65c5d6c\") " pod="openstack/dnsmasq-dns-bc4b48fc9-nxpgh" Dec 05 16:34:57 crc kubenswrapper[4756]: I1205 16:34:57.545130 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc4b48fc9-nxpgh" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.035737 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-nxpgh"] Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.039406 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-567c455747-crzvw" event={"ID":"26532a0f-2abc-4635-b4ef-b4e59fdefcbf","Type":"ContainerStarted","Data":"d862f9294c0495e2c482ce8b4457b34974716acde9bb94d06c19d95d99ffe8e8"} Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.250009 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-x6rb8"] Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.277382 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb666b895-ngrkn"] Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.286017 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-ngrkn" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.292696 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-ngrkn"] Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.335457 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddc74b94-5abc-43e7-b429-040c492b88d1-config\") pod \"dnsmasq-dns-cb666b895-ngrkn\" (UID: \"ddc74b94-5abc-43e7-b429-040c492b88d1\") " pod="openstack/dnsmasq-dns-cb666b895-ngrkn" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.335668 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddc74b94-5abc-43e7-b429-040c492b88d1-dns-svc\") pod \"dnsmasq-dns-cb666b895-ngrkn\" (UID: \"ddc74b94-5abc-43e7-b429-040c492b88d1\") " pod="openstack/dnsmasq-dns-cb666b895-ngrkn" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.335707 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk8rj\" (UniqueName: \"kubernetes.io/projected/ddc74b94-5abc-43e7-b429-040c492b88d1-kube-api-access-mk8rj\") pod \"dnsmasq-dns-cb666b895-ngrkn\" (UID: \"ddc74b94-5abc-43e7-b429-040c492b88d1\") " pod="openstack/dnsmasq-dns-cb666b895-ngrkn" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.383625 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.384748 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.387358 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.387651 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.388002 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.388159 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.388399 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.388563 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.388669 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-kx5lg" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.430998 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.439049 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddc74b94-5abc-43e7-b429-040c492b88d1-dns-svc\") pod \"dnsmasq-dns-cb666b895-ngrkn\" (UID: \"ddc74b94-5abc-43e7-b429-040c492b88d1\") " pod="openstack/dnsmasq-dns-cb666b895-ngrkn" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.439157 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk8rj\" (UniqueName: \"kubernetes.io/projected/ddc74b94-5abc-43e7-b429-040c492b88d1-kube-api-access-mk8rj\") pod \"dnsmasq-dns-cb666b895-ngrkn\" (UID: \"ddc74b94-5abc-43e7-b429-040c492b88d1\") " pod="openstack/dnsmasq-dns-cb666b895-ngrkn" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.439881 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddc74b94-5abc-43e7-b429-040c492b88d1-config\") pod \"dnsmasq-dns-cb666b895-ngrkn\" (UID: \"ddc74b94-5abc-43e7-b429-040c492b88d1\") " pod="openstack/dnsmasq-dns-cb666b895-ngrkn" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.441766 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddc74b94-5abc-43e7-b429-040c492b88d1-dns-svc\") pod \"dnsmasq-dns-cb666b895-ngrkn\" (UID: \"ddc74b94-5abc-43e7-b429-040c492b88d1\") " pod="openstack/dnsmasq-dns-cb666b895-ngrkn" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.441901 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddc74b94-5abc-43e7-b429-040c492b88d1-config\") pod \"dnsmasq-dns-cb666b895-ngrkn\" (UID: \"ddc74b94-5abc-43e7-b429-040c492b88d1\") " pod="openstack/dnsmasq-dns-cb666b895-ngrkn" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.466803 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk8rj\" (UniqueName: \"kubernetes.io/projected/ddc74b94-5abc-43e7-b429-040c492b88d1-kube-api-access-mk8rj\") pod \"dnsmasq-dns-cb666b895-ngrkn\" (UID: \"ddc74b94-5abc-43e7-b429-040c492b88d1\") " pod="openstack/dnsmasq-dns-cb666b895-ngrkn" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.541681 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.541739 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11d0ad48-c987-4415-8536-b150536534a5-config-data\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.541763 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/11d0ad48-c987-4415-8536-b150536534a5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.541803 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/11d0ad48-c987-4415-8536-b150536534a5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.541828 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/11d0ad48-c987-4415-8536-b150536534a5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.541852 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/11d0ad48-c987-4415-8536-b150536534a5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.542287 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2trz\" (UniqueName: \"kubernetes.io/projected/11d0ad48-c987-4415-8536-b150536534a5-kube-api-access-m2trz\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.542453 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/11d0ad48-c987-4415-8536-b150536534a5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.542518 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/11d0ad48-c987-4415-8536-b150536534a5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.542592 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/11d0ad48-c987-4415-8536-b150536534a5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.542615 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/11d0ad48-c987-4415-8536-b150536534a5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.643954 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/11d0ad48-c987-4415-8536-b150536534a5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.645174 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/11d0ad48-c987-4415-8536-b150536534a5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.645123 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/11d0ad48-c987-4415-8536-b150536534a5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.645260 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.646248 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.646403 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11d0ad48-c987-4415-8536-b150536534a5-config-data\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.646440 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/11d0ad48-c987-4415-8536-b150536534a5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.646482 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/11d0ad48-c987-4415-8536-b150536534a5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.646510 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/11d0ad48-c987-4415-8536-b150536534a5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.646532 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/11d0ad48-c987-4415-8536-b150536534a5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.646548 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2trz\" (UniqueName: \"kubernetes.io/projected/11d0ad48-c987-4415-8536-b150536534a5-kube-api-access-m2trz\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.646572 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/11d0ad48-c987-4415-8536-b150536534a5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.646630 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/11d0ad48-c987-4415-8536-b150536534a5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.647141 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11d0ad48-c987-4415-8536-b150536534a5-config-data\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.647681 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/11d0ad48-c987-4415-8536-b150536534a5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.648567 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/11d0ad48-c987-4415-8536-b150536534a5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.648908 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/11d0ad48-c987-4415-8536-b150536534a5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.650556 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/11d0ad48-c987-4415-8536-b150536534a5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.652952 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/11d0ad48-c987-4415-8536-b150536534a5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.653886 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/11d0ad48-c987-4415-8536-b150536534a5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.657720 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-ngrkn" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.667235 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/11d0ad48-c987-4415-8536-b150536534a5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.667239 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2trz\" (UniqueName: \"kubernetes.io/projected/11d0ad48-c987-4415-8536-b150536534a5-kube-api-access-m2trz\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.686002 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " pod="openstack/rabbitmq-server-0" Dec 05 16:34:58 crc kubenswrapper[4756]: I1205 16:34:58.704027 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.064152 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc4b48fc9-nxpgh" event={"ID":"76a2778a-54e4-480c-b234-50c6f65c5d6c","Type":"ContainerStarted","Data":"fe64a6379d71aa1a8d631b4c1dd9babb16d275d5c7d143555470827638a0612b"} Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.349693 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-ngrkn"] Dec 05 16:34:59 crc kubenswrapper[4756]: W1205 16:34:59.396546 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddc74b94_5abc_43e7_b429_040c492b88d1.slice/crio-542423a393b07e7bb9c261ec4e452892de1103ce455355a702e1479548e48857 WatchSource:0}: Error finding container 542423a393b07e7bb9c261ec4e452892de1103ce455355a702e1479548e48857: Status 404 returned error can't find the container with id 542423a393b07e7bb9c261ec4e452892de1103ce455355a702e1479548e48857 Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.409579 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.435542 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.457961 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-fng9q" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.466202 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.466964 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.467171 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.484218 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.497493 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.497637 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.500646 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.505820 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.586519 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/331486fe-c403-4516-9568-862200ec0672-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.586589 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.586645 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/331486fe-c403-4516-9568-862200ec0672-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.586668 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn42p\" (UniqueName: \"kubernetes.io/projected/331486fe-c403-4516-9568-862200ec0672-kube-api-access-pn42p\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.586693 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/331486fe-c403-4516-9568-862200ec0672-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.586723 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/331486fe-c403-4516-9568-862200ec0672-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.586755 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/331486fe-c403-4516-9568-862200ec0672-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.586775 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/331486fe-c403-4516-9568-862200ec0672-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.586819 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/331486fe-c403-4516-9568-862200ec0672-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.586837 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/331486fe-c403-4516-9568-862200ec0672-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.586856 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/331486fe-c403-4516-9568-862200ec0672-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.695295 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.695360 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/331486fe-c403-4516-9568-862200ec0672-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.695466 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/331486fe-c403-4516-9568-862200ec0672-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.695496 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn42p\" (UniqueName: \"kubernetes.io/projected/331486fe-c403-4516-9568-862200ec0672-kube-api-access-pn42p\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.695532 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/331486fe-c403-4516-9568-862200ec0672-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.695605 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/331486fe-c403-4516-9568-862200ec0672-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.695701 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/331486fe-c403-4516-9568-862200ec0672-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.695748 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/331486fe-c403-4516-9568-862200ec0672-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.695804 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/331486fe-c403-4516-9568-862200ec0672-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.695807 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.695824 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/331486fe-c403-4516-9568-862200ec0672-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.698573 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/331486fe-c403-4516-9568-862200ec0672-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.700847 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/331486fe-c403-4516-9568-862200ec0672-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.703292 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/331486fe-c403-4516-9568-862200ec0672-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.703955 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/331486fe-c403-4516-9568-862200ec0672-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.704134 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/331486fe-c403-4516-9568-862200ec0672-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.709007 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/331486fe-c403-4516-9568-862200ec0672-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.709607 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/331486fe-c403-4516-9568-862200ec0672-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.717091 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/331486fe-c403-4516-9568-862200ec0672-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.720886 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn42p\" (UniqueName: \"kubernetes.io/projected/331486fe-c403-4516-9568-862200ec0672-kube-api-access-pn42p\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.723240 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/331486fe-c403-4516-9568-862200ec0672-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.735685 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/331486fe-c403-4516-9568-862200ec0672-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.739711 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:34:59 crc kubenswrapper[4756]: I1205 16:34:59.833825 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.116019 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-ngrkn" event={"ID":"ddc74b94-5abc-43e7-b429-040c492b88d1","Type":"ContainerStarted","Data":"542423a393b07e7bb9c261ec4e452892de1103ce455355a702e1479548e48857"} Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.127875 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"11d0ad48-c987-4415-8536-b150536534a5","Type":"ContainerStarted","Data":"90e66243c1e0c6167189a5d79298fcd076235bdecc2da77cd2ead4fbdbb9fcf7"} Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.436082 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 16:35:00 crc kubenswrapper[4756]: W1205 16:35:00.473534 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod331486fe_c403_4516_9568_862200ec0672.slice/crio-183eaa345ab9e899042dbd94af160bfe145d6f2ea24a9c9fc7ac9d7c89938422 WatchSource:0}: Error finding container 183eaa345ab9e899042dbd94af160bfe145d6f2ea24a9c9fc7ac9d7c89938422: Status 404 returned error can't find the container with id 183eaa345ab9e899042dbd94af160bfe145d6f2ea24a9c9fc7ac9d7c89938422 Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.595981 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.597390 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.602132 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.602150 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-br626" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.602203 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.602153 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.612357 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.615716 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.736393 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55609ecb-b28c-488e-b80f-9add6d256016-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " pod="openstack/openstack-galera-0" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.736650 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/55609ecb-b28c-488e-b80f-9add6d256016-kolla-config\") pod \"openstack-galera-0\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " pod="openstack/openstack-galera-0" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.736704 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/55609ecb-b28c-488e-b80f-9add6d256016-config-data-generated\") pod \"openstack-galera-0\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " pod="openstack/openstack-galera-0" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.736736 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " pod="openstack/openstack-galera-0" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.736772 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55609ecb-b28c-488e-b80f-9add6d256016-operator-scripts\") pod \"openstack-galera-0\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " pod="openstack/openstack-galera-0" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.736804 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n58v\" (UniqueName: \"kubernetes.io/projected/55609ecb-b28c-488e-b80f-9add6d256016-kube-api-access-7n58v\") pod \"openstack-galera-0\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " pod="openstack/openstack-galera-0" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.736841 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/55609ecb-b28c-488e-b80f-9add6d256016-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " pod="openstack/openstack-galera-0" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.736865 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/55609ecb-b28c-488e-b80f-9add6d256016-config-data-default\") pod \"openstack-galera-0\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " pod="openstack/openstack-galera-0" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.839850 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/55609ecb-b28c-488e-b80f-9add6d256016-config-data-generated\") pod \"openstack-galera-0\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " pod="openstack/openstack-galera-0" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.839965 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " pod="openstack/openstack-galera-0" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.840245 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55609ecb-b28c-488e-b80f-9add6d256016-operator-scripts\") pod \"openstack-galera-0\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " pod="openstack/openstack-galera-0" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.840308 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n58v\" (UniqueName: \"kubernetes.io/projected/55609ecb-b28c-488e-b80f-9add6d256016-kube-api-access-7n58v\") pod \"openstack-galera-0\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " pod="openstack/openstack-galera-0" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.840338 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/55609ecb-b28c-488e-b80f-9add6d256016-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " pod="openstack/openstack-galera-0" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.840391 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/55609ecb-b28c-488e-b80f-9add6d256016-config-data-default\") pod \"openstack-galera-0\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " pod="openstack/openstack-galera-0" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.840453 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55609ecb-b28c-488e-b80f-9add6d256016-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " pod="openstack/openstack-galera-0" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.840547 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/55609ecb-b28c-488e-b80f-9add6d256016-kolla-config\") pod \"openstack-galera-0\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " pod="openstack/openstack-galera-0" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.841868 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/55609ecb-b28c-488e-b80f-9add6d256016-config-data-generated\") pod \"openstack-galera-0\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " pod="openstack/openstack-galera-0" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.842207 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/55609ecb-b28c-488e-b80f-9add6d256016-kolla-config\") pod \"openstack-galera-0\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " pod="openstack/openstack-galera-0" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.842274 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-galera-0" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.843491 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55609ecb-b28c-488e-b80f-9add6d256016-operator-scripts\") pod \"openstack-galera-0\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " pod="openstack/openstack-galera-0" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.844637 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/55609ecb-b28c-488e-b80f-9add6d256016-config-data-default\") pod \"openstack-galera-0\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " pod="openstack/openstack-galera-0" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.852508 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55609ecb-b28c-488e-b80f-9add6d256016-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " pod="openstack/openstack-galera-0" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.852698 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/55609ecb-b28c-488e-b80f-9add6d256016-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " pod="openstack/openstack-galera-0" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.861675 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n58v\" (UniqueName: \"kubernetes.io/projected/55609ecb-b28c-488e-b80f-9add6d256016-kube-api-access-7n58v\") pod \"openstack-galera-0\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " pod="openstack/openstack-galera-0" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.866908 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " pod="openstack/openstack-galera-0" Dec 05 16:35:00 crc kubenswrapper[4756]: I1205 16:35:00.931114 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.180586 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"331486fe-c403-4516-9568-862200ec0672","Type":"ContainerStarted","Data":"183eaa345ab9e899042dbd94af160bfe145d6f2ea24a9c9fc7ac9d7c89938422"} Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.581289 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.740185 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.741791 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.744282 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.745279 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-qll5h" Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.745403 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.745784 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.751650 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.858799 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/881d2b4e-3f2e-4aca-a965-2508edc92a3b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.858933 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.859005 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/881d2b4e-3f2e-4aca-a965-2508edc92a3b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.859038 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/881d2b4e-3f2e-4aca-a965-2508edc92a3b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.859059 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/881d2b4e-3f2e-4aca-a965-2508edc92a3b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.859102 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/881d2b4e-3f2e-4aca-a965-2508edc92a3b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.859126 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/881d2b4e-3f2e-4aca-a965-2508edc92a3b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.859155 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7sks\" (UniqueName: \"kubernetes.io/projected/881d2b4e-3f2e-4aca-a965-2508edc92a3b-kube-api-access-x7sks\") pod \"openstack-cell1-galera-0\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.960666 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/881d2b4e-3f2e-4aca-a965-2508edc92a3b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.960730 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/881d2b4e-3f2e-4aca-a965-2508edc92a3b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.960765 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7sks\" (UniqueName: \"kubernetes.io/projected/881d2b4e-3f2e-4aca-a965-2508edc92a3b-kube-api-access-x7sks\") pod \"openstack-cell1-galera-0\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.960810 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/881d2b4e-3f2e-4aca-a965-2508edc92a3b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.960849 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.960910 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/881d2b4e-3f2e-4aca-a965-2508edc92a3b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.960946 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/881d2b4e-3f2e-4aca-a965-2508edc92a3b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.960967 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/881d2b4e-3f2e-4aca-a965-2508edc92a3b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.962482 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/881d2b4e-3f2e-4aca-a965-2508edc92a3b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.962725 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/881d2b4e-3f2e-4aca-a965-2508edc92a3b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.963773 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/881d2b4e-3f2e-4aca-a965-2508edc92a3b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.965503 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/881d2b4e-3f2e-4aca-a965-2508edc92a3b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.965701 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.968165 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/881d2b4e-3f2e-4aca-a965-2508edc92a3b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.968872 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/881d2b4e-3f2e-4aca-a965-2508edc92a3b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:01 crc kubenswrapper[4756]: I1205 16:35:01.979647 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7sks\" (UniqueName: \"kubernetes.io/projected/881d2b4e-3f2e-4aca-a965-2508edc92a3b-kube-api-access-x7sks\") pod \"openstack-cell1-galera-0\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:02 crc kubenswrapper[4756]: I1205 16:35:02.000541 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:02 crc kubenswrapper[4756]: I1205 16:35:02.068050 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:02 crc kubenswrapper[4756]: I1205 16:35:02.204162 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"55609ecb-b28c-488e-b80f-9add6d256016","Type":"ContainerStarted","Data":"d8643f6472346503691bfeafbc54939d454b7c083a9706fcbc56a3fb5bc072a1"} Dec 05 16:35:02 crc kubenswrapper[4756]: I1205 16:35:02.273461 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 05 16:35:02 crc kubenswrapper[4756]: I1205 16:35:02.274510 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 05 16:35:02 crc kubenswrapper[4756]: I1205 16:35:02.284138 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 05 16:35:02 crc kubenswrapper[4756]: I1205 16:35:02.284400 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-hrmc5" Dec 05 16:35:02 crc kubenswrapper[4756]: I1205 16:35:02.284598 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 05 16:35:02 crc kubenswrapper[4756]: I1205 16:35:02.301561 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 05 16:35:02 crc kubenswrapper[4756]: I1205 16:35:02.369606 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vm8t\" (UniqueName: \"kubernetes.io/projected/032af055-c0a6-4fda-90c8-c34ef9278613-kube-api-access-7vm8t\") pod \"memcached-0\" (UID: \"032af055-c0a6-4fda-90c8-c34ef9278613\") " pod="openstack/memcached-0" Dec 05 16:35:02 crc kubenswrapper[4756]: I1205 16:35:02.369692 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/032af055-c0a6-4fda-90c8-c34ef9278613-config-data\") pod \"memcached-0\" (UID: \"032af055-c0a6-4fda-90c8-c34ef9278613\") " pod="openstack/memcached-0" Dec 05 16:35:02 crc kubenswrapper[4756]: I1205 16:35:02.369740 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/032af055-c0a6-4fda-90c8-c34ef9278613-combined-ca-bundle\") pod \"memcached-0\" (UID: \"032af055-c0a6-4fda-90c8-c34ef9278613\") " pod="openstack/memcached-0" Dec 05 16:35:02 crc kubenswrapper[4756]: I1205 16:35:02.369797 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/032af055-c0a6-4fda-90c8-c34ef9278613-kolla-config\") pod \"memcached-0\" (UID: \"032af055-c0a6-4fda-90c8-c34ef9278613\") " pod="openstack/memcached-0" Dec 05 16:35:02 crc kubenswrapper[4756]: I1205 16:35:02.369840 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/032af055-c0a6-4fda-90c8-c34ef9278613-memcached-tls-certs\") pod \"memcached-0\" (UID: \"032af055-c0a6-4fda-90c8-c34ef9278613\") " pod="openstack/memcached-0" Dec 05 16:35:02 crc kubenswrapper[4756]: I1205 16:35:02.474774 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/032af055-c0a6-4fda-90c8-c34ef9278613-combined-ca-bundle\") pod \"memcached-0\" (UID: \"032af055-c0a6-4fda-90c8-c34ef9278613\") " pod="openstack/memcached-0" Dec 05 16:35:02 crc kubenswrapper[4756]: I1205 16:35:02.474857 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/032af055-c0a6-4fda-90c8-c34ef9278613-kolla-config\") pod \"memcached-0\" (UID: \"032af055-c0a6-4fda-90c8-c34ef9278613\") " pod="openstack/memcached-0" Dec 05 16:35:02 crc kubenswrapper[4756]: I1205 16:35:02.474893 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/032af055-c0a6-4fda-90c8-c34ef9278613-memcached-tls-certs\") pod \"memcached-0\" (UID: \"032af055-c0a6-4fda-90c8-c34ef9278613\") " pod="openstack/memcached-0" Dec 05 16:35:02 crc kubenswrapper[4756]: I1205 16:35:02.474946 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vm8t\" (UniqueName: \"kubernetes.io/projected/032af055-c0a6-4fda-90c8-c34ef9278613-kube-api-access-7vm8t\") pod \"memcached-0\" (UID: \"032af055-c0a6-4fda-90c8-c34ef9278613\") " pod="openstack/memcached-0" Dec 05 16:35:02 crc kubenswrapper[4756]: I1205 16:35:02.474982 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/032af055-c0a6-4fda-90c8-c34ef9278613-config-data\") pod \"memcached-0\" (UID: \"032af055-c0a6-4fda-90c8-c34ef9278613\") " pod="openstack/memcached-0" Dec 05 16:35:02 crc kubenswrapper[4756]: I1205 16:35:02.481236 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/032af055-c0a6-4fda-90c8-c34ef9278613-memcached-tls-certs\") pod \"memcached-0\" (UID: \"032af055-c0a6-4fda-90c8-c34ef9278613\") " pod="openstack/memcached-0" Dec 05 16:35:02 crc kubenswrapper[4756]: I1205 16:35:02.481345 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/032af055-c0a6-4fda-90c8-c34ef9278613-combined-ca-bundle\") pod \"memcached-0\" (UID: \"032af055-c0a6-4fda-90c8-c34ef9278613\") " pod="openstack/memcached-0" Dec 05 16:35:02 crc kubenswrapper[4756]: I1205 16:35:02.481618 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/032af055-c0a6-4fda-90c8-c34ef9278613-config-data\") pod \"memcached-0\" (UID: \"032af055-c0a6-4fda-90c8-c34ef9278613\") " pod="openstack/memcached-0" Dec 05 16:35:02 crc kubenswrapper[4756]: I1205 16:35:02.481955 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/032af055-c0a6-4fda-90c8-c34ef9278613-kolla-config\") pod \"memcached-0\" (UID: \"032af055-c0a6-4fda-90c8-c34ef9278613\") " pod="openstack/memcached-0" Dec 05 16:35:02 crc kubenswrapper[4756]: I1205 16:35:02.500904 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vm8t\" (UniqueName: \"kubernetes.io/projected/032af055-c0a6-4fda-90c8-c34ef9278613-kube-api-access-7vm8t\") pod \"memcached-0\" (UID: \"032af055-c0a6-4fda-90c8-c34ef9278613\") " pod="openstack/memcached-0" Dec 05 16:35:02 crc kubenswrapper[4756]: I1205 16:35:02.615540 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 05 16:35:02 crc kubenswrapper[4756]: I1205 16:35:02.720547 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 16:35:03 crc kubenswrapper[4756]: I1205 16:35:03.204448 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 05 16:35:03 crc kubenswrapper[4756]: W1205 16:35:03.212543 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod032af055_c0a6_4fda_90c8_c34ef9278613.slice/crio-a0382226d9555c95b26872032e39d2b01e30bfae3f59087df67631d329270a69 WatchSource:0}: Error finding container a0382226d9555c95b26872032e39d2b01e30bfae3f59087df67631d329270a69: Status 404 returned error can't find the container with id a0382226d9555c95b26872032e39d2b01e30bfae3f59087df67631d329270a69 Dec 05 16:35:03 crc kubenswrapper[4756]: I1205 16:35:03.224042 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"881d2b4e-3f2e-4aca-a965-2508edc92a3b","Type":"ContainerStarted","Data":"e44703d92d418215bf4058bbba49c988c5dd02387e4ce366e08a1a22a8daa411"} Dec 05 16:35:03 crc kubenswrapper[4756]: I1205 16:35:03.902354 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 16:35:03 crc kubenswrapper[4756]: I1205 16:35:03.926817 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 16:35:03 crc kubenswrapper[4756]: I1205 16:35:03.926961 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 16:35:03 crc kubenswrapper[4756]: I1205 16:35:03.933583 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-zz2nq" Dec 05 16:35:04 crc kubenswrapper[4756]: I1205 16:35:04.012214 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctt8f\" (UniqueName: \"kubernetes.io/projected/d091b847-2b31-492a-a1b7-37e356e55e93-kube-api-access-ctt8f\") pod \"kube-state-metrics-0\" (UID: \"d091b847-2b31-492a-a1b7-37e356e55e93\") " pod="openstack/kube-state-metrics-0" Dec 05 16:35:04 crc kubenswrapper[4756]: I1205 16:35:04.115481 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctt8f\" (UniqueName: \"kubernetes.io/projected/d091b847-2b31-492a-a1b7-37e356e55e93-kube-api-access-ctt8f\") pod \"kube-state-metrics-0\" (UID: \"d091b847-2b31-492a-a1b7-37e356e55e93\") " pod="openstack/kube-state-metrics-0" Dec 05 16:35:04 crc kubenswrapper[4756]: I1205 16:35:04.156332 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctt8f\" (UniqueName: \"kubernetes.io/projected/d091b847-2b31-492a-a1b7-37e356e55e93-kube-api-access-ctt8f\") pod \"kube-state-metrics-0\" (UID: \"d091b847-2b31-492a-a1b7-37e356e55e93\") " pod="openstack/kube-state-metrics-0" Dec 05 16:35:04 crc kubenswrapper[4756]: I1205 16:35:04.244714 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"032af055-c0a6-4fda-90c8-c34ef9278613","Type":"ContainerStarted","Data":"a0382226d9555c95b26872032e39d2b01e30bfae3f59087df67631d329270a69"} Dec 05 16:35:04 crc kubenswrapper[4756]: I1205 16:35:04.278361 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 16:35:07 crc kubenswrapper[4756]: I1205 16:35:07.867140 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-vsmjz"] Dec 05 16:35:07 crc kubenswrapper[4756]: I1205 16:35:07.868650 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vsmjz" Dec 05 16:35:07 crc kubenswrapper[4756]: I1205 16:35:07.870485 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-nw7pt" Dec 05 16:35:07 crc kubenswrapper[4756]: I1205 16:35:07.870725 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 05 16:35:07 crc kubenswrapper[4756]: I1205 16:35:07.871995 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 05 16:35:07 crc kubenswrapper[4756]: I1205 16:35:07.880118 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vsmjz"] Dec 05 16:35:07 crc kubenswrapper[4756]: I1205 16:35:07.930194 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-kdtg5"] Dec 05 16:35:07 crc kubenswrapper[4756]: I1205 16:35:07.931733 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-kdtg5" Dec 05 16:35:07 crc kubenswrapper[4756]: I1205 16:35:07.955512 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-kdtg5"] Dec 05 16:35:07 crc kubenswrapper[4756]: I1205 16:35:07.992892 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/359e87f3-8fb4-4476-a538-13c948b65885-var-run-ovn\") pod \"ovn-controller-vsmjz\" (UID: \"359e87f3-8fb4-4476-a538-13c948b65885\") " pod="openstack/ovn-controller-vsmjz" Dec 05 16:35:07 crc kubenswrapper[4756]: I1205 16:35:07.992941 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc482\" (UniqueName: \"kubernetes.io/projected/359e87f3-8fb4-4476-a538-13c948b65885-kube-api-access-mc482\") pod \"ovn-controller-vsmjz\" (UID: \"359e87f3-8fb4-4476-a538-13c948b65885\") " pod="openstack/ovn-controller-vsmjz" Dec 05 16:35:07 crc kubenswrapper[4756]: I1205 16:35:07.992973 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/359e87f3-8fb4-4476-a538-13c948b65885-scripts\") pod \"ovn-controller-vsmjz\" (UID: \"359e87f3-8fb4-4476-a538-13c948b65885\") " pod="openstack/ovn-controller-vsmjz" Dec 05 16:35:07 crc kubenswrapper[4756]: I1205 16:35:07.993007 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/359e87f3-8fb4-4476-a538-13c948b65885-var-run\") pod \"ovn-controller-vsmjz\" (UID: \"359e87f3-8fb4-4476-a538-13c948b65885\") " pod="openstack/ovn-controller-vsmjz" Dec 05 16:35:07 crc kubenswrapper[4756]: I1205 16:35:07.993027 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/359e87f3-8fb4-4476-a538-13c948b65885-ovn-controller-tls-certs\") pod \"ovn-controller-vsmjz\" (UID: \"359e87f3-8fb4-4476-a538-13c948b65885\") " pod="openstack/ovn-controller-vsmjz" Dec 05 16:35:07 crc kubenswrapper[4756]: I1205 16:35:07.993047 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/359e87f3-8fb4-4476-a538-13c948b65885-var-log-ovn\") pod \"ovn-controller-vsmjz\" (UID: \"359e87f3-8fb4-4476-a538-13c948b65885\") " pod="openstack/ovn-controller-vsmjz" Dec 05 16:35:07 crc kubenswrapper[4756]: I1205 16:35:07.993065 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/359e87f3-8fb4-4476-a538-13c948b65885-combined-ca-bundle\") pod \"ovn-controller-vsmjz\" (UID: \"359e87f3-8fb4-4476-a538-13c948b65885\") " pod="openstack/ovn-controller-vsmjz" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.033483 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.037596 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.039991 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.039993 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.041015 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.041207 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-k9jq7" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.041509 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.044911 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.094029 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/40ff87cb-4d82-49d4-a4d9-55e104c9becb-var-lib\") pod \"ovn-controller-ovs-kdtg5\" (UID: \"40ff87cb-4d82-49d4-a4d9-55e104c9becb\") " pod="openstack/ovn-controller-ovs-kdtg5" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.094092 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/359e87f3-8fb4-4476-a538-13c948b65885-scripts\") pod \"ovn-controller-vsmjz\" (UID: \"359e87f3-8fb4-4476-a538-13c948b65885\") " pod="openstack/ovn-controller-vsmjz" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.094123 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/40ff87cb-4d82-49d4-a4d9-55e104c9becb-etc-ovs\") pod \"ovn-controller-ovs-kdtg5\" (UID: \"40ff87cb-4d82-49d4-a4d9-55e104c9becb\") " pod="openstack/ovn-controller-ovs-kdtg5" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.094171 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/359e87f3-8fb4-4476-a538-13c948b65885-var-run\") pod \"ovn-controller-vsmjz\" (UID: \"359e87f3-8fb4-4476-a538-13c948b65885\") " pod="openstack/ovn-controller-vsmjz" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.094194 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40ff87cb-4d82-49d4-a4d9-55e104c9becb-scripts\") pod \"ovn-controller-ovs-kdtg5\" (UID: \"40ff87cb-4d82-49d4-a4d9-55e104c9becb\") " pod="openstack/ovn-controller-ovs-kdtg5" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.094223 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/359e87f3-8fb4-4476-a538-13c948b65885-ovn-controller-tls-certs\") pod \"ovn-controller-vsmjz\" (UID: \"359e87f3-8fb4-4476-a538-13c948b65885\") " pod="openstack/ovn-controller-vsmjz" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.094250 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/359e87f3-8fb4-4476-a538-13c948b65885-var-log-ovn\") pod \"ovn-controller-vsmjz\" (UID: \"359e87f3-8fb4-4476-a538-13c948b65885\") " pod="openstack/ovn-controller-vsmjz" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.094275 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/359e87f3-8fb4-4476-a538-13c948b65885-combined-ca-bundle\") pod \"ovn-controller-vsmjz\" (UID: \"359e87f3-8fb4-4476-a538-13c948b65885\") " pod="openstack/ovn-controller-vsmjz" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.094328 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/40ff87cb-4d82-49d4-a4d9-55e104c9becb-var-log\") pod \"ovn-controller-ovs-kdtg5\" (UID: \"40ff87cb-4d82-49d4-a4d9-55e104c9becb\") " pod="openstack/ovn-controller-ovs-kdtg5" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.094360 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxptp\" (UniqueName: \"kubernetes.io/projected/40ff87cb-4d82-49d4-a4d9-55e104c9becb-kube-api-access-kxptp\") pod \"ovn-controller-ovs-kdtg5\" (UID: \"40ff87cb-4d82-49d4-a4d9-55e104c9becb\") " pod="openstack/ovn-controller-ovs-kdtg5" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.094398 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/40ff87cb-4d82-49d4-a4d9-55e104c9becb-var-run\") pod \"ovn-controller-ovs-kdtg5\" (UID: \"40ff87cb-4d82-49d4-a4d9-55e104c9becb\") " pod="openstack/ovn-controller-ovs-kdtg5" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.094518 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/359e87f3-8fb4-4476-a538-13c948b65885-var-run-ovn\") pod \"ovn-controller-vsmjz\" (UID: \"359e87f3-8fb4-4476-a538-13c948b65885\") " pod="openstack/ovn-controller-vsmjz" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.094559 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc482\" (UniqueName: \"kubernetes.io/projected/359e87f3-8fb4-4476-a538-13c948b65885-kube-api-access-mc482\") pod \"ovn-controller-vsmjz\" (UID: \"359e87f3-8fb4-4476-a538-13c948b65885\") " pod="openstack/ovn-controller-vsmjz" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.095104 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/359e87f3-8fb4-4476-a538-13c948b65885-var-run\") pod \"ovn-controller-vsmjz\" (UID: \"359e87f3-8fb4-4476-a538-13c948b65885\") " pod="openstack/ovn-controller-vsmjz" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.095187 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/359e87f3-8fb4-4476-a538-13c948b65885-var-log-ovn\") pod \"ovn-controller-vsmjz\" (UID: \"359e87f3-8fb4-4476-a538-13c948b65885\") " pod="openstack/ovn-controller-vsmjz" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.095648 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/359e87f3-8fb4-4476-a538-13c948b65885-var-run-ovn\") pod \"ovn-controller-vsmjz\" (UID: \"359e87f3-8fb4-4476-a538-13c948b65885\") " pod="openstack/ovn-controller-vsmjz" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.096830 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/359e87f3-8fb4-4476-a538-13c948b65885-scripts\") pod \"ovn-controller-vsmjz\" (UID: \"359e87f3-8fb4-4476-a538-13c948b65885\") " pod="openstack/ovn-controller-vsmjz" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.100603 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/359e87f3-8fb4-4476-a538-13c948b65885-ovn-controller-tls-certs\") pod \"ovn-controller-vsmjz\" (UID: \"359e87f3-8fb4-4476-a538-13c948b65885\") " pod="openstack/ovn-controller-vsmjz" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.101079 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/359e87f3-8fb4-4476-a538-13c948b65885-combined-ca-bundle\") pod \"ovn-controller-vsmjz\" (UID: \"359e87f3-8fb4-4476-a538-13c948b65885\") " pod="openstack/ovn-controller-vsmjz" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.112268 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc482\" (UniqueName: \"kubernetes.io/projected/359e87f3-8fb4-4476-a538-13c948b65885-kube-api-access-mc482\") pod \"ovn-controller-vsmjz\" (UID: \"359e87f3-8fb4-4476-a538-13c948b65885\") " pod="openstack/ovn-controller-vsmjz" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.187848 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vsmjz" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.196092 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.196127 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f72429c-8e1e-4c2d-8417-c499c22013a9-config\") pod \"ovsdbserver-nb-0\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.196152 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f72429c-8e1e-4c2d-8417-c499c22013a9-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.196204 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40ff87cb-4d82-49d4-a4d9-55e104c9becb-scripts\") pod \"ovn-controller-ovs-kdtg5\" (UID: \"40ff87cb-4d82-49d4-a4d9-55e104c9becb\") " pod="openstack/ovn-controller-ovs-kdtg5" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.196238 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f72429c-8e1e-4c2d-8417-c499c22013a9-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.196259 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f72429c-8e1e-4c2d-8417-c499c22013a9-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.196297 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/40ff87cb-4d82-49d4-a4d9-55e104c9becb-var-lib\") pod \"ovn-controller-ovs-kdtg5\" (UID: \"40ff87cb-4d82-49d4-a4d9-55e104c9becb\") " pod="openstack/ovn-controller-ovs-kdtg5" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.196315 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f72429c-8e1e-4c2d-8417-c499c22013a9-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.196333 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/40ff87cb-4d82-49d4-a4d9-55e104c9becb-etc-ovs\") pod \"ovn-controller-ovs-kdtg5\" (UID: \"40ff87cb-4d82-49d4-a4d9-55e104c9becb\") " pod="openstack/ovn-controller-ovs-kdtg5" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.196360 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4q5b\" (UniqueName: \"kubernetes.io/projected/3f72429c-8e1e-4c2d-8417-c499c22013a9-kube-api-access-k4q5b\") pod \"ovsdbserver-nb-0\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.196375 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3f72429c-8e1e-4c2d-8417-c499c22013a9-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.196404 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/40ff87cb-4d82-49d4-a4d9-55e104c9becb-var-log\") pod \"ovn-controller-ovs-kdtg5\" (UID: \"40ff87cb-4d82-49d4-a4d9-55e104c9becb\") " pod="openstack/ovn-controller-ovs-kdtg5" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.196439 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxptp\" (UniqueName: \"kubernetes.io/projected/40ff87cb-4d82-49d4-a4d9-55e104c9becb-kube-api-access-kxptp\") pod \"ovn-controller-ovs-kdtg5\" (UID: \"40ff87cb-4d82-49d4-a4d9-55e104c9becb\") " pod="openstack/ovn-controller-ovs-kdtg5" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.196465 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/40ff87cb-4d82-49d4-a4d9-55e104c9becb-var-run\") pod \"ovn-controller-ovs-kdtg5\" (UID: \"40ff87cb-4d82-49d4-a4d9-55e104c9becb\") " pod="openstack/ovn-controller-ovs-kdtg5" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.196773 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/40ff87cb-4d82-49d4-a4d9-55e104c9becb-var-lib\") pod \"ovn-controller-ovs-kdtg5\" (UID: \"40ff87cb-4d82-49d4-a4d9-55e104c9becb\") " pod="openstack/ovn-controller-ovs-kdtg5" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.196824 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/40ff87cb-4d82-49d4-a4d9-55e104c9becb-etc-ovs\") pod \"ovn-controller-ovs-kdtg5\" (UID: \"40ff87cb-4d82-49d4-a4d9-55e104c9becb\") " pod="openstack/ovn-controller-ovs-kdtg5" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.196873 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/40ff87cb-4d82-49d4-a4d9-55e104c9becb-var-log\") pod \"ovn-controller-ovs-kdtg5\" (UID: \"40ff87cb-4d82-49d4-a4d9-55e104c9becb\") " pod="openstack/ovn-controller-ovs-kdtg5" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.197348 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/40ff87cb-4d82-49d4-a4d9-55e104c9becb-var-run\") pod \"ovn-controller-ovs-kdtg5\" (UID: \"40ff87cb-4d82-49d4-a4d9-55e104c9becb\") " pod="openstack/ovn-controller-ovs-kdtg5" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.198986 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40ff87cb-4d82-49d4-a4d9-55e104c9becb-scripts\") pod \"ovn-controller-ovs-kdtg5\" (UID: \"40ff87cb-4d82-49d4-a4d9-55e104c9becb\") " pod="openstack/ovn-controller-ovs-kdtg5" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.215256 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxptp\" (UniqueName: \"kubernetes.io/projected/40ff87cb-4d82-49d4-a4d9-55e104c9becb-kube-api-access-kxptp\") pod \"ovn-controller-ovs-kdtg5\" (UID: \"40ff87cb-4d82-49d4-a4d9-55e104c9becb\") " pod="openstack/ovn-controller-ovs-kdtg5" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.246273 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-kdtg5" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.298109 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f72429c-8e1e-4c2d-8417-c499c22013a9-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.298201 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f72429c-8e1e-4c2d-8417-c499c22013a9-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.298247 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4q5b\" (UniqueName: \"kubernetes.io/projected/3f72429c-8e1e-4c2d-8417-c499c22013a9-kube-api-access-k4q5b\") pod \"ovsdbserver-nb-0\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.298270 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3f72429c-8e1e-4c2d-8417-c499c22013a9-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.298696 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.298936 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3f72429c-8e1e-4c2d-8417-c499c22013a9-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.299022 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.299400 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f72429c-8e1e-4c2d-8417-c499c22013a9-config\") pod \"ovsdbserver-nb-0\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.300867 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f72429c-8e1e-4c2d-8417-c499c22013a9-config\") pod \"ovsdbserver-nb-0\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.300912 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f72429c-8e1e-4c2d-8417-c499c22013a9-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.301001 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f72429c-8e1e-4c2d-8417-c499c22013a9-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.302880 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f72429c-8e1e-4c2d-8417-c499c22013a9-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.303668 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f72429c-8e1e-4c2d-8417-c499c22013a9-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.310081 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f72429c-8e1e-4c2d-8417-c499c22013a9-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.310617 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f72429c-8e1e-4c2d-8417-c499c22013a9-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.317282 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4q5b\" (UniqueName: \"kubernetes.io/projected/3f72429c-8e1e-4c2d-8417-c499c22013a9-kube-api-access-k4q5b\") pod \"ovsdbserver-nb-0\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.320528 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:08 crc kubenswrapper[4756]: I1205 16:35:08.364333 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:11 crc kubenswrapper[4756]: I1205 16:35:11.964566 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 16:35:11 crc kubenswrapper[4756]: I1205 16:35:11.966534 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:11 crc kubenswrapper[4756]: I1205 16:35:11.970096 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 05 16:35:11 crc kubenswrapper[4756]: I1205 16:35:11.970215 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 05 16:35:11 crc kubenswrapper[4756]: I1205 16:35:11.970505 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 05 16:35:11 crc kubenswrapper[4756]: I1205 16:35:11.971132 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-sz9bz" Dec 05 16:35:11 crc kubenswrapper[4756]: I1205 16:35:11.977983 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 16:35:12 crc kubenswrapper[4756]: I1205 16:35:12.064626 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pldz2\" (UniqueName: \"kubernetes.io/projected/930a573c-21eb-46e3-827b-2ff167eecd4a-kube-api-access-pldz2\") pod \"ovsdbserver-sb-0\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:12 crc kubenswrapper[4756]: I1205 16:35:12.064666 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:12 crc kubenswrapper[4756]: I1205 16:35:12.064693 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/930a573c-21eb-46e3-827b-2ff167eecd4a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:12 crc kubenswrapper[4756]: I1205 16:35:12.064712 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/930a573c-21eb-46e3-827b-2ff167eecd4a-config\") pod \"ovsdbserver-sb-0\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:12 crc kubenswrapper[4756]: I1205 16:35:12.064745 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/930a573c-21eb-46e3-827b-2ff167eecd4a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:12 crc kubenswrapper[4756]: I1205 16:35:12.064762 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/930a573c-21eb-46e3-827b-2ff167eecd4a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:12 crc kubenswrapper[4756]: I1205 16:35:12.064801 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/930a573c-21eb-46e3-827b-2ff167eecd4a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:12 crc kubenswrapper[4756]: I1205 16:35:12.064863 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/930a573c-21eb-46e3-827b-2ff167eecd4a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:12 crc kubenswrapper[4756]: I1205 16:35:12.166142 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pldz2\" (UniqueName: \"kubernetes.io/projected/930a573c-21eb-46e3-827b-2ff167eecd4a-kube-api-access-pldz2\") pod \"ovsdbserver-sb-0\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:12 crc kubenswrapper[4756]: I1205 16:35:12.166192 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:12 crc kubenswrapper[4756]: I1205 16:35:12.166224 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/930a573c-21eb-46e3-827b-2ff167eecd4a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:12 crc kubenswrapper[4756]: I1205 16:35:12.166244 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/930a573c-21eb-46e3-827b-2ff167eecd4a-config\") pod \"ovsdbserver-sb-0\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:12 crc kubenswrapper[4756]: I1205 16:35:12.166274 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/930a573c-21eb-46e3-827b-2ff167eecd4a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:12 crc kubenswrapper[4756]: I1205 16:35:12.166292 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/930a573c-21eb-46e3-827b-2ff167eecd4a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:12 crc kubenswrapper[4756]: I1205 16:35:12.166346 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/930a573c-21eb-46e3-827b-2ff167eecd4a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:12 crc kubenswrapper[4756]: I1205 16:35:12.166396 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/930a573c-21eb-46e3-827b-2ff167eecd4a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:12 crc kubenswrapper[4756]: I1205 16:35:12.166520 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:12 crc kubenswrapper[4756]: I1205 16:35:12.166960 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/930a573c-21eb-46e3-827b-2ff167eecd4a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:12 crc kubenswrapper[4756]: I1205 16:35:12.172883 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/930a573c-21eb-46e3-827b-2ff167eecd4a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:12 crc kubenswrapper[4756]: I1205 16:35:12.176983 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/930a573c-21eb-46e3-827b-2ff167eecd4a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:12 crc kubenswrapper[4756]: I1205 16:35:12.180531 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/930a573c-21eb-46e3-827b-2ff167eecd4a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:12 crc kubenswrapper[4756]: I1205 16:35:12.183856 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pldz2\" (UniqueName: \"kubernetes.io/projected/930a573c-21eb-46e3-827b-2ff167eecd4a-kube-api-access-pldz2\") pod \"ovsdbserver-sb-0\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:12 crc kubenswrapper[4756]: I1205 16:35:12.203083 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:12 crc kubenswrapper[4756]: I1205 16:35:12.615447 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/930a573c-21eb-46e3-827b-2ff167eecd4a-config\") pod \"ovsdbserver-sb-0\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:12 crc kubenswrapper[4756]: I1205 16:35:12.615497 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/930a573c-21eb-46e3-827b-2ff167eecd4a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:12 crc kubenswrapper[4756]: I1205 16:35:12.889225 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:18 crc kubenswrapper[4756]: E1205 16:35:18.970008 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb@sha256:d9a2d8b19d3de4658dd26a2e781d00002e937738bfe6a1d0cf6c68c015085f4a" Dec 05 16:35:18 crc kubenswrapper[4756]: E1205 16:35:18.971135 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:d9a2d8b19d3de4658dd26a2e781d00002e937738bfe6a1d0cf6c68c015085f4a,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7n58v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(55609ecb-b28c-488e-b80f-9add6d256016): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:35:18 crc kubenswrapper[4756]: E1205 16:35:18.972360 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="55609ecb-b28c-488e-b80f-9add6d256016" Dec 05 16:35:18 crc kubenswrapper[4756]: E1205 16:35:18.980931 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb@sha256:d9a2d8b19d3de4658dd26a2e781d00002e937738bfe6a1d0cf6c68c015085f4a" Dec 05 16:35:18 crc kubenswrapper[4756]: E1205 16:35:18.981081 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:d9a2d8b19d3de4658dd26a2e781d00002e937738bfe6a1d0cf6c68c015085f4a,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x7sks,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(881d2b4e-3f2e-4aca-a965-2508edc92a3b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:35:18 crc kubenswrapper[4756]: E1205 16:35:18.982249 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="881d2b4e-3f2e-4aca-a965-2508edc92a3b" Dec 05 16:35:19 crc kubenswrapper[4756]: E1205 16:35:19.359265 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb@sha256:d9a2d8b19d3de4658dd26a2e781d00002e937738bfe6a1d0cf6c68c015085f4a\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="881d2b4e-3f2e-4aca-a965-2508edc92a3b" Dec 05 16:35:19 crc kubenswrapper[4756]: E1205 16:35:19.360883 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb@sha256:d9a2d8b19d3de4658dd26a2e781d00002e937738bfe6a1d0cf6c68c015085f4a\\\"\"" pod="openstack/openstack-galera-0" podUID="55609ecb-b28c-488e-b80f-9add6d256016" Dec 05 16:35:19 crc kubenswrapper[4756]: E1205 16:35:19.980696 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:c2e607504d3c90c201f29ca74f0ee51de1ef2c5444d597d2ff3160ab9b3ce84d" Dec 05 16:35:19 crc kubenswrapper[4756]: E1205 16:35:19.980881 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:c2e607504d3c90c201f29ca74f0ee51de1ef2c5444d597d2ff3160ab9b3ce84d,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m2trz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(11d0ad48-c987-4415-8536-b150536534a5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:35:19 crc kubenswrapper[4756]: E1205 16:35:19.982887 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="11d0ad48-c987-4415-8536-b150536534a5" Dec 05 16:35:19 crc kubenswrapper[4756]: E1205 16:35:19.997733 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:c2e607504d3c90c201f29ca74f0ee51de1ef2c5444d597d2ff3160ab9b3ce84d" Dec 05 16:35:19 crc kubenswrapper[4756]: E1205 16:35:19.997912 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:c2e607504d3c90c201f29ca74f0ee51de1ef2c5444d597d2ff3160ab9b3ce84d,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pn42p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(331486fe-c403-4516-9568-862200ec0672): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:35:19 crc kubenswrapper[4756]: E1205 16:35:19.999131 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="331486fe-c403-4516-9568-862200ec0672" Dec 05 16:35:20 crc kubenswrapper[4756]: E1205 16:35:20.366789 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:c2e607504d3c90c201f29ca74f0ee51de1ef2c5444d597d2ff3160ab9b3ce84d\\\"\"" pod="openstack/rabbitmq-server-0" podUID="11d0ad48-c987-4415-8536-b150536534a5" Dec 05 16:35:20 crc kubenswrapper[4756]: E1205 16:35:20.367170 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:c2e607504d3c90c201f29ca74f0ee51de1ef2c5444d597d2ff3160ab9b3ce84d\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="331486fe-c403-4516-9568-862200ec0672" Dec 05 16:35:24 crc kubenswrapper[4756]: I1205 16:35:24.655770 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 16:35:24 crc kubenswrapper[4756]: E1205 16:35:24.945453 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792" Dec 05 16:35:24 crc kubenswrapper[4756]: E1205 16:35:24.945940 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dsxfz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-bc4b48fc9-nxpgh_openstack(76a2778a-54e4-480c-b234-50c6f65c5d6c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:35:24 crc kubenswrapper[4756]: E1205 16:35:24.948753 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-bc4b48fc9-nxpgh" podUID="76a2778a-54e4-480c-b234-50c6f65c5d6c" Dec 05 16:35:24 crc kubenswrapper[4756]: E1205 16:35:24.967877 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792" Dec 05 16:35:24 crc kubenswrapper[4756]: E1205 16:35:24.968067 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z2xqt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-567c455747-crzvw_openstack(26532a0f-2abc-4635-b4ef-b4e59fdefcbf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:35:24 crc kubenswrapper[4756]: E1205 16:35:24.969720 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-567c455747-crzvw" podUID="26532a0f-2abc-4635-b4ef-b4e59fdefcbf" Dec 05 16:35:24 crc kubenswrapper[4756]: E1205 16:35:24.991103 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792" Dec 05 16:35:24 crc kubenswrapper[4756]: E1205 16:35:24.991277 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mk8rj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-cb666b895-ngrkn_openstack(ddc74b94-5abc-43e7-b429-040c492b88d1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:35:24 crc kubenswrapper[4756]: E1205 16:35:24.992705 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-cb666b895-ngrkn" podUID="ddc74b94-5abc-43e7-b429-040c492b88d1" Dec 05 16:35:25 crc kubenswrapper[4756]: E1205 16:35:25.008965 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792" Dec 05 16:35:25 crc kubenswrapper[4756]: E1205 16:35:25.009136 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5t8xx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5cd484bb89-x6rb8_openstack(f0f8a490-c56f-4a46-85c1-b609e9be9676): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:35:25 crc kubenswrapper[4756]: E1205 16:35:25.010357 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5cd484bb89-x6rb8" podUID="f0f8a490-c56f-4a46-85c1-b609e9be9676" Dec 05 16:35:25 crc kubenswrapper[4756]: I1205 16:35:25.408896 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d091b847-2b31-492a-a1b7-37e356e55e93","Type":"ContainerStarted","Data":"c716ebc98ba0a2396dfd1cab4154bec9bcca9e2eb678a919759fbb13fb2de23f"} Dec 05 16:35:25 crc kubenswrapper[4756]: I1205 16:35:25.411628 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"032af055-c0a6-4fda-90c8-c34ef9278613","Type":"ContainerStarted","Data":"0d4580a5393a54bbc3f4d1d132fb4626624aacc2b800fb18a8d15e491de91652"} Dec 05 16:35:25 crc kubenswrapper[4756]: I1205 16:35:25.411945 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 05 16:35:25 crc kubenswrapper[4756]: E1205 16:35:25.413036 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792\\\"\"" pod="openstack/dnsmasq-dns-cb666b895-ngrkn" podUID="ddc74b94-5abc-43e7-b429-040c492b88d1" Dec 05 16:35:25 crc kubenswrapper[4756]: E1205 16:35:25.413046 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792\\\"\"" pod="openstack/dnsmasq-dns-bc4b48fc9-nxpgh" podUID="76a2778a-54e4-480c-b234-50c6f65c5d6c" Dec 05 16:35:25 crc kubenswrapper[4756]: I1205 16:35:25.414698 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vsmjz"] Dec 05 16:35:25 crc kubenswrapper[4756]: I1205 16:35:25.532540 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-kdtg5"] Dec 05 16:35:25 crc kubenswrapper[4756]: I1205 16:35:25.542950 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=1.8348406929999999 podStartE2EDuration="23.54292958s" podCreationTimestamp="2025-12-05 16:35:02 +0000 UTC" firstStartedPulling="2025-12-05 16:35:03.218216021 +0000 UTC m=+1163.456889533" lastFinishedPulling="2025-12-05 16:35:24.926304908 +0000 UTC m=+1185.164978420" observedRunningTime="2025-12-05 16:35:25.529104417 +0000 UTC m=+1185.767777939" watchObservedRunningTime="2025-12-05 16:35:25.54292958 +0000 UTC m=+1185.781603092" Dec 05 16:35:25 crc kubenswrapper[4756]: I1205 16:35:25.592137 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 16:35:26 crc kubenswrapper[4756]: I1205 16:35:26.060743 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-x6rb8" Dec 05 16:35:26 crc kubenswrapper[4756]: I1205 16:35:26.096257 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-crzvw" Dec 05 16:35:26 crc kubenswrapper[4756]: I1205 16:35:26.235712 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26532a0f-2abc-4635-b4ef-b4e59fdefcbf-dns-svc\") pod \"26532a0f-2abc-4635-b4ef-b4e59fdefcbf\" (UID: \"26532a0f-2abc-4635-b4ef-b4e59fdefcbf\") " Dec 05 16:35:26 crc kubenswrapper[4756]: I1205 16:35:26.235807 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2xqt\" (UniqueName: \"kubernetes.io/projected/26532a0f-2abc-4635-b4ef-b4e59fdefcbf-kube-api-access-z2xqt\") pod \"26532a0f-2abc-4635-b4ef-b4e59fdefcbf\" (UID: \"26532a0f-2abc-4635-b4ef-b4e59fdefcbf\") " Dec 05 16:35:26 crc kubenswrapper[4756]: I1205 16:35:26.235925 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t8xx\" (UniqueName: \"kubernetes.io/projected/f0f8a490-c56f-4a46-85c1-b609e9be9676-kube-api-access-5t8xx\") pod \"f0f8a490-c56f-4a46-85c1-b609e9be9676\" (UID: \"f0f8a490-c56f-4a46-85c1-b609e9be9676\") " Dec 05 16:35:26 crc kubenswrapper[4756]: I1205 16:35:26.236018 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0f8a490-c56f-4a46-85c1-b609e9be9676-config\") pod \"f0f8a490-c56f-4a46-85c1-b609e9be9676\" (UID: \"f0f8a490-c56f-4a46-85c1-b609e9be9676\") " Dec 05 16:35:26 crc kubenswrapper[4756]: I1205 16:35:26.236069 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26532a0f-2abc-4635-b4ef-b4e59fdefcbf-config\") pod \"26532a0f-2abc-4635-b4ef-b4e59fdefcbf\" (UID: \"26532a0f-2abc-4635-b4ef-b4e59fdefcbf\") " Dec 05 16:35:26 crc kubenswrapper[4756]: I1205 16:35:26.236505 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0f8a490-c56f-4a46-85c1-b609e9be9676-config" (OuterVolumeSpecName: "config") pod "f0f8a490-c56f-4a46-85c1-b609e9be9676" (UID: "f0f8a490-c56f-4a46-85c1-b609e9be9676"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:35:26 crc kubenswrapper[4756]: I1205 16:35:26.236645 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0f8a490-c56f-4a46-85c1-b609e9be9676-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:26 crc kubenswrapper[4756]: I1205 16:35:26.236654 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26532a0f-2abc-4635-b4ef-b4e59fdefcbf-config" (OuterVolumeSpecName: "config") pod "26532a0f-2abc-4635-b4ef-b4e59fdefcbf" (UID: "26532a0f-2abc-4635-b4ef-b4e59fdefcbf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:35:26 crc kubenswrapper[4756]: I1205 16:35:26.236707 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26532a0f-2abc-4635-b4ef-b4e59fdefcbf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "26532a0f-2abc-4635-b4ef-b4e59fdefcbf" (UID: "26532a0f-2abc-4635-b4ef-b4e59fdefcbf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:35:26 crc kubenswrapper[4756]: I1205 16:35:26.242196 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26532a0f-2abc-4635-b4ef-b4e59fdefcbf-kube-api-access-z2xqt" (OuterVolumeSpecName: "kube-api-access-z2xqt") pod "26532a0f-2abc-4635-b4ef-b4e59fdefcbf" (UID: "26532a0f-2abc-4635-b4ef-b4e59fdefcbf"). InnerVolumeSpecName "kube-api-access-z2xqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:35:26 crc kubenswrapper[4756]: I1205 16:35:26.244184 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0f8a490-c56f-4a46-85c1-b609e9be9676-kube-api-access-5t8xx" (OuterVolumeSpecName: "kube-api-access-5t8xx") pod "f0f8a490-c56f-4a46-85c1-b609e9be9676" (UID: "f0f8a490-c56f-4a46-85c1-b609e9be9676"). InnerVolumeSpecName "kube-api-access-5t8xx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:35:26 crc kubenswrapper[4756]: I1205 16:35:26.338815 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26532a0f-2abc-4635-b4ef-b4e59fdefcbf-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:26 crc kubenswrapper[4756]: I1205 16:35:26.338865 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2xqt\" (UniqueName: \"kubernetes.io/projected/26532a0f-2abc-4635-b4ef-b4e59fdefcbf-kube-api-access-z2xqt\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:26 crc kubenswrapper[4756]: I1205 16:35:26.338885 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5t8xx\" (UniqueName: \"kubernetes.io/projected/f0f8a490-c56f-4a46-85c1-b609e9be9676-kube-api-access-5t8xx\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:26 crc kubenswrapper[4756]: I1205 16:35:26.338900 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26532a0f-2abc-4635-b4ef-b4e59fdefcbf-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:26 crc kubenswrapper[4756]: I1205 16:35:26.424799 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kdtg5" event={"ID":"40ff87cb-4d82-49d4-a4d9-55e104c9becb","Type":"ContainerStarted","Data":"31a9a3d50797eb9461b26695c87bceb47e0963457bdf176c58f4841b6b4d2a33"} Dec 05 16:35:26 crc kubenswrapper[4756]: I1205 16:35:26.425877 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 16:35:26 crc kubenswrapper[4756]: I1205 16:35:26.431116 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cd484bb89-x6rb8" event={"ID":"f0f8a490-c56f-4a46-85c1-b609e9be9676","Type":"ContainerDied","Data":"425c6dab33f79daeebbdd7199c9736b1fcdc858cca75c620cc5b57d156921d99"} Dec 05 16:35:26 crc kubenswrapper[4756]: I1205 16:35:26.431133 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-x6rb8" Dec 05 16:35:26 crc kubenswrapper[4756]: I1205 16:35:26.434435 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-567c455747-crzvw" event={"ID":"26532a0f-2abc-4635-b4ef-b4e59fdefcbf","Type":"ContainerDied","Data":"d862f9294c0495e2c482ce8b4457b34974716acde9bb94d06c19d95d99ffe8e8"} Dec 05 16:35:26 crc kubenswrapper[4756]: I1205 16:35:26.434532 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-crzvw" Dec 05 16:35:26 crc kubenswrapper[4756]: I1205 16:35:26.436687 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3f72429c-8e1e-4c2d-8417-c499c22013a9","Type":"ContainerStarted","Data":"9592012620c7f3b4bbf1f01ba218cb5ce66bbfbb0ee270d98879ada47358f432"} Dec 05 16:35:26 crc kubenswrapper[4756]: I1205 16:35:26.438116 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vsmjz" event={"ID":"359e87f3-8fb4-4476-a538-13c948b65885","Type":"ContainerStarted","Data":"3e70c93238337b86942115e08684235ead427eaf4b65639d07de66647a8b9cd5"} Dec 05 16:35:26 crc kubenswrapper[4756]: I1205 16:35:26.514290 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-567c455747-crzvw"] Dec 05 16:35:26 crc kubenswrapper[4756]: I1205 16:35:26.523472 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-567c455747-crzvw"] Dec 05 16:35:26 crc kubenswrapper[4756]: I1205 16:35:26.540887 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-x6rb8"] Dec 05 16:35:26 crc kubenswrapper[4756]: I1205 16:35:26.550179 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-x6rb8"] Dec 05 16:35:27 crc kubenswrapper[4756]: I1205 16:35:27.447345 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"930a573c-21eb-46e3-827b-2ff167eecd4a","Type":"ContainerStarted","Data":"707ab58705624e64f1ef086e35b4a44addb664700c08627d7f41a84fc6487652"} Dec 05 16:35:28 crc kubenswrapper[4756]: I1205 16:35:28.058552 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26532a0f-2abc-4635-b4ef-b4e59fdefcbf" path="/var/lib/kubelet/pods/26532a0f-2abc-4635-b4ef-b4e59fdefcbf/volumes" Dec 05 16:35:28 crc kubenswrapper[4756]: I1205 16:35:28.058974 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0f8a490-c56f-4a46-85c1-b609e9be9676" path="/var/lib/kubelet/pods/f0f8a490-c56f-4a46-85c1-b609e9be9676/volumes" Dec 05 16:35:29 crc kubenswrapper[4756]: I1205 16:35:29.471385 4756 generic.go:334] "Generic (PLEG): container finished" podID="40ff87cb-4d82-49d4-a4d9-55e104c9becb" containerID="906970e8d880890b549aab00084b13dfefdd6a84f0b2cc8df5e80914b5a2230a" exitCode=0 Dec 05 16:35:29 crc kubenswrapper[4756]: I1205 16:35:29.472488 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kdtg5" event={"ID":"40ff87cb-4d82-49d4-a4d9-55e104c9becb","Type":"ContainerDied","Data":"906970e8d880890b549aab00084b13dfefdd6a84f0b2cc8df5e80914b5a2230a"} Dec 05 16:35:29 crc kubenswrapper[4756]: I1205 16:35:29.474301 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"930a573c-21eb-46e3-827b-2ff167eecd4a","Type":"ContainerStarted","Data":"6fbdbf2325367baff55e8b846433a59f4bbca7569a3296fb749614fe4ac3a285"} Dec 05 16:35:29 crc kubenswrapper[4756]: I1205 16:35:29.476034 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3f72429c-8e1e-4c2d-8417-c499c22013a9","Type":"ContainerStarted","Data":"e20df79125cd2c365e707fdb1cf60b40b6b4b579a5f0bd54d7e52f0b70e3d8ea"} Dec 05 16:35:29 crc kubenswrapper[4756]: I1205 16:35:29.477142 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vsmjz" event={"ID":"359e87f3-8fb4-4476-a538-13c948b65885","Type":"ContainerStarted","Data":"ade01f42bbe77b89747984e4dc69936f10f823fef3986224b12446fc96eaf272"} Dec 05 16:35:29 crc kubenswrapper[4756]: I1205 16:35:29.477694 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-vsmjz" Dec 05 16:35:29 crc kubenswrapper[4756]: I1205 16:35:29.510247 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-vsmjz" podStartSLOduration=19.054021477 podStartE2EDuration="22.510226523s" podCreationTimestamp="2025-12-05 16:35:07 +0000 UTC" firstStartedPulling="2025-12-05 16:35:25.427906408 +0000 UTC m=+1185.666579920" lastFinishedPulling="2025-12-05 16:35:28.884111434 +0000 UTC m=+1189.122784966" observedRunningTime="2025-12-05 16:35:29.506340598 +0000 UTC m=+1189.745014120" watchObservedRunningTime="2025-12-05 16:35:29.510226523 +0000 UTC m=+1189.748900035" Dec 05 16:35:30 crc kubenswrapper[4756]: I1205 16:35:30.487984 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kdtg5" event={"ID":"40ff87cb-4d82-49d4-a4d9-55e104c9becb","Type":"ContainerStarted","Data":"15ed1a30eec906bcf7cd0aae8c31d3f0fcdcf61ff466d03836be17d99a7a65f3"} Dec 05 16:35:30 crc kubenswrapper[4756]: I1205 16:35:30.488718 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kdtg5" event={"ID":"40ff87cb-4d82-49d4-a4d9-55e104c9becb","Type":"ContainerStarted","Data":"664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b"} Dec 05 16:35:30 crc kubenswrapper[4756]: I1205 16:35:30.488767 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-kdtg5" Dec 05 16:35:30 crc kubenswrapper[4756]: I1205 16:35:30.488789 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-kdtg5" Dec 05 16:35:30 crc kubenswrapper[4756]: I1205 16:35:30.490523 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d091b847-2b31-492a-a1b7-37e356e55e93","Type":"ContainerStarted","Data":"9b8651f84999e5733fdeed1ce337b453a91be8d32a73815f7e1a831bbb5c01e7"} Dec 05 16:35:30 crc kubenswrapper[4756]: I1205 16:35:30.490683 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 05 16:35:30 crc kubenswrapper[4756]: I1205 16:35:30.512516 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-kdtg5" podStartSLOduration=20.310682242 podStartE2EDuration="23.512496197s" podCreationTimestamp="2025-12-05 16:35:07 +0000 UTC" firstStartedPulling="2025-12-05 16:35:25.638728124 +0000 UTC m=+1185.877401636" lastFinishedPulling="2025-12-05 16:35:28.840542069 +0000 UTC m=+1189.079215591" observedRunningTime="2025-12-05 16:35:30.507661616 +0000 UTC m=+1190.746335148" watchObservedRunningTime="2025-12-05 16:35:30.512496197 +0000 UTC m=+1190.751169699" Dec 05 16:35:30 crc kubenswrapper[4756]: I1205 16:35:30.536600 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=22.787101596 podStartE2EDuration="27.536578016s" podCreationTimestamp="2025-12-05 16:35:03 +0000 UTC" firstStartedPulling="2025-12-05 16:35:24.92082157 +0000 UTC m=+1185.159495082" lastFinishedPulling="2025-12-05 16:35:29.67029799 +0000 UTC m=+1189.908971502" observedRunningTime="2025-12-05 16:35:30.529550467 +0000 UTC m=+1190.768223979" watchObservedRunningTime="2025-12-05 16:35:30.536578016 +0000 UTC m=+1190.775251528" Dec 05 16:35:32 crc kubenswrapper[4756]: I1205 16:35:32.532832 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3f72429c-8e1e-4c2d-8417-c499c22013a9","Type":"ContainerStarted","Data":"85456f1848243765e2d473f760124a27191626f1650d37509a1eb62e5f3138fc"} Dec 05 16:35:32 crc kubenswrapper[4756]: I1205 16:35:32.535922 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"55609ecb-b28c-488e-b80f-9add6d256016","Type":"ContainerStarted","Data":"25e41d82e6c78a57098d4c59694df774a54b05fa51cbfe74ad482fe43a5c7e54"} Dec 05 16:35:32 crc kubenswrapper[4756]: I1205 16:35:32.542200 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"930a573c-21eb-46e3-827b-2ff167eecd4a","Type":"ContainerStarted","Data":"552b70f0a07dfc0a35589ca2e4b524aad06f3b481cf9fd381f8830f3e4bb7734"} Dec 05 16:35:32 crc kubenswrapper[4756]: I1205 16:35:32.544499 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"881d2b4e-3f2e-4aca-a965-2508edc92a3b","Type":"ContainerStarted","Data":"30f7d07e5b70e1f2d721b39fa20de30dcc105da791b7ef1790d9c73e0f8de457"} Dec 05 16:35:32 crc kubenswrapper[4756]: I1205 16:35:32.559077 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=20.318234774 podStartE2EDuration="26.559060851s" podCreationTimestamp="2025-12-05 16:35:06 +0000 UTC" firstStartedPulling="2025-12-05 16:35:25.75015143 +0000 UTC m=+1185.988824942" lastFinishedPulling="2025-12-05 16:35:31.990977507 +0000 UTC m=+1192.229651019" observedRunningTime="2025-12-05 16:35:32.555630558 +0000 UTC m=+1192.794304070" watchObservedRunningTime="2025-12-05 16:35:32.559060851 +0000 UTC m=+1192.797734363" Dec 05 16:35:32 crc kubenswrapper[4756]: I1205 16:35:32.616666 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 05 16:35:32 crc kubenswrapper[4756]: I1205 16:35:32.625232 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=17.189082784 podStartE2EDuration="22.625217835s" podCreationTimestamp="2025-12-05 16:35:10 +0000 UTC" firstStartedPulling="2025-12-05 16:35:26.567781225 +0000 UTC m=+1186.806454737" lastFinishedPulling="2025-12-05 16:35:32.003916276 +0000 UTC m=+1192.242589788" observedRunningTime="2025-12-05 16:35:32.61686765 +0000 UTC m=+1192.855541172" watchObservedRunningTime="2025-12-05 16:35:32.625217835 +0000 UTC m=+1192.863891347" Dec 05 16:35:32 crc kubenswrapper[4756]: I1205 16:35:32.890253 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:33 crc kubenswrapper[4756]: I1205 16:35:33.364650 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:33 crc kubenswrapper[4756]: I1205 16:35:33.889911 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:33 crc kubenswrapper[4756]: I1205 16:35:33.945480 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:34 crc kubenswrapper[4756]: I1205 16:35:34.255755 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-nxpgh"] Dec 05 16:35:34 crc kubenswrapper[4756]: I1205 16:35:34.291024 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 05 16:35:34 crc kubenswrapper[4756]: I1205 16:35:34.336221 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66c567d66c-8szgg"] Dec 05 16:35:34 crc kubenswrapper[4756]: I1205 16:35:34.338318 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66c567d66c-8szgg" Dec 05 16:35:34 crc kubenswrapper[4756]: I1205 16:35:34.372704 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66c567d66c-8szgg"] Dec 05 16:35:34 crc kubenswrapper[4756]: I1205 16:35:34.377335 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87d8dee1-1167-4a52-ad9b-56568cf46ce0-config\") pod \"dnsmasq-dns-66c567d66c-8szgg\" (UID: \"87d8dee1-1167-4a52-ad9b-56568cf46ce0\") " pod="openstack/dnsmasq-dns-66c567d66c-8szgg" Dec 05 16:35:34 crc kubenswrapper[4756]: I1205 16:35:34.377450 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gps5w\" (UniqueName: \"kubernetes.io/projected/87d8dee1-1167-4a52-ad9b-56568cf46ce0-kube-api-access-gps5w\") pod \"dnsmasq-dns-66c567d66c-8szgg\" (UID: \"87d8dee1-1167-4a52-ad9b-56568cf46ce0\") " pod="openstack/dnsmasq-dns-66c567d66c-8szgg" Dec 05 16:35:34 crc kubenswrapper[4756]: I1205 16:35:34.377529 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87d8dee1-1167-4a52-ad9b-56568cf46ce0-dns-svc\") pod \"dnsmasq-dns-66c567d66c-8szgg\" (UID: \"87d8dee1-1167-4a52-ad9b-56568cf46ce0\") " pod="openstack/dnsmasq-dns-66c567d66c-8szgg" Dec 05 16:35:34 crc kubenswrapper[4756]: I1205 16:35:34.479335 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87d8dee1-1167-4a52-ad9b-56568cf46ce0-dns-svc\") pod \"dnsmasq-dns-66c567d66c-8szgg\" (UID: \"87d8dee1-1167-4a52-ad9b-56568cf46ce0\") " pod="openstack/dnsmasq-dns-66c567d66c-8szgg" Dec 05 16:35:34 crc kubenswrapper[4756]: I1205 16:35:34.479489 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87d8dee1-1167-4a52-ad9b-56568cf46ce0-config\") pod \"dnsmasq-dns-66c567d66c-8szgg\" (UID: \"87d8dee1-1167-4a52-ad9b-56568cf46ce0\") " pod="openstack/dnsmasq-dns-66c567d66c-8szgg" Dec 05 16:35:34 crc kubenswrapper[4756]: I1205 16:35:34.479564 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gps5w\" (UniqueName: \"kubernetes.io/projected/87d8dee1-1167-4a52-ad9b-56568cf46ce0-kube-api-access-gps5w\") pod \"dnsmasq-dns-66c567d66c-8szgg\" (UID: \"87d8dee1-1167-4a52-ad9b-56568cf46ce0\") " pod="openstack/dnsmasq-dns-66c567d66c-8szgg" Dec 05 16:35:34 crc kubenswrapper[4756]: I1205 16:35:34.480887 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87d8dee1-1167-4a52-ad9b-56568cf46ce0-dns-svc\") pod \"dnsmasq-dns-66c567d66c-8szgg\" (UID: \"87d8dee1-1167-4a52-ad9b-56568cf46ce0\") " pod="openstack/dnsmasq-dns-66c567d66c-8szgg" Dec 05 16:35:34 crc kubenswrapper[4756]: I1205 16:35:34.481591 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87d8dee1-1167-4a52-ad9b-56568cf46ce0-config\") pod \"dnsmasq-dns-66c567d66c-8szgg\" (UID: \"87d8dee1-1167-4a52-ad9b-56568cf46ce0\") " pod="openstack/dnsmasq-dns-66c567d66c-8szgg" Dec 05 16:35:34 crc kubenswrapper[4756]: I1205 16:35:34.528177 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gps5w\" (UniqueName: \"kubernetes.io/projected/87d8dee1-1167-4a52-ad9b-56568cf46ce0-kube-api-access-gps5w\") pod \"dnsmasq-dns-66c567d66c-8szgg\" (UID: \"87d8dee1-1167-4a52-ad9b-56568cf46ce0\") " pod="openstack/dnsmasq-dns-66c567d66c-8szgg" Dec 05 16:35:34 crc kubenswrapper[4756]: I1205 16:35:34.560633 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"11d0ad48-c987-4415-8536-b150536534a5","Type":"ContainerStarted","Data":"2ddfe79aa135b617b0c2f40a253431db40792217d9ff92e87cd83d2c7077ee6b"} Dec 05 16:35:34 crc kubenswrapper[4756]: I1205 16:35:34.625464 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 05 16:35:34 crc kubenswrapper[4756]: I1205 16:35:34.663956 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66c567d66c-8szgg" Dec 05 16:35:34 crc kubenswrapper[4756]: I1205 16:35:34.891524 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-ngrkn"] Dec 05 16:35:34 crc kubenswrapper[4756]: I1205 16:35:34.892590 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc4b48fc9-nxpgh" Dec 05 16:35:34 crc kubenswrapper[4756]: I1205 16:35:34.926274 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64c66c9fb5-w2jjs"] Dec 05 16:35:34 crc kubenswrapper[4756]: I1205 16:35:34.927661 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64c66c9fb5-w2jjs" Dec 05 16:35:34 crc kubenswrapper[4756]: I1205 16:35:34.931859 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 05 16:35:34 crc kubenswrapper[4756]: I1205 16:35:34.940007 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64c66c9fb5-w2jjs"] Dec 05 16:35:34 crc kubenswrapper[4756]: I1205 16:35:34.978529 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-bg8df"] Dec 05 16:35:34 crc kubenswrapper[4756]: I1205 16:35:34.979854 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-bg8df" Dec 05 16:35:34 crc kubenswrapper[4756]: I1205 16:35:34.985714 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 05 16:35:34 crc kubenswrapper[4756]: I1205 16:35:34.990550 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsxfz\" (UniqueName: \"kubernetes.io/projected/76a2778a-54e4-480c-b234-50c6f65c5d6c-kube-api-access-dsxfz\") pod \"76a2778a-54e4-480c-b234-50c6f65c5d6c\" (UID: \"76a2778a-54e4-480c-b234-50c6f65c5d6c\") " Dec 05 16:35:34 crc kubenswrapper[4756]: I1205 16:35:34.990684 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76a2778a-54e4-480c-b234-50c6f65c5d6c-config\") pod \"76a2778a-54e4-480c-b234-50c6f65c5d6c\" (UID: \"76a2778a-54e4-480c-b234-50c6f65c5d6c\") " Dec 05 16:35:34 crc kubenswrapper[4756]: I1205 16:35:34.990742 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76a2778a-54e4-480c-b234-50c6f65c5d6c-dns-svc\") pod \"76a2778a-54e4-480c-b234-50c6f65c5d6c\" (UID: \"76a2778a-54e4-480c-b234-50c6f65c5d6c\") " Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:34.995511 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c32a5e89-f9d8-4396-9dac-070cda9077c2-config\") pod \"dnsmasq-dns-64c66c9fb5-w2jjs\" (UID: \"c32a5e89-f9d8-4396-9dac-070cda9077c2\") " pod="openstack/dnsmasq-dns-64c66c9fb5-w2jjs" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:34.995666 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gckzq\" (UniqueName: \"kubernetes.io/projected/b9642035-7395-4ead-9da4-4d01f497a80b-kube-api-access-gckzq\") pod \"ovn-controller-metrics-bg8df\" (UID: \"b9642035-7395-4ead-9da4-4d01f497a80b\") " pod="openstack/ovn-controller-metrics-bg8df" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:34.995807 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c32a5e89-f9d8-4396-9dac-070cda9077c2-dns-svc\") pod \"dnsmasq-dns-64c66c9fb5-w2jjs\" (UID: \"c32a5e89-f9d8-4396-9dac-070cda9077c2\") " pod="openstack/dnsmasq-dns-64c66c9fb5-w2jjs" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:34.995943 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b9642035-7395-4ead-9da4-4d01f497a80b-ovn-rundir\") pod \"ovn-controller-metrics-bg8df\" (UID: \"b9642035-7395-4ead-9da4-4d01f497a80b\") " pod="openstack/ovn-controller-metrics-bg8df" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:34.995973 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9642035-7395-4ead-9da4-4d01f497a80b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-bg8df\" (UID: \"b9642035-7395-4ead-9da4-4d01f497a80b\") " pod="openstack/ovn-controller-metrics-bg8df" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:34.996013 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9642035-7395-4ead-9da4-4d01f497a80b-combined-ca-bundle\") pod \"ovn-controller-metrics-bg8df\" (UID: \"b9642035-7395-4ead-9da4-4d01f497a80b\") " pod="openstack/ovn-controller-metrics-bg8df" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:34.996090 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwpnj\" (UniqueName: \"kubernetes.io/projected/c32a5e89-f9d8-4396-9dac-070cda9077c2-kube-api-access-nwpnj\") pod \"dnsmasq-dns-64c66c9fb5-w2jjs\" (UID: \"c32a5e89-f9d8-4396-9dac-070cda9077c2\") " pod="openstack/dnsmasq-dns-64c66c9fb5-w2jjs" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:34.996125 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9642035-7395-4ead-9da4-4d01f497a80b-config\") pod \"ovn-controller-metrics-bg8df\" (UID: \"b9642035-7395-4ead-9da4-4d01f497a80b\") " pod="openstack/ovn-controller-metrics-bg8df" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:34.996161 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c32a5e89-f9d8-4396-9dac-070cda9077c2-ovsdbserver-sb\") pod \"dnsmasq-dns-64c66c9fb5-w2jjs\" (UID: \"c32a5e89-f9d8-4396-9dac-070cda9077c2\") " pod="openstack/dnsmasq-dns-64c66c9fb5-w2jjs" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:34.996213 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b9642035-7395-4ead-9da4-4d01f497a80b-ovs-rundir\") pod \"ovn-controller-metrics-bg8df\" (UID: \"b9642035-7395-4ead-9da4-4d01f497a80b\") " pod="openstack/ovn-controller-metrics-bg8df" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.029392 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76a2778a-54e4-480c-b234-50c6f65c5d6c-config" (OuterVolumeSpecName: "config") pod "76a2778a-54e4-480c-b234-50c6f65c5d6c" (UID: "76a2778a-54e4-480c-b234-50c6f65c5d6c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.034749 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76a2778a-54e4-480c-b234-50c6f65c5d6c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "76a2778a-54e4-480c-b234-50c6f65c5d6c" (UID: "76a2778a-54e4-480c-b234-50c6f65c5d6c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.036156 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76a2778a-54e4-480c-b234-50c6f65c5d6c-kube-api-access-dsxfz" (OuterVolumeSpecName: "kube-api-access-dsxfz") pod "76a2778a-54e4-480c-b234-50c6f65c5d6c" (UID: "76a2778a-54e4-480c-b234-50c6f65c5d6c"). InnerVolumeSpecName "kube-api-access-dsxfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.062453 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-bg8df"] Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.097538 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c32a5e89-f9d8-4396-9dac-070cda9077c2-dns-svc\") pod \"dnsmasq-dns-64c66c9fb5-w2jjs\" (UID: \"c32a5e89-f9d8-4396-9dac-070cda9077c2\") " pod="openstack/dnsmasq-dns-64c66c9fb5-w2jjs" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.097921 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b9642035-7395-4ead-9da4-4d01f497a80b-ovn-rundir\") pod \"ovn-controller-metrics-bg8df\" (UID: \"b9642035-7395-4ead-9da4-4d01f497a80b\") " pod="openstack/ovn-controller-metrics-bg8df" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.097949 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9642035-7395-4ead-9da4-4d01f497a80b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-bg8df\" (UID: \"b9642035-7395-4ead-9da4-4d01f497a80b\") " pod="openstack/ovn-controller-metrics-bg8df" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.097976 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9642035-7395-4ead-9da4-4d01f497a80b-combined-ca-bundle\") pod \"ovn-controller-metrics-bg8df\" (UID: \"b9642035-7395-4ead-9da4-4d01f497a80b\") " pod="openstack/ovn-controller-metrics-bg8df" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.098059 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwpnj\" (UniqueName: \"kubernetes.io/projected/c32a5e89-f9d8-4396-9dac-070cda9077c2-kube-api-access-nwpnj\") pod \"dnsmasq-dns-64c66c9fb5-w2jjs\" (UID: \"c32a5e89-f9d8-4396-9dac-070cda9077c2\") " pod="openstack/dnsmasq-dns-64c66c9fb5-w2jjs" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.098085 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9642035-7395-4ead-9da4-4d01f497a80b-config\") pod \"ovn-controller-metrics-bg8df\" (UID: \"b9642035-7395-4ead-9da4-4d01f497a80b\") " pod="openstack/ovn-controller-metrics-bg8df" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.098131 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c32a5e89-f9d8-4396-9dac-070cda9077c2-ovsdbserver-sb\") pod \"dnsmasq-dns-64c66c9fb5-w2jjs\" (UID: \"c32a5e89-f9d8-4396-9dac-070cda9077c2\") " pod="openstack/dnsmasq-dns-64c66c9fb5-w2jjs" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.098168 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b9642035-7395-4ead-9da4-4d01f497a80b-ovs-rundir\") pod \"ovn-controller-metrics-bg8df\" (UID: \"b9642035-7395-4ead-9da4-4d01f497a80b\") " pod="openstack/ovn-controller-metrics-bg8df" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.098205 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c32a5e89-f9d8-4396-9dac-070cda9077c2-config\") pod \"dnsmasq-dns-64c66c9fb5-w2jjs\" (UID: \"c32a5e89-f9d8-4396-9dac-070cda9077c2\") " pod="openstack/dnsmasq-dns-64c66c9fb5-w2jjs" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.098271 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gckzq\" (UniqueName: \"kubernetes.io/projected/b9642035-7395-4ead-9da4-4d01f497a80b-kube-api-access-gckzq\") pod \"ovn-controller-metrics-bg8df\" (UID: \"b9642035-7395-4ead-9da4-4d01f497a80b\") " pod="openstack/ovn-controller-metrics-bg8df" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.098371 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsxfz\" (UniqueName: \"kubernetes.io/projected/76a2778a-54e4-480c-b234-50c6f65c5d6c-kube-api-access-dsxfz\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.098387 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76a2778a-54e4-480c-b234-50c6f65c5d6c-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.098398 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76a2778a-54e4-480c-b234-50c6f65c5d6c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.099438 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9642035-7395-4ead-9da4-4d01f497a80b-config\") pod \"ovn-controller-metrics-bg8df\" (UID: \"b9642035-7395-4ead-9da4-4d01f497a80b\") " pod="openstack/ovn-controller-metrics-bg8df" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.099707 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b9642035-7395-4ead-9da4-4d01f497a80b-ovn-rundir\") pod \"ovn-controller-metrics-bg8df\" (UID: \"b9642035-7395-4ead-9da4-4d01f497a80b\") " pod="openstack/ovn-controller-metrics-bg8df" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.099783 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b9642035-7395-4ead-9da4-4d01f497a80b-ovs-rundir\") pod \"ovn-controller-metrics-bg8df\" (UID: \"b9642035-7395-4ead-9da4-4d01f497a80b\") " pod="openstack/ovn-controller-metrics-bg8df" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.100178 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c32a5e89-f9d8-4396-9dac-070cda9077c2-ovsdbserver-sb\") pod \"dnsmasq-dns-64c66c9fb5-w2jjs\" (UID: \"c32a5e89-f9d8-4396-9dac-070cda9077c2\") " pod="openstack/dnsmasq-dns-64c66c9fb5-w2jjs" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.101808 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c32a5e89-f9d8-4396-9dac-070cda9077c2-config\") pod \"dnsmasq-dns-64c66c9fb5-w2jjs\" (UID: \"c32a5e89-f9d8-4396-9dac-070cda9077c2\") " pod="openstack/dnsmasq-dns-64c66c9fb5-w2jjs" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.102037 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c32a5e89-f9d8-4396-9dac-070cda9077c2-dns-svc\") pod \"dnsmasq-dns-64c66c9fb5-w2jjs\" (UID: \"c32a5e89-f9d8-4396-9dac-070cda9077c2\") " pod="openstack/dnsmasq-dns-64c66c9fb5-w2jjs" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.102765 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9642035-7395-4ead-9da4-4d01f497a80b-combined-ca-bundle\") pod \"ovn-controller-metrics-bg8df\" (UID: \"b9642035-7395-4ead-9da4-4d01f497a80b\") " pod="openstack/ovn-controller-metrics-bg8df" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.106930 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9642035-7395-4ead-9da4-4d01f497a80b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-bg8df\" (UID: \"b9642035-7395-4ead-9da4-4d01f497a80b\") " pod="openstack/ovn-controller-metrics-bg8df" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.122982 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gckzq\" (UniqueName: \"kubernetes.io/projected/b9642035-7395-4ead-9da4-4d01f497a80b-kube-api-access-gckzq\") pod \"ovn-controller-metrics-bg8df\" (UID: \"b9642035-7395-4ead-9da4-4d01f497a80b\") " pod="openstack/ovn-controller-metrics-bg8df" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.125917 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwpnj\" (UniqueName: \"kubernetes.io/projected/c32a5e89-f9d8-4396-9dac-070cda9077c2-kube-api-access-nwpnj\") pod \"dnsmasq-dns-64c66c9fb5-w2jjs\" (UID: \"c32a5e89-f9d8-4396-9dac-070cda9077c2\") " pod="openstack/dnsmasq-dns-64c66c9fb5-w2jjs" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.194629 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64c66c9fb5-w2jjs"] Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.195275 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64c66c9fb5-w2jjs" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.236502 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59d5fbdd8c-z4k4k"] Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.237766 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5fbdd8c-z4k4k" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.244806 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.257434 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59d5fbdd8c-z4k4k"] Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.308303 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2884268a-caae-472c-9f24-099e6f193487-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5fbdd8c-z4k4k\" (UID: \"2884268a-caae-472c-9f24-099e6f193487\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-z4k4k" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.308372 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-bg8df" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.308550 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgzm7\" (UniqueName: \"kubernetes.io/projected/2884268a-caae-472c-9f24-099e6f193487-kube-api-access-zgzm7\") pod \"dnsmasq-dns-59d5fbdd8c-z4k4k\" (UID: \"2884268a-caae-472c-9f24-099e6f193487\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-z4k4k" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.308577 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2884268a-caae-472c-9f24-099e6f193487-dns-svc\") pod \"dnsmasq-dns-59d5fbdd8c-z4k4k\" (UID: \"2884268a-caae-472c-9f24-099e6f193487\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-z4k4k" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.308625 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2884268a-caae-472c-9f24-099e6f193487-config\") pod \"dnsmasq-dns-59d5fbdd8c-z4k4k\" (UID: \"2884268a-caae-472c-9f24-099e6f193487\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-z4k4k" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.308686 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2884268a-caae-472c-9f24-099e6f193487-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5fbdd8c-z4k4k\" (UID: \"2884268a-caae-472c-9f24-099e6f193487\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-z4k4k" Dec 05 16:35:35 crc kubenswrapper[4756]: W1205 16:35:35.308807 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87d8dee1_1167_4a52_ad9b_56568cf46ce0.slice/crio-6d88cc6e074c50cd7291e8ae58ad00089707a32d77be56ec7d21e730ea41219b WatchSource:0}: Error finding container 6d88cc6e074c50cd7291e8ae58ad00089707a32d77be56ec7d21e730ea41219b: Status 404 returned error can't find the container with id 6d88cc6e074c50cd7291e8ae58ad00089707a32d77be56ec7d21e730ea41219b Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.315130 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66c567d66c-8szgg"] Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.367942 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.398903 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-ngrkn" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.410138 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddc74b94-5abc-43e7-b429-040c492b88d1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ddc74b94-5abc-43e7-b429-040c492b88d1" (UID: "ddc74b94-5abc-43e7-b429-040c492b88d1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.409393 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddc74b94-5abc-43e7-b429-040c492b88d1-dns-svc\") pod \"ddc74b94-5abc-43e7-b429-040c492b88d1\" (UID: \"ddc74b94-5abc-43e7-b429-040c492b88d1\") " Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.410723 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mk8rj\" (UniqueName: \"kubernetes.io/projected/ddc74b94-5abc-43e7-b429-040c492b88d1-kube-api-access-mk8rj\") pod \"ddc74b94-5abc-43e7-b429-040c492b88d1\" (UID: \"ddc74b94-5abc-43e7-b429-040c492b88d1\") " Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.410784 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddc74b94-5abc-43e7-b429-040c492b88d1-config\") pod \"ddc74b94-5abc-43e7-b429-040c492b88d1\" (UID: \"ddc74b94-5abc-43e7-b429-040c492b88d1\") " Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.411011 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2884268a-caae-472c-9f24-099e6f193487-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5fbdd8c-z4k4k\" (UID: \"2884268a-caae-472c-9f24-099e6f193487\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-z4k4k" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.411087 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgzm7\" (UniqueName: \"kubernetes.io/projected/2884268a-caae-472c-9f24-099e6f193487-kube-api-access-zgzm7\") pod \"dnsmasq-dns-59d5fbdd8c-z4k4k\" (UID: \"2884268a-caae-472c-9f24-099e6f193487\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-z4k4k" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.411105 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2884268a-caae-472c-9f24-099e6f193487-dns-svc\") pod \"dnsmasq-dns-59d5fbdd8c-z4k4k\" (UID: \"2884268a-caae-472c-9f24-099e6f193487\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-z4k4k" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.411151 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2884268a-caae-472c-9f24-099e6f193487-config\") pod \"dnsmasq-dns-59d5fbdd8c-z4k4k\" (UID: \"2884268a-caae-472c-9f24-099e6f193487\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-z4k4k" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.411229 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2884268a-caae-472c-9f24-099e6f193487-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5fbdd8c-z4k4k\" (UID: \"2884268a-caae-472c-9f24-099e6f193487\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-z4k4k" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.411267 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddc74b94-5abc-43e7-b429-040c492b88d1-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.411348 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddc74b94-5abc-43e7-b429-040c492b88d1-config" (OuterVolumeSpecName: "config") pod "ddc74b94-5abc-43e7-b429-040c492b88d1" (UID: "ddc74b94-5abc-43e7-b429-040c492b88d1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.412061 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2884268a-caae-472c-9f24-099e6f193487-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5fbdd8c-z4k4k\" (UID: \"2884268a-caae-472c-9f24-099e6f193487\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-z4k4k" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.412990 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2884268a-caae-472c-9f24-099e6f193487-dns-svc\") pod \"dnsmasq-dns-59d5fbdd8c-z4k4k\" (UID: \"2884268a-caae-472c-9f24-099e6f193487\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-z4k4k" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.414838 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2884268a-caae-472c-9f24-099e6f193487-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5fbdd8c-z4k4k\" (UID: \"2884268a-caae-472c-9f24-099e6f193487\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-z4k4k" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.415091 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2884268a-caae-472c-9f24-099e6f193487-config\") pod \"dnsmasq-dns-59d5fbdd8c-z4k4k\" (UID: \"2884268a-caae-472c-9f24-099e6f193487\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-z4k4k" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.415867 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddc74b94-5abc-43e7-b429-040c492b88d1-kube-api-access-mk8rj" (OuterVolumeSpecName: "kube-api-access-mk8rj") pod "ddc74b94-5abc-43e7-b429-040c492b88d1" (UID: "ddc74b94-5abc-43e7-b429-040c492b88d1"). InnerVolumeSpecName "kube-api-access-mk8rj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.444952 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgzm7\" (UniqueName: \"kubernetes.io/projected/2884268a-caae-472c-9f24-099e6f193487-kube-api-access-zgzm7\") pod \"dnsmasq-dns-59d5fbdd8c-z4k4k\" (UID: \"2884268a-caae-472c-9f24-099e6f193487\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-z4k4k" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.456127 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.504334 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.515395 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mk8rj\" (UniqueName: \"kubernetes.io/projected/ddc74b94-5abc-43e7-b429-040c492b88d1-kube-api-access-mk8rj\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.515452 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddc74b94-5abc-43e7-b429-040c492b88d1-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.516665 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.519178 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.519259 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-5x85g" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.522105 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.522251 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.522385 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.579053 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-ngrkn" event={"ID":"ddc74b94-5abc-43e7-b429-040c492b88d1","Type":"ContainerDied","Data":"542423a393b07e7bb9c261ec4e452892de1103ce455355a702e1479548e48857"} Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.579086 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-ngrkn" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.580123 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc4b48fc9-nxpgh" event={"ID":"76a2778a-54e4-480c-b234-50c6f65c5d6c","Type":"ContainerDied","Data":"fe64a6379d71aa1a8d631b4c1dd9babb16d275d5c7d143555470827638a0612b"} Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.580383 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc4b48fc9-nxpgh" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.581672 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66c567d66c-8szgg" event={"ID":"87d8dee1-1167-4a52-ad9b-56568cf46ce0","Type":"ContainerStarted","Data":"6d88cc6e074c50cd7291e8ae58ad00089707a32d77be56ec7d21e730ea41219b"} Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.582556 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5fbdd8c-z4k4k" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.632409 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.712522 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-nxpgh"] Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.730884 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-nxpgh"] Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.731167 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-cache\") pod \"swift-storage-0\" (UID: \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\") " pod="openstack/swift-storage-0" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.731648 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-lock\") pod \"swift-storage-0\" (UID: \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\") " pod="openstack/swift-storage-0" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.731836 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\") " pod="openstack/swift-storage-0" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.732055 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-etc-swift\") pod \"swift-storage-0\" (UID: \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\") " pod="openstack/swift-storage-0" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.732283 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-225xh\" (UniqueName: \"kubernetes.io/projected/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-kube-api-access-225xh\") pod \"swift-storage-0\" (UID: \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\") " pod="openstack/swift-storage-0" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.762705 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-ngrkn"] Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.770550 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-ngrkn"] Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.833880 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-etc-swift\") pod \"swift-storage-0\" (UID: \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\") " pod="openstack/swift-storage-0" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.833959 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-225xh\" (UniqueName: \"kubernetes.io/projected/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-kube-api-access-225xh\") pod \"swift-storage-0\" (UID: \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\") " pod="openstack/swift-storage-0" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.834007 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-cache\") pod \"swift-storage-0\" (UID: \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\") " pod="openstack/swift-storage-0" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.834023 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-lock\") pod \"swift-storage-0\" (UID: \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\") " pod="openstack/swift-storage-0" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.834055 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\") " pod="openstack/swift-storage-0" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.834394 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0" Dec 05 16:35:35 crc kubenswrapper[4756]: E1205 16:35:35.834453 4756 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 16:35:35 crc kubenswrapper[4756]: E1205 16:35:35.834475 4756 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 16:35:35 crc kubenswrapper[4756]: E1205 16:35:35.834524 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-etc-swift podName:1da6dea0-ecf9-4bec-9c85-2fddcc0fae40 nodeName:}" failed. No retries permitted until 2025-12-05 16:35:36.334506691 +0000 UTC m=+1196.573180203 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-etc-swift") pod "swift-storage-0" (UID: "1da6dea0-ecf9-4bec-9c85-2fddcc0fae40") : configmap "swift-ring-files" not found Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.835018 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-cache\") pod \"swift-storage-0\" (UID: \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\") " pod="openstack/swift-storage-0" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.835041 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-lock\") pod \"swift-storage-0\" (UID: \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\") " pod="openstack/swift-storage-0" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.861577 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-225xh\" (UniqueName: \"kubernetes.io/projected/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-kube-api-access-225xh\") pod \"swift-storage-0\" (UID: \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\") " pod="openstack/swift-storage-0" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.872718 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-bg8df"] Dec 05 16:35:35 crc kubenswrapper[4756]: W1205 16:35:35.875300 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc32a5e89_f9d8_4396_9dac_070cda9077c2.slice/crio-2fc4c317850bfa80f47e4124d8d58242538344c9fa69870bb385afe933f7f053 WatchSource:0}: Error finding container 2fc4c317850bfa80f47e4124d8d58242538344c9fa69870bb385afe933f7f053: Status 404 returned error can't find the container with id 2fc4c317850bfa80f47e4124d8d58242538344c9fa69870bb385afe933f7f053 Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.879189 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64c66c9fb5-w2jjs"] Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.903204 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.904455 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.906909 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-48cvg" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.907057 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.907386 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\") " pod="openstack/swift-storage-0" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.910125 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.910142 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 05 16:35:35 crc kubenswrapper[4756]: I1205 16:35:35.918572 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.038174 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\") " pod="openstack/ovn-northd-0" Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.038439 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-config\") pod \"ovn-northd-0\" (UID: \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\") " pod="openstack/ovn-northd-0" Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.038467 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\") " pod="openstack/ovn-northd-0" Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.038555 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfcfm\" (UniqueName: \"kubernetes.io/projected/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-kube-api-access-gfcfm\") pod \"ovn-northd-0\" (UID: \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\") " pod="openstack/ovn-northd-0" Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.038595 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\") " pod="openstack/ovn-northd-0" Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.038623 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-scripts\") pod \"ovn-northd-0\" (UID: \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\") " pod="openstack/ovn-northd-0" Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.038744 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\") " pod="openstack/ovn-northd-0" Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.065583 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76a2778a-54e4-480c-b234-50c6f65c5d6c" path="/var/lib/kubelet/pods/76a2778a-54e4-480c-b234-50c6f65c5d6c/volumes" Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.065986 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddc74b94-5abc-43e7-b429-040c492b88d1" path="/var/lib/kubelet/pods/ddc74b94-5abc-43e7-b429-040c492b88d1/volumes" Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.079912 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59d5fbdd8c-z4k4k"] Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.144239 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\") " pod="openstack/ovn-northd-0" Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.144287 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-scripts\") pod \"ovn-northd-0\" (UID: \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\") " pod="openstack/ovn-northd-0" Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.144320 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\") " pod="openstack/ovn-northd-0" Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.144357 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\") " pod="openstack/ovn-northd-0" Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.144397 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-config\") pod \"ovn-northd-0\" (UID: \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\") " pod="openstack/ovn-northd-0" Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.144475 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\") " pod="openstack/ovn-northd-0" Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.144530 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfcfm\" (UniqueName: \"kubernetes.io/projected/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-kube-api-access-gfcfm\") pod \"ovn-northd-0\" (UID: \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\") " pod="openstack/ovn-northd-0" Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.145522 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\") " pod="openstack/ovn-northd-0" Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.145604 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-scripts\") pod \"ovn-northd-0\" (UID: \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\") " pod="openstack/ovn-northd-0" Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.146303 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-config\") pod \"ovn-northd-0\" (UID: \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\") " pod="openstack/ovn-northd-0" Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.148574 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\") " pod="openstack/ovn-northd-0" Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.149197 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\") " pod="openstack/ovn-northd-0" Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.149927 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\") " pod="openstack/ovn-northd-0" Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.163752 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfcfm\" (UniqueName: \"kubernetes.io/projected/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-kube-api-access-gfcfm\") pod \"ovn-northd-0\" (UID: \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\") " pod="openstack/ovn-northd-0" Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.315698 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.347037 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-etc-swift\") pod \"swift-storage-0\" (UID: \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\") " pod="openstack/swift-storage-0" Dec 05 16:35:36 crc kubenswrapper[4756]: E1205 16:35:36.347349 4756 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 16:35:36 crc kubenswrapper[4756]: E1205 16:35:36.347464 4756 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 16:35:36 crc kubenswrapper[4756]: E1205 16:35:36.347554 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-etc-swift podName:1da6dea0-ecf9-4bec-9c85-2fddcc0fae40 nodeName:}" failed. No retries permitted until 2025-12-05 16:35:37.347529449 +0000 UTC m=+1197.586202961 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-etc-swift") pod "swift-storage-0" (UID: "1da6dea0-ecf9-4bec-9c85-2fddcc0fae40") : configmap "swift-ring-files" not found Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.597099 4756 generic.go:334] "Generic (PLEG): container finished" podID="87d8dee1-1167-4a52-ad9b-56568cf46ce0" containerID="d11d91e03c0cf85317bf483216b8c0f74a029e2d9cb802bf7d7407056ffd093b" exitCode=0 Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.597180 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66c567d66c-8szgg" event={"ID":"87d8dee1-1167-4a52-ad9b-56568cf46ce0","Type":"ContainerDied","Data":"d11d91e03c0cf85317bf483216b8c0f74a029e2d9cb802bf7d7407056ffd093b"} Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.600916 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"331486fe-c403-4516-9568-862200ec0672","Type":"ContainerStarted","Data":"dbfb13eaaa696e55e78ab35fd710ba80f5f15ace22edb546c16e9257aac5efcc"} Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.603347 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-bg8df" event={"ID":"b9642035-7395-4ead-9da4-4d01f497a80b","Type":"ContainerStarted","Data":"aaa563c0b40764c0748232c6188d4ab9d2d109f170578ae13acab5b936dcc5c5"} Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.603381 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-bg8df" event={"ID":"b9642035-7395-4ead-9da4-4d01f497a80b","Type":"ContainerStarted","Data":"8ca8d782a42936d354b601b8dabd7a874bafa4a7d2349554876c0e32cc42076f"} Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.606070 4756 generic.go:334] "Generic (PLEG): container finished" podID="2884268a-caae-472c-9f24-099e6f193487" containerID="9d70e6f8daf390ddf48f89c18ba283193940ed236ad407dcdbf94b751d7e4773" exitCode=0 Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.606131 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5fbdd8c-z4k4k" event={"ID":"2884268a-caae-472c-9f24-099e6f193487","Type":"ContainerDied","Data":"9d70e6f8daf390ddf48f89c18ba283193940ed236ad407dcdbf94b751d7e4773"} Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.606158 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5fbdd8c-z4k4k" event={"ID":"2884268a-caae-472c-9f24-099e6f193487","Type":"ContainerStarted","Data":"22355f499096d472ed616d7241e3128e628393df3d5bc465d5e1a92e49a942bf"} Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.616389 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64c66c9fb5-w2jjs" event={"ID":"c32a5e89-f9d8-4396-9dac-070cda9077c2","Type":"ContainerStarted","Data":"2fc4c317850bfa80f47e4124d8d58242538344c9fa69870bb385afe933f7f053"} Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.640925 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-bg8df" podStartSLOduration=2.640906453 podStartE2EDuration="2.640906453s" podCreationTimestamp="2025-12-05 16:35:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:35:36.637576313 +0000 UTC m=+1196.876249835" watchObservedRunningTime="2025-12-05 16:35:36.640906453 +0000 UTC m=+1196.879579965" Dec 05 16:35:36 crc kubenswrapper[4756]: I1205 16:35:36.756688 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 05 16:35:37 crc kubenswrapper[4756]: I1205 16:35:37.366096 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-etc-swift\") pod \"swift-storage-0\" (UID: \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\") " pod="openstack/swift-storage-0" Dec 05 16:35:37 crc kubenswrapper[4756]: E1205 16:35:37.366266 4756 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 16:35:37 crc kubenswrapper[4756]: E1205 16:35:37.366678 4756 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 16:35:37 crc kubenswrapper[4756]: E1205 16:35:37.366740 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-etc-swift podName:1da6dea0-ecf9-4bec-9c85-2fddcc0fae40 nodeName:}" failed. No retries permitted until 2025-12-05 16:35:39.366717011 +0000 UTC m=+1199.605390533 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-etc-swift") pod "swift-storage-0" (UID: "1da6dea0-ecf9-4bec-9c85-2fddcc0fae40") : configmap "swift-ring-files" not found Dec 05 16:35:37 crc kubenswrapper[4756]: I1205 16:35:37.623398 4756 generic.go:334] "Generic (PLEG): container finished" podID="55609ecb-b28c-488e-b80f-9add6d256016" containerID="25e41d82e6c78a57098d4c59694df774a54b05fa51cbfe74ad482fe43a5c7e54" exitCode=0 Dec 05 16:35:37 crc kubenswrapper[4756]: I1205 16:35:37.623491 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"55609ecb-b28c-488e-b80f-9add6d256016","Type":"ContainerDied","Data":"25e41d82e6c78a57098d4c59694df774a54b05fa51cbfe74ad482fe43a5c7e54"} Dec 05 16:35:37 crc kubenswrapper[4756]: I1205 16:35:37.624681 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683","Type":"ContainerStarted","Data":"6347d4e49846db798b8a977c98253fae53ae88bcb853fe4e28c7333747302452"} Dec 05 16:35:37 crc kubenswrapper[4756]: I1205 16:35:37.626930 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66c567d66c-8szgg" event={"ID":"87d8dee1-1167-4a52-ad9b-56568cf46ce0","Type":"ContainerStarted","Data":"b78f1692985a38b132e1d8fb4d0c9bf6ed67628bb8da08945d50fdd67ca17985"} Dec 05 16:35:37 crc kubenswrapper[4756]: I1205 16:35:37.627128 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66c567d66c-8szgg" Dec 05 16:35:37 crc kubenswrapper[4756]: I1205 16:35:37.630046 4756 generic.go:334] "Generic (PLEG): container finished" podID="881d2b4e-3f2e-4aca-a965-2508edc92a3b" containerID="30f7d07e5b70e1f2d721b39fa20de30dcc105da791b7ef1790d9c73e0f8de457" exitCode=0 Dec 05 16:35:37 crc kubenswrapper[4756]: I1205 16:35:37.630126 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"881d2b4e-3f2e-4aca-a965-2508edc92a3b","Type":"ContainerDied","Data":"30f7d07e5b70e1f2d721b39fa20de30dcc105da791b7ef1790d9c73e0f8de457"} Dec 05 16:35:37 crc kubenswrapper[4756]: I1205 16:35:37.634082 4756 generic.go:334] "Generic (PLEG): container finished" podID="c32a5e89-f9d8-4396-9dac-070cda9077c2" containerID="53eef36959b9221419fd7b9fff744a0201a7048e7b2235f7e0604df9eecf20f1" exitCode=0 Dec 05 16:35:37 crc kubenswrapper[4756]: I1205 16:35:37.634154 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64c66c9fb5-w2jjs" event={"ID":"c32a5e89-f9d8-4396-9dac-070cda9077c2","Type":"ContainerDied","Data":"53eef36959b9221419fd7b9fff744a0201a7048e7b2235f7e0604df9eecf20f1"} Dec 05 16:35:37 crc kubenswrapper[4756]: I1205 16:35:37.643745 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5fbdd8c-z4k4k" event={"ID":"2884268a-caae-472c-9f24-099e6f193487","Type":"ContainerStarted","Data":"f77c26edd248b56bc6b66d0b3bb3c38bd10c3ea2fb1c683e21567cba882d54f9"} Dec 05 16:35:37 crc kubenswrapper[4756]: I1205 16:35:37.643796 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59d5fbdd8c-z4k4k" Dec 05 16:35:37 crc kubenswrapper[4756]: I1205 16:35:37.688262 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59d5fbdd8c-z4k4k" podStartSLOduration=2.6882380919999997 podStartE2EDuration="2.688238092s" podCreationTimestamp="2025-12-05 16:35:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:35:37.664384729 +0000 UTC m=+1197.903058261" watchObservedRunningTime="2025-12-05 16:35:37.688238092 +0000 UTC m=+1197.926911604" Dec 05 16:35:37 crc kubenswrapper[4756]: I1205 16:35:37.752944 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66c567d66c-8szgg" podStartSLOduration=3.04975633 podStartE2EDuration="3.752922247s" podCreationTimestamp="2025-12-05 16:35:34 +0000 UTC" firstStartedPulling="2025-12-05 16:35:35.318867112 +0000 UTC m=+1195.557540644" lastFinishedPulling="2025-12-05 16:35:36.022033049 +0000 UTC m=+1196.260706561" observedRunningTime="2025-12-05 16:35:37.72558899 +0000 UTC m=+1197.964262522" watchObservedRunningTime="2025-12-05 16:35:37.752922247 +0000 UTC m=+1197.991595759" Dec 05 16:35:37 crc kubenswrapper[4756]: I1205 16:35:37.923175 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64c66c9fb5-w2jjs" Dec 05 16:35:37 crc kubenswrapper[4756]: I1205 16:35:37.978893 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwpnj\" (UniqueName: \"kubernetes.io/projected/c32a5e89-f9d8-4396-9dac-070cda9077c2-kube-api-access-nwpnj\") pod \"c32a5e89-f9d8-4396-9dac-070cda9077c2\" (UID: \"c32a5e89-f9d8-4396-9dac-070cda9077c2\") " Dec 05 16:35:37 crc kubenswrapper[4756]: I1205 16:35:37.978991 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c32a5e89-f9d8-4396-9dac-070cda9077c2-ovsdbserver-sb\") pod \"c32a5e89-f9d8-4396-9dac-070cda9077c2\" (UID: \"c32a5e89-f9d8-4396-9dac-070cda9077c2\") " Dec 05 16:35:37 crc kubenswrapper[4756]: I1205 16:35:37.979081 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c32a5e89-f9d8-4396-9dac-070cda9077c2-config\") pod \"c32a5e89-f9d8-4396-9dac-070cda9077c2\" (UID: \"c32a5e89-f9d8-4396-9dac-070cda9077c2\") " Dec 05 16:35:37 crc kubenswrapper[4756]: I1205 16:35:37.979157 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c32a5e89-f9d8-4396-9dac-070cda9077c2-dns-svc\") pod \"c32a5e89-f9d8-4396-9dac-070cda9077c2\" (UID: \"c32a5e89-f9d8-4396-9dac-070cda9077c2\") " Dec 05 16:35:37 crc kubenswrapper[4756]: I1205 16:35:37.983322 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c32a5e89-f9d8-4396-9dac-070cda9077c2-kube-api-access-nwpnj" (OuterVolumeSpecName: "kube-api-access-nwpnj") pod "c32a5e89-f9d8-4396-9dac-070cda9077c2" (UID: "c32a5e89-f9d8-4396-9dac-070cda9077c2"). InnerVolumeSpecName "kube-api-access-nwpnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:35:37 crc kubenswrapper[4756]: I1205 16:35:37.997857 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c32a5e89-f9d8-4396-9dac-070cda9077c2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c32a5e89-f9d8-4396-9dac-070cda9077c2" (UID: "c32a5e89-f9d8-4396-9dac-070cda9077c2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:35:38 crc kubenswrapper[4756]: I1205 16:35:38.000064 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c32a5e89-f9d8-4396-9dac-070cda9077c2-config" (OuterVolumeSpecName: "config") pod "c32a5e89-f9d8-4396-9dac-070cda9077c2" (UID: "c32a5e89-f9d8-4396-9dac-070cda9077c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:35:38 crc kubenswrapper[4756]: I1205 16:35:38.001195 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c32a5e89-f9d8-4396-9dac-070cda9077c2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c32a5e89-f9d8-4396-9dac-070cda9077c2" (UID: "c32a5e89-f9d8-4396-9dac-070cda9077c2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:35:38 crc kubenswrapper[4756]: I1205 16:35:38.081743 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c32a5e89-f9d8-4396-9dac-070cda9077c2-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:38 crc kubenswrapper[4756]: I1205 16:35:38.081803 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwpnj\" (UniqueName: \"kubernetes.io/projected/c32a5e89-f9d8-4396-9dac-070cda9077c2-kube-api-access-nwpnj\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:38 crc kubenswrapper[4756]: I1205 16:35:38.081820 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c32a5e89-f9d8-4396-9dac-070cda9077c2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:38 crc kubenswrapper[4756]: I1205 16:35:38.081833 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c32a5e89-f9d8-4396-9dac-070cda9077c2-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:38 crc kubenswrapper[4756]: I1205 16:35:38.653485 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64c66c9fb5-w2jjs" event={"ID":"c32a5e89-f9d8-4396-9dac-070cda9077c2","Type":"ContainerDied","Data":"2fc4c317850bfa80f47e4124d8d58242538344c9fa69870bb385afe933f7f053"} Dec 05 16:35:38 crc kubenswrapper[4756]: I1205 16:35:38.653537 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64c66c9fb5-w2jjs" Dec 05 16:35:38 crc kubenswrapper[4756]: I1205 16:35:38.653765 4756 scope.go:117] "RemoveContainer" containerID="53eef36959b9221419fd7b9fff744a0201a7048e7b2235f7e0604df9eecf20f1" Dec 05 16:35:38 crc kubenswrapper[4756]: I1205 16:35:38.656686 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"55609ecb-b28c-488e-b80f-9add6d256016","Type":"ContainerStarted","Data":"158cecb488ef422eac05d2c4948de5a6dfc1b4ec80df672c0fc11d44da5db969"} Dec 05 16:35:38 crc kubenswrapper[4756]: I1205 16:35:38.696623 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64c66c9fb5-w2jjs"] Dec 05 16:35:38 crc kubenswrapper[4756]: I1205 16:35:38.710904 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64c66c9fb5-w2jjs"] Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.378271 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-z8mgx"] Dec 05 16:35:39 crc kubenswrapper[4756]: E1205 16:35:39.378993 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c32a5e89-f9d8-4396-9dac-070cda9077c2" containerName="init" Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.379092 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c32a5e89-f9d8-4396-9dac-070cda9077c2" containerName="init" Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.379398 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c32a5e89-f9d8-4396-9dac-070cda9077c2" containerName="init" Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.380208 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-z8mgx" Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.382378 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.382632 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.384056 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.395814 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-z8mgx"] Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.403447 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-etc-swift\") pod \"swift-storage-0\" (UID: \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\") " pod="openstack/swift-storage-0" Dec 05 16:35:39 crc kubenswrapper[4756]: E1205 16:35:39.403631 4756 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 16:35:39 crc kubenswrapper[4756]: E1205 16:35:39.403659 4756 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 16:35:39 crc kubenswrapper[4756]: E1205 16:35:39.403722 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-etc-swift podName:1da6dea0-ecf9-4bec-9c85-2fddcc0fae40 nodeName:}" failed. No retries permitted until 2025-12-05 16:35:43.403701195 +0000 UTC m=+1203.642374707 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-etc-swift") pod "swift-storage-0" (UID: "1da6dea0-ecf9-4bec-9c85-2fddcc0fae40") : configmap "swift-ring-files" not found Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.504916 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d0bd0441-24bc-48f4-aeba-006c3b77419e-etc-swift\") pod \"swift-ring-rebalance-z8mgx\" (UID: \"d0bd0441-24bc-48f4-aeba-006c3b77419e\") " pod="openstack/swift-ring-rebalance-z8mgx" Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.504973 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d0bd0441-24bc-48f4-aeba-006c3b77419e-swiftconf\") pod \"swift-ring-rebalance-z8mgx\" (UID: \"d0bd0441-24bc-48f4-aeba-006c3b77419e\") " pod="openstack/swift-ring-rebalance-z8mgx" Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.505023 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9d75\" (UniqueName: \"kubernetes.io/projected/d0bd0441-24bc-48f4-aeba-006c3b77419e-kube-api-access-m9d75\") pod \"swift-ring-rebalance-z8mgx\" (UID: \"d0bd0441-24bc-48f4-aeba-006c3b77419e\") " pod="openstack/swift-ring-rebalance-z8mgx" Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.505051 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0bd0441-24bc-48f4-aeba-006c3b77419e-combined-ca-bundle\") pod \"swift-ring-rebalance-z8mgx\" (UID: \"d0bd0441-24bc-48f4-aeba-006c3b77419e\") " pod="openstack/swift-ring-rebalance-z8mgx" Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.505081 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0bd0441-24bc-48f4-aeba-006c3b77419e-scripts\") pod \"swift-ring-rebalance-z8mgx\" (UID: \"d0bd0441-24bc-48f4-aeba-006c3b77419e\") " pod="openstack/swift-ring-rebalance-z8mgx" Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.505103 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d0bd0441-24bc-48f4-aeba-006c3b77419e-ring-data-devices\") pod \"swift-ring-rebalance-z8mgx\" (UID: \"d0bd0441-24bc-48f4-aeba-006c3b77419e\") " pod="openstack/swift-ring-rebalance-z8mgx" Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.505268 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d0bd0441-24bc-48f4-aeba-006c3b77419e-dispersionconf\") pod \"swift-ring-rebalance-z8mgx\" (UID: \"d0bd0441-24bc-48f4-aeba-006c3b77419e\") " pod="openstack/swift-ring-rebalance-z8mgx" Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.607206 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d0bd0441-24bc-48f4-aeba-006c3b77419e-etc-swift\") pod \"swift-ring-rebalance-z8mgx\" (UID: \"d0bd0441-24bc-48f4-aeba-006c3b77419e\") " pod="openstack/swift-ring-rebalance-z8mgx" Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.607275 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d0bd0441-24bc-48f4-aeba-006c3b77419e-swiftconf\") pod \"swift-ring-rebalance-z8mgx\" (UID: \"d0bd0441-24bc-48f4-aeba-006c3b77419e\") " pod="openstack/swift-ring-rebalance-z8mgx" Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.607347 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9d75\" (UniqueName: \"kubernetes.io/projected/d0bd0441-24bc-48f4-aeba-006c3b77419e-kube-api-access-m9d75\") pod \"swift-ring-rebalance-z8mgx\" (UID: \"d0bd0441-24bc-48f4-aeba-006c3b77419e\") " pod="openstack/swift-ring-rebalance-z8mgx" Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.607382 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0bd0441-24bc-48f4-aeba-006c3b77419e-combined-ca-bundle\") pod \"swift-ring-rebalance-z8mgx\" (UID: \"d0bd0441-24bc-48f4-aeba-006c3b77419e\") " pod="openstack/swift-ring-rebalance-z8mgx" Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.607434 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0bd0441-24bc-48f4-aeba-006c3b77419e-scripts\") pod \"swift-ring-rebalance-z8mgx\" (UID: \"d0bd0441-24bc-48f4-aeba-006c3b77419e\") " pod="openstack/swift-ring-rebalance-z8mgx" Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.607465 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d0bd0441-24bc-48f4-aeba-006c3b77419e-ring-data-devices\") pod \"swift-ring-rebalance-z8mgx\" (UID: \"d0bd0441-24bc-48f4-aeba-006c3b77419e\") " pod="openstack/swift-ring-rebalance-z8mgx" Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.607524 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d0bd0441-24bc-48f4-aeba-006c3b77419e-dispersionconf\") pod \"swift-ring-rebalance-z8mgx\" (UID: \"d0bd0441-24bc-48f4-aeba-006c3b77419e\") " pod="openstack/swift-ring-rebalance-z8mgx" Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.608381 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d0bd0441-24bc-48f4-aeba-006c3b77419e-etc-swift\") pod \"swift-ring-rebalance-z8mgx\" (UID: \"d0bd0441-24bc-48f4-aeba-006c3b77419e\") " pod="openstack/swift-ring-rebalance-z8mgx" Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.608481 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d0bd0441-24bc-48f4-aeba-006c3b77419e-ring-data-devices\") pod \"swift-ring-rebalance-z8mgx\" (UID: \"d0bd0441-24bc-48f4-aeba-006c3b77419e\") " pod="openstack/swift-ring-rebalance-z8mgx" Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.608568 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0bd0441-24bc-48f4-aeba-006c3b77419e-scripts\") pod \"swift-ring-rebalance-z8mgx\" (UID: \"d0bd0441-24bc-48f4-aeba-006c3b77419e\") " pod="openstack/swift-ring-rebalance-z8mgx" Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.611578 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d0bd0441-24bc-48f4-aeba-006c3b77419e-dispersionconf\") pod \"swift-ring-rebalance-z8mgx\" (UID: \"d0bd0441-24bc-48f4-aeba-006c3b77419e\") " pod="openstack/swift-ring-rebalance-z8mgx" Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.618124 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0bd0441-24bc-48f4-aeba-006c3b77419e-combined-ca-bundle\") pod \"swift-ring-rebalance-z8mgx\" (UID: \"d0bd0441-24bc-48f4-aeba-006c3b77419e\") " pod="openstack/swift-ring-rebalance-z8mgx" Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.618444 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d0bd0441-24bc-48f4-aeba-006c3b77419e-swiftconf\") pod \"swift-ring-rebalance-z8mgx\" (UID: \"d0bd0441-24bc-48f4-aeba-006c3b77419e\") " pod="openstack/swift-ring-rebalance-z8mgx" Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.625893 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9d75\" (UniqueName: \"kubernetes.io/projected/d0bd0441-24bc-48f4-aeba-006c3b77419e-kube-api-access-m9d75\") pod \"swift-ring-rebalance-z8mgx\" (UID: \"d0bd0441-24bc-48f4-aeba-006c3b77419e\") " pod="openstack/swift-ring-rebalance-z8mgx" Dec 05 16:35:39 crc kubenswrapper[4756]: I1205 16:35:39.697573 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-z8mgx" Dec 05 16:35:40 crc kubenswrapper[4756]: I1205 16:35:40.073507 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c32a5e89-f9d8-4396-9dac-070cda9077c2" path="/var/lib/kubelet/pods/c32a5e89-f9d8-4396-9dac-070cda9077c2/volumes" Dec 05 16:35:40 crc kubenswrapper[4756]: I1205 16:35:40.146799 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-z8mgx"] Dec 05 16:35:40 crc kubenswrapper[4756]: I1205 16:35:40.674672 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683","Type":"ContainerStarted","Data":"1ad5c1e2511e818eb90b9c9bbfc0f48afbd46e3e9b27a610240331e0b4b66444"} Dec 05 16:35:40 crc kubenswrapper[4756]: I1205 16:35:40.678322 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-z8mgx" event={"ID":"d0bd0441-24bc-48f4-aeba-006c3b77419e","Type":"ContainerStarted","Data":"4bed77e0ad6f3673f233829ac5ae21886ffe13499605c74d69d02342fe421f24"} Dec 05 16:35:40 crc kubenswrapper[4756]: I1205 16:35:40.680116 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"881d2b4e-3f2e-4aca-a965-2508edc92a3b","Type":"ContainerStarted","Data":"41db2bc934dd1d8f5c3deaf39234aa1d852019a880bf3e665b6803cc5b7d2923"} Dec 05 16:35:40 crc kubenswrapper[4756]: I1205 16:35:40.704687 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371995.150112 podStartE2EDuration="41.704663867s" podCreationTimestamp="2025-12-05 16:34:59 +0000 UTC" firstStartedPulling="2025-12-05 16:35:01.590732272 +0000 UTC m=+1161.829405784" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:35:40.698453039 +0000 UTC m=+1200.937126571" watchObservedRunningTime="2025-12-05 16:35:40.704663867 +0000 UTC m=+1200.943337389" Dec 05 16:35:40 crc kubenswrapper[4756]: I1205 16:35:40.718782 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=11.478641034 podStartE2EDuration="40.718761737s" podCreationTimestamp="2025-12-05 16:35:00 +0000 UTC" firstStartedPulling="2025-12-05 16:35:02.750880405 +0000 UTC m=+1162.989553917" lastFinishedPulling="2025-12-05 16:35:31.991001108 +0000 UTC m=+1192.229674620" observedRunningTime="2025-12-05 16:35:40.713708201 +0000 UTC m=+1200.952381743" watchObservedRunningTime="2025-12-05 16:35:40.718761737 +0000 UTC m=+1200.957435249" Dec 05 16:35:40 crc kubenswrapper[4756]: I1205 16:35:40.932101 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 05 16:35:40 crc kubenswrapper[4756]: I1205 16:35:40.932193 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 05 16:35:41 crc kubenswrapper[4756]: I1205 16:35:41.698364 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683","Type":"ContainerStarted","Data":"3d84600ac906281768c60200071669b5bea31ab8b67b8f4f35c9c9885c5ed8d8"} Dec 05 16:35:41 crc kubenswrapper[4756]: I1205 16:35:41.698916 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 05 16:35:41 crc kubenswrapper[4756]: I1205 16:35:41.732848 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.265423681 podStartE2EDuration="6.732810069s" podCreationTimestamp="2025-12-05 16:35:35 +0000 UTC" firstStartedPulling="2025-12-05 16:35:36.844261038 +0000 UTC m=+1197.082934550" lastFinishedPulling="2025-12-05 16:35:40.311647426 +0000 UTC m=+1200.550320938" observedRunningTime="2025-12-05 16:35:41.719707565 +0000 UTC m=+1201.958381147" watchObservedRunningTime="2025-12-05 16:35:41.732810069 +0000 UTC m=+1201.971483651" Dec 05 16:35:42 crc kubenswrapper[4756]: I1205 16:35:42.068338 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:42 crc kubenswrapper[4756]: I1205 16:35:42.068395 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:43 crc kubenswrapper[4756]: I1205 16:35:43.487000 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-etc-swift\") pod \"swift-storage-0\" (UID: \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\") " pod="openstack/swift-storage-0" Dec 05 16:35:43 crc kubenswrapper[4756]: E1205 16:35:43.487248 4756 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 16:35:43 crc kubenswrapper[4756]: E1205 16:35:43.487461 4756 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 16:35:43 crc kubenswrapper[4756]: E1205 16:35:43.487537 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-etc-swift podName:1da6dea0-ecf9-4bec-9c85-2fddcc0fae40 nodeName:}" failed. No retries permitted until 2025-12-05 16:35:51.48751321 +0000 UTC m=+1211.726186722 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-etc-swift") pod "swift-storage-0" (UID: "1da6dea0-ecf9-4bec-9c85-2fddcc0fae40") : configmap "swift-ring-files" not found Dec 05 16:35:43 crc kubenswrapper[4756]: I1205 16:35:43.714191 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-z8mgx" event={"ID":"d0bd0441-24bc-48f4-aeba-006c3b77419e","Type":"ContainerStarted","Data":"2371130b7e39c0f952bc7b9336044db2e136ebedd7f9a015ed4eb6b6db00bd32"} Dec 05 16:35:43 crc kubenswrapper[4756]: I1205 16:35:43.736656 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-z8mgx" podStartSLOduration=1.9859082030000001 podStartE2EDuration="4.73663159s" podCreationTimestamp="2025-12-05 16:35:39 +0000 UTC" firstStartedPulling="2025-12-05 16:35:40.257955017 +0000 UTC m=+1200.496628569" lastFinishedPulling="2025-12-05 16:35:43.008678444 +0000 UTC m=+1203.247351956" observedRunningTime="2025-12-05 16:35:43.731542092 +0000 UTC m=+1203.970215614" watchObservedRunningTime="2025-12-05 16:35:43.73663159 +0000 UTC m=+1203.975305112" Dec 05 16:35:44 crc kubenswrapper[4756]: I1205 16:35:44.665490 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66c567d66c-8szgg" Dec 05 16:35:45 crc kubenswrapper[4756]: I1205 16:35:45.004776 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 05 16:35:45 crc kubenswrapper[4756]: I1205 16:35:45.081813 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 05 16:35:45 crc kubenswrapper[4756]: I1205 16:35:45.584713 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59d5fbdd8c-z4k4k" Dec 05 16:35:45 crc kubenswrapper[4756]: I1205 16:35:45.653015 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66c567d66c-8szgg"] Dec 05 16:35:45 crc kubenswrapper[4756]: I1205 16:35:45.653205 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-66c567d66c-8szgg" podUID="87d8dee1-1167-4a52-ad9b-56568cf46ce0" containerName="dnsmasq-dns" containerID="cri-o://b78f1692985a38b132e1d8fb4d0c9bf6ed67628bb8da08945d50fdd67ca17985" gracePeriod=10 Dec 05 16:35:46 crc kubenswrapper[4756]: I1205 16:35:46.153682 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66c567d66c-8szgg" Dec 05 16:35:46 crc kubenswrapper[4756]: I1205 16:35:46.328245 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gps5w\" (UniqueName: \"kubernetes.io/projected/87d8dee1-1167-4a52-ad9b-56568cf46ce0-kube-api-access-gps5w\") pod \"87d8dee1-1167-4a52-ad9b-56568cf46ce0\" (UID: \"87d8dee1-1167-4a52-ad9b-56568cf46ce0\") " Dec 05 16:35:46 crc kubenswrapper[4756]: I1205 16:35:46.328673 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87d8dee1-1167-4a52-ad9b-56568cf46ce0-dns-svc\") pod \"87d8dee1-1167-4a52-ad9b-56568cf46ce0\" (UID: \"87d8dee1-1167-4a52-ad9b-56568cf46ce0\") " Dec 05 16:35:46 crc kubenswrapper[4756]: I1205 16:35:46.328941 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87d8dee1-1167-4a52-ad9b-56568cf46ce0-config\") pod \"87d8dee1-1167-4a52-ad9b-56568cf46ce0\" (UID: \"87d8dee1-1167-4a52-ad9b-56568cf46ce0\") " Dec 05 16:35:46 crc kubenswrapper[4756]: I1205 16:35:46.336260 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87d8dee1-1167-4a52-ad9b-56568cf46ce0-kube-api-access-gps5w" (OuterVolumeSpecName: "kube-api-access-gps5w") pod "87d8dee1-1167-4a52-ad9b-56568cf46ce0" (UID: "87d8dee1-1167-4a52-ad9b-56568cf46ce0"). InnerVolumeSpecName "kube-api-access-gps5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:35:46 crc kubenswrapper[4756]: I1205 16:35:46.374200 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87d8dee1-1167-4a52-ad9b-56568cf46ce0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "87d8dee1-1167-4a52-ad9b-56568cf46ce0" (UID: "87d8dee1-1167-4a52-ad9b-56568cf46ce0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:35:46 crc kubenswrapper[4756]: I1205 16:35:46.376842 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87d8dee1-1167-4a52-ad9b-56568cf46ce0-config" (OuterVolumeSpecName: "config") pod "87d8dee1-1167-4a52-ad9b-56568cf46ce0" (UID: "87d8dee1-1167-4a52-ad9b-56568cf46ce0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:35:46 crc kubenswrapper[4756]: I1205 16:35:46.431311 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87d8dee1-1167-4a52-ad9b-56568cf46ce0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:46 crc kubenswrapper[4756]: I1205 16:35:46.431363 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87d8dee1-1167-4a52-ad9b-56568cf46ce0-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:46 crc kubenswrapper[4756]: I1205 16:35:46.431374 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gps5w\" (UniqueName: \"kubernetes.io/projected/87d8dee1-1167-4a52-ad9b-56568cf46ce0-kube-api-access-gps5w\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:46 crc kubenswrapper[4756]: I1205 16:35:46.755559 4756 generic.go:334] "Generic (PLEG): container finished" podID="87d8dee1-1167-4a52-ad9b-56568cf46ce0" containerID="b78f1692985a38b132e1d8fb4d0c9bf6ed67628bb8da08945d50fdd67ca17985" exitCode=0 Dec 05 16:35:46 crc kubenswrapper[4756]: I1205 16:35:46.755607 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66c567d66c-8szgg" event={"ID":"87d8dee1-1167-4a52-ad9b-56568cf46ce0","Type":"ContainerDied","Data":"b78f1692985a38b132e1d8fb4d0c9bf6ed67628bb8da08945d50fdd67ca17985"} Dec 05 16:35:46 crc kubenswrapper[4756]: I1205 16:35:46.755640 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66c567d66c-8szgg" event={"ID":"87d8dee1-1167-4a52-ad9b-56568cf46ce0","Type":"ContainerDied","Data":"6d88cc6e074c50cd7291e8ae58ad00089707a32d77be56ec7d21e730ea41219b"} Dec 05 16:35:46 crc kubenswrapper[4756]: I1205 16:35:46.755640 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66c567d66c-8szgg" Dec 05 16:35:46 crc kubenswrapper[4756]: I1205 16:35:46.755660 4756 scope.go:117] "RemoveContainer" containerID="b78f1692985a38b132e1d8fb4d0c9bf6ed67628bb8da08945d50fdd67ca17985" Dec 05 16:35:46 crc kubenswrapper[4756]: I1205 16:35:46.812862 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66c567d66c-8szgg"] Dec 05 16:35:46 crc kubenswrapper[4756]: I1205 16:35:46.821865 4756 scope.go:117] "RemoveContainer" containerID="d11d91e03c0cf85317bf483216b8c0f74a029e2d9cb802bf7d7407056ffd093b" Dec 05 16:35:46 crc kubenswrapper[4756]: I1205 16:35:46.824293 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66c567d66c-8szgg"] Dec 05 16:35:46 crc kubenswrapper[4756]: I1205 16:35:46.840482 4756 scope.go:117] "RemoveContainer" containerID="b78f1692985a38b132e1d8fb4d0c9bf6ed67628bb8da08945d50fdd67ca17985" Dec 05 16:35:46 crc kubenswrapper[4756]: E1205 16:35:46.840890 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b78f1692985a38b132e1d8fb4d0c9bf6ed67628bb8da08945d50fdd67ca17985\": container with ID starting with b78f1692985a38b132e1d8fb4d0c9bf6ed67628bb8da08945d50fdd67ca17985 not found: ID does not exist" containerID="b78f1692985a38b132e1d8fb4d0c9bf6ed67628bb8da08945d50fdd67ca17985" Dec 05 16:35:46 crc kubenswrapper[4756]: I1205 16:35:46.840925 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b78f1692985a38b132e1d8fb4d0c9bf6ed67628bb8da08945d50fdd67ca17985"} err="failed to get container status \"b78f1692985a38b132e1d8fb4d0c9bf6ed67628bb8da08945d50fdd67ca17985\": rpc error: code = NotFound desc = could not find container \"b78f1692985a38b132e1d8fb4d0c9bf6ed67628bb8da08945d50fdd67ca17985\": container with ID starting with b78f1692985a38b132e1d8fb4d0c9bf6ed67628bb8da08945d50fdd67ca17985 not found: ID does not exist" Dec 05 16:35:46 crc kubenswrapper[4756]: I1205 16:35:46.840947 4756 scope.go:117] "RemoveContainer" containerID="d11d91e03c0cf85317bf483216b8c0f74a029e2d9cb802bf7d7407056ffd093b" Dec 05 16:35:46 crc kubenswrapper[4756]: E1205 16:35:46.841273 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d11d91e03c0cf85317bf483216b8c0f74a029e2d9cb802bf7d7407056ffd093b\": container with ID starting with d11d91e03c0cf85317bf483216b8c0f74a029e2d9cb802bf7d7407056ffd093b not found: ID does not exist" containerID="d11d91e03c0cf85317bf483216b8c0f74a029e2d9cb802bf7d7407056ffd093b" Dec 05 16:35:46 crc kubenswrapper[4756]: I1205 16:35:46.841342 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d11d91e03c0cf85317bf483216b8c0f74a029e2d9cb802bf7d7407056ffd093b"} err="failed to get container status \"d11d91e03c0cf85317bf483216b8c0f74a029e2d9cb802bf7d7407056ffd093b\": rpc error: code = NotFound desc = could not find container \"d11d91e03c0cf85317bf483216b8c0f74a029e2d9cb802bf7d7407056ffd093b\": container with ID starting with d11d91e03c0cf85317bf483216b8c0f74a029e2d9cb802bf7d7407056ffd093b not found: ID does not exist" Dec 05 16:35:48 crc kubenswrapper[4756]: I1205 16:35:48.064785 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87d8dee1-1167-4a52-ad9b-56568cf46ce0" path="/var/lib/kubelet/pods/87d8dee1-1167-4a52-ad9b-56568cf46ce0/volumes" Dec 05 16:35:48 crc kubenswrapper[4756]: I1205 16:35:48.151568 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:48 crc kubenswrapper[4756]: I1205 16:35:48.232690 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 05 16:35:51 crc kubenswrapper[4756]: I1205 16:35:51.406710 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 05 16:35:51 crc kubenswrapper[4756]: I1205 16:35:51.533624 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-etc-swift\") pod \"swift-storage-0\" (UID: \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\") " pod="openstack/swift-storage-0" Dec 05 16:35:51 crc kubenswrapper[4756]: E1205 16:35:51.533838 4756 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 16:35:51 crc kubenswrapper[4756]: E1205 16:35:51.533870 4756 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 16:35:51 crc kubenswrapper[4756]: E1205 16:35:51.533946 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-etc-swift podName:1da6dea0-ecf9-4bec-9c85-2fddcc0fae40 nodeName:}" failed. No retries permitted until 2025-12-05 16:36:07.533917512 +0000 UTC m=+1227.772591044 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-etc-swift") pod "swift-storage-0" (UID: "1da6dea0-ecf9-4bec-9c85-2fddcc0fae40") : configmap "swift-ring-files" not found Dec 05 16:35:51 crc kubenswrapper[4756]: I1205 16:35:51.810025 4756 generic.go:334] "Generic (PLEG): container finished" podID="d0bd0441-24bc-48f4-aeba-006c3b77419e" containerID="2371130b7e39c0f952bc7b9336044db2e136ebedd7f9a015ed4eb6b6db00bd32" exitCode=0 Dec 05 16:35:51 crc kubenswrapper[4756]: I1205 16:35:51.810066 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-z8mgx" event={"ID":"d0bd0441-24bc-48f4-aeba-006c3b77419e","Type":"ContainerDied","Data":"2371130b7e39c0f952bc7b9336044db2e136ebedd7f9a015ed4eb6b6db00bd32"} Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.276471 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-0c76-account-create-update-st5sb"] Dec 05 16:35:52 crc kubenswrapper[4756]: E1205 16:35:52.276840 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87d8dee1-1167-4a52-ad9b-56568cf46ce0" containerName="dnsmasq-dns" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.276861 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="87d8dee1-1167-4a52-ad9b-56568cf46ce0" containerName="dnsmasq-dns" Dec 05 16:35:52 crc kubenswrapper[4756]: E1205 16:35:52.276898 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87d8dee1-1167-4a52-ad9b-56568cf46ce0" containerName="init" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.276907 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="87d8dee1-1167-4a52-ad9b-56568cf46ce0" containerName="init" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.277125 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="87d8dee1-1167-4a52-ad9b-56568cf46ce0" containerName="dnsmasq-dns" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.277763 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0c76-account-create-update-st5sb" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.281847 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.291904 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0c76-account-create-update-st5sb"] Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.294909 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.294965 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.301948 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-6hl4j"] Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.303236 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-6hl4j" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.354750 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-6hl4j"] Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.448874 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kljj2\" (UniqueName: \"kubernetes.io/projected/45e502c2-2f9c-4390-b3ea-ed099cd2111b-kube-api-access-kljj2\") pod \"keystone-db-create-6hl4j\" (UID: \"45e502c2-2f9c-4390-b3ea-ed099cd2111b\") " pod="openstack/keystone-db-create-6hl4j" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.448925 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpsbm\" (UniqueName: \"kubernetes.io/projected/0de45488-2c4b-408e-810b-896d2db706d8-kube-api-access-gpsbm\") pod \"keystone-0c76-account-create-update-st5sb\" (UID: \"0de45488-2c4b-408e-810b-896d2db706d8\") " pod="openstack/keystone-0c76-account-create-update-st5sb" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.449214 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0de45488-2c4b-408e-810b-896d2db706d8-operator-scripts\") pod \"keystone-0c76-account-create-update-st5sb\" (UID: \"0de45488-2c4b-408e-810b-896d2db706d8\") " pod="openstack/keystone-0c76-account-create-update-st5sb" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.449263 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45e502c2-2f9c-4390-b3ea-ed099cd2111b-operator-scripts\") pod \"keystone-db-create-6hl4j\" (UID: \"45e502c2-2f9c-4390-b3ea-ed099cd2111b\") " pod="openstack/keystone-db-create-6hl4j" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.496343 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-qwgf6"] Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.497491 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-qwgf6" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.506162 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-qwgf6"] Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.552579 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kljj2\" (UniqueName: \"kubernetes.io/projected/45e502c2-2f9c-4390-b3ea-ed099cd2111b-kube-api-access-kljj2\") pod \"keystone-db-create-6hl4j\" (UID: \"45e502c2-2f9c-4390-b3ea-ed099cd2111b\") " pod="openstack/keystone-db-create-6hl4j" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.552623 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpsbm\" (UniqueName: \"kubernetes.io/projected/0de45488-2c4b-408e-810b-896d2db706d8-kube-api-access-gpsbm\") pod \"keystone-0c76-account-create-update-st5sb\" (UID: \"0de45488-2c4b-408e-810b-896d2db706d8\") " pod="openstack/keystone-0c76-account-create-update-st5sb" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.552764 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45e502c2-2f9c-4390-b3ea-ed099cd2111b-operator-scripts\") pod \"keystone-db-create-6hl4j\" (UID: \"45e502c2-2f9c-4390-b3ea-ed099cd2111b\") " pod="openstack/keystone-db-create-6hl4j" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.552786 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0de45488-2c4b-408e-810b-896d2db706d8-operator-scripts\") pod \"keystone-0c76-account-create-update-st5sb\" (UID: \"0de45488-2c4b-408e-810b-896d2db706d8\") " pod="openstack/keystone-0c76-account-create-update-st5sb" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.553570 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45e502c2-2f9c-4390-b3ea-ed099cd2111b-operator-scripts\") pod \"keystone-db-create-6hl4j\" (UID: \"45e502c2-2f9c-4390-b3ea-ed099cd2111b\") " pod="openstack/keystone-db-create-6hl4j" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.554087 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0de45488-2c4b-408e-810b-896d2db706d8-operator-scripts\") pod \"keystone-0c76-account-create-update-st5sb\" (UID: \"0de45488-2c4b-408e-810b-896d2db706d8\") " pod="openstack/keystone-0c76-account-create-update-st5sb" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.554800 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5c6b-account-create-update-kb6fq"] Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.555805 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5c6b-account-create-update-kb6fq" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.557876 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.565208 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5c6b-account-create-update-kb6fq"] Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.583532 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kljj2\" (UniqueName: \"kubernetes.io/projected/45e502c2-2f9c-4390-b3ea-ed099cd2111b-kube-api-access-kljj2\") pod \"keystone-db-create-6hl4j\" (UID: \"45e502c2-2f9c-4390-b3ea-ed099cd2111b\") " pod="openstack/keystone-db-create-6hl4j" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.586323 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpsbm\" (UniqueName: \"kubernetes.io/projected/0de45488-2c4b-408e-810b-896d2db706d8-kube-api-access-gpsbm\") pod \"keystone-0c76-account-create-update-st5sb\" (UID: \"0de45488-2c4b-408e-810b-896d2db706d8\") " pod="openstack/keystone-0c76-account-create-update-st5sb" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.594315 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0c76-account-create-update-st5sb" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.620392 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-6hl4j" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.655283 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkrhz\" (UniqueName: \"kubernetes.io/projected/0d04ce23-86cb-48ed-9d9b-59bb3b650da2-kube-api-access-gkrhz\") pod \"placement-5c6b-account-create-update-kb6fq\" (UID: \"0d04ce23-86cb-48ed-9d9b-59bb3b650da2\") " pod="openstack/placement-5c6b-account-create-update-kb6fq" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.655629 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a3091dc-ae5e-4f60-9e11-a07f76cff76f-operator-scripts\") pod \"placement-db-create-qwgf6\" (UID: \"7a3091dc-ae5e-4f60-9e11-a07f76cff76f\") " pod="openstack/placement-db-create-qwgf6" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.655754 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5rbs\" (UniqueName: \"kubernetes.io/projected/7a3091dc-ae5e-4f60-9e11-a07f76cff76f-kube-api-access-z5rbs\") pod \"placement-db-create-qwgf6\" (UID: \"7a3091dc-ae5e-4f60-9e11-a07f76cff76f\") " pod="openstack/placement-db-create-qwgf6" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.655952 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d04ce23-86cb-48ed-9d9b-59bb3b650da2-operator-scripts\") pod \"placement-5c6b-account-create-update-kb6fq\" (UID: \"0d04ce23-86cb-48ed-9d9b-59bb3b650da2\") " pod="openstack/placement-5c6b-account-create-update-kb6fq" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.764454 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d04ce23-86cb-48ed-9d9b-59bb3b650da2-operator-scripts\") pod \"placement-5c6b-account-create-update-kb6fq\" (UID: \"0d04ce23-86cb-48ed-9d9b-59bb3b650da2\") " pod="openstack/placement-5c6b-account-create-update-kb6fq" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.764678 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkrhz\" (UniqueName: \"kubernetes.io/projected/0d04ce23-86cb-48ed-9d9b-59bb3b650da2-kube-api-access-gkrhz\") pod \"placement-5c6b-account-create-update-kb6fq\" (UID: \"0d04ce23-86cb-48ed-9d9b-59bb3b650da2\") " pod="openstack/placement-5c6b-account-create-update-kb6fq" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.764895 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a3091dc-ae5e-4f60-9e11-a07f76cff76f-operator-scripts\") pod \"placement-db-create-qwgf6\" (UID: \"7a3091dc-ae5e-4f60-9e11-a07f76cff76f\") " pod="openstack/placement-db-create-qwgf6" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.764959 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5rbs\" (UniqueName: \"kubernetes.io/projected/7a3091dc-ae5e-4f60-9e11-a07f76cff76f-kube-api-access-z5rbs\") pod \"placement-db-create-qwgf6\" (UID: \"7a3091dc-ae5e-4f60-9e11-a07f76cff76f\") " pod="openstack/placement-db-create-qwgf6" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.765651 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d04ce23-86cb-48ed-9d9b-59bb3b650da2-operator-scripts\") pod \"placement-5c6b-account-create-update-kb6fq\" (UID: \"0d04ce23-86cb-48ed-9d9b-59bb3b650da2\") " pod="openstack/placement-5c6b-account-create-update-kb6fq" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.765865 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a3091dc-ae5e-4f60-9e11-a07f76cff76f-operator-scripts\") pod \"placement-db-create-qwgf6\" (UID: \"7a3091dc-ae5e-4f60-9e11-a07f76cff76f\") " pod="openstack/placement-db-create-qwgf6" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.770516 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-ts8rq"] Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.772830 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ts8rq" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.779646 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ts8rq"] Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.792120 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkrhz\" (UniqueName: \"kubernetes.io/projected/0d04ce23-86cb-48ed-9d9b-59bb3b650da2-kube-api-access-gkrhz\") pod \"placement-5c6b-account-create-update-kb6fq\" (UID: \"0d04ce23-86cb-48ed-9d9b-59bb3b650da2\") " pod="openstack/placement-5c6b-account-create-update-kb6fq" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.792436 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5rbs\" (UniqueName: \"kubernetes.io/projected/7a3091dc-ae5e-4f60-9e11-a07f76cff76f-kube-api-access-z5rbs\") pod \"placement-db-create-qwgf6\" (UID: \"7a3091dc-ae5e-4f60-9e11-a07f76cff76f\") " pod="openstack/placement-db-create-qwgf6" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.815342 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-qwgf6" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.866651 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f13540dc-b5e4-4449-9abb-160846406121-operator-scripts\") pod \"glance-db-create-ts8rq\" (UID: \"f13540dc-b5e4-4449-9abb-160846406121\") " pod="openstack/glance-db-create-ts8rq" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.866719 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcdqz\" (UniqueName: \"kubernetes.io/projected/f13540dc-b5e4-4449-9abb-160846406121-kube-api-access-qcdqz\") pod \"glance-db-create-ts8rq\" (UID: \"f13540dc-b5e4-4449-9abb-160846406121\") " pod="openstack/glance-db-create-ts8rq" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.875758 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5c6b-account-create-update-kb6fq" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.896892 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-1a2d-account-create-update-5jzww"] Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.898183 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1a2d-account-create-update-5jzww" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.903429 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 05 16:35:52 crc kubenswrapper[4756]: I1205 16:35:52.905243 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1a2d-account-create-update-5jzww"] Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.012132 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f13540dc-b5e4-4449-9abb-160846406121-operator-scripts\") pod \"glance-db-create-ts8rq\" (UID: \"f13540dc-b5e4-4449-9abb-160846406121\") " pod="openstack/glance-db-create-ts8rq" Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.013234 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f13540dc-b5e4-4449-9abb-160846406121-operator-scripts\") pod \"glance-db-create-ts8rq\" (UID: \"f13540dc-b5e4-4449-9abb-160846406121\") " pod="openstack/glance-db-create-ts8rq" Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.014481 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcdqz\" (UniqueName: \"kubernetes.io/projected/f13540dc-b5e4-4449-9abb-160846406121-kube-api-access-qcdqz\") pod \"glance-db-create-ts8rq\" (UID: \"f13540dc-b5e4-4449-9abb-160846406121\") " pod="openstack/glance-db-create-ts8rq" Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.036028 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcdqz\" (UniqueName: \"kubernetes.io/projected/f13540dc-b5e4-4449-9abb-160846406121-kube-api-access-qcdqz\") pod \"glance-db-create-ts8rq\" (UID: \"f13540dc-b5e4-4449-9abb-160846406121\") " pod="openstack/glance-db-create-ts8rq" Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.078334 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0c76-account-create-update-st5sb"] Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.118100 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpj68\" (UniqueName: \"kubernetes.io/projected/669dd164-f2eb-4cc7-bc41-c12a82bb74b7-kube-api-access-wpj68\") pod \"glance-1a2d-account-create-update-5jzww\" (UID: \"669dd164-f2eb-4cc7-bc41-c12a82bb74b7\") " pod="openstack/glance-1a2d-account-create-update-5jzww" Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.118178 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/669dd164-f2eb-4cc7-bc41-c12a82bb74b7-operator-scripts\") pod \"glance-1a2d-account-create-update-5jzww\" (UID: \"669dd164-f2eb-4cc7-bc41-c12a82bb74b7\") " pod="openstack/glance-1a2d-account-create-update-5jzww" Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.153036 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-6hl4j"] Dec 05 16:35:53 crc kubenswrapper[4756]: W1205 16:35:53.154982 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45e502c2_2f9c_4390_b3ea_ed099cd2111b.slice/crio-526ac285eb4d3604b1c95c5a019d5569dbab26b33f5a5a398c6f978829ca45ab WatchSource:0}: Error finding container 526ac285eb4d3604b1c95c5a019d5569dbab26b33f5a5a398c6f978829ca45ab: Status 404 returned error can't find the container with id 526ac285eb4d3604b1c95c5a019d5569dbab26b33f5a5a398c6f978829ca45ab Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.173190 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-z8mgx" Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.184124 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ts8rq" Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.218763 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d0bd0441-24bc-48f4-aeba-006c3b77419e-etc-swift\") pod \"d0bd0441-24bc-48f4-aeba-006c3b77419e\" (UID: \"d0bd0441-24bc-48f4-aeba-006c3b77419e\") " Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.218874 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d0bd0441-24bc-48f4-aeba-006c3b77419e-ring-data-devices\") pod \"d0bd0441-24bc-48f4-aeba-006c3b77419e\" (UID: \"d0bd0441-24bc-48f4-aeba-006c3b77419e\") " Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.218920 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0bd0441-24bc-48f4-aeba-006c3b77419e-combined-ca-bundle\") pod \"d0bd0441-24bc-48f4-aeba-006c3b77419e\" (UID: \"d0bd0441-24bc-48f4-aeba-006c3b77419e\") " Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.218959 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d0bd0441-24bc-48f4-aeba-006c3b77419e-dispersionconf\") pod \"d0bd0441-24bc-48f4-aeba-006c3b77419e\" (UID: \"d0bd0441-24bc-48f4-aeba-006c3b77419e\") " Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.219024 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d0bd0441-24bc-48f4-aeba-006c3b77419e-swiftconf\") pod \"d0bd0441-24bc-48f4-aeba-006c3b77419e\" (UID: \"d0bd0441-24bc-48f4-aeba-006c3b77419e\") " Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.219063 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9d75\" (UniqueName: \"kubernetes.io/projected/d0bd0441-24bc-48f4-aeba-006c3b77419e-kube-api-access-m9d75\") pod \"d0bd0441-24bc-48f4-aeba-006c3b77419e\" (UID: \"d0bd0441-24bc-48f4-aeba-006c3b77419e\") " Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.219084 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0bd0441-24bc-48f4-aeba-006c3b77419e-scripts\") pod \"d0bd0441-24bc-48f4-aeba-006c3b77419e\" (UID: \"d0bd0441-24bc-48f4-aeba-006c3b77419e\") " Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.219300 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpj68\" (UniqueName: \"kubernetes.io/projected/669dd164-f2eb-4cc7-bc41-c12a82bb74b7-kube-api-access-wpj68\") pod \"glance-1a2d-account-create-update-5jzww\" (UID: \"669dd164-f2eb-4cc7-bc41-c12a82bb74b7\") " pod="openstack/glance-1a2d-account-create-update-5jzww" Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.219347 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/669dd164-f2eb-4cc7-bc41-c12a82bb74b7-operator-scripts\") pod \"glance-1a2d-account-create-update-5jzww\" (UID: \"669dd164-f2eb-4cc7-bc41-c12a82bb74b7\") " pod="openstack/glance-1a2d-account-create-update-5jzww" Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.220131 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/669dd164-f2eb-4cc7-bc41-c12a82bb74b7-operator-scripts\") pod \"glance-1a2d-account-create-update-5jzww\" (UID: \"669dd164-f2eb-4cc7-bc41-c12a82bb74b7\") " pod="openstack/glance-1a2d-account-create-update-5jzww" Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.220190 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0bd0441-24bc-48f4-aeba-006c3b77419e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d0bd0441-24bc-48f4-aeba-006c3b77419e" (UID: "d0bd0441-24bc-48f4-aeba-006c3b77419e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.221765 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0bd0441-24bc-48f4-aeba-006c3b77419e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "d0bd0441-24bc-48f4-aeba-006c3b77419e" (UID: "d0bd0441-24bc-48f4-aeba-006c3b77419e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.224296 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0bd0441-24bc-48f4-aeba-006c3b77419e-kube-api-access-m9d75" (OuterVolumeSpecName: "kube-api-access-m9d75") pod "d0bd0441-24bc-48f4-aeba-006c3b77419e" (UID: "d0bd0441-24bc-48f4-aeba-006c3b77419e"). InnerVolumeSpecName "kube-api-access-m9d75". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.232248 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0bd0441-24bc-48f4-aeba-006c3b77419e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "d0bd0441-24bc-48f4-aeba-006c3b77419e" (UID: "d0bd0441-24bc-48f4-aeba-006c3b77419e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.241401 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpj68\" (UniqueName: \"kubernetes.io/projected/669dd164-f2eb-4cc7-bc41-c12a82bb74b7-kube-api-access-wpj68\") pod \"glance-1a2d-account-create-update-5jzww\" (UID: \"669dd164-f2eb-4cc7-bc41-c12a82bb74b7\") " pod="openstack/glance-1a2d-account-create-update-5jzww" Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.252487 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0bd0441-24bc-48f4-aeba-006c3b77419e-scripts" (OuterVolumeSpecName: "scripts") pod "d0bd0441-24bc-48f4-aeba-006c3b77419e" (UID: "d0bd0441-24bc-48f4-aeba-006c3b77419e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.255075 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0bd0441-24bc-48f4-aeba-006c3b77419e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0bd0441-24bc-48f4-aeba-006c3b77419e" (UID: "d0bd0441-24bc-48f4-aeba-006c3b77419e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.257249 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0bd0441-24bc-48f4-aeba-006c3b77419e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "d0bd0441-24bc-48f4-aeba-006c3b77419e" (UID: "d0bd0441-24bc-48f4-aeba-006c3b77419e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.320472 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9d75\" (UniqueName: \"kubernetes.io/projected/d0bd0441-24bc-48f4-aeba-006c3b77419e-kube-api-access-m9d75\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.320496 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0bd0441-24bc-48f4-aeba-006c3b77419e-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.320505 4756 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d0bd0441-24bc-48f4-aeba-006c3b77419e-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.320514 4756 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d0bd0441-24bc-48f4-aeba-006c3b77419e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.320523 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0bd0441-24bc-48f4-aeba-006c3b77419e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.320533 4756 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d0bd0441-24bc-48f4-aeba-006c3b77419e-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.320541 4756 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d0bd0441-24bc-48f4-aeba-006c3b77419e-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.393561 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-qwgf6"] Dec 05 16:35:53 crc kubenswrapper[4756]: W1205 16:35:53.395716 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a3091dc_ae5e_4f60_9e11_a07f76cff76f.slice/crio-4a40909f83c613ea17688db1f05014e28287eafdcca4ac2f87505d1329e96ac7 WatchSource:0}: Error finding container 4a40909f83c613ea17688db1f05014e28287eafdcca4ac2f87505d1329e96ac7: Status 404 returned error can't find the container with id 4a40909f83c613ea17688db1f05014e28287eafdcca4ac2f87505d1329e96ac7 Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.479553 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5c6b-account-create-update-kb6fq"] Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.515301 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1a2d-account-create-update-5jzww" Dec 05 16:35:53 crc kubenswrapper[4756]: W1205 16:35:53.565981 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d04ce23_86cb_48ed_9d9b_59bb3b650da2.slice/crio-482c39b0c92d76339f75e14fee23b7c8d1d58f3be249b57e131a32a848b560f9 WatchSource:0}: Error finding container 482c39b0c92d76339f75e14fee23b7c8d1d58f3be249b57e131a32a848b560f9: Status 404 returned error can't find the container with id 482c39b0c92d76339f75e14fee23b7c8d1d58f3be249b57e131a32a848b560f9 Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.639646 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ts8rq"] Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.825920 4756 generic.go:334] "Generic (PLEG): container finished" podID="45e502c2-2f9c-4390-b3ea-ed099cd2111b" containerID="d11db3e82fa52daef7384c7f1d126a686291ca3143db5866bd59dc3cfdbcbffd" exitCode=0 Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.826001 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-6hl4j" event={"ID":"45e502c2-2f9c-4390-b3ea-ed099cd2111b","Type":"ContainerDied","Data":"d11db3e82fa52daef7384c7f1d126a686291ca3143db5866bd59dc3cfdbcbffd"} Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.826024 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-6hl4j" event={"ID":"45e502c2-2f9c-4390-b3ea-ed099cd2111b","Type":"ContainerStarted","Data":"526ac285eb4d3604b1c95c5a019d5569dbab26b33f5a5a398c6f978829ca45ab"} Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.827981 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ts8rq" event={"ID":"f13540dc-b5e4-4449-9abb-160846406121","Type":"ContainerStarted","Data":"db2c602af5ae592e04d0c4437ed7ec9413a4df6cdb793118baacd952ab4a6512"} Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.828007 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ts8rq" event={"ID":"f13540dc-b5e4-4449-9abb-160846406121","Type":"ContainerStarted","Data":"749ae20512a60ee40966ab69efadc79288ce9af1877633c1c71f8b6e582054e8"} Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.829608 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-z8mgx" Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.829618 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-z8mgx" event={"ID":"d0bd0441-24bc-48f4-aeba-006c3b77419e","Type":"ContainerDied","Data":"4bed77e0ad6f3673f233829ac5ae21886ffe13499605c74d69d02342fe421f24"} Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.829657 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bed77e0ad6f3673f233829ac5ae21886ffe13499605c74d69d02342fe421f24" Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.830941 4756 generic.go:334] "Generic (PLEG): container finished" podID="7a3091dc-ae5e-4f60-9e11-a07f76cff76f" containerID="732c7bc429bae861433213544fb19f3e8c9cfa3b827a04b2bdee69fe3f36c4a2" exitCode=0 Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.830969 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-qwgf6" event={"ID":"7a3091dc-ae5e-4f60-9e11-a07f76cff76f","Type":"ContainerDied","Data":"732c7bc429bae861433213544fb19f3e8c9cfa3b827a04b2bdee69fe3f36c4a2"} Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.831003 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-qwgf6" event={"ID":"7a3091dc-ae5e-4f60-9e11-a07f76cff76f","Type":"ContainerStarted","Data":"4a40909f83c613ea17688db1f05014e28287eafdcca4ac2f87505d1329e96ac7"} Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.832339 4756 generic.go:334] "Generic (PLEG): container finished" podID="0de45488-2c4b-408e-810b-896d2db706d8" containerID="772b002d40245f8da0b5197a01d0e6ef213b4ad63d84975dc5b67c4131b51e39" exitCode=0 Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.832399 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0c76-account-create-update-st5sb" event={"ID":"0de45488-2c4b-408e-810b-896d2db706d8","Type":"ContainerDied","Data":"772b002d40245f8da0b5197a01d0e6ef213b4ad63d84975dc5b67c4131b51e39"} Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.832503 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0c76-account-create-update-st5sb" event={"ID":"0de45488-2c4b-408e-810b-896d2db706d8","Type":"ContainerStarted","Data":"2a51400e3aa7612c7065ac4e93003c90f10c9233339934b5225ba9f0fa826dd7"} Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.833908 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5c6b-account-create-update-kb6fq" event={"ID":"0d04ce23-86cb-48ed-9d9b-59bb3b650da2","Type":"ContainerStarted","Data":"57244970037b08a10bfce5d09153a60d6d4d0743e6dcb260784dd10814796f07"} Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.834024 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5c6b-account-create-update-kb6fq" event={"ID":"0d04ce23-86cb-48ed-9d9b-59bb3b650da2","Type":"ContainerStarted","Data":"482c39b0c92d76339f75e14fee23b7c8d1d58f3be249b57e131a32a848b560f9"} Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.881557 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5c6b-account-create-update-kb6fq" podStartSLOduration=1.881539166 podStartE2EDuration="1.881539166s" podCreationTimestamp="2025-12-05 16:35:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:35:53.880699693 +0000 UTC m=+1214.119373205" watchObservedRunningTime="2025-12-05 16:35:53.881539166 +0000 UTC m=+1214.120212678" Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.895354 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-ts8rq" podStartSLOduration=1.895336608 podStartE2EDuration="1.895336608s" podCreationTimestamp="2025-12-05 16:35:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:35:53.894143676 +0000 UTC m=+1214.132817188" watchObservedRunningTime="2025-12-05 16:35:53.895336608 +0000 UTC m=+1214.134010120" Dec 05 16:35:53 crc kubenswrapper[4756]: I1205 16:35:53.943783 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1a2d-account-create-update-5jzww"] Dec 05 16:35:54 crc kubenswrapper[4756]: I1205 16:35:54.841603 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1a2d-account-create-update-5jzww" event={"ID":"669dd164-f2eb-4cc7-bc41-c12a82bb74b7","Type":"ContainerStarted","Data":"ffe12582c087f36dcf1ccc4b2569e203bbdd0e871d37f6f302894e91b850d9b4"} Dec 05 16:35:54 crc kubenswrapper[4756]: I1205 16:35:54.841977 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1a2d-account-create-update-5jzww" event={"ID":"669dd164-f2eb-4cc7-bc41-c12a82bb74b7","Type":"ContainerStarted","Data":"9f6c0dde3ade1a62aa58138602203ae8349b3e702f3f579a5e8fedfb3b56a080"} Dec 05 16:35:54 crc kubenswrapper[4756]: I1205 16:35:54.843171 4756 generic.go:334] "Generic (PLEG): container finished" podID="f13540dc-b5e4-4449-9abb-160846406121" containerID="db2c602af5ae592e04d0c4437ed7ec9413a4df6cdb793118baacd952ab4a6512" exitCode=0 Dec 05 16:35:54 crc kubenswrapper[4756]: I1205 16:35:54.843225 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ts8rq" event={"ID":"f13540dc-b5e4-4449-9abb-160846406121","Type":"ContainerDied","Data":"db2c602af5ae592e04d0c4437ed7ec9413a4df6cdb793118baacd952ab4a6512"} Dec 05 16:35:54 crc kubenswrapper[4756]: I1205 16:35:54.844487 4756 generic.go:334] "Generic (PLEG): container finished" podID="0d04ce23-86cb-48ed-9d9b-59bb3b650da2" containerID="57244970037b08a10bfce5d09153a60d6d4d0743e6dcb260784dd10814796f07" exitCode=0 Dec 05 16:35:54 crc kubenswrapper[4756]: I1205 16:35:54.844551 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5c6b-account-create-update-kb6fq" event={"ID":"0d04ce23-86cb-48ed-9d9b-59bb3b650da2","Type":"ContainerDied","Data":"57244970037b08a10bfce5d09153a60d6d4d0743e6dcb260784dd10814796f07"} Dec 05 16:35:54 crc kubenswrapper[4756]: I1205 16:35:54.862862 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-1a2d-account-create-update-5jzww" podStartSLOduration=2.862845055 podStartE2EDuration="2.862845055s" podCreationTimestamp="2025-12-05 16:35:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:35:54.857721257 +0000 UTC m=+1215.096394779" watchObservedRunningTime="2025-12-05 16:35:54.862845055 +0000 UTC m=+1215.101518567" Dec 05 16:35:55 crc kubenswrapper[4756]: I1205 16:35:55.336473 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-6hl4j" Dec 05 16:35:55 crc kubenswrapper[4756]: I1205 16:35:55.341236 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-qwgf6" Dec 05 16:35:55 crc kubenswrapper[4756]: I1205 16:35:55.348444 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0c76-account-create-update-st5sb" Dec 05 16:35:55 crc kubenswrapper[4756]: I1205 16:35:55.455037 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kljj2\" (UniqueName: \"kubernetes.io/projected/45e502c2-2f9c-4390-b3ea-ed099cd2111b-kube-api-access-kljj2\") pod \"45e502c2-2f9c-4390-b3ea-ed099cd2111b\" (UID: \"45e502c2-2f9c-4390-b3ea-ed099cd2111b\") " Dec 05 16:35:55 crc kubenswrapper[4756]: I1205 16:35:55.455183 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5rbs\" (UniqueName: \"kubernetes.io/projected/7a3091dc-ae5e-4f60-9e11-a07f76cff76f-kube-api-access-z5rbs\") pod \"7a3091dc-ae5e-4f60-9e11-a07f76cff76f\" (UID: \"7a3091dc-ae5e-4f60-9e11-a07f76cff76f\") " Dec 05 16:35:55 crc kubenswrapper[4756]: I1205 16:35:55.455358 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a3091dc-ae5e-4f60-9e11-a07f76cff76f-operator-scripts\") pod \"7a3091dc-ae5e-4f60-9e11-a07f76cff76f\" (UID: \"7a3091dc-ae5e-4f60-9e11-a07f76cff76f\") " Dec 05 16:35:55 crc kubenswrapper[4756]: I1205 16:35:55.455502 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0de45488-2c4b-408e-810b-896d2db706d8-operator-scripts\") pod \"0de45488-2c4b-408e-810b-896d2db706d8\" (UID: \"0de45488-2c4b-408e-810b-896d2db706d8\") " Dec 05 16:35:55 crc kubenswrapper[4756]: I1205 16:35:55.455558 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45e502c2-2f9c-4390-b3ea-ed099cd2111b-operator-scripts\") pod \"45e502c2-2f9c-4390-b3ea-ed099cd2111b\" (UID: \"45e502c2-2f9c-4390-b3ea-ed099cd2111b\") " Dec 05 16:35:55 crc kubenswrapper[4756]: I1205 16:35:55.455601 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpsbm\" (UniqueName: \"kubernetes.io/projected/0de45488-2c4b-408e-810b-896d2db706d8-kube-api-access-gpsbm\") pod \"0de45488-2c4b-408e-810b-896d2db706d8\" (UID: \"0de45488-2c4b-408e-810b-896d2db706d8\") " Dec 05 16:35:55 crc kubenswrapper[4756]: I1205 16:35:55.456023 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0de45488-2c4b-408e-810b-896d2db706d8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0de45488-2c4b-408e-810b-896d2db706d8" (UID: "0de45488-2c4b-408e-810b-896d2db706d8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:35:55 crc kubenswrapper[4756]: I1205 16:35:55.456034 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a3091dc-ae5e-4f60-9e11-a07f76cff76f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7a3091dc-ae5e-4f60-9e11-a07f76cff76f" (UID: "7a3091dc-ae5e-4f60-9e11-a07f76cff76f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:35:55 crc kubenswrapper[4756]: I1205 16:35:55.456075 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45e502c2-2f9c-4390-b3ea-ed099cd2111b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "45e502c2-2f9c-4390-b3ea-ed099cd2111b" (UID: "45e502c2-2f9c-4390-b3ea-ed099cd2111b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:35:55 crc kubenswrapper[4756]: I1205 16:35:55.461215 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0de45488-2c4b-408e-810b-896d2db706d8-kube-api-access-gpsbm" (OuterVolumeSpecName: "kube-api-access-gpsbm") pod "0de45488-2c4b-408e-810b-896d2db706d8" (UID: "0de45488-2c4b-408e-810b-896d2db706d8"). InnerVolumeSpecName "kube-api-access-gpsbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:35:55 crc kubenswrapper[4756]: I1205 16:35:55.463731 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a3091dc-ae5e-4f60-9e11-a07f76cff76f-kube-api-access-z5rbs" (OuterVolumeSpecName: "kube-api-access-z5rbs") pod "7a3091dc-ae5e-4f60-9e11-a07f76cff76f" (UID: "7a3091dc-ae5e-4f60-9e11-a07f76cff76f"). InnerVolumeSpecName "kube-api-access-z5rbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:35:55 crc kubenswrapper[4756]: I1205 16:35:55.463844 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45e502c2-2f9c-4390-b3ea-ed099cd2111b-kube-api-access-kljj2" (OuterVolumeSpecName: "kube-api-access-kljj2") pod "45e502c2-2f9c-4390-b3ea-ed099cd2111b" (UID: "45e502c2-2f9c-4390-b3ea-ed099cd2111b"). InnerVolumeSpecName "kube-api-access-kljj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:35:55 crc kubenswrapper[4756]: I1205 16:35:55.557531 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0de45488-2c4b-408e-810b-896d2db706d8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:55 crc kubenswrapper[4756]: I1205 16:35:55.557582 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45e502c2-2f9c-4390-b3ea-ed099cd2111b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:55 crc kubenswrapper[4756]: I1205 16:35:55.557600 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpsbm\" (UniqueName: \"kubernetes.io/projected/0de45488-2c4b-408e-810b-896d2db706d8-kube-api-access-gpsbm\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:55 crc kubenswrapper[4756]: I1205 16:35:55.557620 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kljj2\" (UniqueName: \"kubernetes.io/projected/45e502c2-2f9c-4390-b3ea-ed099cd2111b-kube-api-access-kljj2\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:55 crc kubenswrapper[4756]: I1205 16:35:55.557637 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5rbs\" (UniqueName: \"kubernetes.io/projected/7a3091dc-ae5e-4f60-9e11-a07f76cff76f-kube-api-access-z5rbs\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:55 crc kubenswrapper[4756]: I1205 16:35:55.557655 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a3091dc-ae5e-4f60-9e11-a07f76cff76f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:55 crc kubenswrapper[4756]: I1205 16:35:55.855064 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-qwgf6" Dec 05 16:35:55 crc kubenswrapper[4756]: I1205 16:35:55.855020 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-qwgf6" event={"ID":"7a3091dc-ae5e-4f60-9e11-a07f76cff76f","Type":"ContainerDied","Data":"4a40909f83c613ea17688db1f05014e28287eafdcca4ac2f87505d1329e96ac7"} Dec 05 16:35:55 crc kubenswrapper[4756]: I1205 16:35:55.855235 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a40909f83c613ea17688db1f05014e28287eafdcca4ac2f87505d1329e96ac7" Dec 05 16:35:55 crc kubenswrapper[4756]: I1205 16:35:55.857279 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0c76-account-create-update-st5sb" event={"ID":"0de45488-2c4b-408e-810b-896d2db706d8","Type":"ContainerDied","Data":"2a51400e3aa7612c7065ac4e93003c90f10c9233339934b5225ba9f0fa826dd7"} Dec 05 16:35:55 crc kubenswrapper[4756]: I1205 16:35:55.857323 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a51400e3aa7612c7065ac4e93003c90f10c9233339934b5225ba9f0fa826dd7" Dec 05 16:35:55 crc kubenswrapper[4756]: I1205 16:35:55.857373 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0c76-account-create-update-st5sb" Dec 05 16:35:55 crc kubenswrapper[4756]: I1205 16:35:55.858800 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-6hl4j" event={"ID":"45e502c2-2f9c-4390-b3ea-ed099cd2111b","Type":"ContainerDied","Data":"526ac285eb4d3604b1c95c5a019d5569dbab26b33f5a5a398c6f978829ca45ab"} Dec 05 16:35:55 crc kubenswrapper[4756]: I1205 16:35:55.858872 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="526ac285eb4d3604b1c95c5a019d5569dbab26b33f5a5a398c6f978829ca45ab" Dec 05 16:35:55 crc kubenswrapper[4756]: I1205 16:35:55.858956 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-6hl4j" Dec 05 16:35:56 crc kubenswrapper[4756]: I1205 16:35:56.177631 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ts8rq" Dec 05 16:35:56 crc kubenswrapper[4756]: I1205 16:35:56.195680 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5c6b-account-create-update-kb6fq" Dec 05 16:35:56 crc kubenswrapper[4756]: I1205 16:35:56.371802 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcdqz\" (UniqueName: \"kubernetes.io/projected/f13540dc-b5e4-4449-9abb-160846406121-kube-api-access-qcdqz\") pod \"f13540dc-b5e4-4449-9abb-160846406121\" (UID: \"f13540dc-b5e4-4449-9abb-160846406121\") " Dec 05 16:35:56 crc kubenswrapper[4756]: I1205 16:35:56.372098 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d04ce23-86cb-48ed-9d9b-59bb3b650da2-operator-scripts\") pod \"0d04ce23-86cb-48ed-9d9b-59bb3b650da2\" (UID: \"0d04ce23-86cb-48ed-9d9b-59bb3b650da2\") " Dec 05 16:35:56 crc kubenswrapper[4756]: I1205 16:35:56.372133 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f13540dc-b5e4-4449-9abb-160846406121-operator-scripts\") pod \"f13540dc-b5e4-4449-9abb-160846406121\" (UID: \"f13540dc-b5e4-4449-9abb-160846406121\") " Dec 05 16:35:56 crc kubenswrapper[4756]: I1205 16:35:56.372199 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkrhz\" (UniqueName: \"kubernetes.io/projected/0d04ce23-86cb-48ed-9d9b-59bb3b650da2-kube-api-access-gkrhz\") pod \"0d04ce23-86cb-48ed-9d9b-59bb3b650da2\" (UID: \"0d04ce23-86cb-48ed-9d9b-59bb3b650da2\") " Dec 05 16:35:56 crc kubenswrapper[4756]: I1205 16:35:56.373103 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d04ce23-86cb-48ed-9d9b-59bb3b650da2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0d04ce23-86cb-48ed-9d9b-59bb3b650da2" (UID: "0d04ce23-86cb-48ed-9d9b-59bb3b650da2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:35:56 crc kubenswrapper[4756]: I1205 16:35:56.373939 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f13540dc-b5e4-4449-9abb-160846406121-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f13540dc-b5e4-4449-9abb-160846406121" (UID: "f13540dc-b5e4-4449-9abb-160846406121"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:35:56 crc kubenswrapper[4756]: I1205 16:35:56.397319 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f13540dc-b5e4-4449-9abb-160846406121-kube-api-access-qcdqz" (OuterVolumeSpecName: "kube-api-access-qcdqz") pod "f13540dc-b5e4-4449-9abb-160846406121" (UID: "f13540dc-b5e4-4449-9abb-160846406121"). InnerVolumeSpecName "kube-api-access-qcdqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:35:56 crc kubenswrapper[4756]: I1205 16:35:56.425342 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d04ce23-86cb-48ed-9d9b-59bb3b650da2-kube-api-access-gkrhz" (OuterVolumeSpecName: "kube-api-access-gkrhz") pod "0d04ce23-86cb-48ed-9d9b-59bb3b650da2" (UID: "0d04ce23-86cb-48ed-9d9b-59bb3b650da2"). InnerVolumeSpecName "kube-api-access-gkrhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:35:56 crc kubenswrapper[4756]: I1205 16:35:56.474545 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcdqz\" (UniqueName: \"kubernetes.io/projected/f13540dc-b5e4-4449-9abb-160846406121-kube-api-access-qcdqz\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:56 crc kubenswrapper[4756]: I1205 16:35:56.474584 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d04ce23-86cb-48ed-9d9b-59bb3b650da2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:56 crc kubenswrapper[4756]: I1205 16:35:56.474592 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f13540dc-b5e4-4449-9abb-160846406121-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:56 crc kubenswrapper[4756]: I1205 16:35:56.474602 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkrhz\" (UniqueName: \"kubernetes.io/projected/0d04ce23-86cb-48ed-9d9b-59bb3b650da2-kube-api-access-gkrhz\") on node \"crc\" DevicePath \"\"" Dec 05 16:35:56 crc kubenswrapper[4756]: I1205 16:35:56.874094 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ts8rq" Dec 05 16:35:56 crc kubenswrapper[4756]: I1205 16:35:56.874042 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ts8rq" event={"ID":"f13540dc-b5e4-4449-9abb-160846406121","Type":"ContainerDied","Data":"749ae20512a60ee40966ab69efadc79288ce9af1877633c1c71f8b6e582054e8"} Dec 05 16:35:56 crc kubenswrapper[4756]: I1205 16:35:56.874335 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="749ae20512a60ee40966ab69efadc79288ce9af1877633c1c71f8b6e582054e8" Dec 05 16:35:56 crc kubenswrapper[4756]: I1205 16:35:56.876857 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5c6b-account-create-update-kb6fq" event={"ID":"0d04ce23-86cb-48ed-9d9b-59bb3b650da2","Type":"ContainerDied","Data":"482c39b0c92d76339f75e14fee23b7c8d1d58f3be249b57e131a32a848b560f9"} Dec 05 16:35:56 crc kubenswrapper[4756]: I1205 16:35:56.876904 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="482c39b0c92d76339f75e14fee23b7c8d1d58f3be249b57e131a32a848b560f9" Dec 05 16:35:56 crc kubenswrapper[4756]: I1205 16:35:56.877002 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5c6b-account-create-update-kb6fq" Dec 05 16:35:58 crc kubenswrapper[4756]: I1205 16:35:58.894732 4756 generic.go:334] "Generic (PLEG): container finished" podID="669dd164-f2eb-4cc7-bc41-c12a82bb74b7" containerID="ffe12582c087f36dcf1ccc4b2569e203bbdd0e871d37f6f302894e91b850d9b4" exitCode=0 Dec 05 16:35:58 crc kubenswrapper[4756]: I1205 16:35:58.894824 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1a2d-account-create-update-5jzww" event={"ID":"669dd164-f2eb-4cc7-bc41-c12a82bb74b7","Type":"ContainerDied","Data":"ffe12582c087f36dcf1ccc4b2569e203bbdd0e871d37f6f302894e91b850d9b4"} Dec 05 16:36:00 crc kubenswrapper[4756]: I1205 16:36:00.252563 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1a2d-account-create-update-5jzww" Dec 05 16:36:00 crc kubenswrapper[4756]: I1205 16:36:00.337565 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpj68\" (UniqueName: \"kubernetes.io/projected/669dd164-f2eb-4cc7-bc41-c12a82bb74b7-kube-api-access-wpj68\") pod \"669dd164-f2eb-4cc7-bc41-c12a82bb74b7\" (UID: \"669dd164-f2eb-4cc7-bc41-c12a82bb74b7\") " Dec 05 16:36:00 crc kubenswrapper[4756]: I1205 16:36:00.337781 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/669dd164-f2eb-4cc7-bc41-c12a82bb74b7-operator-scripts\") pod \"669dd164-f2eb-4cc7-bc41-c12a82bb74b7\" (UID: \"669dd164-f2eb-4cc7-bc41-c12a82bb74b7\") " Dec 05 16:36:00 crc kubenswrapper[4756]: I1205 16:36:00.338698 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/669dd164-f2eb-4cc7-bc41-c12a82bb74b7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "669dd164-f2eb-4cc7-bc41-c12a82bb74b7" (UID: "669dd164-f2eb-4cc7-bc41-c12a82bb74b7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:36:00 crc kubenswrapper[4756]: I1205 16:36:00.342926 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/669dd164-f2eb-4cc7-bc41-c12a82bb74b7-kube-api-access-wpj68" (OuterVolumeSpecName: "kube-api-access-wpj68") pod "669dd164-f2eb-4cc7-bc41-c12a82bb74b7" (UID: "669dd164-f2eb-4cc7-bc41-c12a82bb74b7"). InnerVolumeSpecName "kube-api-access-wpj68". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:36:00 crc kubenswrapper[4756]: I1205 16:36:00.440497 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpj68\" (UniqueName: \"kubernetes.io/projected/669dd164-f2eb-4cc7-bc41-c12a82bb74b7-kube-api-access-wpj68\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:00 crc kubenswrapper[4756]: I1205 16:36:00.440821 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/669dd164-f2eb-4cc7-bc41-c12a82bb74b7-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:00 crc kubenswrapper[4756]: I1205 16:36:00.912261 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1a2d-account-create-update-5jzww" event={"ID":"669dd164-f2eb-4cc7-bc41-c12a82bb74b7","Type":"ContainerDied","Data":"9f6c0dde3ade1a62aa58138602203ae8349b3e702f3f579a5e8fedfb3b56a080"} Dec 05 16:36:00 crc kubenswrapper[4756]: I1205 16:36:00.912332 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f6c0dde3ade1a62aa58138602203ae8349b3e702f3f579a5e8fedfb3b56a080" Dec 05 16:36:00 crc kubenswrapper[4756]: I1205 16:36:00.912336 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1a2d-account-create-update-5jzww" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.038916 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-mvn92"] Dec 05 16:36:03 crc kubenswrapper[4756]: E1205 16:36:03.039352 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0de45488-2c4b-408e-810b-896d2db706d8" containerName="mariadb-account-create-update" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.039368 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="0de45488-2c4b-408e-810b-896d2db706d8" containerName="mariadb-account-create-update" Dec 05 16:36:03 crc kubenswrapper[4756]: E1205 16:36:03.039386 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d04ce23-86cb-48ed-9d9b-59bb3b650da2" containerName="mariadb-account-create-update" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.039396 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d04ce23-86cb-48ed-9d9b-59bb3b650da2" containerName="mariadb-account-create-update" Dec 05 16:36:03 crc kubenswrapper[4756]: E1205 16:36:03.039408 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a3091dc-ae5e-4f60-9e11-a07f76cff76f" containerName="mariadb-database-create" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.039438 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a3091dc-ae5e-4f60-9e11-a07f76cff76f" containerName="mariadb-database-create" Dec 05 16:36:03 crc kubenswrapper[4756]: E1205 16:36:03.039454 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e502c2-2f9c-4390-b3ea-ed099cd2111b" containerName="mariadb-database-create" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.039461 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e502c2-2f9c-4390-b3ea-ed099cd2111b" containerName="mariadb-database-create" Dec 05 16:36:03 crc kubenswrapper[4756]: E1205 16:36:03.039470 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0bd0441-24bc-48f4-aeba-006c3b77419e" containerName="swift-ring-rebalance" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.039477 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0bd0441-24bc-48f4-aeba-006c3b77419e" containerName="swift-ring-rebalance" Dec 05 16:36:03 crc kubenswrapper[4756]: E1205 16:36:03.039497 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="669dd164-f2eb-4cc7-bc41-c12a82bb74b7" containerName="mariadb-account-create-update" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.039504 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="669dd164-f2eb-4cc7-bc41-c12a82bb74b7" containerName="mariadb-account-create-update" Dec 05 16:36:03 crc kubenswrapper[4756]: E1205 16:36:03.039514 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f13540dc-b5e4-4449-9abb-160846406121" containerName="mariadb-database-create" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.039521 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f13540dc-b5e4-4449-9abb-160846406121" containerName="mariadb-database-create" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.039699 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="669dd164-f2eb-4cc7-bc41-c12a82bb74b7" containerName="mariadb-account-create-update" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.039710 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="45e502c2-2f9c-4390-b3ea-ed099cd2111b" containerName="mariadb-database-create" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.039719 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="0de45488-2c4b-408e-810b-896d2db706d8" containerName="mariadb-account-create-update" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.039729 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0bd0441-24bc-48f4-aeba-006c3b77419e" containerName="swift-ring-rebalance" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.039736 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f13540dc-b5e4-4449-9abb-160846406121" containerName="mariadb-database-create" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.039743 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d04ce23-86cb-48ed-9d9b-59bb3b650da2" containerName="mariadb-account-create-update" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.039753 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a3091dc-ae5e-4f60-9e11-a07f76cff76f" containerName="mariadb-database-create" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.040280 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mvn92" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.042027 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.042278 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-r2jl8" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.051708 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-mvn92"] Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.090715 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m55h9\" (UniqueName: \"kubernetes.io/projected/dbf5eba8-7c13-43e6-9777-978152dff744-kube-api-access-m55h9\") pod \"glance-db-sync-mvn92\" (UID: \"dbf5eba8-7c13-43e6-9777-978152dff744\") " pod="openstack/glance-db-sync-mvn92" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.090771 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbf5eba8-7c13-43e6-9777-978152dff744-config-data\") pod \"glance-db-sync-mvn92\" (UID: \"dbf5eba8-7c13-43e6-9777-978152dff744\") " pod="openstack/glance-db-sync-mvn92" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.090947 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dbf5eba8-7c13-43e6-9777-978152dff744-db-sync-config-data\") pod \"glance-db-sync-mvn92\" (UID: \"dbf5eba8-7c13-43e6-9777-978152dff744\") " pod="openstack/glance-db-sync-mvn92" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.091120 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbf5eba8-7c13-43e6-9777-978152dff744-combined-ca-bundle\") pod \"glance-db-sync-mvn92\" (UID: \"dbf5eba8-7c13-43e6-9777-978152dff744\") " pod="openstack/glance-db-sync-mvn92" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.192115 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dbf5eba8-7c13-43e6-9777-978152dff744-db-sync-config-data\") pod \"glance-db-sync-mvn92\" (UID: \"dbf5eba8-7c13-43e6-9777-978152dff744\") " pod="openstack/glance-db-sync-mvn92" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.192204 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbf5eba8-7c13-43e6-9777-978152dff744-combined-ca-bundle\") pod \"glance-db-sync-mvn92\" (UID: \"dbf5eba8-7c13-43e6-9777-978152dff744\") " pod="openstack/glance-db-sync-mvn92" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.192239 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m55h9\" (UniqueName: \"kubernetes.io/projected/dbf5eba8-7c13-43e6-9777-978152dff744-kube-api-access-m55h9\") pod \"glance-db-sync-mvn92\" (UID: \"dbf5eba8-7c13-43e6-9777-978152dff744\") " pod="openstack/glance-db-sync-mvn92" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.192271 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbf5eba8-7c13-43e6-9777-978152dff744-config-data\") pod \"glance-db-sync-mvn92\" (UID: \"dbf5eba8-7c13-43e6-9777-978152dff744\") " pod="openstack/glance-db-sync-mvn92" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.197746 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbf5eba8-7c13-43e6-9777-978152dff744-config-data\") pod \"glance-db-sync-mvn92\" (UID: \"dbf5eba8-7c13-43e6-9777-978152dff744\") " pod="openstack/glance-db-sync-mvn92" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.198486 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dbf5eba8-7c13-43e6-9777-978152dff744-db-sync-config-data\") pod \"glance-db-sync-mvn92\" (UID: \"dbf5eba8-7c13-43e6-9777-978152dff744\") " pod="openstack/glance-db-sync-mvn92" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.210655 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbf5eba8-7c13-43e6-9777-978152dff744-combined-ca-bundle\") pod \"glance-db-sync-mvn92\" (UID: \"dbf5eba8-7c13-43e6-9777-978152dff744\") " pod="openstack/glance-db-sync-mvn92" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.211269 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m55h9\" (UniqueName: \"kubernetes.io/projected/dbf5eba8-7c13-43e6-9777-978152dff744-kube-api-access-m55h9\") pod \"glance-db-sync-mvn92\" (UID: \"dbf5eba8-7c13-43e6-9777-978152dff744\") " pod="openstack/glance-db-sync-mvn92" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.239660 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-vsmjz" podUID="359e87f3-8fb4-4476-a538-13c948b65885" containerName="ovn-controller" probeResult="failure" output=< Dec 05 16:36:03 crc kubenswrapper[4756]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 05 16:36:03 crc kubenswrapper[4756]: > Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.286926 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-kdtg5" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.291211 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-kdtg5" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.363698 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mvn92" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.521504 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-vsmjz-config-lm7gn"] Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.523177 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vsmjz-config-lm7gn" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.525498 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.549549 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vsmjz-config-lm7gn"] Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.600592 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g55cc\" (UniqueName: \"kubernetes.io/projected/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-kube-api-access-g55cc\") pod \"ovn-controller-vsmjz-config-lm7gn\" (UID: \"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e\") " pod="openstack/ovn-controller-vsmjz-config-lm7gn" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.600642 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-var-run-ovn\") pod \"ovn-controller-vsmjz-config-lm7gn\" (UID: \"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e\") " pod="openstack/ovn-controller-vsmjz-config-lm7gn" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.600682 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-var-log-ovn\") pod \"ovn-controller-vsmjz-config-lm7gn\" (UID: \"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e\") " pod="openstack/ovn-controller-vsmjz-config-lm7gn" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.600877 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-additional-scripts\") pod \"ovn-controller-vsmjz-config-lm7gn\" (UID: \"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e\") " pod="openstack/ovn-controller-vsmjz-config-lm7gn" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.600919 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-var-run\") pod \"ovn-controller-vsmjz-config-lm7gn\" (UID: \"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e\") " pod="openstack/ovn-controller-vsmjz-config-lm7gn" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.600974 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-scripts\") pod \"ovn-controller-vsmjz-config-lm7gn\" (UID: \"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e\") " pod="openstack/ovn-controller-vsmjz-config-lm7gn" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.702691 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-scripts\") pod \"ovn-controller-vsmjz-config-lm7gn\" (UID: \"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e\") " pod="openstack/ovn-controller-vsmjz-config-lm7gn" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.702812 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g55cc\" (UniqueName: \"kubernetes.io/projected/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-kube-api-access-g55cc\") pod \"ovn-controller-vsmjz-config-lm7gn\" (UID: \"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e\") " pod="openstack/ovn-controller-vsmjz-config-lm7gn" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.702854 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-var-run-ovn\") pod \"ovn-controller-vsmjz-config-lm7gn\" (UID: \"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e\") " pod="openstack/ovn-controller-vsmjz-config-lm7gn" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.702876 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-var-log-ovn\") pod \"ovn-controller-vsmjz-config-lm7gn\" (UID: \"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e\") " pod="openstack/ovn-controller-vsmjz-config-lm7gn" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.702914 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-additional-scripts\") pod \"ovn-controller-vsmjz-config-lm7gn\" (UID: \"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e\") " pod="openstack/ovn-controller-vsmjz-config-lm7gn" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.702931 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-var-run\") pod \"ovn-controller-vsmjz-config-lm7gn\" (UID: \"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e\") " pod="openstack/ovn-controller-vsmjz-config-lm7gn" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.703244 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-var-run\") pod \"ovn-controller-vsmjz-config-lm7gn\" (UID: \"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e\") " pod="openstack/ovn-controller-vsmjz-config-lm7gn" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.703247 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-var-run-ovn\") pod \"ovn-controller-vsmjz-config-lm7gn\" (UID: \"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e\") " pod="openstack/ovn-controller-vsmjz-config-lm7gn" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.703302 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-var-log-ovn\") pod \"ovn-controller-vsmjz-config-lm7gn\" (UID: \"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e\") " pod="openstack/ovn-controller-vsmjz-config-lm7gn" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.703830 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-additional-scripts\") pod \"ovn-controller-vsmjz-config-lm7gn\" (UID: \"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e\") " pod="openstack/ovn-controller-vsmjz-config-lm7gn" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.704650 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-scripts\") pod \"ovn-controller-vsmjz-config-lm7gn\" (UID: \"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e\") " pod="openstack/ovn-controller-vsmjz-config-lm7gn" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.720641 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g55cc\" (UniqueName: \"kubernetes.io/projected/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-kube-api-access-g55cc\") pod \"ovn-controller-vsmjz-config-lm7gn\" (UID: \"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e\") " pod="openstack/ovn-controller-vsmjz-config-lm7gn" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.851943 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vsmjz-config-lm7gn" Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.887937 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-mvn92"] Dec 05 16:36:03 crc kubenswrapper[4756]: W1205 16:36:03.890710 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbf5eba8_7c13_43e6_9777_978152dff744.slice/crio-7852cd504a8ea5683d52125399c3ee70e331ae56ebb9e8c15f17434e55fcade5 WatchSource:0}: Error finding container 7852cd504a8ea5683d52125399c3ee70e331ae56ebb9e8c15f17434e55fcade5: Status 404 returned error can't find the container with id 7852cd504a8ea5683d52125399c3ee70e331ae56ebb9e8c15f17434e55fcade5 Dec 05 16:36:03 crc kubenswrapper[4756]: I1205 16:36:03.940585 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mvn92" event={"ID":"dbf5eba8-7c13-43e6-9777-978152dff744","Type":"ContainerStarted","Data":"7852cd504a8ea5683d52125399c3ee70e331ae56ebb9e8c15f17434e55fcade5"} Dec 05 16:36:04 crc kubenswrapper[4756]: I1205 16:36:04.275209 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vsmjz-config-lm7gn"] Dec 05 16:36:04 crc kubenswrapper[4756]: I1205 16:36:04.950470 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vsmjz-config-lm7gn" event={"ID":"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e","Type":"ContainerStarted","Data":"afb8dc1e4dccb3c479604db1131a8871ec7c23c250a85cbacaaa1b5e642a404b"} Dec 05 16:36:06 crc kubenswrapper[4756]: I1205 16:36:06.968172 4756 generic.go:334] "Generic (PLEG): container finished" podID="11d0ad48-c987-4415-8536-b150536534a5" containerID="2ddfe79aa135b617b0c2f40a253431db40792217d9ff92e87cd83d2c7077ee6b" exitCode=0 Dec 05 16:36:06 crc kubenswrapper[4756]: I1205 16:36:06.968309 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"11d0ad48-c987-4415-8536-b150536534a5","Type":"ContainerDied","Data":"2ddfe79aa135b617b0c2f40a253431db40792217d9ff92e87cd83d2c7077ee6b"} Dec 05 16:36:07 crc kubenswrapper[4756]: I1205 16:36:07.573823 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-etc-swift\") pod \"swift-storage-0\" (UID: \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\") " pod="openstack/swift-storage-0" Dec 05 16:36:07 crc kubenswrapper[4756]: I1205 16:36:07.581408 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-etc-swift\") pod \"swift-storage-0\" (UID: \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\") " pod="openstack/swift-storage-0" Dec 05 16:36:07 crc kubenswrapper[4756]: I1205 16:36:07.684198 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 05 16:36:07 crc kubenswrapper[4756]: I1205 16:36:07.983784 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"11d0ad48-c987-4415-8536-b150536534a5","Type":"ContainerStarted","Data":"b92e191defe003995a105571535e742081d3f399571f5918d9704a31621eb3ff"} Dec 05 16:36:07 crc kubenswrapper[4756]: I1205 16:36:07.984340 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 05 16:36:07 crc kubenswrapper[4756]: I1205 16:36:07.989876 4756 generic.go:334] "Generic (PLEG): container finished" podID="9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e" containerID="01bd2e3d0a47c24809786f02ba0d948624a24bcd0220147dcc5a098acfd29827" exitCode=0 Dec 05 16:36:07 crc kubenswrapper[4756]: I1205 16:36:07.989922 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vsmjz-config-lm7gn" event={"ID":"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e","Type":"ContainerDied","Data":"01bd2e3d0a47c24809786f02ba0d948624a24bcd0220147dcc5a098acfd29827"} Dec 05 16:36:08 crc kubenswrapper[4756]: I1205 16:36:08.009010 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.026372372 podStartE2EDuration="1m11.008972933s" podCreationTimestamp="2025-12-05 16:34:57 +0000 UTC" firstStartedPulling="2025-12-05 16:34:59.473480072 +0000 UTC m=+1159.712153584" lastFinishedPulling="2025-12-05 16:35:32.456080633 +0000 UTC m=+1192.694754145" observedRunningTime="2025-12-05 16:36:08.008743247 +0000 UTC m=+1228.247416779" watchObservedRunningTime="2025-12-05 16:36:08.008972933 +0000 UTC m=+1228.247646465" Dec 05 16:36:08 crc kubenswrapper[4756]: I1205 16:36:08.223879 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 05 16:36:08 crc kubenswrapper[4756]: I1205 16:36:08.226560 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-vsmjz" Dec 05 16:36:08 crc kubenswrapper[4756]: W1205 16:36:08.232174 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1da6dea0_ecf9_4bec_9c85_2fddcc0fae40.slice/crio-b185d3cc3e40a67cd7e7744ce5c71c3f1698e265dfcf8d008203d209df117427 WatchSource:0}: Error finding container b185d3cc3e40a67cd7e7744ce5c71c3f1698e265dfcf8d008203d209df117427: Status 404 returned error can't find the container with id b185d3cc3e40a67cd7e7744ce5c71c3f1698e265dfcf8d008203d209df117427 Dec 05 16:36:09 crc kubenswrapper[4756]: I1205 16:36:09.007174 4756 generic.go:334] "Generic (PLEG): container finished" podID="331486fe-c403-4516-9568-862200ec0672" containerID="dbfb13eaaa696e55e78ab35fd710ba80f5f15ace22edb546c16e9257aac5efcc" exitCode=0 Dec 05 16:36:09 crc kubenswrapper[4756]: I1205 16:36:09.007266 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"331486fe-c403-4516-9568-862200ec0672","Type":"ContainerDied","Data":"dbfb13eaaa696e55e78ab35fd710ba80f5f15ace22edb546c16e9257aac5efcc"} Dec 05 16:36:09 crc kubenswrapper[4756]: I1205 16:36:09.014433 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerStarted","Data":"b185d3cc3e40a67cd7e7744ce5c71c3f1698e265dfcf8d008203d209df117427"} Dec 05 16:36:09 crc kubenswrapper[4756]: I1205 16:36:09.595973 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vsmjz-config-lm7gn" Dec 05 16:36:09 crc kubenswrapper[4756]: I1205 16:36:09.717550 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-scripts\") pod \"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e\" (UID: \"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e\") " Dec 05 16:36:09 crc kubenswrapper[4756]: I1205 16:36:09.717673 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g55cc\" (UniqueName: \"kubernetes.io/projected/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-kube-api-access-g55cc\") pod \"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e\" (UID: \"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e\") " Dec 05 16:36:09 crc kubenswrapper[4756]: I1205 16:36:09.717715 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-var-run-ovn\") pod \"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e\" (UID: \"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e\") " Dec 05 16:36:09 crc kubenswrapper[4756]: I1205 16:36:09.717781 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-var-log-ovn\") pod \"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e\" (UID: \"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e\") " Dec 05 16:36:09 crc kubenswrapper[4756]: I1205 16:36:09.717820 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-var-run\") pod \"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e\" (UID: \"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e\") " Dec 05 16:36:09 crc kubenswrapper[4756]: I1205 16:36:09.717895 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-additional-scripts\") pod \"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e\" (UID: \"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e\") " Dec 05 16:36:09 crc kubenswrapper[4756]: I1205 16:36:09.718158 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e" (UID: "9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:36:09 crc kubenswrapper[4756]: I1205 16:36:09.718197 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-var-run" (OuterVolumeSpecName: "var-run") pod "9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e" (UID: "9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:36:09 crc kubenswrapper[4756]: I1205 16:36:09.718222 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e" (UID: "9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:36:09 crc kubenswrapper[4756]: I1205 16:36:09.718494 4756 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:09 crc kubenswrapper[4756]: I1205 16:36:09.718516 4756 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:09 crc kubenswrapper[4756]: I1205 16:36:09.718526 4756 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-var-run\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:09 crc kubenswrapper[4756]: I1205 16:36:09.718761 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e" (UID: "9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:36:09 crc kubenswrapper[4756]: I1205 16:36:09.719013 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-scripts" (OuterVolumeSpecName: "scripts") pod "9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e" (UID: "9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:36:09 crc kubenswrapper[4756]: I1205 16:36:09.723859 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-kube-api-access-g55cc" (OuterVolumeSpecName: "kube-api-access-g55cc") pod "9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e" (UID: "9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e"). InnerVolumeSpecName "kube-api-access-g55cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:36:09 crc kubenswrapper[4756]: I1205 16:36:09.820194 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:09 crc kubenswrapper[4756]: I1205 16:36:09.820246 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g55cc\" (UniqueName: \"kubernetes.io/projected/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-kube-api-access-g55cc\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:09 crc kubenswrapper[4756]: I1205 16:36:09.820258 4756 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:10 crc kubenswrapper[4756]: I1205 16:36:10.034922 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"331486fe-c403-4516-9568-862200ec0672","Type":"ContainerStarted","Data":"fa659750fdca4260aabdf7cc3b64385ac78f7103ed2236c3231de5b75e383a15"} Dec 05 16:36:10 crc kubenswrapper[4756]: I1205 16:36:10.036099 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:36:10 crc kubenswrapper[4756]: I1205 16:36:10.039683 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerStarted","Data":"58129759a9bc200a7d116b2fc2c28784959095ead6831251c418d834f5b0a55f"} Dec 05 16:36:10 crc kubenswrapper[4756]: I1205 16:36:10.039719 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerStarted","Data":"dda1d79f2b59ff6c5c2d0266e0acd62ebf58cab920b8afccadd520b9892680b4"} Dec 05 16:36:10 crc kubenswrapper[4756]: I1205 16:36:10.043987 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vsmjz-config-lm7gn" event={"ID":"9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e","Type":"ContainerDied","Data":"afb8dc1e4dccb3c479604db1131a8871ec7c23c250a85cbacaaa1b5e642a404b"} Dec 05 16:36:10 crc kubenswrapper[4756]: I1205 16:36:10.044031 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afb8dc1e4dccb3c479604db1131a8871ec7c23c250a85cbacaaa1b5e642a404b" Dec 05 16:36:10 crc kubenswrapper[4756]: I1205 16:36:10.044125 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vsmjz-config-lm7gn" Dec 05 16:36:10 crc kubenswrapper[4756]: I1205 16:36:10.073822 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371964.780972 podStartE2EDuration="1m12.073803624s" podCreationTimestamp="2025-12-05 16:34:58 +0000 UTC" firstStartedPulling="2025-12-05 16:35:00.531609494 +0000 UTC m=+1160.770282996" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:36:10.072565951 +0000 UTC m=+1230.311239513" watchObservedRunningTime="2025-12-05 16:36:10.073803624 +0000 UTC m=+1230.312477156" Dec 05 16:36:10 crc kubenswrapper[4756]: I1205 16:36:10.692662 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-vsmjz-config-lm7gn"] Dec 05 16:36:10 crc kubenswrapper[4756]: I1205 16:36:10.702866 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-vsmjz-config-lm7gn"] Dec 05 16:36:11 crc kubenswrapper[4756]: I1205 16:36:11.054807 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerStarted","Data":"fb543cb304c9d5844c9b5a67d5c174d88a21239b421f7c141d51de2980c31e75"} Dec 05 16:36:11 crc kubenswrapper[4756]: I1205 16:36:11.054875 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerStarted","Data":"0c6cfb9d5dd454d68dc8f1516ae3fbaa64a717012996cf3d859df476dcfab0a5"} Dec 05 16:36:12 crc kubenswrapper[4756]: I1205 16:36:12.061229 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e" path="/var/lib/kubelet/pods/9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e/volumes" Dec 05 16:36:12 crc kubenswrapper[4756]: I1205 16:36:12.069818 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerStarted","Data":"76099a82df8143f2cc4ed5aad230dfa5b3acbe45156918636453426258978e46"} Dec 05 16:36:12 crc kubenswrapper[4756]: I1205 16:36:12.069879 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerStarted","Data":"ebf30d748f8665f39bb4c4c38feb695f0528281f593a1cb8bba7b8c8fd0a8fc2"} Dec 05 16:36:18 crc kubenswrapper[4756]: I1205 16:36:18.707040 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="11d0ad48-c987-4415-8536-b150536534a5" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Dec 05 16:36:19 crc kubenswrapper[4756]: I1205 16:36:19.836486 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="331486fe-c403-4516-9568-862200ec0672" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Dec 05 16:36:22 crc kubenswrapper[4756]: I1205 16:36:22.295271 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:36:22 crc kubenswrapper[4756]: I1205 16:36:22.295708 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:36:22 crc kubenswrapper[4756]: E1205 16:36:22.540603 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api@sha256:481073ac9deefb38bbd989aaa8dd7aedb4e0af26017f4883f85fce433380bf63" Dec 05 16:36:22 crc kubenswrapper[4756]: E1205 16:36:22.540853 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:481073ac9deefb38bbd989aaa8dd7aedb4e0af26017f4883f85fce433380bf63,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m55h9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-mvn92_openstack(dbf5eba8-7c13-43e6-9777-978152dff744): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:36:22 crc kubenswrapper[4756]: E1205 16:36:22.542065 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-mvn92" podUID="dbf5eba8-7c13-43e6-9777-978152dff744" Dec 05 16:36:23 crc kubenswrapper[4756]: I1205 16:36:23.170652 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerStarted","Data":"1e0ca2f0b49755dfd0c9132cf3368e74ff1b4a7228ed3e94f5207e344afe1db1"} Dec 05 16:36:23 crc kubenswrapper[4756]: I1205 16:36:23.171056 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerStarted","Data":"e20786692279aaf2d8f121cc96987f9f799c1af4d0ee5262b2c6e9ee5557839a"} Dec 05 16:36:23 crc kubenswrapper[4756]: E1205 16:36:23.171779 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api@sha256:481073ac9deefb38bbd989aaa8dd7aedb4e0af26017f4883f85fce433380bf63\\\"\"" pod="openstack/glance-db-sync-mvn92" podUID="dbf5eba8-7c13-43e6-9777-978152dff744" Dec 05 16:36:25 crc kubenswrapper[4756]: I1205 16:36:25.189707 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerStarted","Data":"821acf37d22b8f71cb55bb2d57dcd64430e023776cecd598ec2c206cea1a4102"} Dec 05 16:36:26 crc kubenswrapper[4756]: I1205 16:36:26.211114 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerStarted","Data":"9c291384610a06cc375042a98a47615fdbff66e21123c781d2769ab2bcadeded"} Dec 05 16:36:26 crc kubenswrapper[4756]: I1205 16:36:26.211163 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerStarted","Data":"5e1f40cea05fa071537e16acd6b09db6a6409dc74b0cd1fa563ac99186148a45"} Dec 05 16:36:26 crc kubenswrapper[4756]: I1205 16:36:26.211174 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerStarted","Data":"4389071ecd156da0b5f4845fe6c02738ea9fcc211d54b131fe326e1966f73345"} Dec 05 16:36:26 crc kubenswrapper[4756]: I1205 16:36:26.211187 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerStarted","Data":"05437fa0b343d36ef2ae01e088f3fdcbef68a2d04ddfa1eaa69a14db0d05f9b1"} Dec 05 16:36:27 crc kubenswrapper[4756]: I1205 16:36:27.227445 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerStarted","Data":"247cd9f8e619b5b90e0efa25570e05c275258c09a3806318d1576d31ee9c54d4"} Dec 05 16:36:27 crc kubenswrapper[4756]: I1205 16:36:27.227789 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerStarted","Data":"9e3e1415751beadec3c5e98d32474f2d2dec1760de9605948fb52aab896fc9c7"} Dec 05 16:36:27 crc kubenswrapper[4756]: I1205 16:36:27.271919 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.923177955 podStartE2EDuration="53.271902789s" podCreationTimestamp="2025-12-05 16:35:34 +0000 UTC" firstStartedPulling="2025-12-05 16:36:08.233722075 +0000 UTC m=+1228.472395587" lastFinishedPulling="2025-12-05 16:36:24.582446909 +0000 UTC m=+1244.821120421" observedRunningTime="2025-12-05 16:36:27.264596981 +0000 UTC m=+1247.503270513" watchObservedRunningTime="2025-12-05 16:36:27.271902789 +0000 UTC m=+1247.510576301" Dec 05 16:36:27 crc kubenswrapper[4756]: I1205 16:36:27.513247 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-779c5847bc-x2p9h"] Dec 05 16:36:27 crc kubenswrapper[4756]: E1205 16:36:27.514296 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e" containerName="ovn-config" Dec 05 16:36:27 crc kubenswrapper[4756]: I1205 16:36:27.514397 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e" containerName="ovn-config" Dec 05 16:36:27 crc kubenswrapper[4756]: I1205 16:36:27.514750 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="9197ec16-b4dd-4b9d-bd2d-b2e52e44ef9e" containerName="ovn-config" Dec 05 16:36:27 crc kubenswrapper[4756]: I1205 16:36:27.515960 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" Dec 05 16:36:27 crc kubenswrapper[4756]: I1205 16:36:27.518139 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 05 16:36:27 crc kubenswrapper[4756]: I1205 16:36:27.521037 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-779c5847bc-x2p9h"] Dec 05 16:36:27 crc kubenswrapper[4756]: I1205 16:36:27.642345 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcdnw\" (UniqueName: \"kubernetes.io/projected/bee23650-0d6d-495a-af65-7e818117b368-kube-api-access-bcdnw\") pod \"dnsmasq-dns-779c5847bc-x2p9h\" (UID: \"bee23650-0d6d-495a-af65-7e818117b368\") " pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" Dec 05 16:36:27 crc kubenswrapper[4756]: I1205 16:36:27.642435 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bee23650-0d6d-495a-af65-7e818117b368-config\") pod \"dnsmasq-dns-779c5847bc-x2p9h\" (UID: \"bee23650-0d6d-495a-af65-7e818117b368\") " pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" Dec 05 16:36:27 crc kubenswrapper[4756]: I1205 16:36:27.642506 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bee23650-0d6d-495a-af65-7e818117b368-dns-svc\") pod \"dnsmasq-dns-779c5847bc-x2p9h\" (UID: \"bee23650-0d6d-495a-af65-7e818117b368\") " pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" Dec 05 16:36:27 crc kubenswrapper[4756]: I1205 16:36:27.642976 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bee23650-0d6d-495a-af65-7e818117b368-dns-swift-storage-0\") pod \"dnsmasq-dns-779c5847bc-x2p9h\" (UID: \"bee23650-0d6d-495a-af65-7e818117b368\") " pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" Dec 05 16:36:27 crc kubenswrapper[4756]: I1205 16:36:27.643015 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bee23650-0d6d-495a-af65-7e818117b368-ovsdbserver-sb\") pod \"dnsmasq-dns-779c5847bc-x2p9h\" (UID: \"bee23650-0d6d-495a-af65-7e818117b368\") " pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" Dec 05 16:36:27 crc kubenswrapper[4756]: I1205 16:36:27.643037 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bee23650-0d6d-495a-af65-7e818117b368-ovsdbserver-nb\") pod \"dnsmasq-dns-779c5847bc-x2p9h\" (UID: \"bee23650-0d6d-495a-af65-7e818117b368\") " pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" Dec 05 16:36:27 crc kubenswrapper[4756]: I1205 16:36:27.744450 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcdnw\" (UniqueName: \"kubernetes.io/projected/bee23650-0d6d-495a-af65-7e818117b368-kube-api-access-bcdnw\") pod \"dnsmasq-dns-779c5847bc-x2p9h\" (UID: \"bee23650-0d6d-495a-af65-7e818117b368\") " pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" Dec 05 16:36:27 crc kubenswrapper[4756]: I1205 16:36:27.744769 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bee23650-0d6d-495a-af65-7e818117b368-config\") pod \"dnsmasq-dns-779c5847bc-x2p9h\" (UID: \"bee23650-0d6d-495a-af65-7e818117b368\") " pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" Dec 05 16:36:27 crc kubenswrapper[4756]: I1205 16:36:27.744905 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bee23650-0d6d-495a-af65-7e818117b368-dns-svc\") pod \"dnsmasq-dns-779c5847bc-x2p9h\" (UID: \"bee23650-0d6d-495a-af65-7e818117b368\") " pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" Dec 05 16:36:27 crc kubenswrapper[4756]: I1205 16:36:27.745023 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bee23650-0d6d-495a-af65-7e818117b368-dns-swift-storage-0\") pod \"dnsmasq-dns-779c5847bc-x2p9h\" (UID: \"bee23650-0d6d-495a-af65-7e818117b368\") " pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" Dec 05 16:36:27 crc kubenswrapper[4756]: I1205 16:36:27.745100 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bee23650-0d6d-495a-af65-7e818117b368-ovsdbserver-sb\") pod \"dnsmasq-dns-779c5847bc-x2p9h\" (UID: \"bee23650-0d6d-495a-af65-7e818117b368\") " pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" Dec 05 16:36:27 crc kubenswrapper[4756]: I1205 16:36:27.745178 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bee23650-0d6d-495a-af65-7e818117b368-ovsdbserver-nb\") pod \"dnsmasq-dns-779c5847bc-x2p9h\" (UID: \"bee23650-0d6d-495a-af65-7e818117b368\") " pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" Dec 05 16:36:27 crc kubenswrapper[4756]: I1205 16:36:27.745882 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bee23650-0d6d-495a-af65-7e818117b368-config\") pod \"dnsmasq-dns-779c5847bc-x2p9h\" (UID: \"bee23650-0d6d-495a-af65-7e818117b368\") " pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" Dec 05 16:36:27 crc kubenswrapper[4756]: I1205 16:36:27.746051 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bee23650-0d6d-495a-af65-7e818117b368-dns-svc\") pod \"dnsmasq-dns-779c5847bc-x2p9h\" (UID: \"bee23650-0d6d-495a-af65-7e818117b368\") " pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" Dec 05 16:36:27 crc kubenswrapper[4756]: I1205 16:36:27.746050 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bee23650-0d6d-495a-af65-7e818117b368-dns-swift-storage-0\") pod \"dnsmasq-dns-779c5847bc-x2p9h\" (UID: \"bee23650-0d6d-495a-af65-7e818117b368\") " pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" Dec 05 16:36:27 crc kubenswrapper[4756]: I1205 16:36:27.746558 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bee23650-0d6d-495a-af65-7e818117b368-ovsdbserver-sb\") pod \"dnsmasq-dns-779c5847bc-x2p9h\" (UID: \"bee23650-0d6d-495a-af65-7e818117b368\") " pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" Dec 05 16:36:27 crc kubenswrapper[4756]: I1205 16:36:27.746779 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bee23650-0d6d-495a-af65-7e818117b368-ovsdbserver-nb\") pod \"dnsmasq-dns-779c5847bc-x2p9h\" (UID: \"bee23650-0d6d-495a-af65-7e818117b368\") " pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" Dec 05 16:36:27 crc kubenswrapper[4756]: I1205 16:36:27.782204 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcdnw\" (UniqueName: \"kubernetes.io/projected/bee23650-0d6d-495a-af65-7e818117b368-kube-api-access-bcdnw\") pod \"dnsmasq-dns-779c5847bc-x2p9h\" (UID: \"bee23650-0d6d-495a-af65-7e818117b368\") " pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" Dec 05 16:36:27 crc kubenswrapper[4756]: I1205 16:36:27.843601 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" Dec 05 16:36:28 crc kubenswrapper[4756]: I1205 16:36:28.128371 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-779c5847bc-x2p9h"] Dec 05 16:36:28 crc kubenswrapper[4756]: W1205 16:36:28.136170 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbee23650_0d6d_495a_af65_7e818117b368.slice/crio-dab48f7c5441103ae23dd8f63c211847faa8a3d516fa2015e57049f5e971de1d WatchSource:0}: Error finding container dab48f7c5441103ae23dd8f63c211847faa8a3d516fa2015e57049f5e971de1d: Status 404 returned error can't find the container with id dab48f7c5441103ae23dd8f63c211847faa8a3d516fa2015e57049f5e971de1d Dec 05 16:36:28 crc kubenswrapper[4756]: I1205 16:36:28.243287 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" event={"ID":"bee23650-0d6d-495a-af65-7e818117b368","Type":"ContainerStarted","Data":"dab48f7c5441103ae23dd8f63c211847faa8a3d516fa2015e57049f5e971de1d"} Dec 05 16:36:28 crc kubenswrapper[4756]: I1205 16:36:28.706595 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.015022 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-csgm4"] Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.016309 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-csgm4" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.028048 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-csgm4"] Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.067072 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4319acb7-6d0a-4954-9d76-e698dde5536b-operator-scripts\") pod \"cinder-db-create-csgm4\" (UID: \"4319acb7-6d0a-4954-9d76-e698dde5536b\") " pod="openstack/cinder-db-create-csgm4" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.067227 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j488w\" (UniqueName: \"kubernetes.io/projected/4319acb7-6d0a-4954-9d76-e698dde5536b-kube-api-access-j488w\") pod \"cinder-db-create-csgm4\" (UID: \"4319acb7-6d0a-4954-9d76-e698dde5536b\") " pod="openstack/cinder-db-create-csgm4" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.110776 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-c3ab-account-create-update-s6vq9"] Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.112175 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c3ab-account-create-update-s6vq9" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.117982 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.124623 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-468zf"] Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.125637 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-468zf" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.128428 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-468zf"] Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.142452 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c3ab-account-create-update-s6vq9"] Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.169720 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j488w\" (UniqueName: \"kubernetes.io/projected/4319acb7-6d0a-4954-9d76-e698dde5536b-kube-api-access-j488w\") pod \"cinder-db-create-csgm4\" (UID: \"4319acb7-6d0a-4954-9d76-e698dde5536b\") " pod="openstack/cinder-db-create-csgm4" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.169826 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e7af33e-bc54-4dde-9b49-ea742c331563-operator-scripts\") pod \"barbican-db-create-468zf\" (UID: \"7e7af33e-bc54-4dde-9b49-ea742c331563\") " pod="openstack/barbican-db-create-468zf" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.169928 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt6s6\" (UniqueName: \"kubernetes.io/projected/3eb981f4-baed-45b9-856b-cda56ded1142-kube-api-access-vt6s6\") pod \"barbican-c3ab-account-create-update-s6vq9\" (UID: \"3eb981f4-baed-45b9-856b-cda56ded1142\") " pod="openstack/barbican-c3ab-account-create-update-s6vq9" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.169956 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5n5s\" (UniqueName: \"kubernetes.io/projected/7e7af33e-bc54-4dde-9b49-ea742c331563-kube-api-access-s5n5s\") pod \"barbican-db-create-468zf\" (UID: \"7e7af33e-bc54-4dde-9b49-ea742c331563\") " pod="openstack/barbican-db-create-468zf" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.169988 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4319acb7-6d0a-4954-9d76-e698dde5536b-operator-scripts\") pod \"cinder-db-create-csgm4\" (UID: \"4319acb7-6d0a-4954-9d76-e698dde5536b\") " pod="openstack/cinder-db-create-csgm4" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.170018 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3eb981f4-baed-45b9-856b-cda56ded1142-operator-scripts\") pod \"barbican-c3ab-account-create-update-s6vq9\" (UID: \"3eb981f4-baed-45b9-856b-cda56ded1142\") " pod="openstack/barbican-c3ab-account-create-update-s6vq9" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.171177 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4319acb7-6d0a-4954-9d76-e698dde5536b-operator-scripts\") pod \"cinder-db-create-csgm4\" (UID: \"4319acb7-6d0a-4954-9d76-e698dde5536b\") " pod="openstack/cinder-db-create-csgm4" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.220708 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j488w\" (UniqueName: \"kubernetes.io/projected/4319acb7-6d0a-4954-9d76-e698dde5536b-kube-api-access-j488w\") pod \"cinder-db-create-csgm4\" (UID: \"4319acb7-6d0a-4954-9d76-e698dde5536b\") " pod="openstack/cinder-db-create-csgm4" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.236370 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-2230-account-create-update-lj8h4"] Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.237722 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2230-account-create-update-lj8h4" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.258363 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.274311 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt6s6\" (UniqueName: \"kubernetes.io/projected/3eb981f4-baed-45b9-856b-cda56ded1142-kube-api-access-vt6s6\") pod \"barbican-c3ab-account-create-update-s6vq9\" (UID: \"3eb981f4-baed-45b9-856b-cda56ded1142\") " pod="openstack/barbican-c3ab-account-create-update-s6vq9" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.274357 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5n5s\" (UniqueName: \"kubernetes.io/projected/7e7af33e-bc54-4dde-9b49-ea742c331563-kube-api-access-s5n5s\") pod \"barbican-db-create-468zf\" (UID: \"7e7af33e-bc54-4dde-9b49-ea742c331563\") " pod="openstack/barbican-db-create-468zf" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.274406 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3eb981f4-baed-45b9-856b-cda56ded1142-operator-scripts\") pod \"barbican-c3ab-account-create-update-s6vq9\" (UID: \"3eb981f4-baed-45b9-856b-cda56ded1142\") " pod="openstack/barbican-c3ab-account-create-update-s6vq9" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.274519 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e7af33e-bc54-4dde-9b49-ea742c331563-operator-scripts\") pod \"barbican-db-create-468zf\" (UID: \"7e7af33e-bc54-4dde-9b49-ea742c331563\") " pod="openstack/barbican-db-create-468zf" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.275163 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e7af33e-bc54-4dde-9b49-ea742c331563-operator-scripts\") pod \"barbican-db-create-468zf\" (UID: \"7e7af33e-bc54-4dde-9b49-ea742c331563\") " pod="openstack/barbican-db-create-468zf" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.276024 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3eb981f4-baed-45b9-856b-cda56ded1142-operator-scripts\") pod \"barbican-c3ab-account-create-update-s6vq9\" (UID: \"3eb981f4-baed-45b9-856b-cda56ded1142\") " pod="openstack/barbican-c3ab-account-create-update-s6vq9" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.276994 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2230-account-create-update-lj8h4"] Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.329990 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt6s6\" (UniqueName: \"kubernetes.io/projected/3eb981f4-baed-45b9-856b-cda56ded1142-kube-api-access-vt6s6\") pod \"barbican-c3ab-account-create-update-s6vq9\" (UID: \"3eb981f4-baed-45b9-856b-cda56ded1142\") " pod="openstack/barbican-c3ab-account-create-update-s6vq9" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.337961 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5n5s\" (UniqueName: \"kubernetes.io/projected/7e7af33e-bc54-4dde-9b49-ea742c331563-kube-api-access-s5n5s\") pod \"barbican-db-create-468zf\" (UID: \"7e7af33e-bc54-4dde-9b49-ea742c331563\") " pod="openstack/barbican-db-create-468zf" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.338926 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-csgm4" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.376374 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36fe1100-e673-4067-84f1-30bbfc2e4a0c-operator-scripts\") pod \"cinder-2230-account-create-update-lj8h4\" (UID: \"36fe1100-e673-4067-84f1-30bbfc2e4a0c\") " pod="openstack/cinder-2230-account-create-update-lj8h4" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.376455 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52kpc\" (UniqueName: \"kubernetes.io/projected/36fe1100-e673-4067-84f1-30bbfc2e4a0c-kube-api-access-52kpc\") pod \"cinder-2230-account-create-update-lj8h4\" (UID: \"36fe1100-e673-4067-84f1-30bbfc2e4a0c\") " pod="openstack/cinder-2230-account-create-update-lj8h4" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.452261 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-wj2j9"] Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.453295 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wj2j9" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.463785 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-bc5a-account-create-update-rvl5h"] Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.465121 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bc5a-account-create-update-rvl5h" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.468709 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.469089 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c3ab-account-create-update-s6vq9" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.478262 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36fe1100-e673-4067-84f1-30bbfc2e4a0c-operator-scripts\") pod \"cinder-2230-account-create-update-lj8h4\" (UID: \"36fe1100-e673-4067-84f1-30bbfc2e4a0c\") " pod="openstack/cinder-2230-account-create-update-lj8h4" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.478317 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52kpc\" (UniqueName: \"kubernetes.io/projected/36fe1100-e673-4067-84f1-30bbfc2e4a0c-kube-api-access-52kpc\") pod \"cinder-2230-account-create-update-lj8h4\" (UID: \"36fe1100-e673-4067-84f1-30bbfc2e4a0c\") " pod="openstack/cinder-2230-account-create-update-lj8h4" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.479260 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36fe1100-e673-4067-84f1-30bbfc2e4a0c-operator-scripts\") pod \"cinder-2230-account-create-update-lj8h4\" (UID: \"36fe1100-e673-4067-84f1-30bbfc2e4a0c\") " pod="openstack/cinder-2230-account-create-update-lj8h4" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.480721 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-gmdc8"] Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.480871 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-468zf" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.482469 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-gmdc8" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.490953 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.491207 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.497445 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bc5a-account-create-update-rvl5h"] Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.501813 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rdq64" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.502400 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.508441 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-wj2j9"] Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.543374 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-gmdc8"] Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.544941 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52kpc\" (UniqueName: \"kubernetes.io/projected/36fe1100-e673-4067-84f1-30bbfc2e4a0c-kube-api-access-52kpc\") pod \"cinder-2230-account-create-update-lj8h4\" (UID: \"36fe1100-e673-4067-84f1-30bbfc2e4a0c\") " pod="openstack/cinder-2230-account-create-update-lj8h4" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.569772 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2230-account-create-update-lj8h4" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.583756 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a485c78-6a69-4e0d-a3ab-0048b19ed955-operator-scripts\") pod \"neutron-bc5a-account-create-update-rvl5h\" (UID: \"3a485c78-6a69-4e0d-a3ab-0048b19ed955\") " pod="openstack/neutron-bc5a-account-create-update-rvl5h" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.583835 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68d72ac3-7f79-4644-9195-ebc6ed861edd-operator-scripts\") pod \"neutron-db-create-wj2j9\" (UID: \"68d72ac3-7f79-4644-9195-ebc6ed861edd\") " pod="openstack/neutron-db-create-wj2j9" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.583867 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15e16ba1-ba6e-42bd-a187-a22710026feb-combined-ca-bundle\") pod \"keystone-db-sync-gmdc8\" (UID: \"15e16ba1-ba6e-42bd-a187-a22710026feb\") " pod="openstack/keystone-db-sync-gmdc8" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.583898 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmsk5\" (UniqueName: \"kubernetes.io/projected/68d72ac3-7f79-4644-9195-ebc6ed861edd-kube-api-access-wmsk5\") pod \"neutron-db-create-wj2j9\" (UID: \"68d72ac3-7f79-4644-9195-ebc6ed861edd\") " pod="openstack/neutron-db-create-wj2j9" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.584032 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15e16ba1-ba6e-42bd-a187-a22710026feb-config-data\") pod \"keystone-db-sync-gmdc8\" (UID: \"15e16ba1-ba6e-42bd-a187-a22710026feb\") " pod="openstack/keystone-db-sync-gmdc8" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.584072 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p27z9\" (UniqueName: \"kubernetes.io/projected/15e16ba1-ba6e-42bd-a187-a22710026feb-kube-api-access-p27z9\") pod \"keystone-db-sync-gmdc8\" (UID: \"15e16ba1-ba6e-42bd-a187-a22710026feb\") " pod="openstack/keystone-db-sync-gmdc8" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.584120 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnb7j\" (UniqueName: \"kubernetes.io/projected/3a485c78-6a69-4e0d-a3ab-0048b19ed955-kube-api-access-lnb7j\") pod \"neutron-bc5a-account-create-update-rvl5h\" (UID: \"3a485c78-6a69-4e0d-a3ab-0048b19ed955\") " pod="openstack/neutron-bc5a-account-create-update-rvl5h" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.685955 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68d72ac3-7f79-4644-9195-ebc6ed861edd-operator-scripts\") pod \"neutron-db-create-wj2j9\" (UID: \"68d72ac3-7f79-4644-9195-ebc6ed861edd\") " pod="openstack/neutron-db-create-wj2j9" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.686287 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15e16ba1-ba6e-42bd-a187-a22710026feb-combined-ca-bundle\") pod \"keystone-db-sync-gmdc8\" (UID: \"15e16ba1-ba6e-42bd-a187-a22710026feb\") " pod="openstack/keystone-db-sync-gmdc8" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.686310 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmsk5\" (UniqueName: \"kubernetes.io/projected/68d72ac3-7f79-4644-9195-ebc6ed861edd-kube-api-access-wmsk5\") pod \"neutron-db-create-wj2j9\" (UID: \"68d72ac3-7f79-4644-9195-ebc6ed861edd\") " pod="openstack/neutron-db-create-wj2j9" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.686372 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15e16ba1-ba6e-42bd-a187-a22710026feb-config-data\") pod \"keystone-db-sync-gmdc8\" (UID: \"15e16ba1-ba6e-42bd-a187-a22710026feb\") " pod="openstack/keystone-db-sync-gmdc8" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.686411 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p27z9\" (UniqueName: \"kubernetes.io/projected/15e16ba1-ba6e-42bd-a187-a22710026feb-kube-api-access-p27z9\") pod \"keystone-db-sync-gmdc8\" (UID: \"15e16ba1-ba6e-42bd-a187-a22710026feb\") " pod="openstack/keystone-db-sync-gmdc8" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.686510 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnb7j\" (UniqueName: \"kubernetes.io/projected/3a485c78-6a69-4e0d-a3ab-0048b19ed955-kube-api-access-lnb7j\") pod \"neutron-bc5a-account-create-update-rvl5h\" (UID: \"3a485c78-6a69-4e0d-a3ab-0048b19ed955\") " pod="openstack/neutron-bc5a-account-create-update-rvl5h" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.686584 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a485c78-6a69-4e0d-a3ab-0048b19ed955-operator-scripts\") pod \"neutron-bc5a-account-create-update-rvl5h\" (UID: \"3a485c78-6a69-4e0d-a3ab-0048b19ed955\") " pod="openstack/neutron-bc5a-account-create-update-rvl5h" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.697621 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68d72ac3-7f79-4644-9195-ebc6ed861edd-operator-scripts\") pod \"neutron-db-create-wj2j9\" (UID: \"68d72ac3-7f79-4644-9195-ebc6ed861edd\") " pod="openstack/neutron-db-create-wj2j9" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.697781 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a485c78-6a69-4e0d-a3ab-0048b19ed955-operator-scripts\") pod \"neutron-bc5a-account-create-update-rvl5h\" (UID: \"3a485c78-6a69-4e0d-a3ab-0048b19ed955\") " pod="openstack/neutron-bc5a-account-create-update-rvl5h" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.704708 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15e16ba1-ba6e-42bd-a187-a22710026feb-combined-ca-bundle\") pod \"keystone-db-sync-gmdc8\" (UID: \"15e16ba1-ba6e-42bd-a187-a22710026feb\") " pod="openstack/keystone-db-sync-gmdc8" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.716244 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15e16ba1-ba6e-42bd-a187-a22710026feb-config-data\") pod \"keystone-db-sync-gmdc8\" (UID: \"15e16ba1-ba6e-42bd-a187-a22710026feb\") " pod="openstack/keystone-db-sync-gmdc8" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.732860 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnb7j\" (UniqueName: \"kubernetes.io/projected/3a485c78-6a69-4e0d-a3ab-0048b19ed955-kube-api-access-lnb7j\") pod \"neutron-bc5a-account-create-update-rvl5h\" (UID: \"3a485c78-6a69-4e0d-a3ab-0048b19ed955\") " pod="openstack/neutron-bc5a-account-create-update-rvl5h" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.742706 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmsk5\" (UniqueName: \"kubernetes.io/projected/68d72ac3-7f79-4644-9195-ebc6ed861edd-kube-api-access-wmsk5\") pod \"neutron-db-create-wj2j9\" (UID: \"68d72ac3-7f79-4644-9195-ebc6ed861edd\") " pod="openstack/neutron-db-create-wj2j9" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.758883 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p27z9\" (UniqueName: \"kubernetes.io/projected/15e16ba1-ba6e-42bd-a187-a22710026feb-kube-api-access-p27z9\") pod \"keystone-db-sync-gmdc8\" (UID: \"15e16ba1-ba6e-42bd-a187-a22710026feb\") " pod="openstack/keystone-db-sync-gmdc8" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.785342 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-csgm4"] Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.801764 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wj2j9" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.840653 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.854262 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bc5a-account-create-update-rvl5h" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.904137 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-gmdc8" Dec 05 16:36:29 crc kubenswrapper[4756]: I1205 16:36:29.934056 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c3ab-account-create-update-s6vq9"] Dec 05 16:36:29 crc kubenswrapper[4756]: W1205 16:36:29.947061 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3eb981f4_baed_45b9_856b_cda56ded1142.slice/crio-bc80c5efbdbb980d7abf1f8b43562f06927e8c3455fcd0f8603a260269c63139 WatchSource:0}: Error finding container bc80c5efbdbb980d7abf1f8b43562f06927e8c3455fcd0f8603a260269c63139: Status 404 returned error can't find the container with id bc80c5efbdbb980d7abf1f8b43562f06927e8c3455fcd0f8603a260269c63139 Dec 05 16:36:30 crc kubenswrapper[4756]: I1205 16:36:30.190462 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2230-account-create-update-lj8h4"] Dec 05 16:36:30 crc kubenswrapper[4756]: W1205 16:36:30.192887 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36fe1100_e673_4067_84f1_30bbfc2e4a0c.slice/crio-3e1fd6375062dfa7ed9ae97a7e72568ff7e2e9cca0e40022109bb9253638cd50 WatchSource:0}: Error finding container 3e1fd6375062dfa7ed9ae97a7e72568ff7e2e9cca0e40022109bb9253638cd50: Status 404 returned error can't find the container with id 3e1fd6375062dfa7ed9ae97a7e72568ff7e2e9cca0e40022109bb9253638cd50 Dec 05 16:36:30 crc kubenswrapper[4756]: I1205 16:36:30.263960 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2230-account-create-update-lj8h4" event={"ID":"36fe1100-e673-4067-84f1-30bbfc2e4a0c","Type":"ContainerStarted","Data":"3e1fd6375062dfa7ed9ae97a7e72568ff7e2e9cca0e40022109bb9253638cd50"} Dec 05 16:36:30 crc kubenswrapper[4756]: I1205 16:36:30.271777 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-csgm4" event={"ID":"4319acb7-6d0a-4954-9d76-e698dde5536b","Type":"ContainerStarted","Data":"a416214046fd5de1212ab7414da17367d6cc94f7b4188017cc377e1f51d19f6a"} Dec 05 16:36:30 crc kubenswrapper[4756]: I1205 16:36:30.273369 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c3ab-account-create-update-s6vq9" event={"ID":"3eb981f4-baed-45b9-856b-cda56ded1142","Type":"ContainerStarted","Data":"bc80c5efbdbb980d7abf1f8b43562f06927e8c3455fcd0f8603a260269c63139"} Dec 05 16:36:30 crc kubenswrapper[4756]: I1205 16:36:30.275012 4756 generic.go:334] "Generic (PLEG): container finished" podID="bee23650-0d6d-495a-af65-7e818117b368" containerID="0c07de68969b170afeb462d0f1067ae5e922dca93fdb5eb23018f7ae6b73caef" exitCode=0 Dec 05 16:36:30 crc kubenswrapper[4756]: I1205 16:36:30.275059 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" event={"ID":"bee23650-0d6d-495a-af65-7e818117b368","Type":"ContainerDied","Data":"0c07de68969b170afeb462d0f1067ae5e922dca93fdb5eb23018f7ae6b73caef"} Dec 05 16:36:30 crc kubenswrapper[4756]: I1205 16:36:30.288955 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-468zf"] Dec 05 16:36:30 crc kubenswrapper[4756]: I1205 16:36:30.431010 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bc5a-account-create-update-rvl5h"] Dec 05 16:36:30 crc kubenswrapper[4756]: I1205 16:36:30.436916 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-wj2j9"] Dec 05 16:36:30 crc kubenswrapper[4756]: I1205 16:36:30.445618 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-gmdc8"] Dec 05 16:36:30 crc kubenswrapper[4756]: W1205 16:36:30.448857 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15e16ba1_ba6e_42bd_a187_a22710026feb.slice/crio-90c08c6ffadef59a869736e80d0522030312b913543fa8446788899d3f426839 WatchSource:0}: Error finding container 90c08c6ffadef59a869736e80d0522030312b913543fa8446788899d3f426839: Status 404 returned error can't find the container with id 90c08c6ffadef59a869736e80d0522030312b913543fa8446788899d3f426839 Dec 05 16:36:31 crc kubenswrapper[4756]: I1205 16:36:31.449240 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-468zf" event={"ID":"7e7af33e-bc54-4dde-9b49-ea742c331563","Type":"ContainerStarted","Data":"705757f3f4449e493650518de1e2d0888446219219c37715bbb720555d759c65"} Dec 05 16:36:31 crc kubenswrapper[4756]: I1205 16:36:31.450792 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wj2j9" event={"ID":"68d72ac3-7f79-4644-9195-ebc6ed861edd","Type":"ContainerStarted","Data":"55dc5940803fff14bdad162b009fa3ded9b1ad6b7937bc1d105c62f05ccac24d"} Dec 05 16:36:31 crc kubenswrapper[4756]: I1205 16:36:31.452582 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-gmdc8" event={"ID":"15e16ba1-ba6e-42bd-a187-a22710026feb","Type":"ContainerStarted","Data":"90c08c6ffadef59a869736e80d0522030312b913543fa8446788899d3f426839"} Dec 05 16:36:31 crc kubenswrapper[4756]: I1205 16:36:31.454693 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bc5a-account-create-update-rvl5h" event={"ID":"3a485c78-6a69-4e0d-a3ab-0048b19ed955","Type":"ContainerStarted","Data":"74e51b67c3494c0382a0743156dca92c1c8bd70ead80019bf317e49afa85bfb4"} Dec 05 16:36:32 crc kubenswrapper[4756]: I1205 16:36:32.464345 4756 generic.go:334] "Generic (PLEG): container finished" podID="7e7af33e-bc54-4dde-9b49-ea742c331563" containerID="622296c3958d5d7a5843eb02cf0aa27222ca463b3056cce2ed68e7255ddd8f4b" exitCode=0 Dec 05 16:36:32 crc kubenswrapper[4756]: I1205 16:36:32.464461 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-468zf" event={"ID":"7e7af33e-bc54-4dde-9b49-ea742c331563","Type":"ContainerDied","Data":"622296c3958d5d7a5843eb02cf0aa27222ca463b3056cce2ed68e7255ddd8f4b"} Dec 05 16:36:32 crc kubenswrapper[4756]: I1205 16:36:32.465799 4756 generic.go:334] "Generic (PLEG): container finished" podID="4319acb7-6d0a-4954-9d76-e698dde5536b" containerID="fb8ac0de6e9797d2428d2cbdde22697f86aebba8e11f73aecb75abd983f806ec" exitCode=0 Dec 05 16:36:32 crc kubenswrapper[4756]: I1205 16:36:32.465851 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-csgm4" event={"ID":"4319acb7-6d0a-4954-9d76-e698dde5536b","Type":"ContainerDied","Data":"fb8ac0de6e9797d2428d2cbdde22697f86aebba8e11f73aecb75abd983f806ec"} Dec 05 16:36:32 crc kubenswrapper[4756]: I1205 16:36:32.468891 4756 generic.go:334] "Generic (PLEG): container finished" podID="68d72ac3-7f79-4644-9195-ebc6ed861edd" containerID="bc3fc7814a08ae99eb6fd057cff3a1942e88941dafc3edd5ec0acce3c68e60f0" exitCode=0 Dec 05 16:36:32 crc kubenswrapper[4756]: I1205 16:36:32.468995 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wj2j9" event={"ID":"68d72ac3-7f79-4644-9195-ebc6ed861edd","Type":"ContainerDied","Data":"bc3fc7814a08ae99eb6fd057cff3a1942e88941dafc3edd5ec0acce3c68e60f0"} Dec 05 16:36:32 crc kubenswrapper[4756]: I1205 16:36:32.471893 4756 generic.go:334] "Generic (PLEG): container finished" podID="3eb981f4-baed-45b9-856b-cda56ded1142" containerID="b2c09374299b8b9e5c8557a6143462bf42ba4b6ac57d54c21e967011eebb395e" exitCode=0 Dec 05 16:36:32 crc kubenswrapper[4756]: I1205 16:36:32.471949 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c3ab-account-create-update-s6vq9" event={"ID":"3eb981f4-baed-45b9-856b-cda56ded1142","Type":"ContainerDied","Data":"b2c09374299b8b9e5c8557a6143462bf42ba4b6ac57d54c21e967011eebb395e"} Dec 05 16:36:32 crc kubenswrapper[4756]: I1205 16:36:32.474365 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bc5a-account-create-update-rvl5h" event={"ID":"3a485c78-6a69-4e0d-a3ab-0048b19ed955","Type":"ContainerStarted","Data":"b80c949a7e62ec7be4eed3cd8082a46205bea1efe05fb041a6260912f1b9e6aa"} Dec 05 16:36:32 crc kubenswrapper[4756]: I1205 16:36:32.496981 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" event={"ID":"bee23650-0d6d-495a-af65-7e818117b368","Type":"ContainerStarted","Data":"3363871f09f0407a249583935183f2d2f082f2738b6a7c6ff9ff1414a1479a2b"} Dec 05 16:36:32 crc kubenswrapper[4756]: I1205 16:36:32.497529 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" Dec 05 16:36:32 crc kubenswrapper[4756]: I1205 16:36:32.510668 4756 generic.go:334] "Generic (PLEG): container finished" podID="36fe1100-e673-4067-84f1-30bbfc2e4a0c" containerID="08a4ee5938edef2174f98c8d125df245ebb94504859012da9b9eee42e3708693" exitCode=0 Dec 05 16:36:32 crc kubenswrapper[4756]: I1205 16:36:32.510721 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2230-account-create-update-lj8h4" event={"ID":"36fe1100-e673-4067-84f1-30bbfc2e4a0c","Type":"ContainerDied","Data":"08a4ee5938edef2174f98c8d125df245ebb94504859012da9b9eee42e3708693"} Dec 05 16:36:32 crc kubenswrapper[4756]: I1205 16:36:32.607922 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" podStartSLOduration=5.607904212 podStartE2EDuration="5.607904212s" podCreationTimestamp="2025-12-05 16:36:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:36:32.601248882 +0000 UTC m=+1252.839922394" watchObservedRunningTime="2025-12-05 16:36:32.607904212 +0000 UTC m=+1252.846577724" Dec 05 16:36:33 crc kubenswrapper[4756]: I1205 16:36:33.520693 4756 generic.go:334] "Generic (PLEG): container finished" podID="3a485c78-6a69-4e0d-a3ab-0048b19ed955" containerID="b80c949a7e62ec7be4eed3cd8082a46205bea1efe05fb041a6260912f1b9e6aa" exitCode=0 Dec 05 16:36:33 crc kubenswrapper[4756]: I1205 16:36:33.520737 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bc5a-account-create-update-rvl5h" event={"ID":"3a485c78-6a69-4e0d-a3ab-0048b19ed955","Type":"ContainerDied","Data":"b80c949a7e62ec7be4eed3cd8082a46205bea1efe05fb041a6260912f1b9e6aa"} Dec 05 16:36:36 crc kubenswrapper[4756]: I1205 16:36:36.553851 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c3ab-account-create-update-s6vq9" event={"ID":"3eb981f4-baed-45b9-856b-cda56ded1142","Type":"ContainerDied","Data":"bc80c5efbdbb980d7abf1f8b43562f06927e8c3455fcd0f8603a260269c63139"} Dec 05 16:36:36 crc kubenswrapper[4756]: I1205 16:36:36.554159 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc80c5efbdbb980d7abf1f8b43562f06927e8c3455fcd0f8603a260269c63139" Dec 05 16:36:36 crc kubenswrapper[4756]: I1205 16:36:36.851862 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c3ab-account-create-update-s6vq9" Dec 05 16:36:36 crc kubenswrapper[4756]: I1205 16:36:36.884774 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wj2j9" Dec 05 16:36:36 crc kubenswrapper[4756]: I1205 16:36:36.885189 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2230-account-create-update-lj8h4" Dec 05 16:36:36 crc kubenswrapper[4756]: I1205 16:36:36.923844 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-468zf" Dec 05 16:36:36 crc kubenswrapper[4756]: I1205 16:36:36.933377 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt6s6\" (UniqueName: \"kubernetes.io/projected/3eb981f4-baed-45b9-856b-cda56ded1142-kube-api-access-vt6s6\") pod \"3eb981f4-baed-45b9-856b-cda56ded1142\" (UID: \"3eb981f4-baed-45b9-856b-cda56ded1142\") " Dec 05 16:36:36 crc kubenswrapper[4756]: I1205 16:36:36.933592 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmsk5\" (UniqueName: \"kubernetes.io/projected/68d72ac3-7f79-4644-9195-ebc6ed861edd-kube-api-access-wmsk5\") pod \"68d72ac3-7f79-4644-9195-ebc6ed861edd\" (UID: \"68d72ac3-7f79-4644-9195-ebc6ed861edd\") " Dec 05 16:36:36 crc kubenswrapper[4756]: I1205 16:36:36.933656 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36fe1100-e673-4067-84f1-30bbfc2e4a0c-operator-scripts\") pod \"36fe1100-e673-4067-84f1-30bbfc2e4a0c\" (UID: \"36fe1100-e673-4067-84f1-30bbfc2e4a0c\") " Dec 05 16:36:36 crc kubenswrapper[4756]: I1205 16:36:36.933704 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52kpc\" (UniqueName: \"kubernetes.io/projected/36fe1100-e673-4067-84f1-30bbfc2e4a0c-kube-api-access-52kpc\") pod \"36fe1100-e673-4067-84f1-30bbfc2e4a0c\" (UID: \"36fe1100-e673-4067-84f1-30bbfc2e4a0c\") " Dec 05 16:36:36 crc kubenswrapper[4756]: I1205 16:36:36.933729 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68d72ac3-7f79-4644-9195-ebc6ed861edd-operator-scripts\") pod \"68d72ac3-7f79-4644-9195-ebc6ed861edd\" (UID: \"68d72ac3-7f79-4644-9195-ebc6ed861edd\") " Dec 05 16:36:36 crc kubenswrapper[4756]: I1205 16:36:36.933771 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3eb981f4-baed-45b9-856b-cda56ded1142-operator-scripts\") pod \"3eb981f4-baed-45b9-856b-cda56ded1142\" (UID: \"3eb981f4-baed-45b9-856b-cda56ded1142\") " Dec 05 16:36:36 crc kubenswrapper[4756]: I1205 16:36:36.934922 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36fe1100-e673-4067-84f1-30bbfc2e4a0c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "36fe1100-e673-4067-84f1-30bbfc2e4a0c" (UID: "36fe1100-e673-4067-84f1-30bbfc2e4a0c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:36:36 crc kubenswrapper[4756]: I1205 16:36:36.934936 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68d72ac3-7f79-4644-9195-ebc6ed861edd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "68d72ac3-7f79-4644-9195-ebc6ed861edd" (UID: "68d72ac3-7f79-4644-9195-ebc6ed861edd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:36:36 crc kubenswrapper[4756]: I1205 16:36:36.935172 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3eb981f4-baed-45b9-856b-cda56ded1142-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3eb981f4-baed-45b9-856b-cda56ded1142" (UID: "3eb981f4-baed-45b9-856b-cda56ded1142"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:36:36 crc kubenswrapper[4756]: I1205 16:36:36.937472 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bc5a-account-create-update-rvl5h" Dec 05 16:36:36 crc kubenswrapper[4756]: I1205 16:36:36.937910 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eb981f4-baed-45b9-856b-cda56ded1142-kube-api-access-vt6s6" (OuterVolumeSpecName: "kube-api-access-vt6s6") pod "3eb981f4-baed-45b9-856b-cda56ded1142" (UID: "3eb981f4-baed-45b9-856b-cda56ded1142"). InnerVolumeSpecName "kube-api-access-vt6s6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:36:36 crc kubenswrapper[4756]: I1205 16:36:36.942404 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36fe1100-e673-4067-84f1-30bbfc2e4a0c-kube-api-access-52kpc" (OuterVolumeSpecName: "kube-api-access-52kpc") pod "36fe1100-e673-4067-84f1-30bbfc2e4a0c" (UID: "36fe1100-e673-4067-84f1-30bbfc2e4a0c"). InnerVolumeSpecName "kube-api-access-52kpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:36:36 crc kubenswrapper[4756]: I1205 16:36:36.943570 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68d72ac3-7f79-4644-9195-ebc6ed861edd-kube-api-access-wmsk5" (OuterVolumeSpecName: "kube-api-access-wmsk5") pod "68d72ac3-7f79-4644-9195-ebc6ed861edd" (UID: "68d72ac3-7f79-4644-9195-ebc6ed861edd"). InnerVolumeSpecName "kube-api-access-wmsk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:36:36 crc kubenswrapper[4756]: I1205 16:36:36.955298 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-csgm4" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.036457 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnb7j\" (UniqueName: \"kubernetes.io/projected/3a485c78-6a69-4e0d-a3ab-0048b19ed955-kube-api-access-lnb7j\") pod \"3a485c78-6a69-4e0d-a3ab-0048b19ed955\" (UID: \"3a485c78-6a69-4e0d-a3ab-0048b19ed955\") " Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.036911 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e7af33e-bc54-4dde-9b49-ea742c331563-operator-scripts\") pod \"7e7af33e-bc54-4dde-9b49-ea742c331563\" (UID: \"7e7af33e-bc54-4dde-9b49-ea742c331563\") " Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.036940 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a485c78-6a69-4e0d-a3ab-0048b19ed955-operator-scripts\") pod \"3a485c78-6a69-4e0d-a3ab-0048b19ed955\" (UID: \"3a485c78-6a69-4e0d-a3ab-0048b19ed955\") " Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.036972 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4319acb7-6d0a-4954-9d76-e698dde5536b-operator-scripts\") pod \"4319acb7-6d0a-4954-9d76-e698dde5536b\" (UID: \"4319acb7-6d0a-4954-9d76-e698dde5536b\") " Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.037006 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5n5s\" (UniqueName: \"kubernetes.io/projected/7e7af33e-bc54-4dde-9b49-ea742c331563-kube-api-access-s5n5s\") pod \"7e7af33e-bc54-4dde-9b49-ea742c331563\" (UID: \"7e7af33e-bc54-4dde-9b49-ea742c331563\") " Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.037113 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j488w\" (UniqueName: \"kubernetes.io/projected/4319acb7-6d0a-4954-9d76-e698dde5536b-kube-api-access-j488w\") pod \"4319acb7-6d0a-4954-9d76-e698dde5536b\" (UID: \"4319acb7-6d0a-4954-9d76-e698dde5536b\") " Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.037726 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmsk5\" (UniqueName: \"kubernetes.io/projected/68d72ac3-7f79-4644-9195-ebc6ed861edd-kube-api-access-wmsk5\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.037743 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36fe1100-e673-4067-84f1-30bbfc2e4a0c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.037752 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52kpc\" (UniqueName: \"kubernetes.io/projected/36fe1100-e673-4067-84f1-30bbfc2e4a0c-kube-api-access-52kpc\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.037760 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68d72ac3-7f79-4644-9195-ebc6ed861edd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.037769 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3eb981f4-baed-45b9-856b-cda56ded1142-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.037777 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt6s6\" (UniqueName: \"kubernetes.io/projected/3eb981f4-baed-45b9-856b-cda56ded1142-kube-api-access-vt6s6\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.037947 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a485c78-6a69-4e0d-a3ab-0048b19ed955-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3a485c78-6a69-4e0d-a3ab-0048b19ed955" (UID: "3a485c78-6a69-4e0d-a3ab-0048b19ed955"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.038001 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4319acb7-6d0a-4954-9d76-e698dde5536b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4319acb7-6d0a-4954-9d76-e698dde5536b" (UID: "4319acb7-6d0a-4954-9d76-e698dde5536b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.038296 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e7af33e-bc54-4dde-9b49-ea742c331563-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7e7af33e-bc54-4dde-9b49-ea742c331563" (UID: "7e7af33e-bc54-4dde-9b49-ea742c331563"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.041051 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a485c78-6a69-4e0d-a3ab-0048b19ed955-kube-api-access-lnb7j" (OuterVolumeSpecName: "kube-api-access-lnb7j") pod "3a485c78-6a69-4e0d-a3ab-0048b19ed955" (UID: "3a485c78-6a69-4e0d-a3ab-0048b19ed955"). InnerVolumeSpecName "kube-api-access-lnb7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.041837 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4319acb7-6d0a-4954-9d76-e698dde5536b-kube-api-access-j488w" (OuterVolumeSpecName: "kube-api-access-j488w") pod "4319acb7-6d0a-4954-9d76-e698dde5536b" (UID: "4319acb7-6d0a-4954-9d76-e698dde5536b"). InnerVolumeSpecName "kube-api-access-j488w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.042310 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e7af33e-bc54-4dde-9b49-ea742c331563-kube-api-access-s5n5s" (OuterVolumeSpecName: "kube-api-access-s5n5s") pod "7e7af33e-bc54-4dde-9b49-ea742c331563" (UID: "7e7af33e-bc54-4dde-9b49-ea742c331563"). InnerVolumeSpecName "kube-api-access-s5n5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.139390 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4319acb7-6d0a-4954-9d76-e698dde5536b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.139494 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5n5s\" (UniqueName: \"kubernetes.io/projected/7e7af33e-bc54-4dde-9b49-ea742c331563-kube-api-access-s5n5s\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.139513 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j488w\" (UniqueName: \"kubernetes.io/projected/4319acb7-6d0a-4954-9d76-e698dde5536b-kube-api-access-j488w\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.139524 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnb7j\" (UniqueName: \"kubernetes.io/projected/3a485c78-6a69-4e0d-a3ab-0048b19ed955-kube-api-access-lnb7j\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.139536 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e7af33e-bc54-4dde-9b49-ea742c331563-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.139547 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a485c78-6a69-4e0d-a3ab-0048b19ed955-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.562999 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-gmdc8" event={"ID":"15e16ba1-ba6e-42bd-a187-a22710026feb","Type":"ContainerStarted","Data":"f3c6cab0ce8d40ac7ad080640579b50975b38fa2de3b98eba3f1ed047e946240"} Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.567291 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bc5a-account-create-update-rvl5h" event={"ID":"3a485c78-6a69-4e0d-a3ab-0048b19ed955","Type":"ContainerDied","Data":"74e51b67c3494c0382a0743156dca92c1c8bd70ead80019bf317e49afa85bfb4"} Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.567325 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74e51b67c3494c0382a0743156dca92c1c8bd70ead80019bf317e49afa85bfb4" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.567367 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bc5a-account-create-update-rvl5h" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.569408 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2230-account-create-update-lj8h4" event={"ID":"36fe1100-e673-4067-84f1-30bbfc2e4a0c","Type":"ContainerDied","Data":"3e1fd6375062dfa7ed9ae97a7e72568ff7e2e9cca0e40022109bb9253638cd50"} Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.569452 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e1fd6375062dfa7ed9ae97a7e72568ff7e2e9cca0e40022109bb9253638cd50" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.569459 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2230-account-create-update-lj8h4" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.575514 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-468zf" event={"ID":"7e7af33e-bc54-4dde-9b49-ea742c331563","Type":"ContainerDied","Data":"705757f3f4449e493650518de1e2d0888446219219c37715bbb720555d759c65"} Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.575558 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="705757f3f4449e493650518de1e2d0888446219219c37715bbb720555d759c65" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.575626 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-468zf" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.579971 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-csgm4" event={"ID":"4319acb7-6d0a-4954-9d76-e698dde5536b","Type":"ContainerDied","Data":"a416214046fd5de1212ab7414da17367d6cc94f7b4188017cc377e1f51d19f6a"} Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.580017 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a416214046fd5de1212ab7414da17367d6cc94f7b4188017cc377e1f51d19f6a" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.580073 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-csgm4" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.582806 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c3ab-account-create-update-s6vq9" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.582859 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wj2j9" event={"ID":"68d72ac3-7f79-4644-9195-ebc6ed861edd","Type":"ContainerDied","Data":"55dc5940803fff14bdad162b009fa3ded9b1ad6b7937bc1d105c62f05ccac24d"} Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.582897 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55dc5940803fff14bdad162b009fa3ded9b1ad6b7937bc1d105c62f05ccac24d" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.583013 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wj2j9" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.593768 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-gmdc8" podStartSLOduration=3.398855621 podStartE2EDuration="8.593746048s" podCreationTimestamp="2025-12-05 16:36:29 +0000 UTC" firstStartedPulling="2025-12-05 16:36:31.432042733 +0000 UTC m=+1251.670716245" lastFinishedPulling="2025-12-05 16:36:36.62693316 +0000 UTC m=+1256.865606672" observedRunningTime="2025-12-05 16:36:37.578822203 +0000 UTC m=+1257.817495725" watchObservedRunningTime="2025-12-05 16:36:37.593746048 +0000 UTC m=+1257.832419580" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.845617 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.937253 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59d5fbdd8c-z4k4k"] Dec 05 16:36:37 crc kubenswrapper[4756]: I1205 16:36:37.937628 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59d5fbdd8c-z4k4k" podUID="2884268a-caae-472c-9f24-099e6f193487" containerName="dnsmasq-dns" containerID="cri-o://f77c26edd248b56bc6b66d0b3bb3c38bd10c3ea2fb1c683e21567cba882d54f9" gracePeriod=10 Dec 05 16:36:38 crc kubenswrapper[4756]: I1205 16:36:38.591724 4756 generic.go:334] "Generic (PLEG): container finished" podID="2884268a-caae-472c-9f24-099e6f193487" containerID="f77c26edd248b56bc6b66d0b3bb3c38bd10c3ea2fb1c683e21567cba882d54f9" exitCode=0 Dec 05 16:36:38 crc kubenswrapper[4756]: I1205 16:36:38.591813 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5fbdd8c-z4k4k" event={"ID":"2884268a-caae-472c-9f24-099e6f193487","Type":"ContainerDied","Data":"f77c26edd248b56bc6b66d0b3bb3c38bd10c3ea2fb1c683e21567cba882d54f9"} Dec 05 16:36:39 crc kubenswrapper[4756]: I1205 16:36:39.946512 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5fbdd8c-z4k4k" Dec 05 16:36:40 crc kubenswrapper[4756]: I1205 16:36:40.007838 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2884268a-caae-472c-9f24-099e6f193487-ovsdbserver-sb\") pod \"2884268a-caae-472c-9f24-099e6f193487\" (UID: \"2884268a-caae-472c-9f24-099e6f193487\") " Dec 05 16:36:40 crc kubenswrapper[4756]: I1205 16:36:40.008327 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2884268a-caae-472c-9f24-099e6f193487-ovsdbserver-nb\") pod \"2884268a-caae-472c-9f24-099e6f193487\" (UID: \"2884268a-caae-472c-9f24-099e6f193487\") " Dec 05 16:36:40 crc kubenswrapper[4756]: I1205 16:36:40.008391 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2884268a-caae-472c-9f24-099e6f193487-dns-svc\") pod \"2884268a-caae-472c-9f24-099e6f193487\" (UID: \"2884268a-caae-472c-9f24-099e6f193487\") " Dec 05 16:36:40 crc kubenswrapper[4756]: I1205 16:36:40.008471 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2884268a-caae-472c-9f24-099e6f193487-config\") pod \"2884268a-caae-472c-9f24-099e6f193487\" (UID: \"2884268a-caae-472c-9f24-099e6f193487\") " Dec 05 16:36:40 crc kubenswrapper[4756]: I1205 16:36:40.008572 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgzm7\" (UniqueName: \"kubernetes.io/projected/2884268a-caae-472c-9f24-099e6f193487-kube-api-access-zgzm7\") pod \"2884268a-caae-472c-9f24-099e6f193487\" (UID: \"2884268a-caae-472c-9f24-099e6f193487\") " Dec 05 16:36:40 crc kubenswrapper[4756]: I1205 16:36:40.028776 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2884268a-caae-472c-9f24-099e6f193487-kube-api-access-zgzm7" (OuterVolumeSpecName: "kube-api-access-zgzm7") pod "2884268a-caae-472c-9f24-099e6f193487" (UID: "2884268a-caae-472c-9f24-099e6f193487"). InnerVolumeSpecName "kube-api-access-zgzm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:36:40 crc kubenswrapper[4756]: I1205 16:36:40.069739 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2884268a-caae-472c-9f24-099e6f193487-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2884268a-caae-472c-9f24-099e6f193487" (UID: "2884268a-caae-472c-9f24-099e6f193487"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:36:40 crc kubenswrapper[4756]: I1205 16:36:40.070759 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2884268a-caae-472c-9f24-099e6f193487-config" (OuterVolumeSpecName: "config") pod "2884268a-caae-472c-9f24-099e6f193487" (UID: "2884268a-caae-472c-9f24-099e6f193487"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:36:40 crc kubenswrapper[4756]: I1205 16:36:40.072281 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2884268a-caae-472c-9f24-099e6f193487-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2884268a-caae-472c-9f24-099e6f193487" (UID: "2884268a-caae-472c-9f24-099e6f193487"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:36:40 crc kubenswrapper[4756]: I1205 16:36:40.090534 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2884268a-caae-472c-9f24-099e6f193487-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2884268a-caae-472c-9f24-099e6f193487" (UID: "2884268a-caae-472c-9f24-099e6f193487"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:36:40 crc kubenswrapper[4756]: I1205 16:36:40.119786 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2884268a-caae-472c-9f24-099e6f193487-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:40 crc kubenswrapper[4756]: I1205 16:36:40.119829 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgzm7\" (UniqueName: \"kubernetes.io/projected/2884268a-caae-472c-9f24-099e6f193487-kube-api-access-zgzm7\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:40 crc kubenswrapper[4756]: I1205 16:36:40.119845 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2884268a-caae-472c-9f24-099e6f193487-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:40 crc kubenswrapper[4756]: I1205 16:36:40.119857 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2884268a-caae-472c-9f24-099e6f193487-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:40 crc kubenswrapper[4756]: I1205 16:36:40.119868 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2884268a-caae-472c-9f24-099e6f193487-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:40 crc kubenswrapper[4756]: I1205 16:36:40.612982 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5fbdd8c-z4k4k" event={"ID":"2884268a-caae-472c-9f24-099e6f193487","Type":"ContainerDied","Data":"22355f499096d472ed616d7241e3128e628393df3d5bc465d5e1a92e49a942bf"} Dec 05 16:36:40 crc kubenswrapper[4756]: I1205 16:36:40.613453 4756 scope.go:117] "RemoveContainer" containerID="f77c26edd248b56bc6b66d0b3bb3c38bd10c3ea2fb1c683e21567cba882d54f9" Dec 05 16:36:40 crc kubenswrapper[4756]: I1205 16:36:40.613147 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5fbdd8c-z4k4k" Dec 05 16:36:40 crc kubenswrapper[4756]: I1205 16:36:40.653582 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59d5fbdd8c-z4k4k"] Dec 05 16:36:40 crc kubenswrapper[4756]: I1205 16:36:40.659470 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59d5fbdd8c-z4k4k"] Dec 05 16:36:42 crc kubenswrapper[4756]: I1205 16:36:42.066599 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2884268a-caae-472c-9f24-099e6f193487" path="/var/lib/kubelet/pods/2884268a-caae-472c-9f24-099e6f193487/volumes" Dec 05 16:36:44 crc kubenswrapper[4756]: I1205 16:36:44.984506 4756 scope.go:117] "RemoveContainer" containerID="9d70e6f8daf390ddf48f89c18ba283193940ed236ad407dcdbf94b751d7e4773" Dec 05 16:36:46 crc kubenswrapper[4756]: I1205 16:36:46.667578 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mvn92" event={"ID":"dbf5eba8-7c13-43e6-9777-978152dff744","Type":"ContainerStarted","Data":"ec5c3ff55792fd4b79770c090cc741ae3724e644f600d45e694289adcf92843e"} Dec 05 16:36:48 crc kubenswrapper[4756]: I1205 16:36:48.685605 4756 generic.go:334] "Generic (PLEG): container finished" podID="15e16ba1-ba6e-42bd-a187-a22710026feb" containerID="f3c6cab0ce8d40ac7ad080640579b50975b38fa2de3b98eba3f1ed047e946240" exitCode=0 Dec 05 16:36:48 crc kubenswrapper[4756]: I1205 16:36:48.685721 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-gmdc8" event={"ID":"15e16ba1-ba6e-42bd-a187-a22710026feb","Type":"ContainerDied","Data":"f3c6cab0ce8d40ac7ad080640579b50975b38fa2de3b98eba3f1ed047e946240"} Dec 05 16:36:48 crc kubenswrapper[4756]: I1205 16:36:48.710369 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-mvn92" podStartSLOduration=3.6747413829999998 podStartE2EDuration="45.710352577s" podCreationTimestamp="2025-12-05 16:36:03 +0000 UTC" firstStartedPulling="2025-12-05 16:36:03.89264554 +0000 UTC m=+1224.131319052" lastFinishedPulling="2025-12-05 16:36:45.928256744 +0000 UTC m=+1266.166930246" observedRunningTime="2025-12-05 16:36:46.692494266 +0000 UTC m=+1266.931167788" watchObservedRunningTime="2025-12-05 16:36:48.710352577 +0000 UTC m=+1268.949026089" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:49.959837 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-gmdc8" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:49.992962 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15e16ba1-ba6e-42bd-a187-a22710026feb-combined-ca-bundle\") pod \"15e16ba1-ba6e-42bd-a187-a22710026feb\" (UID: \"15e16ba1-ba6e-42bd-a187-a22710026feb\") " Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:49.993124 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15e16ba1-ba6e-42bd-a187-a22710026feb-config-data\") pod \"15e16ba1-ba6e-42bd-a187-a22710026feb\" (UID: \"15e16ba1-ba6e-42bd-a187-a22710026feb\") " Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:49.993165 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p27z9\" (UniqueName: \"kubernetes.io/projected/15e16ba1-ba6e-42bd-a187-a22710026feb-kube-api-access-p27z9\") pod \"15e16ba1-ba6e-42bd-a187-a22710026feb\" (UID: \"15e16ba1-ba6e-42bd-a187-a22710026feb\") " Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:49.998005 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15e16ba1-ba6e-42bd-a187-a22710026feb-kube-api-access-p27z9" (OuterVolumeSpecName: "kube-api-access-p27z9") pod "15e16ba1-ba6e-42bd-a187-a22710026feb" (UID: "15e16ba1-ba6e-42bd-a187-a22710026feb"). InnerVolumeSpecName "kube-api-access-p27z9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:50.016202 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15e16ba1-ba6e-42bd-a187-a22710026feb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15e16ba1-ba6e-42bd-a187-a22710026feb" (UID: "15e16ba1-ba6e-42bd-a187-a22710026feb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:50.036143 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15e16ba1-ba6e-42bd-a187-a22710026feb-config-data" (OuterVolumeSpecName: "config-data") pod "15e16ba1-ba6e-42bd-a187-a22710026feb" (UID: "15e16ba1-ba6e-42bd-a187-a22710026feb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:50.095464 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15e16ba1-ba6e-42bd-a187-a22710026feb-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:50.095506 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p27z9\" (UniqueName: \"kubernetes.io/projected/15e16ba1-ba6e-42bd-a187-a22710026feb-kube-api-access-p27z9\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:50.095515 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15e16ba1-ba6e-42bd-a187-a22710026feb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:50.702747 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-gmdc8" event={"ID":"15e16ba1-ba6e-42bd-a187-a22710026feb","Type":"ContainerDied","Data":"90c08c6ffadef59a869736e80d0522030312b913543fa8446788899d3f426839"} Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:50.702781 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90c08c6ffadef59a869736e80d0522030312b913543fa8446788899d3f426839" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:50.702841 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-gmdc8" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.006967 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-ck9m7"] Dec 05 16:36:51 crc kubenswrapper[4756]: E1205 16:36:51.007688 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36fe1100-e673-4067-84f1-30bbfc2e4a0c" containerName="mariadb-account-create-update" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.007706 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="36fe1100-e673-4067-84f1-30bbfc2e4a0c" containerName="mariadb-account-create-update" Dec 05 16:36:51 crc kubenswrapper[4756]: E1205 16:36:51.007725 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4319acb7-6d0a-4954-9d76-e698dde5536b" containerName="mariadb-database-create" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.007733 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4319acb7-6d0a-4954-9d76-e698dde5536b" containerName="mariadb-database-create" Dec 05 16:36:51 crc kubenswrapper[4756]: E1205 16:36:51.007746 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2884268a-caae-472c-9f24-099e6f193487" containerName="init" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.007754 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="2884268a-caae-472c-9f24-099e6f193487" containerName="init" Dec 05 16:36:51 crc kubenswrapper[4756]: E1205 16:36:51.007767 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e7af33e-bc54-4dde-9b49-ea742c331563" containerName="mariadb-database-create" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.007774 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e7af33e-bc54-4dde-9b49-ea742c331563" containerName="mariadb-database-create" Dec 05 16:36:51 crc kubenswrapper[4756]: E1205 16:36:51.007787 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68d72ac3-7f79-4644-9195-ebc6ed861edd" containerName="mariadb-database-create" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.007795 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d72ac3-7f79-4644-9195-ebc6ed861edd" containerName="mariadb-database-create" Dec 05 16:36:51 crc kubenswrapper[4756]: E1205 16:36:51.007812 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2884268a-caae-472c-9f24-099e6f193487" containerName="dnsmasq-dns" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.007819 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="2884268a-caae-472c-9f24-099e6f193487" containerName="dnsmasq-dns" Dec 05 16:36:51 crc kubenswrapper[4756]: E1205 16:36:51.007830 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15e16ba1-ba6e-42bd-a187-a22710026feb" containerName="keystone-db-sync" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.007839 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="15e16ba1-ba6e-42bd-a187-a22710026feb" containerName="keystone-db-sync" Dec 05 16:36:51 crc kubenswrapper[4756]: E1205 16:36:51.007857 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb981f4-baed-45b9-856b-cda56ded1142" containerName="mariadb-account-create-update" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.007865 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb981f4-baed-45b9-856b-cda56ded1142" containerName="mariadb-account-create-update" Dec 05 16:36:51 crc kubenswrapper[4756]: E1205 16:36:51.007884 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a485c78-6a69-4e0d-a3ab-0048b19ed955" containerName="mariadb-account-create-update" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.007892 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a485c78-6a69-4e0d-a3ab-0048b19ed955" containerName="mariadb-account-create-update" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.008073 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="68d72ac3-7f79-4644-9195-ebc6ed861edd" containerName="mariadb-database-create" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.008092 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="36fe1100-e673-4067-84f1-30bbfc2e4a0c" containerName="mariadb-account-create-update" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.008107 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="4319acb7-6d0a-4954-9d76-e698dde5536b" containerName="mariadb-database-create" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.008121 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="2884268a-caae-472c-9f24-099e6f193487" containerName="dnsmasq-dns" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.008132 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="15e16ba1-ba6e-42bd-a187-a22710026feb" containerName="keystone-db-sync" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.008145 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e7af33e-bc54-4dde-9b49-ea742c331563" containerName="mariadb-database-create" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.008158 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eb981f4-baed-45b9-856b-cda56ded1142" containerName="mariadb-account-create-update" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.008178 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a485c78-6a69-4e0d-a3ab-0048b19ed955" containerName="mariadb-account-create-update" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.008822 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ck9m7" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.011087 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.011389 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.014242 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rdq64" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.025312 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.032996 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.033064 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78b9b4595f-mt442"] Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.040272 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78b9b4595f-mt442" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.071562 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ck9m7"] Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.089142 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78b9b4595f-mt442"] Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.112616 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-scripts\") pod \"keystone-bootstrap-ck9m7\" (UID: \"4b01b863-c788-45c9-8e5a-d26d0c5d8f26\") " pod="openstack/keystone-bootstrap-ck9m7" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.112674 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lzm7\" (UniqueName: \"kubernetes.io/projected/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-kube-api-access-8lzm7\") pod \"keystone-bootstrap-ck9m7\" (UID: \"4b01b863-c788-45c9-8e5a-d26d0c5d8f26\") " pod="openstack/keystone-bootstrap-ck9m7" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.112703 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/692b055e-f83f-4ed3-99f5-d2a581edf3cb-config\") pod \"dnsmasq-dns-78b9b4595f-mt442\" (UID: \"692b055e-f83f-4ed3-99f5-d2a581edf3cb\") " pod="openstack/dnsmasq-dns-78b9b4595f-mt442" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.112737 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/692b055e-f83f-4ed3-99f5-d2a581edf3cb-ovsdbserver-sb\") pod \"dnsmasq-dns-78b9b4595f-mt442\" (UID: \"692b055e-f83f-4ed3-99f5-d2a581edf3cb\") " pod="openstack/dnsmasq-dns-78b9b4595f-mt442" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.112803 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/692b055e-f83f-4ed3-99f5-d2a581edf3cb-dns-swift-storage-0\") pod \"dnsmasq-dns-78b9b4595f-mt442\" (UID: \"692b055e-f83f-4ed3-99f5-d2a581edf3cb\") " pod="openstack/dnsmasq-dns-78b9b4595f-mt442" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.112827 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pwrr\" (UniqueName: \"kubernetes.io/projected/692b055e-f83f-4ed3-99f5-d2a581edf3cb-kube-api-access-5pwrr\") pod \"dnsmasq-dns-78b9b4595f-mt442\" (UID: \"692b055e-f83f-4ed3-99f5-d2a581edf3cb\") " pod="openstack/dnsmasq-dns-78b9b4595f-mt442" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.112876 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-config-data\") pod \"keystone-bootstrap-ck9m7\" (UID: \"4b01b863-c788-45c9-8e5a-d26d0c5d8f26\") " pod="openstack/keystone-bootstrap-ck9m7" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.112904 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-credential-keys\") pod \"keystone-bootstrap-ck9m7\" (UID: \"4b01b863-c788-45c9-8e5a-d26d0c5d8f26\") " pod="openstack/keystone-bootstrap-ck9m7" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.112921 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-combined-ca-bundle\") pod \"keystone-bootstrap-ck9m7\" (UID: \"4b01b863-c788-45c9-8e5a-d26d0c5d8f26\") " pod="openstack/keystone-bootstrap-ck9m7" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.112937 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/692b055e-f83f-4ed3-99f5-d2a581edf3cb-dns-svc\") pod \"dnsmasq-dns-78b9b4595f-mt442\" (UID: \"692b055e-f83f-4ed3-99f5-d2a581edf3cb\") " pod="openstack/dnsmasq-dns-78b9b4595f-mt442" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.112973 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-fernet-keys\") pod \"keystone-bootstrap-ck9m7\" (UID: \"4b01b863-c788-45c9-8e5a-d26d0c5d8f26\") " pod="openstack/keystone-bootstrap-ck9m7" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.112990 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/692b055e-f83f-4ed3-99f5-d2a581edf3cb-ovsdbserver-nb\") pod \"dnsmasq-dns-78b9b4595f-mt442\" (UID: \"692b055e-f83f-4ed3-99f5-d2a581edf3cb\") " pod="openstack/dnsmasq-dns-78b9b4595f-mt442" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.219376 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-scripts\") pod \"keystone-bootstrap-ck9m7\" (UID: \"4b01b863-c788-45c9-8e5a-d26d0c5d8f26\") " pod="openstack/keystone-bootstrap-ck9m7" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.219462 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lzm7\" (UniqueName: \"kubernetes.io/projected/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-kube-api-access-8lzm7\") pod \"keystone-bootstrap-ck9m7\" (UID: \"4b01b863-c788-45c9-8e5a-d26d0c5d8f26\") " pod="openstack/keystone-bootstrap-ck9m7" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.219496 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/692b055e-f83f-4ed3-99f5-d2a581edf3cb-config\") pod \"dnsmasq-dns-78b9b4595f-mt442\" (UID: \"692b055e-f83f-4ed3-99f5-d2a581edf3cb\") " pod="openstack/dnsmasq-dns-78b9b4595f-mt442" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.219557 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/692b055e-f83f-4ed3-99f5-d2a581edf3cb-ovsdbserver-sb\") pod \"dnsmasq-dns-78b9b4595f-mt442\" (UID: \"692b055e-f83f-4ed3-99f5-d2a581edf3cb\") " pod="openstack/dnsmasq-dns-78b9b4595f-mt442" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.219658 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/692b055e-f83f-4ed3-99f5-d2a581edf3cb-dns-swift-storage-0\") pod \"dnsmasq-dns-78b9b4595f-mt442\" (UID: \"692b055e-f83f-4ed3-99f5-d2a581edf3cb\") " pod="openstack/dnsmasq-dns-78b9b4595f-mt442" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.219693 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pwrr\" (UniqueName: \"kubernetes.io/projected/692b055e-f83f-4ed3-99f5-d2a581edf3cb-kube-api-access-5pwrr\") pod \"dnsmasq-dns-78b9b4595f-mt442\" (UID: \"692b055e-f83f-4ed3-99f5-d2a581edf3cb\") " pod="openstack/dnsmasq-dns-78b9b4595f-mt442" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.219743 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-config-data\") pod \"keystone-bootstrap-ck9m7\" (UID: \"4b01b863-c788-45c9-8e5a-d26d0c5d8f26\") " pod="openstack/keystone-bootstrap-ck9m7" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.219793 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-credential-keys\") pod \"keystone-bootstrap-ck9m7\" (UID: \"4b01b863-c788-45c9-8e5a-d26d0c5d8f26\") " pod="openstack/keystone-bootstrap-ck9m7" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.219816 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-combined-ca-bundle\") pod \"keystone-bootstrap-ck9m7\" (UID: \"4b01b863-c788-45c9-8e5a-d26d0c5d8f26\") " pod="openstack/keystone-bootstrap-ck9m7" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.219833 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/692b055e-f83f-4ed3-99f5-d2a581edf3cb-dns-svc\") pod \"dnsmasq-dns-78b9b4595f-mt442\" (UID: \"692b055e-f83f-4ed3-99f5-d2a581edf3cb\") " pod="openstack/dnsmasq-dns-78b9b4595f-mt442" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.219896 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-fernet-keys\") pod \"keystone-bootstrap-ck9m7\" (UID: \"4b01b863-c788-45c9-8e5a-d26d0c5d8f26\") " pod="openstack/keystone-bootstrap-ck9m7" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.219914 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/692b055e-f83f-4ed3-99f5-d2a581edf3cb-ovsdbserver-nb\") pod \"dnsmasq-dns-78b9b4595f-mt442\" (UID: \"692b055e-f83f-4ed3-99f5-d2a581edf3cb\") " pod="openstack/dnsmasq-dns-78b9b4595f-mt442" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.220801 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/692b055e-f83f-4ed3-99f5-d2a581edf3cb-ovsdbserver-nb\") pod \"dnsmasq-dns-78b9b4595f-mt442\" (UID: \"692b055e-f83f-4ed3-99f5-d2a581edf3cb\") " pod="openstack/dnsmasq-dns-78b9b4595f-mt442" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.221590 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/692b055e-f83f-4ed3-99f5-d2a581edf3cb-ovsdbserver-sb\") pod \"dnsmasq-dns-78b9b4595f-mt442\" (UID: \"692b055e-f83f-4ed3-99f5-d2a581edf3cb\") " pod="openstack/dnsmasq-dns-78b9b4595f-mt442" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.228042 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-scripts\") pod \"keystone-bootstrap-ck9m7\" (UID: \"4b01b863-c788-45c9-8e5a-d26d0c5d8f26\") " pod="openstack/keystone-bootstrap-ck9m7" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.236821 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/692b055e-f83f-4ed3-99f5-d2a581edf3cb-dns-swift-storage-0\") pod \"dnsmasq-dns-78b9b4595f-mt442\" (UID: \"692b055e-f83f-4ed3-99f5-d2a581edf3cb\") " pod="openstack/dnsmasq-dns-78b9b4595f-mt442" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.237370 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/692b055e-f83f-4ed3-99f5-d2a581edf3cb-config\") pod \"dnsmasq-dns-78b9b4595f-mt442\" (UID: \"692b055e-f83f-4ed3-99f5-d2a581edf3cb\") " pod="openstack/dnsmasq-dns-78b9b4595f-mt442" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.240063 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/692b055e-f83f-4ed3-99f5-d2a581edf3cb-dns-svc\") pod \"dnsmasq-dns-78b9b4595f-mt442\" (UID: \"692b055e-f83f-4ed3-99f5-d2a581edf3cb\") " pod="openstack/dnsmasq-dns-78b9b4595f-mt442" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.242877 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-credential-keys\") pod \"keystone-bootstrap-ck9m7\" (UID: \"4b01b863-c788-45c9-8e5a-d26d0c5d8f26\") " pod="openstack/keystone-bootstrap-ck9m7" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.243001 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-combined-ca-bundle\") pod \"keystone-bootstrap-ck9m7\" (UID: \"4b01b863-c788-45c9-8e5a-d26d0c5d8f26\") " pod="openstack/keystone-bootstrap-ck9m7" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.255567 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-fernet-keys\") pod \"keystone-bootstrap-ck9m7\" (UID: \"4b01b863-c788-45c9-8e5a-d26d0c5d8f26\") " pod="openstack/keystone-bootstrap-ck9m7" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.259134 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pwrr\" (UniqueName: \"kubernetes.io/projected/692b055e-f83f-4ed3-99f5-d2a581edf3cb-kube-api-access-5pwrr\") pod \"dnsmasq-dns-78b9b4595f-mt442\" (UID: \"692b055e-f83f-4ed3-99f5-d2a581edf3cb\") " pod="openstack/dnsmasq-dns-78b9b4595f-mt442" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.265084 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lzm7\" (UniqueName: \"kubernetes.io/projected/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-kube-api-access-8lzm7\") pod \"keystone-bootstrap-ck9m7\" (UID: \"4b01b863-c788-45c9-8e5a-d26d0c5d8f26\") " pod="openstack/keystone-bootstrap-ck9m7" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.266326 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-config-data\") pod \"keystone-bootstrap-ck9m7\" (UID: \"4b01b863-c788-45c9-8e5a-d26d0c5d8f26\") " pod="openstack/keystone-bootstrap-ck9m7" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.327947 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ck9m7" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.371220 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78b9b4595f-mt442" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.382735 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-4rtgn"] Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.384538 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4rtgn" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.393644 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.393713 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.399201 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-vg2jk" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.428485 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/343931b4-805d-4457-a27f-7c680e29a475-config\") pod \"neutron-db-sync-4rtgn\" (UID: \"343931b4-805d-4457-a27f-7c680e29a475\") " pod="openstack/neutron-db-sync-4rtgn" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.428573 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmxsg\" (UniqueName: \"kubernetes.io/projected/343931b4-805d-4457-a27f-7c680e29a475-kube-api-access-kmxsg\") pod \"neutron-db-sync-4rtgn\" (UID: \"343931b4-805d-4457-a27f-7c680e29a475\") " pod="openstack/neutron-db-sync-4rtgn" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.428627 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/343931b4-805d-4457-a27f-7c680e29a475-combined-ca-bundle\") pod \"neutron-db-sync-4rtgn\" (UID: \"343931b4-805d-4457-a27f-7c680e29a475\") " pod="openstack/neutron-db-sync-4rtgn" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.429170 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-4rtgn"] Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.501410 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-w47gl"] Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.509877 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-w47gl" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.516686 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.520618 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-ffbjc"] Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.521842 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ffbjc" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.527883 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-k2qx5" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.528074 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.528218 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2kdjx" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.529814 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-scripts\") pod \"cinder-db-sync-ffbjc\" (UID: \"b35c2088-a11f-4514-8390-1a1f5ff0ccb5\") " pod="openstack/cinder-db-sync-ffbjc" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.529840 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-config-data\") pod \"cinder-db-sync-ffbjc\" (UID: \"b35c2088-a11f-4514-8390-1a1f5ff0ccb5\") " pod="openstack/cinder-db-sync-ffbjc" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.529861 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9f0a57ed-b792-42fb-95df-8a88b426b3b7-db-sync-config-data\") pod \"barbican-db-sync-w47gl\" (UID: \"9f0a57ed-b792-42fb-95df-8a88b426b3b7\") " pod="openstack/barbican-db-sync-w47gl" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.529898 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcjxl\" (UniqueName: \"kubernetes.io/projected/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-kube-api-access-bcjxl\") pod \"cinder-db-sync-ffbjc\" (UID: \"b35c2088-a11f-4514-8390-1a1f5ff0ccb5\") " pod="openstack/cinder-db-sync-ffbjc" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.529920 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/343931b4-805d-4457-a27f-7c680e29a475-config\") pod \"neutron-db-sync-4rtgn\" (UID: \"343931b4-805d-4457-a27f-7c680e29a475\") " pod="openstack/neutron-db-sync-4rtgn" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.529940 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmxsg\" (UniqueName: \"kubernetes.io/projected/343931b4-805d-4457-a27f-7c680e29a475-kube-api-access-kmxsg\") pod \"neutron-db-sync-4rtgn\" (UID: \"343931b4-805d-4457-a27f-7c680e29a475\") " pod="openstack/neutron-db-sync-4rtgn" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.529956 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f0a57ed-b792-42fb-95df-8a88b426b3b7-combined-ca-bundle\") pod \"barbican-db-sync-w47gl\" (UID: \"9f0a57ed-b792-42fb-95df-8a88b426b3b7\") " pod="openstack/barbican-db-sync-w47gl" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.529979 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-db-sync-config-data\") pod \"cinder-db-sync-ffbjc\" (UID: \"b35c2088-a11f-4514-8390-1a1f5ff0ccb5\") " pod="openstack/cinder-db-sync-ffbjc" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.530000 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/343931b4-805d-4457-a27f-7c680e29a475-combined-ca-bundle\") pod \"neutron-db-sync-4rtgn\" (UID: \"343931b4-805d-4457-a27f-7c680e29a475\") " pod="openstack/neutron-db-sync-4rtgn" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.530016 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-etc-machine-id\") pod \"cinder-db-sync-ffbjc\" (UID: \"b35c2088-a11f-4514-8390-1a1f5ff0ccb5\") " pod="openstack/cinder-db-sync-ffbjc" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.530033 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-combined-ca-bundle\") pod \"cinder-db-sync-ffbjc\" (UID: \"b35c2088-a11f-4514-8390-1a1f5ff0ccb5\") " pod="openstack/cinder-db-sync-ffbjc" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.530070 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcf8z\" (UniqueName: \"kubernetes.io/projected/9f0a57ed-b792-42fb-95df-8a88b426b3b7-kube-api-access-fcf8z\") pod \"barbican-db-sync-w47gl\" (UID: \"9f0a57ed-b792-42fb-95df-8a88b426b3b7\") " pod="openstack/barbican-db-sync-w47gl" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.532866 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.559852 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/343931b4-805d-4457-a27f-7c680e29a475-config\") pod \"neutron-db-sync-4rtgn\" (UID: \"343931b4-805d-4457-a27f-7c680e29a475\") " pod="openstack/neutron-db-sync-4rtgn" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.565270 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/343931b4-805d-4457-a27f-7c680e29a475-combined-ca-bundle\") pod \"neutron-db-sync-4rtgn\" (UID: \"343931b4-805d-4457-a27f-7c680e29a475\") " pod="openstack/neutron-db-sync-4rtgn" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.597522 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-w47gl"] Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.601556 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmxsg\" (UniqueName: \"kubernetes.io/projected/343931b4-805d-4457-a27f-7c680e29a475-kube-api-access-kmxsg\") pod \"neutron-db-sync-4rtgn\" (UID: \"343931b4-805d-4457-a27f-7c680e29a475\") " pod="openstack/neutron-db-sync-4rtgn" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.626339 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-ffbjc"] Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.632709 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-scripts\") pod \"cinder-db-sync-ffbjc\" (UID: \"b35c2088-a11f-4514-8390-1a1f5ff0ccb5\") " pod="openstack/cinder-db-sync-ffbjc" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.632754 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-config-data\") pod \"cinder-db-sync-ffbjc\" (UID: \"b35c2088-a11f-4514-8390-1a1f5ff0ccb5\") " pod="openstack/cinder-db-sync-ffbjc" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.632829 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9f0a57ed-b792-42fb-95df-8a88b426b3b7-db-sync-config-data\") pod \"barbican-db-sync-w47gl\" (UID: \"9f0a57ed-b792-42fb-95df-8a88b426b3b7\") " pod="openstack/barbican-db-sync-w47gl" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.632940 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcjxl\" (UniqueName: \"kubernetes.io/projected/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-kube-api-access-bcjxl\") pod \"cinder-db-sync-ffbjc\" (UID: \"b35c2088-a11f-4514-8390-1a1f5ff0ccb5\") " pod="openstack/cinder-db-sync-ffbjc" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.632985 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f0a57ed-b792-42fb-95df-8a88b426b3b7-combined-ca-bundle\") pod \"barbican-db-sync-w47gl\" (UID: \"9f0a57ed-b792-42fb-95df-8a88b426b3b7\") " pod="openstack/barbican-db-sync-w47gl" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.633031 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-db-sync-config-data\") pod \"cinder-db-sync-ffbjc\" (UID: \"b35c2088-a11f-4514-8390-1a1f5ff0ccb5\") " pod="openstack/cinder-db-sync-ffbjc" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.633071 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-etc-machine-id\") pod \"cinder-db-sync-ffbjc\" (UID: \"b35c2088-a11f-4514-8390-1a1f5ff0ccb5\") " pod="openstack/cinder-db-sync-ffbjc" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.633087 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-combined-ca-bundle\") pod \"cinder-db-sync-ffbjc\" (UID: \"b35c2088-a11f-4514-8390-1a1f5ff0ccb5\") " pod="openstack/cinder-db-sync-ffbjc" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.633171 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcf8z\" (UniqueName: \"kubernetes.io/projected/9f0a57ed-b792-42fb-95df-8a88b426b3b7-kube-api-access-fcf8z\") pod \"barbican-db-sync-w47gl\" (UID: \"9f0a57ed-b792-42fb-95df-8a88b426b3b7\") " pod="openstack/barbican-db-sync-w47gl" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.637734 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-etc-machine-id\") pod \"cinder-db-sync-ffbjc\" (UID: \"b35c2088-a11f-4514-8390-1a1f5ff0ccb5\") " pod="openstack/cinder-db-sync-ffbjc" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.643762 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9f0a57ed-b792-42fb-95df-8a88b426b3b7-db-sync-config-data\") pod \"barbican-db-sync-w47gl\" (UID: \"9f0a57ed-b792-42fb-95df-8a88b426b3b7\") " pod="openstack/barbican-db-sync-w47gl" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.643762 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-scripts\") pod \"cinder-db-sync-ffbjc\" (UID: \"b35c2088-a11f-4514-8390-1a1f5ff0ccb5\") " pod="openstack/cinder-db-sync-ffbjc" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.651119 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f0a57ed-b792-42fb-95df-8a88b426b3b7-combined-ca-bundle\") pod \"barbican-db-sync-w47gl\" (UID: \"9f0a57ed-b792-42fb-95df-8a88b426b3b7\") " pod="openstack/barbican-db-sync-w47gl" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.652033 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-db-sync-config-data\") pod \"cinder-db-sync-ffbjc\" (UID: \"b35c2088-a11f-4514-8390-1a1f5ff0ccb5\") " pod="openstack/cinder-db-sync-ffbjc" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.658369 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcf8z\" (UniqueName: \"kubernetes.io/projected/9f0a57ed-b792-42fb-95df-8a88b426b3b7-kube-api-access-fcf8z\") pod \"barbican-db-sync-w47gl\" (UID: \"9f0a57ed-b792-42fb-95df-8a88b426b3b7\") " pod="openstack/barbican-db-sync-w47gl" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.661052 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-combined-ca-bundle\") pod \"cinder-db-sync-ffbjc\" (UID: \"b35c2088-a11f-4514-8390-1a1f5ff0ccb5\") " pod="openstack/cinder-db-sync-ffbjc" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.662092 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-config-data\") pod \"cinder-db-sync-ffbjc\" (UID: \"b35c2088-a11f-4514-8390-1a1f5ff0ccb5\") " pod="openstack/cinder-db-sync-ffbjc" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.664584 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-64jk8"] Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.670647 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcjxl\" (UniqueName: \"kubernetes.io/projected/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-kube-api-access-bcjxl\") pod \"cinder-db-sync-ffbjc\" (UID: \"b35c2088-a11f-4514-8390-1a1f5ff0ccb5\") " pod="openstack/cinder-db-sync-ffbjc" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.670708 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-64jk8" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.684144 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-gllhf" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.689305 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.693382 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.729931 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78b9b4595f-mt442"] Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.736836 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01800831-fa81-4f58-9da9-66dcbfe86580-config-data\") pod \"placement-db-sync-64jk8\" (UID: \"01800831-fa81-4f58-9da9-66dcbfe86580\") " pod="openstack/placement-db-sync-64jk8" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.736943 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01800831-fa81-4f58-9da9-66dcbfe86580-logs\") pod \"placement-db-sync-64jk8\" (UID: \"01800831-fa81-4f58-9da9-66dcbfe86580\") " pod="openstack/placement-db-sync-64jk8" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.737088 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01800831-fa81-4f58-9da9-66dcbfe86580-scripts\") pod \"placement-db-sync-64jk8\" (UID: \"01800831-fa81-4f58-9da9-66dcbfe86580\") " pod="openstack/placement-db-sync-64jk8" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.737150 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01800831-fa81-4f58-9da9-66dcbfe86580-combined-ca-bundle\") pod \"placement-db-sync-64jk8\" (UID: \"01800831-fa81-4f58-9da9-66dcbfe86580\") " pod="openstack/placement-db-sync-64jk8" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.737193 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s76l\" (UniqueName: \"kubernetes.io/projected/01800831-fa81-4f58-9da9-66dcbfe86580-kube-api-access-6s76l\") pod \"placement-db-sync-64jk8\" (UID: \"01800831-fa81-4f58-9da9-66dcbfe86580\") " pod="openstack/placement-db-sync-64jk8" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.744611 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-64jk8"] Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.764353 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-794c6877f7-rhjmw"] Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.765930 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.775719 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-794c6877f7-rhjmw"] Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.802252 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4rtgn" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.838446 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01800831-fa81-4f58-9da9-66dcbfe86580-scripts\") pod \"placement-db-sync-64jk8\" (UID: \"01800831-fa81-4f58-9da9-66dcbfe86580\") " pod="openstack/placement-db-sync-64jk8" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.838520 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b2c1f93-ed28-4323-bb50-e8c143e01f51-config\") pod \"dnsmasq-dns-794c6877f7-rhjmw\" (UID: \"0b2c1f93-ed28-4323-bb50-e8c143e01f51\") " pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.838551 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01800831-fa81-4f58-9da9-66dcbfe86580-combined-ca-bundle\") pod \"placement-db-sync-64jk8\" (UID: \"01800831-fa81-4f58-9da9-66dcbfe86580\") " pod="openstack/placement-db-sync-64jk8" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.838578 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdxjj\" (UniqueName: \"kubernetes.io/projected/0b2c1f93-ed28-4323-bb50-e8c143e01f51-kube-api-access-kdxjj\") pod \"dnsmasq-dns-794c6877f7-rhjmw\" (UID: \"0b2c1f93-ed28-4323-bb50-e8c143e01f51\") " pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.838597 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s76l\" (UniqueName: \"kubernetes.io/projected/01800831-fa81-4f58-9da9-66dcbfe86580-kube-api-access-6s76l\") pod \"placement-db-sync-64jk8\" (UID: \"01800831-fa81-4f58-9da9-66dcbfe86580\") " pod="openstack/placement-db-sync-64jk8" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.838627 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01800831-fa81-4f58-9da9-66dcbfe86580-config-data\") pod \"placement-db-sync-64jk8\" (UID: \"01800831-fa81-4f58-9da9-66dcbfe86580\") " pod="openstack/placement-db-sync-64jk8" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.838687 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01800831-fa81-4f58-9da9-66dcbfe86580-logs\") pod \"placement-db-sync-64jk8\" (UID: \"01800831-fa81-4f58-9da9-66dcbfe86580\") " pod="openstack/placement-db-sync-64jk8" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.838706 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b2c1f93-ed28-4323-bb50-e8c143e01f51-dns-svc\") pod \"dnsmasq-dns-794c6877f7-rhjmw\" (UID: \"0b2c1f93-ed28-4323-bb50-e8c143e01f51\") " pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.838747 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b2c1f93-ed28-4323-bb50-e8c143e01f51-ovsdbserver-sb\") pod \"dnsmasq-dns-794c6877f7-rhjmw\" (UID: \"0b2c1f93-ed28-4323-bb50-e8c143e01f51\") " pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.838771 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b2c1f93-ed28-4323-bb50-e8c143e01f51-ovsdbserver-nb\") pod \"dnsmasq-dns-794c6877f7-rhjmw\" (UID: \"0b2c1f93-ed28-4323-bb50-e8c143e01f51\") " pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.838804 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0b2c1f93-ed28-4323-bb50-e8c143e01f51-dns-swift-storage-0\") pod \"dnsmasq-dns-794c6877f7-rhjmw\" (UID: \"0b2c1f93-ed28-4323-bb50-e8c143e01f51\") " pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.842990 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01800831-fa81-4f58-9da9-66dcbfe86580-scripts\") pod \"placement-db-sync-64jk8\" (UID: \"01800831-fa81-4f58-9da9-66dcbfe86580\") " pod="openstack/placement-db-sync-64jk8" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.843278 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01800831-fa81-4f58-9da9-66dcbfe86580-logs\") pod \"placement-db-sync-64jk8\" (UID: \"01800831-fa81-4f58-9da9-66dcbfe86580\") " pod="openstack/placement-db-sync-64jk8" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.848159 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01800831-fa81-4f58-9da9-66dcbfe86580-combined-ca-bundle\") pod \"placement-db-sync-64jk8\" (UID: \"01800831-fa81-4f58-9da9-66dcbfe86580\") " pod="openstack/placement-db-sync-64jk8" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.849002 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01800831-fa81-4f58-9da9-66dcbfe86580-config-data\") pod \"placement-db-sync-64jk8\" (UID: \"01800831-fa81-4f58-9da9-66dcbfe86580\") " pod="openstack/placement-db-sync-64jk8" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.861844 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s76l\" (UniqueName: \"kubernetes.io/projected/01800831-fa81-4f58-9da9-66dcbfe86580-kube-api-access-6s76l\") pod \"placement-db-sync-64jk8\" (UID: \"01800831-fa81-4f58-9da9-66dcbfe86580\") " pod="openstack/placement-db-sync-64jk8" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.920533 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-w47gl" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.939430 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b2c1f93-ed28-4323-bb50-e8c143e01f51-config\") pod \"dnsmasq-dns-794c6877f7-rhjmw\" (UID: \"0b2c1f93-ed28-4323-bb50-e8c143e01f51\") " pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.939484 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdxjj\" (UniqueName: \"kubernetes.io/projected/0b2c1f93-ed28-4323-bb50-e8c143e01f51-kube-api-access-kdxjj\") pod \"dnsmasq-dns-794c6877f7-rhjmw\" (UID: \"0b2c1f93-ed28-4323-bb50-e8c143e01f51\") " pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.939540 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b2c1f93-ed28-4323-bb50-e8c143e01f51-dns-svc\") pod \"dnsmasq-dns-794c6877f7-rhjmw\" (UID: \"0b2c1f93-ed28-4323-bb50-e8c143e01f51\") " pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.939578 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b2c1f93-ed28-4323-bb50-e8c143e01f51-ovsdbserver-sb\") pod \"dnsmasq-dns-794c6877f7-rhjmw\" (UID: \"0b2c1f93-ed28-4323-bb50-e8c143e01f51\") " pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.939601 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b2c1f93-ed28-4323-bb50-e8c143e01f51-ovsdbserver-nb\") pod \"dnsmasq-dns-794c6877f7-rhjmw\" (UID: \"0b2c1f93-ed28-4323-bb50-e8c143e01f51\") " pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.939640 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0b2c1f93-ed28-4323-bb50-e8c143e01f51-dns-swift-storage-0\") pod \"dnsmasq-dns-794c6877f7-rhjmw\" (UID: \"0b2c1f93-ed28-4323-bb50-e8c143e01f51\") " pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.940484 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0b2c1f93-ed28-4323-bb50-e8c143e01f51-dns-swift-storage-0\") pod \"dnsmasq-dns-794c6877f7-rhjmw\" (UID: \"0b2c1f93-ed28-4323-bb50-e8c143e01f51\") " pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.941660 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b2c1f93-ed28-4323-bb50-e8c143e01f51-config\") pod \"dnsmasq-dns-794c6877f7-rhjmw\" (UID: \"0b2c1f93-ed28-4323-bb50-e8c143e01f51\") " pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.942452 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b2c1f93-ed28-4323-bb50-e8c143e01f51-ovsdbserver-sb\") pod \"dnsmasq-dns-794c6877f7-rhjmw\" (UID: \"0b2c1f93-ed28-4323-bb50-e8c143e01f51\") " pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.942732 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b2c1f93-ed28-4323-bb50-e8c143e01f51-ovsdbserver-nb\") pod \"dnsmasq-dns-794c6877f7-rhjmw\" (UID: \"0b2c1f93-ed28-4323-bb50-e8c143e01f51\") " pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.943204 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b2c1f93-ed28-4323-bb50-e8c143e01f51-dns-svc\") pod \"dnsmasq-dns-794c6877f7-rhjmw\" (UID: \"0b2c1f93-ed28-4323-bb50-e8c143e01f51\") " pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.946308 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78b9b4595f-mt442"] Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.953187 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ffbjc" Dec 05 16:36:51 crc kubenswrapper[4756]: I1205 16:36:51.961986 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdxjj\" (UniqueName: \"kubernetes.io/projected/0b2c1f93-ed28-4323-bb50-e8c143e01f51-kube-api-access-kdxjj\") pod \"dnsmasq-dns-794c6877f7-rhjmw\" (UID: \"0b2c1f93-ed28-4323-bb50-e8c143e01f51\") " pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.015685 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ck9m7"] Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.022006 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-64jk8" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.090772 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.295575 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.295640 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.296006 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.296704 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"314160243b8e9a333f056208abaaefb43e7de34a3e41ec79e08a896638376f87"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.297170 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://314160243b8e9a333f056208abaaefb43e7de34a3e41ec79e08a896638376f87" gracePeriod=600 Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.335835 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-4rtgn"] Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.511168 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-ffbjc"] Dec 05 16:36:52 crc kubenswrapper[4756]: W1205 16:36:52.523399 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb35c2088_a11f_4514_8390_1a1f5ff0ccb5.slice/crio-e635cd23ae7f2d107556903658fe456129ac3c804641ccfbd935f9f01c996c52 WatchSource:0}: Error finding container e635cd23ae7f2d107556903658fe456129ac3c804641ccfbd935f9f01c996c52: Status 404 returned error can't find the container with id e635cd23ae7f2d107556903658fe456129ac3c804641ccfbd935f9f01c996c52 Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.564043 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.572451 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.578746 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.582932 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.583089 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.682020 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-w47gl"] Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.753506 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7550fa7-075b-42e9-be97-52e7ed4f552c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d7550fa7-075b-42e9-be97-52e7ed4f552c\") " pod="openstack/ceilometer-0" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.753561 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7550fa7-075b-42e9-be97-52e7ed4f552c-log-httpd\") pod \"ceilometer-0\" (UID: \"d7550fa7-075b-42e9-be97-52e7ed4f552c\") " pod="openstack/ceilometer-0" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.753589 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5dz4\" (UniqueName: \"kubernetes.io/projected/d7550fa7-075b-42e9-be97-52e7ed4f552c-kube-api-access-k5dz4\") pod \"ceilometer-0\" (UID: \"d7550fa7-075b-42e9-be97-52e7ed4f552c\") " pod="openstack/ceilometer-0" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.753724 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7550fa7-075b-42e9-be97-52e7ed4f552c-scripts\") pod \"ceilometer-0\" (UID: \"d7550fa7-075b-42e9-be97-52e7ed4f552c\") " pod="openstack/ceilometer-0" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.753767 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7550fa7-075b-42e9-be97-52e7ed4f552c-run-httpd\") pod \"ceilometer-0\" (UID: \"d7550fa7-075b-42e9-be97-52e7ed4f552c\") " pod="openstack/ceilometer-0" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.753827 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7550fa7-075b-42e9-be97-52e7ed4f552c-config-data\") pod \"ceilometer-0\" (UID: \"d7550fa7-075b-42e9-be97-52e7ed4f552c\") " pod="openstack/ceilometer-0" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.753867 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7550fa7-075b-42e9-be97-52e7ed4f552c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d7550fa7-075b-42e9-be97-52e7ed4f552c\") " pod="openstack/ceilometer-0" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.783021 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4rtgn" event={"ID":"343931b4-805d-4457-a27f-7c680e29a475","Type":"ContainerStarted","Data":"7c27b557cb4a78e84801969e7ecd598a80c88010e59d4e2e8a5bcae978626017"} Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.783103 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4rtgn" event={"ID":"343931b4-805d-4457-a27f-7c680e29a475","Type":"ContainerStarted","Data":"4ca0b76afdbc174758110ef84fcb830144be6ec2acfc42019850a54b61c8b74e"} Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.785011 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ffbjc" event={"ID":"b35c2088-a11f-4514-8390-1a1f5ff0ccb5","Type":"ContainerStarted","Data":"e635cd23ae7f2d107556903658fe456129ac3c804641ccfbd935f9f01c996c52"} Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.786462 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-w47gl" event={"ID":"9f0a57ed-b792-42fb-95df-8a88b426b3b7","Type":"ContainerStarted","Data":"379f9a9b7c9d3e3dc2f1fea7a80b69d968ead94df695d2e07bebf99878880ce2"} Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.791911 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78b9b4595f-mt442" event={"ID":"692b055e-f83f-4ed3-99f5-d2a581edf3cb","Type":"ContainerStarted","Data":"ec168056589e87ee2fc3b35ae88a0cfcb60afbf27837794359ad73e5aaddeb20"} Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.791963 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78b9b4595f-mt442" event={"ID":"692b055e-f83f-4ed3-99f5-d2a581edf3cb","Type":"ContainerStarted","Data":"4870ecbd390cea3b180f63f2431d7c5cf0f787317eede75170c6a8ef110c5b07"} Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.794070 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="314160243b8e9a333f056208abaaefb43e7de34a3e41ec79e08a896638376f87" exitCode=0 Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.794119 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"314160243b8e9a333f056208abaaefb43e7de34a3e41ec79e08a896638376f87"} Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.794138 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"4d4a53385c8aee53151ea619736690865a1fdc90406e455c625f67e894ce004c"} Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.794154 4756 scope.go:117] "RemoveContainer" containerID="e8a4fad4de22e09f52b10e225d79410b313d20ac3252ed14a260ce12e986a83e" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.806279 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ck9m7" event={"ID":"4b01b863-c788-45c9-8e5a-d26d0c5d8f26","Type":"ContainerStarted","Data":"935c0bc93eae0dc34c3b477543cb9ee593f85515f0bc2d07d1730561667fd233"} Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.806368 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ck9m7" event={"ID":"4b01b863-c788-45c9-8e5a-d26d0c5d8f26","Type":"ContainerStarted","Data":"640438d9e9828758ec46b3c31062d38bd7489d888675cbeeb6b10dfe194fe939"} Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.858723 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7550fa7-075b-42e9-be97-52e7ed4f552c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d7550fa7-075b-42e9-be97-52e7ed4f552c\") " pod="openstack/ceilometer-0" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.858813 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7550fa7-075b-42e9-be97-52e7ed4f552c-log-httpd\") pod \"ceilometer-0\" (UID: \"d7550fa7-075b-42e9-be97-52e7ed4f552c\") " pod="openstack/ceilometer-0" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.858857 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5dz4\" (UniqueName: \"kubernetes.io/projected/d7550fa7-075b-42e9-be97-52e7ed4f552c-kube-api-access-k5dz4\") pod \"ceilometer-0\" (UID: \"d7550fa7-075b-42e9-be97-52e7ed4f552c\") " pod="openstack/ceilometer-0" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.858911 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7550fa7-075b-42e9-be97-52e7ed4f552c-scripts\") pod \"ceilometer-0\" (UID: \"d7550fa7-075b-42e9-be97-52e7ed4f552c\") " pod="openstack/ceilometer-0" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.858933 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7550fa7-075b-42e9-be97-52e7ed4f552c-run-httpd\") pod \"ceilometer-0\" (UID: \"d7550fa7-075b-42e9-be97-52e7ed4f552c\") " pod="openstack/ceilometer-0" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.858976 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7550fa7-075b-42e9-be97-52e7ed4f552c-config-data\") pod \"ceilometer-0\" (UID: \"d7550fa7-075b-42e9-be97-52e7ed4f552c\") " pod="openstack/ceilometer-0" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.859016 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7550fa7-075b-42e9-be97-52e7ed4f552c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d7550fa7-075b-42e9-be97-52e7ed4f552c\") " pod="openstack/ceilometer-0" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.863398 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7550fa7-075b-42e9-be97-52e7ed4f552c-log-httpd\") pod \"ceilometer-0\" (UID: \"d7550fa7-075b-42e9-be97-52e7ed4f552c\") " pod="openstack/ceilometer-0" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.865376 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7550fa7-075b-42e9-be97-52e7ed4f552c-run-httpd\") pod \"ceilometer-0\" (UID: \"d7550fa7-075b-42e9-be97-52e7ed4f552c\") " pod="openstack/ceilometer-0" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.869560 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7550fa7-075b-42e9-be97-52e7ed4f552c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d7550fa7-075b-42e9-be97-52e7ed4f552c\") " pod="openstack/ceilometer-0" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.869747 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7550fa7-075b-42e9-be97-52e7ed4f552c-config-data\") pod \"ceilometer-0\" (UID: \"d7550fa7-075b-42e9-be97-52e7ed4f552c\") " pod="openstack/ceilometer-0" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.870202 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7550fa7-075b-42e9-be97-52e7ed4f552c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d7550fa7-075b-42e9-be97-52e7ed4f552c\") " pod="openstack/ceilometer-0" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.883502 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7550fa7-075b-42e9-be97-52e7ed4f552c-scripts\") pod \"ceilometer-0\" (UID: \"d7550fa7-075b-42e9-be97-52e7ed4f552c\") " pod="openstack/ceilometer-0" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.895833 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-64jk8"] Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.905575 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5dz4\" (UniqueName: \"kubernetes.io/projected/d7550fa7-075b-42e9-be97-52e7ed4f552c-kube-api-access-k5dz4\") pod \"ceilometer-0\" (UID: \"d7550fa7-075b-42e9-be97-52e7ed4f552c\") " pod="openstack/ceilometer-0" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.917189 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-ck9m7" podStartSLOduration=2.917163467 podStartE2EDuration="2.917163467s" podCreationTimestamp="2025-12-05 16:36:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:36:52.863187213 +0000 UTC m=+1273.101860735" watchObservedRunningTime="2025-12-05 16:36:52.917163467 +0000 UTC m=+1273.155836999" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.926088 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:36:52 crc kubenswrapper[4756]: I1205 16:36:52.950813 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-794c6877f7-rhjmw"] Dec 05 16:36:53 crc kubenswrapper[4756]: I1205 16:36:53.573483 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:36:53 crc kubenswrapper[4756]: I1205 16:36:53.815521 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-64jk8" event={"ID":"01800831-fa81-4f58-9da9-66dcbfe86580","Type":"ContainerStarted","Data":"8ecdafeb73e64adf7ef8d5dd110f049e479909be680134ce99b5838c3026f9de"} Dec 05 16:36:53 crc kubenswrapper[4756]: I1205 16:36:53.816990 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7550fa7-075b-42e9-be97-52e7ed4f552c","Type":"ContainerStarted","Data":"f764ca580f7aae85592ece8b51ec3bcd365a571a4b3543bfe145060e1e87bb24"} Dec 05 16:36:53 crc kubenswrapper[4756]: I1205 16:36:53.818360 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" event={"ID":"0b2c1f93-ed28-4323-bb50-e8c143e01f51","Type":"ContainerStarted","Data":"959dc7bc41aae2c9f926443387c9bd7fa1f1cb9a6c0704ed7213840d20f0898b"} Dec 05 16:36:53 crc kubenswrapper[4756]: I1205 16:36:53.818495 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78b9b4595f-mt442" podUID="692b055e-f83f-4ed3-99f5-d2a581edf3cb" containerName="init" containerID="cri-o://ec168056589e87ee2fc3b35ae88a0cfcb60afbf27837794359ad73e5aaddeb20" gracePeriod=10 Dec 05 16:36:54 crc kubenswrapper[4756]: I1205 16:36:54.019736 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:36:54 crc kubenswrapper[4756]: I1205 16:36:54.814603 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78b9b4595f-mt442" Dec 05 16:36:54 crc kubenswrapper[4756]: I1205 16:36:54.838623 4756 generic.go:334] "Generic (PLEG): container finished" podID="692b055e-f83f-4ed3-99f5-d2a581edf3cb" containerID="ec168056589e87ee2fc3b35ae88a0cfcb60afbf27837794359ad73e5aaddeb20" exitCode=0 Dec 05 16:36:54 crc kubenswrapper[4756]: I1205 16:36:54.838704 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78b9b4595f-mt442" Dec 05 16:36:54 crc kubenswrapper[4756]: I1205 16:36:54.838704 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78b9b4595f-mt442" event={"ID":"692b055e-f83f-4ed3-99f5-d2a581edf3cb","Type":"ContainerDied","Data":"ec168056589e87ee2fc3b35ae88a0cfcb60afbf27837794359ad73e5aaddeb20"} Dec 05 16:36:54 crc kubenswrapper[4756]: I1205 16:36:54.838819 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78b9b4595f-mt442" event={"ID":"692b055e-f83f-4ed3-99f5-d2a581edf3cb","Type":"ContainerDied","Data":"4870ecbd390cea3b180f63f2431d7c5cf0f787317eede75170c6a8ef110c5b07"} Dec 05 16:36:54 crc kubenswrapper[4756]: I1205 16:36:54.838842 4756 scope.go:117] "RemoveContainer" containerID="ec168056589e87ee2fc3b35ae88a0cfcb60afbf27837794359ad73e5aaddeb20" Dec 05 16:36:54 crc kubenswrapper[4756]: I1205 16:36:54.840924 4756 generic.go:334] "Generic (PLEG): container finished" podID="0b2c1f93-ed28-4323-bb50-e8c143e01f51" containerID="97ff50d7a20d09b9ee422d55325eb627b6df125b1520eb972aebd163564bccb6" exitCode=0 Dec 05 16:36:54 crc kubenswrapper[4756]: I1205 16:36:54.840982 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" event={"ID":"0b2c1f93-ed28-4323-bb50-e8c143e01f51","Type":"ContainerDied","Data":"97ff50d7a20d09b9ee422d55325eb627b6df125b1520eb972aebd163564bccb6"} Dec 05 16:36:54 crc kubenswrapper[4756]: I1205 16:36:54.898551 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/692b055e-f83f-4ed3-99f5-d2a581edf3cb-ovsdbserver-sb\") pod \"692b055e-f83f-4ed3-99f5-d2a581edf3cb\" (UID: \"692b055e-f83f-4ed3-99f5-d2a581edf3cb\") " Dec 05 16:36:54 crc kubenswrapper[4756]: I1205 16:36:54.898656 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/692b055e-f83f-4ed3-99f5-d2a581edf3cb-ovsdbserver-nb\") pod \"692b055e-f83f-4ed3-99f5-d2a581edf3cb\" (UID: \"692b055e-f83f-4ed3-99f5-d2a581edf3cb\") " Dec 05 16:36:54 crc kubenswrapper[4756]: I1205 16:36:54.898764 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/692b055e-f83f-4ed3-99f5-d2a581edf3cb-config\") pod \"692b055e-f83f-4ed3-99f5-d2a581edf3cb\" (UID: \"692b055e-f83f-4ed3-99f5-d2a581edf3cb\") " Dec 05 16:36:54 crc kubenswrapper[4756]: I1205 16:36:54.899276 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/692b055e-f83f-4ed3-99f5-d2a581edf3cb-dns-svc\") pod \"692b055e-f83f-4ed3-99f5-d2a581edf3cb\" (UID: \"692b055e-f83f-4ed3-99f5-d2a581edf3cb\") " Dec 05 16:36:54 crc kubenswrapper[4756]: I1205 16:36:54.899305 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pwrr\" (UniqueName: \"kubernetes.io/projected/692b055e-f83f-4ed3-99f5-d2a581edf3cb-kube-api-access-5pwrr\") pod \"692b055e-f83f-4ed3-99f5-d2a581edf3cb\" (UID: \"692b055e-f83f-4ed3-99f5-d2a581edf3cb\") " Dec 05 16:36:54 crc kubenswrapper[4756]: I1205 16:36:54.899345 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/692b055e-f83f-4ed3-99f5-d2a581edf3cb-dns-swift-storage-0\") pod \"692b055e-f83f-4ed3-99f5-d2a581edf3cb\" (UID: \"692b055e-f83f-4ed3-99f5-d2a581edf3cb\") " Dec 05 16:36:54 crc kubenswrapper[4756]: I1205 16:36:54.904307 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-4rtgn" podStartSLOduration=3.904286284 podStartE2EDuration="3.904286284s" podCreationTimestamp="2025-12-05 16:36:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:36:54.899676249 +0000 UTC m=+1275.138349761" watchObservedRunningTime="2025-12-05 16:36:54.904286284 +0000 UTC m=+1275.142959796" Dec 05 16:36:54 crc kubenswrapper[4756]: I1205 16:36:54.912184 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/692b055e-f83f-4ed3-99f5-d2a581edf3cb-kube-api-access-5pwrr" (OuterVolumeSpecName: "kube-api-access-5pwrr") pod "692b055e-f83f-4ed3-99f5-d2a581edf3cb" (UID: "692b055e-f83f-4ed3-99f5-d2a581edf3cb"). InnerVolumeSpecName "kube-api-access-5pwrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:36:54 crc kubenswrapper[4756]: I1205 16:36:54.930734 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/692b055e-f83f-4ed3-99f5-d2a581edf3cb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "692b055e-f83f-4ed3-99f5-d2a581edf3cb" (UID: "692b055e-f83f-4ed3-99f5-d2a581edf3cb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:36:54 crc kubenswrapper[4756]: I1205 16:36:54.945782 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/692b055e-f83f-4ed3-99f5-d2a581edf3cb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "692b055e-f83f-4ed3-99f5-d2a581edf3cb" (UID: "692b055e-f83f-4ed3-99f5-d2a581edf3cb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:36:54 crc kubenswrapper[4756]: I1205 16:36:54.960177 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/692b055e-f83f-4ed3-99f5-d2a581edf3cb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "692b055e-f83f-4ed3-99f5-d2a581edf3cb" (UID: "692b055e-f83f-4ed3-99f5-d2a581edf3cb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:36:54 crc kubenswrapper[4756]: I1205 16:36:54.972751 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/692b055e-f83f-4ed3-99f5-d2a581edf3cb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "692b055e-f83f-4ed3-99f5-d2a581edf3cb" (UID: "692b055e-f83f-4ed3-99f5-d2a581edf3cb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:36:54 crc kubenswrapper[4756]: I1205 16:36:54.974183 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/692b055e-f83f-4ed3-99f5-d2a581edf3cb-config" (OuterVolumeSpecName: "config") pod "692b055e-f83f-4ed3-99f5-d2a581edf3cb" (UID: "692b055e-f83f-4ed3-99f5-d2a581edf3cb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:36:55 crc kubenswrapper[4756]: I1205 16:36:55.001201 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pwrr\" (UniqueName: \"kubernetes.io/projected/692b055e-f83f-4ed3-99f5-d2a581edf3cb-kube-api-access-5pwrr\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:55 crc kubenswrapper[4756]: I1205 16:36:55.001231 4756 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/692b055e-f83f-4ed3-99f5-d2a581edf3cb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:55 crc kubenswrapper[4756]: I1205 16:36:55.001242 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/692b055e-f83f-4ed3-99f5-d2a581edf3cb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:55 crc kubenswrapper[4756]: I1205 16:36:55.001250 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/692b055e-f83f-4ed3-99f5-d2a581edf3cb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:55 crc kubenswrapper[4756]: I1205 16:36:55.001258 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/692b055e-f83f-4ed3-99f5-d2a581edf3cb-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:55 crc kubenswrapper[4756]: I1205 16:36:55.001268 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/692b055e-f83f-4ed3-99f5-d2a581edf3cb-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 16:36:55 crc kubenswrapper[4756]: I1205 16:36:55.010095 4756 scope.go:117] "RemoveContainer" containerID="ec168056589e87ee2fc3b35ae88a0cfcb60afbf27837794359ad73e5aaddeb20" Dec 05 16:36:55 crc kubenswrapper[4756]: E1205 16:36:55.010991 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec168056589e87ee2fc3b35ae88a0cfcb60afbf27837794359ad73e5aaddeb20\": container with ID starting with ec168056589e87ee2fc3b35ae88a0cfcb60afbf27837794359ad73e5aaddeb20 not found: ID does not exist" containerID="ec168056589e87ee2fc3b35ae88a0cfcb60afbf27837794359ad73e5aaddeb20" Dec 05 16:36:55 crc kubenswrapper[4756]: I1205 16:36:55.011031 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec168056589e87ee2fc3b35ae88a0cfcb60afbf27837794359ad73e5aaddeb20"} err="failed to get container status \"ec168056589e87ee2fc3b35ae88a0cfcb60afbf27837794359ad73e5aaddeb20\": rpc error: code = NotFound desc = could not find container \"ec168056589e87ee2fc3b35ae88a0cfcb60afbf27837794359ad73e5aaddeb20\": container with ID starting with ec168056589e87ee2fc3b35ae88a0cfcb60afbf27837794359ad73e5aaddeb20 not found: ID does not exist" Dec 05 16:36:55 crc kubenswrapper[4756]: I1205 16:36:55.208788 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78b9b4595f-mt442"] Dec 05 16:36:55 crc kubenswrapper[4756]: I1205 16:36:55.220362 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78b9b4595f-mt442"] Dec 05 16:36:55 crc kubenswrapper[4756]: I1205 16:36:55.865703 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" event={"ID":"0b2c1f93-ed28-4323-bb50-e8c143e01f51","Type":"ContainerStarted","Data":"fe04514fb7986a3917fe465e5c61beebbacdaf0ec3432b12272d4701aa5b6d79"} Dec 05 16:36:55 crc kubenswrapper[4756]: I1205 16:36:55.866372 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" Dec 05 16:36:56 crc kubenswrapper[4756]: I1205 16:36:56.063461 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="692b055e-f83f-4ed3-99f5-d2a581edf3cb" path="/var/lib/kubelet/pods/692b055e-f83f-4ed3-99f5-d2a581edf3cb/volumes" Dec 05 16:36:58 crc kubenswrapper[4756]: I1205 16:36:58.895569 4756 generic.go:334] "Generic (PLEG): container finished" podID="4b01b863-c788-45c9-8e5a-d26d0c5d8f26" containerID="935c0bc93eae0dc34c3b477543cb9ee593f85515f0bc2d07d1730561667fd233" exitCode=0 Dec 05 16:36:58 crc kubenswrapper[4756]: I1205 16:36:58.895671 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ck9m7" event={"ID":"4b01b863-c788-45c9-8e5a-d26d0c5d8f26","Type":"ContainerDied","Data":"935c0bc93eae0dc34c3b477543cb9ee593f85515f0bc2d07d1730561667fd233"} Dec 05 16:36:58 crc kubenswrapper[4756]: I1205 16:36:58.918918 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" podStartSLOduration=7.918899401 podStartE2EDuration="7.918899401s" podCreationTimestamp="2025-12-05 16:36:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:36:55.909769331 +0000 UTC m=+1276.148442843" watchObservedRunningTime="2025-12-05 16:36:58.918899401 +0000 UTC m=+1279.157572913" Dec 05 16:37:02 crc kubenswrapper[4756]: I1205 16:37:02.093690 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" Dec 05 16:37:02 crc kubenswrapper[4756]: I1205 16:37:02.163825 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-779c5847bc-x2p9h"] Dec 05 16:37:02 crc kubenswrapper[4756]: I1205 16:37:02.164077 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" podUID="bee23650-0d6d-495a-af65-7e818117b368" containerName="dnsmasq-dns" containerID="cri-o://3363871f09f0407a249583935183f2d2f082f2738b6a7c6ff9ff1414a1479a2b" gracePeriod=10 Dec 05 16:37:02 crc kubenswrapper[4756]: I1205 16:37:02.844969 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" podUID="bee23650-0d6d-495a-af65-7e818117b368" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Dec 05 16:37:03 crc kubenswrapper[4756]: I1205 16:37:03.942229 4756 generic.go:334] "Generic (PLEG): container finished" podID="bee23650-0d6d-495a-af65-7e818117b368" containerID="3363871f09f0407a249583935183f2d2f082f2738b6a7c6ff9ff1414a1479a2b" exitCode=0 Dec 05 16:37:03 crc kubenswrapper[4756]: I1205 16:37:03.942297 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" event={"ID":"bee23650-0d6d-495a-af65-7e818117b368","Type":"ContainerDied","Data":"3363871f09f0407a249583935183f2d2f082f2738b6a7c6ff9ff1414a1479a2b"} Dec 05 16:37:07 crc kubenswrapper[4756]: I1205 16:37:07.844488 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" podUID="bee23650-0d6d-495a-af65-7e818117b368" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Dec 05 16:37:08 crc kubenswrapper[4756]: I1205 16:37:08.873291 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ck9m7" Dec 05 16:37:08 crc kubenswrapper[4756]: I1205 16:37:08.987164 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lzm7\" (UniqueName: \"kubernetes.io/projected/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-kube-api-access-8lzm7\") pod \"4b01b863-c788-45c9-8e5a-d26d0c5d8f26\" (UID: \"4b01b863-c788-45c9-8e5a-d26d0c5d8f26\") " Dec 05 16:37:08 crc kubenswrapper[4756]: I1205 16:37:08.987255 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-credential-keys\") pod \"4b01b863-c788-45c9-8e5a-d26d0c5d8f26\" (UID: \"4b01b863-c788-45c9-8e5a-d26d0c5d8f26\") " Dec 05 16:37:08 crc kubenswrapper[4756]: I1205 16:37:08.987300 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-fernet-keys\") pod \"4b01b863-c788-45c9-8e5a-d26d0c5d8f26\" (UID: \"4b01b863-c788-45c9-8e5a-d26d0c5d8f26\") " Dec 05 16:37:08 crc kubenswrapper[4756]: I1205 16:37:08.987367 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-config-data\") pod \"4b01b863-c788-45c9-8e5a-d26d0c5d8f26\" (UID: \"4b01b863-c788-45c9-8e5a-d26d0c5d8f26\") " Dec 05 16:37:08 crc kubenswrapper[4756]: I1205 16:37:08.987447 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-scripts\") pod \"4b01b863-c788-45c9-8e5a-d26d0c5d8f26\" (UID: \"4b01b863-c788-45c9-8e5a-d26d0c5d8f26\") " Dec 05 16:37:08 crc kubenswrapper[4756]: I1205 16:37:08.987473 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-combined-ca-bundle\") pod \"4b01b863-c788-45c9-8e5a-d26d0c5d8f26\" (UID: \"4b01b863-c788-45c9-8e5a-d26d0c5d8f26\") " Dec 05 16:37:08 crc kubenswrapper[4756]: I1205 16:37:08.993739 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-kube-api-access-8lzm7" (OuterVolumeSpecName: "kube-api-access-8lzm7") pod "4b01b863-c788-45c9-8e5a-d26d0c5d8f26" (UID: "4b01b863-c788-45c9-8e5a-d26d0c5d8f26"). InnerVolumeSpecName "kube-api-access-8lzm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:37:08 crc kubenswrapper[4756]: I1205 16:37:08.994328 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4b01b863-c788-45c9-8e5a-d26d0c5d8f26" (UID: "4b01b863-c788-45c9-8e5a-d26d0c5d8f26"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:08 crc kubenswrapper[4756]: I1205 16:37:08.995058 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4b01b863-c788-45c9-8e5a-d26d0c5d8f26" (UID: "4b01b863-c788-45c9-8e5a-d26d0c5d8f26"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:08 crc kubenswrapper[4756]: I1205 16:37:08.998060 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-scripts" (OuterVolumeSpecName: "scripts") pod "4b01b863-c788-45c9-8e5a-d26d0c5d8f26" (UID: "4b01b863-c788-45c9-8e5a-d26d0c5d8f26"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:09 crc kubenswrapper[4756]: I1205 16:37:09.001340 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ck9m7" event={"ID":"4b01b863-c788-45c9-8e5a-d26d0c5d8f26","Type":"ContainerDied","Data":"640438d9e9828758ec46b3c31062d38bd7489d888675cbeeb6b10dfe194fe939"} Dec 05 16:37:09 crc kubenswrapper[4756]: I1205 16:37:09.001430 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ck9m7" Dec 05 16:37:09 crc kubenswrapper[4756]: I1205 16:37:09.001459 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="640438d9e9828758ec46b3c31062d38bd7489d888675cbeeb6b10dfe194fe939" Dec 05 16:37:09 crc kubenswrapper[4756]: I1205 16:37:09.018795 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b01b863-c788-45c9-8e5a-d26d0c5d8f26" (UID: "4b01b863-c788-45c9-8e5a-d26d0c5d8f26"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:09 crc kubenswrapper[4756]: I1205 16:37:09.022954 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-config-data" (OuterVolumeSpecName: "config-data") pod "4b01b863-c788-45c9-8e5a-d26d0c5d8f26" (UID: "4b01b863-c788-45c9-8e5a-d26d0c5d8f26"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:09 crc kubenswrapper[4756]: I1205 16:37:09.089336 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:09 crc kubenswrapper[4756]: I1205 16:37:09.089372 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:09 crc kubenswrapper[4756]: I1205 16:37:09.089385 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lzm7\" (UniqueName: \"kubernetes.io/projected/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-kube-api-access-8lzm7\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:09 crc kubenswrapper[4756]: I1205 16:37:09.089397 4756 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:09 crc kubenswrapper[4756]: I1205 16:37:09.089409 4756 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:09 crc kubenswrapper[4756]: I1205 16:37:09.089441 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b01b863-c788-45c9-8e5a-d26d0c5d8f26-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.060032 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-ck9m7"] Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.062048 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-ck9m7"] Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.164990 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-nrzfl"] Dec 05 16:37:10 crc kubenswrapper[4756]: E1205 16:37:10.165608 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="692b055e-f83f-4ed3-99f5-d2a581edf3cb" containerName="init" Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.165628 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="692b055e-f83f-4ed3-99f5-d2a581edf3cb" containerName="init" Dec 05 16:37:10 crc kubenswrapper[4756]: E1205 16:37:10.165674 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b01b863-c788-45c9-8e5a-d26d0c5d8f26" containerName="keystone-bootstrap" Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.165681 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b01b863-c788-45c9-8e5a-d26d0c5d8f26" containerName="keystone-bootstrap" Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.166141 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b01b863-c788-45c9-8e5a-d26d0c5d8f26" containerName="keystone-bootstrap" Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.166160 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="692b055e-f83f-4ed3-99f5-d2a581edf3cb" containerName="init" Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.166835 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nrzfl" Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.169065 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.169275 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.169470 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.169611 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rdq64" Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.169717 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.173647 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-nrzfl"] Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.312561 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pcll\" (UniqueName: \"kubernetes.io/projected/6071e116-5a36-4d4f-800a-a2b52ef342c8-kube-api-access-8pcll\") pod \"keystone-bootstrap-nrzfl\" (UID: \"6071e116-5a36-4d4f-800a-a2b52ef342c8\") " pod="openstack/keystone-bootstrap-nrzfl" Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.312604 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6071e116-5a36-4d4f-800a-a2b52ef342c8-credential-keys\") pod \"keystone-bootstrap-nrzfl\" (UID: \"6071e116-5a36-4d4f-800a-a2b52ef342c8\") " pod="openstack/keystone-bootstrap-nrzfl" Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.312706 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6071e116-5a36-4d4f-800a-a2b52ef342c8-scripts\") pod \"keystone-bootstrap-nrzfl\" (UID: \"6071e116-5a36-4d4f-800a-a2b52ef342c8\") " pod="openstack/keystone-bootstrap-nrzfl" Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.312765 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6071e116-5a36-4d4f-800a-a2b52ef342c8-fernet-keys\") pod \"keystone-bootstrap-nrzfl\" (UID: \"6071e116-5a36-4d4f-800a-a2b52ef342c8\") " pod="openstack/keystone-bootstrap-nrzfl" Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.312976 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6071e116-5a36-4d4f-800a-a2b52ef342c8-combined-ca-bundle\") pod \"keystone-bootstrap-nrzfl\" (UID: \"6071e116-5a36-4d4f-800a-a2b52ef342c8\") " pod="openstack/keystone-bootstrap-nrzfl" Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.313111 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6071e116-5a36-4d4f-800a-a2b52ef342c8-config-data\") pod \"keystone-bootstrap-nrzfl\" (UID: \"6071e116-5a36-4d4f-800a-a2b52ef342c8\") " pod="openstack/keystone-bootstrap-nrzfl" Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.414518 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6071e116-5a36-4d4f-800a-a2b52ef342c8-scripts\") pod \"keystone-bootstrap-nrzfl\" (UID: \"6071e116-5a36-4d4f-800a-a2b52ef342c8\") " pod="openstack/keystone-bootstrap-nrzfl" Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.414587 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6071e116-5a36-4d4f-800a-a2b52ef342c8-fernet-keys\") pod \"keystone-bootstrap-nrzfl\" (UID: \"6071e116-5a36-4d4f-800a-a2b52ef342c8\") " pod="openstack/keystone-bootstrap-nrzfl" Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.414720 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6071e116-5a36-4d4f-800a-a2b52ef342c8-combined-ca-bundle\") pod \"keystone-bootstrap-nrzfl\" (UID: \"6071e116-5a36-4d4f-800a-a2b52ef342c8\") " pod="openstack/keystone-bootstrap-nrzfl" Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.414747 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6071e116-5a36-4d4f-800a-a2b52ef342c8-config-data\") pod \"keystone-bootstrap-nrzfl\" (UID: \"6071e116-5a36-4d4f-800a-a2b52ef342c8\") " pod="openstack/keystone-bootstrap-nrzfl" Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.414829 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pcll\" (UniqueName: \"kubernetes.io/projected/6071e116-5a36-4d4f-800a-a2b52ef342c8-kube-api-access-8pcll\") pod \"keystone-bootstrap-nrzfl\" (UID: \"6071e116-5a36-4d4f-800a-a2b52ef342c8\") " pod="openstack/keystone-bootstrap-nrzfl" Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.414849 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6071e116-5a36-4d4f-800a-a2b52ef342c8-credential-keys\") pod \"keystone-bootstrap-nrzfl\" (UID: \"6071e116-5a36-4d4f-800a-a2b52ef342c8\") " pod="openstack/keystone-bootstrap-nrzfl" Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.419171 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6071e116-5a36-4d4f-800a-a2b52ef342c8-credential-keys\") pod \"keystone-bootstrap-nrzfl\" (UID: \"6071e116-5a36-4d4f-800a-a2b52ef342c8\") " pod="openstack/keystone-bootstrap-nrzfl" Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.419607 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6071e116-5a36-4d4f-800a-a2b52ef342c8-scripts\") pod \"keystone-bootstrap-nrzfl\" (UID: \"6071e116-5a36-4d4f-800a-a2b52ef342c8\") " pod="openstack/keystone-bootstrap-nrzfl" Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.420701 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6071e116-5a36-4d4f-800a-a2b52ef342c8-fernet-keys\") pod \"keystone-bootstrap-nrzfl\" (UID: \"6071e116-5a36-4d4f-800a-a2b52ef342c8\") " pod="openstack/keystone-bootstrap-nrzfl" Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.422064 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6071e116-5a36-4d4f-800a-a2b52ef342c8-combined-ca-bundle\") pod \"keystone-bootstrap-nrzfl\" (UID: \"6071e116-5a36-4d4f-800a-a2b52ef342c8\") " pod="openstack/keystone-bootstrap-nrzfl" Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.426650 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6071e116-5a36-4d4f-800a-a2b52ef342c8-config-data\") pod \"keystone-bootstrap-nrzfl\" (UID: \"6071e116-5a36-4d4f-800a-a2b52ef342c8\") " pod="openstack/keystone-bootstrap-nrzfl" Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.438537 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pcll\" (UniqueName: \"kubernetes.io/projected/6071e116-5a36-4d4f-800a-a2b52ef342c8-kube-api-access-8pcll\") pod \"keystone-bootstrap-nrzfl\" (UID: \"6071e116-5a36-4d4f-800a-a2b52ef342c8\") " pod="openstack/keystone-bootstrap-nrzfl" Dec 05 16:37:10 crc kubenswrapper[4756]: I1205 16:37:10.483425 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nrzfl" Dec 05 16:37:11 crc kubenswrapper[4756]: I1205 16:37:11.024932 4756 generic.go:334] "Generic (PLEG): container finished" podID="dbf5eba8-7c13-43e6-9777-978152dff744" containerID="ec5c3ff55792fd4b79770c090cc741ae3724e644f600d45e694289adcf92843e" exitCode=0 Dec 05 16:37:11 crc kubenswrapper[4756]: I1205 16:37:11.024987 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mvn92" event={"ID":"dbf5eba8-7c13-43e6-9777-978152dff744","Type":"ContainerDied","Data":"ec5c3ff55792fd4b79770c090cc741ae3724e644f600d45e694289adcf92843e"} Dec 05 16:37:12 crc kubenswrapper[4756]: I1205 16:37:12.061708 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b01b863-c788-45c9-8e5a-d26d0c5d8f26" path="/var/lib/kubelet/pods/4b01b863-c788-45c9-8e5a-d26d0c5d8f26/volumes" Dec 05 16:37:17 crc kubenswrapper[4756]: I1205 16:37:17.843846 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" podUID="bee23650-0d6d-495a-af65-7e818117b368" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: i/o timeout" Dec 05 16:37:17 crc kubenswrapper[4756]: I1205 16:37:17.845186 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" Dec 05 16:37:17 crc kubenswrapper[4756]: I1205 16:37:17.884064 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" Dec 05 16:37:18 crc kubenswrapper[4756]: I1205 16:37:18.066124 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bee23650-0d6d-495a-af65-7e818117b368-ovsdbserver-sb\") pod \"bee23650-0d6d-495a-af65-7e818117b368\" (UID: \"bee23650-0d6d-495a-af65-7e818117b368\") " Dec 05 16:37:18 crc kubenswrapper[4756]: I1205 16:37:18.066462 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bee23650-0d6d-495a-af65-7e818117b368-dns-swift-storage-0\") pod \"bee23650-0d6d-495a-af65-7e818117b368\" (UID: \"bee23650-0d6d-495a-af65-7e818117b368\") " Dec 05 16:37:18 crc kubenswrapper[4756]: I1205 16:37:18.066507 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcdnw\" (UniqueName: \"kubernetes.io/projected/bee23650-0d6d-495a-af65-7e818117b368-kube-api-access-bcdnw\") pod \"bee23650-0d6d-495a-af65-7e818117b368\" (UID: \"bee23650-0d6d-495a-af65-7e818117b368\") " Dec 05 16:37:18 crc kubenswrapper[4756]: I1205 16:37:18.066561 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bee23650-0d6d-495a-af65-7e818117b368-ovsdbserver-nb\") pod \"bee23650-0d6d-495a-af65-7e818117b368\" (UID: \"bee23650-0d6d-495a-af65-7e818117b368\") " Dec 05 16:37:18 crc kubenswrapper[4756]: I1205 16:37:18.066589 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bee23650-0d6d-495a-af65-7e818117b368-dns-svc\") pod \"bee23650-0d6d-495a-af65-7e818117b368\" (UID: \"bee23650-0d6d-495a-af65-7e818117b368\") " Dec 05 16:37:18 crc kubenswrapper[4756]: I1205 16:37:18.066717 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bee23650-0d6d-495a-af65-7e818117b368-config\") pod \"bee23650-0d6d-495a-af65-7e818117b368\" (UID: \"bee23650-0d6d-495a-af65-7e818117b368\") " Dec 05 16:37:18 crc kubenswrapper[4756]: I1205 16:37:18.073964 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bee23650-0d6d-495a-af65-7e818117b368-kube-api-access-bcdnw" (OuterVolumeSpecName: "kube-api-access-bcdnw") pod "bee23650-0d6d-495a-af65-7e818117b368" (UID: "bee23650-0d6d-495a-af65-7e818117b368"). InnerVolumeSpecName "kube-api-access-bcdnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:37:18 crc kubenswrapper[4756]: I1205 16:37:18.083306 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" event={"ID":"bee23650-0d6d-495a-af65-7e818117b368","Type":"ContainerDied","Data":"dab48f7c5441103ae23dd8f63c211847faa8a3d516fa2015e57049f5e971de1d"} Dec 05 16:37:18 crc kubenswrapper[4756]: I1205 16:37:18.083363 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" Dec 05 16:37:18 crc kubenswrapper[4756]: I1205 16:37:18.083788 4756 scope.go:117] "RemoveContainer" containerID="3363871f09f0407a249583935183f2d2f082f2738b6a7c6ff9ff1414a1479a2b" Dec 05 16:37:18 crc kubenswrapper[4756]: I1205 16:37:18.116197 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bee23650-0d6d-495a-af65-7e818117b368-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bee23650-0d6d-495a-af65-7e818117b368" (UID: "bee23650-0d6d-495a-af65-7e818117b368"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:37:18 crc kubenswrapper[4756]: I1205 16:37:18.121885 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bee23650-0d6d-495a-af65-7e818117b368-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bee23650-0d6d-495a-af65-7e818117b368" (UID: "bee23650-0d6d-495a-af65-7e818117b368"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:37:18 crc kubenswrapper[4756]: I1205 16:37:18.122851 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bee23650-0d6d-495a-af65-7e818117b368-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bee23650-0d6d-495a-af65-7e818117b368" (UID: "bee23650-0d6d-495a-af65-7e818117b368"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:37:18 crc kubenswrapper[4756]: I1205 16:37:18.129036 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bee23650-0d6d-495a-af65-7e818117b368-config" (OuterVolumeSpecName: "config") pod "bee23650-0d6d-495a-af65-7e818117b368" (UID: "bee23650-0d6d-495a-af65-7e818117b368"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:37:18 crc kubenswrapper[4756]: I1205 16:37:18.133436 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bee23650-0d6d-495a-af65-7e818117b368-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bee23650-0d6d-495a-af65-7e818117b368" (UID: "bee23650-0d6d-495a-af65-7e818117b368"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:37:18 crc kubenswrapper[4756]: I1205 16:37:18.169068 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bee23650-0d6d-495a-af65-7e818117b368-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:18 crc kubenswrapper[4756]: I1205 16:37:18.169099 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bee23650-0d6d-495a-af65-7e818117b368-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:18 crc kubenswrapper[4756]: I1205 16:37:18.169110 4756 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bee23650-0d6d-495a-af65-7e818117b368-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:18 crc kubenswrapper[4756]: I1205 16:37:18.169119 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcdnw\" (UniqueName: \"kubernetes.io/projected/bee23650-0d6d-495a-af65-7e818117b368-kube-api-access-bcdnw\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:18 crc kubenswrapper[4756]: I1205 16:37:18.169127 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bee23650-0d6d-495a-af65-7e818117b368-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:18 crc kubenswrapper[4756]: I1205 16:37:18.169136 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bee23650-0d6d-495a-af65-7e818117b368-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:18 crc kubenswrapper[4756]: I1205 16:37:18.419093 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-779c5847bc-x2p9h"] Dec 05 16:37:18 crc kubenswrapper[4756]: I1205 16:37:18.428124 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-779c5847bc-x2p9h"] Dec 05 16:37:19 crc kubenswrapper[4756]: E1205 16:37:19.049097 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:7a2056615520e272bae43ec3f34e2ba7a92c1d364b8d9106b53bd694619fc9c2" Dec 05 16:37:19 crc kubenswrapper[4756]: E1205 16:37:19.049409 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:7a2056615520e272bae43ec3f34e2ba7a92c1d364b8d9106b53bd694619fc9c2,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bcjxl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-ffbjc_openstack(b35c2088-a11f-4514-8390-1a1f5ff0ccb5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:37:19 crc kubenswrapper[4756]: E1205 16:37:19.050797 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-ffbjc" podUID="b35c2088-a11f-4514-8390-1a1f5ff0ccb5" Dec 05 16:37:19 crc kubenswrapper[4756]: E1205 16:37:19.094992 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:7a2056615520e272bae43ec3f34e2ba7a92c1d364b8d9106b53bd694619fc9c2\\\"\"" pod="openstack/cinder-db-sync-ffbjc" podUID="b35c2088-a11f-4514-8390-1a1f5ff0ccb5" Dec 05 16:37:19 crc kubenswrapper[4756]: E1205 16:37:19.391304 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:43a24796dabde68270dbfefa107205e173fdd6a0dc701502858cadbede69da31" Dec 05 16:37:19 crc kubenswrapper[4756]: E1205 16:37:19.391958 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:43a24796dabde68270dbfefa107205e173fdd6a0dc701502858cadbede69da31,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n697hc8hb5h66fh6h5b5h59h57ch548h589h99h78hcfh8dh584h8fhbchch656hb6h654h87hf4h549h65h5cdh5dh698h5cbh56fh674h5fdq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k5dz4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(d7550fa7-075b-42e9-be97-52e7ed4f552c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 16:37:19 crc kubenswrapper[4756]: I1205 16:37:19.404872 4756 scope.go:117] "RemoveContainer" containerID="0c07de68969b170afeb462d0f1067ae5e922dca93fdb5eb23018f7ae6b73caef" Dec 05 16:37:19 crc kubenswrapper[4756]: I1205 16:37:19.616293 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mvn92" Dec 05 16:37:19 crc kubenswrapper[4756]: I1205 16:37:19.799063 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dbf5eba8-7c13-43e6-9777-978152dff744-db-sync-config-data\") pod \"dbf5eba8-7c13-43e6-9777-978152dff744\" (UID: \"dbf5eba8-7c13-43e6-9777-978152dff744\") " Dec 05 16:37:19 crc kubenswrapper[4756]: I1205 16:37:19.799180 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbf5eba8-7c13-43e6-9777-978152dff744-config-data\") pod \"dbf5eba8-7c13-43e6-9777-978152dff744\" (UID: \"dbf5eba8-7c13-43e6-9777-978152dff744\") " Dec 05 16:37:19 crc kubenswrapper[4756]: I1205 16:37:19.799247 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m55h9\" (UniqueName: \"kubernetes.io/projected/dbf5eba8-7c13-43e6-9777-978152dff744-kube-api-access-m55h9\") pod \"dbf5eba8-7c13-43e6-9777-978152dff744\" (UID: \"dbf5eba8-7c13-43e6-9777-978152dff744\") " Dec 05 16:37:19 crc kubenswrapper[4756]: I1205 16:37:19.799315 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbf5eba8-7c13-43e6-9777-978152dff744-combined-ca-bundle\") pod \"dbf5eba8-7c13-43e6-9777-978152dff744\" (UID: \"dbf5eba8-7c13-43e6-9777-978152dff744\") " Dec 05 16:37:19 crc kubenswrapper[4756]: I1205 16:37:19.804137 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbf5eba8-7c13-43e6-9777-978152dff744-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "dbf5eba8-7c13-43e6-9777-978152dff744" (UID: "dbf5eba8-7c13-43e6-9777-978152dff744"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:19 crc kubenswrapper[4756]: I1205 16:37:19.807613 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbf5eba8-7c13-43e6-9777-978152dff744-kube-api-access-m55h9" (OuterVolumeSpecName: "kube-api-access-m55h9") pod "dbf5eba8-7c13-43e6-9777-978152dff744" (UID: "dbf5eba8-7c13-43e6-9777-978152dff744"). InnerVolumeSpecName "kube-api-access-m55h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:37:19 crc kubenswrapper[4756]: I1205 16:37:19.846218 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbf5eba8-7c13-43e6-9777-978152dff744-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbf5eba8-7c13-43e6-9777-978152dff744" (UID: "dbf5eba8-7c13-43e6-9777-978152dff744"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:19 crc kubenswrapper[4756]: I1205 16:37:19.858582 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbf5eba8-7c13-43e6-9777-978152dff744-config-data" (OuterVolumeSpecName: "config-data") pod "dbf5eba8-7c13-43e6-9777-978152dff744" (UID: "dbf5eba8-7c13-43e6-9777-978152dff744"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:19 crc kubenswrapper[4756]: I1205 16:37:19.880503 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-nrzfl"] Dec 05 16:37:19 crc kubenswrapper[4756]: I1205 16:37:19.926391 4756 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dbf5eba8-7c13-43e6-9777-978152dff744-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:19 crc kubenswrapper[4756]: I1205 16:37:19.927032 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbf5eba8-7c13-43e6-9777-978152dff744-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:19 crc kubenswrapper[4756]: I1205 16:37:19.927054 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m55h9\" (UniqueName: \"kubernetes.io/projected/dbf5eba8-7c13-43e6-9777-978152dff744-kube-api-access-m55h9\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:19 crc kubenswrapper[4756]: I1205 16:37:19.927068 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbf5eba8-7c13-43e6-9777-978152dff744-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:20 crc kubenswrapper[4756]: I1205 16:37:20.062014 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bee23650-0d6d-495a-af65-7e818117b368" path="/var/lib/kubelet/pods/bee23650-0d6d-495a-af65-7e818117b368/volumes" Dec 05 16:37:20 crc kubenswrapper[4756]: I1205 16:37:20.102917 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-w47gl" event={"ID":"9f0a57ed-b792-42fb-95df-8a88b426b3b7","Type":"ContainerStarted","Data":"03919cc24c81be7e09a5b0d4242745d52ccadb053c029fd639abe9d7ba68719a"} Dec 05 16:37:20 crc kubenswrapper[4756]: I1205 16:37:20.105129 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nrzfl" event={"ID":"6071e116-5a36-4d4f-800a-a2b52ef342c8","Type":"ContainerStarted","Data":"2e051bfca4c328a46627298d5d4b9c579dfd75f24de4f9f86c2754e3d192c184"} Dec 05 16:37:20 crc kubenswrapper[4756]: I1205 16:37:20.105197 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nrzfl" event={"ID":"6071e116-5a36-4d4f-800a-a2b52ef342c8","Type":"ContainerStarted","Data":"5d5304d178c426335c960a8cdb4a1d9d094d6a61161cf315273d341a662272a9"} Dec 05 16:37:20 crc kubenswrapper[4756]: I1205 16:37:20.106972 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mvn92" event={"ID":"dbf5eba8-7c13-43e6-9777-978152dff744","Type":"ContainerDied","Data":"7852cd504a8ea5683d52125399c3ee70e331ae56ebb9e8c15f17434e55fcade5"} Dec 05 16:37:20 crc kubenswrapper[4756]: I1205 16:37:20.107027 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7852cd504a8ea5683d52125399c3ee70e331ae56ebb9e8c15f17434e55fcade5" Dec 05 16:37:20 crc kubenswrapper[4756]: I1205 16:37:20.106989 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mvn92" Dec 05 16:37:20 crc kubenswrapper[4756]: I1205 16:37:20.111453 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-64jk8" event={"ID":"01800831-fa81-4f58-9da9-66dcbfe86580","Type":"ContainerStarted","Data":"871db04e55a28c57dd3290df7567c594c5863df0a7288c54f7126b691f5b411f"} Dec 05 16:37:20 crc kubenswrapper[4756]: I1205 16:37:20.121298 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-w47gl" podStartSLOduration=2.3844362439999998 podStartE2EDuration="29.121278955s" podCreationTimestamp="2025-12-05 16:36:51 +0000 UTC" firstStartedPulling="2025-12-05 16:36:52.682267695 +0000 UTC m=+1272.920941207" lastFinishedPulling="2025-12-05 16:37:19.419110406 +0000 UTC m=+1299.657783918" observedRunningTime="2025-12-05 16:37:20.118830608 +0000 UTC m=+1300.357504120" watchObservedRunningTime="2025-12-05 16:37:20.121278955 +0000 UTC m=+1300.359952487" Dec 05 16:37:20 crc kubenswrapper[4756]: I1205 16:37:20.138061 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-64jk8" podStartSLOduration=2.599861237 podStartE2EDuration="29.138039399s" podCreationTimestamp="2025-12-05 16:36:51 +0000 UTC" firstStartedPulling="2025-12-05 16:36:52.876565016 +0000 UTC m=+1273.115238528" lastFinishedPulling="2025-12-05 16:37:19.414743158 +0000 UTC m=+1299.653416690" observedRunningTime="2025-12-05 16:37:20.135211423 +0000 UTC m=+1300.373884955" watchObservedRunningTime="2025-12-05 16:37:20.138039399 +0000 UTC m=+1300.376712911" Dec 05 16:37:20 crc kubenswrapper[4756]: I1205 16:37:20.160579 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-nrzfl" podStartSLOduration=10.1605382 podStartE2EDuration="10.1605382s" podCreationTimestamp="2025-12-05 16:37:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:37:20.154088035 +0000 UTC m=+1300.392761557" watchObservedRunningTime="2025-12-05 16:37:20.1605382 +0000 UTC m=+1300.399211712" Dec 05 16:37:21 crc kubenswrapper[4756]: I1205 16:37:21.219440 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7884648fd9-9kzg6"] Dec 05 16:37:21 crc kubenswrapper[4756]: E1205 16:37:21.220884 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bee23650-0d6d-495a-af65-7e818117b368" containerName="dnsmasq-dns" Dec 05 16:37:21 crc kubenswrapper[4756]: I1205 16:37:21.220950 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="bee23650-0d6d-495a-af65-7e818117b368" containerName="dnsmasq-dns" Dec 05 16:37:21 crc kubenswrapper[4756]: E1205 16:37:21.221016 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bee23650-0d6d-495a-af65-7e818117b368" containerName="init" Dec 05 16:37:21 crc kubenswrapper[4756]: I1205 16:37:21.221062 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="bee23650-0d6d-495a-af65-7e818117b368" containerName="init" Dec 05 16:37:21 crc kubenswrapper[4756]: E1205 16:37:21.221120 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf5eba8-7c13-43e6-9777-978152dff744" containerName="glance-db-sync" Dec 05 16:37:21 crc kubenswrapper[4756]: I1205 16:37:21.221182 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf5eba8-7c13-43e6-9777-978152dff744" containerName="glance-db-sync" Dec 05 16:37:21 crc kubenswrapper[4756]: I1205 16:37:21.221383 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="bee23650-0d6d-495a-af65-7e818117b368" containerName="dnsmasq-dns" Dec 05 16:37:21 crc kubenswrapper[4756]: I1205 16:37:21.221463 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbf5eba8-7c13-43e6-9777-978152dff744" containerName="glance-db-sync" Dec 05 16:37:21 crc kubenswrapper[4756]: I1205 16:37:21.222448 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" Dec 05 16:37:21 crc kubenswrapper[4756]: I1205 16:37:21.245161 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7884648fd9-9kzg6"] Dec 05 16:37:21 crc kubenswrapper[4756]: I1205 16:37:21.357571 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1cfcad6-0fd3-47ed-9d46-093d26580a35-dns-svc\") pod \"dnsmasq-dns-7884648fd9-9kzg6\" (UID: \"b1cfcad6-0fd3-47ed-9d46-093d26580a35\") " pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" Dec 05 16:37:21 crc kubenswrapper[4756]: I1205 16:37:21.357642 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cm9p\" (UniqueName: \"kubernetes.io/projected/b1cfcad6-0fd3-47ed-9d46-093d26580a35-kube-api-access-7cm9p\") pod \"dnsmasq-dns-7884648fd9-9kzg6\" (UID: \"b1cfcad6-0fd3-47ed-9d46-093d26580a35\") " pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" Dec 05 16:37:21 crc kubenswrapper[4756]: I1205 16:37:21.357673 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1cfcad6-0fd3-47ed-9d46-093d26580a35-ovsdbserver-sb\") pod \"dnsmasq-dns-7884648fd9-9kzg6\" (UID: \"b1cfcad6-0fd3-47ed-9d46-093d26580a35\") " pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" Dec 05 16:37:21 crc kubenswrapper[4756]: I1205 16:37:21.357696 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1cfcad6-0fd3-47ed-9d46-093d26580a35-dns-swift-storage-0\") pod \"dnsmasq-dns-7884648fd9-9kzg6\" (UID: \"b1cfcad6-0fd3-47ed-9d46-093d26580a35\") " pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" Dec 05 16:37:21 crc kubenswrapper[4756]: I1205 16:37:21.357725 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1cfcad6-0fd3-47ed-9d46-093d26580a35-ovsdbserver-nb\") pod \"dnsmasq-dns-7884648fd9-9kzg6\" (UID: \"b1cfcad6-0fd3-47ed-9d46-093d26580a35\") " pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" Dec 05 16:37:21 crc kubenswrapper[4756]: I1205 16:37:21.357776 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1cfcad6-0fd3-47ed-9d46-093d26580a35-config\") pod \"dnsmasq-dns-7884648fd9-9kzg6\" (UID: \"b1cfcad6-0fd3-47ed-9d46-093d26580a35\") " pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" Dec 05 16:37:21 crc kubenswrapper[4756]: I1205 16:37:21.459791 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cm9p\" (UniqueName: \"kubernetes.io/projected/b1cfcad6-0fd3-47ed-9d46-093d26580a35-kube-api-access-7cm9p\") pod \"dnsmasq-dns-7884648fd9-9kzg6\" (UID: \"b1cfcad6-0fd3-47ed-9d46-093d26580a35\") " pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" Dec 05 16:37:21 crc kubenswrapper[4756]: I1205 16:37:21.459848 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1cfcad6-0fd3-47ed-9d46-093d26580a35-ovsdbserver-sb\") pod \"dnsmasq-dns-7884648fd9-9kzg6\" (UID: \"b1cfcad6-0fd3-47ed-9d46-093d26580a35\") " pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" Dec 05 16:37:21 crc kubenswrapper[4756]: I1205 16:37:21.459875 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1cfcad6-0fd3-47ed-9d46-093d26580a35-dns-swift-storage-0\") pod \"dnsmasq-dns-7884648fd9-9kzg6\" (UID: \"b1cfcad6-0fd3-47ed-9d46-093d26580a35\") " pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" Dec 05 16:37:21 crc kubenswrapper[4756]: I1205 16:37:21.459904 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1cfcad6-0fd3-47ed-9d46-093d26580a35-ovsdbserver-nb\") pod \"dnsmasq-dns-7884648fd9-9kzg6\" (UID: \"b1cfcad6-0fd3-47ed-9d46-093d26580a35\") " pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" Dec 05 16:37:21 crc kubenswrapper[4756]: I1205 16:37:21.459947 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1cfcad6-0fd3-47ed-9d46-093d26580a35-config\") pod \"dnsmasq-dns-7884648fd9-9kzg6\" (UID: \"b1cfcad6-0fd3-47ed-9d46-093d26580a35\") " pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" Dec 05 16:37:21 crc kubenswrapper[4756]: I1205 16:37:21.460016 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1cfcad6-0fd3-47ed-9d46-093d26580a35-dns-svc\") pod \"dnsmasq-dns-7884648fd9-9kzg6\" (UID: \"b1cfcad6-0fd3-47ed-9d46-093d26580a35\") " pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" Dec 05 16:37:21 crc kubenswrapper[4756]: I1205 16:37:21.460950 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1cfcad6-0fd3-47ed-9d46-093d26580a35-dns-swift-storage-0\") pod \"dnsmasq-dns-7884648fd9-9kzg6\" (UID: \"b1cfcad6-0fd3-47ed-9d46-093d26580a35\") " pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" Dec 05 16:37:21 crc kubenswrapper[4756]: I1205 16:37:21.460993 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1cfcad6-0fd3-47ed-9d46-093d26580a35-ovsdbserver-sb\") pod \"dnsmasq-dns-7884648fd9-9kzg6\" (UID: \"b1cfcad6-0fd3-47ed-9d46-093d26580a35\") " pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" Dec 05 16:37:21 crc kubenswrapper[4756]: I1205 16:37:21.461258 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1cfcad6-0fd3-47ed-9d46-093d26580a35-ovsdbserver-nb\") pod \"dnsmasq-dns-7884648fd9-9kzg6\" (UID: \"b1cfcad6-0fd3-47ed-9d46-093d26580a35\") " pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" Dec 05 16:37:21 crc kubenswrapper[4756]: I1205 16:37:21.461520 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1cfcad6-0fd3-47ed-9d46-093d26580a35-dns-svc\") pod \"dnsmasq-dns-7884648fd9-9kzg6\" (UID: \"b1cfcad6-0fd3-47ed-9d46-093d26580a35\") " pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" Dec 05 16:37:21 crc kubenswrapper[4756]: I1205 16:37:21.461826 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1cfcad6-0fd3-47ed-9d46-093d26580a35-config\") pod \"dnsmasq-dns-7884648fd9-9kzg6\" (UID: \"b1cfcad6-0fd3-47ed-9d46-093d26580a35\") " pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" Dec 05 16:37:21 crc kubenswrapper[4756]: I1205 16:37:21.489323 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cm9p\" (UniqueName: \"kubernetes.io/projected/b1cfcad6-0fd3-47ed-9d46-093d26580a35-kube-api-access-7cm9p\") pod \"dnsmasq-dns-7884648fd9-9kzg6\" (UID: \"b1cfcad6-0fd3-47ed-9d46-093d26580a35\") " pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" Dec 05 16:37:21 crc kubenswrapper[4756]: I1205 16:37:21.540811 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" Dec 05 16:37:21 crc kubenswrapper[4756]: W1205 16:37:21.999213 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1cfcad6_0fd3_47ed_9d46_093d26580a35.slice/crio-6ac4d45ac530c7368e041bfaa15aeba1b3bcac6c4e169a5adb97873dfe45213f WatchSource:0}: Error finding container 6ac4d45ac530c7368e041bfaa15aeba1b3bcac6c4e169a5adb97873dfe45213f: Status 404 returned error can't find the container with id 6ac4d45ac530c7368e041bfaa15aeba1b3bcac6c4e169a5adb97873dfe45213f Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.010473 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7884648fd9-9kzg6"] Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.038743 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.040641 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.044602 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-r2jl8" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.044876 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.044917 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.068395 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.077273 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fa2f279-0add-448e-8e59-66555a041c99-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4fa2f279-0add-448e-8e59-66555a041c99\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.077312 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"4fa2f279-0add-448e-8e59-66555a041c99\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.077373 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fa2f279-0add-448e-8e59-66555a041c99-config-data\") pod \"glance-default-external-api-0\" (UID: \"4fa2f279-0add-448e-8e59-66555a041c99\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.077437 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4fa2f279-0add-448e-8e59-66555a041c99-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4fa2f279-0add-448e-8e59-66555a041c99\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.077522 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fa2f279-0add-448e-8e59-66555a041c99-logs\") pod \"glance-default-external-api-0\" (UID: \"4fa2f279-0add-448e-8e59-66555a041c99\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.077588 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fa2f279-0add-448e-8e59-66555a041c99-scripts\") pod \"glance-default-external-api-0\" (UID: \"4fa2f279-0add-448e-8e59-66555a041c99\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.077619 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtw8g\" (UniqueName: \"kubernetes.io/projected/4fa2f279-0add-448e-8e59-66555a041c99-kube-api-access-jtw8g\") pod \"glance-default-external-api-0\" (UID: \"4fa2f279-0add-448e-8e59-66555a041c99\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.131523 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" event={"ID":"b1cfcad6-0fd3-47ed-9d46-093d26580a35","Type":"ContainerStarted","Data":"6ac4d45ac530c7368e041bfaa15aeba1b3bcac6c4e169a5adb97873dfe45213f"} Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.132745 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7550fa7-075b-42e9-be97-52e7ed4f552c","Type":"ContainerStarted","Data":"bf81fbd405b146ccf77ccba5def407bc7fba16acdb849eec1f8de533e90ca946"} Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.178719 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fa2f279-0add-448e-8e59-66555a041c99-scripts\") pod \"glance-default-external-api-0\" (UID: \"4fa2f279-0add-448e-8e59-66555a041c99\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.178783 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtw8g\" (UniqueName: \"kubernetes.io/projected/4fa2f279-0add-448e-8e59-66555a041c99-kube-api-access-jtw8g\") pod \"glance-default-external-api-0\" (UID: \"4fa2f279-0add-448e-8e59-66555a041c99\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.178827 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fa2f279-0add-448e-8e59-66555a041c99-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4fa2f279-0add-448e-8e59-66555a041c99\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.178852 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"4fa2f279-0add-448e-8e59-66555a041c99\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.178889 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fa2f279-0add-448e-8e59-66555a041c99-config-data\") pod \"glance-default-external-api-0\" (UID: \"4fa2f279-0add-448e-8e59-66555a041c99\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.178922 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4fa2f279-0add-448e-8e59-66555a041c99-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4fa2f279-0add-448e-8e59-66555a041c99\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.179001 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fa2f279-0add-448e-8e59-66555a041c99-logs\") pod \"glance-default-external-api-0\" (UID: \"4fa2f279-0add-448e-8e59-66555a041c99\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.179231 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"4fa2f279-0add-448e-8e59-66555a041c99\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.179442 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4fa2f279-0add-448e-8e59-66555a041c99-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4fa2f279-0add-448e-8e59-66555a041c99\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.179616 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fa2f279-0add-448e-8e59-66555a041c99-logs\") pod \"glance-default-external-api-0\" (UID: \"4fa2f279-0add-448e-8e59-66555a041c99\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.182710 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fa2f279-0add-448e-8e59-66555a041c99-scripts\") pod \"glance-default-external-api-0\" (UID: \"4fa2f279-0add-448e-8e59-66555a041c99\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.184340 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fa2f279-0add-448e-8e59-66555a041c99-config-data\") pod \"glance-default-external-api-0\" (UID: \"4fa2f279-0add-448e-8e59-66555a041c99\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.192608 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fa2f279-0add-448e-8e59-66555a041c99-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4fa2f279-0add-448e-8e59-66555a041c99\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.195135 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtw8g\" (UniqueName: \"kubernetes.io/projected/4fa2f279-0add-448e-8e59-66555a041c99-kube-api-access-jtw8g\") pod \"glance-default-external-api-0\" (UID: \"4fa2f279-0add-448e-8e59-66555a041c99\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.201968 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"4fa2f279-0add-448e-8e59-66555a041c99\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.360809 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.366296 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.382402 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.400389 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.406462 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.486883 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c19f5ead-a871-4aab-885d-19a423a1ab5e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c19f5ead-a871-4aab-885d-19a423a1ab5e\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.487525 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c19f5ead-a871-4aab-885d-19a423a1ab5e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c19f5ead-a871-4aab-885d-19a423a1ab5e\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.487681 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgr6b\" (UniqueName: \"kubernetes.io/projected/c19f5ead-a871-4aab-885d-19a423a1ab5e-kube-api-access-dgr6b\") pod \"glance-default-internal-api-0\" (UID: \"c19f5ead-a871-4aab-885d-19a423a1ab5e\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.487811 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c19f5ead-a871-4aab-885d-19a423a1ab5e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c19f5ead-a871-4aab-885d-19a423a1ab5e\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.487913 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c19f5ead-a871-4aab-885d-19a423a1ab5e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c19f5ead-a871-4aab-885d-19a423a1ab5e\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.487989 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"c19f5ead-a871-4aab-885d-19a423a1ab5e\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.488069 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c19f5ead-a871-4aab-885d-19a423a1ab5e-logs\") pod \"glance-default-internal-api-0\" (UID: \"c19f5ead-a871-4aab-885d-19a423a1ab5e\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.590191 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c19f5ead-a871-4aab-885d-19a423a1ab5e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c19f5ead-a871-4aab-885d-19a423a1ab5e\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.590247 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c19f5ead-a871-4aab-885d-19a423a1ab5e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c19f5ead-a871-4aab-885d-19a423a1ab5e\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.590286 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c19f5ead-a871-4aab-885d-19a423a1ab5e-logs\") pod \"glance-default-internal-api-0\" (UID: \"c19f5ead-a871-4aab-885d-19a423a1ab5e\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.590308 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"c19f5ead-a871-4aab-885d-19a423a1ab5e\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.590440 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c19f5ead-a871-4aab-885d-19a423a1ab5e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c19f5ead-a871-4aab-885d-19a423a1ab5e\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.590477 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c19f5ead-a871-4aab-885d-19a423a1ab5e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c19f5ead-a871-4aab-885d-19a423a1ab5e\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.590518 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgr6b\" (UniqueName: \"kubernetes.io/projected/c19f5ead-a871-4aab-885d-19a423a1ab5e-kube-api-access-dgr6b\") pod \"glance-default-internal-api-0\" (UID: \"c19f5ead-a871-4aab-885d-19a423a1ab5e\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.590797 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"c19f5ead-a871-4aab-885d-19a423a1ab5e\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.591215 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c19f5ead-a871-4aab-885d-19a423a1ab5e-logs\") pod \"glance-default-internal-api-0\" (UID: \"c19f5ead-a871-4aab-885d-19a423a1ab5e\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.591479 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c19f5ead-a871-4aab-885d-19a423a1ab5e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c19f5ead-a871-4aab-885d-19a423a1ab5e\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.598078 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c19f5ead-a871-4aab-885d-19a423a1ab5e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c19f5ead-a871-4aab-885d-19a423a1ab5e\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.598641 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c19f5ead-a871-4aab-885d-19a423a1ab5e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c19f5ead-a871-4aab-885d-19a423a1ab5e\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.608391 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c19f5ead-a871-4aab-885d-19a423a1ab5e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c19f5ead-a871-4aab-885d-19a423a1ab5e\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.625691 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"c19f5ead-a871-4aab-885d-19a423a1ab5e\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.628449 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgr6b\" (UniqueName: \"kubernetes.io/projected/c19f5ead-a871-4aab-885d-19a423a1ab5e-kube-api-access-dgr6b\") pod \"glance-default-internal-api-0\" (UID: \"c19f5ead-a871-4aab-885d-19a423a1ab5e\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.744135 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 16:37:22 crc kubenswrapper[4756]: I1205 16:37:22.845232 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-779c5847bc-x2p9h" podUID="bee23650-0d6d-495a-af65-7e818117b368" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: i/o timeout" Dec 05 16:37:23 crc kubenswrapper[4756]: I1205 16:37:23.352538 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 16:37:23 crc kubenswrapper[4756]: W1205 16:37:23.357825 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc19f5ead_a871_4aab_885d_19a423a1ab5e.slice/crio-7d6b454b7f1d45e28436b2a7fa23ff5b04fd1859c5313578502b03a6395ad3c7 WatchSource:0}: Error finding container 7d6b454b7f1d45e28436b2a7fa23ff5b04fd1859c5313578502b03a6395ad3c7: Status 404 returned error can't find the container with id 7d6b454b7f1d45e28436b2a7fa23ff5b04fd1859c5313578502b03a6395ad3c7 Dec 05 16:37:23 crc kubenswrapper[4756]: I1205 16:37:23.432776 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 16:37:23 crc kubenswrapper[4756]: W1205 16:37:23.441378 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fa2f279_0add_448e_8e59_66555a041c99.slice/crio-d9b6220e2c3ec96e6f77da3521bb1a359b4bd702aa84d891b083d2343f0f153e WatchSource:0}: Error finding container d9b6220e2c3ec96e6f77da3521bb1a359b4bd702aa84d891b083d2343f0f153e: Status 404 returned error can't find the container with id d9b6220e2c3ec96e6f77da3521bb1a359b4bd702aa84d891b083d2343f0f153e Dec 05 16:37:23 crc kubenswrapper[4756]: I1205 16:37:23.754692 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 16:37:23 crc kubenswrapper[4756]: I1205 16:37:23.858095 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 16:37:24 crc kubenswrapper[4756]: I1205 16:37:24.155126 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c19f5ead-a871-4aab-885d-19a423a1ab5e","Type":"ContainerStarted","Data":"7d6b454b7f1d45e28436b2a7fa23ff5b04fd1859c5313578502b03a6395ad3c7"} Dec 05 16:37:24 crc kubenswrapper[4756]: I1205 16:37:24.159023 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" event={"ID":"b1cfcad6-0fd3-47ed-9d46-093d26580a35","Type":"ContainerStarted","Data":"b4741c568cbaacb180537c018676ea148fa077b61ffa287514c164814cc1b3b3"} Dec 05 16:37:24 crc kubenswrapper[4756]: I1205 16:37:24.162281 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4fa2f279-0add-448e-8e59-66555a041c99","Type":"ContainerStarted","Data":"d9b6220e2c3ec96e6f77da3521bb1a359b4bd702aa84d891b083d2343f0f153e"} Dec 05 16:37:28 crc kubenswrapper[4756]: I1205 16:37:28.195334 4756 generic.go:334] "Generic (PLEG): container finished" podID="b1cfcad6-0fd3-47ed-9d46-093d26580a35" containerID="b4741c568cbaacb180537c018676ea148fa077b61ffa287514c164814cc1b3b3" exitCode=0 Dec 05 16:37:28 crc kubenswrapper[4756]: I1205 16:37:28.329065 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" event={"ID":"b1cfcad6-0fd3-47ed-9d46-093d26580a35","Type":"ContainerDied","Data":"b4741c568cbaacb180537c018676ea148fa077b61ffa287514c164814cc1b3b3"} Dec 05 16:37:30 crc kubenswrapper[4756]: I1205 16:37:30.221646 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4fa2f279-0add-448e-8e59-66555a041c99","Type":"ContainerStarted","Data":"a73af2a4a51cf3cc30a8ba9c82f0692030878733b2d97c896ea685b515a60559"} Dec 05 16:37:30 crc kubenswrapper[4756]: I1205 16:37:30.224071 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c19f5ead-a871-4aab-885d-19a423a1ab5e","Type":"ContainerStarted","Data":"f88bfa8d54450c4dd5111f49ad6a1976d0838dc22b10bb71fa0e60671a727adf"} Dec 05 16:37:30 crc kubenswrapper[4756]: I1205 16:37:30.226506 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" event={"ID":"b1cfcad6-0fd3-47ed-9d46-093d26580a35","Type":"ContainerStarted","Data":"8f9916c198d47920f4ad0ddb09e6240388183e0700fe9be2cd2d8ef11766df35"} Dec 05 16:37:30 crc kubenswrapper[4756]: I1205 16:37:30.226706 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" Dec 05 16:37:30 crc kubenswrapper[4756]: I1205 16:37:30.248148 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" podStartSLOduration=9.248120743 podStartE2EDuration="9.248120743s" podCreationTimestamp="2025-12-05 16:37:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:37:30.244440763 +0000 UTC m=+1310.483114275" watchObservedRunningTime="2025-12-05 16:37:30.248120743 +0000 UTC m=+1310.486794255" Dec 05 16:37:31 crc kubenswrapper[4756]: I1205 16:37:31.248350 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4fa2f279-0add-448e-8e59-66555a041c99","Type":"ContainerStarted","Data":"17291890af985389f331805af621c45fafd34a98aa6c95915a38f76d5aff3b77"} Dec 05 16:37:31 crc kubenswrapper[4756]: I1205 16:37:31.248401 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4fa2f279-0add-448e-8e59-66555a041c99" containerName="glance-log" containerID="cri-o://a73af2a4a51cf3cc30a8ba9c82f0692030878733b2d97c896ea685b515a60559" gracePeriod=30 Dec 05 16:37:31 crc kubenswrapper[4756]: I1205 16:37:31.248532 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4fa2f279-0add-448e-8e59-66555a041c99" containerName="glance-httpd" containerID="cri-o://17291890af985389f331805af621c45fafd34a98aa6c95915a38f76d5aff3b77" gracePeriod=30 Dec 05 16:37:31 crc kubenswrapper[4756]: I1205 16:37:31.258903 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c19f5ead-a871-4aab-885d-19a423a1ab5e","Type":"ContainerStarted","Data":"36c8af0658c41bfba1f2a4bc1d0aea28e7e8c2879d906034b4ead3db3d7fd504"} Dec 05 16:37:31 crc kubenswrapper[4756]: I1205 16:37:31.259021 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c19f5ead-a871-4aab-885d-19a423a1ab5e" containerName="glance-log" containerID="cri-o://f88bfa8d54450c4dd5111f49ad6a1976d0838dc22b10bb71fa0e60671a727adf" gracePeriod=30 Dec 05 16:37:31 crc kubenswrapper[4756]: I1205 16:37:31.259142 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c19f5ead-a871-4aab-885d-19a423a1ab5e" containerName="glance-httpd" containerID="cri-o://36c8af0658c41bfba1f2a4bc1d0aea28e7e8c2879d906034b4ead3db3d7fd504" gracePeriod=30 Dec 05 16:37:31 crc kubenswrapper[4756]: I1205 16:37:31.275648 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=11.275627227 podStartE2EDuration="11.275627227s" podCreationTimestamp="2025-12-05 16:37:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:37:31.271436863 +0000 UTC m=+1311.510110395" watchObservedRunningTime="2025-12-05 16:37:31.275627227 +0000 UTC m=+1311.514300739" Dec 05 16:37:31 crc kubenswrapper[4756]: I1205 16:37:31.309379 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=10.309361582 podStartE2EDuration="10.309361582s" podCreationTimestamp="2025-12-05 16:37:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:37:31.306037382 +0000 UTC m=+1311.544710904" watchObservedRunningTime="2025-12-05 16:37:31.309361582 +0000 UTC m=+1311.548035094" Dec 05 16:37:32 crc kubenswrapper[4756]: I1205 16:37:32.273513 4756 generic.go:334] "Generic (PLEG): container finished" podID="4fa2f279-0add-448e-8e59-66555a041c99" containerID="17291890af985389f331805af621c45fafd34a98aa6c95915a38f76d5aff3b77" exitCode=0 Dec 05 16:37:32 crc kubenswrapper[4756]: I1205 16:37:32.273887 4756 generic.go:334] "Generic (PLEG): container finished" podID="4fa2f279-0add-448e-8e59-66555a041c99" containerID="a73af2a4a51cf3cc30a8ba9c82f0692030878733b2d97c896ea685b515a60559" exitCode=143 Dec 05 16:37:32 crc kubenswrapper[4756]: I1205 16:37:32.273644 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4fa2f279-0add-448e-8e59-66555a041c99","Type":"ContainerDied","Data":"17291890af985389f331805af621c45fafd34a98aa6c95915a38f76d5aff3b77"} Dec 05 16:37:32 crc kubenswrapper[4756]: I1205 16:37:32.273990 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4fa2f279-0add-448e-8e59-66555a041c99","Type":"ContainerDied","Data":"a73af2a4a51cf3cc30a8ba9c82f0692030878733b2d97c896ea685b515a60559"} Dec 05 16:37:32 crc kubenswrapper[4756]: I1205 16:37:32.276263 4756 generic.go:334] "Generic (PLEG): container finished" podID="c19f5ead-a871-4aab-885d-19a423a1ab5e" containerID="36c8af0658c41bfba1f2a4bc1d0aea28e7e8c2879d906034b4ead3db3d7fd504" exitCode=0 Dec 05 16:37:32 crc kubenswrapper[4756]: I1205 16:37:32.276313 4756 generic.go:334] "Generic (PLEG): container finished" podID="c19f5ead-a871-4aab-885d-19a423a1ab5e" containerID="f88bfa8d54450c4dd5111f49ad6a1976d0838dc22b10bb71fa0e60671a727adf" exitCode=143 Dec 05 16:37:32 crc kubenswrapper[4756]: I1205 16:37:32.276372 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c19f5ead-a871-4aab-885d-19a423a1ab5e","Type":"ContainerDied","Data":"36c8af0658c41bfba1f2a4bc1d0aea28e7e8c2879d906034b4ead3db3d7fd504"} Dec 05 16:37:32 crc kubenswrapper[4756]: I1205 16:37:32.276455 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c19f5ead-a871-4aab-885d-19a423a1ab5e","Type":"ContainerDied","Data":"f88bfa8d54450c4dd5111f49ad6a1976d0838dc22b10bb71fa0e60671a727adf"} Dec 05 16:37:32 crc kubenswrapper[4756]: I1205 16:37:32.277916 4756 generic.go:334] "Generic (PLEG): container finished" podID="6071e116-5a36-4d4f-800a-a2b52ef342c8" containerID="2e051bfca4c328a46627298d5d4b9c579dfd75f24de4f9f86c2754e3d192c184" exitCode=0 Dec 05 16:37:32 crc kubenswrapper[4756]: I1205 16:37:32.277954 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nrzfl" event={"ID":"6071e116-5a36-4d4f-800a-a2b52ef342c8","Type":"ContainerDied","Data":"2e051bfca4c328a46627298d5d4b9c579dfd75f24de4f9f86c2754e3d192c184"} Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.154616 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.160806 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nrzfl" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.225083 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6071e116-5a36-4d4f-800a-a2b52ef342c8-config-data\") pod \"6071e116-5a36-4d4f-800a-a2b52ef342c8\" (UID: \"6071e116-5a36-4d4f-800a-a2b52ef342c8\") " Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.225158 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c19f5ead-a871-4aab-885d-19a423a1ab5e-logs\") pod \"c19f5ead-a871-4aab-885d-19a423a1ab5e\" (UID: \"c19f5ead-a871-4aab-885d-19a423a1ab5e\") " Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.225217 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6071e116-5a36-4d4f-800a-a2b52ef342c8-fernet-keys\") pod \"6071e116-5a36-4d4f-800a-a2b52ef342c8\" (UID: \"6071e116-5a36-4d4f-800a-a2b52ef342c8\") " Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.225249 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c19f5ead-a871-4aab-885d-19a423a1ab5e-config-data\") pod \"c19f5ead-a871-4aab-885d-19a423a1ab5e\" (UID: \"c19f5ead-a871-4aab-885d-19a423a1ab5e\") " Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.225299 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"c19f5ead-a871-4aab-885d-19a423a1ab5e\" (UID: \"c19f5ead-a871-4aab-885d-19a423a1ab5e\") " Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.225343 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgr6b\" (UniqueName: \"kubernetes.io/projected/c19f5ead-a871-4aab-885d-19a423a1ab5e-kube-api-access-dgr6b\") pod \"c19f5ead-a871-4aab-885d-19a423a1ab5e\" (UID: \"c19f5ead-a871-4aab-885d-19a423a1ab5e\") " Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.225369 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c19f5ead-a871-4aab-885d-19a423a1ab5e-httpd-run\") pod \"c19f5ead-a871-4aab-885d-19a423a1ab5e\" (UID: \"c19f5ead-a871-4aab-885d-19a423a1ab5e\") " Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.225393 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pcll\" (UniqueName: \"kubernetes.io/projected/6071e116-5a36-4d4f-800a-a2b52ef342c8-kube-api-access-8pcll\") pod \"6071e116-5a36-4d4f-800a-a2b52ef342c8\" (UID: \"6071e116-5a36-4d4f-800a-a2b52ef342c8\") " Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.225609 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c19f5ead-a871-4aab-885d-19a423a1ab5e-scripts\") pod \"c19f5ead-a871-4aab-885d-19a423a1ab5e\" (UID: \"c19f5ead-a871-4aab-885d-19a423a1ab5e\") " Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.225640 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c19f5ead-a871-4aab-885d-19a423a1ab5e-combined-ca-bundle\") pod \"c19f5ead-a871-4aab-885d-19a423a1ab5e\" (UID: \"c19f5ead-a871-4aab-885d-19a423a1ab5e\") " Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.225671 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6071e116-5a36-4d4f-800a-a2b52ef342c8-credential-keys\") pod \"6071e116-5a36-4d4f-800a-a2b52ef342c8\" (UID: \"6071e116-5a36-4d4f-800a-a2b52ef342c8\") " Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.225669 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c19f5ead-a871-4aab-885d-19a423a1ab5e-logs" (OuterVolumeSpecName: "logs") pod "c19f5ead-a871-4aab-885d-19a423a1ab5e" (UID: "c19f5ead-a871-4aab-885d-19a423a1ab5e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.225744 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6071e116-5a36-4d4f-800a-a2b52ef342c8-combined-ca-bundle\") pod \"6071e116-5a36-4d4f-800a-a2b52ef342c8\" (UID: \"6071e116-5a36-4d4f-800a-a2b52ef342c8\") " Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.225785 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6071e116-5a36-4d4f-800a-a2b52ef342c8-scripts\") pod \"6071e116-5a36-4d4f-800a-a2b52ef342c8\" (UID: \"6071e116-5a36-4d4f-800a-a2b52ef342c8\") " Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.226436 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c19f5ead-a871-4aab-885d-19a423a1ab5e-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.231705 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6071e116-5a36-4d4f-800a-a2b52ef342c8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6071e116-5a36-4d4f-800a-a2b52ef342c8" (UID: "6071e116-5a36-4d4f-800a-a2b52ef342c8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.231712 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "c19f5ead-a871-4aab-885d-19a423a1ab5e" (UID: "c19f5ead-a871-4aab-885d-19a423a1ab5e"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.233591 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6071e116-5a36-4d4f-800a-a2b52ef342c8-kube-api-access-8pcll" (OuterVolumeSpecName: "kube-api-access-8pcll") pod "6071e116-5a36-4d4f-800a-a2b52ef342c8" (UID: "6071e116-5a36-4d4f-800a-a2b52ef342c8"). InnerVolumeSpecName "kube-api-access-8pcll". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.234917 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6071e116-5a36-4d4f-800a-a2b52ef342c8-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "6071e116-5a36-4d4f-800a-a2b52ef342c8" (UID: "6071e116-5a36-4d4f-800a-a2b52ef342c8"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.235214 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c19f5ead-a871-4aab-885d-19a423a1ab5e-scripts" (OuterVolumeSpecName: "scripts") pod "c19f5ead-a871-4aab-885d-19a423a1ab5e" (UID: "c19f5ead-a871-4aab-885d-19a423a1ab5e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.236591 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c19f5ead-a871-4aab-885d-19a423a1ab5e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c19f5ead-a871-4aab-885d-19a423a1ab5e" (UID: "c19f5ead-a871-4aab-885d-19a423a1ab5e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.240537 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c19f5ead-a871-4aab-885d-19a423a1ab5e-kube-api-access-dgr6b" (OuterVolumeSpecName: "kube-api-access-dgr6b") pod "c19f5ead-a871-4aab-885d-19a423a1ab5e" (UID: "c19f5ead-a871-4aab-885d-19a423a1ab5e"). InnerVolumeSpecName "kube-api-access-dgr6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.258736 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6071e116-5a36-4d4f-800a-a2b52ef342c8-scripts" (OuterVolumeSpecName: "scripts") pod "6071e116-5a36-4d4f-800a-a2b52ef342c8" (UID: "6071e116-5a36-4d4f-800a-a2b52ef342c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.264615 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6071e116-5a36-4d4f-800a-a2b52ef342c8-config-data" (OuterVolumeSpecName: "config-data") pod "6071e116-5a36-4d4f-800a-a2b52ef342c8" (UID: "6071e116-5a36-4d4f-800a-a2b52ef342c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.264675 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6071e116-5a36-4d4f-800a-a2b52ef342c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6071e116-5a36-4d4f-800a-a2b52ef342c8" (UID: "6071e116-5a36-4d4f-800a-a2b52ef342c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.283165 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c19f5ead-a871-4aab-885d-19a423a1ab5e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c19f5ead-a871-4aab-885d-19a423a1ab5e" (UID: "c19f5ead-a871-4aab-885d-19a423a1ab5e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.314212 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c19f5ead-a871-4aab-885d-19a423a1ab5e","Type":"ContainerDied","Data":"7d6b454b7f1d45e28436b2a7fa23ff5b04fd1859c5313578502b03a6395ad3c7"} Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.314255 4756 scope.go:117] "RemoveContainer" containerID="36c8af0658c41bfba1f2a4bc1d0aea28e7e8c2879d906034b4ead3db3d7fd504" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.314357 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.321854 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c19f5ead-a871-4aab-885d-19a423a1ab5e-config-data" (OuterVolumeSpecName: "config-data") pod "c19f5ead-a871-4aab-885d-19a423a1ab5e" (UID: "c19f5ead-a871-4aab-885d-19a423a1ab5e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.324609 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nrzfl" event={"ID":"6071e116-5a36-4d4f-800a-a2b52ef342c8","Type":"ContainerDied","Data":"5d5304d178c426335c960a8cdb4a1d9d094d6a61161cf315273d341a662272a9"} Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.324648 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d5304d178c426335c960a8cdb4a1d9d094d6a61161cf315273d341a662272a9" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.324709 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nrzfl" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.328172 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c19f5ead-a871-4aab-885d-19a423a1ab5e-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.328211 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c19f5ead-a871-4aab-885d-19a423a1ab5e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.328245 4756 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6071e116-5a36-4d4f-800a-a2b52ef342c8-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.328258 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6071e116-5a36-4d4f-800a-a2b52ef342c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.328270 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6071e116-5a36-4d4f-800a-a2b52ef342c8-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.328283 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6071e116-5a36-4d4f-800a-a2b52ef342c8-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.328296 4756 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6071e116-5a36-4d4f-800a-a2b52ef342c8-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.328328 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c19f5ead-a871-4aab-885d-19a423a1ab5e-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.328376 4756 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.328389 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgr6b\" (UniqueName: \"kubernetes.io/projected/c19f5ead-a871-4aab-885d-19a423a1ab5e-kube-api-access-dgr6b\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.328402 4756 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c19f5ead-a871-4aab-885d-19a423a1ab5e-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.328450 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pcll\" (UniqueName: \"kubernetes.io/projected/6071e116-5a36-4d4f-800a-a2b52ef342c8-kube-api-access-8pcll\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.358840 4756 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.407961 4756 scope.go:117] "RemoveContainer" containerID="f88bfa8d54450c4dd5111f49ad6a1976d0838dc22b10bb71fa0e60671a727adf" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.430341 4756 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.481312 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.632996 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtw8g\" (UniqueName: \"kubernetes.io/projected/4fa2f279-0add-448e-8e59-66555a041c99-kube-api-access-jtw8g\") pod \"4fa2f279-0add-448e-8e59-66555a041c99\" (UID: \"4fa2f279-0add-448e-8e59-66555a041c99\") " Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.633135 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fa2f279-0add-448e-8e59-66555a041c99-config-data\") pod \"4fa2f279-0add-448e-8e59-66555a041c99\" (UID: \"4fa2f279-0add-448e-8e59-66555a041c99\") " Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.633268 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fa2f279-0add-448e-8e59-66555a041c99-combined-ca-bundle\") pod \"4fa2f279-0add-448e-8e59-66555a041c99\" (UID: \"4fa2f279-0add-448e-8e59-66555a041c99\") " Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.633335 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fa2f279-0add-448e-8e59-66555a041c99-scripts\") pod \"4fa2f279-0add-448e-8e59-66555a041c99\" (UID: \"4fa2f279-0add-448e-8e59-66555a041c99\") " Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.633369 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"4fa2f279-0add-448e-8e59-66555a041c99\" (UID: \"4fa2f279-0add-448e-8e59-66555a041c99\") " Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.633445 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4fa2f279-0add-448e-8e59-66555a041c99-httpd-run\") pod \"4fa2f279-0add-448e-8e59-66555a041c99\" (UID: \"4fa2f279-0add-448e-8e59-66555a041c99\") " Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.633528 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fa2f279-0add-448e-8e59-66555a041c99-logs\") pod \"4fa2f279-0add-448e-8e59-66555a041c99\" (UID: \"4fa2f279-0add-448e-8e59-66555a041c99\") " Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.633889 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fa2f279-0add-448e-8e59-66555a041c99-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4fa2f279-0add-448e-8e59-66555a041c99" (UID: "4fa2f279-0add-448e-8e59-66555a041c99"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.634215 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fa2f279-0add-448e-8e59-66555a041c99-logs" (OuterVolumeSpecName: "logs") pod "4fa2f279-0add-448e-8e59-66555a041c99" (UID: "4fa2f279-0add-448e-8e59-66555a041c99"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.643622 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fa2f279-0add-448e-8e59-66555a041c99-scripts" (OuterVolumeSpecName: "scripts") pod "4fa2f279-0add-448e-8e59-66555a041c99" (UID: "4fa2f279-0add-448e-8e59-66555a041c99"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.643809 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fa2f279-0add-448e-8e59-66555a041c99-kube-api-access-jtw8g" (OuterVolumeSpecName: "kube-api-access-jtw8g") pod "4fa2f279-0add-448e-8e59-66555a041c99" (UID: "4fa2f279-0add-448e-8e59-66555a041c99"). InnerVolumeSpecName "kube-api-access-jtw8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.645702 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "4fa2f279-0add-448e-8e59-66555a041c99" (UID: "4fa2f279-0add-448e-8e59-66555a041c99"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.652762 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.674672 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fa2f279-0add-448e-8e59-66555a041c99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4fa2f279-0add-448e-8e59-66555a041c99" (UID: "4fa2f279-0add-448e-8e59-66555a041c99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.686086 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.696453 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 16:37:35 crc kubenswrapper[4756]: E1205 16:37:35.696790 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fa2f279-0add-448e-8e59-66555a041c99" containerName="glance-log" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.696806 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fa2f279-0add-448e-8e59-66555a041c99" containerName="glance-log" Dec 05 16:37:35 crc kubenswrapper[4756]: E1205 16:37:35.696816 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fa2f279-0add-448e-8e59-66555a041c99" containerName="glance-httpd" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.696822 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fa2f279-0add-448e-8e59-66555a041c99" containerName="glance-httpd" Dec 05 16:37:35 crc kubenswrapper[4756]: E1205 16:37:35.696838 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c19f5ead-a871-4aab-885d-19a423a1ab5e" containerName="glance-httpd" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.696844 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c19f5ead-a871-4aab-885d-19a423a1ab5e" containerName="glance-httpd" Dec 05 16:37:35 crc kubenswrapper[4756]: E1205 16:37:35.696857 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c19f5ead-a871-4aab-885d-19a423a1ab5e" containerName="glance-log" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.696862 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c19f5ead-a871-4aab-885d-19a423a1ab5e" containerName="glance-log" Dec 05 16:37:35 crc kubenswrapper[4756]: E1205 16:37:35.696876 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6071e116-5a36-4d4f-800a-a2b52ef342c8" containerName="keystone-bootstrap" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.696882 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="6071e116-5a36-4d4f-800a-a2b52ef342c8" containerName="keystone-bootstrap" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.697101 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c19f5ead-a871-4aab-885d-19a423a1ab5e" containerName="glance-log" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.697115 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c19f5ead-a871-4aab-885d-19a423a1ab5e" containerName="glance-httpd" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.697128 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fa2f279-0add-448e-8e59-66555a041c99" containerName="glance-httpd" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.697136 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fa2f279-0add-448e-8e59-66555a041c99" containerName="glance-log" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.697147 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="6071e116-5a36-4d4f-800a-a2b52ef342c8" containerName="keystone-bootstrap" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.698085 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.701403 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.703292 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.724614 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fa2f279-0add-448e-8e59-66555a041c99-config-data" (OuterVolumeSpecName: "config-data") pod "4fa2f279-0add-448e-8e59-66555a041c99" (UID: "4fa2f279-0add-448e-8e59-66555a041c99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.731788 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.735351 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fa2f279-0add-448e-8e59-66555a041c99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.735389 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fa2f279-0add-448e-8e59-66555a041c99-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.735442 4756 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.735456 4756 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4fa2f279-0add-448e-8e59-66555a041c99-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.735468 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fa2f279-0add-448e-8e59-66555a041c99-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.735510 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtw8g\" (UniqueName: \"kubernetes.io/projected/4fa2f279-0add-448e-8e59-66555a041c99-kube-api-access-jtw8g\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.735526 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fa2f279-0add-448e-8e59-66555a041c99-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.755939 4756 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.836595 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d20d841d-b516-42f8-9d69-bee6cab38d04-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.836721 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d20d841d-b516-42f8-9d69-bee6cab38d04-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.836771 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d20d841d-b516-42f8-9d69-bee6cab38d04-logs\") pod \"glance-default-internal-api-0\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.836802 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcvsp\" (UniqueName: \"kubernetes.io/projected/d20d841d-b516-42f8-9d69-bee6cab38d04-kube-api-access-mcvsp\") pod \"glance-default-internal-api-0\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.836828 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.836977 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d20d841d-b516-42f8-9d69-bee6cab38d04-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.837031 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d20d841d-b516-42f8-9d69-bee6cab38d04-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.837353 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d20d841d-b516-42f8-9d69-bee6cab38d04-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.837614 4756 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.939359 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d20d841d-b516-42f8-9d69-bee6cab38d04-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.939427 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d20d841d-b516-42f8-9d69-bee6cab38d04-logs\") pod \"glance-default-internal-api-0\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.939458 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcvsp\" (UniqueName: \"kubernetes.io/projected/d20d841d-b516-42f8-9d69-bee6cab38d04-kube-api-access-mcvsp\") pod \"glance-default-internal-api-0\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.939478 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.939519 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d20d841d-b516-42f8-9d69-bee6cab38d04-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.939572 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d20d841d-b516-42f8-9d69-bee6cab38d04-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.939626 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d20d841d-b516-42f8-9d69-bee6cab38d04-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.939654 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d20d841d-b516-42f8-9d69-bee6cab38d04-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.940087 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d20d841d-b516-42f8-9d69-bee6cab38d04-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.940167 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d20d841d-b516-42f8-9d69-bee6cab38d04-logs\") pod \"glance-default-internal-api-0\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.940345 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.944082 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d20d841d-b516-42f8-9d69-bee6cab38d04-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.944085 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d20d841d-b516-42f8-9d69-bee6cab38d04-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.945249 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d20d841d-b516-42f8-9d69-bee6cab38d04-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.950127 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d20d841d-b516-42f8-9d69-bee6cab38d04-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.961282 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcvsp\" (UniqueName: \"kubernetes.io/projected/d20d841d-b516-42f8-9d69-bee6cab38d04-kube-api-access-mcvsp\") pod \"glance-default-internal-api-0\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:35 crc kubenswrapper[4756]: I1205 16:37:35.976239 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.015882 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.077973 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c19f5ead-a871-4aab-885d-19a423a1ab5e" path="/var/lib/kubelet/pods/c19f5ead-a871-4aab-885d-19a423a1ab5e/volumes" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.314065 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-857c4c465c-j99gg"] Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.317485 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-857c4c465c-j99gg" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.320663 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.320819 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rdq64" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.320921 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.321169 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.321306 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.321395 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.356950 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7550fa7-075b-42e9-be97-52e7ed4f552c","Type":"ContainerStarted","Data":"1dc41eca912836854eb899bcf0eed311a49e7644a686a7350102564960f9d9db"} Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.361166 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-857c4c465c-j99gg"] Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.362263 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ffbjc" event={"ID":"b35c2088-a11f-4514-8390-1a1f5ff0ccb5","Type":"ContainerStarted","Data":"4da7d739364a7736ae954751f02c3f5f707192b39710f85ec3c47bc1b3a59586"} Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.369092 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4fa2f279-0add-448e-8e59-66555a041c99","Type":"ContainerDied","Data":"d9b6220e2c3ec96e6f77da3521bb1a359b4bd702aa84d891b083d2343f0f153e"} Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.369142 4756 scope.go:117] "RemoveContainer" containerID="17291890af985389f331805af621c45fafd34a98aa6c95915a38f76d5aff3b77" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.369276 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.393985 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-ffbjc" podStartSLOduration=2.640019067 podStartE2EDuration="45.393965526s" podCreationTimestamp="2025-12-05 16:36:51 +0000 UTC" firstStartedPulling="2025-12-05 16:36:52.527724003 +0000 UTC m=+1272.766397515" lastFinishedPulling="2025-12-05 16:37:35.281670462 +0000 UTC m=+1315.520343974" observedRunningTime="2025-12-05 16:37:36.386663318 +0000 UTC m=+1316.625336830" watchObservedRunningTime="2025-12-05 16:37:36.393965526 +0000 UTC m=+1316.632639038" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.418452 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.431544 4756 scope.go:117] "RemoveContainer" containerID="a73af2a4a51cf3cc30a8ba9c82f0692030878733b2d97c896ea685b515a60559" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.445163 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.450326 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-config-data\") pod \"keystone-857c4c465c-j99gg\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " pod="openstack/keystone-857c4c465c-j99gg" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.450439 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-internal-tls-certs\") pod \"keystone-857c4c465c-j99gg\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " pod="openstack/keystone-857c4c465c-j99gg" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.450496 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-scripts\") pod \"keystone-857c4c465c-j99gg\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " pod="openstack/keystone-857c4c465c-j99gg" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.450542 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-public-tls-certs\") pod \"keystone-857c4c465c-j99gg\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " pod="openstack/keystone-857c4c465c-j99gg" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.450580 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-combined-ca-bundle\") pod \"keystone-857c4c465c-j99gg\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " pod="openstack/keystone-857c4c465c-j99gg" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.450625 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vct9x\" (UniqueName: \"kubernetes.io/projected/87da04a1-7ee1-47e4-9bc7-1808ea506457-kube-api-access-vct9x\") pod \"keystone-857c4c465c-j99gg\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " pod="openstack/keystone-857c4c465c-j99gg" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.450643 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-fernet-keys\") pod \"keystone-857c4c465c-j99gg\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " pod="openstack/keystone-857c4c465c-j99gg" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.450681 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-credential-keys\") pod \"keystone-857c4c465c-j99gg\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " pod="openstack/keystone-857c4c465c-j99gg" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.471809 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.473228 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.473911 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.479763 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.482254 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.485981 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.542549 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.553302 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-combined-ca-bundle\") pod \"keystone-857c4c465c-j99gg\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " pod="openstack/keystone-857c4c465c-j99gg" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.553408 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vct9x\" (UniqueName: \"kubernetes.io/projected/87da04a1-7ee1-47e4-9bc7-1808ea506457-kube-api-access-vct9x\") pod \"keystone-857c4c465c-j99gg\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " pod="openstack/keystone-857c4c465c-j99gg" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.553465 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-fernet-keys\") pod \"keystone-857c4c465c-j99gg\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " pod="openstack/keystone-857c4c465c-j99gg" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.553517 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-credential-keys\") pod \"keystone-857c4c465c-j99gg\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " pod="openstack/keystone-857c4c465c-j99gg" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.553564 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-config-data\") pod \"keystone-857c4c465c-j99gg\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " pod="openstack/keystone-857c4c465c-j99gg" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.553636 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-internal-tls-certs\") pod \"keystone-857c4c465c-j99gg\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " pod="openstack/keystone-857c4c465c-j99gg" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.553712 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-scripts\") pod \"keystone-857c4c465c-j99gg\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " pod="openstack/keystone-857c4c465c-j99gg" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.553766 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-public-tls-certs\") pod \"keystone-857c4c465c-j99gg\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " pod="openstack/keystone-857c4c465c-j99gg" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.574435 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-scripts\") pod \"keystone-857c4c465c-j99gg\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " pod="openstack/keystone-857c4c465c-j99gg" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.574957 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-credential-keys\") pod \"keystone-857c4c465c-j99gg\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " pod="openstack/keystone-857c4c465c-j99gg" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.577148 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-combined-ca-bundle\") pod \"keystone-857c4c465c-j99gg\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " pod="openstack/keystone-857c4c465c-j99gg" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.577531 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-config-data\") pod \"keystone-857c4c465c-j99gg\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " pod="openstack/keystone-857c4c465c-j99gg" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.585076 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-fernet-keys\") pod \"keystone-857c4c465c-j99gg\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " pod="openstack/keystone-857c4c465c-j99gg" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.588625 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-public-tls-certs\") pod \"keystone-857c4c465c-j99gg\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " pod="openstack/keystone-857c4c465c-j99gg" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.598993 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vct9x\" (UniqueName: \"kubernetes.io/projected/87da04a1-7ee1-47e4-9bc7-1808ea506457-kube-api-access-vct9x\") pod \"keystone-857c4c465c-j99gg\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " pod="openstack/keystone-857c4c465c-j99gg" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.655439 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.655486 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc6hr\" (UniqueName: \"kubernetes.io/projected/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-kube-api-access-lc6hr\") pod \"glance-default-external-api-0\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.655546 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.655587 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-config-data\") pod \"glance-default-external-api-0\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.655601 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-scripts\") pod \"glance-default-external-api-0\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.655634 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.655691 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.655712 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-logs\") pod \"glance-default-external-api-0\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.668045 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-794c6877f7-rhjmw"] Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.668276 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" podUID="0b2c1f93-ed28-4323-bb50-e8c143e01f51" containerName="dnsmasq-dns" containerID="cri-o://fe04514fb7986a3917fe465e5c61beebbacdaf0ec3432b12272d4701aa5b6d79" gracePeriod=10 Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.700255 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-internal-tls-certs\") pod \"keystone-857c4c465c-j99gg\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " pod="openstack/keystone-857c4c465c-j99gg" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.771462 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.771519 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc6hr\" (UniqueName: \"kubernetes.io/projected/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-kube-api-access-lc6hr\") pod \"glance-default-external-api-0\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.771584 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.771610 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-config-data\") pod \"glance-default-external-api-0\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.771627 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-scripts\") pod \"glance-default-external-api-0\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.771665 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.771739 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-logs\") pod \"glance-default-external-api-0\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.771755 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.772532 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.772752 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-logs\") pod \"glance-default-external-api-0\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.773166 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.780959 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-config-data\") pod \"glance-default-external-api-0\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.782095 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.784799 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-scripts\") pod \"glance-default-external-api-0\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.787342 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.816223 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc6hr\" (UniqueName: \"kubernetes.io/projected/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-kube-api-access-lc6hr\") pod \"glance-default-external-api-0\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.878792 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " pod="openstack/glance-default-external-api-0" Dec 05 16:37:36 crc kubenswrapper[4756]: I1205 16:37:36.953191 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-857c4c465c-j99gg" Dec 05 16:37:37 crc kubenswrapper[4756]: I1205 16:37:37.101951 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 16:37:37 crc kubenswrapper[4756]: I1205 16:37:37.438962 4756 generic.go:334] "Generic (PLEG): container finished" podID="0b2c1f93-ed28-4323-bb50-e8c143e01f51" containerID="fe04514fb7986a3917fe465e5c61beebbacdaf0ec3432b12272d4701aa5b6d79" exitCode=0 Dec 05 16:37:37 crc kubenswrapper[4756]: I1205 16:37:37.439360 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" event={"ID":"0b2c1f93-ed28-4323-bb50-e8c143e01f51","Type":"ContainerDied","Data":"fe04514fb7986a3917fe465e5c61beebbacdaf0ec3432b12272d4701aa5b6d79"} Dec 05 16:37:37 crc kubenswrapper[4756]: I1205 16:37:37.439396 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" event={"ID":"0b2c1f93-ed28-4323-bb50-e8c143e01f51","Type":"ContainerDied","Data":"959dc7bc41aae2c9f926443387c9bd7fa1f1cb9a6c0704ed7213840d20f0898b"} Dec 05 16:37:37 crc kubenswrapper[4756]: I1205 16:37:37.439410 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="959dc7bc41aae2c9f926443387c9bd7fa1f1cb9a6c0704ed7213840d20f0898b" Dec 05 16:37:37 crc kubenswrapper[4756]: I1205 16:37:37.442258 4756 generic.go:334] "Generic (PLEG): container finished" podID="01800831-fa81-4f58-9da9-66dcbfe86580" containerID="871db04e55a28c57dd3290df7567c594c5863df0a7288c54f7126b691f5b411f" exitCode=0 Dec 05 16:37:37 crc kubenswrapper[4756]: I1205 16:37:37.442309 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-64jk8" event={"ID":"01800831-fa81-4f58-9da9-66dcbfe86580","Type":"ContainerDied","Data":"871db04e55a28c57dd3290df7567c594c5863df0a7288c54f7126b691f5b411f"} Dec 05 16:37:37 crc kubenswrapper[4756]: I1205 16:37:37.451051 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d20d841d-b516-42f8-9d69-bee6cab38d04","Type":"ContainerStarted","Data":"d55ecbfd1516e2c5f14e5a447accc73007dfe4d73d0ba03239b133644c89115c"} Dec 05 16:37:37 crc kubenswrapper[4756]: I1205 16:37:37.485193 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" Dec 05 16:37:37 crc kubenswrapper[4756]: I1205 16:37:37.591207 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdxjj\" (UniqueName: \"kubernetes.io/projected/0b2c1f93-ed28-4323-bb50-e8c143e01f51-kube-api-access-kdxjj\") pod \"0b2c1f93-ed28-4323-bb50-e8c143e01f51\" (UID: \"0b2c1f93-ed28-4323-bb50-e8c143e01f51\") " Dec 05 16:37:37 crc kubenswrapper[4756]: I1205 16:37:37.591540 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b2c1f93-ed28-4323-bb50-e8c143e01f51-dns-svc\") pod \"0b2c1f93-ed28-4323-bb50-e8c143e01f51\" (UID: \"0b2c1f93-ed28-4323-bb50-e8c143e01f51\") " Dec 05 16:37:37 crc kubenswrapper[4756]: I1205 16:37:37.591652 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b2c1f93-ed28-4323-bb50-e8c143e01f51-ovsdbserver-sb\") pod \"0b2c1f93-ed28-4323-bb50-e8c143e01f51\" (UID: \"0b2c1f93-ed28-4323-bb50-e8c143e01f51\") " Dec 05 16:37:37 crc kubenswrapper[4756]: I1205 16:37:37.591672 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b2c1f93-ed28-4323-bb50-e8c143e01f51-ovsdbserver-nb\") pod \"0b2c1f93-ed28-4323-bb50-e8c143e01f51\" (UID: \"0b2c1f93-ed28-4323-bb50-e8c143e01f51\") " Dec 05 16:37:37 crc kubenswrapper[4756]: I1205 16:37:37.591780 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b2c1f93-ed28-4323-bb50-e8c143e01f51-config\") pod \"0b2c1f93-ed28-4323-bb50-e8c143e01f51\" (UID: \"0b2c1f93-ed28-4323-bb50-e8c143e01f51\") " Dec 05 16:37:37 crc kubenswrapper[4756]: I1205 16:37:37.591819 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0b2c1f93-ed28-4323-bb50-e8c143e01f51-dns-swift-storage-0\") pod \"0b2c1f93-ed28-4323-bb50-e8c143e01f51\" (UID: \"0b2c1f93-ed28-4323-bb50-e8c143e01f51\") " Dec 05 16:37:37 crc kubenswrapper[4756]: I1205 16:37:37.598955 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b2c1f93-ed28-4323-bb50-e8c143e01f51-kube-api-access-kdxjj" (OuterVolumeSpecName: "kube-api-access-kdxjj") pod "0b2c1f93-ed28-4323-bb50-e8c143e01f51" (UID: "0b2c1f93-ed28-4323-bb50-e8c143e01f51"). InnerVolumeSpecName "kube-api-access-kdxjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:37:37 crc kubenswrapper[4756]: I1205 16:37:37.667551 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b2c1f93-ed28-4323-bb50-e8c143e01f51-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0b2c1f93-ed28-4323-bb50-e8c143e01f51" (UID: "0b2c1f93-ed28-4323-bb50-e8c143e01f51"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:37:37 crc kubenswrapper[4756]: I1205 16:37:37.667570 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b2c1f93-ed28-4323-bb50-e8c143e01f51-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0b2c1f93-ed28-4323-bb50-e8c143e01f51" (UID: "0b2c1f93-ed28-4323-bb50-e8c143e01f51"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:37:37 crc kubenswrapper[4756]: I1205 16:37:37.694345 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdxjj\" (UniqueName: \"kubernetes.io/projected/0b2c1f93-ed28-4323-bb50-e8c143e01f51-kube-api-access-kdxjj\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:37 crc kubenswrapper[4756]: I1205 16:37:37.694385 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b2c1f93-ed28-4323-bb50-e8c143e01f51-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:37 crc kubenswrapper[4756]: I1205 16:37:37.694397 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b2c1f93-ed28-4323-bb50-e8c143e01f51-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:37 crc kubenswrapper[4756]: I1205 16:37:37.696671 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b2c1f93-ed28-4323-bb50-e8c143e01f51-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0b2c1f93-ed28-4323-bb50-e8c143e01f51" (UID: "0b2c1f93-ed28-4323-bb50-e8c143e01f51"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:37:37 crc kubenswrapper[4756]: I1205 16:37:37.720456 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b2c1f93-ed28-4323-bb50-e8c143e01f51-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0b2c1f93-ed28-4323-bb50-e8c143e01f51" (UID: "0b2c1f93-ed28-4323-bb50-e8c143e01f51"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:37:37 crc kubenswrapper[4756]: I1205 16:37:37.725525 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b2c1f93-ed28-4323-bb50-e8c143e01f51-config" (OuterVolumeSpecName: "config") pod "0b2c1f93-ed28-4323-bb50-e8c143e01f51" (UID: "0b2c1f93-ed28-4323-bb50-e8c143e01f51"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:37:37 crc kubenswrapper[4756]: I1205 16:37:37.796237 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b2c1f93-ed28-4323-bb50-e8c143e01f51-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:37 crc kubenswrapper[4756]: I1205 16:37:37.796272 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b2c1f93-ed28-4323-bb50-e8c143e01f51-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:37 crc kubenswrapper[4756]: I1205 16:37:37.796287 4756 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0b2c1f93-ed28-4323-bb50-e8c143e01f51-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:37 crc kubenswrapper[4756]: I1205 16:37:37.926946 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 16:37:37 crc kubenswrapper[4756]: W1205 16:37:37.949567 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6f2b24c_528f_4a3f_b3d0_d85d5b64fa83.slice/crio-6654c1e4f1ab92c605ef4191e6cb7fb61ce7e619e3c649fe103e9c2293dc9956 WatchSource:0}: Error finding container 6654c1e4f1ab92c605ef4191e6cb7fb61ce7e619e3c649fe103e9c2293dc9956: Status 404 returned error can't find the container with id 6654c1e4f1ab92c605ef4191e6cb7fb61ce7e619e3c649fe103e9c2293dc9956 Dec 05 16:37:37 crc kubenswrapper[4756]: I1205 16:37:37.977254 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-857c4c465c-j99gg"] Dec 05 16:37:38 crc kubenswrapper[4756]: I1205 16:37:38.099129 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fa2f279-0add-448e-8e59-66555a041c99" path="/var/lib/kubelet/pods/4fa2f279-0add-448e-8e59-66555a041c99/volumes" Dec 05 16:37:38 crc kubenswrapper[4756]: I1205 16:37:38.481489 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d20d841d-b516-42f8-9d69-bee6cab38d04","Type":"ContainerStarted","Data":"9f05ebf9f76d9de462b8e6c8165c8a59b53a64d17ed04decde1d2699deefc1d9"} Dec 05 16:37:38 crc kubenswrapper[4756]: I1205 16:37:38.485406 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-857c4c465c-j99gg" event={"ID":"87da04a1-7ee1-47e4-9bc7-1808ea506457","Type":"ContainerStarted","Data":"84853475a4ebc4610f6d0ed52745339c050a856a8d40b0bcfb30e50f936692ed"} Dec 05 16:37:38 crc kubenswrapper[4756]: I1205 16:37:38.485475 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-857c4c465c-j99gg" event={"ID":"87da04a1-7ee1-47e4-9bc7-1808ea506457","Type":"ContainerStarted","Data":"8c4bc2a41badc74bb7d0d18e5fdbc0ea44112e88cc1b30de0ce6e857c2a33ef3"} Dec 05 16:37:38 crc kubenswrapper[4756]: I1205 16:37:38.486866 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-857c4c465c-j99gg" Dec 05 16:37:38 crc kubenswrapper[4756]: I1205 16:37:38.495576 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83","Type":"ContainerStarted","Data":"6654c1e4f1ab92c605ef4191e6cb7fb61ce7e619e3c649fe103e9c2293dc9956"} Dec 05 16:37:38 crc kubenswrapper[4756]: I1205 16:37:38.499861 4756 generic.go:334] "Generic (PLEG): container finished" podID="9f0a57ed-b792-42fb-95df-8a88b426b3b7" containerID="03919cc24c81be7e09a5b0d4242745d52ccadb053c029fd639abe9d7ba68719a" exitCode=0 Dec 05 16:37:38 crc kubenswrapper[4756]: I1205 16:37:38.499953 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" Dec 05 16:37:38 crc kubenswrapper[4756]: I1205 16:37:38.500082 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-w47gl" event={"ID":"9f0a57ed-b792-42fb-95df-8a88b426b3b7","Type":"ContainerDied","Data":"03919cc24c81be7e09a5b0d4242745d52ccadb053c029fd639abe9d7ba68719a"} Dec 05 16:37:38 crc kubenswrapper[4756]: E1205 16:37:38.502735 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b2c1f93_ed28_4323_bb50_e8c143e01f51.slice\": RecentStats: unable to find data in memory cache]" Dec 05 16:37:38 crc kubenswrapper[4756]: I1205 16:37:38.516676 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-857c4c465c-j99gg" podStartSLOduration=2.516654021 podStartE2EDuration="2.516654021s" podCreationTimestamp="2025-12-05 16:37:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:37:38.515068708 +0000 UTC m=+1318.753742220" watchObservedRunningTime="2025-12-05 16:37:38.516654021 +0000 UTC m=+1318.755327533" Dec 05 16:37:38 crc kubenswrapper[4756]: I1205 16:37:38.561345 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-794c6877f7-rhjmw"] Dec 05 16:37:38 crc kubenswrapper[4756]: I1205 16:37:38.574490 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-794c6877f7-rhjmw"] Dec 05 16:37:38 crc kubenswrapper[4756]: I1205 16:37:38.929379 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-64jk8" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.024647 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01800831-fa81-4f58-9da9-66dcbfe86580-config-data\") pod \"01800831-fa81-4f58-9da9-66dcbfe86580\" (UID: \"01800831-fa81-4f58-9da9-66dcbfe86580\") " Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.024772 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01800831-fa81-4f58-9da9-66dcbfe86580-logs\") pod \"01800831-fa81-4f58-9da9-66dcbfe86580\" (UID: \"01800831-fa81-4f58-9da9-66dcbfe86580\") " Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.025342 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01800831-fa81-4f58-9da9-66dcbfe86580-logs" (OuterVolumeSpecName: "logs") pod "01800831-fa81-4f58-9da9-66dcbfe86580" (UID: "01800831-fa81-4f58-9da9-66dcbfe86580"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.025677 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01800831-fa81-4f58-9da9-66dcbfe86580-scripts\") pod \"01800831-fa81-4f58-9da9-66dcbfe86580\" (UID: \"01800831-fa81-4f58-9da9-66dcbfe86580\") " Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.025767 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s76l\" (UniqueName: \"kubernetes.io/projected/01800831-fa81-4f58-9da9-66dcbfe86580-kube-api-access-6s76l\") pod \"01800831-fa81-4f58-9da9-66dcbfe86580\" (UID: \"01800831-fa81-4f58-9da9-66dcbfe86580\") " Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.025809 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01800831-fa81-4f58-9da9-66dcbfe86580-combined-ca-bundle\") pod \"01800831-fa81-4f58-9da9-66dcbfe86580\" (UID: \"01800831-fa81-4f58-9da9-66dcbfe86580\") " Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.028321 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01800831-fa81-4f58-9da9-66dcbfe86580-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.033892 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01800831-fa81-4f58-9da9-66dcbfe86580-kube-api-access-6s76l" (OuterVolumeSpecName: "kube-api-access-6s76l") pod "01800831-fa81-4f58-9da9-66dcbfe86580" (UID: "01800831-fa81-4f58-9da9-66dcbfe86580"). InnerVolumeSpecName "kube-api-access-6s76l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.034266 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01800831-fa81-4f58-9da9-66dcbfe86580-scripts" (OuterVolumeSpecName: "scripts") pod "01800831-fa81-4f58-9da9-66dcbfe86580" (UID: "01800831-fa81-4f58-9da9-66dcbfe86580"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.059666 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01800831-fa81-4f58-9da9-66dcbfe86580-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01800831-fa81-4f58-9da9-66dcbfe86580" (UID: "01800831-fa81-4f58-9da9-66dcbfe86580"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.062898 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01800831-fa81-4f58-9da9-66dcbfe86580-config-data" (OuterVolumeSpecName: "config-data") pod "01800831-fa81-4f58-9da9-66dcbfe86580" (UID: "01800831-fa81-4f58-9da9-66dcbfe86580"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.129687 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01800831-fa81-4f58-9da9-66dcbfe86580-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.129730 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01800831-fa81-4f58-9da9-66dcbfe86580-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.129740 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s76l\" (UniqueName: \"kubernetes.io/projected/01800831-fa81-4f58-9da9-66dcbfe86580-kube-api-access-6s76l\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.129750 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01800831-fa81-4f58-9da9-66dcbfe86580-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.517969 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83","Type":"ContainerStarted","Data":"624c369df5ed61c6d384aab2a1be9626724b0b3a3157dc4f1752a10cd288684f"} Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.518244 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83","Type":"ContainerStarted","Data":"57ab6a40cb3a904dcb8be55d30199cf169c467f5fb9e1537ab5a5413525e37a4"} Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.527818 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d20d841d-b516-42f8-9d69-bee6cab38d04","Type":"ContainerStarted","Data":"0a3f2fad23369eef6abdb198f5d8d248f85735284965758d4037155e9fa9a2ea"} Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.531223 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-64jk8" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.531220 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-64jk8" event={"ID":"01800831-fa81-4f58-9da9-66dcbfe86580","Type":"ContainerDied","Data":"8ecdafeb73e64adf7ef8d5dd110f049e479909be680134ce99b5838c3026f9de"} Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.531273 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ecdafeb73e64adf7ef8d5dd110f049e479909be680134ce99b5838c3026f9de" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.545035 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.545018427 podStartE2EDuration="3.545018427s" podCreationTimestamp="2025-12-05 16:37:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:37:39.538331246 +0000 UTC m=+1319.777004758" watchObservedRunningTime="2025-12-05 16:37:39.545018427 +0000 UTC m=+1319.783691939" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.574779 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.574753304 podStartE2EDuration="4.574753304s" podCreationTimestamp="2025-12-05 16:37:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:37:39.560537678 +0000 UTC m=+1319.799211210" watchObservedRunningTime="2025-12-05 16:37:39.574753304 +0000 UTC m=+1319.813426816" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.736578 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-668cbcb89b-dvw88"] Dec 05 16:37:39 crc kubenswrapper[4756]: E1205 16:37:39.737244 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01800831-fa81-4f58-9da9-66dcbfe86580" containerName="placement-db-sync" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.737352 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="01800831-fa81-4f58-9da9-66dcbfe86580" containerName="placement-db-sync" Dec 05 16:37:39 crc kubenswrapper[4756]: E1205 16:37:39.737442 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b2c1f93-ed28-4323-bb50-e8c143e01f51" containerName="dnsmasq-dns" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.737501 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b2c1f93-ed28-4323-bb50-e8c143e01f51" containerName="dnsmasq-dns" Dec 05 16:37:39 crc kubenswrapper[4756]: E1205 16:37:39.737566 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b2c1f93-ed28-4323-bb50-e8c143e01f51" containerName="init" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.737617 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b2c1f93-ed28-4323-bb50-e8c143e01f51" containerName="init" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.737856 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b2c1f93-ed28-4323-bb50-e8c143e01f51" containerName="dnsmasq-dns" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.737920 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="01800831-fa81-4f58-9da9-66dcbfe86580" containerName="placement-db-sync" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.738912 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-668cbcb89b-dvw88" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.748168 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-gllhf" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.751797 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.754176 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.754600 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.754846 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.779921 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-668cbcb89b-dvw88"] Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.844498 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84d62427-bd91-4330-be20-152bc3508447-combined-ca-bundle\") pod \"placement-668cbcb89b-dvw88\" (UID: \"84d62427-bd91-4330-be20-152bc3508447\") " pod="openstack/placement-668cbcb89b-dvw88" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.844730 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84d62427-bd91-4330-be20-152bc3508447-logs\") pod \"placement-668cbcb89b-dvw88\" (UID: \"84d62427-bd91-4330-be20-152bc3508447\") " pod="openstack/placement-668cbcb89b-dvw88" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.844845 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84d62427-bd91-4330-be20-152bc3508447-public-tls-certs\") pod \"placement-668cbcb89b-dvw88\" (UID: \"84d62427-bd91-4330-be20-152bc3508447\") " pod="openstack/placement-668cbcb89b-dvw88" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.844928 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84d62427-bd91-4330-be20-152bc3508447-config-data\") pod \"placement-668cbcb89b-dvw88\" (UID: \"84d62427-bd91-4330-be20-152bc3508447\") " pod="openstack/placement-668cbcb89b-dvw88" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.844997 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84d62427-bd91-4330-be20-152bc3508447-scripts\") pod \"placement-668cbcb89b-dvw88\" (UID: \"84d62427-bd91-4330-be20-152bc3508447\") " pod="openstack/placement-668cbcb89b-dvw88" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.845109 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrfjj\" (UniqueName: \"kubernetes.io/projected/84d62427-bd91-4330-be20-152bc3508447-kube-api-access-wrfjj\") pod \"placement-668cbcb89b-dvw88\" (UID: \"84d62427-bd91-4330-be20-152bc3508447\") " pod="openstack/placement-668cbcb89b-dvw88" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.845203 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84d62427-bd91-4330-be20-152bc3508447-internal-tls-certs\") pod \"placement-668cbcb89b-dvw88\" (UID: \"84d62427-bd91-4330-be20-152bc3508447\") " pod="openstack/placement-668cbcb89b-dvw88" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.946390 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrfjj\" (UniqueName: \"kubernetes.io/projected/84d62427-bd91-4330-be20-152bc3508447-kube-api-access-wrfjj\") pod \"placement-668cbcb89b-dvw88\" (UID: \"84d62427-bd91-4330-be20-152bc3508447\") " pod="openstack/placement-668cbcb89b-dvw88" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.946469 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84d62427-bd91-4330-be20-152bc3508447-internal-tls-certs\") pod \"placement-668cbcb89b-dvw88\" (UID: \"84d62427-bd91-4330-be20-152bc3508447\") " pod="openstack/placement-668cbcb89b-dvw88" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.946494 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84d62427-bd91-4330-be20-152bc3508447-combined-ca-bundle\") pod \"placement-668cbcb89b-dvw88\" (UID: \"84d62427-bd91-4330-be20-152bc3508447\") " pod="openstack/placement-668cbcb89b-dvw88" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.946513 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84d62427-bd91-4330-be20-152bc3508447-logs\") pod \"placement-668cbcb89b-dvw88\" (UID: \"84d62427-bd91-4330-be20-152bc3508447\") " pod="openstack/placement-668cbcb89b-dvw88" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.946561 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84d62427-bd91-4330-be20-152bc3508447-public-tls-certs\") pod \"placement-668cbcb89b-dvw88\" (UID: \"84d62427-bd91-4330-be20-152bc3508447\") " pod="openstack/placement-668cbcb89b-dvw88" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.946586 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84d62427-bd91-4330-be20-152bc3508447-config-data\") pod \"placement-668cbcb89b-dvw88\" (UID: \"84d62427-bd91-4330-be20-152bc3508447\") " pod="openstack/placement-668cbcb89b-dvw88" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.946604 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84d62427-bd91-4330-be20-152bc3508447-scripts\") pod \"placement-668cbcb89b-dvw88\" (UID: \"84d62427-bd91-4330-be20-152bc3508447\") " pod="openstack/placement-668cbcb89b-dvw88" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.947609 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84d62427-bd91-4330-be20-152bc3508447-logs\") pod \"placement-668cbcb89b-dvw88\" (UID: \"84d62427-bd91-4330-be20-152bc3508447\") " pod="openstack/placement-668cbcb89b-dvw88" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.956268 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84d62427-bd91-4330-be20-152bc3508447-config-data\") pod \"placement-668cbcb89b-dvw88\" (UID: \"84d62427-bd91-4330-be20-152bc3508447\") " pod="openstack/placement-668cbcb89b-dvw88" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.957563 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84d62427-bd91-4330-be20-152bc3508447-internal-tls-certs\") pod \"placement-668cbcb89b-dvw88\" (UID: \"84d62427-bd91-4330-be20-152bc3508447\") " pod="openstack/placement-668cbcb89b-dvw88" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.968787 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84d62427-bd91-4330-be20-152bc3508447-scripts\") pod \"placement-668cbcb89b-dvw88\" (UID: \"84d62427-bd91-4330-be20-152bc3508447\") " pod="openstack/placement-668cbcb89b-dvw88" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.976178 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84d62427-bd91-4330-be20-152bc3508447-public-tls-certs\") pod \"placement-668cbcb89b-dvw88\" (UID: \"84d62427-bd91-4330-be20-152bc3508447\") " pod="openstack/placement-668cbcb89b-dvw88" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.979601 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrfjj\" (UniqueName: \"kubernetes.io/projected/84d62427-bd91-4330-be20-152bc3508447-kube-api-access-wrfjj\") pod \"placement-668cbcb89b-dvw88\" (UID: \"84d62427-bd91-4330-be20-152bc3508447\") " pod="openstack/placement-668cbcb89b-dvw88" Dec 05 16:37:39 crc kubenswrapper[4756]: I1205 16:37:39.994839 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84d62427-bd91-4330-be20-152bc3508447-combined-ca-bundle\") pod \"placement-668cbcb89b-dvw88\" (UID: \"84d62427-bd91-4330-be20-152bc3508447\") " pod="openstack/placement-668cbcb89b-dvw88" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.115232 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-gllhf" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.121764 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b2c1f93-ed28-4323-bb50-e8c143e01f51" path="/var/lib/kubelet/pods/0b2c1f93-ed28-4323-bb50-e8c143e01f51/volumes" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.122755 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-668cbcb89b-dvw88" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.214800 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-w47gl" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.373785 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f0a57ed-b792-42fb-95df-8a88b426b3b7-combined-ca-bundle\") pod \"9f0a57ed-b792-42fb-95df-8a88b426b3b7\" (UID: \"9f0a57ed-b792-42fb-95df-8a88b426b3b7\") " Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.374124 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9f0a57ed-b792-42fb-95df-8a88b426b3b7-db-sync-config-data\") pod \"9f0a57ed-b792-42fb-95df-8a88b426b3b7\" (UID: \"9f0a57ed-b792-42fb-95df-8a88b426b3b7\") " Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.374175 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcf8z\" (UniqueName: \"kubernetes.io/projected/9f0a57ed-b792-42fb-95df-8a88b426b3b7-kube-api-access-fcf8z\") pod \"9f0a57ed-b792-42fb-95df-8a88b426b3b7\" (UID: \"9f0a57ed-b792-42fb-95df-8a88b426b3b7\") " Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.390722 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f0a57ed-b792-42fb-95df-8a88b426b3b7-kube-api-access-fcf8z" (OuterVolumeSpecName: "kube-api-access-fcf8z") pod "9f0a57ed-b792-42fb-95df-8a88b426b3b7" (UID: "9f0a57ed-b792-42fb-95df-8a88b426b3b7"). InnerVolumeSpecName "kube-api-access-fcf8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.409737 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f0a57ed-b792-42fb-95df-8a88b426b3b7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9f0a57ed-b792-42fb-95df-8a88b426b3b7" (UID: "9f0a57ed-b792-42fb-95df-8a88b426b3b7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.412716 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f0a57ed-b792-42fb-95df-8a88b426b3b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f0a57ed-b792-42fb-95df-8a88b426b3b7" (UID: "9f0a57ed-b792-42fb-95df-8a88b426b3b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.476264 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f0a57ed-b792-42fb-95df-8a88b426b3b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.476308 4756 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9f0a57ed-b792-42fb-95df-8a88b426b3b7-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.476322 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcf8z\" (UniqueName: \"kubernetes.io/projected/9f0a57ed-b792-42fb-95df-8a88b426b3b7-kube-api-access-fcf8z\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.550366 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-w47gl" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.550440 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-w47gl" event={"ID":"9f0a57ed-b792-42fb-95df-8a88b426b3b7","Type":"ContainerDied","Data":"379f9a9b7c9d3e3dc2f1fea7a80b69d968ead94df695d2e07bebf99878880ce2"} Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.550532 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="379f9a9b7c9d3e3dc2f1fea7a80b69d968ead94df695d2e07bebf99878880ce2" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.619150 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-668cbcb89b-dvw88"] Dec 05 16:37:40 crc kubenswrapper[4756]: W1205 16:37:40.623317 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84d62427_bd91_4330_be20_152bc3508447.slice/crio-731818aad98a6a2a6899b20d4dca2d9aed160a17a7cdc3867438296fbe19a6a6 WatchSource:0}: Error finding container 731818aad98a6a2a6899b20d4dca2d9aed160a17a7cdc3867438296fbe19a6a6: Status 404 returned error can't find the container with id 731818aad98a6a2a6899b20d4dca2d9aed160a17a7cdc3867438296fbe19a6a6 Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.827346 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-9d6468757-b9ttg"] Dec 05 16:37:40 crc kubenswrapper[4756]: E1205 16:37:40.827746 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0a57ed-b792-42fb-95df-8a88b426b3b7" containerName="barbican-db-sync" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.827758 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0a57ed-b792-42fb-95df-8a88b426b3b7" containerName="barbican-db-sync" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.827952 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f0a57ed-b792-42fb-95df-8a88b426b3b7" containerName="barbican-db-sync" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.828813 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-9d6468757-b9ttg" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.831862 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.832050 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2kdjx" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.838352 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.840164 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6fd4b854cb-rm9k8"] Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.841728 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6fd4b854cb-rm9k8" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.849619 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.868547 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-9d6468757-b9ttg"] Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.879190 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6fd4b854cb-rm9k8"] Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.896673 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0fc7cfb-9160-471d-8aec-457769d46602-logs\") pod \"barbican-worker-9d6468757-b9ttg\" (UID: \"f0fc7cfb-9160-471d-8aec-457769d46602\") " pod="openstack/barbican-worker-9d6468757-b9ttg" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.896717 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0fc7cfb-9160-471d-8aec-457769d46602-combined-ca-bundle\") pod \"barbican-worker-9d6468757-b9ttg\" (UID: \"f0fc7cfb-9160-471d-8aec-457769d46602\") " pod="openstack/barbican-worker-9d6468757-b9ttg" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.896774 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67sp5\" (UniqueName: \"kubernetes.io/projected/f0fc7cfb-9160-471d-8aec-457769d46602-kube-api-access-67sp5\") pod \"barbican-worker-9d6468757-b9ttg\" (UID: \"f0fc7cfb-9160-471d-8aec-457769d46602\") " pod="openstack/barbican-worker-9d6468757-b9ttg" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.896810 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0fc7cfb-9160-471d-8aec-457769d46602-config-data-custom\") pod \"barbican-worker-9d6468757-b9ttg\" (UID: \"f0fc7cfb-9160-471d-8aec-457769d46602\") " pod="openstack/barbican-worker-9d6468757-b9ttg" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.896877 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0fc7cfb-9160-471d-8aec-457769d46602-config-data\") pod \"barbican-worker-9d6468757-b9ttg\" (UID: \"f0fc7cfb-9160-471d-8aec-457769d46602\") " pod="openstack/barbican-worker-9d6468757-b9ttg" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.935861 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c64bcd78f-44lq8"] Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.937702 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c64bcd78f-44lq8" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.991001 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c64bcd78f-44lq8"] Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.997949 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0fc7cfb-9160-471d-8aec-457769d46602-config-data-custom\") pod \"barbican-worker-9d6468757-b9ttg\" (UID: \"f0fc7cfb-9160-471d-8aec-457769d46602\") " pod="openstack/barbican-worker-9d6468757-b9ttg" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.998217 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13fa1e9e-f893-4641-a55f-d0c209d43917-logs\") pod \"barbican-keystone-listener-6fd4b854cb-rm9k8\" (UID: \"13fa1e9e-f893-4641-a55f-d0c209d43917\") " pod="openstack/barbican-keystone-listener-6fd4b854cb-rm9k8" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.998333 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtmpg\" (UniqueName: \"kubernetes.io/projected/13fa1e9e-f893-4641-a55f-d0c209d43917-kube-api-access-wtmpg\") pod \"barbican-keystone-listener-6fd4b854cb-rm9k8\" (UID: \"13fa1e9e-f893-4641-a55f-d0c209d43917\") " pod="openstack/barbican-keystone-listener-6fd4b854cb-rm9k8" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.998436 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13fa1e9e-f893-4641-a55f-d0c209d43917-config-data\") pod \"barbican-keystone-listener-6fd4b854cb-rm9k8\" (UID: \"13fa1e9e-f893-4641-a55f-d0c209d43917\") " pod="openstack/barbican-keystone-listener-6fd4b854cb-rm9k8" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.998518 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13fa1e9e-f893-4641-a55f-d0c209d43917-config-data-custom\") pod \"barbican-keystone-listener-6fd4b854cb-rm9k8\" (UID: \"13fa1e9e-f893-4641-a55f-d0c209d43917\") " pod="openstack/barbican-keystone-listener-6fd4b854cb-rm9k8" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.998615 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0fc7cfb-9160-471d-8aec-457769d46602-config-data\") pod \"barbican-worker-9d6468757-b9ttg\" (UID: \"f0fc7cfb-9160-471d-8aec-457769d46602\") " pod="openstack/barbican-worker-9d6468757-b9ttg" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.998745 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13fa1e9e-f893-4641-a55f-d0c209d43917-combined-ca-bundle\") pod \"barbican-keystone-listener-6fd4b854cb-rm9k8\" (UID: \"13fa1e9e-f893-4641-a55f-d0c209d43917\") " pod="openstack/barbican-keystone-listener-6fd4b854cb-rm9k8" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.998871 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0fc7cfb-9160-471d-8aec-457769d46602-logs\") pod \"barbican-worker-9d6468757-b9ttg\" (UID: \"f0fc7cfb-9160-471d-8aec-457769d46602\") " pod="openstack/barbican-worker-9d6468757-b9ttg" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.998969 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0fc7cfb-9160-471d-8aec-457769d46602-combined-ca-bundle\") pod \"barbican-worker-9d6468757-b9ttg\" (UID: \"f0fc7cfb-9160-471d-8aec-457769d46602\") " pod="openstack/barbican-worker-9d6468757-b9ttg" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.999088 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67sp5\" (UniqueName: \"kubernetes.io/projected/f0fc7cfb-9160-471d-8aec-457769d46602-kube-api-access-67sp5\") pod \"barbican-worker-9d6468757-b9ttg\" (UID: \"f0fc7cfb-9160-471d-8aec-457769d46602\") " pod="openstack/barbican-worker-9d6468757-b9ttg" Dec 05 16:37:40 crc kubenswrapper[4756]: I1205 16:37:40.999910 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0fc7cfb-9160-471d-8aec-457769d46602-logs\") pod \"barbican-worker-9d6468757-b9ttg\" (UID: \"f0fc7cfb-9160-471d-8aec-457769d46602\") " pod="openstack/barbican-worker-9d6468757-b9ttg" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.003498 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0fc7cfb-9160-471d-8aec-457769d46602-config-data-custom\") pod \"barbican-worker-9d6468757-b9ttg\" (UID: \"f0fc7cfb-9160-471d-8aec-457769d46602\") " pod="openstack/barbican-worker-9d6468757-b9ttg" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.004854 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0fc7cfb-9160-471d-8aec-457769d46602-combined-ca-bundle\") pod \"barbican-worker-9d6468757-b9ttg\" (UID: \"f0fc7cfb-9160-471d-8aec-457769d46602\") " pod="openstack/barbican-worker-9d6468757-b9ttg" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.005063 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0fc7cfb-9160-471d-8aec-457769d46602-config-data\") pod \"barbican-worker-9d6468757-b9ttg\" (UID: \"f0fc7cfb-9160-471d-8aec-457769d46602\") " pod="openstack/barbican-worker-9d6468757-b9ttg" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.024960 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67sp5\" (UniqueName: \"kubernetes.io/projected/f0fc7cfb-9160-471d-8aec-457769d46602-kube-api-access-67sp5\") pod \"barbican-worker-9d6468757-b9ttg\" (UID: \"f0fc7cfb-9160-471d-8aec-457769d46602\") " pod="openstack/barbican-worker-9d6468757-b9ttg" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.094062 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-749876f46d-6zlh7"] Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.095496 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-749876f46d-6zlh7" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.102284 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee4ccfb2-3d7a-4131-85cf-65e640617199-ovsdbserver-nb\") pod \"dnsmasq-dns-c64bcd78f-44lq8\" (UID: \"ee4ccfb2-3d7a-4131-85cf-65e640617199\") " pod="openstack/dnsmasq-dns-c64bcd78f-44lq8" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.103483 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13fa1e9e-f893-4641-a55f-d0c209d43917-logs\") pod \"barbican-keystone-listener-6fd4b854cb-rm9k8\" (UID: \"13fa1e9e-f893-4641-a55f-d0c209d43917\") " pod="openstack/barbican-keystone-listener-6fd4b854cb-rm9k8" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.103519 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee4ccfb2-3d7a-4131-85cf-65e640617199-dns-svc\") pod \"dnsmasq-dns-c64bcd78f-44lq8\" (UID: \"ee4ccfb2-3d7a-4131-85cf-65e640617199\") " pod="openstack/dnsmasq-dns-c64bcd78f-44lq8" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.103567 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtmpg\" (UniqueName: \"kubernetes.io/projected/13fa1e9e-f893-4641-a55f-d0c209d43917-kube-api-access-wtmpg\") pod \"barbican-keystone-listener-6fd4b854cb-rm9k8\" (UID: \"13fa1e9e-f893-4641-a55f-d0c209d43917\") " pod="openstack/barbican-keystone-listener-6fd4b854cb-rm9k8" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.103587 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee4ccfb2-3d7a-4131-85cf-65e640617199-dns-swift-storage-0\") pod \"dnsmasq-dns-c64bcd78f-44lq8\" (UID: \"ee4ccfb2-3d7a-4131-85cf-65e640617199\") " pod="openstack/dnsmasq-dns-c64bcd78f-44lq8" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.103617 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13fa1e9e-f893-4641-a55f-d0c209d43917-config-data\") pod \"barbican-keystone-listener-6fd4b854cb-rm9k8\" (UID: \"13fa1e9e-f893-4641-a55f-d0c209d43917\") " pod="openstack/barbican-keystone-listener-6fd4b854cb-rm9k8" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.103642 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13fa1e9e-f893-4641-a55f-d0c209d43917-config-data-custom\") pod \"barbican-keystone-listener-6fd4b854cb-rm9k8\" (UID: \"13fa1e9e-f893-4641-a55f-d0c209d43917\") " pod="openstack/barbican-keystone-listener-6fd4b854cb-rm9k8" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.103704 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj972\" (UniqueName: \"kubernetes.io/projected/ee4ccfb2-3d7a-4131-85cf-65e640617199-kube-api-access-fj972\") pod \"dnsmasq-dns-c64bcd78f-44lq8\" (UID: \"ee4ccfb2-3d7a-4131-85cf-65e640617199\") " pod="openstack/dnsmasq-dns-c64bcd78f-44lq8" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.103751 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee4ccfb2-3d7a-4131-85cf-65e640617199-config\") pod \"dnsmasq-dns-c64bcd78f-44lq8\" (UID: \"ee4ccfb2-3d7a-4131-85cf-65e640617199\") " pod="openstack/dnsmasq-dns-c64bcd78f-44lq8" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.103774 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13fa1e9e-f893-4641-a55f-d0c209d43917-combined-ca-bundle\") pod \"barbican-keystone-listener-6fd4b854cb-rm9k8\" (UID: \"13fa1e9e-f893-4641-a55f-d0c209d43917\") " pod="openstack/barbican-keystone-listener-6fd4b854cb-rm9k8" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.103852 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee4ccfb2-3d7a-4131-85cf-65e640617199-ovsdbserver-sb\") pod \"dnsmasq-dns-c64bcd78f-44lq8\" (UID: \"ee4ccfb2-3d7a-4131-85cf-65e640617199\") " pod="openstack/dnsmasq-dns-c64bcd78f-44lq8" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.105996 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13fa1e9e-f893-4641-a55f-d0c209d43917-logs\") pod \"barbican-keystone-listener-6fd4b854cb-rm9k8\" (UID: \"13fa1e9e-f893-4641-a55f-d0c209d43917\") " pod="openstack/barbican-keystone-listener-6fd4b854cb-rm9k8" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.108489 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.108735 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13fa1e9e-f893-4641-a55f-d0c209d43917-config-data-custom\") pod \"barbican-keystone-listener-6fd4b854cb-rm9k8\" (UID: \"13fa1e9e-f893-4641-a55f-d0c209d43917\") " pod="openstack/barbican-keystone-listener-6fd4b854cb-rm9k8" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.111013 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13fa1e9e-f893-4641-a55f-d0c209d43917-config-data\") pod \"barbican-keystone-listener-6fd4b854cb-rm9k8\" (UID: \"13fa1e9e-f893-4641-a55f-d0c209d43917\") " pod="openstack/barbican-keystone-listener-6fd4b854cb-rm9k8" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.112618 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13fa1e9e-f893-4641-a55f-d0c209d43917-combined-ca-bundle\") pod \"barbican-keystone-listener-6fd4b854cb-rm9k8\" (UID: \"13fa1e9e-f893-4641-a55f-d0c209d43917\") " pod="openstack/barbican-keystone-listener-6fd4b854cb-rm9k8" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.131990 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtmpg\" (UniqueName: \"kubernetes.io/projected/13fa1e9e-f893-4641-a55f-d0c209d43917-kube-api-access-wtmpg\") pod \"barbican-keystone-listener-6fd4b854cb-rm9k8\" (UID: \"13fa1e9e-f893-4641-a55f-d0c209d43917\") " pod="openstack/barbican-keystone-listener-6fd4b854cb-rm9k8" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.137737 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-749876f46d-6zlh7"] Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.154520 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-9d6468757-b9ttg" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.173033 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6fd4b854cb-rm9k8" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.212268 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bd0d557-c324-4da4-88e1-2073413d29de-logs\") pod \"barbican-api-749876f46d-6zlh7\" (UID: \"4bd0d557-c324-4da4-88e1-2073413d29de\") " pod="openstack/barbican-api-749876f46d-6zlh7" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.212350 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee4ccfb2-3d7a-4131-85cf-65e640617199-dns-svc\") pod \"dnsmasq-dns-c64bcd78f-44lq8\" (UID: \"ee4ccfb2-3d7a-4131-85cf-65e640617199\") " pod="openstack/dnsmasq-dns-c64bcd78f-44lq8" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.212433 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee4ccfb2-3d7a-4131-85cf-65e640617199-dns-swift-storage-0\") pod \"dnsmasq-dns-c64bcd78f-44lq8\" (UID: \"ee4ccfb2-3d7a-4131-85cf-65e640617199\") " pod="openstack/dnsmasq-dns-c64bcd78f-44lq8" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.212490 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd0d557-c324-4da4-88e1-2073413d29de-config-data\") pod \"barbican-api-749876f46d-6zlh7\" (UID: \"4bd0d557-c324-4da4-88e1-2073413d29de\") " pod="openstack/barbican-api-749876f46d-6zlh7" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.212537 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj972\" (UniqueName: \"kubernetes.io/projected/ee4ccfb2-3d7a-4131-85cf-65e640617199-kube-api-access-fj972\") pod \"dnsmasq-dns-c64bcd78f-44lq8\" (UID: \"ee4ccfb2-3d7a-4131-85cf-65e640617199\") " pod="openstack/dnsmasq-dns-c64bcd78f-44lq8" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.212569 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd0d557-c324-4da4-88e1-2073413d29de-combined-ca-bundle\") pod \"barbican-api-749876f46d-6zlh7\" (UID: \"4bd0d557-c324-4da4-88e1-2073413d29de\") " pod="openstack/barbican-api-749876f46d-6zlh7" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.212592 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8crd\" (UniqueName: \"kubernetes.io/projected/4bd0d557-c324-4da4-88e1-2073413d29de-kube-api-access-f8crd\") pod \"barbican-api-749876f46d-6zlh7\" (UID: \"4bd0d557-c324-4da4-88e1-2073413d29de\") " pod="openstack/barbican-api-749876f46d-6zlh7" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.212620 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee4ccfb2-3d7a-4131-85cf-65e640617199-config\") pod \"dnsmasq-dns-c64bcd78f-44lq8\" (UID: \"ee4ccfb2-3d7a-4131-85cf-65e640617199\") " pod="openstack/dnsmasq-dns-c64bcd78f-44lq8" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.212693 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee4ccfb2-3d7a-4131-85cf-65e640617199-ovsdbserver-sb\") pod \"dnsmasq-dns-c64bcd78f-44lq8\" (UID: \"ee4ccfb2-3d7a-4131-85cf-65e640617199\") " pod="openstack/dnsmasq-dns-c64bcd78f-44lq8" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.212742 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4bd0d557-c324-4da4-88e1-2073413d29de-config-data-custom\") pod \"barbican-api-749876f46d-6zlh7\" (UID: \"4bd0d557-c324-4da4-88e1-2073413d29de\") " pod="openstack/barbican-api-749876f46d-6zlh7" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.212798 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee4ccfb2-3d7a-4131-85cf-65e640617199-ovsdbserver-nb\") pod \"dnsmasq-dns-c64bcd78f-44lq8\" (UID: \"ee4ccfb2-3d7a-4131-85cf-65e640617199\") " pod="openstack/dnsmasq-dns-c64bcd78f-44lq8" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.213410 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee4ccfb2-3d7a-4131-85cf-65e640617199-dns-svc\") pod \"dnsmasq-dns-c64bcd78f-44lq8\" (UID: \"ee4ccfb2-3d7a-4131-85cf-65e640617199\") " pod="openstack/dnsmasq-dns-c64bcd78f-44lq8" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.213643 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee4ccfb2-3d7a-4131-85cf-65e640617199-ovsdbserver-nb\") pod \"dnsmasq-dns-c64bcd78f-44lq8\" (UID: \"ee4ccfb2-3d7a-4131-85cf-65e640617199\") " pod="openstack/dnsmasq-dns-c64bcd78f-44lq8" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.213807 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee4ccfb2-3d7a-4131-85cf-65e640617199-dns-swift-storage-0\") pod \"dnsmasq-dns-c64bcd78f-44lq8\" (UID: \"ee4ccfb2-3d7a-4131-85cf-65e640617199\") " pod="openstack/dnsmasq-dns-c64bcd78f-44lq8" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.214116 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee4ccfb2-3d7a-4131-85cf-65e640617199-config\") pod \"dnsmasq-dns-c64bcd78f-44lq8\" (UID: \"ee4ccfb2-3d7a-4131-85cf-65e640617199\") " pod="openstack/dnsmasq-dns-c64bcd78f-44lq8" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.214184 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee4ccfb2-3d7a-4131-85cf-65e640617199-ovsdbserver-sb\") pod \"dnsmasq-dns-c64bcd78f-44lq8\" (UID: \"ee4ccfb2-3d7a-4131-85cf-65e640617199\") " pod="openstack/dnsmasq-dns-c64bcd78f-44lq8" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.239110 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj972\" (UniqueName: \"kubernetes.io/projected/ee4ccfb2-3d7a-4131-85cf-65e640617199-kube-api-access-fj972\") pod \"dnsmasq-dns-c64bcd78f-44lq8\" (UID: \"ee4ccfb2-3d7a-4131-85cf-65e640617199\") " pod="openstack/dnsmasq-dns-c64bcd78f-44lq8" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.288387 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c64bcd78f-44lq8" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.314204 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bd0d557-c324-4da4-88e1-2073413d29de-logs\") pod \"barbican-api-749876f46d-6zlh7\" (UID: \"4bd0d557-c324-4da4-88e1-2073413d29de\") " pod="openstack/barbican-api-749876f46d-6zlh7" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.314572 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd0d557-c324-4da4-88e1-2073413d29de-config-data\") pod \"barbican-api-749876f46d-6zlh7\" (UID: \"4bd0d557-c324-4da4-88e1-2073413d29de\") " pod="openstack/barbican-api-749876f46d-6zlh7" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.314619 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd0d557-c324-4da4-88e1-2073413d29de-combined-ca-bundle\") pod \"barbican-api-749876f46d-6zlh7\" (UID: \"4bd0d557-c324-4da4-88e1-2073413d29de\") " pod="openstack/barbican-api-749876f46d-6zlh7" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.314650 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8crd\" (UniqueName: \"kubernetes.io/projected/4bd0d557-c324-4da4-88e1-2073413d29de-kube-api-access-f8crd\") pod \"barbican-api-749876f46d-6zlh7\" (UID: \"4bd0d557-c324-4da4-88e1-2073413d29de\") " pod="openstack/barbican-api-749876f46d-6zlh7" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.314730 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4bd0d557-c324-4da4-88e1-2073413d29de-config-data-custom\") pod \"barbican-api-749876f46d-6zlh7\" (UID: \"4bd0d557-c324-4da4-88e1-2073413d29de\") " pod="openstack/barbican-api-749876f46d-6zlh7" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.315536 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bd0d557-c324-4da4-88e1-2073413d29de-logs\") pod \"barbican-api-749876f46d-6zlh7\" (UID: \"4bd0d557-c324-4da4-88e1-2073413d29de\") " pod="openstack/barbican-api-749876f46d-6zlh7" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.329653 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd0d557-c324-4da4-88e1-2073413d29de-combined-ca-bundle\") pod \"barbican-api-749876f46d-6zlh7\" (UID: \"4bd0d557-c324-4da4-88e1-2073413d29de\") " pod="openstack/barbican-api-749876f46d-6zlh7" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.335688 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4bd0d557-c324-4da4-88e1-2073413d29de-config-data-custom\") pod \"barbican-api-749876f46d-6zlh7\" (UID: \"4bd0d557-c324-4da4-88e1-2073413d29de\") " pod="openstack/barbican-api-749876f46d-6zlh7" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.338733 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd0d557-c324-4da4-88e1-2073413d29de-config-data\") pod \"barbican-api-749876f46d-6zlh7\" (UID: \"4bd0d557-c324-4da4-88e1-2073413d29de\") " pod="openstack/barbican-api-749876f46d-6zlh7" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.360401 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8crd\" (UniqueName: \"kubernetes.io/projected/4bd0d557-c324-4da4-88e1-2073413d29de-kube-api-access-f8crd\") pod \"barbican-api-749876f46d-6zlh7\" (UID: \"4bd0d557-c324-4da4-88e1-2073413d29de\") " pod="openstack/barbican-api-749876f46d-6zlh7" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.365021 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-749876f46d-6zlh7" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.568361 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-668cbcb89b-dvw88" event={"ID":"84d62427-bd91-4330-be20-152bc3508447","Type":"ContainerStarted","Data":"a005af597ef26ea0b48f5743eb681caeba6662ee90634b36c5befe70f580e2a0"} Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.568700 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-668cbcb89b-dvw88" event={"ID":"84d62427-bd91-4330-be20-152bc3508447","Type":"ContainerStarted","Data":"a99c90cd22f1473b85f31a90f8f7912ef7f574ee4089fbd8fd4b2995f484d61c"} Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.568918 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-668cbcb89b-dvw88" event={"ID":"84d62427-bd91-4330-be20-152bc3508447","Type":"ContainerStarted","Data":"731818aad98a6a2a6899b20d4dca2d9aed160a17a7cdc3867438296fbe19a6a6"} Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.569580 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-668cbcb89b-dvw88" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.569629 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-668cbcb89b-dvw88" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.597270 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-668cbcb89b-dvw88" podStartSLOduration=2.59725643 podStartE2EDuration="2.59725643s" podCreationTimestamp="2025-12-05 16:37:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:37:41.591924136 +0000 UTC m=+1321.830597648" watchObservedRunningTime="2025-12-05 16:37:41.59725643 +0000 UTC m=+1321.835929942" Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.791504 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-9d6468757-b9ttg"] Dec 05 16:37:41 crc kubenswrapper[4756]: I1205 16:37:41.840132 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6fd4b854cb-rm9k8"] Dec 05 16:37:42 crc kubenswrapper[4756]: I1205 16:37:42.092108 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-794c6877f7-rhjmw" podUID="0b2c1f93-ed28-4323-bb50-e8c143e01f51" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.141:5353: i/o timeout" Dec 05 16:37:43 crc kubenswrapper[4756]: I1205 16:37:43.748435 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-54c6d9b488-h9hmj"] Dec 05 16:37:43 crc kubenswrapper[4756]: I1205 16:37:43.750291 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54c6d9b488-h9hmj" Dec 05 16:37:43 crc kubenswrapper[4756]: I1205 16:37:43.752359 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 05 16:37:43 crc kubenswrapper[4756]: I1205 16:37:43.752536 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 05 16:37:43 crc kubenswrapper[4756]: I1205 16:37:43.769384 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-54c6d9b488-h9hmj"] Dec 05 16:37:43 crc kubenswrapper[4756]: I1205 16:37:43.872524 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2606acc0-f425-4595-8f5a-df1f009175ee-logs\") pod \"barbican-api-54c6d9b488-h9hmj\" (UID: \"2606acc0-f425-4595-8f5a-df1f009175ee\") " pod="openstack/barbican-api-54c6d9b488-h9hmj" Dec 05 16:37:43 crc kubenswrapper[4756]: I1205 16:37:43.872598 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2606acc0-f425-4595-8f5a-df1f009175ee-internal-tls-certs\") pod \"barbican-api-54c6d9b488-h9hmj\" (UID: \"2606acc0-f425-4595-8f5a-df1f009175ee\") " pod="openstack/barbican-api-54c6d9b488-h9hmj" Dec 05 16:37:43 crc kubenswrapper[4756]: I1205 16:37:43.872641 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2606acc0-f425-4595-8f5a-df1f009175ee-config-data-custom\") pod \"barbican-api-54c6d9b488-h9hmj\" (UID: \"2606acc0-f425-4595-8f5a-df1f009175ee\") " pod="openstack/barbican-api-54c6d9b488-h9hmj" Dec 05 16:37:43 crc kubenswrapper[4756]: I1205 16:37:43.872829 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2606acc0-f425-4595-8f5a-df1f009175ee-public-tls-certs\") pod \"barbican-api-54c6d9b488-h9hmj\" (UID: \"2606acc0-f425-4595-8f5a-df1f009175ee\") " pod="openstack/barbican-api-54c6d9b488-h9hmj" Dec 05 16:37:43 crc kubenswrapper[4756]: I1205 16:37:43.872896 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2606acc0-f425-4595-8f5a-df1f009175ee-config-data\") pod \"barbican-api-54c6d9b488-h9hmj\" (UID: \"2606acc0-f425-4595-8f5a-df1f009175ee\") " pod="openstack/barbican-api-54c6d9b488-h9hmj" Dec 05 16:37:43 crc kubenswrapper[4756]: I1205 16:37:43.872963 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2606acc0-f425-4595-8f5a-df1f009175ee-combined-ca-bundle\") pod \"barbican-api-54c6d9b488-h9hmj\" (UID: \"2606acc0-f425-4595-8f5a-df1f009175ee\") " pod="openstack/barbican-api-54c6d9b488-h9hmj" Dec 05 16:37:43 crc kubenswrapper[4756]: I1205 16:37:43.873026 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxhq9\" (UniqueName: \"kubernetes.io/projected/2606acc0-f425-4595-8f5a-df1f009175ee-kube-api-access-vxhq9\") pod \"barbican-api-54c6d9b488-h9hmj\" (UID: \"2606acc0-f425-4595-8f5a-df1f009175ee\") " pod="openstack/barbican-api-54c6d9b488-h9hmj" Dec 05 16:37:43 crc kubenswrapper[4756]: I1205 16:37:43.974811 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2606acc0-f425-4595-8f5a-df1f009175ee-config-data\") pod \"barbican-api-54c6d9b488-h9hmj\" (UID: \"2606acc0-f425-4595-8f5a-df1f009175ee\") " pod="openstack/barbican-api-54c6d9b488-h9hmj" Dec 05 16:37:43 crc kubenswrapper[4756]: I1205 16:37:43.974909 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2606acc0-f425-4595-8f5a-df1f009175ee-combined-ca-bundle\") pod \"barbican-api-54c6d9b488-h9hmj\" (UID: \"2606acc0-f425-4595-8f5a-df1f009175ee\") " pod="openstack/barbican-api-54c6d9b488-h9hmj" Dec 05 16:37:43 crc kubenswrapper[4756]: I1205 16:37:43.974958 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxhq9\" (UniqueName: \"kubernetes.io/projected/2606acc0-f425-4595-8f5a-df1f009175ee-kube-api-access-vxhq9\") pod \"barbican-api-54c6d9b488-h9hmj\" (UID: \"2606acc0-f425-4595-8f5a-df1f009175ee\") " pod="openstack/barbican-api-54c6d9b488-h9hmj" Dec 05 16:37:43 crc kubenswrapper[4756]: I1205 16:37:43.974991 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2606acc0-f425-4595-8f5a-df1f009175ee-logs\") pod \"barbican-api-54c6d9b488-h9hmj\" (UID: \"2606acc0-f425-4595-8f5a-df1f009175ee\") " pod="openstack/barbican-api-54c6d9b488-h9hmj" Dec 05 16:37:43 crc kubenswrapper[4756]: I1205 16:37:43.975021 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2606acc0-f425-4595-8f5a-df1f009175ee-internal-tls-certs\") pod \"barbican-api-54c6d9b488-h9hmj\" (UID: \"2606acc0-f425-4595-8f5a-df1f009175ee\") " pod="openstack/barbican-api-54c6d9b488-h9hmj" Dec 05 16:37:43 crc kubenswrapper[4756]: I1205 16:37:43.975047 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2606acc0-f425-4595-8f5a-df1f009175ee-config-data-custom\") pod \"barbican-api-54c6d9b488-h9hmj\" (UID: \"2606acc0-f425-4595-8f5a-df1f009175ee\") " pod="openstack/barbican-api-54c6d9b488-h9hmj" Dec 05 16:37:43 crc kubenswrapper[4756]: I1205 16:37:43.975089 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2606acc0-f425-4595-8f5a-df1f009175ee-public-tls-certs\") pod \"barbican-api-54c6d9b488-h9hmj\" (UID: \"2606acc0-f425-4595-8f5a-df1f009175ee\") " pod="openstack/barbican-api-54c6d9b488-h9hmj" Dec 05 16:37:43 crc kubenswrapper[4756]: I1205 16:37:43.976510 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2606acc0-f425-4595-8f5a-df1f009175ee-logs\") pod \"barbican-api-54c6d9b488-h9hmj\" (UID: \"2606acc0-f425-4595-8f5a-df1f009175ee\") " pod="openstack/barbican-api-54c6d9b488-h9hmj" Dec 05 16:37:43 crc kubenswrapper[4756]: I1205 16:37:43.982990 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2606acc0-f425-4595-8f5a-df1f009175ee-public-tls-certs\") pod \"barbican-api-54c6d9b488-h9hmj\" (UID: \"2606acc0-f425-4595-8f5a-df1f009175ee\") " pod="openstack/barbican-api-54c6d9b488-h9hmj" Dec 05 16:37:43 crc kubenswrapper[4756]: I1205 16:37:43.983540 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2606acc0-f425-4595-8f5a-df1f009175ee-config-data\") pod \"barbican-api-54c6d9b488-h9hmj\" (UID: \"2606acc0-f425-4595-8f5a-df1f009175ee\") " pod="openstack/barbican-api-54c6d9b488-h9hmj" Dec 05 16:37:43 crc kubenswrapper[4756]: I1205 16:37:43.991144 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2606acc0-f425-4595-8f5a-df1f009175ee-config-data-custom\") pod \"barbican-api-54c6d9b488-h9hmj\" (UID: \"2606acc0-f425-4595-8f5a-df1f009175ee\") " pod="openstack/barbican-api-54c6d9b488-h9hmj" Dec 05 16:37:43 crc kubenswrapper[4756]: I1205 16:37:43.992492 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2606acc0-f425-4595-8f5a-df1f009175ee-internal-tls-certs\") pod \"barbican-api-54c6d9b488-h9hmj\" (UID: \"2606acc0-f425-4595-8f5a-df1f009175ee\") " pod="openstack/barbican-api-54c6d9b488-h9hmj" Dec 05 16:37:43 crc kubenswrapper[4756]: I1205 16:37:43.994001 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2606acc0-f425-4595-8f5a-df1f009175ee-combined-ca-bundle\") pod \"barbican-api-54c6d9b488-h9hmj\" (UID: \"2606acc0-f425-4595-8f5a-df1f009175ee\") " pod="openstack/barbican-api-54c6d9b488-h9hmj" Dec 05 16:37:43 crc kubenswrapper[4756]: I1205 16:37:43.997725 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxhq9\" (UniqueName: \"kubernetes.io/projected/2606acc0-f425-4595-8f5a-df1f009175ee-kube-api-access-vxhq9\") pod \"barbican-api-54c6d9b488-h9hmj\" (UID: \"2606acc0-f425-4595-8f5a-df1f009175ee\") " pod="openstack/barbican-api-54c6d9b488-h9hmj" Dec 05 16:37:44 crc kubenswrapper[4756]: I1205 16:37:44.084097 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54c6d9b488-h9hmj" Dec 05 16:37:46 crc kubenswrapper[4756]: I1205 16:37:46.016702 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 16:37:46 crc kubenswrapper[4756]: I1205 16:37:46.017147 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 16:37:46 crc kubenswrapper[4756]: I1205 16:37:46.077269 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 16:37:46 crc kubenswrapper[4756]: I1205 16:37:46.091059 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 16:37:46 crc kubenswrapper[4756]: I1205 16:37:46.615503 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 16:37:46 crc kubenswrapper[4756]: I1205 16:37:46.615580 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 16:37:47 crc kubenswrapper[4756]: I1205 16:37:47.102925 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 16:37:47 crc kubenswrapper[4756]: I1205 16:37:47.102977 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 16:37:47 crc kubenswrapper[4756]: I1205 16:37:47.142428 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 16:37:47 crc kubenswrapper[4756]: I1205 16:37:47.159661 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 16:37:47 crc kubenswrapper[4756]: W1205 16:37:47.274467 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0fc7cfb_9160_471d_8aec_457769d46602.slice/crio-db960e3dfedc47f213662aed65e9f25449ed394349e91d48c0df3cbb42c99bbd WatchSource:0}: Error finding container db960e3dfedc47f213662aed65e9f25449ed394349e91d48c0df3cbb42c99bbd: Status 404 returned error can't find the container with id db960e3dfedc47f213662aed65e9f25449ed394349e91d48c0df3cbb42c99bbd Dec 05 16:37:47 crc kubenswrapper[4756]: I1205 16:37:47.625530 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-9d6468757-b9ttg" event={"ID":"f0fc7cfb-9160-471d-8aec-457769d46602","Type":"ContainerStarted","Data":"db960e3dfedc47f213662aed65e9f25449ed394349e91d48c0df3cbb42c99bbd"} Dec 05 16:37:47 crc kubenswrapper[4756]: I1205 16:37:47.625575 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 16:37:47 crc kubenswrapper[4756]: I1205 16:37:47.626453 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 16:37:47 crc kubenswrapper[4756]: W1205 16:37:47.740105 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13fa1e9e_f893_4641_a55f_d0c209d43917.slice/crio-066946f6bad4e2e1416b640fa9541ea8a8e415d35ee0bb8d2783c6616335c822 WatchSource:0}: Error finding container 066946f6bad4e2e1416b640fa9541ea8a8e415d35ee0bb8d2783c6616335c822: Status 404 returned error can't find the container with id 066946f6bad4e2e1416b640fa9541ea8a8e415d35ee0bb8d2783c6616335c822 Dec 05 16:37:48 crc kubenswrapper[4756]: E1205 16:37:48.240066 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="d7550fa7-075b-42e9-be97-52e7ed4f552c" Dec 05 16:37:48 crc kubenswrapper[4756]: I1205 16:37:48.349366 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-54c6d9b488-h9hmj"] Dec 05 16:37:48 crc kubenswrapper[4756]: W1205 16:37:48.369977 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2606acc0_f425_4595_8f5a_df1f009175ee.slice/crio-9bf353f74a5196bea33d9de00268bbf27578984ef2ad72a57669263aca8c5dbb WatchSource:0}: Error finding container 9bf353f74a5196bea33d9de00268bbf27578984ef2ad72a57669263aca8c5dbb: Status 404 returned error can't find the container with id 9bf353f74a5196bea33d9de00268bbf27578984ef2ad72a57669263aca8c5dbb Dec 05 16:37:48 crc kubenswrapper[4756]: I1205 16:37:48.474189 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-749876f46d-6zlh7"] Dec 05 16:37:48 crc kubenswrapper[4756]: I1205 16:37:48.482125 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c64bcd78f-44lq8"] Dec 05 16:37:48 crc kubenswrapper[4756]: I1205 16:37:48.681695 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6fd4b854cb-rm9k8" event={"ID":"13fa1e9e-f893-4641-a55f-d0c209d43917","Type":"ContainerStarted","Data":"066946f6bad4e2e1416b640fa9541ea8a8e415d35ee0bb8d2783c6616335c822"} Dec 05 16:37:48 crc kubenswrapper[4756]: I1205 16:37:48.694558 4756 generic.go:334] "Generic (PLEG): container finished" podID="b35c2088-a11f-4514-8390-1a1f5ff0ccb5" containerID="4da7d739364a7736ae954751f02c3f5f707192b39710f85ec3c47bc1b3a59586" exitCode=0 Dec 05 16:37:48 crc kubenswrapper[4756]: I1205 16:37:48.694645 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ffbjc" event={"ID":"b35c2088-a11f-4514-8390-1a1f5ff0ccb5","Type":"ContainerDied","Data":"4da7d739364a7736ae954751f02c3f5f707192b39710f85ec3c47bc1b3a59586"} Dec 05 16:37:48 crc kubenswrapper[4756]: I1205 16:37:48.701191 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54c6d9b488-h9hmj" event={"ID":"2606acc0-f425-4595-8f5a-df1f009175ee","Type":"ContainerStarted","Data":"9bf353f74a5196bea33d9de00268bbf27578984ef2ad72a57669263aca8c5dbb"} Dec 05 16:37:48 crc kubenswrapper[4756]: I1205 16:37:48.717549 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d7550fa7-075b-42e9-be97-52e7ed4f552c" containerName="ceilometer-notification-agent" containerID="cri-o://bf81fbd405b146ccf77ccba5def407bc7fba16acdb849eec1f8de533e90ca946" gracePeriod=30 Dec 05 16:37:48 crc kubenswrapper[4756]: I1205 16:37:48.717926 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7550fa7-075b-42e9-be97-52e7ed4f552c","Type":"ContainerStarted","Data":"e27ba8d07ae00908888de0ce90498b3d2a9dc789a238952ce19112b10a011ac0"} Dec 05 16:37:48 crc kubenswrapper[4756]: I1205 16:37:48.717984 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 16:37:48 crc kubenswrapper[4756]: I1205 16:37:48.718033 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d7550fa7-075b-42e9-be97-52e7ed4f552c" containerName="proxy-httpd" containerID="cri-o://e27ba8d07ae00908888de0ce90498b3d2a9dc789a238952ce19112b10a011ac0" gracePeriod=30 Dec 05 16:37:48 crc kubenswrapper[4756]: I1205 16:37:48.718145 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d7550fa7-075b-42e9-be97-52e7ed4f552c" containerName="sg-core" containerID="cri-o://1dc41eca912836854eb899bcf0eed311a49e7644a686a7350102564960f9d9db" gracePeriod=30 Dec 05 16:37:48 crc kubenswrapper[4756]: W1205 16:37:48.958611 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bd0d557_c324_4da4_88e1_2073413d29de.slice/crio-f0cd005521a828a97c8124ea9e0692e3c85e99bb0e304b2579dbdb10c8958018 WatchSource:0}: Error finding container f0cd005521a828a97c8124ea9e0692e3c85e99bb0e304b2579dbdb10c8958018: Status 404 returned error can't find the container with id f0cd005521a828a97c8124ea9e0692e3c85e99bb0e304b2579dbdb10c8958018 Dec 05 16:37:48 crc kubenswrapper[4756]: E1205 16:37:48.973213 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7550fa7_075b_42e9_be97_52e7ed4f552c.slice/crio-1dc41eca912836854eb899bcf0eed311a49e7644a686a7350102564960f9d9db.scope\": RecentStats: unable to find data in memory cache]" Dec 05 16:37:49 crc kubenswrapper[4756]: I1205 16:37:49.001199 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 16:37:49 crc kubenswrapper[4756]: I1205 16:37:49.001297 4756 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 16:37:49 crc kubenswrapper[4756]: I1205 16:37:49.149053 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 16:37:49 crc kubenswrapper[4756]: I1205 16:37:49.727220 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-9d6468757-b9ttg" event={"ID":"f0fc7cfb-9160-471d-8aec-457769d46602","Type":"ContainerStarted","Data":"b4e4e8a043bf4f88e1edd5b6dd41710a7149c49c31bb40ac6714307c37009357"} Dec 05 16:37:49 crc kubenswrapper[4756]: I1205 16:37:49.727277 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-9d6468757-b9ttg" event={"ID":"f0fc7cfb-9160-471d-8aec-457769d46602","Type":"ContainerStarted","Data":"481bb590f2466b1ec99a35fcf296aa1fc3174e0cc5b933c289f817ad96130df5"} Dec 05 16:37:49 crc kubenswrapper[4756]: I1205 16:37:49.731518 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54c6d9b488-h9hmj" event={"ID":"2606acc0-f425-4595-8f5a-df1f009175ee","Type":"ContainerStarted","Data":"bfa23ea411e7c12b52777bbf724439990757136fd7d85b535a806320b41f695b"} Dec 05 16:37:49 crc kubenswrapper[4756]: I1205 16:37:49.731900 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54c6d9b488-h9hmj" event={"ID":"2606acc0-f425-4595-8f5a-df1f009175ee","Type":"ContainerStarted","Data":"25d37ab5592969027f78161ad8c13da0276bef9030f7a390f77c1e2d2d113f8c"} Dec 05 16:37:49 crc kubenswrapper[4756]: I1205 16:37:49.731925 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-54c6d9b488-h9hmj" Dec 05 16:37:49 crc kubenswrapper[4756]: I1205 16:37:49.731939 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-54c6d9b488-h9hmj" Dec 05 16:37:49 crc kubenswrapper[4756]: I1205 16:37:49.735287 4756 generic.go:334] "Generic (PLEG): container finished" podID="d7550fa7-075b-42e9-be97-52e7ed4f552c" containerID="e27ba8d07ae00908888de0ce90498b3d2a9dc789a238952ce19112b10a011ac0" exitCode=0 Dec 05 16:37:49 crc kubenswrapper[4756]: I1205 16:37:49.735331 4756 generic.go:334] "Generic (PLEG): container finished" podID="d7550fa7-075b-42e9-be97-52e7ed4f552c" containerID="1dc41eca912836854eb899bcf0eed311a49e7644a686a7350102564960f9d9db" exitCode=2 Dec 05 16:37:49 crc kubenswrapper[4756]: I1205 16:37:49.735390 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7550fa7-075b-42e9-be97-52e7ed4f552c","Type":"ContainerDied","Data":"e27ba8d07ae00908888de0ce90498b3d2a9dc789a238952ce19112b10a011ac0"} Dec 05 16:37:49 crc kubenswrapper[4756]: I1205 16:37:49.735461 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7550fa7-075b-42e9-be97-52e7ed4f552c","Type":"ContainerDied","Data":"1dc41eca912836854eb899bcf0eed311a49e7644a686a7350102564960f9d9db"} Dec 05 16:37:49 crc kubenswrapper[4756]: I1205 16:37:49.742708 4756 generic.go:334] "Generic (PLEG): container finished" podID="ee4ccfb2-3d7a-4131-85cf-65e640617199" containerID="2b7553f35ef2c1fe61db93f73c16184f3497a662ce4194c4af8b6ad84fac110f" exitCode=0 Dec 05 16:37:49 crc kubenswrapper[4756]: I1205 16:37:49.742762 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c64bcd78f-44lq8" event={"ID":"ee4ccfb2-3d7a-4131-85cf-65e640617199","Type":"ContainerDied","Data":"2b7553f35ef2c1fe61db93f73c16184f3497a662ce4194c4af8b6ad84fac110f"} Dec 05 16:37:49 crc kubenswrapper[4756]: I1205 16:37:49.742823 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c64bcd78f-44lq8" event={"ID":"ee4ccfb2-3d7a-4131-85cf-65e640617199","Type":"ContainerStarted","Data":"b9ba8b80997b8085403d78a792fd795686aaedb45e5a6399f90693fbee3b5999"} Dec 05 16:37:49 crc kubenswrapper[4756]: I1205 16:37:49.744932 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-749876f46d-6zlh7" event={"ID":"4bd0d557-c324-4da4-88e1-2073413d29de","Type":"ContainerStarted","Data":"dfb79049cb89a71b4607b123544b14f9225ac8a88e66e094a43ee841d722ee58"} Dec 05 16:37:49 crc kubenswrapper[4756]: I1205 16:37:49.744979 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-749876f46d-6zlh7" event={"ID":"4bd0d557-c324-4da4-88e1-2073413d29de","Type":"ContainerStarted","Data":"88ac134226b0d54d94601d18418b3b961516451f134b0f57dee1347033088240"} Dec 05 16:37:49 crc kubenswrapper[4756]: I1205 16:37:49.744997 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-749876f46d-6zlh7" event={"ID":"4bd0d557-c324-4da4-88e1-2073413d29de","Type":"ContainerStarted","Data":"f0cd005521a828a97c8124ea9e0692e3c85e99bb0e304b2579dbdb10c8958018"} Dec 05 16:37:49 crc kubenswrapper[4756]: I1205 16:37:49.745138 4756 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 16:37:49 crc kubenswrapper[4756]: I1205 16:37:49.745156 4756 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 16:37:49 crc kubenswrapper[4756]: I1205 16:37:49.752689 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-9d6468757-b9ttg" podStartSLOduration=8.021496766 podStartE2EDuration="9.752673249s" podCreationTimestamp="2025-12-05 16:37:40 +0000 UTC" firstStartedPulling="2025-12-05 16:37:47.276212568 +0000 UTC m=+1327.514886080" lastFinishedPulling="2025-12-05 16:37:49.007389051 +0000 UTC m=+1329.246062563" observedRunningTime="2025-12-05 16:37:49.746548133 +0000 UTC m=+1329.985221645" watchObservedRunningTime="2025-12-05 16:37:49.752673249 +0000 UTC m=+1329.991346761" Dec 05 16:37:49 crc kubenswrapper[4756]: I1205 16:37:49.769024 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-749876f46d-6zlh7" podStartSLOduration=8.768987022 podStartE2EDuration="8.768987022s" podCreationTimestamp="2025-12-05 16:37:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:37:49.765182039 +0000 UTC m=+1330.003855551" watchObservedRunningTime="2025-12-05 16:37:49.768987022 +0000 UTC m=+1330.007660534" Dec 05 16:37:49 crc kubenswrapper[4756]: I1205 16:37:49.793208 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-54c6d9b488-h9hmj" podStartSLOduration=6.793188388 podStartE2EDuration="6.793188388s" podCreationTimestamp="2025-12-05 16:37:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:37:49.791008949 +0000 UTC m=+1330.029682461" watchObservedRunningTime="2025-12-05 16:37:49.793188388 +0000 UTC m=+1330.031861910" Dec 05 16:37:49 crc kubenswrapper[4756]: I1205 16:37:49.996218 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 16:37:50 crc kubenswrapper[4756]: I1205 16:37:50.004588 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 16:37:50 crc kubenswrapper[4756]: I1205 16:37:50.525846 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ffbjc" Dec 05 16:37:50 crc kubenswrapper[4756]: I1205 16:37:50.637941 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcjxl\" (UniqueName: \"kubernetes.io/projected/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-kube-api-access-bcjxl\") pod \"b35c2088-a11f-4514-8390-1a1f5ff0ccb5\" (UID: \"b35c2088-a11f-4514-8390-1a1f5ff0ccb5\") " Dec 05 16:37:50 crc kubenswrapper[4756]: I1205 16:37:50.638073 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-combined-ca-bundle\") pod \"b35c2088-a11f-4514-8390-1a1f5ff0ccb5\" (UID: \"b35c2088-a11f-4514-8390-1a1f5ff0ccb5\") " Dec 05 16:37:50 crc kubenswrapper[4756]: I1205 16:37:50.638128 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-scripts\") pod \"b35c2088-a11f-4514-8390-1a1f5ff0ccb5\" (UID: \"b35c2088-a11f-4514-8390-1a1f5ff0ccb5\") " Dec 05 16:37:50 crc kubenswrapper[4756]: I1205 16:37:50.638147 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-etc-machine-id\") pod \"b35c2088-a11f-4514-8390-1a1f5ff0ccb5\" (UID: \"b35c2088-a11f-4514-8390-1a1f5ff0ccb5\") " Dec 05 16:37:50 crc kubenswrapper[4756]: I1205 16:37:50.638208 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-config-data\") pod \"b35c2088-a11f-4514-8390-1a1f5ff0ccb5\" (UID: \"b35c2088-a11f-4514-8390-1a1f5ff0ccb5\") " Dec 05 16:37:50 crc kubenswrapper[4756]: I1205 16:37:50.638227 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-db-sync-config-data\") pod \"b35c2088-a11f-4514-8390-1a1f5ff0ccb5\" (UID: \"b35c2088-a11f-4514-8390-1a1f5ff0ccb5\") " Dec 05 16:37:50 crc kubenswrapper[4756]: I1205 16:37:50.638483 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b35c2088-a11f-4514-8390-1a1f5ff0ccb5" (UID: "b35c2088-a11f-4514-8390-1a1f5ff0ccb5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:37:50 crc kubenswrapper[4756]: I1205 16:37:50.638845 4756 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:50 crc kubenswrapper[4756]: I1205 16:37:50.647558 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b35c2088-a11f-4514-8390-1a1f5ff0ccb5" (UID: "b35c2088-a11f-4514-8390-1a1f5ff0ccb5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:50 crc kubenswrapper[4756]: I1205 16:37:50.649317 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-kube-api-access-bcjxl" (OuterVolumeSpecName: "kube-api-access-bcjxl") pod "b35c2088-a11f-4514-8390-1a1f5ff0ccb5" (UID: "b35c2088-a11f-4514-8390-1a1f5ff0ccb5"). InnerVolumeSpecName "kube-api-access-bcjxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:37:50 crc kubenswrapper[4756]: I1205 16:37:50.702709 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-scripts" (OuterVolumeSpecName: "scripts") pod "b35c2088-a11f-4514-8390-1a1f5ff0ccb5" (UID: "b35c2088-a11f-4514-8390-1a1f5ff0ccb5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:50 crc kubenswrapper[4756]: I1205 16:37:50.731601 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b35c2088-a11f-4514-8390-1a1f5ff0ccb5" (UID: "b35c2088-a11f-4514-8390-1a1f5ff0ccb5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:50 crc kubenswrapper[4756]: I1205 16:37:50.741725 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcjxl\" (UniqueName: \"kubernetes.io/projected/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-kube-api-access-bcjxl\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:50 crc kubenswrapper[4756]: I1205 16:37:50.741829 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:50 crc kubenswrapper[4756]: I1205 16:37:50.741845 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:50 crc kubenswrapper[4756]: I1205 16:37:50.741861 4756 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:50 crc kubenswrapper[4756]: I1205 16:37:50.755810 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-config-data" (OuterVolumeSpecName: "config-data") pod "b35c2088-a11f-4514-8390-1a1f5ff0ccb5" (UID: "b35c2088-a11f-4514-8390-1a1f5ff0ccb5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:50 crc kubenswrapper[4756]: I1205 16:37:50.761947 4756 generic.go:334] "Generic (PLEG): container finished" podID="343931b4-805d-4457-a27f-7c680e29a475" containerID="7c27b557cb4a78e84801969e7ecd598a80c88010e59d4e2e8a5bcae978626017" exitCode=0 Dec 05 16:37:50 crc kubenswrapper[4756]: I1205 16:37:50.762026 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4rtgn" event={"ID":"343931b4-805d-4457-a27f-7c680e29a475","Type":"ContainerDied","Data":"7c27b557cb4a78e84801969e7ecd598a80c88010e59d4e2e8a5bcae978626017"} Dec 05 16:37:50 crc kubenswrapper[4756]: I1205 16:37:50.764658 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ffbjc" Dec 05 16:37:50 crc kubenswrapper[4756]: I1205 16:37:50.767738 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ffbjc" event={"ID":"b35c2088-a11f-4514-8390-1a1f5ff0ccb5","Type":"ContainerDied","Data":"e635cd23ae7f2d107556903658fe456129ac3c804641ccfbd935f9f01c996c52"} Dec 05 16:37:50 crc kubenswrapper[4756]: I1205 16:37:50.767780 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e635cd23ae7f2d107556903658fe456129ac3c804641ccfbd935f9f01c996c52" Dec 05 16:37:50 crc kubenswrapper[4756]: I1205 16:37:50.767859 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-749876f46d-6zlh7" Dec 05 16:37:50 crc kubenswrapper[4756]: I1205 16:37:50.768743 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-749876f46d-6zlh7" Dec 05 16:37:50 crc kubenswrapper[4756]: I1205 16:37:50.843365 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b35c2088-a11f-4514-8390-1a1f5ff0ccb5-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.020116 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 16:37:51 crc kubenswrapper[4756]: E1205 16:37:51.020573 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b35c2088-a11f-4514-8390-1a1f5ff0ccb5" containerName="cinder-db-sync" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.020587 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b35c2088-a11f-4514-8390-1a1f5ff0ccb5" containerName="cinder-db-sync" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.020811 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b35c2088-a11f-4514-8390-1a1f5ff0ccb5" containerName="cinder-db-sync" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.021766 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.035054 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.035789 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.036022 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-k2qx5" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.037704 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.051038 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.097082 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c64bcd78f-44lq8"] Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.140619 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64bc447477-8cfhg"] Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.142006 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64bc447477-8cfhg" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.160731 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99744cdb-b38c-4576-925d-3b8bb6404a01-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"99744cdb-b38c-4576-925d-3b8bb6404a01\") " pod="openstack/cinder-scheduler-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.160835 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99744cdb-b38c-4576-925d-3b8bb6404a01-scripts\") pod \"cinder-scheduler-0\" (UID: \"99744cdb-b38c-4576-925d-3b8bb6404a01\") " pod="openstack/cinder-scheduler-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.160882 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99744cdb-b38c-4576-925d-3b8bb6404a01-config-data\") pod \"cinder-scheduler-0\" (UID: \"99744cdb-b38c-4576-925d-3b8bb6404a01\") " pod="openstack/cinder-scheduler-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.160944 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2wtk\" (UniqueName: \"kubernetes.io/projected/99744cdb-b38c-4576-925d-3b8bb6404a01-kube-api-access-j2wtk\") pod \"cinder-scheduler-0\" (UID: \"99744cdb-b38c-4576-925d-3b8bb6404a01\") " pod="openstack/cinder-scheduler-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.161049 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/99744cdb-b38c-4576-925d-3b8bb6404a01-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"99744cdb-b38c-4576-925d-3b8bb6404a01\") " pod="openstack/cinder-scheduler-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.161092 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99744cdb-b38c-4576-925d-3b8bb6404a01-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"99744cdb-b38c-4576-925d-3b8bb6404a01\") " pod="openstack/cinder-scheduler-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.175543 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64bc447477-8cfhg"] Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.265091 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99744cdb-b38c-4576-925d-3b8bb6404a01-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"99744cdb-b38c-4576-925d-3b8bb6404a01\") " pod="openstack/cinder-scheduler-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.265234 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99744cdb-b38c-4576-925d-3b8bb6404a01-scripts\") pod \"cinder-scheduler-0\" (UID: \"99744cdb-b38c-4576-925d-3b8bb6404a01\") " pod="openstack/cinder-scheduler-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.265480 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-dns-swift-storage-0\") pod \"dnsmasq-dns-64bc447477-8cfhg\" (UID: \"e0186554-9d53-4bef-874e-6cef6df1a0a5\") " pod="openstack/dnsmasq-dns-64bc447477-8cfhg" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.265503 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-ovsdbserver-sb\") pod \"dnsmasq-dns-64bc447477-8cfhg\" (UID: \"e0186554-9d53-4bef-874e-6cef6df1a0a5\") " pod="openstack/dnsmasq-dns-64bc447477-8cfhg" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.265530 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99744cdb-b38c-4576-925d-3b8bb6404a01-config-data\") pod \"cinder-scheduler-0\" (UID: \"99744cdb-b38c-4576-925d-3b8bb6404a01\") " pod="openstack/cinder-scheduler-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.265589 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-dns-svc\") pod \"dnsmasq-dns-64bc447477-8cfhg\" (UID: \"e0186554-9d53-4bef-874e-6cef6df1a0a5\") " pod="openstack/dnsmasq-dns-64bc447477-8cfhg" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.265620 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2wtk\" (UniqueName: \"kubernetes.io/projected/99744cdb-b38c-4576-925d-3b8bb6404a01-kube-api-access-j2wtk\") pod \"cinder-scheduler-0\" (UID: \"99744cdb-b38c-4576-925d-3b8bb6404a01\") " pod="openstack/cinder-scheduler-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.265635 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmrmm\" (UniqueName: \"kubernetes.io/projected/e0186554-9d53-4bef-874e-6cef6df1a0a5-kube-api-access-zmrmm\") pod \"dnsmasq-dns-64bc447477-8cfhg\" (UID: \"e0186554-9d53-4bef-874e-6cef6df1a0a5\") " pod="openstack/dnsmasq-dns-64bc447477-8cfhg" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.265673 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-config\") pod \"dnsmasq-dns-64bc447477-8cfhg\" (UID: \"e0186554-9d53-4bef-874e-6cef6df1a0a5\") " pod="openstack/dnsmasq-dns-64bc447477-8cfhg" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.265744 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-ovsdbserver-nb\") pod \"dnsmasq-dns-64bc447477-8cfhg\" (UID: \"e0186554-9d53-4bef-874e-6cef6df1a0a5\") " pod="openstack/dnsmasq-dns-64bc447477-8cfhg" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.266090 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/99744cdb-b38c-4576-925d-3b8bb6404a01-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"99744cdb-b38c-4576-925d-3b8bb6404a01\") " pod="openstack/cinder-scheduler-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.266119 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99744cdb-b38c-4576-925d-3b8bb6404a01-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"99744cdb-b38c-4576-925d-3b8bb6404a01\") " pod="openstack/cinder-scheduler-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.268092 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/99744cdb-b38c-4576-925d-3b8bb6404a01-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"99744cdb-b38c-4576-925d-3b8bb6404a01\") " pod="openstack/cinder-scheduler-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.270855 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99744cdb-b38c-4576-925d-3b8bb6404a01-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"99744cdb-b38c-4576-925d-3b8bb6404a01\") " pod="openstack/cinder-scheduler-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.270945 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99744cdb-b38c-4576-925d-3b8bb6404a01-scripts\") pod \"cinder-scheduler-0\" (UID: \"99744cdb-b38c-4576-925d-3b8bb6404a01\") " pod="openstack/cinder-scheduler-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.271678 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99744cdb-b38c-4576-925d-3b8bb6404a01-config-data\") pod \"cinder-scheduler-0\" (UID: \"99744cdb-b38c-4576-925d-3b8bb6404a01\") " pod="openstack/cinder-scheduler-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.271936 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99744cdb-b38c-4576-925d-3b8bb6404a01-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"99744cdb-b38c-4576-925d-3b8bb6404a01\") " pod="openstack/cinder-scheduler-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.289177 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2wtk\" (UniqueName: \"kubernetes.io/projected/99744cdb-b38c-4576-925d-3b8bb6404a01-kube-api-access-j2wtk\") pod \"cinder-scheduler-0\" (UID: \"99744cdb-b38c-4576-925d-3b8bb6404a01\") " pod="openstack/cinder-scheduler-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.338594 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.339942 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.346262 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.367339 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-dns-swift-storage-0\") pod \"dnsmasq-dns-64bc447477-8cfhg\" (UID: \"e0186554-9d53-4bef-874e-6cef6df1a0a5\") " pod="openstack/dnsmasq-dns-64bc447477-8cfhg" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.367383 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-ovsdbserver-sb\") pod \"dnsmasq-dns-64bc447477-8cfhg\" (UID: \"e0186554-9d53-4bef-874e-6cef6df1a0a5\") " pod="openstack/dnsmasq-dns-64bc447477-8cfhg" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.367436 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-dns-svc\") pod \"dnsmasq-dns-64bc447477-8cfhg\" (UID: \"e0186554-9d53-4bef-874e-6cef6df1a0a5\") " pod="openstack/dnsmasq-dns-64bc447477-8cfhg" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.367458 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmrmm\" (UniqueName: \"kubernetes.io/projected/e0186554-9d53-4bef-874e-6cef6df1a0a5-kube-api-access-zmrmm\") pod \"dnsmasq-dns-64bc447477-8cfhg\" (UID: \"e0186554-9d53-4bef-874e-6cef6df1a0a5\") " pod="openstack/dnsmasq-dns-64bc447477-8cfhg" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.367483 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-config\") pod \"dnsmasq-dns-64bc447477-8cfhg\" (UID: \"e0186554-9d53-4bef-874e-6cef6df1a0a5\") " pod="openstack/dnsmasq-dns-64bc447477-8cfhg" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.367516 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-ovsdbserver-nb\") pod \"dnsmasq-dns-64bc447477-8cfhg\" (UID: \"e0186554-9d53-4bef-874e-6cef6df1a0a5\") " pod="openstack/dnsmasq-dns-64bc447477-8cfhg" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.368343 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-ovsdbserver-nb\") pod \"dnsmasq-dns-64bc447477-8cfhg\" (UID: \"e0186554-9d53-4bef-874e-6cef6df1a0a5\") " pod="openstack/dnsmasq-dns-64bc447477-8cfhg" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.368864 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-dns-swift-storage-0\") pod \"dnsmasq-dns-64bc447477-8cfhg\" (UID: \"e0186554-9d53-4bef-874e-6cef6df1a0a5\") " pod="openstack/dnsmasq-dns-64bc447477-8cfhg" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.369220 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-dns-svc\") pod \"dnsmasq-dns-64bc447477-8cfhg\" (UID: \"e0186554-9d53-4bef-874e-6cef6df1a0a5\") " pod="openstack/dnsmasq-dns-64bc447477-8cfhg" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.369620 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-config\") pod \"dnsmasq-dns-64bc447477-8cfhg\" (UID: \"e0186554-9d53-4bef-874e-6cef6df1a0a5\") " pod="openstack/dnsmasq-dns-64bc447477-8cfhg" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.369758 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-ovsdbserver-sb\") pod \"dnsmasq-dns-64bc447477-8cfhg\" (UID: \"e0186554-9d53-4bef-874e-6cef6df1a0a5\") " pod="openstack/dnsmasq-dns-64bc447477-8cfhg" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.372269 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.405859 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmrmm\" (UniqueName: \"kubernetes.io/projected/e0186554-9d53-4bef-874e-6cef6df1a0a5-kube-api-access-zmrmm\") pod \"dnsmasq-dns-64bc447477-8cfhg\" (UID: \"e0186554-9d53-4bef-874e-6cef6df1a0a5\") " pod="openstack/dnsmasq-dns-64bc447477-8cfhg" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.472444 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.473584 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cbf5b74-1863-441f-bdf3-570f64a7f50e-scripts\") pod \"cinder-api-0\" (UID: \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\") " pod="openstack/cinder-api-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.473642 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1cbf5b74-1863-441f-bdf3-570f64a7f50e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\") " pod="openstack/cinder-api-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.473677 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cbf5b74-1863-441f-bdf3-570f64a7f50e-logs\") pod \"cinder-api-0\" (UID: \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\") " pod="openstack/cinder-api-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.473693 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cbf5b74-1863-441f-bdf3-570f64a7f50e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\") " pod="openstack/cinder-api-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.473726 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cbf5b74-1863-441f-bdf3-570f64a7f50e-config-data\") pod \"cinder-api-0\" (UID: \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\") " pod="openstack/cinder-api-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.473745 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1cbf5b74-1863-441f-bdf3-570f64a7f50e-config-data-custom\") pod \"cinder-api-0\" (UID: \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\") " pod="openstack/cinder-api-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.473776 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kndn\" (UniqueName: \"kubernetes.io/projected/1cbf5b74-1863-441f-bdf3-570f64a7f50e-kube-api-access-6kndn\") pod \"cinder-api-0\" (UID: \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\") " pod="openstack/cinder-api-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.479774 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64bc447477-8cfhg" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.581341 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cbf5b74-1863-441f-bdf3-570f64a7f50e-scripts\") pod \"cinder-api-0\" (UID: \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\") " pod="openstack/cinder-api-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.581408 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1cbf5b74-1863-441f-bdf3-570f64a7f50e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\") " pod="openstack/cinder-api-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.581453 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cbf5b74-1863-441f-bdf3-570f64a7f50e-logs\") pod \"cinder-api-0\" (UID: \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\") " pod="openstack/cinder-api-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.581469 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cbf5b74-1863-441f-bdf3-570f64a7f50e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\") " pod="openstack/cinder-api-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.581498 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cbf5b74-1863-441f-bdf3-570f64a7f50e-config-data\") pod \"cinder-api-0\" (UID: \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\") " pod="openstack/cinder-api-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.581519 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1cbf5b74-1863-441f-bdf3-570f64a7f50e-config-data-custom\") pod \"cinder-api-0\" (UID: \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\") " pod="openstack/cinder-api-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.581553 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kndn\" (UniqueName: \"kubernetes.io/projected/1cbf5b74-1863-441f-bdf3-570f64a7f50e-kube-api-access-6kndn\") pod \"cinder-api-0\" (UID: \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\") " pod="openstack/cinder-api-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.592276 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cbf5b74-1863-441f-bdf3-570f64a7f50e-scripts\") pod \"cinder-api-0\" (UID: \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\") " pod="openstack/cinder-api-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.593499 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1cbf5b74-1863-441f-bdf3-570f64a7f50e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\") " pod="openstack/cinder-api-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.597830 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cbf5b74-1863-441f-bdf3-570f64a7f50e-config-data\") pod \"cinder-api-0\" (UID: \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\") " pod="openstack/cinder-api-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.598394 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cbf5b74-1863-441f-bdf3-570f64a7f50e-logs\") pod \"cinder-api-0\" (UID: \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\") " pod="openstack/cinder-api-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.602768 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1cbf5b74-1863-441f-bdf3-570f64a7f50e-config-data-custom\") pod \"cinder-api-0\" (UID: \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\") " pod="openstack/cinder-api-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.612268 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kndn\" (UniqueName: \"kubernetes.io/projected/1cbf5b74-1863-441f-bdf3-570f64a7f50e-kube-api-access-6kndn\") pod \"cinder-api-0\" (UID: \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\") " pod="openstack/cinder-api-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.621814 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cbf5b74-1863-441f-bdf3-570f64a7f50e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\") " pod="openstack/cinder-api-0" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.785509 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c64bcd78f-44lq8" event={"ID":"ee4ccfb2-3d7a-4131-85cf-65e640617199","Type":"ContainerStarted","Data":"1f51dbee348e935b8bdde38aaeb57d0434cd80f1ed2a54d00f1edf10c28ba117"} Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.786285 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-c64bcd78f-44lq8" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.821923 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-c64bcd78f-44lq8" podStartSLOduration=11.821905082 podStartE2EDuration="11.821905082s" podCreationTimestamp="2025-12-05 16:37:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:37:51.81295904 +0000 UTC m=+1332.051632572" watchObservedRunningTime="2025-12-05 16:37:51.821905082 +0000 UTC m=+1332.060578604" Dec 05 16:37:51 crc kubenswrapper[4756]: I1205 16:37:51.861061 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 16:37:52 crc kubenswrapper[4756]: I1205 16:37:52.013746 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 16:37:52 crc kubenswrapper[4756]: I1205 16:37:52.162907 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64bc447477-8cfhg"] Dec 05 16:37:52 crc kubenswrapper[4756]: I1205 16:37:52.166970 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4rtgn" Dec 05 16:37:52 crc kubenswrapper[4756]: I1205 16:37:52.301496 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmxsg\" (UniqueName: \"kubernetes.io/projected/343931b4-805d-4457-a27f-7c680e29a475-kube-api-access-kmxsg\") pod \"343931b4-805d-4457-a27f-7c680e29a475\" (UID: \"343931b4-805d-4457-a27f-7c680e29a475\") " Dec 05 16:37:52 crc kubenswrapper[4756]: I1205 16:37:52.301595 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/343931b4-805d-4457-a27f-7c680e29a475-combined-ca-bundle\") pod \"343931b4-805d-4457-a27f-7c680e29a475\" (UID: \"343931b4-805d-4457-a27f-7c680e29a475\") " Dec 05 16:37:52 crc kubenswrapper[4756]: I1205 16:37:52.301749 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/343931b4-805d-4457-a27f-7c680e29a475-config\") pod \"343931b4-805d-4457-a27f-7c680e29a475\" (UID: \"343931b4-805d-4457-a27f-7c680e29a475\") " Dec 05 16:37:52 crc kubenswrapper[4756]: I1205 16:37:52.307624 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/343931b4-805d-4457-a27f-7c680e29a475-kube-api-access-kmxsg" (OuterVolumeSpecName: "kube-api-access-kmxsg") pod "343931b4-805d-4457-a27f-7c680e29a475" (UID: "343931b4-805d-4457-a27f-7c680e29a475"). InnerVolumeSpecName "kube-api-access-kmxsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:37:52 crc kubenswrapper[4756]: I1205 16:37:52.339641 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/343931b4-805d-4457-a27f-7c680e29a475-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "343931b4-805d-4457-a27f-7c680e29a475" (UID: "343931b4-805d-4457-a27f-7c680e29a475"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:52 crc kubenswrapper[4756]: I1205 16:37:52.349479 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/343931b4-805d-4457-a27f-7c680e29a475-config" (OuterVolumeSpecName: "config") pod "343931b4-805d-4457-a27f-7c680e29a475" (UID: "343931b4-805d-4457-a27f-7c680e29a475"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:52 crc kubenswrapper[4756]: I1205 16:37:52.403956 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/343931b4-805d-4457-a27f-7c680e29a475-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:52 crc kubenswrapper[4756]: I1205 16:37:52.404273 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/343931b4-805d-4457-a27f-7c680e29a475-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:52 crc kubenswrapper[4756]: I1205 16:37:52.404285 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmxsg\" (UniqueName: \"kubernetes.io/projected/343931b4-805d-4457-a27f-7c680e29a475-kube-api-access-kmxsg\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:52 crc kubenswrapper[4756]: W1205 16:37:52.653581 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cbf5b74_1863_441f_bdf3_570f64a7f50e.slice/crio-e33851172f487b5ced7fa81c63d174dd4b49b12754e02f6e0b2c7c17c35fe870 WatchSource:0}: Error finding container e33851172f487b5ced7fa81c63d174dd4b49b12754e02f6e0b2c7c17c35fe870: Status 404 returned error can't find the container with id e33851172f487b5ced7fa81c63d174dd4b49b12754e02f6e0b2c7c17c35fe870 Dec 05 16:37:52 crc kubenswrapper[4756]: I1205 16:37:52.656039 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 16:37:52 crc kubenswrapper[4756]: I1205 16:37:52.861283 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"99744cdb-b38c-4576-925d-3b8bb6404a01","Type":"ContainerStarted","Data":"d9e37c563c058287ef14b53a7b9bf0cc71f0f8f9b11bcd7cd7c664d74e180efe"} Dec 05 16:37:52 crc kubenswrapper[4756]: I1205 16:37:52.867938 4756 generic.go:334] "Generic (PLEG): container finished" podID="e0186554-9d53-4bef-874e-6cef6df1a0a5" containerID="ecb0e8ccd0885dcf9bc2cd11fdda7288f174f8eab10ebdb1765bf9c5bdf6ab13" exitCode=0 Dec 05 16:37:52 crc kubenswrapper[4756]: I1205 16:37:52.868014 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64bc447477-8cfhg" event={"ID":"e0186554-9d53-4bef-874e-6cef6df1a0a5","Type":"ContainerDied","Data":"ecb0e8ccd0885dcf9bc2cd11fdda7288f174f8eab10ebdb1765bf9c5bdf6ab13"} Dec 05 16:37:52 crc kubenswrapper[4756]: I1205 16:37:52.868051 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64bc447477-8cfhg" event={"ID":"e0186554-9d53-4bef-874e-6cef6df1a0a5","Type":"ContainerStarted","Data":"9078e464723eed70428cb589a572fe38e0649d0b83f77aa00aadddefe8ef2d08"} Dec 05 16:37:52 crc kubenswrapper[4756]: I1205 16:37:52.870692 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4rtgn" event={"ID":"343931b4-805d-4457-a27f-7c680e29a475","Type":"ContainerDied","Data":"4ca0b76afdbc174758110ef84fcb830144be6ec2acfc42019850a54b61c8b74e"} Dec 05 16:37:52 crc kubenswrapper[4756]: I1205 16:37:52.870726 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ca0b76afdbc174758110ef84fcb830144be6ec2acfc42019850a54b61c8b74e" Dec 05 16:37:52 crc kubenswrapper[4756]: I1205 16:37:52.870770 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4rtgn" Dec 05 16:37:52 crc kubenswrapper[4756]: I1205 16:37:52.878875 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6fd4b854cb-rm9k8" event={"ID":"13fa1e9e-f893-4641-a55f-d0c209d43917","Type":"ContainerStarted","Data":"f08e0d462ffd45dfa7950e9164ec47234418bfc47d5218d2a85f0732c1d92391"} Dec 05 16:37:52 crc kubenswrapper[4756]: I1205 16:37:52.878928 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6fd4b854cb-rm9k8" event={"ID":"13fa1e9e-f893-4641-a55f-d0c209d43917","Type":"ContainerStarted","Data":"04daa8a7964206ec50253f25a3104d1d3037a3b4b32afa5015cf06cab5b9fd7d"} Dec 05 16:37:52 crc kubenswrapper[4756]: I1205 16:37:52.886044 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1cbf5b74-1863-441f-bdf3-570f64a7f50e","Type":"ContainerStarted","Data":"e33851172f487b5ced7fa81c63d174dd4b49b12754e02f6e0b2c7c17c35fe870"} Dec 05 16:37:52 crc kubenswrapper[4756]: I1205 16:37:52.886125 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-c64bcd78f-44lq8" podUID="ee4ccfb2-3d7a-4131-85cf-65e640617199" containerName="dnsmasq-dns" containerID="cri-o://1f51dbee348e935b8bdde38aaeb57d0434cd80f1ed2a54d00f1edf10c28ba117" gracePeriod=10 Dec 05 16:37:52 crc kubenswrapper[4756]: I1205 16:37:52.932447 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6fd4b854cb-rm9k8" podStartSLOduration=8.615695085 podStartE2EDuration="12.932405728s" podCreationTimestamp="2025-12-05 16:37:40 +0000 UTC" firstStartedPulling="2025-12-05 16:37:47.797210211 +0000 UTC m=+1328.035883743" lastFinishedPulling="2025-12-05 16:37:52.113920874 +0000 UTC m=+1332.352594386" observedRunningTime="2025-12-05 16:37:52.93210799 +0000 UTC m=+1333.170781522" watchObservedRunningTime="2025-12-05 16:37:52.932405728 +0000 UTC m=+1333.171079240" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.088591 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64bc447477-8cfhg"] Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.141496 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77d8c9c7-tmvh4"] Dec 05 16:37:53 crc kubenswrapper[4756]: E1205 16:37:53.141881 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="343931b4-805d-4457-a27f-7c680e29a475" containerName="neutron-db-sync" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.141930 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="343931b4-805d-4457-a27f-7c680e29a475" containerName="neutron-db-sync" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.142108 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="343931b4-805d-4457-a27f-7c680e29a475" containerName="neutron-db-sync" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.143048 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.217251 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77d8c9c7-tmvh4"] Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.242747 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzf8n\" (UniqueName: \"kubernetes.io/projected/f060a6c1-48e8-49b5-9283-172dbef53241-kube-api-access-wzf8n\") pod \"dnsmasq-dns-77d8c9c7-tmvh4\" (UID: \"f060a6c1-48e8-49b5-9283-172dbef53241\") " pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.243130 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f060a6c1-48e8-49b5-9283-172dbef53241-dns-svc\") pod \"dnsmasq-dns-77d8c9c7-tmvh4\" (UID: \"f060a6c1-48e8-49b5-9283-172dbef53241\") " pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.243197 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f060a6c1-48e8-49b5-9283-172dbef53241-dns-swift-storage-0\") pod \"dnsmasq-dns-77d8c9c7-tmvh4\" (UID: \"f060a6c1-48e8-49b5-9283-172dbef53241\") " pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.243244 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f060a6c1-48e8-49b5-9283-172dbef53241-ovsdbserver-nb\") pod \"dnsmasq-dns-77d8c9c7-tmvh4\" (UID: \"f060a6c1-48e8-49b5-9283-172dbef53241\") " pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.243272 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f060a6c1-48e8-49b5-9283-172dbef53241-ovsdbserver-sb\") pod \"dnsmasq-dns-77d8c9c7-tmvh4\" (UID: \"f060a6c1-48e8-49b5-9283-172dbef53241\") " pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.243311 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f060a6c1-48e8-49b5-9283-172dbef53241-config\") pod \"dnsmasq-dns-77d8c9c7-tmvh4\" (UID: \"f060a6c1-48e8-49b5-9283-172dbef53241\") " pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.244040 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-84874cff96-vj9jp"] Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.245908 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84874cff96-vj9jp" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.251899 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-84874cff96-vj9jp"] Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.254178 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-vg2jk" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.254397 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.254526 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.254660 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.357357 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f060a6c1-48e8-49b5-9283-172dbef53241-ovsdbserver-nb\") pod \"dnsmasq-dns-77d8c9c7-tmvh4\" (UID: \"f060a6c1-48e8-49b5-9283-172dbef53241\") " pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.357428 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f060a6c1-48e8-49b5-9283-172dbef53241-ovsdbserver-sb\") pod \"dnsmasq-dns-77d8c9c7-tmvh4\" (UID: \"f060a6c1-48e8-49b5-9283-172dbef53241\") " pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.357456 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a-httpd-config\") pod \"neutron-84874cff96-vj9jp\" (UID: \"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a\") " pod="openstack/neutron-84874cff96-vj9jp" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.357479 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f060a6c1-48e8-49b5-9283-172dbef53241-config\") pod \"dnsmasq-dns-77d8c9c7-tmvh4\" (UID: \"f060a6c1-48e8-49b5-9283-172dbef53241\") " pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.358517 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f060a6c1-48e8-49b5-9283-172dbef53241-config\") pod \"dnsmasq-dns-77d8c9c7-tmvh4\" (UID: \"f060a6c1-48e8-49b5-9283-172dbef53241\") " pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.358555 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f060a6c1-48e8-49b5-9283-172dbef53241-ovsdbserver-nb\") pod \"dnsmasq-dns-77d8c9c7-tmvh4\" (UID: \"f060a6c1-48e8-49b5-9283-172dbef53241\") " pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.358590 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f060a6c1-48e8-49b5-9283-172dbef53241-ovsdbserver-sb\") pod \"dnsmasq-dns-77d8c9c7-tmvh4\" (UID: \"f060a6c1-48e8-49b5-9283-172dbef53241\") " pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.358746 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzf8n\" (UniqueName: \"kubernetes.io/projected/f060a6c1-48e8-49b5-9283-172dbef53241-kube-api-access-wzf8n\") pod \"dnsmasq-dns-77d8c9c7-tmvh4\" (UID: \"f060a6c1-48e8-49b5-9283-172dbef53241\") " pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.358782 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a-ovndb-tls-certs\") pod \"neutron-84874cff96-vj9jp\" (UID: \"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a\") " pod="openstack/neutron-84874cff96-vj9jp" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.358857 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f060a6c1-48e8-49b5-9283-172dbef53241-dns-svc\") pod \"dnsmasq-dns-77d8c9c7-tmvh4\" (UID: \"f060a6c1-48e8-49b5-9283-172dbef53241\") " pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.358921 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd4tc\" (UniqueName: \"kubernetes.io/projected/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a-kube-api-access-bd4tc\") pod \"neutron-84874cff96-vj9jp\" (UID: \"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a\") " pod="openstack/neutron-84874cff96-vj9jp" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.358982 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a-config\") pod \"neutron-84874cff96-vj9jp\" (UID: \"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a\") " pod="openstack/neutron-84874cff96-vj9jp" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.359023 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f060a6c1-48e8-49b5-9283-172dbef53241-dns-swift-storage-0\") pod \"dnsmasq-dns-77d8c9c7-tmvh4\" (UID: \"f060a6c1-48e8-49b5-9283-172dbef53241\") " pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.359050 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a-combined-ca-bundle\") pod \"neutron-84874cff96-vj9jp\" (UID: \"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a\") " pod="openstack/neutron-84874cff96-vj9jp" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.359750 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f060a6c1-48e8-49b5-9283-172dbef53241-dns-svc\") pod \"dnsmasq-dns-77d8c9c7-tmvh4\" (UID: \"f060a6c1-48e8-49b5-9283-172dbef53241\") " pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.360342 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f060a6c1-48e8-49b5-9283-172dbef53241-dns-swift-storage-0\") pod \"dnsmasq-dns-77d8c9c7-tmvh4\" (UID: \"f060a6c1-48e8-49b5-9283-172dbef53241\") " pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.389060 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzf8n\" (UniqueName: \"kubernetes.io/projected/f060a6c1-48e8-49b5-9283-172dbef53241-kube-api-access-wzf8n\") pod \"dnsmasq-dns-77d8c9c7-tmvh4\" (UID: \"f060a6c1-48e8-49b5-9283-172dbef53241\") " pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.460325 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a-ovndb-tls-certs\") pod \"neutron-84874cff96-vj9jp\" (UID: \"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a\") " pod="openstack/neutron-84874cff96-vj9jp" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.460387 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd4tc\" (UniqueName: \"kubernetes.io/projected/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a-kube-api-access-bd4tc\") pod \"neutron-84874cff96-vj9jp\" (UID: \"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a\") " pod="openstack/neutron-84874cff96-vj9jp" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.461093 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a-config\") pod \"neutron-84874cff96-vj9jp\" (UID: \"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a\") " pod="openstack/neutron-84874cff96-vj9jp" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.461132 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a-combined-ca-bundle\") pod \"neutron-84874cff96-vj9jp\" (UID: \"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a\") " pod="openstack/neutron-84874cff96-vj9jp" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.461170 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a-httpd-config\") pod \"neutron-84874cff96-vj9jp\" (UID: \"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a\") " pod="openstack/neutron-84874cff96-vj9jp" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.468410 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a-httpd-config\") pod \"neutron-84874cff96-vj9jp\" (UID: \"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a\") " pod="openstack/neutron-84874cff96-vj9jp" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.468963 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a-ovndb-tls-certs\") pod \"neutron-84874cff96-vj9jp\" (UID: \"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a\") " pod="openstack/neutron-84874cff96-vj9jp" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.469713 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a-config\") pod \"neutron-84874cff96-vj9jp\" (UID: \"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a\") " pod="openstack/neutron-84874cff96-vj9jp" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.478751 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.507145 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a-combined-ca-bundle\") pod \"neutron-84874cff96-vj9jp\" (UID: \"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a\") " pod="openstack/neutron-84874cff96-vj9jp" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.518837 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd4tc\" (UniqueName: \"kubernetes.io/projected/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a-kube-api-access-bd4tc\") pod \"neutron-84874cff96-vj9jp\" (UID: \"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a\") " pod="openstack/neutron-84874cff96-vj9jp" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.589317 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84874cff96-vj9jp" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.800592 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c64bcd78f-44lq8" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.879162 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee4ccfb2-3d7a-4131-85cf-65e640617199-ovsdbserver-nb\") pod \"ee4ccfb2-3d7a-4131-85cf-65e640617199\" (UID: \"ee4ccfb2-3d7a-4131-85cf-65e640617199\") " Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.879217 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee4ccfb2-3d7a-4131-85cf-65e640617199-dns-swift-storage-0\") pod \"ee4ccfb2-3d7a-4131-85cf-65e640617199\" (UID: \"ee4ccfb2-3d7a-4131-85cf-65e640617199\") " Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.879263 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fj972\" (UniqueName: \"kubernetes.io/projected/ee4ccfb2-3d7a-4131-85cf-65e640617199-kube-api-access-fj972\") pod \"ee4ccfb2-3d7a-4131-85cf-65e640617199\" (UID: \"ee4ccfb2-3d7a-4131-85cf-65e640617199\") " Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.879286 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee4ccfb2-3d7a-4131-85cf-65e640617199-ovsdbserver-sb\") pod \"ee4ccfb2-3d7a-4131-85cf-65e640617199\" (UID: \"ee4ccfb2-3d7a-4131-85cf-65e640617199\") " Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.890840 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee4ccfb2-3d7a-4131-85cf-65e640617199-kube-api-access-fj972" (OuterVolumeSpecName: "kube-api-access-fj972") pod "ee4ccfb2-3d7a-4131-85cf-65e640617199" (UID: "ee4ccfb2-3d7a-4131-85cf-65e640617199"). InnerVolumeSpecName "kube-api-access-fj972". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.904832 4756 generic.go:334] "Generic (PLEG): container finished" podID="ee4ccfb2-3d7a-4131-85cf-65e640617199" containerID="1f51dbee348e935b8bdde38aaeb57d0434cd80f1ed2a54d00f1edf10c28ba117" exitCode=0 Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.904895 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c64bcd78f-44lq8" event={"ID":"ee4ccfb2-3d7a-4131-85cf-65e640617199","Type":"ContainerDied","Data":"1f51dbee348e935b8bdde38aaeb57d0434cd80f1ed2a54d00f1edf10c28ba117"} Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.904923 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c64bcd78f-44lq8" event={"ID":"ee4ccfb2-3d7a-4131-85cf-65e640617199","Type":"ContainerDied","Data":"b9ba8b80997b8085403d78a792fd795686aaedb45e5a6399f90693fbee3b5999"} Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.904948 4756 scope.go:117] "RemoveContainer" containerID="1f51dbee348e935b8bdde38aaeb57d0434cd80f1ed2a54d00f1edf10c28ba117" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.905073 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c64bcd78f-44lq8" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.918235 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1cbf5b74-1863-441f-bdf3-570f64a7f50e","Type":"ContainerStarted","Data":"f70940c318d9ae4df8296dcaeadcfc53cd35133bb9a5496e509d90c5a9f7ec5c"} Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.949240 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.965292 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee4ccfb2-3d7a-4131-85cf-65e640617199-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ee4ccfb2-3d7a-4131-85cf-65e640617199" (UID: "ee4ccfb2-3d7a-4131-85cf-65e640617199"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.980667 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee4ccfb2-3d7a-4131-85cf-65e640617199-config\") pod \"ee4ccfb2-3d7a-4131-85cf-65e640617199\" (UID: \"ee4ccfb2-3d7a-4131-85cf-65e640617199\") " Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.980719 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee4ccfb2-3d7a-4131-85cf-65e640617199-dns-svc\") pod \"ee4ccfb2-3d7a-4131-85cf-65e640617199\" (UID: \"ee4ccfb2-3d7a-4131-85cf-65e640617199\") " Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.981003 4756 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee4ccfb2-3d7a-4131-85cf-65e640617199-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.981021 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fj972\" (UniqueName: \"kubernetes.io/projected/ee4ccfb2-3d7a-4131-85cf-65e640617199-kube-api-access-fj972\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.994721 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64bc447477-8cfhg" event={"ID":"e0186554-9d53-4bef-874e-6cef6df1a0a5","Type":"ContainerStarted","Data":"2dd00d84377408ead21fd27084132f194ccc25b9f63579b7e2fc9134ee5013ba"} Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.994944 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-64bc447477-8cfhg" podUID="e0186554-9d53-4bef-874e-6cef6df1a0a5" containerName="dnsmasq-dns" containerID="cri-o://2dd00d84377408ead21fd27084132f194ccc25b9f63579b7e2fc9134ee5013ba" gracePeriod=10 Dec 05 16:37:53 crc kubenswrapper[4756]: I1205 16:37:53.995066 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-64bc447477-8cfhg" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.003821 4756 scope.go:117] "RemoveContainer" containerID="2b7553f35ef2c1fe61db93f73c16184f3497a662ce4194c4af8b6ad84fac110f" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.022149 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee4ccfb2-3d7a-4131-85cf-65e640617199-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ee4ccfb2-3d7a-4131-85cf-65e640617199" (UID: "ee4ccfb2-3d7a-4131-85cf-65e640617199"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.093226 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-64bc447477-8cfhg" podStartSLOduration=3.093211477 podStartE2EDuration="3.093211477s" podCreationTimestamp="2025-12-05 16:37:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:37:54.092851538 +0000 UTC m=+1334.331525050" watchObservedRunningTime="2025-12-05 16:37:54.093211477 +0000 UTC m=+1334.331884979" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.103355 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee4ccfb2-3d7a-4131-85cf-65e640617199-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.128088 4756 generic.go:334] "Generic (PLEG): container finished" podID="d7550fa7-075b-42e9-be97-52e7ed4f552c" containerID="bf81fbd405b146ccf77ccba5def407bc7fba16acdb849eec1f8de533e90ca946" exitCode=0 Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.146204 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7550fa7-075b-42e9-be97-52e7ed4f552c","Type":"ContainerDied","Data":"bf81fbd405b146ccf77ccba5def407bc7fba16acdb849eec1f8de533e90ca946"} Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.191390 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77d8c9c7-tmvh4"] Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.278235 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee4ccfb2-3d7a-4131-85cf-65e640617199-config" (OuterVolumeSpecName: "config") pod "ee4ccfb2-3d7a-4131-85cf-65e640617199" (UID: "ee4ccfb2-3d7a-4131-85cf-65e640617199"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.281990 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee4ccfb2-3d7a-4131-85cf-65e640617199-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ee4ccfb2-3d7a-4131-85cf-65e640617199" (UID: "ee4ccfb2-3d7a-4131-85cf-65e640617199"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.329262 4756 scope.go:117] "RemoveContainer" containerID="1f51dbee348e935b8bdde38aaeb57d0434cd80f1ed2a54d00f1edf10c28ba117" Dec 05 16:37:54 crc kubenswrapper[4756]: E1205 16:37:54.362268 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f51dbee348e935b8bdde38aaeb57d0434cd80f1ed2a54d00f1edf10c28ba117\": container with ID starting with 1f51dbee348e935b8bdde38aaeb57d0434cd80f1ed2a54d00f1edf10c28ba117 not found: ID does not exist" containerID="1f51dbee348e935b8bdde38aaeb57d0434cd80f1ed2a54d00f1edf10c28ba117" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.362314 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f51dbee348e935b8bdde38aaeb57d0434cd80f1ed2a54d00f1edf10c28ba117"} err="failed to get container status \"1f51dbee348e935b8bdde38aaeb57d0434cd80f1ed2a54d00f1edf10c28ba117\": rpc error: code = NotFound desc = could not find container \"1f51dbee348e935b8bdde38aaeb57d0434cd80f1ed2a54d00f1edf10c28ba117\": container with ID starting with 1f51dbee348e935b8bdde38aaeb57d0434cd80f1ed2a54d00f1edf10c28ba117 not found: ID does not exist" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.362346 4756 scope.go:117] "RemoveContainer" containerID="2b7553f35ef2c1fe61db93f73c16184f3497a662ce4194c4af8b6ad84fac110f" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.379862 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee4ccfb2-3d7a-4131-85cf-65e640617199-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ee4ccfb2-3d7a-4131-85cf-65e640617199" (UID: "ee4ccfb2-3d7a-4131-85cf-65e640617199"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:37:54 crc kubenswrapper[4756]: E1205 16:37:54.383306 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b7553f35ef2c1fe61db93f73c16184f3497a662ce4194c4af8b6ad84fac110f\": container with ID starting with 2b7553f35ef2c1fe61db93f73c16184f3497a662ce4194c4af8b6ad84fac110f not found: ID does not exist" containerID="2b7553f35ef2c1fe61db93f73c16184f3497a662ce4194c4af8b6ad84fac110f" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.383359 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b7553f35ef2c1fe61db93f73c16184f3497a662ce4194c4af8b6ad84fac110f"} err="failed to get container status \"2b7553f35ef2c1fe61db93f73c16184f3497a662ce4194c4af8b6ad84fac110f\": rpc error: code = NotFound desc = could not find container \"2b7553f35ef2c1fe61db93f73c16184f3497a662ce4194c4af8b6ad84fac110f\": container with ID starting with 2b7553f35ef2c1fe61db93f73c16184f3497a662ce4194c4af8b6ad84fac110f not found: ID does not exist" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.384483 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee4ccfb2-3d7a-4131-85cf-65e640617199-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.384507 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee4ccfb2-3d7a-4131-85cf-65e640617199-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.486954 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee4ccfb2-3d7a-4131-85cf-65e640617199-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.543373 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.588032 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7550fa7-075b-42e9-be97-52e7ed4f552c-log-httpd\") pod \"d7550fa7-075b-42e9-be97-52e7ed4f552c\" (UID: \"d7550fa7-075b-42e9-be97-52e7ed4f552c\") " Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.588113 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7550fa7-075b-42e9-be97-52e7ed4f552c-config-data\") pod \"d7550fa7-075b-42e9-be97-52e7ed4f552c\" (UID: \"d7550fa7-075b-42e9-be97-52e7ed4f552c\") " Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.588172 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7550fa7-075b-42e9-be97-52e7ed4f552c-sg-core-conf-yaml\") pod \"d7550fa7-075b-42e9-be97-52e7ed4f552c\" (UID: \"d7550fa7-075b-42e9-be97-52e7ed4f552c\") " Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.588214 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5dz4\" (UniqueName: \"kubernetes.io/projected/d7550fa7-075b-42e9-be97-52e7ed4f552c-kube-api-access-k5dz4\") pod \"d7550fa7-075b-42e9-be97-52e7ed4f552c\" (UID: \"d7550fa7-075b-42e9-be97-52e7ed4f552c\") " Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.588312 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7550fa7-075b-42e9-be97-52e7ed4f552c-run-httpd\") pod \"d7550fa7-075b-42e9-be97-52e7ed4f552c\" (UID: \"d7550fa7-075b-42e9-be97-52e7ed4f552c\") " Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.588465 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7550fa7-075b-42e9-be97-52e7ed4f552c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d7550fa7-075b-42e9-be97-52e7ed4f552c" (UID: "d7550fa7-075b-42e9-be97-52e7ed4f552c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.588481 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7550fa7-075b-42e9-be97-52e7ed4f552c-scripts\") pod \"d7550fa7-075b-42e9-be97-52e7ed4f552c\" (UID: \"d7550fa7-075b-42e9-be97-52e7ed4f552c\") " Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.588513 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7550fa7-075b-42e9-be97-52e7ed4f552c-combined-ca-bundle\") pod \"d7550fa7-075b-42e9-be97-52e7ed4f552c\" (UID: \"d7550fa7-075b-42e9-be97-52e7ed4f552c\") " Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.588937 4756 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7550fa7-075b-42e9-be97-52e7ed4f552c-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.589001 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7550fa7-075b-42e9-be97-52e7ed4f552c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d7550fa7-075b-42e9-be97-52e7ed4f552c" (UID: "d7550fa7-075b-42e9-be97-52e7ed4f552c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.596034 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7550fa7-075b-42e9-be97-52e7ed4f552c-scripts" (OuterVolumeSpecName: "scripts") pod "d7550fa7-075b-42e9-be97-52e7ed4f552c" (UID: "d7550fa7-075b-42e9-be97-52e7ed4f552c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.603966 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-84874cff96-vj9jp"] Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.610565 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7550fa7-075b-42e9-be97-52e7ed4f552c-kube-api-access-k5dz4" (OuterVolumeSpecName: "kube-api-access-k5dz4") pod "d7550fa7-075b-42e9-be97-52e7ed4f552c" (UID: "d7550fa7-075b-42e9-be97-52e7ed4f552c"). InnerVolumeSpecName "kube-api-access-k5dz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.636208 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7550fa7-075b-42e9-be97-52e7ed4f552c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d7550fa7-075b-42e9-be97-52e7ed4f552c" (UID: "d7550fa7-075b-42e9-be97-52e7ed4f552c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.663857 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7550fa7-075b-42e9-be97-52e7ed4f552c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7550fa7-075b-42e9-be97-52e7ed4f552c" (UID: "d7550fa7-075b-42e9-be97-52e7ed4f552c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.690874 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5dz4\" (UniqueName: \"kubernetes.io/projected/d7550fa7-075b-42e9-be97-52e7ed4f552c-kube-api-access-k5dz4\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.690918 4756 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7550fa7-075b-42e9-be97-52e7ed4f552c-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.690930 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7550fa7-075b-42e9-be97-52e7ed4f552c-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.690941 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7550fa7-075b-42e9-be97-52e7ed4f552c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.690951 4756 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7550fa7-075b-42e9-be97-52e7ed4f552c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.699984 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7550fa7-075b-42e9-be97-52e7ed4f552c-config-data" (OuterVolumeSpecName: "config-data") pod "d7550fa7-075b-42e9-be97-52e7ed4f552c" (UID: "d7550fa7-075b-42e9-be97-52e7ed4f552c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.735754 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64bc447477-8cfhg" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.770189 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c64bcd78f-44lq8"] Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.795115 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-dns-swift-storage-0\") pod \"e0186554-9d53-4bef-874e-6cef6df1a0a5\" (UID: \"e0186554-9d53-4bef-874e-6cef6df1a0a5\") " Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.795345 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-config\") pod \"e0186554-9d53-4bef-874e-6cef6df1a0a5\" (UID: \"e0186554-9d53-4bef-874e-6cef6df1a0a5\") " Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.795383 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-ovsdbserver-nb\") pod \"e0186554-9d53-4bef-874e-6cef6df1a0a5\" (UID: \"e0186554-9d53-4bef-874e-6cef6df1a0a5\") " Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.795444 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-dns-svc\") pod \"e0186554-9d53-4bef-874e-6cef6df1a0a5\" (UID: \"e0186554-9d53-4bef-874e-6cef6df1a0a5\") " Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.795519 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmrmm\" (UniqueName: \"kubernetes.io/projected/e0186554-9d53-4bef-874e-6cef6df1a0a5-kube-api-access-zmrmm\") pod \"e0186554-9d53-4bef-874e-6cef6df1a0a5\" (UID: \"e0186554-9d53-4bef-874e-6cef6df1a0a5\") " Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.795565 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-ovsdbserver-sb\") pod \"e0186554-9d53-4bef-874e-6cef6df1a0a5\" (UID: \"e0186554-9d53-4bef-874e-6cef6df1a0a5\") " Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.795985 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7550fa7-075b-42e9-be97-52e7ed4f552c-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.801591 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0186554-9d53-4bef-874e-6cef6df1a0a5-kube-api-access-zmrmm" (OuterVolumeSpecName: "kube-api-access-zmrmm") pod "e0186554-9d53-4bef-874e-6cef6df1a0a5" (UID: "e0186554-9d53-4bef-874e-6cef6df1a0a5"). InnerVolumeSpecName "kube-api-access-zmrmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.819197 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c64bcd78f-44lq8"] Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.898219 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmrmm\" (UniqueName: \"kubernetes.io/projected/e0186554-9d53-4bef-874e-6cef6df1a0a5-kube-api-access-zmrmm\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:54 crc kubenswrapper[4756]: I1205 16:37:54.984966 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e0186554-9d53-4bef-874e-6cef6df1a0a5" (UID: "e0186554-9d53-4bef-874e-6cef6df1a0a5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:54.999978 4756 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.000794 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e0186554-9d53-4bef-874e-6cef6df1a0a5" (UID: "e0186554-9d53-4bef-874e-6cef6df1a0a5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.014900 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e0186554-9d53-4bef-874e-6cef6df1a0a5" (UID: "e0186554-9d53-4bef-874e-6cef6df1a0a5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.101650 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.101683 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.155301 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" event={"ID":"f060a6c1-48e8-49b5-9283-172dbef53241","Type":"ContainerStarted","Data":"402fb96553c11e8b81aee95a1c0cfb627cc7be118fc8785777d07f7aae8377d2"} Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.155342 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" event={"ID":"f060a6c1-48e8-49b5-9283-172dbef53241","Type":"ContainerStarted","Data":"9d750af7cb3eff27dd2d4042b0f17934b060ae4876835db3d8ab1793fcc13d25"} Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.170718 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"99744cdb-b38c-4576-925d-3b8bb6404a01","Type":"ContainerStarted","Data":"c5d41d4df7c1c913413b4e2752a10c340ab18ef296781903eb8d89de4db61791"} Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.174905 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84874cff96-vj9jp" event={"ID":"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a","Type":"ContainerStarted","Data":"cc992d269d209eedc89a6f0ff4e26b6086ae6ba5c0f6c630a01d73c38ec639b2"} Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.179246 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e0186554-9d53-4bef-874e-6cef6df1a0a5" (UID: "e0186554-9d53-4bef-874e-6cef6df1a0a5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.180226 4756 generic.go:334] "Generic (PLEG): container finished" podID="e0186554-9d53-4bef-874e-6cef6df1a0a5" containerID="2dd00d84377408ead21fd27084132f194ccc25b9f63579b7e2fc9134ee5013ba" exitCode=0 Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.180287 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64bc447477-8cfhg" event={"ID":"e0186554-9d53-4bef-874e-6cef6df1a0a5","Type":"ContainerDied","Data":"2dd00d84377408ead21fd27084132f194ccc25b9f63579b7e2fc9134ee5013ba"} Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.180314 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64bc447477-8cfhg" event={"ID":"e0186554-9d53-4bef-874e-6cef6df1a0a5","Type":"ContainerDied","Data":"9078e464723eed70428cb589a572fe38e0649d0b83f77aa00aadddefe8ef2d08"} Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.180332 4756 scope.go:117] "RemoveContainer" containerID="2dd00d84377408ead21fd27084132f194ccc25b9f63579b7e2fc9134ee5013ba" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.180445 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64bc447477-8cfhg" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.200676 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7550fa7-075b-42e9-be97-52e7ed4f552c","Type":"ContainerDied","Data":"f764ca580f7aae85592ece8b51ec3bcd365a571a4b3543bfe145060e1e87bb24"} Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.200775 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.202839 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-config" (OuterVolumeSpecName: "config") pod "e0186554-9d53-4bef-874e-6cef6df1a0a5" (UID: "e0186554-9d53-4bef-874e-6cef6df1a0a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.202943 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-config\") pod \"e0186554-9d53-4bef-874e-6cef6df1a0a5\" (UID: \"e0186554-9d53-4bef-874e-6cef6df1a0a5\") " Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.204028 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:55 crc kubenswrapper[4756]: W1205 16:37:55.204981 4756 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/e0186554-9d53-4bef-874e-6cef6df1a0a5/volumes/kubernetes.io~configmap/config Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.204993 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-config" (OuterVolumeSpecName: "config") pod "e0186554-9d53-4bef-874e-6cef6df1a0a5" (UID: "e0186554-9d53-4bef-874e-6cef6df1a0a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.236592 4756 scope.go:117] "RemoveContainer" containerID="ecb0e8ccd0885dcf9bc2cd11fdda7288f174f8eab10ebdb1765bf9c5bdf6ab13" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.299671 4756 scope.go:117] "RemoveContainer" containerID="2dd00d84377408ead21fd27084132f194ccc25b9f63579b7e2fc9134ee5013ba" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.308454 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0186554-9d53-4bef-874e-6cef6df1a0a5-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:55 crc kubenswrapper[4756]: E1205 16:37:55.313244 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dd00d84377408ead21fd27084132f194ccc25b9f63579b7e2fc9134ee5013ba\": container with ID starting with 2dd00d84377408ead21fd27084132f194ccc25b9f63579b7e2fc9134ee5013ba not found: ID does not exist" containerID="2dd00d84377408ead21fd27084132f194ccc25b9f63579b7e2fc9134ee5013ba" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.313285 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dd00d84377408ead21fd27084132f194ccc25b9f63579b7e2fc9134ee5013ba"} err="failed to get container status \"2dd00d84377408ead21fd27084132f194ccc25b9f63579b7e2fc9134ee5013ba\": rpc error: code = NotFound desc = could not find container \"2dd00d84377408ead21fd27084132f194ccc25b9f63579b7e2fc9134ee5013ba\": container with ID starting with 2dd00d84377408ead21fd27084132f194ccc25b9f63579b7e2fc9134ee5013ba not found: ID does not exist" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.313308 4756 scope.go:117] "RemoveContainer" containerID="ecb0e8ccd0885dcf9bc2cd11fdda7288f174f8eab10ebdb1765bf9c5bdf6ab13" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.313702 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:37:55 crc kubenswrapper[4756]: E1205 16:37:55.314273 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecb0e8ccd0885dcf9bc2cd11fdda7288f174f8eab10ebdb1765bf9c5bdf6ab13\": container with ID starting with ecb0e8ccd0885dcf9bc2cd11fdda7288f174f8eab10ebdb1765bf9c5bdf6ab13 not found: ID does not exist" containerID="ecb0e8ccd0885dcf9bc2cd11fdda7288f174f8eab10ebdb1765bf9c5bdf6ab13" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.314296 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecb0e8ccd0885dcf9bc2cd11fdda7288f174f8eab10ebdb1765bf9c5bdf6ab13"} err="failed to get container status \"ecb0e8ccd0885dcf9bc2cd11fdda7288f174f8eab10ebdb1765bf9c5bdf6ab13\": rpc error: code = NotFound desc = could not find container \"ecb0e8ccd0885dcf9bc2cd11fdda7288f174f8eab10ebdb1765bf9c5bdf6ab13\": container with ID starting with ecb0e8ccd0885dcf9bc2cd11fdda7288f174f8eab10ebdb1765bf9c5bdf6ab13 not found: ID does not exist" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.314310 4756 scope.go:117] "RemoveContainer" containerID="e27ba8d07ae00908888de0ce90498b3d2a9dc789a238952ce19112b10a011ac0" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.327597 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.337065 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:37:55 crc kubenswrapper[4756]: E1205 16:37:55.337407 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0186554-9d53-4bef-874e-6cef6df1a0a5" containerName="init" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.337436 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0186554-9d53-4bef-874e-6cef6df1a0a5" containerName="init" Dec 05 16:37:55 crc kubenswrapper[4756]: E1205 16:37:55.337452 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee4ccfb2-3d7a-4131-85cf-65e640617199" containerName="init" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.337458 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee4ccfb2-3d7a-4131-85cf-65e640617199" containerName="init" Dec 05 16:37:55 crc kubenswrapper[4756]: E1205 16:37:55.337472 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee4ccfb2-3d7a-4131-85cf-65e640617199" containerName="dnsmasq-dns" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.337477 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee4ccfb2-3d7a-4131-85cf-65e640617199" containerName="dnsmasq-dns" Dec 05 16:37:55 crc kubenswrapper[4756]: E1205 16:37:55.337522 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7550fa7-075b-42e9-be97-52e7ed4f552c" containerName="proxy-httpd" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.337529 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7550fa7-075b-42e9-be97-52e7ed4f552c" containerName="proxy-httpd" Dec 05 16:37:55 crc kubenswrapper[4756]: E1205 16:37:55.337562 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0186554-9d53-4bef-874e-6cef6df1a0a5" containerName="dnsmasq-dns" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.337569 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0186554-9d53-4bef-874e-6cef6df1a0a5" containerName="dnsmasq-dns" Dec 05 16:37:55 crc kubenswrapper[4756]: E1205 16:37:55.337578 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7550fa7-075b-42e9-be97-52e7ed4f552c" containerName="sg-core" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.337583 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7550fa7-075b-42e9-be97-52e7ed4f552c" containerName="sg-core" Dec 05 16:37:55 crc kubenswrapper[4756]: E1205 16:37:55.337623 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7550fa7-075b-42e9-be97-52e7ed4f552c" containerName="ceilometer-notification-agent" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.337631 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7550fa7-075b-42e9-be97-52e7ed4f552c" containerName="ceilometer-notification-agent" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.337812 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7550fa7-075b-42e9-be97-52e7ed4f552c" containerName="ceilometer-notification-agent" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.337838 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7550fa7-075b-42e9-be97-52e7ed4f552c" containerName="sg-core" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.337864 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7550fa7-075b-42e9-be97-52e7ed4f552c" containerName="proxy-httpd" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.337904 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0186554-9d53-4bef-874e-6cef6df1a0a5" containerName="dnsmasq-dns" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.337913 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee4ccfb2-3d7a-4131-85cf-65e640617199" containerName="dnsmasq-dns" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.339609 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.342265 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.342529 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.347823 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.411755 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwfpd\" (UniqueName: \"kubernetes.io/projected/e22ff075-9534-4b98-9e00-04672da4fa74-kube-api-access-nwfpd\") pod \"ceilometer-0\" (UID: \"e22ff075-9534-4b98-9e00-04672da4fa74\") " pod="openstack/ceilometer-0" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.411810 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e22ff075-9534-4b98-9e00-04672da4fa74-scripts\") pod \"ceilometer-0\" (UID: \"e22ff075-9534-4b98-9e00-04672da4fa74\") " pod="openstack/ceilometer-0" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.411888 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22ff075-9534-4b98-9e00-04672da4fa74-log-httpd\") pod \"ceilometer-0\" (UID: \"e22ff075-9534-4b98-9e00-04672da4fa74\") " pod="openstack/ceilometer-0" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.411904 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22ff075-9534-4b98-9e00-04672da4fa74-config-data\") pod \"ceilometer-0\" (UID: \"e22ff075-9534-4b98-9e00-04672da4fa74\") " pod="openstack/ceilometer-0" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.411936 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22ff075-9534-4b98-9e00-04672da4fa74-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e22ff075-9534-4b98-9e00-04672da4fa74\") " pod="openstack/ceilometer-0" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.411954 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22ff075-9534-4b98-9e00-04672da4fa74-run-httpd\") pod \"ceilometer-0\" (UID: \"e22ff075-9534-4b98-9e00-04672da4fa74\") " pod="openstack/ceilometer-0" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.412039 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e22ff075-9534-4b98-9e00-04672da4fa74-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e22ff075-9534-4b98-9e00-04672da4fa74\") " pod="openstack/ceilometer-0" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.426609 4756 scope.go:117] "RemoveContainer" containerID="1dc41eca912836854eb899bcf0eed311a49e7644a686a7350102564960f9d9db" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.513285 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22ff075-9534-4b98-9e00-04672da4fa74-log-httpd\") pod \"ceilometer-0\" (UID: \"e22ff075-9534-4b98-9e00-04672da4fa74\") " pod="openstack/ceilometer-0" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.513659 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22ff075-9534-4b98-9e00-04672da4fa74-config-data\") pod \"ceilometer-0\" (UID: \"e22ff075-9534-4b98-9e00-04672da4fa74\") " pod="openstack/ceilometer-0" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.513713 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22ff075-9534-4b98-9e00-04672da4fa74-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e22ff075-9534-4b98-9e00-04672da4fa74\") " pod="openstack/ceilometer-0" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.513735 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22ff075-9534-4b98-9e00-04672da4fa74-run-httpd\") pod \"ceilometer-0\" (UID: \"e22ff075-9534-4b98-9e00-04672da4fa74\") " pod="openstack/ceilometer-0" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.513764 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e22ff075-9534-4b98-9e00-04672da4fa74-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e22ff075-9534-4b98-9e00-04672da4fa74\") " pod="openstack/ceilometer-0" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.513836 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwfpd\" (UniqueName: \"kubernetes.io/projected/e22ff075-9534-4b98-9e00-04672da4fa74-kube-api-access-nwfpd\") pod \"ceilometer-0\" (UID: \"e22ff075-9534-4b98-9e00-04672da4fa74\") " pod="openstack/ceilometer-0" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.513861 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e22ff075-9534-4b98-9e00-04672da4fa74-scripts\") pod \"ceilometer-0\" (UID: \"e22ff075-9534-4b98-9e00-04672da4fa74\") " pod="openstack/ceilometer-0" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.515035 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22ff075-9534-4b98-9e00-04672da4fa74-run-httpd\") pod \"ceilometer-0\" (UID: \"e22ff075-9534-4b98-9e00-04672da4fa74\") " pod="openstack/ceilometer-0" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.515080 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22ff075-9534-4b98-9e00-04672da4fa74-log-httpd\") pod \"ceilometer-0\" (UID: \"e22ff075-9534-4b98-9e00-04672da4fa74\") " pod="openstack/ceilometer-0" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.522898 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e22ff075-9534-4b98-9e00-04672da4fa74-scripts\") pod \"ceilometer-0\" (UID: \"e22ff075-9534-4b98-9e00-04672da4fa74\") " pod="openstack/ceilometer-0" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.523249 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22ff075-9534-4b98-9e00-04672da4fa74-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e22ff075-9534-4b98-9e00-04672da4fa74\") " pod="openstack/ceilometer-0" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.523928 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e22ff075-9534-4b98-9e00-04672da4fa74-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e22ff075-9534-4b98-9e00-04672da4fa74\") " pod="openstack/ceilometer-0" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.524712 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22ff075-9534-4b98-9e00-04672da4fa74-config-data\") pod \"ceilometer-0\" (UID: \"e22ff075-9534-4b98-9e00-04672da4fa74\") " pod="openstack/ceilometer-0" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.536253 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwfpd\" (UniqueName: \"kubernetes.io/projected/e22ff075-9534-4b98-9e00-04672da4fa74-kube-api-access-nwfpd\") pod \"ceilometer-0\" (UID: \"e22ff075-9534-4b98-9e00-04672da4fa74\") " pod="openstack/ceilometer-0" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.639105 4756 scope.go:117] "RemoveContainer" containerID="bf81fbd405b146ccf77ccba5def407bc7fba16acdb849eec1f8de533e90ca946" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.648319 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.676212 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64bc447477-8cfhg"] Dec 05 16:37:55 crc kubenswrapper[4756]: I1205 16:37:55.690375 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64bc447477-8cfhg"] Dec 05 16:37:56 crc kubenswrapper[4756]: I1205 16:37:56.068069 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7550fa7-075b-42e9-be97-52e7ed4f552c" path="/var/lib/kubelet/pods/d7550fa7-075b-42e9-be97-52e7ed4f552c/volumes" Dec 05 16:37:56 crc kubenswrapper[4756]: I1205 16:37:56.069393 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0186554-9d53-4bef-874e-6cef6df1a0a5" path="/var/lib/kubelet/pods/e0186554-9d53-4bef-874e-6cef6df1a0a5/volumes" Dec 05 16:37:56 crc kubenswrapper[4756]: I1205 16:37:56.070001 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee4ccfb2-3d7a-4131-85cf-65e640617199" path="/var/lib/kubelet/pods/ee4ccfb2-3d7a-4131-85cf-65e640617199/volumes" Dec 05 16:37:56 crc kubenswrapper[4756]: I1205 16:37:56.218919 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"99744cdb-b38c-4576-925d-3b8bb6404a01","Type":"ContainerStarted","Data":"e8ac119fbc8359cc2ef2694311e407ca4ce5040fe856da58e0dd98b290825396"} Dec 05 16:37:56 crc kubenswrapper[4756]: I1205 16:37:56.228077 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84874cff96-vj9jp" event={"ID":"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a","Type":"ContainerStarted","Data":"a0da29df90f63367bef1b3be43b14be690f24459c837cdc40e920dde2d3e744c"} Dec 05 16:37:56 crc kubenswrapper[4756]: I1205 16:37:56.228136 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84874cff96-vj9jp" event={"ID":"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a","Type":"ContainerStarted","Data":"d13d9a27b35d2845a09a8db034bdad1c6e20fddb4e0686170007fa4fe3692129"} Dec 05 16:37:56 crc kubenswrapper[4756]: I1205 16:37:56.228259 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-84874cff96-vj9jp" Dec 05 16:37:56 crc kubenswrapper[4756]: I1205 16:37:56.252151 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.203115837 podStartE2EDuration="6.252131825s" podCreationTimestamp="2025-12-05 16:37:50 +0000 UTC" firstStartedPulling="2025-12-05 16:37:52.075648126 +0000 UTC m=+1332.314321638" lastFinishedPulling="2025-12-05 16:37:53.124664114 +0000 UTC m=+1333.363337626" observedRunningTime="2025-12-05 16:37:56.243701966 +0000 UTC m=+1336.482375488" watchObservedRunningTime="2025-12-05 16:37:56.252131825 +0000 UTC m=+1336.490805337" Dec 05 16:37:56 crc kubenswrapper[4756]: I1205 16:37:56.252668 4756 generic.go:334] "Generic (PLEG): container finished" podID="f060a6c1-48e8-49b5-9283-172dbef53241" containerID="402fb96553c11e8b81aee95a1c0cfb627cc7be118fc8785777d07f7aae8377d2" exitCode=0 Dec 05 16:37:56 crc kubenswrapper[4756]: I1205 16:37:56.252839 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" event={"ID":"f060a6c1-48e8-49b5-9283-172dbef53241","Type":"ContainerDied","Data":"402fb96553c11e8b81aee95a1c0cfb627cc7be118fc8785777d07f7aae8377d2"} Dec 05 16:37:56 crc kubenswrapper[4756]: I1205 16:37:56.252953 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" event={"ID":"f060a6c1-48e8-49b5-9283-172dbef53241","Type":"ContainerStarted","Data":"9b565047b4f90b9ae68c5bdaf8d5c38a5e415dce6e03641e5c8dca325759a137"} Dec 05 16:37:56 crc kubenswrapper[4756]: I1205 16:37:56.253907 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" Dec 05 16:37:56 crc kubenswrapper[4756]: I1205 16:37:56.286019 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1cbf5b74-1863-441f-bdf3-570f64a7f50e","Type":"ContainerStarted","Data":"d078b3ef05a7d287ce00f737ceab7fb10cf2a69af45722fe1f9d5d1677ee9ea5"} Dec 05 16:37:56 crc kubenswrapper[4756]: I1205 16:37:56.286297 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1cbf5b74-1863-441f-bdf3-570f64a7f50e" containerName="cinder-api-log" containerID="cri-o://f70940c318d9ae4df8296dcaeadcfc53cd35133bb9a5496e509d90c5a9f7ec5c" gracePeriod=30 Dec 05 16:37:56 crc kubenswrapper[4756]: I1205 16:37:56.286475 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 05 16:37:56 crc kubenswrapper[4756]: I1205 16:37:56.286541 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1cbf5b74-1863-441f-bdf3-570f64a7f50e" containerName="cinder-api" containerID="cri-o://d078b3ef05a7d287ce00f737ceab7fb10cf2a69af45722fe1f9d5d1677ee9ea5" gracePeriod=30 Dec 05 16:37:56 crc kubenswrapper[4756]: I1205 16:37:56.332199 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:37:56 crc kubenswrapper[4756]: I1205 16:37:56.358242 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-84874cff96-vj9jp" podStartSLOduration=3.358213983 podStartE2EDuration="3.358213983s" podCreationTimestamp="2025-12-05 16:37:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:37:56.275364515 +0000 UTC m=+1336.514038027" watchObservedRunningTime="2025-12-05 16:37:56.358213983 +0000 UTC m=+1336.596887485" Dec 05 16:37:56 crc kubenswrapper[4756]: I1205 16:37:56.373169 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 05 16:37:56 crc kubenswrapper[4756]: I1205 16:37:56.373833 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" podStartSLOduration=3.373825306 podStartE2EDuration="3.373825306s" podCreationTimestamp="2025-12-05 16:37:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:37:56.30502574 +0000 UTC m=+1336.543699262" watchObservedRunningTime="2025-12-05 16:37:56.373825306 +0000 UTC m=+1336.612498818" Dec 05 16:37:56 crc kubenswrapper[4756]: I1205 16:37:56.398440 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.3984111630000005 podStartE2EDuration="5.398411163s" podCreationTimestamp="2025-12-05 16:37:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:37:56.329039491 +0000 UTC m=+1336.567713003" watchObservedRunningTime="2025-12-05 16:37:56.398411163 +0000 UTC m=+1336.637084675" Dec 05 16:37:56 crc kubenswrapper[4756]: I1205 16:37:56.672521 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-54c6d9b488-h9hmj" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.112086 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-54c6d9b488-h9hmj" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.146156 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.179983 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1cbf5b74-1863-441f-bdf3-570f64a7f50e-etc-machine-id\") pod \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\" (UID: \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\") " Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.180055 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cbf5b74-1863-441f-bdf3-570f64a7f50e-scripts\") pod \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\" (UID: \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\") " Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.180125 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cbf5b74-1863-441f-bdf3-570f64a7f50e-config-data\") pod \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\" (UID: \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\") " Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.180142 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1cbf5b74-1863-441f-bdf3-570f64a7f50e-config-data-custom\") pod \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\" (UID: \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\") " Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.180217 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cbf5b74-1863-441f-bdf3-570f64a7f50e-logs\") pod \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\" (UID: \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\") " Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.180257 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cbf5b74-1863-441f-bdf3-570f64a7f50e-combined-ca-bundle\") pod \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\" (UID: \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\") " Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.180303 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kndn\" (UniqueName: \"kubernetes.io/projected/1cbf5b74-1863-441f-bdf3-570f64a7f50e-kube-api-access-6kndn\") pod \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\" (UID: \"1cbf5b74-1863-441f-bdf3-570f64a7f50e\") " Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.182174 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1cbf5b74-1863-441f-bdf3-570f64a7f50e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1cbf5b74-1863-441f-bdf3-570f64a7f50e" (UID: "1cbf5b74-1863-441f-bdf3-570f64a7f50e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.185868 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cbf5b74-1863-441f-bdf3-570f64a7f50e-logs" (OuterVolumeSpecName: "logs") pod "1cbf5b74-1863-441f-bdf3-570f64a7f50e" (UID: "1cbf5b74-1863-441f-bdf3-570f64a7f50e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.199783 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cbf5b74-1863-441f-bdf3-570f64a7f50e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1cbf5b74-1863-441f-bdf3-570f64a7f50e" (UID: "1cbf5b74-1863-441f-bdf3-570f64a7f50e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.209633 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cbf5b74-1863-441f-bdf3-570f64a7f50e-scripts" (OuterVolumeSpecName: "scripts") pod "1cbf5b74-1863-441f-bdf3-570f64a7f50e" (UID: "1cbf5b74-1863-441f-bdf3-570f64a7f50e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.216646 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cbf5b74-1863-441f-bdf3-570f64a7f50e-kube-api-access-6kndn" (OuterVolumeSpecName: "kube-api-access-6kndn") pod "1cbf5b74-1863-441f-bdf3-570f64a7f50e" (UID: "1cbf5b74-1863-441f-bdf3-570f64a7f50e"). InnerVolumeSpecName "kube-api-access-6kndn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.221457 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-749876f46d-6zlh7"] Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.221694 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-749876f46d-6zlh7" podUID="4bd0d557-c324-4da4-88e1-2073413d29de" containerName="barbican-api-log" containerID="cri-o://88ac134226b0d54d94601d18418b3b961516451f134b0f57dee1347033088240" gracePeriod=30 Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.222013 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-749876f46d-6zlh7" podUID="4bd0d557-c324-4da4-88e1-2073413d29de" containerName="barbican-api" containerID="cri-o://dfb79049cb89a71b4607b123544b14f9225ac8a88e66e094a43ee841d722ee58" gracePeriod=30 Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.246685 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-749876f46d-6zlh7" podUID="4bd0d557-c324-4da4-88e1-2073413d29de" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.154:9311/healthcheck\": EOF" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.247272 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-749876f46d-6zlh7" podUID="4bd0d557-c324-4da4-88e1-2073413d29de" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.154:9311/healthcheck\": EOF" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.247338 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-749876f46d-6zlh7" podUID="4bd0d557-c324-4da4-88e1-2073413d29de" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.154:9311/healthcheck\": EOF" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.247771 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-749876f46d-6zlh7" podUID="4bd0d557-c324-4da4-88e1-2073413d29de" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.154:9311/healthcheck\": EOF" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.295475 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kndn\" (UniqueName: \"kubernetes.io/projected/1cbf5b74-1863-441f-bdf3-570f64a7f50e-kube-api-access-6kndn\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.295507 4756 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1cbf5b74-1863-441f-bdf3-570f64a7f50e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.295517 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cbf5b74-1863-441f-bdf3-570f64a7f50e-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.295530 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1cbf5b74-1863-441f-bdf3-570f64a7f50e-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.295538 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cbf5b74-1863-441f-bdf3-570f64a7f50e-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.338078 4756 generic.go:334] "Generic (PLEG): container finished" podID="1cbf5b74-1863-441f-bdf3-570f64a7f50e" containerID="d078b3ef05a7d287ce00f737ceab7fb10cf2a69af45722fe1f9d5d1677ee9ea5" exitCode=0 Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.338208 4756 generic.go:334] "Generic (PLEG): container finished" podID="1cbf5b74-1863-441f-bdf3-570f64a7f50e" containerID="f70940c318d9ae4df8296dcaeadcfc53cd35133bb9a5496e509d90c5a9f7ec5c" exitCode=143 Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.338340 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1cbf5b74-1863-441f-bdf3-570f64a7f50e","Type":"ContainerDied","Data":"d078b3ef05a7d287ce00f737ceab7fb10cf2a69af45722fe1f9d5d1677ee9ea5"} Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.338464 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1cbf5b74-1863-441f-bdf3-570f64a7f50e","Type":"ContainerDied","Data":"f70940c318d9ae4df8296dcaeadcfc53cd35133bb9a5496e509d90c5a9f7ec5c"} Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.342975 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1cbf5b74-1863-441f-bdf3-570f64a7f50e","Type":"ContainerDied","Data":"e33851172f487b5ced7fa81c63d174dd4b49b12754e02f6e0b2c7c17c35fe870"} Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.338737 4756 scope.go:117] "RemoveContainer" containerID="d078b3ef05a7d287ce00f737ceab7fb10cf2a69af45722fe1f9d5d1677ee9ea5" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.338679 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.339843 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cbf5b74-1863-441f-bdf3-570f64a7f50e-config-data" (OuterVolumeSpecName: "config-data") pod "1cbf5b74-1863-441f-bdf3-570f64a7f50e" (UID: "1cbf5b74-1863-441f-bdf3-570f64a7f50e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.348240 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cbf5b74-1863-441f-bdf3-570f64a7f50e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1cbf5b74-1863-441f-bdf3-570f64a7f50e" (UID: "1cbf5b74-1863-441f-bdf3-570f64a7f50e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.359711 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e22ff075-9534-4b98-9e00-04672da4fa74","Type":"ContainerStarted","Data":"e57cae3fd680c166873c24b8843123064d86a2276e2b5b9ac3369c5189072149"} Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.389618 4756 scope.go:117] "RemoveContainer" containerID="f70940c318d9ae4df8296dcaeadcfc53cd35133bb9a5496e509d90c5a9f7ec5c" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.405908 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cbf5b74-1863-441f-bdf3-570f64a7f50e-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.405948 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cbf5b74-1863-441f-bdf3-570f64a7f50e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.464564 4756 scope.go:117] "RemoveContainer" containerID="d078b3ef05a7d287ce00f737ceab7fb10cf2a69af45722fe1f9d5d1677ee9ea5" Dec 05 16:37:57 crc kubenswrapper[4756]: E1205 16:37:57.468765 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d078b3ef05a7d287ce00f737ceab7fb10cf2a69af45722fe1f9d5d1677ee9ea5\": container with ID starting with d078b3ef05a7d287ce00f737ceab7fb10cf2a69af45722fe1f9d5d1677ee9ea5 not found: ID does not exist" containerID="d078b3ef05a7d287ce00f737ceab7fb10cf2a69af45722fe1f9d5d1677ee9ea5" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.468822 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d078b3ef05a7d287ce00f737ceab7fb10cf2a69af45722fe1f9d5d1677ee9ea5"} err="failed to get container status \"d078b3ef05a7d287ce00f737ceab7fb10cf2a69af45722fe1f9d5d1677ee9ea5\": rpc error: code = NotFound desc = could not find container \"d078b3ef05a7d287ce00f737ceab7fb10cf2a69af45722fe1f9d5d1677ee9ea5\": container with ID starting with d078b3ef05a7d287ce00f737ceab7fb10cf2a69af45722fe1f9d5d1677ee9ea5 not found: ID does not exist" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.468852 4756 scope.go:117] "RemoveContainer" containerID="f70940c318d9ae4df8296dcaeadcfc53cd35133bb9a5496e509d90c5a9f7ec5c" Dec 05 16:37:57 crc kubenswrapper[4756]: E1205 16:37:57.475378 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f70940c318d9ae4df8296dcaeadcfc53cd35133bb9a5496e509d90c5a9f7ec5c\": container with ID starting with f70940c318d9ae4df8296dcaeadcfc53cd35133bb9a5496e509d90c5a9f7ec5c not found: ID does not exist" containerID="f70940c318d9ae4df8296dcaeadcfc53cd35133bb9a5496e509d90c5a9f7ec5c" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.475408 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f70940c318d9ae4df8296dcaeadcfc53cd35133bb9a5496e509d90c5a9f7ec5c"} err="failed to get container status \"f70940c318d9ae4df8296dcaeadcfc53cd35133bb9a5496e509d90c5a9f7ec5c\": rpc error: code = NotFound desc = could not find container \"f70940c318d9ae4df8296dcaeadcfc53cd35133bb9a5496e509d90c5a9f7ec5c\": container with ID starting with f70940c318d9ae4df8296dcaeadcfc53cd35133bb9a5496e509d90c5a9f7ec5c not found: ID does not exist" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.475439 4756 scope.go:117] "RemoveContainer" containerID="d078b3ef05a7d287ce00f737ceab7fb10cf2a69af45722fe1f9d5d1677ee9ea5" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.476767 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d078b3ef05a7d287ce00f737ceab7fb10cf2a69af45722fe1f9d5d1677ee9ea5"} err="failed to get container status \"d078b3ef05a7d287ce00f737ceab7fb10cf2a69af45722fe1f9d5d1677ee9ea5\": rpc error: code = NotFound desc = could not find container \"d078b3ef05a7d287ce00f737ceab7fb10cf2a69af45722fe1f9d5d1677ee9ea5\": container with ID starting with d078b3ef05a7d287ce00f737ceab7fb10cf2a69af45722fe1f9d5d1677ee9ea5 not found: ID does not exist" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.476789 4756 scope.go:117] "RemoveContainer" containerID="f70940c318d9ae4df8296dcaeadcfc53cd35133bb9a5496e509d90c5a9f7ec5c" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.486737 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f70940c318d9ae4df8296dcaeadcfc53cd35133bb9a5496e509d90c5a9f7ec5c"} err="failed to get container status \"f70940c318d9ae4df8296dcaeadcfc53cd35133bb9a5496e509d90c5a9f7ec5c\": rpc error: code = NotFound desc = could not find container \"f70940c318d9ae4df8296dcaeadcfc53cd35133bb9a5496e509d90c5a9f7ec5c\": container with ID starting with f70940c318d9ae4df8296dcaeadcfc53cd35133bb9a5496e509d90c5a9f7ec5c not found: ID does not exist" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.676846 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.696774 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.713495 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 05 16:37:57 crc kubenswrapper[4756]: E1205 16:37:57.713883 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cbf5b74-1863-441f-bdf3-570f64a7f50e" containerName="cinder-api-log" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.713900 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cbf5b74-1863-441f-bdf3-570f64a7f50e" containerName="cinder-api-log" Dec 05 16:37:57 crc kubenswrapper[4756]: E1205 16:37:57.713924 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cbf5b74-1863-441f-bdf3-570f64a7f50e" containerName="cinder-api" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.713930 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cbf5b74-1863-441f-bdf3-570f64a7f50e" containerName="cinder-api" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.714102 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cbf5b74-1863-441f-bdf3-570f64a7f50e" containerName="cinder-api" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.714125 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cbf5b74-1863-441f-bdf3-570f64a7f50e" containerName="cinder-api-log" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.719130 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.723304 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.723342 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.731128 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.731574 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.812627 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " pod="openstack/cinder-api-0" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.812784 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-config-data-custom\") pod \"cinder-api-0\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " pod="openstack/cinder-api-0" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.812817 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " pod="openstack/cinder-api-0" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.812870 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-config-data\") pod \"cinder-api-0\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " pod="openstack/cinder-api-0" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.812901 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9df5252d-16cf-4612-910c-8f37d10b48a7-logs\") pod \"cinder-api-0\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " pod="openstack/cinder-api-0" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.812937 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-scripts\") pod \"cinder-api-0\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " pod="openstack/cinder-api-0" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.812960 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9df5252d-16cf-4612-910c-8f37d10b48a7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " pod="openstack/cinder-api-0" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.813005 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncmqs\" (UniqueName: \"kubernetes.io/projected/9df5252d-16cf-4612-910c-8f37d10b48a7-kube-api-access-ncmqs\") pod \"cinder-api-0\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " pod="openstack/cinder-api-0" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.813029 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " pod="openstack/cinder-api-0" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.835064 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5b545bc8c5-xcv4s"] Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.839146 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b545bc8c5-xcv4s" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.843914 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.845107 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.853459 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b545bc8c5-xcv4s"] Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.915855 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-public-tls-certs\") pod \"neutron-5b545bc8c5-xcv4s\" (UID: \"926e1895-6638-4861-be46-f06e94e4b4ae\") " pod="openstack/neutron-5b545bc8c5-xcv4s" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.915896 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-ovndb-tls-certs\") pod \"neutron-5b545bc8c5-xcv4s\" (UID: \"926e1895-6638-4861-be46-f06e94e4b4ae\") " pod="openstack/neutron-5b545bc8c5-xcv4s" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.915941 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-httpd-config\") pod \"neutron-5b545bc8c5-xcv4s\" (UID: \"926e1895-6638-4861-be46-f06e94e4b4ae\") " pod="openstack/neutron-5b545bc8c5-xcv4s" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.915971 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " pod="openstack/cinder-api-0" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.915997 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-config-data-custom\") pod \"cinder-api-0\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " pod="openstack/cinder-api-0" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.916016 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " pod="openstack/cinder-api-0" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.916055 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-config-data\") pod \"cinder-api-0\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " pod="openstack/cinder-api-0" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.916089 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9df5252d-16cf-4612-910c-8f37d10b48a7-logs\") pod \"cinder-api-0\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " pod="openstack/cinder-api-0" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.916128 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-scripts\") pod \"cinder-api-0\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " pod="openstack/cinder-api-0" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.916149 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9df5252d-16cf-4612-910c-8f37d10b48a7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " pod="openstack/cinder-api-0" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.916179 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-internal-tls-certs\") pod \"neutron-5b545bc8c5-xcv4s\" (UID: \"926e1895-6638-4861-be46-f06e94e4b4ae\") " pod="openstack/neutron-5b545bc8c5-xcv4s" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.916198 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-config\") pod \"neutron-5b545bc8c5-xcv4s\" (UID: \"926e1895-6638-4861-be46-f06e94e4b4ae\") " pod="openstack/neutron-5b545bc8c5-xcv4s" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.916218 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-combined-ca-bundle\") pod \"neutron-5b545bc8c5-xcv4s\" (UID: \"926e1895-6638-4861-be46-f06e94e4b4ae\") " pod="openstack/neutron-5b545bc8c5-xcv4s" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.916237 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfgg9\" (UniqueName: \"kubernetes.io/projected/926e1895-6638-4861-be46-f06e94e4b4ae-kube-api-access-bfgg9\") pod \"neutron-5b545bc8c5-xcv4s\" (UID: \"926e1895-6638-4861-be46-f06e94e4b4ae\") " pod="openstack/neutron-5b545bc8c5-xcv4s" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.916268 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncmqs\" (UniqueName: \"kubernetes.io/projected/9df5252d-16cf-4612-910c-8f37d10b48a7-kube-api-access-ncmqs\") pod \"cinder-api-0\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " pod="openstack/cinder-api-0" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.916294 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " pod="openstack/cinder-api-0" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.917815 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9df5252d-16cf-4612-910c-8f37d10b48a7-logs\") pod \"cinder-api-0\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " pod="openstack/cinder-api-0" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.921507 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9df5252d-16cf-4612-910c-8f37d10b48a7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " pod="openstack/cinder-api-0" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.934092 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-config-data\") pod \"cinder-api-0\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " pod="openstack/cinder-api-0" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.934475 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-config-data-custom\") pod \"cinder-api-0\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " pod="openstack/cinder-api-0" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.938405 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " pod="openstack/cinder-api-0" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.940847 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-scripts\") pod \"cinder-api-0\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " pod="openstack/cinder-api-0" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.942699 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " pod="openstack/cinder-api-0" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.945314 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncmqs\" (UniqueName: \"kubernetes.io/projected/9df5252d-16cf-4612-910c-8f37d10b48a7-kube-api-access-ncmqs\") pod \"cinder-api-0\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " pod="openstack/cinder-api-0" Dec 05 16:37:57 crc kubenswrapper[4756]: I1205 16:37:57.946052 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " pod="openstack/cinder-api-0" Dec 05 16:37:58 crc kubenswrapper[4756]: I1205 16:37:58.017574 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-httpd-config\") pod \"neutron-5b545bc8c5-xcv4s\" (UID: \"926e1895-6638-4861-be46-f06e94e4b4ae\") " pod="openstack/neutron-5b545bc8c5-xcv4s" Dec 05 16:37:58 crc kubenswrapper[4756]: I1205 16:37:58.017696 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-internal-tls-certs\") pod \"neutron-5b545bc8c5-xcv4s\" (UID: \"926e1895-6638-4861-be46-f06e94e4b4ae\") " pod="openstack/neutron-5b545bc8c5-xcv4s" Dec 05 16:37:58 crc kubenswrapper[4756]: I1205 16:37:58.017721 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-config\") pod \"neutron-5b545bc8c5-xcv4s\" (UID: \"926e1895-6638-4861-be46-f06e94e4b4ae\") " pod="openstack/neutron-5b545bc8c5-xcv4s" Dec 05 16:37:58 crc kubenswrapper[4756]: I1205 16:37:58.017742 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-combined-ca-bundle\") pod \"neutron-5b545bc8c5-xcv4s\" (UID: \"926e1895-6638-4861-be46-f06e94e4b4ae\") " pod="openstack/neutron-5b545bc8c5-xcv4s" Dec 05 16:37:58 crc kubenswrapper[4756]: I1205 16:37:58.017762 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfgg9\" (UniqueName: \"kubernetes.io/projected/926e1895-6638-4861-be46-f06e94e4b4ae-kube-api-access-bfgg9\") pod \"neutron-5b545bc8c5-xcv4s\" (UID: \"926e1895-6638-4861-be46-f06e94e4b4ae\") " pod="openstack/neutron-5b545bc8c5-xcv4s" Dec 05 16:37:58 crc kubenswrapper[4756]: I1205 16:37:58.017801 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-public-tls-certs\") pod \"neutron-5b545bc8c5-xcv4s\" (UID: \"926e1895-6638-4861-be46-f06e94e4b4ae\") " pod="openstack/neutron-5b545bc8c5-xcv4s" Dec 05 16:37:58 crc kubenswrapper[4756]: I1205 16:37:58.017817 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-ovndb-tls-certs\") pod \"neutron-5b545bc8c5-xcv4s\" (UID: \"926e1895-6638-4861-be46-f06e94e4b4ae\") " pod="openstack/neutron-5b545bc8c5-xcv4s" Dec 05 16:37:58 crc kubenswrapper[4756]: I1205 16:37:58.021457 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-ovndb-tls-certs\") pod \"neutron-5b545bc8c5-xcv4s\" (UID: \"926e1895-6638-4861-be46-f06e94e4b4ae\") " pod="openstack/neutron-5b545bc8c5-xcv4s" Dec 05 16:37:58 crc kubenswrapper[4756]: I1205 16:37:58.023298 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-combined-ca-bundle\") pod \"neutron-5b545bc8c5-xcv4s\" (UID: \"926e1895-6638-4861-be46-f06e94e4b4ae\") " pod="openstack/neutron-5b545bc8c5-xcv4s" Dec 05 16:37:58 crc kubenswrapper[4756]: I1205 16:37:58.023544 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-public-tls-certs\") pod \"neutron-5b545bc8c5-xcv4s\" (UID: \"926e1895-6638-4861-be46-f06e94e4b4ae\") " pod="openstack/neutron-5b545bc8c5-xcv4s" Dec 05 16:37:58 crc kubenswrapper[4756]: I1205 16:37:58.024095 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-httpd-config\") pod \"neutron-5b545bc8c5-xcv4s\" (UID: \"926e1895-6638-4861-be46-f06e94e4b4ae\") " pod="openstack/neutron-5b545bc8c5-xcv4s" Dec 05 16:37:58 crc kubenswrapper[4756]: I1205 16:37:58.026337 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-config\") pod \"neutron-5b545bc8c5-xcv4s\" (UID: \"926e1895-6638-4861-be46-f06e94e4b4ae\") " pod="openstack/neutron-5b545bc8c5-xcv4s" Dec 05 16:37:58 crc kubenswrapper[4756]: I1205 16:37:58.031199 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-internal-tls-certs\") pod \"neutron-5b545bc8c5-xcv4s\" (UID: \"926e1895-6638-4861-be46-f06e94e4b4ae\") " pod="openstack/neutron-5b545bc8c5-xcv4s" Dec 05 16:37:58 crc kubenswrapper[4756]: I1205 16:37:58.034013 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 16:37:58 crc kubenswrapper[4756]: I1205 16:37:58.039805 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfgg9\" (UniqueName: \"kubernetes.io/projected/926e1895-6638-4861-be46-f06e94e4b4ae-kube-api-access-bfgg9\") pod \"neutron-5b545bc8c5-xcv4s\" (UID: \"926e1895-6638-4861-be46-f06e94e4b4ae\") " pod="openstack/neutron-5b545bc8c5-xcv4s" Dec 05 16:37:58 crc kubenswrapper[4756]: I1205 16:37:58.093639 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cbf5b74-1863-441f-bdf3-570f64a7f50e" path="/var/lib/kubelet/pods/1cbf5b74-1863-441f-bdf3-570f64a7f50e/volumes" Dec 05 16:37:58 crc kubenswrapper[4756]: I1205 16:37:58.155060 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b545bc8c5-xcv4s" Dec 05 16:37:58 crc kubenswrapper[4756]: I1205 16:37:58.432590 4756 generic.go:334] "Generic (PLEG): container finished" podID="4bd0d557-c324-4da4-88e1-2073413d29de" containerID="88ac134226b0d54d94601d18418b3b961516451f134b0f57dee1347033088240" exitCode=143 Dec 05 16:37:58 crc kubenswrapper[4756]: I1205 16:37:58.432980 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-749876f46d-6zlh7" event={"ID":"4bd0d557-c324-4da4-88e1-2073413d29de","Type":"ContainerDied","Data":"88ac134226b0d54d94601d18418b3b961516451f134b0f57dee1347033088240"} Dec 05 16:37:58 crc kubenswrapper[4756]: I1205 16:37:58.439950 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e22ff075-9534-4b98-9e00-04672da4fa74","Type":"ContainerStarted","Data":"0613a7ab5a963b46b6c293bb87644a995d68eed3117fa15ec6230d11c8e9c821"} Dec 05 16:37:58 crc kubenswrapper[4756]: I1205 16:37:58.439998 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e22ff075-9534-4b98-9e00-04672da4fa74","Type":"ContainerStarted","Data":"278c6bf1ce5c05a26a66ac4857f514a3ecfce374d4e7e0ce0f0f871d34afdb5e"} Dec 05 16:37:58 crc kubenswrapper[4756]: I1205 16:37:58.629775 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 16:37:58 crc kubenswrapper[4756]: I1205 16:37:58.869962 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b545bc8c5-xcv4s"] Dec 05 16:37:59 crc kubenswrapper[4756]: I1205 16:37:59.462645 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9df5252d-16cf-4612-910c-8f37d10b48a7","Type":"ContainerStarted","Data":"ef0f39ff2baef02e17cf770a2dbd86eae23197baa277532416645111fc401104"} Dec 05 16:37:59 crc kubenswrapper[4756]: I1205 16:37:59.476803 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e22ff075-9534-4b98-9e00-04672da4fa74","Type":"ContainerStarted","Data":"2924cfbec2bcd7342ee1c2869c60f831538193dcd182c8f17f207d9200a1fc4f"} Dec 05 16:37:59 crc kubenswrapper[4756]: I1205 16:37:59.479087 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b545bc8c5-xcv4s" event={"ID":"926e1895-6638-4861-be46-f06e94e4b4ae","Type":"ContainerStarted","Data":"f64d7df36b795e3694f688c94ab0720253fe8c0c08bc8d7f803bb4019f348b81"} Dec 05 16:37:59 crc kubenswrapper[4756]: I1205 16:37:59.479252 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b545bc8c5-xcv4s" event={"ID":"926e1895-6638-4861-be46-f06e94e4b4ae","Type":"ContainerStarted","Data":"841e19590514a1169a3fd1e5b25f0fa69ff80f78a27e877a3961e22fc23c67a1"} Dec 05 16:38:00 crc kubenswrapper[4756]: I1205 16:38:00.489971 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b545bc8c5-xcv4s" event={"ID":"926e1895-6638-4861-be46-f06e94e4b4ae","Type":"ContainerStarted","Data":"c7af4d58a9c44e723d1f4b250fb4689fe9457696570e5e229b0f57ec7ad64a29"} Dec 05 16:38:00 crc kubenswrapper[4756]: I1205 16:38:00.490534 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5b545bc8c5-xcv4s" Dec 05 16:38:00 crc kubenswrapper[4756]: I1205 16:38:00.493968 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9df5252d-16cf-4612-910c-8f37d10b48a7","Type":"ContainerStarted","Data":"6d0787e75ae992dd2b3dc29096ca4f2b71204bd07d887523f0de467c4e4688d4"} Dec 05 16:38:00 crc kubenswrapper[4756]: I1205 16:38:00.493999 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9df5252d-16cf-4612-910c-8f37d10b48a7","Type":"ContainerStarted","Data":"b2292cd9b3a9b3b5abd290889833f8a1801c60128dcd06bb289b998128e70716"} Dec 05 16:38:00 crc kubenswrapper[4756]: I1205 16:38:00.494776 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 05 16:38:00 crc kubenswrapper[4756]: I1205 16:38:00.535117 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5b545bc8c5-xcv4s" podStartSLOduration=3.535097312 podStartE2EDuration="3.535097312s" podCreationTimestamp="2025-12-05 16:37:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:38:00.512584322 +0000 UTC m=+1340.751257844" watchObservedRunningTime="2025-12-05 16:38:00.535097312 +0000 UTC m=+1340.773770824" Dec 05 16:38:00 crc kubenswrapper[4756]: I1205 16:38:00.536356 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.536348246 podStartE2EDuration="3.536348246s" podCreationTimestamp="2025-12-05 16:37:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:38:00.531025902 +0000 UTC m=+1340.769699414" watchObservedRunningTime="2025-12-05 16:38:00.536348246 +0000 UTC m=+1340.775021758" Dec 05 16:38:01 crc kubenswrapper[4756]: I1205 16:38:01.504599 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e22ff075-9534-4b98-9e00-04672da4fa74","Type":"ContainerStarted","Data":"ce4700bb9c9d51e56f5b62af753192e92110be5ca1e9970ef1f21dbbdd1f826c"} Dec 05 16:38:01 crc kubenswrapper[4756]: I1205 16:38:01.505910 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 16:38:01 crc kubenswrapper[4756]: I1205 16:38:01.538303 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.064176194 podStartE2EDuration="6.538281156s" podCreationTimestamp="2025-12-05 16:37:55 +0000 UTC" firstStartedPulling="2025-12-05 16:37:56.285936262 +0000 UTC m=+1336.524609774" lastFinishedPulling="2025-12-05 16:38:00.760041224 +0000 UTC m=+1340.998714736" observedRunningTime="2025-12-05 16:38:01.529866897 +0000 UTC m=+1341.768540419" watchObservedRunningTime="2025-12-05 16:38:01.538281156 +0000 UTC m=+1341.776954678" Dec 05 16:38:01 crc kubenswrapper[4756]: I1205 16:38:01.625174 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 05 16:38:01 crc kubenswrapper[4756]: I1205 16:38:01.666830 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 16:38:02 crc kubenswrapper[4756]: I1205 16:38:02.328665 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-749876f46d-6zlh7" podUID="4bd0d557-c324-4da4-88e1-2073413d29de" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.154:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 16:38:02 crc kubenswrapper[4756]: I1205 16:38:02.329041 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-749876f46d-6zlh7" podUID="4bd0d557-c324-4da4-88e1-2073413d29de" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.154:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 16:38:02 crc kubenswrapper[4756]: I1205 16:38:02.512118 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="99744cdb-b38c-4576-925d-3b8bb6404a01" containerName="cinder-scheduler" containerID="cri-o://c5d41d4df7c1c913413b4e2752a10c340ab18ef296781903eb8d89de4db61791" gracePeriod=30 Dec 05 16:38:02 crc kubenswrapper[4756]: I1205 16:38:02.512538 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="99744cdb-b38c-4576-925d-3b8bb6404a01" containerName="probe" containerID="cri-o://e8ac119fbc8359cc2ef2694311e407ca4ce5040fe856da58e0dd98b290825396" gracePeriod=30 Dec 05 16:38:03 crc kubenswrapper[4756]: I1205 16:38:03.480769 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" Dec 05 16:38:03 crc kubenswrapper[4756]: I1205 16:38:03.525145 4756 generic.go:334] "Generic (PLEG): container finished" podID="99744cdb-b38c-4576-925d-3b8bb6404a01" containerID="e8ac119fbc8359cc2ef2694311e407ca4ce5040fe856da58e0dd98b290825396" exitCode=0 Dec 05 16:38:03 crc kubenswrapper[4756]: I1205 16:38:03.525488 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"99744cdb-b38c-4576-925d-3b8bb6404a01","Type":"ContainerDied","Data":"e8ac119fbc8359cc2ef2694311e407ca4ce5040fe856da58e0dd98b290825396"} Dec 05 16:38:03 crc kubenswrapper[4756]: I1205 16:38:03.554976 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7884648fd9-9kzg6"] Dec 05 16:38:03 crc kubenswrapper[4756]: I1205 16:38:03.555259 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" podUID="b1cfcad6-0fd3-47ed-9d46-093d26580a35" containerName="dnsmasq-dns" containerID="cri-o://8f9916c198d47920f4ad0ddb09e6240388183e0700fe9be2cd2d8ef11766df35" gracePeriod=10 Dec 05 16:38:03 crc kubenswrapper[4756]: I1205 16:38:03.656940 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-749876f46d-6zlh7" podUID="4bd0d557-c324-4da4-88e1-2073413d29de" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.154:9311/healthcheck\": read tcp 10.217.0.2:50686->10.217.0.154:9311: read: connection reset by peer" Dec 05 16:38:03 crc kubenswrapper[4756]: I1205 16:38:03.657284 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-749876f46d-6zlh7" podUID="4bd0d557-c324-4da4-88e1-2073413d29de" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.154:9311/healthcheck\": read tcp 10.217.0.2:50680->10.217.0.154:9311: read: connection reset by peer" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.116081 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.184726 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1cfcad6-0fd3-47ed-9d46-093d26580a35-dns-svc\") pod \"b1cfcad6-0fd3-47ed-9d46-093d26580a35\" (UID: \"b1cfcad6-0fd3-47ed-9d46-093d26580a35\") " Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.184818 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1cfcad6-0fd3-47ed-9d46-093d26580a35-dns-swift-storage-0\") pod \"b1cfcad6-0fd3-47ed-9d46-093d26580a35\" (UID: \"b1cfcad6-0fd3-47ed-9d46-093d26580a35\") " Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.184896 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1cfcad6-0fd3-47ed-9d46-093d26580a35-config\") pod \"b1cfcad6-0fd3-47ed-9d46-093d26580a35\" (UID: \"b1cfcad6-0fd3-47ed-9d46-093d26580a35\") " Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.185001 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1cfcad6-0fd3-47ed-9d46-093d26580a35-ovsdbserver-nb\") pod \"b1cfcad6-0fd3-47ed-9d46-093d26580a35\" (UID: \"b1cfcad6-0fd3-47ed-9d46-093d26580a35\") " Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.185099 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1cfcad6-0fd3-47ed-9d46-093d26580a35-ovsdbserver-sb\") pod \"b1cfcad6-0fd3-47ed-9d46-093d26580a35\" (UID: \"b1cfcad6-0fd3-47ed-9d46-093d26580a35\") " Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.185139 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cm9p\" (UniqueName: \"kubernetes.io/projected/b1cfcad6-0fd3-47ed-9d46-093d26580a35-kube-api-access-7cm9p\") pod \"b1cfcad6-0fd3-47ed-9d46-093d26580a35\" (UID: \"b1cfcad6-0fd3-47ed-9d46-093d26580a35\") " Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.200106 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1cfcad6-0fd3-47ed-9d46-093d26580a35-kube-api-access-7cm9p" (OuterVolumeSpecName: "kube-api-access-7cm9p") pod "b1cfcad6-0fd3-47ed-9d46-093d26580a35" (UID: "b1cfcad6-0fd3-47ed-9d46-093d26580a35"). InnerVolumeSpecName "kube-api-access-7cm9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.240386 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1cfcad6-0fd3-47ed-9d46-093d26580a35-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b1cfcad6-0fd3-47ed-9d46-093d26580a35" (UID: "b1cfcad6-0fd3-47ed-9d46-093d26580a35"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.247184 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1cfcad6-0fd3-47ed-9d46-093d26580a35-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b1cfcad6-0fd3-47ed-9d46-093d26580a35" (UID: "b1cfcad6-0fd3-47ed-9d46-093d26580a35"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.250870 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1cfcad6-0fd3-47ed-9d46-093d26580a35-config" (OuterVolumeSpecName: "config") pod "b1cfcad6-0fd3-47ed-9d46-093d26580a35" (UID: "b1cfcad6-0fd3-47ed-9d46-093d26580a35"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.251484 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-749876f46d-6zlh7" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.254996 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1cfcad6-0fd3-47ed-9d46-093d26580a35-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b1cfcad6-0fd3-47ed-9d46-093d26580a35" (UID: "b1cfcad6-0fd3-47ed-9d46-093d26580a35"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.261558 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1cfcad6-0fd3-47ed-9d46-093d26580a35-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b1cfcad6-0fd3-47ed-9d46-093d26580a35" (UID: "b1cfcad6-0fd3-47ed-9d46-093d26580a35"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.286522 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8crd\" (UniqueName: \"kubernetes.io/projected/4bd0d557-c324-4da4-88e1-2073413d29de-kube-api-access-f8crd\") pod \"4bd0d557-c324-4da4-88e1-2073413d29de\" (UID: \"4bd0d557-c324-4da4-88e1-2073413d29de\") " Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.286653 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd0d557-c324-4da4-88e1-2073413d29de-combined-ca-bundle\") pod \"4bd0d557-c324-4da4-88e1-2073413d29de\" (UID: \"4bd0d557-c324-4da4-88e1-2073413d29de\") " Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.286704 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4bd0d557-c324-4da4-88e1-2073413d29de-config-data-custom\") pod \"4bd0d557-c324-4da4-88e1-2073413d29de\" (UID: \"4bd0d557-c324-4da4-88e1-2073413d29de\") " Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.286730 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bd0d557-c324-4da4-88e1-2073413d29de-logs\") pod \"4bd0d557-c324-4da4-88e1-2073413d29de\" (UID: \"4bd0d557-c324-4da4-88e1-2073413d29de\") " Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.286863 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd0d557-c324-4da4-88e1-2073413d29de-config-data\") pod \"4bd0d557-c324-4da4-88e1-2073413d29de\" (UID: \"4bd0d557-c324-4da4-88e1-2073413d29de\") " Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.287311 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1cfcad6-0fd3-47ed-9d46-093d26580a35-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.287329 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1cfcad6-0fd3-47ed-9d46-093d26580a35-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.287339 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cm9p\" (UniqueName: \"kubernetes.io/projected/b1cfcad6-0fd3-47ed-9d46-093d26580a35-kube-api-access-7cm9p\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.287351 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1cfcad6-0fd3-47ed-9d46-093d26580a35-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.287359 4756 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1cfcad6-0fd3-47ed-9d46-093d26580a35-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.287369 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1cfcad6-0fd3-47ed-9d46-093d26580a35-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.287859 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bd0d557-c324-4da4-88e1-2073413d29de-logs" (OuterVolumeSpecName: "logs") pod "4bd0d557-c324-4da4-88e1-2073413d29de" (UID: "4bd0d557-c324-4da4-88e1-2073413d29de"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.289665 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bd0d557-c324-4da4-88e1-2073413d29de-kube-api-access-f8crd" (OuterVolumeSpecName: "kube-api-access-f8crd") pod "4bd0d557-c324-4da4-88e1-2073413d29de" (UID: "4bd0d557-c324-4da4-88e1-2073413d29de"). InnerVolumeSpecName "kube-api-access-f8crd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.290165 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bd0d557-c324-4da4-88e1-2073413d29de-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4bd0d557-c324-4da4-88e1-2073413d29de" (UID: "4bd0d557-c324-4da4-88e1-2073413d29de"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.315815 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bd0d557-c324-4da4-88e1-2073413d29de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bd0d557-c324-4da4-88e1-2073413d29de" (UID: "4bd0d557-c324-4da4-88e1-2073413d29de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.341967 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bd0d557-c324-4da4-88e1-2073413d29de-config-data" (OuterVolumeSpecName: "config-data") pod "4bd0d557-c324-4da4-88e1-2073413d29de" (UID: "4bd0d557-c324-4da4-88e1-2073413d29de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.388983 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8crd\" (UniqueName: \"kubernetes.io/projected/4bd0d557-c324-4da4-88e1-2073413d29de-kube-api-access-f8crd\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.389028 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd0d557-c324-4da4-88e1-2073413d29de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.389038 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4bd0d557-c324-4da4-88e1-2073413d29de-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.389046 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bd0d557-c324-4da4-88e1-2073413d29de-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.389056 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd0d557-c324-4da4-88e1-2073413d29de-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.575298 4756 generic.go:334] "Generic (PLEG): container finished" podID="4bd0d557-c324-4da4-88e1-2073413d29de" containerID="dfb79049cb89a71b4607b123544b14f9225ac8a88e66e094a43ee841d722ee58" exitCode=0 Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.575558 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-749876f46d-6zlh7" event={"ID":"4bd0d557-c324-4da4-88e1-2073413d29de","Type":"ContainerDied","Data":"dfb79049cb89a71b4607b123544b14f9225ac8a88e66e094a43ee841d722ee58"} Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.575610 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-749876f46d-6zlh7" event={"ID":"4bd0d557-c324-4da4-88e1-2073413d29de","Type":"ContainerDied","Data":"f0cd005521a828a97c8124ea9e0692e3c85e99bb0e304b2579dbdb10c8958018"} Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.575648 4756 scope.go:117] "RemoveContainer" containerID="dfb79049cb89a71b4607b123544b14f9225ac8a88e66e094a43ee841d722ee58" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.577972 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-749876f46d-6zlh7" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.636669 4756 scope.go:117] "RemoveContainer" containerID="88ac134226b0d54d94601d18418b3b961516451f134b0f57dee1347033088240" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.636792 4756 generic.go:334] "Generic (PLEG): container finished" podID="99744cdb-b38c-4576-925d-3b8bb6404a01" containerID="c5d41d4df7c1c913413b4e2752a10c340ab18ef296781903eb8d89de4db61791" exitCode=0 Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.636895 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"99744cdb-b38c-4576-925d-3b8bb6404a01","Type":"ContainerDied","Data":"c5d41d4df7c1c913413b4e2752a10c340ab18ef296781903eb8d89de4db61791"} Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.681473 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-749876f46d-6zlh7"] Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.692862 4756 scope.go:117] "RemoveContainer" containerID="dfb79049cb89a71b4607b123544b14f9225ac8a88e66e094a43ee841d722ee58" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.693480 4756 generic.go:334] "Generic (PLEG): container finished" podID="b1cfcad6-0fd3-47ed-9d46-093d26580a35" containerID="8f9916c198d47920f4ad0ddb09e6240388183e0700fe9be2cd2d8ef11766df35" exitCode=0 Dec 05 16:38:04 crc kubenswrapper[4756]: E1205 16:38:04.693500 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfb79049cb89a71b4607b123544b14f9225ac8a88e66e094a43ee841d722ee58\": container with ID starting with dfb79049cb89a71b4607b123544b14f9225ac8a88e66e094a43ee841d722ee58 not found: ID does not exist" containerID="dfb79049cb89a71b4607b123544b14f9225ac8a88e66e094a43ee841d722ee58" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.693524 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" event={"ID":"b1cfcad6-0fd3-47ed-9d46-093d26580a35","Type":"ContainerDied","Data":"8f9916c198d47920f4ad0ddb09e6240388183e0700fe9be2cd2d8ef11766df35"} Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.693534 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfb79049cb89a71b4607b123544b14f9225ac8a88e66e094a43ee841d722ee58"} err="failed to get container status \"dfb79049cb89a71b4607b123544b14f9225ac8a88e66e094a43ee841d722ee58\": rpc error: code = NotFound desc = could not find container \"dfb79049cb89a71b4607b123544b14f9225ac8a88e66e094a43ee841d722ee58\": container with ID starting with dfb79049cb89a71b4607b123544b14f9225ac8a88e66e094a43ee841d722ee58 not found: ID does not exist" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.693557 4756 scope.go:117] "RemoveContainer" containerID="88ac134226b0d54d94601d18418b3b961516451f134b0f57dee1347033088240" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.693572 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" event={"ID":"b1cfcad6-0fd3-47ed-9d46-093d26580a35","Type":"ContainerDied","Data":"6ac4d45ac530c7368e041bfaa15aeba1b3bcac6c4e169a5adb97873dfe45213f"} Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.693651 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7884648fd9-9kzg6" Dec 05 16:38:04 crc kubenswrapper[4756]: E1205 16:38:04.700612 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88ac134226b0d54d94601d18418b3b961516451f134b0f57dee1347033088240\": container with ID starting with 88ac134226b0d54d94601d18418b3b961516451f134b0f57dee1347033088240 not found: ID does not exist" containerID="88ac134226b0d54d94601d18418b3b961516451f134b0f57dee1347033088240" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.700671 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88ac134226b0d54d94601d18418b3b961516451f134b0f57dee1347033088240"} err="failed to get container status \"88ac134226b0d54d94601d18418b3b961516451f134b0f57dee1347033088240\": rpc error: code = NotFound desc = could not find container \"88ac134226b0d54d94601d18418b3b961516451f134b0f57dee1347033088240\": container with ID starting with 88ac134226b0d54d94601d18418b3b961516451f134b0f57dee1347033088240 not found: ID does not exist" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.700701 4756 scope.go:117] "RemoveContainer" containerID="8f9916c198d47920f4ad0ddb09e6240388183e0700fe9be2cd2d8ef11766df35" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.746995 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-749876f46d-6zlh7"] Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.762464 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7884648fd9-9kzg6"] Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.766821 4756 scope.go:117] "RemoveContainer" containerID="b4741c568cbaacb180537c018676ea148fa077b61ffa287514c164814cc1b3b3" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.788430 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7884648fd9-9kzg6"] Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.824665 4756 scope.go:117] "RemoveContainer" containerID="8f9916c198d47920f4ad0ddb09e6240388183e0700fe9be2cd2d8ef11766df35" Dec 05 16:38:04 crc kubenswrapper[4756]: E1205 16:38:04.825078 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f9916c198d47920f4ad0ddb09e6240388183e0700fe9be2cd2d8ef11766df35\": container with ID starting with 8f9916c198d47920f4ad0ddb09e6240388183e0700fe9be2cd2d8ef11766df35 not found: ID does not exist" containerID="8f9916c198d47920f4ad0ddb09e6240388183e0700fe9be2cd2d8ef11766df35" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.825109 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f9916c198d47920f4ad0ddb09e6240388183e0700fe9be2cd2d8ef11766df35"} err="failed to get container status \"8f9916c198d47920f4ad0ddb09e6240388183e0700fe9be2cd2d8ef11766df35\": rpc error: code = NotFound desc = could not find container \"8f9916c198d47920f4ad0ddb09e6240388183e0700fe9be2cd2d8ef11766df35\": container with ID starting with 8f9916c198d47920f4ad0ddb09e6240388183e0700fe9be2cd2d8ef11766df35 not found: ID does not exist" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.825133 4756 scope.go:117] "RemoveContainer" containerID="b4741c568cbaacb180537c018676ea148fa077b61ffa287514c164814cc1b3b3" Dec 05 16:38:04 crc kubenswrapper[4756]: E1205 16:38:04.825315 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4741c568cbaacb180537c018676ea148fa077b61ffa287514c164814cc1b3b3\": container with ID starting with b4741c568cbaacb180537c018676ea148fa077b61ffa287514c164814cc1b3b3 not found: ID does not exist" containerID="b4741c568cbaacb180537c018676ea148fa077b61ffa287514c164814cc1b3b3" Dec 05 16:38:04 crc kubenswrapper[4756]: I1205 16:38:04.825335 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4741c568cbaacb180537c018676ea148fa077b61ffa287514c164814cc1b3b3"} err="failed to get container status \"b4741c568cbaacb180537c018676ea148fa077b61ffa287514c164814cc1b3b3\": rpc error: code = NotFound desc = could not find container \"b4741c568cbaacb180537c018676ea148fa077b61ffa287514c164814cc1b3b3\": container with ID starting with b4741c568cbaacb180537c018676ea148fa077b61ffa287514c164814cc1b3b3 not found: ID does not exist" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.169033 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.230091 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/99744cdb-b38c-4576-925d-3b8bb6404a01-etc-machine-id\") pod \"99744cdb-b38c-4576-925d-3b8bb6404a01\" (UID: \"99744cdb-b38c-4576-925d-3b8bb6404a01\") " Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.230174 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99744cdb-b38c-4576-925d-3b8bb6404a01-config-data-custom\") pod \"99744cdb-b38c-4576-925d-3b8bb6404a01\" (UID: \"99744cdb-b38c-4576-925d-3b8bb6404a01\") " Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.230325 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99744cdb-b38c-4576-925d-3b8bb6404a01-config-data\") pod \"99744cdb-b38c-4576-925d-3b8bb6404a01\" (UID: \"99744cdb-b38c-4576-925d-3b8bb6404a01\") " Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.230365 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99744cdb-b38c-4576-925d-3b8bb6404a01-scripts\") pod \"99744cdb-b38c-4576-925d-3b8bb6404a01\" (UID: \"99744cdb-b38c-4576-925d-3b8bb6404a01\") " Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.230443 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99744cdb-b38c-4576-925d-3b8bb6404a01-combined-ca-bundle\") pod \"99744cdb-b38c-4576-925d-3b8bb6404a01\" (UID: \"99744cdb-b38c-4576-925d-3b8bb6404a01\") " Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.230528 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2wtk\" (UniqueName: \"kubernetes.io/projected/99744cdb-b38c-4576-925d-3b8bb6404a01-kube-api-access-j2wtk\") pod \"99744cdb-b38c-4576-925d-3b8bb6404a01\" (UID: \"99744cdb-b38c-4576-925d-3b8bb6404a01\") " Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.231247 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99744cdb-b38c-4576-925d-3b8bb6404a01-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "99744cdb-b38c-4576-925d-3b8bb6404a01" (UID: "99744cdb-b38c-4576-925d-3b8bb6404a01"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.235498 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99744cdb-b38c-4576-925d-3b8bb6404a01-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "99744cdb-b38c-4576-925d-3b8bb6404a01" (UID: "99744cdb-b38c-4576-925d-3b8bb6404a01"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.236332 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99744cdb-b38c-4576-925d-3b8bb6404a01-kube-api-access-j2wtk" (OuterVolumeSpecName: "kube-api-access-j2wtk") pod "99744cdb-b38c-4576-925d-3b8bb6404a01" (UID: "99744cdb-b38c-4576-925d-3b8bb6404a01"). InnerVolumeSpecName "kube-api-access-j2wtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.238555 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99744cdb-b38c-4576-925d-3b8bb6404a01-scripts" (OuterVolumeSpecName: "scripts") pod "99744cdb-b38c-4576-925d-3b8bb6404a01" (UID: "99744cdb-b38c-4576-925d-3b8bb6404a01"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.286606 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99744cdb-b38c-4576-925d-3b8bb6404a01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99744cdb-b38c-4576-925d-3b8bb6404a01" (UID: "99744cdb-b38c-4576-925d-3b8bb6404a01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.328537 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99744cdb-b38c-4576-925d-3b8bb6404a01-config-data" (OuterVolumeSpecName: "config-data") pod "99744cdb-b38c-4576-925d-3b8bb6404a01" (UID: "99744cdb-b38c-4576-925d-3b8bb6404a01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.333227 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99744cdb-b38c-4576-925d-3b8bb6404a01-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.333263 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99744cdb-b38c-4576-925d-3b8bb6404a01-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.333272 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99744cdb-b38c-4576-925d-3b8bb6404a01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.333283 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2wtk\" (UniqueName: \"kubernetes.io/projected/99744cdb-b38c-4576-925d-3b8bb6404a01-kube-api-access-j2wtk\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.333292 4756 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/99744cdb-b38c-4576-925d-3b8bb6404a01-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.333300 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99744cdb-b38c-4576-925d-3b8bb6404a01-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.707875 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"99744cdb-b38c-4576-925d-3b8bb6404a01","Type":"ContainerDied","Data":"d9e37c563c058287ef14b53a7b9bf0cc71f0f8f9b11bcd7cd7c664d74e180efe"} Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.708310 4756 scope.go:117] "RemoveContainer" containerID="e8ac119fbc8359cc2ef2694311e407ca4ce5040fe856da58e0dd98b290825396" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.707895 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.732497 4756 scope.go:117] "RemoveContainer" containerID="c5d41d4df7c1c913413b4e2752a10c340ab18ef296781903eb8d89de4db61791" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.746887 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.754919 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.774785 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 16:38:05 crc kubenswrapper[4756]: E1205 16:38:05.775201 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bd0d557-c324-4da4-88e1-2073413d29de" containerName="barbican-api-log" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.775227 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bd0d557-c324-4da4-88e1-2073413d29de" containerName="barbican-api-log" Dec 05 16:38:05 crc kubenswrapper[4756]: E1205 16:38:05.775241 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1cfcad6-0fd3-47ed-9d46-093d26580a35" containerName="init" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.775249 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1cfcad6-0fd3-47ed-9d46-093d26580a35" containerName="init" Dec 05 16:38:05 crc kubenswrapper[4756]: E1205 16:38:05.775261 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1cfcad6-0fd3-47ed-9d46-093d26580a35" containerName="dnsmasq-dns" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.775268 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1cfcad6-0fd3-47ed-9d46-093d26580a35" containerName="dnsmasq-dns" Dec 05 16:38:05 crc kubenswrapper[4756]: E1205 16:38:05.775293 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bd0d557-c324-4da4-88e1-2073413d29de" containerName="barbican-api" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.775301 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bd0d557-c324-4da4-88e1-2073413d29de" containerName="barbican-api" Dec 05 16:38:05 crc kubenswrapper[4756]: E1205 16:38:05.775322 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99744cdb-b38c-4576-925d-3b8bb6404a01" containerName="probe" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.775329 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="99744cdb-b38c-4576-925d-3b8bb6404a01" containerName="probe" Dec 05 16:38:05 crc kubenswrapper[4756]: E1205 16:38:05.775347 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99744cdb-b38c-4576-925d-3b8bb6404a01" containerName="cinder-scheduler" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.775357 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="99744cdb-b38c-4576-925d-3b8bb6404a01" containerName="cinder-scheduler" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.775682 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bd0d557-c324-4da4-88e1-2073413d29de" containerName="barbican-api-log" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.775705 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bd0d557-c324-4da4-88e1-2073413d29de" containerName="barbican-api" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.775729 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="99744cdb-b38c-4576-925d-3b8bb6404a01" containerName="cinder-scheduler" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.775741 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="99744cdb-b38c-4576-925d-3b8bb6404a01" containerName="probe" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.775757 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1cfcad6-0fd3-47ed-9d46-093d26580a35" containerName="dnsmasq-dns" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.776926 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.779958 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.800593 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.841235 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/509573b5-33f4-432c-bb2e-1fc9186ed072-config-data\") pod \"cinder-scheduler-0\" (UID: \"509573b5-33f4-432c-bb2e-1fc9186ed072\") " pod="openstack/cinder-scheduler-0" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.841299 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/509573b5-33f4-432c-bb2e-1fc9186ed072-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"509573b5-33f4-432c-bb2e-1fc9186ed072\") " pod="openstack/cinder-scheduler-0" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.841327 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/509573b5-33f4-432c-bb2e-1fc9186ed072-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"509573b5-33f4-432c-bb2e-1fc9186ed072\") " pod="openstack/cinder-scheduler-0" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.841556 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/509573b5-33f4-432c-bb2e-1fc9186ed072-scripts\") pod \"cinder-scheduler-0\" (UID: \"509573b5-33f4-432c-bb2e-1fc9186ed072\") " pod="openstack/cinder-scheduler-0" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.841694 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sksn6\" (UniqueName: \"kubernetes.io/projected/509573b5-33f4-432c-bb2e-1fc9186ed072-kube-api-access-sksn6\") pod \"cinder-scheduler-0\" (UID: \"509573b5-33f4-432c-bb2e-1fc9186ed072\") " pod="openstack/cinder-scheduler-0" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.841724 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/509573b5-33f4-432c-bb2e-1fc9186ed072-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"509573b5-33f4-432c-bb2e-1fc9186ed072\") " pod="openstack/cinder-scheduler-0" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.942793 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/509573b5-33f4-432c-bb2e-1fc9186ed072-config-data\") pod \"cinder-scheduler-0\" (UID: \"509573b5-33f4-432c-bb2e-1fc9186ed072\") " pod="openstack/cinder-scheduler-0" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.942841 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/509573b5-33f4-432c-bb2e-1fc9186ed072-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"509573b5-33f4-432c-bb2e-1fc9186ed072\") " pod="openstack/cinder-scheduler-0" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.942868 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/509573b5-33f4-432c-bb2e-1fc9186ed072-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"509573b5-33f4-432c-bb2e-1fc9186ed072\") " pod="openstack/cinder-scheduler-0" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.942941 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/509573b5-33f4-432c-bb2e-1fc9186ed072-scripts\") pod \"cinder-scheduler-0\" (UID: \"509573b5-33f4-432c-bb2e-1fc9186ed072\") " pod="openstack/cinder-scheduler-0" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.943027 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sksn6\" (UniqueName: \"kubernetes.io/projected/509573b5-33f4-432c-bb2e-1fc9186ed072-kube-api-access-sksn6\") pod \"cinder-scheduler-0\" (UID: \"509573b5-33f4-432c-bb2e-1fc9186ed072\") " pod="openstack/cinder-scheduler-0" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.943045 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/509573b5-33f4-432c-bb2e-1fc9186ed072-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"509573b5-33f4-432c-bb2e-1fc9186ed072\") " pod="openstack/cinder-scheduler-0" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.943756 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/509573b5-33f4-432c-bb2e-1fc9186ed072-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"509573b5-33f4-432c-bb2e-1fc9186ed072\") " pod="openstack/cinder-scheduler-0" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.947995 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/509573b5-33f4-432c-bb2e-1fc9186ed072-scripts\") pod \"cinder-scheduler-0\" (UID: \"509573b5-33f4-432c-bb2e-1fc9186ed072\") " pod="openstack/cinder-scheduler-0" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.948005 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/509573b5-33f4-432c-bb2e-1fc9186ed072-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"509573b5-33f4-432c-bb2e-1fc9186ed072\") " pod="openstack/cinder-scheduler-0" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.948147 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/509573b5-33f4-432c-bb2e-1fc9186ed072-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"509573b5-33f4-432c-bb2e-1fc9186ed072\") " pod="openstack/cinder-scheduler-0" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.948942 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/509573b5-33f4-432c-bb2e-1fc9186ed072-config-data\") pod \"cinder-scheduler-0\" (UID: \"509573b5-33f4-432c-bb2e-1fc9186ed072\") " pod="openstack/cinder-scheduler-0" Dec 05 16:38:05 crc kubenswrapper[4756]: I1205 16:38:05.961435 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sksn6\" (UniqueName: \"kubernetes.io/projected/509573b5-33f4-432c-bb2e-1fc9186ed072-kube-api-access-sksn6\") pod \"cinder-scheduler-0\" (UID: \"509573b5-33f4-432c-bb2e-1fc9186ed072\") " pod="openstack/cinder-scheduler-0" Dec 05 16:38:06 crc kubenswrapper[4756]: I1205 16:38:06.062380 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bd0d557-c324-4da4-88e1-2073413d29de" path="/var/lib/kubelet/pods/4bd0d557-c324-4da4-88e1-2073413d29de/volumes" Dec 05 16:38:06 crc kubenswrapper[4756]: I1205 16:38:06.063203 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99744cdb-b38c-4576-925d-3b8bb6404a01" path="/var/lib/kubelet/pods/99744cdb-b38c-4576-925d-3b8bb6404a01/volumes" Dec 05 16:38:06 crc kubenswrapper[4756]: I1205 16:38:06.063832 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1cfcad6-0fd3-47ed-9d46-093d26580a35" path="/var/lib/kubelet/pods/b1cfcad6-0fd3-47ed-9d46-093d26580a35/volumes" Dec 05 16:38:06 crc kubenswrapper[4756]: I1205 16:38:06.101767 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 16:38:06 crc kubenswrapper[4756]: I1205 16:38:06.559816 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 16:38:06 crc kubenswrapper[4756]: I1205 16:38:06.731119 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"509573b5-33f4-432c-bb2e-1fc9186ed072","Type":"ContainerStarted","Data":"922472c149da7ca1519db8bd8b314ce74a0247f88ad115a5512bb0e4a2f0e9c5"} Dec 05 16:38:07 crc kubenswrapper[4756]: I1205 16:38:07.749254 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"509573b5-33f4-432c-bb2e-1fc9186ed072","Type":"ContainerStarted","Data":"1c001a9c870ca4bb31c57a0eaad2bee3efc8b47aef90bc4502a5337603659d8b"} Dec 05 16:38:08 crc kubenswrapper[4756]: I1205 16:38:08.640008 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-857c4c465c-j99gg" Dec 05 16:38:08 crc kubenswrapper[4756]: I1205 16:38:08.769387 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"509573b5-33f4-432c-bb2e-1fc9186ed072","Type":"ContainerStarted","Data":"a4403245fba51bc317b8d7897b375b1d195288a8e13b1478f5ed78542abf0293"} Dec 05 16:38:08 crc kubenswrapper[4756]: I1205 16:38:08.791135 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.791111599 podStartE2EDuration="3.791111599s" podCreationTimestamp="2025-12-05 16:38:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:38:08.787154501 +0000 UTC m=+1349.025828023" watchObservedRunningTime="2025-12-05 16:38:08.791111599 +0000 UTC m=+1349.029785111" Dec 05 16:38:10 crc kubenswrapper[4756]: I1205 16:38:10.377028 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 05 16:38:11 crc kubenswrapper[4756]: I1205 16:38:11.103312 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 05 16:38:11 crc kubenswrapper[4756]: I1205 16:38:11.439965 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-668cbcb89b-dvw88" Dec 05 16:38:11 crc kubenswrapper[4756]: I1205 16:38:11.447106 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-668cbcb89b-dvw88" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.302690 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-647b5b9577-j6wcs"] Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.307395 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.311663 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.313180 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.314399 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-647b5b9577-j6wcs"] Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.316803 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.384278 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjk8k\" (UniqueName: \"kubernetes.io/projected/541f7de0-3773-4e28-9881-b552367f1c89-kube-api-access-pjk8k\") pod \"swift-proxy-647b5b9577-j6wcs\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.384465 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541f7de0-3773-4e28-9881-b552367f1c89-combined-ca-bundle\") pod \"swift-proxy-647b5b9577-j6wcs\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.384566 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/541f7de0-3773-4e28-9881-b552367f1c89-internal-tls-certs\") pod \"swift-proxy-647b5b9577-j6wcs\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.384671 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/541f7de0-3773-4e28-9881-b552367f1c89-run-httpd\") pod \"swift-proxy-647b5b9577-j6wcs\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.384756 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/541f7de0-3773-4e28-9881-b552367f1c89-public-tls-certs\") pod \"swift-proxy-647b5b9577-j6wcs\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.384795 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/541f7de0-3773-4e28-9881-b552367f1c89-etc-swift\") pod \"swift-proxy-647b5b9577-j6wcs\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.384942 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/541f7de0-3773-4e28-9881-b552367f1c89-log-httpd\") pod \"swift-proxy-647b5b9577-j6wcs\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.385003 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/541f7de0-3773-4e28-9881-b552367f1c89-config-data\") pod \"swift-proxy-647b5b9577-j6wcs\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.488538 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.489685 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.494636 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/541f7de0-3773-4e28-9881-b552367f1c89-public-tls-certs\") pod \"swift-proxy-647b5b9577-j6wcs\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.494798 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/541f7de0-3773-4e28-9881-b552367f1c89-etc-swift\") pod \"swift-proxy-647b5b9577-j6wcs\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.495042 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/541f7de0-3773-4e28-9881-b552367f1c89-log-httpd\") pod \"swift-proxy-647b5b9577-j6wcs\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.495098 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/541f7de0-3773-4e28-9881-b552367f1c89-config-data\") pod \"swift-proxy-647b5b9577-j6wcs\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.495265 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541f7de0-3773-4e28-9881-b552367f1c89-combined-ca-bundle\") pod \"swift-proxy-647b5b9577-j6wcs\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.495299 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/541f7de0-3773-4e28-9881-b552367f1c89-internal-tls-certs\") pod \"swift-proxy-647b5b9577-j6wcs\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.495330 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjk8k\" (UniqueName: \"kubernetes.io/projected/541f7de0-3773-4e28-9881-b552367f1c89-kube-api-access-pjk8k\") pod \"swift-proxy-647b5b9577-j6wcs\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.495504 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/541f7de0-3773-4e28-9881-b552367f1c89-run-httpd\") pod \"swift-proxy-647b5b9577-j6wcs\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.496490 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/541f7de0-3773-4e28-9881-b552367f1c89-run-httpd\") pod \"swift-proxy-647b5b9577-j6wcs\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.499650 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-jkt57" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.500117 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.500441 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.501302 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/541f7de0-3773-4e28-9881-b552367f1c89-log-httpd\") pod \"swift-proxy-647b5b9577-j6wcs\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.502898 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.506479 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/541f7de0-3773-4e28-9881-b552367f1c89-etc-swift\") pod \"swift-proxy-647b5b9577-j6wcs\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.507246 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/541f7de0-3773-4e28-9881-b552367f1c89-config-data\") pod \"swift-proxy-647b5b9577-j6wcs\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.517852 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/541f7de0-3773-4e28-9881-b552367f1c89-public-tls-certs\") pod \"swift-proxy-647b5b9577-j6wcs\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.520853 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541f7de0-3773-4e28-9881-b552367f1c89-combined-ca-bundle\") pod \"swift-proxy-647b5b9577-j6wcs\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.525118 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjk8k\" (UniqueName: \"kubernetes.io/projected/541f7de0-3773-4e28-9881-b552367f1c89-kube-api-access-pjk8k\") pod \"swift-proxy-647b5b9577-j6wcs\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.540947 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/541f7de0-3773-4e28-9881-b552367f1c89-internal-tls-certs\") pod \"swift-proxy-647b5b9577-j6wcs\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.600431 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gg42\" (UniqueName: \"kubernetes.io/projected/6aa745f2-e987-4c61-84bd-f40224e4a4a0-kube-api-access-7gg42\") pod \"openstackclient\" (UID: \"6aa745f2-e987-4c61-84bd-f40224e4a4a0\") " pod="openstack/openstackclient" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.600887 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6aa745f2-e987-4c61-84bd-f40224e4a4a0-openstack-config-secret\") pod \"openstackclient\" (UID: \"6aa745f2-e987-4c61-84bd-f40224e4a4a0\") " pod="openstack/openstackclient" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.600911 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa745f2-e987-4c61-84bd-f40224e4a4a0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6aa745f2-e987-4c61-84bd-f40224e4a4a0\") " pod="openstack/openstackclient" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.600944 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6aa745f2-e987-4c61-84bd-f40224e4a4a0-openstack-config\") pod \"openstackclient\" (UID: \"6aa745f2-e987-4c61-84bd-f40224e4a4a0\") " pod="openstack/openstackclient" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.637021 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.702897 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6aa745f2-e987-4c61-84bd-f40224e4a4a0-openstack-config-secret\") pod \"openstackclient\" (UID: \"6aa745f2-e987-4c61-84bd-f40224e4a4a0\") " pod="openstack/openstackclient" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.702957 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa745f2-e987-4c61-84bd-f40224e4a4a0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6aa745f2-e987-4c61-84bd-f40224e4a4a0\") " pod="openstack/openstackclient" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.703018 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6aa745f2-e987-4c61-84bd-f40224e4a4a0-openstack-config\") pod \"openstackclient\" (UID: \"6aa745f2-e987-4c61-84bd-f40224e4a4a0\") " pod="openstack/openstackclient" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.703103 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gg42\" (UniqueName: \"kubernetes.io/projected/6aa745f2-e987-4c61-84bd-f40224e4a4a0-kube-api-access-7gg42\") pod \"openstackclient\" (UID: \"6aa745f2-e987-4c61-84bd-f40224e4a4a0\") " pod="openstack/openstackclient" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.704330 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6aa745f2-e987-4c61-84bd-f40224e4a4a0-openstack-config\") pod \"openstackclient\" (UID: \"6aa745f2-e987-4c61-84bd-f40224e4a4a0\") " pod="openstack/openstackclient" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.714363 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6aa745f2-e987-4c61-84bd-f40224e4a4a0-openstack-config-secret\") pod \"openstackclient\" (UID: \"6aa745f2-e987-4c61-84bd-f40224e4a4a0\") " pod="openstack/openstackclient" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.714528 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa745f2-e987-4c61-84bd-f40224e4a4a0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6aa745f2-e987-4c61-84bd-f40224e4a4a0\") " pod="openstack/openstackclient" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.722680 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gg42\" (UniqueName: \"kubernetes.io/projected/6aa745f2-e987-4c61-84bd-f40224e4a4a0-kube-api-access-7gg42\") pod \"openstackclient\" (UID: \"6aa745f2-e987-4c61-84bd-f40224e4a4a0\") " pod="openstack/openstackclient" Dec 05 16:38:13 crc kubenswrapper[4756]: I1205 16:38:13.933494 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 16:38:14 crc kubenswrapper[4756]: I1205 16:38:14.383644 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-647b5b9577-j6wcs"] Dec 05 16:38:14 crc kubenswrapper[4756]: I1205 16:38:14.426491 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 16:38:14 crc kubenswrapper[4756]: W1205 16:38:14.428630 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6aa745f2_e987_4c61_84bd_f40224e4a4a0.slice/crio-41a72055d6a82e877c4c4a94e47fc5392f9cf9060cad8f0d559cd21415632141 WatchSource:0}: Error finding container 41a72055d6a82e877c4c4a94e47fc5392f9cf9060cad8f0d559cd21415632141: Status 404 returned error can't find the container with id 41a72055d6a82e877c4c4a94e47fc5392f9cf9060cad8f0d559cd21415632141 Dec 05 16:38:14 crc kubenswrapper[4756]: I1205 16:38:14.822922 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-647b5b9577-j6wcs" event={"ID":"541f7de0-3773-4e28-9881-b552367f1c89","Type":"ContainerStarted","Data":"3745ded4e39a2011f659d17a30eb1ac6e00e0cafa518d9a8e7f02ca39ea34a5d"} Dec 05 16:38:14 crc kubenswrapper[4756]: I1205 16:38:14.823899 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6aa745f2-e987-4c61-84bd-f40224e4a4a0","Type":"ContainerStarted","Data":"41a72055d6a82e877c4c4a94e47fc5392f9cf9060cad8f0d559cd21415632141"} Dec 05 16:38:15 crc kubenswrapper[4756]: I1205 16:38:15.185334 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:38:15 crc kubenswrapper[4756]: I1205 16:38:15.185878 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e22ff075-9534-4b98-9e00-04672da4fa74" containerName="ceilometer-central-agent" containerID="cri-o://0613a7ab5a963b46b6c293bb87644a995d68eed3117fa15ec6230d11c8e9c821" gracePeriod=30 Dec 05 16:38:15 crc kubenswrapper[4756]: I1205 16:38:15.185950 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e22ff075-9534-4b98-9e00-04672da4fa74" containerName="sg-core" containerID="cri-o://2924cfbec2bcd7342ee1c2869c60f831538193dcd182c8f17f207d9200a1fc4f" gracePeriod=30 Dec 05 16:38:15 crc kubenswrapper[4756]: I1205 16:38:15.185983 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e22ff075-9534-4b98-9e00-04672da4fa74" containerName="proxy-httpd" containerID="cri-o://ce4700bb9c9d51e56f5b62af753192e92110be5ca1e9970ef1f21dbbdd1f826c" gracePeriod=30 Dec 05 16:38:15 crc kubenswrapper[4756]: I1205 16:38:15.185984 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e22ff075-9534-4b98-9e00-04672da4fa74" containerName="ceilometer-notification-agent" containerID="cri-o://278c6bf1ce5c05a26a66ac4857f514a3ecfce374d4e7e0ce0f0f871d34afdb5e" gracePeriod=30 Dec 05 16:38:15 crc kubenswrapper[4756]: I1205 16:38:15.211563 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e22ff075-9534-4b98-9e00-04672da4fa74" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.161:3000/\": EOF" Dec 05 16:38:15 crc kubenswrapper[4756]: I1205 16:38:15.835551 4756 generic.go:334] "Generic (PLEG): container finished" podID="e22ff075-9534-4b98-9e00-04672da4fa74" containerID="ce4700bb9c9d51e56f5b62af753192e92110be5ca1e9970ef1f21dbbdd1f826c" exitCode=0 Dec 05 16:38:15 crc kubenswrapper[4756]: I1205 16:38:15.835848 4756 generic.go:334] "Generic (PLEG): container finished" podID="e22ff075-9534-4b98-9e00-04672da4fa74" containerID="2924cfbec2bcd7342ee1c2869c60f831538193dcd182c8f17f207d9200a1fc4f" exitCode=2 Dec 05 16:38:15 crc kubenswrapper[4756]: I1205 16:38:15.835647 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e22ff075-9534-4b98-9e00-04672da4fa74","Type":"ContainerDied","Data":"ce4700bb9c9d51e56f5b62af753192e92110be5ca1e9970ef1f21dbbdd1f826c"} Dec 05 16:38:15 crc kubenswrapper[4756]: I1205 16:38:15.835898 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e22ff075-9534-4b98-9e00-04672da4fa74","Type":"ContainerDied","Data":"2924cfbec2bcd7342ee1c2869c60f831538193dcd182c8f17f207d9200a1fc4f"} Dec 05 16:38:15 crc kubenswrapper[4756]: I1205 16:38:15.841480 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-647b5b9577-j6wcs" event={"ID":"541f7de0-3773-4e28-9881-b552367f1c89","Type":"ContainerStarted","Data":"40fa47072fd0322fd7094036691019fba18613d3d6531f5dd1a0acf458462514"} Dec 05 16:38:15 crc kubenswrapper[4756]: I1205 16:38:15.841547 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-647b5b9577-j6wcs" event={"ID":"541f7de0-3773-4e28-9881-b552367f1c89","Type":"ContainerStarted","Data":"1b4671c01e0fecf9173531aafb787e57c3c0f19f93680f966bff65b5e243cf8c"} Dec 05 16:38:15 crc kubenswrapper[4756]: I1205 16:38:15.841656 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:38:16 crc kubenswrapper[4756]: I1205 16:38:16.401611 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 05 16:38:16 crc kubenswrapper[4756]: I1205 16:38:16.433806 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-647b5b9577-j6wcs" podStartSLOduration=3.433788667 podStartE2EDuration="3.433788667s" podCreationTimestamp="2025-12-05 16:38:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:38:15.864434422 +0000 UTC m=+1356.103107954" watchObservedRunningTime="2025-12-05 16:38:16.433788667 +0000 UTC m=+1356.672462179" Dec 05 16:38:16 crc kubenswrapper[4756]: I1205 16:38:16.854440 4756 generic.go:334] "Generic (PLEG): container finished" podID="e22ff075-9534-4b98-9e00-04672da4fa74" containerID="0613a7ab5a963b46b6c293bb87644a995d68eed3117fa15ec6230d11c8e9c821" exitCode=0 Dec 05 16:38:16 crc kubenswrapper[4756]: I1205 16:38:16.854520 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e22ff075-9534-4b98-9e00-04672da4fa74","Type":"ContainerDied","Data":"0613a7ab5a963b46b6c293bb87644a995d68eed3117fa15ec6230d11c8e9c821"} Dec 05 16:38:16 crc kubenswrapper[4756]: I1205 16:38:16.855463 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:38:20 crc kubenswrapper[4756]: I1205 16:38:20.894705 4756 generic.go:334] "Generic (PLEG): container finished" podID="e22ff075-9534-4b98-9e00-04672da4fa74" containerID="278c6bf1ce5c05a26a66ac4857f514a3ecfce374d4e7e0ce0f0f871d34afdb5e" exitCode=0 Dec 05 16:38:20 crc kubenswrapper[4756]: I1205 16:38:20.894782 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e22ff075-9534-4b98-9e00-04672da4fa74","Type":"ContainerDied","Data":"278c6bf1ce5c05a26a66ac4857f514a3ecfce374d4e7e0ce0f0f871d34afdb5e"} Dec 05 16:38:23 crc kubenswrapper[4756]: I1205 16:38:23.610611 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-84874cff96-vj9jp" Dec 05 16:38:23 crc kubenswrapper[4756]: I1205 16:38:23.663840 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:38:23 crc kubenswrapper[4756]: I1205 16:38:23.670408 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:38:23 crc kubenswrapper[4756]: I1205 16:38:23.950845 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e22ff075-9534-4b98-9e00-04672da4fa74","Type":"ContainerDied","Data":"e57cae3fd680c166873c24b8843123064d86a2276e2b5b9ac3369c5189072149"} Dec 05 16:38:23 crc kubenswrapper[4756]: I1205 16:38:23.951091 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e57cae3fd680c166873c24b8843123064d86a2276e2b5b9ac3369c5189072149" Dec 05 16:38:23 crc kubenswrapper[4756]: I1205 16:38:23.991163 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:38:24 crc kubenswrapper[4756]: I1205 16:38:24.050264 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22ff075-9534-4b98-9e00-04672da4fa74-log-httpd\") pod \"e22ff075-9534-4b98-9e00-04672da4fa74\" (UID: \"e22ff075-9534-4b98-9e00-04672da4fa74\") " Dec 05 16:38:24 crc kubenswrapper[4756]: I1205 16:38:24.050723 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22ff075-9534-4b98-9e00-04672da4fa74-run-httpd\") pod \"e22ff075-9534-4b98-9e00-04672da4fa74\" (UID: \"e22ff075-9534-4b98-9e00-04672da4fa74\") " Dec 05 16:38:24 crc kubenswrapper[4756]: I1205 16:38:24.050849 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22ff075-9534-4b98-9e00-04672da4fa74-config-data\") pod \"e22ff075-9534-4b98-9e00-04672da4fa74\" (UID: \"e22ff075-9534-4b98-9e00-04672da4fa74\") " Dec 05 16:38:24 crc kubenswrapper[4756]: I1205 16:38:24.050898 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e22ff075-9534-4b98-9e00-04672da4fa74-sg-core-conf-yaml\") pod \"e22ff075-9534-4b98-9e00-04672da4fa74\" (UID: \"e22ff075-9534-4b98-9e00-04672da4fa74\") " Dec 05 16:38:24 crc kubenswrapper[4756]: I1205 16:38:24.050923 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e22ff075-9534-4b98-9e00-04672da4fa74-scripts\") pod \"e22ff075-9534-4b98-9e00-04672da4fa74\" (UID: \"e22ff075-9534-4b98-9e00-04672da4fa74\") " Dec 05 16:38:24 crc kubenswrapper[4756]: I1205 16:38:24.051040 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22ff075-9534-4b98-9e00-04672da4fa74-combined-ca-bundle\") pod \"e22ff075-9534-4b98-9e00-04672da4fa74\" (UID: \"e22ff075-9534-4b98-9e00-04672da4fa74\") " Dec 05 16:38:24 crc kubenswrapper[4756]: I1205 16:38:24.051126 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwfpd\" (UniqueName: \"kubernetes.io/projected/e22ff075-9534-4b98-9e00-04672da4fa74-kube-api-access-nwfpd\") pod \"e22ff075-9534-4b98-9e00-04672da4fa74\" (UID: \"e22ff075-9534-4b98-9e00-04672da4fa74\") " Dec 05 16:38:24 crc kubenswrapper[4756]: I1205 16:38:24.051402 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e22ff075-9534-4b98-9e00-04672da4fa74-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e22ff075-9534-4b98-9e00-04672da4fa74" (UID: "e22ff075-9534-4b98-9e00-04672da4fa74"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:38:24 crc kubenswrapper[4756]: I1205 16:38:24.051967 4756 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22ff075-9534-4b98-9e00-04672da4fa74-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:24 crc kubenswrapper[4756]: I1205 16:38:24.052988 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e22ff075-9534-4b98-9e00-04672da4fa74-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e22ff075-9534-4b98-9e00-04672da4fa74" (UID: "e22ff075-9534-4b98-9e00-04672da4fa74"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:38:24 crc kubenswrapper[4756]: I1205 16:38:24.059625 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e22ff075-9534-4b98-9e00-04672da4fa74-kube-api-access-nwfpd" (OuterVolumeSpecName: "kube-api-access-nwfpd") pod "e22ff075-9534-4b98-9e00-04672da4fa74" (UID: "e22ff075-9534-4b98-9e00-04672da4fa74"). InnerVolumeSpecName "kube-api-access-nwfpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:38:24 crc kubenswrapper[4756]: I1205 16:38:24.059703 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e22ff075-9534-4b98-9e00-04672da4fa74-scripts" (OuterVolumeSpecName: "scripts") pod "e22ff075-9534-4b98-9e00-04672da4fa74" (UID: "e22ff075-9534-4b98-9e00-04672da4fa74"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:24 crc kubenswrapper[4756]: I1205 16:38:24.098115 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e22ff075-9534-4b98-9e00-04672da4fa74-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e22ff075-9534-4b98-9e00-04672da4fa74" (UID: "e22ff075-9534-4b98-9e00-04672da4fa74"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:24 crc kubenswrapper[4756]: I1205 16:38:24.154893 4756 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22ff075-9534-4b98-9e00-04672da4fa74-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:24 crc kubenswrapper[4756]: I1205 16:38:24.154969 4756 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e22ff075-9534-4b98-9e00-04672da4fa74-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:24 crc kubenswrapper[4756]: I1205 16:38:24.154980 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e22ff075-9534-4b98-9e00-04672da4fa74-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:24 crc kubenswrapper[4756]: I1205 16:38:24.154989 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwfpd\" (UniqueName: \"kubernetes.io/projected/e22ff075-9534-4b98-9e00-04672da4fa74-kube-api-access-nwfpd\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:24 crc kubenswrapper[4756]: I1205 16:38:24.167910 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e22ff075-9534-4b98-9e00-04672da4fa74-config-data" (OuterVolumeSpecName: "config-data") pod "e22ff075-9534-4b98-9e00-04672da4fa74" (UID: "e22ff075-9534-4b98-9e00-04672da4fa74"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:24 crc kubenswrapper[4756]: I1205 16:38:24.172522 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e22ff075-9534-4b98-9e00-04672da4fa74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e22ff075-9534-4b98-9e00-04672da4fa74" (UID: "e22ff075-9534-4b98-9e00-04672da4fa74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:24 crc kubenswrapper[4756]: I1205 16:38:24.256363 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22ff075-9534-4b98-9e00-04672da4fa74-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:24 crc kubenswrapper[4756]: I1205 16:38:24.256403 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22ff075-9534-4b98-9e00-04672da4fa74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:24 crc kubenswrapper[4756]: I1205 16:38:24.960704 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:38:24 crc kubenswrapper[4756]: I1205 16:38:24.960708 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6aa745f2-e987-4c61-84bd-f40224e4a4a0","Type":"ContainerStarted","Data":"7be737e28ed352a4c08f3c99de7ce6f9d006ea35ab29b304739115ae17238871"} Dec 05 16:38:24 crc kubenswrapper[4756]: I1205 16:38:24.981957 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.725127459 podStartE2EDuration="11.981939564s" podCreationTimestamp="2025-12-05 16:38:13 +0000 UTC" firstStartedPulling="2025-12-05 16:38:14.433730771 +0000 UTC m=+1354.672404283" lastFinishedPulling="2025-12-05 16:38:23.690542876 +0000 UTC m=+1363.929216388" observedRunningTime="2025-12-05 16:38:24.978725447 +0000 UTC m=+1365.217398959" watchObservedRunningTime="2025-12-05 16:38:24.981939564 +0000 UTC m=+1365.220613076" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.000814 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.009790 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.027766 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:38:25 crc kubenswrapper[4756]: E1205 16:38:25.028151 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e22ff075-9534-4b98-9e00-04672da4fa74" containerName="ceilometer-central-agent" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.028172 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="e22ff075-9534-4b98-9e00-04672da4fa74" containerName="ceilometer-central-agent" Dec 05 16:38:25 crc kubenswrapper[4756]: E1205 16:38:25.028198 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e22ff075-9534-4b98-9e00-04672da4fa74" containerName="ceilometer-notification-agent" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.028205 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="e22ff075-9534-4b98-9e00-04672da4fa74" containerName="ceilometer-notification-agent" Dec 05 16:38:25 crc kubenswrapper[4756]: E1205 16:38:25.028223 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e22ff075-9534-4b98-9e00-04672da4fa74" containerName="sg-core" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.028229 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="e22ff075-9534-4b98-9e00-04672da4fa74" containerName="sg-core" Dec 05 16:38:25 crc kubenswrapper[4756]: E1205 16:38:25.028242 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e22ff075-9534-4b98-9e00-04672da4fa74" containerName="proxy-httpd" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.028249 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="e22ff075-9534-4b98-9e00-04672da4fa74" containerName="proxy-httpd" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.028507 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="e22ff075-9534-4b98-9e00-04672da4fa74" containerName="proxy-httpd" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.028537 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="e22ff075-9534-4b98-9e00-04672da4fa74" containerName="ceilometer-notification-agent" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.028554 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="e22ff075-9534-4b98-9e00-04672da4fa74" containerName="ceilometer-central-agent" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.028574 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="e22ff075-9534-4b98-9e00-04672da4fa74" containerName="sg-core" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.030130 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.034166 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.036369 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.040707 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.170198 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/174c1954-3038-4744-b65f-0b714cf6691b-log-httpd\") pod \"ceilometer-0\" (UID: \"174c1954-3038-4744-b65f-0b714cf6691b\") " pod="openstack/ceilometer-0" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.170244 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/174c1954-3038-4744-b65f-0b714cf6691b-scripts\") pod \"ceilometer-0\" (UID: \"174c1954-3038-4744-b65f-0b714cf6691b\") " pod="openstack/ceilometer-0" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.170270 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/174c1954-3038-4744-b65f-0b714cf6691b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"174c1954-3038-4744-b65f-0b714cf6691b\") " pod="openstack/ceilometer-0" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.170299 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghd2p\" (UniqueName: \"kubernetes.io/projected/174c1954-3038-4744-b65f-0b714cf6691b-kube-api-access-ghd2p\") pod \"ceilometer-0\" (UID: \"174c1954-3038-4744-b65f-0b714cf6691b\") " pod="openstack/ceilometer-0" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.170382 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/174c1954-3038-4744-b65f-0b714cf6691b-config-data\") pod \"ceilometer-0\" (UID: \"174c1954-3038-4744-b65f-0b714cf6691b\") " pod="openstack/ceilometer-0" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.170402 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/174c1954-3038-4744-b65f-0b714cf6691b-run-httpd\") pod \"ceilometer-0\" (UID: \"174c1954-3038-4744-b65f-0b714cf6691b\") " pod="openstack/ceilometer-0" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.170448 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/174c1954-3038-4744-b65f-0b714cf6691b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"174c1954-3038-4744-b65f-0b714cf6691b\") " pod="openstack/ceilometer-0" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.191925 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:38:25 crc kubenswrapper[4756]: E1205 16:38:25.192643 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-ghd2p log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="174c1954-3038-4744-b65f-0b714cf6691b" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.271683 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/174c1954-3038-4744-b65f-0b714cf6691b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"174c1954-3038-4744-b65f-0b714cf6691b\") " pod="openstack/ceilometer-0" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.271925 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghd2p\" (UniqueName: \"kubernetes.io/projected/174c1954-3038-4744-b65f-0b714cf6691b-kube-api-access-ghd2p\") pod \"ceilometer-0\" (UID: \"174c1954-3038-4744-b65f-0b714cf6691b\") " pod="openstack/ceilometer-0" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.272091 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/174c1954-3038-4744-b65f-0b714cf6691b-config-data\") pod \"ceilometer-0\" (UID: \"174c1954-3038-4744-b65f-0b714cf6691b\") " pod="openstack/ceilometer-0" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.272171 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/174c1954-3038-4744-b65f-0b714cf6691b-run-httpd\") pod \"ceilometer-0\" (UID: \"174c1954-3038-4744-b65f-0b714cf6691b\") " pod="openstack/ceilometer-0" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.272286 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/174c1954-3038-4744-b65f-0b714cf6691b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"174c1954-3038-4744-b65f-0b714cf6691b\") " pod="openstack/ceilometer-0" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.272407 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/174c1954-3038-4744-b65f-0b714cf6691b-log-httpd\") pod \"ceilometer-0\" (UID: \"174c1954-3038-4744-b65f-0b714cf6691b\") " pod="openstack/ceilometer-0" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.272516 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/174c1954-3038-4744-b65f-0b714cf6691b-scripts\") pod \"ceilometer-0\" (UID: \"174c1954-3038-4744-b65f-0b714cf6691b\") " pod="openstack/ceilometer-0" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.273752 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/174c1954-3038-4744-b65f-0b714cf6691b-run-httpd\") pod \"ceilometer-0\" (UID: \"174c1954-3038-4744-b65f-0b714cf6691b\") " pod="openstack/ceilometer-0" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.273832 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/174c1954-3038-4744-b65f-0b714cf6691b-log-httpd\") pod \"ceilometer-0\" (UID: \"174c1954-3038-4744-b65f-0b714cf6691b\") " pod="openstack/ceilometer-0" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.278706 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/174c1954-3038-4744-b65f-0b714cf6691b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"174c1954-3038-4744-b65f-0b714cf6691b\") " pod="openstack/ceilometer-0" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.280893 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/174c1954-3038-4744-b65f-0b714cf6691b-scripts\") pod \"ceilometer-0\" (UID: \"174c1954-3038-4744-b65f-0b714cf6691b\") " pod="openstack/ceilometer-0" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.281727 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/174c1954-3038-4744-b65f-0b714cf6691b-config-data\") pod \"ceilometer-0\" (UID: \"174c1954-3038-4744-b65f-0b714cf6691b\") " pod="openstack/ceilometer-0" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.294206 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/174c1954-3038-4744-b65f-0b714cf6691b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"174c1954-3038-4744-b65f-0b714cf6691b\") " pod="openstack/ceilometer-0" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.295303 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghd2p\" (UniqueName: \"kubernetes.io/projected/174c1954-3038-4744-b65f-0b714cf6691b-kube-api-access-ghd2p\") pod \"ceilometer-0\" (UID: \"174c1954-3038-4744-b65f-0b714cf6691b\") " pod="openstack/ceilometer-0" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.969775 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:38:25 crc kubenswrapper[4756]: I1205 16:38:25.982868 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:38:26 crc kubenswrapper[4756]: I1205 16:38:26.061837 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e22ff075-9534-4b98-9e00-04672da4fa74" path="/var/lib/kubelet/pods/e22ff075-9534-4b98-9e00-04672da4fa74/volumes" Dec 05 16:38:26 crc kubenswrapper[4756]: I1205 16:38:26.087616 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/174c1954-3038-4744-b65f-0b714cf6691b-combined-ca-bundle\") pod \"174c1954-3038-4744-b65f-0b714cf6691b\" (UID: \"174c1954-3038-4744-b65f-0b714cf6691b\") " Dec 05 16:38:26 crc kubenswrapper[4756]: I1205 16:38:26.087677 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/174c1954-3038-4744-b65f-0b714cf6691b-scripts\") pod \"174c1954-3038-4744-b65f-0b714cf6691b\" (UID: \"174c1954-3038-4744-b65f-0b714cf6691b\") " Dec 05 16:38:26 crc kubenswrapper[4756]: I1205 16:38:26.087849 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/174c1954-3038-4744-b65f-0b714cf6691b-sg-core-conf-yaml\") pod \"174c1954-3038-4744-b65f-0b714cf6691b\" (UID: \"174c1954-3038-4744-b65f-0b714cf6691b\") " Dec 05 16:38:26 crc kubenswrapper[4756]: I1205 16:38:26.087947 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghd2p\" (UniqueName: \"kubernetes.io/projected/174c1954-3038-4744-b65f-0b714cf6691b-kube-api-access-ghd2p\") pod \"174c1954-3038-4744-b65f-0b714cf6691b\" (UID: \"174c1954-3038-4744-b65f-0b714cf6691b\") " Dec 05 16:38:26 crc kubenswrapper[4756]: I1205 16:38:26.087999 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/174c1954-3038-4744-b65f-0b714cf6691b-log-httpd\") pod \"174c1954-3038-4744-b65f-0b714cf6691b\" (UID: \"174c1954-3038-4744-b65f-0b714cf6691b\") " Dec 05 16:38:26 crc kubenswrapper[4756]: I1205 16:38:26.088065 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/174c1954-3038-4744-b65f-0b714cf6691b-config-data\") pod \"174c1954-3038-4744-b65f-0b714cf6691b\" (UID: \"174c1954-3038-4744-b65f-0b714cf6691b\") " Dec 05 16:38:26 crc kubenswrapper[4756]: I1205 16:38:26.088103 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/174c1954-3038-4744-b65f-0b714cf6691b-run-httpd\") pod \"174c1954-3038-4744-b65f-0b714cf6691b\" (UID: \"174c1954-3038-4744-b65f-0b714cf6691b\") " Dec 05 16:38:26 crc kubenswrapper[4756]: I1205 16:38:26.088696 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/174c1954-3038-4744-b65f-0b714cf6691b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "174c1954-3038-4744-b65f-0b714cf6691b" (UID: "174c1954-3038-4744-b65f-0b714cf6691b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:38:26 crc kubenswrapper[4756]: I1205 16:38:26.088851 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/174c1954-3038-4744-b65f-0b714cf6691b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "174c1954-3038-4744-b65f-0b714cf6691b" (UID: "174c1954-3038-4744-b65f-0b714cf6691b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:38:26 crc kubenswrapper[4756]: I1205 16:38:26.092486 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/174c1954-3038-4744-b65f-0b714cf6691b-config-data" (OuterVolumeSpecName: "config-data") pod "174c1954-3038-4744-b65f-0b714cf6691b" (UID: "174c1954-3038-4744-b65f-0b714cf6691b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:26 crc kubenswrapper[4756]: I1205 16:38:26.092504 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/174c1954-3038-4744-b65f-0b714cf6691b-scripts" (OuterVolumeSpecName: "scripts") pod "174c1954-3038-4744-b65f-0b714cf6691b" (UID: "174c1954-3038-4744-b65f-0b714cf6691b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:26 crc kubenswrapper[4756]: I1205 16:38:26.092960 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/174c1954-3038-4744-b65f-0b714cf6691b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "174c1954-3038-4744-b65f-0b714cf6691b" (UID: "174c1954-3038-4744-b65f-0b714cf6691b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:26 crc kubenswrapper[4756]: I1205 16:38:26.098608 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/174c1954-3038-4744-b65f-0b714cf6691b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "174c1954-3038-4744-b65f-0b714cf6691b" (UID: "174c1954-3038-4744-b65f-0b714cf6691b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:26 crc kubenswrapper[4756]: I1205 16:38:26.098669 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/174c1954-3038-4744-b65f-0b714cf6691b-kube-api-access-ghd2p" (OuterVolumeSpecName: "kube-api-access-ghd2p") pod "174c1954-3038-4744-b65f-0b714cf6691b" (UID: "174c1954-3038-4744-b65f-0b714cf6691b"). InnerVolumeSpecName "kube-api-access-ghd2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:38:26 crc kubenswrapper[4756]: I1205 16:38:26.191025 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghd2p\" (UniqueName: \"kubernetes.io/projected/174c1954-3038-4744-b65f-0b714cf6691b-kube-api-access-ghd2p\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:26 crc kubenswrapper[4756]: I1205 16:38:26.191063 4756 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/174c1954-3038-4744-b65f-0b714cf6691b-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:26 crc kubenswrapper[4756]: I1205 16:38:26.191076 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/174c1954-3038-4744-b65f-0b714cf6691b-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:26 crc kubenswrapper[4756]: I1205 16:38:26.191087 4756 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/174c1954-3038-4744-b65f-0b714cf6691b-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:26 crc kubenswrapper[4756]: I1205 16:38:26.191099 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/174c1954-3038-4744-b65f-0b714cf6691b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:26 crc kubenswrapper[4756]: I1205 16:38:26.191110 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/174c1954-3038-4744-b65f-0b714cf6691b-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:26 crc kubenswrapper[4756]: I1205 16:38:26.191121 4756 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/174c1954-3038-4744-b65f-0b714cf6691b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:26 crc kubenswrapper[4756]: I1205 16:38:26.979202 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.036137 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.044770 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.064727 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.069114 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.072642 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.074508 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.097502 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.209428 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59c21716-507e-46b6-ad7e-8aa53968d9a6-scripts\") pod \"ceilometer-0\" (UID: \"59c21716-507e-46b6-ad7e-8aa53968d9a6\") " pod="openstack/ceilometer-0" Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.209582 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/59c21716-507e-46b6-ad7e-8aa53968d9a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"59c21716-507e-46b6-ad7e-8aa53968d9a6\") " pod="openstack/ceilometer-0" Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.209611 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59c21716-507e-46b6-ad7e-8aa53968d9a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"59c21716-507e-46b6-ad7e-8aa53968d9a6\") " pod="openstack/ceilometer-0" Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.209646 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59c21716-507e-46b6-ad7e-8aa53968d9a6-config-data\") pod \"ceilometer-0\" (UID: \"59c21716-507e-46b6-ad7e-8aa53968d9a6\") " pod="openstack/ceilometer-0" Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.209720 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59c21716-507e-46b6-ad7e-8aa53968d9a6-log-httpd\") pod \"ceilometer-0\" (UID: \"59c21716-507e-46b6-ad7e-8aa53968d9a6\") " pod="openstack/ceilometer-0" Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.209735 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59c21716-507e-46b6-ad7e-8aa53968d9a6-run-httpd\") pod \"ceilometer-0\" (UID: \"59c21716-507e-46b6-ad7e-8aa53968d9a6\") " pod="openstack/ceilometer-0" Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.209778 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgv9l\" (UniqueName: \"kubernetes.io/projected/59c21716-507e-46b6-ad7e-8aa53968d9a6-kube-api-access-jgv9l\") pod \"ceilometer-0\" (UID: \"59c21716-507e-46b6-ad7e-8aa53968d9a6\") " pod="openstack/ceilometer-0" Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.310887 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59c21716-507e-46b6-ad7e-8aa53968d9a6-log-httpd\") pod \"ceilometer-0\" (UID: \"59c21716-507e-46b6-ad7e-8aa53968d9a6\") " pod="openstack/ceilometer-0" Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.311114 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59c21716-507e-46b6-ad7e-8aa53968d9a6-run-httpd\") pod \"ceilometer-0\" (UID: \"59c21716-507e-46b6-ad7e-8aa53968d9a6\") " pod="openstack/ceilometer-0" Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.311223 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgv9l\" (UniqueName: \"kubernetes.io/projected/59c21716-507e-46b6-ad7e-8aa53968d9a6-kube-api-access-jgv9l\") pod \"ceilometer-0\" (UID: \"59c21716-507e-46b6-ad7e-8aa53968d9a6\") " pod="openstack/ceilometer-0" Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.311336 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59c21716-507e-46b6-ad7e-8aa53968d9a6-scripts\") pod \"ceilometer-0\" (UID: \"59c21716-507e-46b6-ad7e-8aa53968d9a6\") " pod="openstack/ceilometer-0" Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.311492 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/59c21716-507e-46b6-ad7e-8aa53968d9a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"59c21716-507e-46b6-ad7e-8aa53968d9a6\") " pod="openstack/ceilometer-0" Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.311605 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59c21716-507e-46b6-ad7e-8aa53968d9a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"59c21716-507e-46b6-ad7e-8aa53968d9a6\") " pod="openstack/ceilometer-0" Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.311685 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59c21716-507e-46b6-ad7e-8aa53968d9a6-config-data\") pod \"ceilometer-0\" (UID: \"59c21716-507e-46b6-ad7e-8aa53968d9a6\") " pod="openstack/ceilometer-0" Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.313052 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59c21716-507e-46b6-ad7e-8aa53968d9a6-log-httpd\") pod \"ceilometer-0\" (UID: \"59c21716-507e-46b6-ad7e-8aa53968d9a6\") " pod="openstack/ceilometer-0" Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.313067 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59c21716-507e-46b6-ad7e-8aa53968d9a6-run-httpd\") pod \"ceilometer-0\" (UID: \"59c21716-507e-46b6-ad7e-8aa53968d9a6\") " pod="openstack/ceilometer-0" Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.316065 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59c21716-507e-46b6-ad7e-8aa53968d9a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"59c21716-507e-46b6-ad7e-8aa53968d9a6\") " pod="openstack/ceilometer-0" Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.316198 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/59c21716-507e-46b6-ad7e-8aa53968d9a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"59c21716-507e-46b6-ad7e-8aa53968d9a6\") " pod="openstack/ceilometer-0" Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.317143 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59c21716-507e-46b6-ad7e-8aa53968d9a6-scripts\") pod \"ceilometer-0\" (UID: \"59c21716-507e-46b6-ad7e-8aa53968d9a6\") " pod="openstack/ceilometer-0" Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.317159 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59c21716-507e-46b6-ad7e-8aa53968d9a6-config-data\") pod \"ceilometer-0\" (UID: \"59c21716-507e-46b6-ad7e-8aa53968d9a6\") " pod="openstack/ceilometer-0" Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.328087 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgv9l\" (UniqueName: \"kubernetes.io/projected/59c21716-507e-46b6-ad7e-8aa53968d9a6-kube-api-access-jgv9l\") pod \"ceilometer-0\" (UID: \"59c21716-507e-46b6-ad7e-8aa53968d9a6\") " pod="openstack/ceilometer-0" Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.385564 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.877757 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:38:27 crc kubenswrapper[4756]: I1205 16:38:27.989271 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"59c21716-507e-46b6-ad7e-8aa53968d9a6","Type":"ContainerStarted","Data":"1cab81d5ee9fe8ea619ccdc7439f64212043995132c05a739a25b1e7f0870292"} Dec 05 16:38:28 crc kubenswrapper[4756]: I1205 16:38:28.070679 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="174c1954-3038-4744-b65f-0b714cf6691b" path="/var/lib/kubelet/pods/174c1954-3038-4744-b65f-0b714cf6691b/volumes" Dec 05 16:38:28 crc kubenswrapper[4756]: I1205 16:38:28.174578 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5b545bc8c5-xcv4s" Dec 05 16:38:28 crc kubenswrapper[4756]: I1205 16:38:28.251986 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-84874cff96-vj9jp"] Dec 05 16:38:28 crc kubenswrapper[4756]: I1205 16:38:28.252200 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-84874cff96-vj9jp" podUID="f55d3666-a13e-4d10-ba01-d13f5d0b1d7a" containerName="neutron-api" containerID="cri-o://d13d9a27b35d2845a09a8db034bdad1c6e20fddb4e0686170007fa4fe3692129" gracePeriod=30 Dec 05 16:38:28 crc kubenswrapper[4756]: I1205 16:38:28.254227 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-84874cff96-vj9jp" podUID="f55d3666-a13e-4d10-ba01-d13f5d0b1d7a" containerName="neutron-httpd" containerID="cri-o://a0da29df90f63367bef1b3be43b14be690f24459c837cdc40e920dde2d3e744c" gracePeriod=30 Dec 05 16:38:29 crc kubenswrapper[4756]: I1205 16:38:29.000157 4756 generic.go:334] "Generic (PLEG): container finished" podID="f55d3666-a13e-4d10-ba01-d13f5d0b1d7a" containerID="a0da29df90f63367bef1b3be43b14be690f24459c837cdc40e920dde2d3e744c" exitCode=0 Dec 05 16:38:29 crc kubenswrapper[4756]: I1205 16:38:29.000221 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84874cff96-vj9jp" event={"ID":"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a","Type":"ContainerDied","Data":"a0da29df90f63367bef1b3be43b14be690f24459c837cdc40e920dde2d3e744c"} Dec 05 16:38:29 crc kubenswrapper[4756]: I1205 16:38:29.001709 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"59c21716-507e-46b6-ad7e-8aa53968d9a6","Type":"ContainerStarted","Data":"dcf9655a0ecf26d837db083e173823cb2b45d03a9e4b77039b4255ba071b5808"} Dec 05 16:38:30 crc kubenswrapper[4756]: I1205 16:38:30.012719 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"59c21716-507e-46b6-ad7e-8aa53968d9a6","Type":"ContainerStarted","Data":"ca4eb31ee2f57553aac6d57afc83aa7cc485b7391c618d680f0bf310a49e95e5"} Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.030998 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"59c21716-507e-46b6-ad7e-8aa53968d9a6","Type":"ContainerStarted","Data":"0739127c4b5a48cc0fea53e873c582ad740019f226779db51ead18e14872f2bf"} Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.444455 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-5jhw5"] Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.445874 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5jhw5" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.455001 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-5jhw5"] Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.527845 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/986034bd-a776-4242-b772-b0db284f6d4f-operator-scripts\") pod \"nova-api-db-create-5jhw5\" (UID: \"986034bd-a776-4242-b772-b0db284f6d4f\") " pod="openstack/nova-api-db-create-5jhw5" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.527899 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqsw4\" (UniqueName: \"kubernetes.io/projected/986034bd-a776-4242-b772-b0db284f6d4f-kube-api-access-kqsw4\") pod \"nova-api-db-create-5jhw5\" (UID: \"986034bd-a776-4242-b772-b0db284f6d4f\") " pod="openstack/nova-api-db-create-5jhw5" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.542006 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-hjw2d"] Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.543469 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hjw2d" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.561278 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-hjw2d"] Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.629788 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67e154b5-e680-4c49-b449-62b100aac6af-operator-scripts\") pod \"nova-cell0-db-create-hjw2d\" (UID: \"67e154b5-e680-4c49-b449-62b100aac6af\") " pod="openstack/nova-cell0-db-create-hjw2d" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.629838 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/986034bd-a776-4242-b772-b0db284f6d4f-operator-scripts\") pod \"nova-api-db-create-5jhw5\" (UID: \"986034bd-a776-4242-b772-b0db284f6d4f\") " pod="openstack/nova-api-db-create-5jhw5" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.629864 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqsw4\" (UniqueName: \"kubernetes.io/projected/986034bd-a776-4242-b772-b0db284f6d4f-kube-api-access-kqsw4\") pod \"nova-api-db-create-5jhw5\" (UID: \"986034bd-a776-4242-b772-b0db284f6d4f\") " pod="openstack/nova-api-db-create-5jhw5" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.629882 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vds2l\" (UniqueName: \"kubernetes.io/projected/67e154b5-e680-4c49-b449-62b100aac6af-kube-api-access-vds2l\") pod \"nova-cell0-db-create-hjw2d\" (UID: \"67e154b5-e680-4c49-b449-62b100aac6af\") " pod="openstack/nova-cell0-db-create-hjw2d" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.630592 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/986034bd-a776-4242-b772-b0db284f6d4f-operator-scripts\") pod \"nova-api-db-create-5jhw5\" (UID: \"986034bd-a776-4242-b772-b0db284f6d4f\") " pod="openstack/nova-api-db-create-5jhw5" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.654395 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-58bc-account-create-update-r7d29"] Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.655571 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-58bc-account-create-update-r7d29" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.659647 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqsw4\" (UniqueName: \"kubernetes.io/projected/986034bd-a776-4242-b772-b0db284f6d4f-kube-api-access-kqsw4\") pod \"nova-api-db-create-5jhw5\" (UID: \"986034bd-a776-4242-b772-b0db284f6d4f\") " pod="openstack/nova-api-db-create-5jhw5" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.662640 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.664195 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-58bc-account-create-update-r7d29"] Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.732398 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f5fs\" (UniqueName: \"kubernetes.io/projected/e884c151-d5ce-44a5-819d-9ee0c4328cbe-kube-api-access-7f5fs\") pod \"nova-api-58bc-account-create-update-r7d29\" (UID: \"e884c151-d5ce-44a5-819d-9ee0c4328cbe\") " pod="openstack/nova-api-58bc-account-create-update-r7d29" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.732522 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67e154b5-e680-4c49-b449-62b100aac6af-operator-scripts\") pod \"nova-cell0-db-create-hjw2d\" (UID: \"67e154b5-e680-4c49-b449-62b100aac6af\") " pod="openstack/nova-cell0-db-create-hjw2d" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.732572 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vds2l\" (UniqueName: \"kubernetes.io/projected/67e154b5-e680-4c49-b449-62b100aac6af-kube-api-access-vds2l\") pod \"nova-cell0-db-create-hjw2d\" (UID: \"67e154b5-e680-4c49-b449-62b100aac6af\") " pod="openstack/nova-cell0-db-create-hjw2d" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.732631 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e884c151-d5ce-44a5-819d-9ee0c4328cbe-operator-scripts\") pod \"nova-api-58bc-account-create-update-r7d29\" (UID: \"e884c151-d5ce-44a5-819d-9ee0c4328cbe\") " pod="openstack/nova-api-58bc-account-create-update-r7d29" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.733543 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67e154b5-e680-4c49-b449-62b100aac6af-operator-scripts\") pod \"nova-cell0-db-create-hjw2d\" (UID: \"67e154b5-e680-4c49-b449-62b100aac6af\") " pod="openstack/nova-cell0-db-create-hjw2d" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.752307 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-bsbtj"] Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.753436 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bsbtj" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.766871 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-bsbtj"] Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.773125 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vds2l\" (UniqueName: \"kubernetes.io/projected/67e154b5-e680-4c49-b449-62b100aac6af-kube-api-access-vds2l\") pod \"nova-cell0-db-create-hjw2d\" (UID: \"67e154b5-e680-4c49-b449-62b100aac6af\") " pod="openstack/nova-cell0-db-create-hjw2d" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.780005 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5jhw5" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.834613 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e884c151-d5ce-44a5-819d-9ee0c4328cbe-operator-scripts\") pod \"nova-api-58bc-account-create-update-r7d29\" (UID: \"e884c151-d5ce-44a5-819d-9ee0c4328cbe\") " pod="openstack/nova-api-58bc-account-create-update-r7d29" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.834730 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f76f909-d1f6-4f2d-a398-5791bb8a14cf-operator-scripts\") pod \"nova-cell1-db-create-bsbtj\" (UID: \"6f76f909-d1f6-4f2d-a398-5791bb8a14cf\") " pod="openstack/nova-cell1-db-create-bsbtj" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.834821 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn9tq\" (UniqueName: \"kubernetes.io/projected/6f76f909-d1f6-4f2d-a398-5791bb8a14cf-kube-api-access-tn9tq\") pod \"nova-cell1-db-create-bsbtj\" (UID: \"6f76f909-d1f6-4f2d-a398-5791bb8a14cf\") " pod="openstack/nova-cell1-db-create-bsbtj" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.834865 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f5fs\" (UniqueName: \"kubernetes.io/projected/e884c151-d5ce-44a5-819d-9ee0c4328cbe-kube-api-access-7f5fs\") pod \"nova-api-58bc-account-create-update-r7d29\" (UID: \"e884c151-d5ce-44a5-819d-9ee0c4328cbe\") " pod="openstack/nova-api-58bc-account-create-update-r7d29" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.836199 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e884c151-d5ce-44a5-819d-9ee0c4328cbe-operator-scripts\") pod \"nova-api-58bc-account-create-update-r7d29\" (UID: \"e884c151-d5ce-44a5-819d-9ee0c4328cbe\") " pod="openstack/nova-api-58bc-account-create-update-r7d29" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.856546 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-bc19-account-create-update-rxqln"] Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.859965 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-bc19-account-create-update-rxqln" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.861284 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hjw2d" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.863794 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.868187 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f5fs\" (UniqueName: \"kubernetes.io/projected/e884c151-d5ce-44a5-819d-9ee0c4328cbe-kube-api-access-7f5fs\") pod \"nova-api-58bc-account-create-update-r7d29\" (UID: \"e884c151-d5ce-44a5-819d-9ee0c4328cbe\") " pod="openstack/nova-api-58bc-account-create-update-r7d29" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.869616 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-bc19-account-create-update-rxqln"] Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.936058 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbzrm\" (UniqueName: \"kubernetes.io/projected/1702bfa9-c7bd-4e9b-996a-dbb838d454ac-kube-api-access-qbzrm\") pod \"nova-cell0-bc19-account-create-update-rxqln\" (UID: \"1702bfa9-c7bd-4e9b-996a-dbb838d454ac\") " pod="openstack/nova-cell0-bc19-account-create-update-rxqln" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.936160 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f76f909-d1f6-4f2d-a398-5791bb8a14cf-operator-scripts\") pod \"nova-cell1-db-create-bsbtj\" (UID: \"6f76f909-d1f6-4f2d-a398-5791bb8a14cf\") " pod="openstack/nova-cell1-db-create-bsbtj" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.936238 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn9tq\" (UniqueName: \"kubernetes.io/projected/6f76f909-d1f6-4f2d-a398-5791bb8a14cf-kube-api-access-tn9tq\") pod \"nova-cell1-db-create-bsbtj\" (UID: \"6f76f909-d1f6-4f2d-a398-5791bb8a14cf\") " pod="openstack/nova-cell1-db-create-bsbtj" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.936270 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1702bfa9-c7bd-4e9b-996a-dbb838d454ac-operator-scripts\") pod \"nova-cell0-bc19-account-create-update-rxqln\" (UID: \"1702bfa9-c7bd-4e9b-996a-dbb838d454ac\") " pod="openstack/nova-cell0-bc19-account-create-update-rxqln" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.937695 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f76f909-d1f6-4f2d-a398-5791bb8a14cf-operator-scripts\") pod \"nova-cell1-db-create-bsbtj\" (UID: \"6f76f909-d1f6-4f2d-a398-5791bb8a14cf\") " pod="openstack/nova-cell1-db-create-bsbtj" Dec 05 16:38:32 crc kubenswrapper[4756]: I1205 16:38:32.957652 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn9tq\" (UniqueName: \"kubernetes.io/projected/6f76f909-d1f6-4f2d-a398-5791bb8a14cf-kube-api-access-tn9tq\") pod \"nova-cell1-db-create-bsbtj\" (UID: \"6f76f909-d1f6-4f2d-a398-5791bb8a14cf\") " pod="openstack/nova-cell1-db-create-bsbtj" Dec 05 16:38:33 crc kubenswrapper[4756]: I1205 16:38:33.030123 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-58bc-account-create-update-r7d29" Dec 05 16:38:33 crc kubenswrapper[4756]: I1205 16:38:33.038026 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1702bfa9-c7bd-4e9b-996a-dbb838d454ac-operator-scripts\") pod \"nova-cell0-bc19-account-create-update-rxqln\" (UID: \"1702bfa9-c7bd-4e9b-996a-dbb838d454ac\") " pod="openstack/nova-cell0-bc19-account-create-update-rxqln" Dec 05 16:38:33 crc kubenswrapper[4756]: I1205 16:38:33.038181 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbzrm\" (UniqueName: \"kubernetes.io/projected/1702bfa9-c7bd-4e9b-996a-dbb838d454ac-kube-api-access-qbzrm\") pod \"nova-cell0-bc19-account-create-update-rxqln\" (UID: \"1702bfa9-c7bd-4e9b-996a-dbb838d454ac\") " pod="openstack/nova-cell0-bc19-account-create-update-rxqln" Dec 05 16:38:33 crc kubenswrapper[4756]: I1205 16:38:33.039351 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1702bfa9-c7bd-4e9b-996a-dbb838d454ac-operator-scripts\") pod \"nova-cell0-bc19-account-create-update-rxqln\" (UID: \"1702bfa9-c7bd-4e9b-996a-dbb838d454ac\") " pod="openstack/nova-cell0-bc19-account-create-update-rxqln" Dec 05 16:38:33 crc kubenswrapper[4756]: I1205 16:38:33.051823 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-6b06-account-create-update-tjk9b"] Dec 05 16:38:33 crc kubenswrapper[4756]: I1205 16:38:33.052916 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6b06-account-create-update-tjk9b" Dec 05 16:38:33 crc kubenswrapper[4756]: I1205 16:38:33.056223 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 05 16:38:33 crc kubenswrapper[4756]: I1205 16:38:33.057260 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"59c21716-507e-46b6-ad7e-8aa53968d9a6","Type":"ContainerStarted","Data":"f23aaf5b558f23193ee013236b2ffa73ec172d4225e0d322d8cb4eba4a54b080"} Dec 05 16:38:33 crc kubenswrapper[4756]: I1205 16:38:33.057613 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 16:38:33 crc kubenswrapper[4756]: I1205 16:38:33.057906 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbzrm\" (UniqueName: \"kubernetes.io/projected/1702bfa9-c7bd-4e9b-996a-dbb838d454ac-kube-api-access-qbzrm\") pod \"nova-cell0-bc19-account-create-update-rxqln\" (UID: \"1702bfa9-c7bd-4e9b-996a-dbb838d454ac\") " pod="openstack/nova-cell0-bc19-account-create-update-rxqln" Dec 05 16:38:33 crc kubenswrapper[4756]: I1205 16:38:33.072604 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bsbtj" Dec 05 16:38:33 crc kubenswrapper[4756]: I1205 16:38:33.116353 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6b06-account-create-update-tjk9b"] Dec 05 16:38:33 crc kubenswrapper[4756]: I1205 16:38:33.141711 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8031226409999999 podStartE2EDuration="6.141692439s" podCreationTimestamp="2025-12-05 16:38:27 +0000 UTC" firstStartedPulling="2025-12-05 16:38:27.889669086 +0000 UTC m=+1368.128342598" lastFinishedPulling="2025-12-05 16:38:32.228238884 +0000 UTC m=+1372.466912396" observedRunningTime="2025-12-05 16:38:33.135182132 +0000 UTC m=+1373.373855654" watchObservedRunningTime="2025-12-05 16:38:33.141692439 +0000 UTC m=+1373.380365961" Dec 05 16:38:33 crc kubenswrapper[4756]: I1205 16:38:33.155545 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwlm5\" (UniqueName: \"kubernetes.io/projected/4e3345b9-82e1-469d-8db0-0d5a8163c369-kube-api-access-jwlm5\") pod \"nova-cell1-6b06-account-create-update-tjk9b\" (UID: \"4e3345b9-82e1-469d-8db0-0d5a8163c369\") " pod="openstack/nova-cell1-6b06-account-create-update-tjk9b" Dec 05 16:38:33 crc kubenswrapper[4756]: I1205 16:38:33.155598 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e3345b9-82e1-469d-8db0-0d5a8163c369-operator-scripts\") pod \"nova-cell1-6b06-account-create-update-tjk9b\" (UID: \"4e3345b9-82e1-469d-8db0-0d5a8163c369\") " pod="openstack/nova-cell1-6b06-account-create-update-tjk9b" Dec 05 16:38:33 crc kubenswrapper[4756]: I1205 16:38:33.183109 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-bc19-account-create-update-rxqln" Dec 05 16:38:33 crc kubenswrapper[4756]: I1205 16:38:33.259471 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwlm5\" (UniqueName: \"kubernetes.io/projected/4e3345b9-82e1-469d-8db0-0d5a8163c369-kube-api-access-jwlm5\") pod \"nova-cell1-6b06-account-create-update-tjk9b\" (UID: \"4e3345b9-82e1-469d-8db0-0d5a8163c369\") " pod="openstack/nova-cell1-6b06-account-create-update-tjk9b" Dec 05 16:38:33 crc kubenswrapper[4756]: I1205 16:38:33.259735 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e3345b9-82e1-469d-8db0-0d5a8163c369-operator-scripts\") pod \"nova-cell1-6b06-account-create-update-tjk9b\" (UID: \"4e3345b9-82e1-469d-8db0-0d5a8163c369\") " pod="openstack/nova-cell1-6b06-account-create-update-tjk9b" Dec 05 16:38:33 crc kubenswrapper[4756]: I1205 16:38:33.260367 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e3345b9-82e1-469d-8db0-0d5a8163c369-operator-scripts\") pod \"nova-cell1-6b06-account-create-update-tjk9b\" (UID: \"4e3345b9-82e1-469d-8db0-0d5a8163c369\") " pod="openstack/nova-cell1-6b06-account-create-update-tjk9b" Dec 05 16:38:33 crc kubenswrapper[4756]: I1205 16:38:33.291743 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwlm5\" (UniqueName: \"kubernetes.io/projected/4e3345b9-82e1-469d-8db0-0d5a8163c369-kube-api-access-jwlm5\") pod \"nova-cell1-6b06-account-create-update-tjk9b\" (UID: \"4e3345b9-82e1-469d-8db0-0d5a8163c369\") " pod="openstack/nova-cell1-6b06-account-create-update-tjk9b" Dec 05 16:38:33 crc kubenswrapper[4756]: I1205 16:38:33.377098 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6b06-account-create-update-tjk9b" Dec 05 16:38:33 crc kubenswrapper[4756]: I1205 16:38:33.433082 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-5jhw5"] Dec 05 16:38:33 crc kubenswrapper[4756]: I1205 16:38:33.549813 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-hjw2d"] Dec 05 16:38:33 crc kubenswrapper[4756]: I1205 16:38:33.747188 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-bsbtj"] Dec 05 16:38:33 crc kubenswrapper[4756]: W1205 16:38:33.804016 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f76f909_d1f6_4f2d_a398_5791bb8a14cf.slice/crio-18fc9911a8c121b02971e9755cd2bdf76f4fd76a7bb22334ebab9b9e630436b8 WatchSource:0}: Error finding container 18fc9911a8c121b02971e9755cd2bdf76f4fd76a7bb22334ebab9b9e630436b8: Status 404 returned error can't find the container with id 18fc9911a8c121b02971e9755cd2bdf76f4fd76a7bb22334ebab9b9e630436b8 Dec 05 16:38:33 crc kubenswrapper[4756]: I1205 16:38:33.884030 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-58bc-account-create-update-r7d29"] Dec 05 16:38:33 crc kubenswrapper[4756]: W1205 16:38:33.896536 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode884c151_d5ce_44a5_819d_9ee0c4328cbe.slice/crio-46c1fd9005826c44d49be5ce43c7213203e72d45feaca6a8dc8e2ff6b637f8b6 WatchSource:0}: Error finding container 46c1fd9005826c44d49be5ce43c7213203e72d45feaca6a8dc8e2ff6b637f8b6: Status 404 returned error can't find the container with id 46c1fd9005826c44d49be5ce43c7213203e72d45feaca6a8dc8e2ff6b637f8b6 Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.014778 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-bc19-account-create-update-rxqln"] Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.142113 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6b06-account-create-update-tjk9b"] Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.169077 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-bsbtj" event={"ID":"6f76f909-d1f6-4f2d-a398-5791bb8a14cf","Type":"ContainerStarted","Data":"18fc9911a8c121b02971e9755cd2bdf76f4fd76a7bb22334ebab9b9e630436b8"} Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.172395 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-58bc-account-create-update-r7d29" event={"ID":"e884c151-d5ce-44a5-819d-9ee0c4328cbe","Type":"ContainerStarted","Data":"46c1fd9005826c44d49be5ce43c7213203e72d45feaca6a8dc8e2ff6b637f8b6"} Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.174207 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-bc19-account-create-update-rxqln" event={"ID":"1702bfa9-c7bd-4e9b-996a-dbb838d454ac","Type":"ContainerStarted","Data":"af854cb8bf263195261bbd3b032eb84a6b5d6dfd33c53c48c73a89dca7fc5298"} Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.176995 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-hjw2d" event={"ID":"67e154b5-e680-4c49-b449-62b100aac6af","Type":"ContainerStarted","Data":"9eb7c337bc0192f0bbdc6691c02794d365f0af44aebf7d0ed7d4e350deb56d23"} Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.177018 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-hjw2d" event={"ID":"67e154b5-e680-4c49-b449-62b100aac6af","Type":"ContainerStarted","Data":"870ec7bd7a1dc8e9a01b165ee4720048760eca3db18fa341f87ca60a7fffad43"} Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.185685 4756 generic.go:334] "Generic (PLEG): container finished" podID="986034bd-a776-4242-b772-b0db284f6d4f" containerID="d327f9403ce4568b62d72fe9980a0432ff8cd23fd8d7e8702c031cc920a379de" exitCode=0 Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.185762 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5jhw5" event={"ID":"986034bd-a776-4242-b772-b0db284f6d4f","Type":"ContainerDied","Data":"d327f9403ce4568b62d72fe9980a0432ff8cd23fd8d7e8702c031cc920a379de"} Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.185787 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5jhw5" event={"ID":"986034bd-a776-4242-b772-b0db284f6d4f","Type":"ContainerStarted","Data":"dc01071459bc18ba784addb43af0b68e23da963c1a73fa2ec571d0b9c8b52677"} Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.192920 4756 generic.go:334] "Generic (PLEG): container finished" podID="f55d3666-a13e-4d10-ba01-d13f5d0b1d7a" containerID="d13d9a27b35d2845a09a8db034bdad1c6e20fddb4e0686170007fa4fe3692129" exitCode=0 Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.193238 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84874cff96-vj9jp" event={"ID":"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a","Type":"ContainerDied","Data":"d13d9a27b35d2845a09a8db034bdad1c6e20fddb4e0686170007fa4fe3692129"} Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.208744 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-58bc-account-create-update-r7d29" podStartSLOduration=2.208726096 podStartE2EDuration="2.208726096s" podCreationTimestamp="2025-12-05 16:38:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:38:34.201052408 +0000 UTC m=+1374.439725920" watchObservedRunningTime="2025-12-05 16:38:34.208726096 +0000 UTC m=+1374.447399618" Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.373083 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84874cff96-vj9jp" Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.444651 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.444927 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83" containerName="glance-log" containerID="cri-o://57ab6a40cb3a904dcb8be55d30199cf169c467f5fb9e1537ab5a5413525e37a4" gracePeriod=30 Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.445462 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83" containerName="glance-httpd" containerID="cri-o://624c369df5ed61c6d384aab2a1be9626724b0b3a3157dc4f1752a10cd288684f" gracePeriod=30 Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.535536 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a-combined-ca-bundle\") pod \"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a\" (UID: \"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a\") " Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.535692 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a-config\") pod \"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a\" (UID: \"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a\") " Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.538653 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a-ovndb-tls-certs\") pod \"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a\" (UID: \"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a\") " Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.539529 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a-httpd-config\") pod \"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a\" (UID: \"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a\") " Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.539597 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd4tc\" (UniqueName: \"kubernetes.io/projected/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a-kube-api-access-bd4tc\") pod \"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a\" (UID: \"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a\") " Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.547331 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a-kube-api-access-bd4tc" (OuterVolumeSpecName: "kube-api-access-bd4tc") pod "f55d3666-a13e-4d10-ba01-d13f5d0b1d7a" (UID: "f55d3666-a13e-4d10-ba01-d13f5d0b1d7a"). InnerVolumeSpecName "kube-api-access-bd4tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.550180 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f55d3666-a13e-4d10-ba01-d13f5d0b1d7a" (UID: "f55d3666-a13e-4d10-ba01-d13f5d0b1d7a"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.616736 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a-config" (OuterVolumeSpecName: "config") pod "f55d3666-a13e-4d10-ba01-d13f5d0b1d7a" (UID: "f55d3666-a13e-4d10-ba01-d13f5d0b1d7a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.617184 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f55d3666-a13e-4d10-ba01-d13f5d0b1d7a" (UID: "f55d3666-a13e-4d10-ba01-d13f5d0b1d7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.642541 4756 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.642595 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bd4tc\" (UniqueName: \"kubernetes.io/projected/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a-kube-api-access-bd4tc\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.642611 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.642622 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.666652 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "f55d3666-a13e-4d10-ba01-d13f5d0b1d7a" (UID: "f55d3666-a13e-4d10-ba01-d13f5d0b1d7a"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:34 crc kubenswrapper[4756]: I1205 16:38:34.744617 4756 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.204993 4756 generic.go:334] "Generic (PLEG): container finished" podID="e884c151-d5ce-44a5-819d-9ee0c4328cbe" containerID="748d893aca593248b6065e871080d3dc2da1e7446554b7f0e8a379e4ea7892be" exitCode=0 Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.205041 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-58bc-account-create-update-r7d29" event={"ID":"e884c151-d5ce-44a5-819d-9ee0c4328cbe","Type":"ContainerDied","Data":"748d893aca593248b6065e871080d3dc2da1e7446554b7f0e8a379e4ea7892be"} Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.207156 4756 generic.go:334] "Generic (PLEG): container finished" podID="4e3345b9-82e1-469d-8db0-0d5a8163c369" containerID="e9cb97c846b6e2094cc7aac2cfe5bee6e90448725e33aa4f60ab7b3ad4926211" exitCode=0 Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.207206 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6b06-account-create-update-tjk9b" event={"ID":"4e3345b9-82e1-469d-8db0-0d5a8163c369","Type":"ContainerDied","Data":"e9cb97c846b6e2094cc7aac2cfe5bee6e90448725e33aa4f60ab7b3ad4926211"} Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.207255 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6b06-account-create-update-tjk9b" event={"ID":"4e3345b9-82e1-469d-8db0-0d5a8163c369","Type":"ContainerStarted","Data":"c387f299c1c83754279dc12b904f7bc08977d776a5ac7ff01a2010c7c7d121db"} Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.209410 4756 generic.go:334] "Generic (PLEG): container finished" podID="1702bfa9-c7bd-4e9b-996a-dbb838d454ac" containerID="a98732f8a9adea1e07b9136ead6f0a5c91c6e5e7c6bd79d6b9a321ff304a94b5" exitCode=0 Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.209484 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-bc19-account-create-update-rxqln" event={"ID":"1702bfa9-c7bd-4e9b-996a-dbb838d454ac","Type":"ContainerDied","Data":"a98732f8a9adea1e07b9136ead6f0a5c91c6e5e7c6bd79d6b9a321ff304a94b5"} Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.211393 4756 generic.go:334] "Generic (PLEG): container finished" podID="67e154b5-e680-4c49-b449-62b100aac6af" containerID="9eb7c337bc0192f0bbdc6691c02794d365f0af44aebf7d0ed7d4e350deb56d23" exitCode=0 Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.211556 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-hjw2d" event={"ID":"67e154b5-e680-4c49-b449-62b100aac6af","Type":"ContainerDied","Data":"9eb7c337bc0192f0bbdc6691c02794d365f0af44aebf7d0ed7d4e350deb56d23"} Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.220663 4756 generic.go:334] "Generic (PLEG): container finished" podID="e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83" containerID="57ab6a40cb3a904dcb8be55d30199cf169c467f5fb9e1537ab5a5413525e37a4" exitCode=143 Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.220766 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83","Type":"ContainerDied","Data":"57ab6a40cb3a904dcb8be55d30199cf169c467f5fb9e1537ab5a5413525e37a4"} Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.225468 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84874cff96-vj9jp" event={"ID":"f55d3666-a13e-4d10-ba01-d13f5d0b1d7a","Type":"ContainerDied","Data":"cc992d269d209eedc89a6f0ff4e26b6086ae6ba5c0f6c630a01d73c38ec639b2"} Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.225534 4756 scope.go:117] "RemoveContainer" containerID="a0da29df90f63367bef1b3be43b14be690f24459c837cdc40e920dde2d3e744c" Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.225624 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84874cff96-vj9jp" Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.239539 4756 generic.go:334] "Generic (PLEG): container finished" podID="6f76f909-d1f6-4f2d-a398-5791bb8a14cf" containerID="b1fdd4b0355b13fe70aff4ac0136e9ef9e527331dc56f1dbae9b13dfe8b9c5f1" exitCode=0 Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.239822 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-bsbtj" event={"ID":"6f76f909-d1f6-4f2d-a398-5791bb8a14cf","Type":"ContainerDied","Data":"b1fdd4b0355b13fe70aff4ac0136e9ef9e527331dc56f1dbae9b13dfe8b9c5f1"} Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.264452 4756 scope.go:117] "RemoveContainer" containerID="d13d9a27b35d2845a09a8db034bdad1c6e20fddb4e0686170007fa4fe3692129" Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.306325 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-84874cff96-vj9jp"] Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.314234 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-84874cff96-vj9jp"] Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.682137 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hjw2d" Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.703628 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5jhw5" Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.761862 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vds2l\" (UniqueName: \"kubernetes.io/projected/67e154b5-e680-4c49-b449-62b100aac6af-kube-api-access-vds2l\") pod \"67e154b5-e680-4c49-b449-62b100aac6af\" (UID: \"67e154b5-e680-4c49-b449-62b100aac6af\") " Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.761952 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67e154b5-e680-4c49-b449-62b100aac6af-operator-scripts\") pod \"67e154b5-e680-4c49-b449-62b100aac6af\" (UID: \"67e154b5-e680-4c49-b449-62b100aac6af\") " Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.762830 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67e154b5-e680-4c49-b449-62b100aac6af-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "67e154b5-e680-4c49-b449-62b100aac6af" (UID: "67e154b5-e680-4c49-b449-62b100aac6af"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.785629 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67e154b5-e680-4c49-b449-62b100aac6af-kube-api-access-vds2l" (OuterVolumeSpecName: "kube-api-access-vds2l") pod "67e154b5-e680-4c49-b449-62b100aac6af" (UID: "67e154b5-e680-4c49-b449-62b100aac6af"). InnerVolumeSpecName "kube-api-access-vds2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.863627 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqsw4\" (UniqueName: \"kubernetes.io/projected/986034bd-a776-4242-b772-b0db284f6d4f-kube-api-access-kqsw4\") pod \"986034bd-a776-4242-b772-b0db284f6d4f\" (UID: \"986034bd-a776-4242-b772-b0db284f6d4f\") " Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.863803 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/986034bd-a776-4242-b772-b0db284f6d4f-operator-scripts\") pod \"986034bd-a776-4242-b772-b0db284f6d4f\" (UID: \"986034bd-a776-4242-b772-b0db284f6d4f\") " Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.864278 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vds2l\" (UniqueName: \"kubernetes.io/projected/67e154b5-e680-4c49-b449-62b100aac6af-kube-api-access-vds2l\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.864297 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67e154b5-e680-4c49-b449-62b100aac6af-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.864264 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/986034bd-a776-4242-b772-b0db284f6d4f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "986034bd-a776-4242-b772-b0db284f6d4f" (UID: "986034bd-a776-4242-b772-b0db284f6d4f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.876697 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/986034bd-a776-4242-b772-b0db284f6d4f-kube-api-access-kqsw4" (OuterVolumeSpecName: "kube-api-access-kqsw4") pod "986034bd-a776-4242-b772-b0db284f6d4f" (UID: "986034bd-a776-4242-b772-b0db284f6d4f"). InnerVolumeSpecName "kube-api-access-kqsw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.966249 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/986034bd-a776-4242-b772-b0db284f6d4f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:35 crc kubenswrapper[4756]: I1205 16:38:35.966296 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqsw4\" (UniqueName: \"kubernetes.io/projected/986034bd-a776-4242-b772-b0db284f6d4f-kube-api-access-kqsw4\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:36 crc kubenswrapper[4756]: I1205 16:38:36.061018 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f55d3666-a13e-4d10-ba01-d13f5d0b1d7a" path="/var/lib/kubelet/pods/f55d3666-a13e-4d10-ba01-d13f5d0b1d7a/volumes" Dec 05 16:38:36 crc kubenswrapper[4756]: I1205 16:38:36.251599 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-hjw2d" event={"ID":"67e154b5-e680-4c49-b449-62b100aac6af","Type":"ContainerDied","Data":"870ec7bd7a1dc8e9a01b165ee4720048760eca3db18fa341f87ca60a7fffad43"} Dec 05 16:38:36 crc kubenswrapper[4756]: I1205 16:38:36.251638 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="870ec7bd7a1dc8e9a01b165ee4720048760eca3db18fa341f87ca60a7fffad43" Dec 05 16:38:36 crc kubenswrapper[4756]: I1205 16:38:36.251689 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hjw2d" Dec 05 16:38:36 crc kubenswrapper[4756]: I1205 16:38:36.254485 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5jhw5" Dec 05 16:38:36 crc kubenswrapper[4756]: I1205 16:38:36.254859 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5jhw5" event={"ID":"986034bd-a776-4242-b772-b0db284f6d4f","Type":"ContainerDied","Data":"dc01071459bc18ba784addb43af0b68e23da963c1a73fa2ec571d0b9c8b52677"} Dec 05 16:38:36 crc kubenswrapper[4756]: I1205 16:38:36.254885 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc01071459bc18ba784addb43af0b68e23da963c1a73fa2ec571d0b9c8b52677" Dec 05 16:38:36 crc kubenswrapper[4756]: I1205 16:38:36.525402 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bsbtj" Dec 05 16:38:36 crc kubenswrapper[4756]: I1205 16:38:36.681018 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn9tq\" (UniqueName: \"kubernetes.io/projected/6f76f909-d1f6-4f2d-a398-5791bb8a14cf-kube-api-access-tn9tq\") pod \"6f76f909-d1f6-4f2d-a398-5791bb8a14cf\" (UID: \"6f76f909-d1f6-4f2d-a398-5791bb8a14cf\") " Dec 05 16:38:36 crc kubenswrapper[4756]: I1205 16:38:36.681176 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f76f909-d1f6-4f2d-a398-5791bb8a14cf-operator-scripts\") pod \"6f76f909-d1f6-4f2d-a398-5791bb8a14cf\" (UID: \"6f76f909-d1f6-4f2d-a398-5791bb8a14cf\") " Dec 05 16:38:36 crc kubenswrapper[4756]: I1205 16:38:36.681688 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f76f909-d1f6-4f2d-a398-5791bb8a14cf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6f76f909-d1f6-4f2d-a398-5791bb8a14cf" (UID: "6f76f909-d1f6-4f2d-a398-5791bb8a14cf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:38:36 crc kubenswrapper[4756]: I1205 16:38:36.681872 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f76f909-d1f6-4f2d-a398-5791bb8a14cf-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:36 crc kubenswrapper[4756]: I1205 16:38:36.684437 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 16:38:36 crc kubenswrapper[4756]: I1205 16:38:36.684656 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d20d841d-b516-42f8-9d69-bee6cab38d04" containerName="glance-log" containerID="cri-o://9f05ebf9f76d9de462b8e6c8165c8a59b53a64d17ed04decde1d2699deefc1d9" gracePeriod=30 Dec 05 16:38:36 crc kubenswrapper[4756]: I1205 16:38:36.685052 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d20d841d-b516-42f8-9d69-bee6cab38d04" containerName="glance-httpd" containerID="cri-o://0a3f2fad23369eef6abdb198f5d8d248f85735284965758d4037155e9fa9a2ea" gracePeriod=30 Dec 05 16:38:36 crc kubenswrapper[4756]: I1205 16:38:36.687608 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f76f909-d1f6-4f2d-a398-5791bb8a14cf-kube-api-access-tn9tq" (OuterVolumeSpecName: "kube-api-access-tn9tq") pod "6f76f909-d1f6-4f2d-a398-5791bb8a14cf" (UID: "6f76f909-d1f6-4f2d-a398-5791bb8a14cf"). InnerVolumeSpecName "kube-api-access-tn9tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:38:36 crc kubenswrapper[4756]: I1205 16:38:36.783778 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn9tq\" (UniqueName: \"kubernetes.io/projected/6f76f909-d1f6-4f2d-a398-5791bb8a14cf-kube-api-access-tn9tq\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.031999 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-bc19-account-create-update-rxqln" Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.043545 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-58bc-account-create-update-r7d29" Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.045844 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6b06-account-create-update-tjk9b" Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.197997 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1702bfa9-c7bd-4e9b-996a-dbb838d454ac-operator-scripts\") pod \"1702bfa9-c7bd-4e9b-996a-dbb838d454ac\" (UID: \"1702bfa9-c7bd-4e9b-996a-dbb838d454ac\") " Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.198060 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwlm5\" (UniqueName: \"kubernetes.io/projected/4e3345b9-82e1-469d-8db0-0d5a8163c369-kube-api-access-jwlm5\") pod \"4e3345b9-82e1-469d-8db0-0d5a8163c369\" (UID: \"4e3345b9-82e1-469d-8db0-0d5a8163c369\") " Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.198096 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e3345b9-82e1-469d-8db0-0d5a8163c369-operator-scripts\") pod \"4e3345b9-82e1-469d-8db0-0d5a8163c369\" (UID: \"4e3345b9-82e1-469d-8db0-0d5a8163c369\") " Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.198115 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e884c151-d5ce-44a5-819d-9ee0c4328cbe-operator-scripts\") pod \"e884c151-d5ce-44a5-819d-9ee0c4328cbe\" (UID: \"e884c151-d5ce-44a5-819d-9ee0c4328cbe\") " Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.198297 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbzrm\" (UniqueName: \"kubernetes.io/projected/1702bfa9-c7bd-4e9b-996a-dbb838d454ac-kube-api-access-qbzrm\") pod \"1702bfa9-c7bd-4e9b-996a-dbb838d454ac\" (UID: \"1702bfa9-c7bd-4e9b-996a-dbb838d454ac\") " Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.198340 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f5fs\" (UniqueName: \"kubernetes.io/projected/e884c151-d5ce-44a5-819d-9ee0c4328cbe-kube-api-access-7f5fs\") pod \"e884c151-d5ce-44a5-819d-9ee0c4328cbe\" (UID: \"e884c151-d5ce-44a5-819d-9ee0c4328cbe\") " Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.198699 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1702bfa9-c7bd-4e9b-996a-dbb838d454ac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1702bfa9-c7bd-4e9b-996a-dbb838d454ac" (UID: "1702bfa9-c7bd-4e9b-996a-dbb838d454ac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.199034 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e884c151-d5ce-44a5-819d-9ee0c4328cbe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e884c151-d5ce-44a5-819d-9ee0c4328cbe" (UID: "e884c151-d5ce-44a5-819d-9ee0c4328cbe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.199092 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e3345b9-82e1-469d-8db0-0d5a8163c369-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4e3345b9-82e1-469d-8db0-0d5a8163c369" (UID: "4e3345b9-82e1-469d-8db0-0d5a8163c369"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.199806 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1702bfa9-c7bd-4e9b-996a-dbb838d454ac-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.199830 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e884c151-d5ce-44a5-819d-9ee0c4328cbe-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.199842 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e3345b9-82e1-469d-8db0-0d5a8163c369-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.205822 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e3345b9-82e1-469d-8db0-0d5a8163c369-kube-api-access-jwlm5" (OuterVolumeSpecName: "kube-api-access-jwlm5") pod "4e3345b9-82e1-469d-8db0-0d5a8163c369" (UID: "4e3345b9-82e1-469d-8db0-0d5a8163c369"). InnerVolumeSpecName "kube-api-access-jwlm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.205892 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1702bfa9-c7bd-4e9b-996a-dbb838d454ac-kube-api-access-qbzrm" (OuterVolumeSpecName: "kube-api-access-qbzrm") pod "1702bfa9-c7bd-4e9b-996a-dbb838d454ac" (UID: "1702bfa9-c7bd-4e9b-996a-dbb838d454ac"). InnerVolumeSpecName "kube-api-access-qbzrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.205950 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e884c151-d5ce-44a5-819d-9ee0c4328cbe-kube-api-access-7f5fs" (OuterVolumeSpecName: "kube-api-access-7f5fs") pod "e884c151-d5ce-44a5-819d-9ee0c4328cbe" (UID: "e884c151-d5ce-44a5-819d-9ee0c4328cbe"). InnerVolumeSpecName "kube-api-access-7f5fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.262405 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-bsbtj" event={"ID":"6f76f909-d1f6-4f2d-a398-5791bb8a14cf","Type":"ContainerDied","Data":"18fc9911a8c121b02971e9755cd2bdf76f4fd76a7bb22334ebab9b9e630436b8"} Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.262444 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18fc9911a8c121b02971e9755cd2bdf76f4fd76a7bb22334ebab9b9e630436b8" Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.262507 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bsbtj" Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.270352 4756 generic.go:334] "Generic (PLEG): container finished" podID="d20d841d-b516-42f8-9d69-bee6cab38d04" containerID="9f05ebf9f76d9de462b8e6c8165c8a59b53a64d17ed04decde1d2699deefc1d9" exitCode=143 Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.270422 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d20d841d-b516-42f8-9d69-bee6cab38d04","Type":"ContainerDied","Data":"9f05ebf9f76d9de462b8e6c8165c8a59b53a64d17ed04decde1d2699deefc1d9"} Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.273586 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-58bc-account-create-update-r7d29" event={"ID":"e884c151-d5ce-44a5-819d-9ee0c4328cbe","Type":"ContainerDied","Data":"46c1fd9005826c44d49be5ce43c7213203e72d45feaca6a8dc8e2ff6b637f8b6"} Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.273615 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46c1fd9005826c44d49be5ce43c7213203e72d45feaca6a8dc8e2ff6b637f8b6" Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.273637 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-58bc-account-create-update-r7d29" Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.275731 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6b06-account-create-update-tjk9b" event={"ID":"4e3345b9-82e1-469d-8db0-0d5a8163c369","Type":"ContainerDied","Data":"c387f299c1c83754279dc12b904f7bc08977d776a5ac7ff01a2010c7c7d121db"} Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.275753 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c387f299c1c83754279dc12b904f7bc08977d776a5ac7ff01a2010c7c7d121db" Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.275797 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6b06-account-create-update-tjk9b" Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.282083 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-bc19-account-create-update-rxqln" event={"ID":"1702bfa9-c7bd-4e9b-996a-dbb838d454ac","Type":"ContainerDied","Data":"af854cb8bf263195261bbd3b032eb84a6b5d6dfd33c53c48c73a89dca7fc5298"} Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.282121 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af854cb8bf263195261bbd3b032eb84a6b5d6dfd33c53c48c73a89dca7fc5298" Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.282176 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-bc19-account-create-update-rxqln" Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.301537 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbzrm\" (UniqueName: \"kubernetes.io/projected/1702bfa9-c7bd-4e9b-996a-dbb838d454ac-kube-api-access-qbzrm\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.301559 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f5fs\" (UniqueName: \"kubernetes.io/projected/e884c151-d5ce-44a5-819d-9ee0c4328cbe-kube-api-access-7f5fs\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.301569 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwlm5\" (UniqueName: \"kubernetes.io/projected/4e3345b9-82e1-469d-8db0-0d5a8163c369-kube-api-access-jwlm5\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.611903 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.149:9292/healthcheck\": read tcp 10.217.0.2:51434->10.217.0.149:9292: read: connection reset by peer" Dec 05 16:38:37 crc kubenswrapper[4756]: I1205 16:38:37.613674 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.149:9292/healthcheck\": read tcp 10.217.0.2:51438->10.217.0.149:9292: read: connection reset by peer" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.151494 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.293395 4756 generic.go:334] "Generic (PLEG): container finished" podID="e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83" containerID="624c369df5ed61c6d384aab2a1be9626724b0b3a3157dc4f1752a10cd288684f" exitCode=0 Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.293482 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83","Type":"ContainerDied","Data":"624c369df5ed61c6d384aab2a1be9626724b0b3a3157dc4f1752a10cd288684f"} Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.293510 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83","Type":"ContainerDied","Data":"6654c1e4f1ab92c605ef4191e6cb7fb61ce7e619e3c649fe103e9c2293dc9956"} Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.293525 4756 scope.go:117] "RemoveContainer" containerID="624c369df5ed61c6d384aab2a1be9626724b0b3a3157dc4f1752a10cd288684f" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.293633 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.321607 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-logs\") pod \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.321993 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.322302 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-httpd-run\") pod \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.322361 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lc6hr\" (UniqueName: \"kubernetes.io/projected/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-kube-api-access-lc6hr\") pod \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.322389 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-combined-ca-bundle\") pod \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.322998 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-logs" (OuterVolumeSpecName: "logs") pod "e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83" (UID: "e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.323332 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83" (UID: "e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.328992 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-config-data\") pod \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.329063 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-scripts\") pod \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.329108 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-public-tls-certs\") pod \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\" (UID: \"e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83\") " Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.329887 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.329915 4756 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.337785 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-scripts" (OuterVolumeSpecName: "scripts") pod "e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83" (UID: "e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.341397 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-kube-api-access-lc6hr" (OuterVolumeSpecName: "kube-api-access-lc6hr") pod "e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83" (UID: "e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83"). InnerVolumeSpecName "kube-api-access-lc6hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.363438 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83" (UID: "e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.368312 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83" (UID: "e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.393820 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.394217 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="59c21716-507e-46b6-ad7e-8aa53968d9a6" containerName="ceilometer-central-agent" containerID="cri-o://dcf9655a0ecf26d837db083e173823cb2b45d03a9e4b77039b4255ba071b5808" gracePeriod=30 Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.394706 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="59c21716-507e-46b6-ad7e-8aa53968d9a6" containerName="proxy-httpd" containerID="cri-o://f23aaf5b558f23193ee013236b2ffa73ec172d4225e0d322d8cb4eba4a54b080" gracePeriod=30 Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.394769 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="59c21716-507e-46b6-ad7e-8aa53968d9a6" containerName="sg-core" containerID="cri-o://0739127c4b5a48cc0fea53e873c582ad740019f226779db51ead18e14872f2bf" gracePeriod=30 Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.394809 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="59c21716-507e-46b6-ad7e-8aa53968d9a6" containerName="ceilometer-notification-agent" containerID="cri-o://ca4eb31ee2f57553aac6d57afc83aa7cc485b7391c618d680f0bf310a49e95e5" gracePeriod=30 Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.433560 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lc6hr\" (UniqueName: \"kubernetes.io/projected/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-kube-api-access-lc6hr\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.433590 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.433599 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.433626 4756 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.452120 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83" (UID: "e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.454636 4756 scope.go:117] "RemoveContainer" containerID="57ab6a40cb3a904dcb8be55d30199cf169c467f5fb9e1537ab5a5413525e37a4" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.462286 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-config-data" (OuterVolumeSpecName: "config-data") pod "e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83" (UID: "e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.477421 4756 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.479060 4756 scope.go:117] "RemoveContainer" containerID="624c369df5ed61c6d384aab2a1be9626724b0b3a3157dc4f1752a10cd288684f" Dec 05 16:38:38 crc kubenswrapper[4756]: E1205 16:38:38.479879 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"624c369df5ed61c6d384aab2a1be9626724b0b3a3157dc4f1752a10cd288684f\": container with ID starting with 624c369df5ed61c6d384aab2a1be9626724b0b3a3157dc4f1752a10cd288684f not found: ID does not exist" containerID="624c369df5ed61c6d384aab2a1be9626724b0b3a3157dc4f1752a10cd288684f" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.479916 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"624c369df5ed61c6d384aab2a1be9626724b0b3a3157dc4f1752a10cd288684f"} err="failed to get container status \"624c369df5ed61c6d384aab2a1be9626724b0b3a3157dc4f1752a10cd288684f\": rpc error: code = NotFound desc = could not find container \"624c369df5ed61c6d384aab2a1be9626724b0b3a3157dc4f1752a10cd288684f\": container with ID starting with 624c369df5ed61c6d384aab2a1be9626724b0b3a3157dc4f1752a10cd288684f not found: ID does not exist" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.479943 4756 scope.go:117] "RemoveContainer" containerID="57ab6a40cb3a904dcb8be55d30199cf169c467f5fb9e1537ab5a5413525e37a4" Dec 05 16:38:38 crc kubenswrapper[4756]: E1205 16:38:38.482383 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57ab6a40cb3a904dcb8be55d30199cf169c467f5fb9e1537ab5a5413525e37a4\": container with ID starting with 57ab6a40cb3a904dcb8be55d30199cf169c467f5fb9e1537ab5a5413525e37a4 not found: ID does not exist" containerID="57ab6a40cb3a904dcb8be55d30199cf169c467f5fb9e1537ab5a5413525e37a4" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.482439 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57ab6a40cb3a904dcb8be55d30199cf169c467f5fb9e1537ab5a5413525e37a4"} err="failed to get container status \"57ab6a40cb3a904dcb8be55d30199cf169c467f5fb9e1537ab5a5413525e37a4\": rpc error: code = NotFound desc = could not find container \"57ab6a40cb3a904dcb8be55d30199cf169c467f5fb9e1537ab5a5413525e37a4\": container with ID starting with 57ab6a40cb3a904dcb8be55d30199cf169c467f5fb9e1537ab5a5413525e37a4 not found: ID does not exist" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.535018 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.535048 4756 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.535057 4756 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.632848 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.650477 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.666198 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 16:38:38 crc kubenswrapper[4756]: E1205 16:38:38.666700 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e3345b9-82e1-469d-8db0-0d5a8163c369" containerName="mariadb-account-create-update" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.666725 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e3345b9-82e1-469d-8db0-0d5a8163c369" containerName="mariadb-account-create-update" Dec 05 16:38:38 crc kubenswrapper[4756]: E1205 16:38:38.666750 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1702bfa9-c7bd-4e9b-996a-dbb838d454ac" containerName="mariadb-account-create-update" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.666758 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1702bfa9-c7bd-4e9b-996a-dbb838d454ac" containerName="mariadb-account-create-update" Dec 05 16:38:38 crc kubenswrapper[4756]: E1205 16:38:38.666771 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e154b5-e680-4c49-b449-62b100aac6af" containerName="mariadb-database-create" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.666779 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e154b5-e680-4c49-b449-62b100aac6af" containerName="mariadb-database-create" Dec 05 16:38:38 crc kubenswrapper[4756]: E1205 16:38:38.666792 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83" containerName="glance-log" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.666800 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83" containerName="glance-log" Dec 05 16:38:38 crc kubenswrapper[4756]: E1205 16:38:38.666817 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f55d3666-a13e-4d10-ba01-d13f5d0b1d7a" containerName="neutron-api" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.666825 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f55d3666-a13e-4d10-ba01-d13f5d0b1d7a" containerName="neutron-api" Dec 05 16:38:38 crc kubenswrapper[4756]: E1205 16:38:38.666843 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f55d3666-a13e-4d10-ba01-d13f5d0b1d7a" containerName="neutron-httpd" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.666850 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f55d3666-a13e-4d10-ba01-d13f5d0b1d7a" containerName="neutron-httpd" Dec 05 16:38:38 crc kubenswrapper[4756]: E1205 16:38:38.666864 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83" containerName="glance-httpd" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.666872 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83" containerName="glance-httpd" Dec 05 16:38:38 crc kubenswrapper[4756]: E1205 16:38:38.666883 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f76f909-d1f6-4f2d-a398-5791bb8a14cf" containerName="mariadb-database-create" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.666891 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f76f909-d1f6-4f2d-a398-5791bb8a14cf" containerName="mariadb-database-create" Dec 05 16:38:38 crc kubenswrapper[4756]: E1205 16:38:38.666906 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e884c151-d5ce-44a5-819d-9ee0c4328cbe" containerName="mariadb-account-create-update" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.666915 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="e884c151-d5ce-44a5-819d-9ee0c4328cbe" containerName="mariadb-account-create-update" Dec 05 16:38:38 crc kubenswrapper[4756]: E1205 16:38:38.666936 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="986034bd-a776-4242-b772-b0db284f6d4f" containerName="mariadb-database-create" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.666944 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="986034bd-a776-4242-b772-b0db284f6d4f" containerName="mariadb-database-create" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.667139 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="986034bd-a776-4242-b772-b0db284f6d4f" containerName="mariadb-database-create" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.667164 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e3345b9-82e1-469d-8db0-0d5a8163c369" containerName="mariadb-account-create-update" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.667176 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83" containerName="glance-log" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.667196 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f55d3666-a13e-4d10-ba01-d13f5d0b1d7a" containerName="neutron-api" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.667210 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e154b5-e680-4c49-b449-62b100aac6af" containerName="mariadb-database-create" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.667224 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1702bfa9-c7bd-4e9b-996a-dbb838d454ac" containerName="mariadb-account-create-update" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.667238 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f76f909-d1f6-4f2d-a398-5791bb8a14cf" containerName="mariadb-database-create" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.667246 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f55d3666-a13e-4d10-ba01-d13f5d0b1d7a" containerName="neutron-httpd" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.667260 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="e884c151-d5ce-44a5-819d-9ee0c4328cbe" containerName="mariadb-account-create-update" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.667274 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83" containerName="glance-httpd" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.668376 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.671447 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.671621 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.701338 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.840307 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4beba48e-5ac4-4d41-864b-ecfd8b039988-config-data\") pod \"glance-default-external-api-0\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " pod="openstack/glance-default-external-api-0" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.840351 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4beba48e-5ac4-4d41-864b-ecfd8b039988-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " pod="openstack/glance-default-external-api-0" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.840389 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4beba48e-5ac4-4d41-864b-ecfd8b039988-scripts\") pod \"glance-default-external-api-0\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " pod="openstack/glance-default-external-api-0" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.840778 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4beba48e-5ac4-4d41-864b-ecfd8b039988-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " pod="openstack/glance-default-external-api-0" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.840857 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4beba48e-5ac4-4d41-864b-ecfd8b039988-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " pod="openstack/glance-default-external-api-0" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.840916 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt77c\" (UniqueName: \"kubernetes.io/projected/4beba48e-5ac4-4d41-864b-ecfd8b039988-kube-api-access-tt77c\") pod \"glance-default-external-api-0\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " pod="openstack/glance-default-external-api-0" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.841074 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " pod="openstack/glance-default-external-api-0" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.841149 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4beba48e-5ac4-4d41-864b-ecfd8b039988-logs\") pod \"glance-default-external-api-0\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " pod="openstack/glance-default-external-api-0" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.942568 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " pod="openstack/glance-default-external-api-0" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.942649 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4beba48e-5ac4-4d41-864b-ecfd8b039988-logs\") pod \"glance-default-external-api-0\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " pod="openstack/glance-default-external-api-0" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.942696 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4beba48e-5ac4-4d41-864b-ecfd8b039988-config-data\") pod \"glance-default-external-api-0\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " pod="openstack/glance-default-external-api-0" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.942728 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4beba48e-5ac4-4d41-864b-ecfd8b039988-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " pod="openstack/glance-default-external-api-0" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.942762 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4beba48e-5ac4-4d41-864b-ecfd8b039988-scripts\") pod \"glance-default-external-api-0\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " pod="openstack/glance-default-external-api-0" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.942863 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4beba48e-5ac4-4d41-864b-ecfd8b039988-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " pod="openstack/glance-default-external-api-0" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.942892 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4beba48e-5ac4-4d41-864b-ecfd8b039988-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " pod="openstack/glance-default-external-api-0" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.942925 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt77c\" (UniqueName: \"kubernetes.io/projected/4beba48e-5ac4-4d41-864b-ecfd8b039988-kube-api-access-tt77c\") pod \"glance-default-external-api-0\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " pod="openstack/glance-default-external-api-0" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.942957 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.943996 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4beba48e-5ac4-4d41-864b-ecfd8b039988-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " pod="openstack/glance-default-external-api-0" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.944297 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4beba48e-5ac4-4d41-864b-ecfd8b039988-logs\") pod \"glance-default-external-api-0\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " pod="openstack/glance-default-external-api-0" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.947564 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4beba48e-5ac4-4d41-864b-ecfd8b039988-config-data\") pod \"glance-default-external-api-0\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " pod="openstack/glance-default-external-api-0" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.948204 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4beba48e-5ac4-4d41-864b-ecfd8b039988-scripts\") pod \"glance-default-external-api-0\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " pod="openstack/glance-default-external-api-0" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.948353 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4beba48e-5ac4-4d41-864b-ecfd8b039988-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " pod="openstack/glance-default-external-api-0" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.950980 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4beba48e-5ac4-4d41-864b-ecfd8b039988-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " pod="openstack/glance-default-external-api-0" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.960242 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt77c\" (UniqueName: \"kubernetes.io/projected/4beba48e-5ac4-4d41-864b-ecfd8b039988-kube-api-access-tt77c\") pod \"glance-default-external-api-0\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " pod="openstack/glance-default-external-api-0" Dec 05 16:38:38 crc kubenswrapper[4756]: I1205 16:38:38.988648 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " pod="openstack/glance-default-external-api-0" Dec 05 16:38:39 crc kubenswrapper[4756]: I1205 16:38:38.998846 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 16:38:39 crc kubenswrapper[4756]: I1205 16:38:39.308960 4756 generic.go:334] "Generic (PLEG): container finished" podID="59c21716-507e-46b6-ad7e-8aa53968d9a6" containerID="f23aaf5b558f23193ee013236b2ffa73ec172d4225e0d322d8cb4eba4a54b080" exitCode=0 Dec 05 16:38:39 crc kubenswrapper[4756]: I1205 16:38:39.309241 4756 generic.go:334] "Generic (PLEG): container finished" podID="59c21716-507e-46b6-ad7e-8aa53968d9a6" containerID="0739127c4b5a48cc0fea53e873c582ad740019f226779db51ead18e14872f2bf" exitCode=2 Dec 05 16:38:39 crc kubenswrapper[4756]: I1205 16:38:39.309251 4756 generic.go:334] "Generic (PLEG): container finished" podID="59c21716-507e-46b6-ad7e-8aa53968d9a6" containerID="ca4eb31ee2f57553aac6d57afc83aa7cc485b7391c618d680f0bf310a49e95e5" exitCode=0 Dec 05 16:38:39 crc kubenswrapper[4756]: I1205 16:38:39.309294 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"59c21716-507e-46b6-ad7e-8aa53968d9a6","Type":"ContainerDied","Data":"f23aaf5b558f23193ee013236b2ffa73ec172d4225e0d322d8cb4eba4a54b080"} Dec 05 16:38:39 crc kubenswrapper[4756]: I1205 16:38:39.309329 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"59c21716-507e-46b6-ad7e-8aa53968d9a6","Type":"ContainerDied","Data":"0739127c4b5a48cc0fea53e873c582ad740019f226779db51ead18e14872f2bf"} Dec 05 16:38:39 crc kubenswrapper[4756]: I1205 16:38:39.309341 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"59c21716-507e-46b6-ad7e-8aa53968d9a6","Type":"ContainerDied","Data":"ca4eb31ee2f57553aac6d57afc83aa7cc485b7391c618d680f0bf310a49e95e5"} Dec 05 16:38:39 crc kubenswrapper[4756]: I1205 16:38:39.569948 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 16:38:39 crc kubenswrapper[4756]: W1205 16:38:39.577663 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4beba48e_5ac4_4d41_864b_ecfd8b039988.slice/crio-297838fda664c6d36050dcab620e44ecebc97fe76faa2c79149e53d7073dd6ea WatchSource:0}: Error finding container 297838fda664c6d36050dcab620e44ecebc97fe76faa2c79149e53d7073dd6ea: Status 404 returned error can't find the container with id 297838fda664c6d36050dcab620e44ecebc97fe76faa2c79149e53d7073dd6ea Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.071495 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83" path="/var/lib/kubelet/pods/e6f2b24c-528f-4a3f-b3d0-d85d5b64fa83/volumes" Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.379075 4756 generic.go:334] "Generic (PLEG): container finished" podID="59c21716-507e-46b6-ad7e-8aa53968d9a6" containerID="dcf9655a0ecf26d837db083e173823cb2b45d03a9e4b77039b4255ba071b5808" exitCode=0 Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.379478 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"59c21716-507e-46b6-ad7e-8aa53968d9a6","Type":"ContainerDied","Data":"dcf9655a0ecf26d837db083e173823cb2b45d03a9e4b77039b4255ba071b5808"} Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.388899 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4beba48e-5ac4-4d41-864b-ecfd8b039988","Type":"ContainerStarted","Data":"ff50efe465d0847be81b8c502e77c5ea6e01d546de46512db17a70f058fa77b7"} Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.389277 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4beba48e-5ac4-4d41-864b-ecfd8b039988","Type":"ContainerStarted","Data":"297838fda664c6d36050dcab620e44ecebc97fe76faa2c79149e53d7073dd6ea"} Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.398683 4756 generic.go:334] "Generic (PLEG): container finished" podID="d20d841d-b516-42f8-9d69-bee6cab38d04" containerID="0a3f2fad23369eef6abdb198f5d8d248f85735284965758d4037155e9fa9a2ea" exitCode=0 Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.398734 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d20d841d-b516-42f8-9d69-bee6cab38d04","Type":"ContainerDied","Data":"0a3f2fad23369eef6abdb198f5d8d248f85735284965758d4037155e9fa9a2ea"} Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.462209 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.581369 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59c21716-507e-46b6-ad7e-8aa53968d9a6-scripts\") pod \"59c21716-507e-46b6-ad7e-8aa53968d9a6\" (UID: \"59c21716-507e-46b6-ad7e-8aa53968d9a6\") " Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.581506 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59c21716-507e-46b6-ad7e-8aa53968d9a6-run-httpd\") pod \"59c21716-507e-46b6-ad7e-8aa53968d9a6\" (UID: \"59c21716-507e-46b6-ad7e-8aa53968d9a6\") " Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.581646 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59c21716-507e-46b6-ad7e-8aa53968d9a6-combined-ca-bundle\") pod \"59c21716-507e-46b6-ad7e-8aa53968d9a6\" (UID: \"59c21716-507e-46b6-ad7e-8aa53968d9a6\") " Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.581710 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgv9l\" (UniqueName: \"kubernetes.io/projected/59c21716-507e-46b6-ad7e-8aa53968d9a6-kube-api-access-jgv9l\") pod \"59c21716-507e-46b6-ad7e-8aa53968d9a6\" (UID: \"59c21716-507e-46b6-ad7e-8aa53968d9a6\") " Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.581754 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59c21716-507e-46b6-ad7e-8aa53968d9a6-config-data\") pod \"59c21716-507e-46b6-ad7e-8aa53968d9a6\" (UID: \"59c21716-507e-46b6-ad7e-8aa53968d9a6\") " Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.581829 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/59c21716-507e-46b6-ad7e-8aa53968d9a6-sg-core-conf-yaml\") pod \"59c21716-507e-46b6-ad7e-8aa53968d9a6\" (UID: \"59c21716-507e-46b6-ad7e-8aa53968d9a6\") " Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.581888 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59c21716-507e-46b6-ad7e-8aa53968d9a6-log-httpd\") pod \"59c21716-507e-46b6-ad7e-8aa53968d9a6\" (UID: \"59c21716-507e-46b6-ad7e-8aa53968d9a6\") " Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.582137 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59c21716-507e-46b6-ad7e-8aa53968d9a6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "59c21716-507e-46b6-ad7e-8aa53968d9a6" (UID: "59c21716-507e-46b6-ad7e-8aa53968d9a6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.584379 4756 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59c21716-507e-46b6-ad7e-8aa53968d9a6-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.585094 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59c21716-507e-46b6-ad7e-8aa53968d9a6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "59c21716-507e-46b6-ad7e-8aa53968d9a6" (UID: "59c21716-507e-46b6-ad7e-8aa53968d9a6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.591674 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59c21716-507e-46b6-ad7e-8aa53968d9a6-scripts" (OuterVolumeSpecName: "scripts") pod "59c21716-507e-46b6-ad7e-8aa53968d9a6" (UID: "59c21716-507e-46b6-ad7e-8aa53968d9a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.604005 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59c21716-507e-46b6-ad7e-8aa53968d9a6-kube-api-access-jgv9l" (OuterVolumeSpecName: "kube-api-access-jgv9l") pod "59c21716-507e-46b6-ad7e-8aa53968d9a6" (UID: "59c21716-507e-46b6-ad7e-8aa53968d9a6"). InnerVolumeSpecName "kube-api-access-jgv9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.639578 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59c21716-507e-46b6-ad7e-8aa53968d9a6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "59c21716-507e-46b6-ad7e-8aa53968d9a6" (UID: "59c21716-507e-46b6-ad7e-8aa53968d9a6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.687713 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgv9l\" (UniqueName: \"kubernetes.io/projected/59c21716-507e-46b6-ad7e-8aa53968d9a6-kube-api-access-jgv9l\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.687771 4756 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/59c21716-507e-46b6-ad7e-8aa53968d9a6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.687786 4756 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59c21716-507e-46b6-ad7e-8aa53968d9a6-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.687799 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59c21716-507e-46b6-ad7e-8aa53968d9a6-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.729310 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59c21716-507e-46b6-ad7e-8aa53968d9a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59c21716-507e-46b6-ad7e-8aa53968d9a6" (UID: "59c21716-507e-46b6-ad7e-8aa53968d9a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.754382 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59c21716-507e-46b6-ad7e-8aa53968d9a6-config-data" (OuterVolumeSpecName: "config-data") pod "59c21716-507e-46b6-ad7e-8aa53968d9a6" (UID: "59c21716-507e-46b6-ad7e-8aa53968d9a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.790515 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59c21716-507e-46b6-ad7e-8aa53968d9a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.790556 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59c21716-507e-46b6-ad7e-8aa53968d9a6-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.852026 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.993654 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d20d841d-b516-42f8-9d69-bee6cab38d04-scripts\") pod \"d20d841d-b516-42f8-9d69-bee6cab38d04\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.994097 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"d20d841d-b516-42f8-9d69-bee6cab38d04\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.994212 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcvsp\" (UniqueName: \"kubernetes.io/projected/d20d841d-b516-42f8-9d69-bee6cab38d04-kube-api-access-mcvsp\") pod \"d20d841d-b516-42f8-9d69-bee6cab38d04\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.994250 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d20d841d-b516-42f8-9d69-bee6cab38d04-combined-ca-bundle\") pod \"d20d841d-b516-42f8-9d69-bee6cab38d04\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.994311 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d20d841d-b516-42f8-9d69-bee6cab38d04-httpd-run\") pod \"d20d841d-b516-42f8-9d69-bee6cab38d04\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.994337 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d20d841d-b516-42f8-9d69-bee6cab38d04-logs\") pod \"d20d841d-b516-42f8-9d69-bee6cab38d04\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.994358 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d20d841d-b516-42f8-9d69-bee6cab38d04-internal-tls-certs\") pod \"d20d841d-b516-42f8-9d69-bee6cab38d04\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.994377 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d20d841d-b516-42f8-9d69-bee6cab38d04-config-data\") pod \"d20d841d-b516-42f8-9d69-bee6cab38d04\" (UID: \"d20d841d-b516-42f8-9d69-bee6cab38d04\") " Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.995840 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d20d841d-b516-42f8-9d69-bee6cab38d04-logs" (OuterVolumeSpecName: "logs") pod "d20d841d-b516-42f8-9d69-bee6cab38d04" (UID: "d20d841d-b516-42f8-9d69-bee6cab38d04"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:38:40 crc kubenswrapper[4756]: I1205 16:38:40.996061 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d20d841d-b516-42f8-9d69-bee6cab38d04-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d20d841d-b516-42f8-9d69-bee6cab38d04" (UID: "d20d841d-b516-42f8-9d69-bee6cab38d04"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.002907 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d20d841d-b516-42f8-9d69-bee6cab38d04-kube-api-access-mcvsp" (OuterVolumeSpecName: "kube-api-access-mcvsp") pod "d20d841d-b516-42f8-9d69-bee6cab38d04" (UID: "d20d841d-b516-42f8-9d69-bee6cab38d04"). InnerVolumeSpecName "kube-api-access-mcvsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.003584 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "d20d841d-b516-42f8-9d69-bee6cab38d04" (UID: "d20d841d-b516-42f8-9d69-bee6cab38d04"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.005334 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d20d841d-b516-42f8-9d69-bee6cab38d04-scripts" (OuterVolumeSpecName: "scripts") pod "d20d841d-b516-42f8-9d69-bee6cab38d04" (UID: "d20d841d-b516-42f8-9d69-bee6cab38d04"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.054571 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d20d841d-b516-42f8-9d69-bee6cab38d04-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d20d841d-b516-42f8-9d69-bee6cab38d04" (UID: "d20d841d-b516-42f8-9d69-bee6cab38d04"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.070639 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d20d841d-b516-42f8-9d69-bee6cab38d04-config-data" (OuterVolumeSpecName: "config-data") pod "d20d841d-b516-42f8-9d69-bee6cab38d04" (UID: "d20d841d-b516-42f8-9d69-bee6cab38d04"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.096658 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d20d841d-b516-42f8-9d69-bee6cab38d04-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.097666 4756 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d20d841d-b516-42f8-9d69-bee6cab38d04-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.097837 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d20d841d-b516-42f8-9d69-bee6cab38d04-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.100670 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d20d841d-b516-42f8-9d69-bee6cab38d04-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.100700 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d20d841d-b516-42f8-9d69-bee6cab38d04-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.100742 4756 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.100760 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcvsp\" (UniqueName: \"kubernetes.io/projected/d20d841d-b516-42f8-9d69-bee6cab38d04-kube-api-access-mcvsp\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.100688 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d20d841d-b516-42f8-9d69-bee6cab38d04-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d20d841d-b516-42f8-9d69-bee6cab38d04" (UID: "d20d841d-b516-42f8-9d69-bee6cab38d04"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.124857 4756 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.202647 4756 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d20d841d-b516-42f8-9d69-bee6cab38d04-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.202701 4756 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.411123 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4beba48e-5ac4-4d41-864b-ecfd8b039988","Type":"ContainerStarted","Data":"ee842d9afcdec61635c6c49c87079fe579d522f6aece80d8d0ea7d52afa84d53"} Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.416933 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d20d841d-b516-42f8-9d69-bee6cab38d04","Type":"ContainerDied","Data":"d55ecbfd1516e2c5f14e5a447accc73007dfe4d73d0ba03239b133644c89115c"} Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.416973 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.416998 4756 scope.go:117] "RemoveContainer" containerID="0a3f2fad23369eef6abdb198f5d8d248f85735284965758d4037155e9fa9a2ea" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.424702 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"59c21716-507e-46b6-ad7e-8aa53968d9a6","Type":"ContainerDied","Data":"1cab81d5ee9fe8ea619ccdc7439f64212043995132c05a739a25b1e7f0870292"} Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.424779 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.447190 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.447172253 podStartE2EDuration="3.447172253s" podCreationTimestamp="2025-12-05 16:38:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:38:41.444251854 +0000 UTC m=+1381.682925376" watchObservedRunningTime="2025-12-05 16:38:41.447172253 +0000 UTC m=+1381.685845765" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.462543 4756 scope.go:117] "RemoveContainer" containerID="9f05ebf9f76d9de462b8e6c8165c8a59b53a64d17ed04decde1d2699deefc1d9" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.472196 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.495022 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.499636 4756 scope.go:117] "RemoveContainer" containerID="f23aaf5b558f23193ee013236b2ffa73ec172d4225e0d322d8cb4eba4a54b080" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.510685 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.527731 4756 scope.go:117] "RemoveContainer" containerID="0739127c4b5a48cc0fea53e873c582ad740019f226779db51ead18e14872f2bf" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.529003 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.538411 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:38:41 crc kubenswrapper[4756]: E1205 16:38:41.538818 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59c21716-507e-46b6-ad7e-8aa53968d9a6" containerName="ceilometer-central-agent" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.538838 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="59c21716-507e-46b6-ad7e-8aa53968d9a6" containerName="ceilometer-central-agent" Dec 05 16:38:41 crc kubenswrapper[4756]: E1205 16:38:41.538857 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59c21716-507e-46b6-ad7e-8aa53968d9a6" containerName="sg-core" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.538864 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="59c21716-507e-46b6-ad7e-8aa53968d9a6" containerName="sg-core" Dec 05 16:38:41 crc kubenswrapper[4756]: E1205 16:38:41.538884 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59c21716-507e-46b6-ad7e-8aa53968d9a6" containerName="proxy-httpd" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.538891 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="59c21716-507e-46b6-ad7e-8aa53968d9a6" containerName="proxy-httpd" Dec 05 16:38:41 crc kubenswrapper[4756]: E1205 16:38:41.538904 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59c21716-507e-46b6-ad7e-8aa53968d9a6" containerName="ceilometer-notification-agent" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.538911 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="59c21716-507e-46b6-ad7e-8aa53968d9a6" containerName="ceilometer-notification-agent" Dec 05 16:38:41 crc kubenswrapper[4756]: E1205 16:38:41.538921 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d20d841d-b516-42f8-9d69-bee6cab38d04" containerName="glance-log" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.538927 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d20d841d-b516-42f8-9d69-bee6cab38d04" containerName="glance-log" Dec 05 16:38:41 crc kubenswrapper[4756]: E1205 16:38:41.538939 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d20d841d-b516-42f8-9d69-bee6cab38d04" containerName="glance-httpd" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.538945 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d20d841d-b516-42f8-9d69-bee6cab38d04" containerName="glance-httpd" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.539099 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="59c21716-507e-46b6-ad7e-8aa53968d9a6" containerName="ceilometer-notification-agent" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.539119 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="d20d841d-b516-42f8-9d69-bee6cab38d04" containerName="glance-log" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.539129 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="59c21716-507e-46b6-ad7e-8aa53968d9a6" containerName="ceilometer-central-agent" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.539139 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="d20d841d-b516-42f8-9d69-bee6cab38d04" containerName="glance-httpd" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.539151 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="59c21716-507e-46b6-ad7e-8aa53968d9a6" containerName="proxy-httpd" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.539159 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="59c21716-507e-46b6-ad7e-8aa53968d9a6" containerName="sg-core" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.540923 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.544954 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.545190 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.548918 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.549764 4756 scope.go:117] "RemoveContainer" containerID="ca4eb31ee2f57553aac6d57afc83aa7cc485b7391c618d680f0bf310a49e95e5" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.550872 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.553946 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.554125 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.563527 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.589695 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.610055 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.610140 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/112a1a42-7019-477a-a6de-62bee8637d41-run-httpd\") pod \"ceilometer-0\" (UID: \"112a1a42-7019-477a-a6de-62bee8637d41\") " pod="openstack/ceilometer-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.610187 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p6cc\" (UniqueName: \"kubernetes.io/projected/112a1a42-7019-477a-a6de-62bee8637d41-kube-api-access-8p6cc\") pod \"ceilometer-0\" (UID: \"112a1a42-7019-477a-a6de-62bee8637d41\") " pod="openstack/ceilometer-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.610210 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f664c2b4-e517-427a-bbdc-8fba6d548a21-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.610236 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112a1a42-7019-477a-a6de-62bee8637d41-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"112a1a42-7019-477a-a6de-62bee8637d41\") " pod="openstack/ceilometer-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.610332 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f664c2b4-e517-427a-bbdc-8fba6d548a21-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.610382 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76sjv\" (UniqueName: \"kubernetes.io/projected/f664c2b4-e517-427a-bbdc-8fba6d548a21-kube-api-access-76sjv\") pod \"glance-default-internal-api-0\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.610409 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f664c2b4-e517-427a-bbdc-8fba6d548a21-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.610598 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f664c2b4-e517-427a-bbdc-8fba6d548a21-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.610701 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f664c2b4-e517-427a-bbdc-8fba6d548a21-logs\") pod \"glance-default-internal-api-0\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.610745 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/112a1a42-7019-477a-a6de-62bee8637d41-scripts\") pod \"ceilometer-0\" (UID: \"112a1a42-7019-477a-a6de-62bee8637d41\") " pod="openstack/ceilometer-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.610783 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f664c2b4-e517-427a-bbdc-8fba6d548a21-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.610816 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/112a1a42-7019-477a-a6de-62bee8637d41-log-httpd\") pod \"ceilometer-0\" (UID: \"112a1a42-7019-477a-a6de-62bee8637d41\") " pod="openstack/ceilometer-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.610838 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/112a1a42-7019-477a-a6de-62bee8637d41-config-data\") pod \"ceilometer-0\" (UID: \"112a1a42-7019-477a-a6de-62bee8637d41\") " pod="openstack/ceilometer-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.610874 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/112a1a42-7019-477a-a6de-62bee8637d41-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"112a1a42-7019-477a-a6de-62bee8637d41\") " pod="openstack/ceilometer-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.623579 4756 scope.go:117] "RemoveContainer" containerID="dcf9655a0ecf26d837db083e173823cb2b45d03a9e4b77039b4255ba071b5808" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.713175 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p6cc\" (UniqueName: \"kubernetes.io/projected/112a1a42-7019-477a-a6de-62bee8637d41-kube-api-access-8p6cc\") pod \"ceilometer-0\" (UID: \"112a1a42-7019-477a-a6de-62bee8637d41\") " pod="openstack/ceilometer-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.713228 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f664c2b4-e517-427a-bbdc-8fba6d548a21-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.713254 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112a1a42-7019-477a-a6de-62bee8637d41-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"112a1a42-7019-477a-a6de-62bee8637d41\") " pod="openstack/ceilometer-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.713331 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f664c2b4-e517-427a-bbdc-8fba6d548a21-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.713378 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76sjv\" (UniqueName: \"kubernetes.io/projected/f664c2b4-e517-427a-bbdc-8fba6d548a21-kube-api-access-76sjv\") pod \"glance-default-internal-api-0\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.713408 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f664c2b4-e517-427a-bbdc-8fba6d548a21-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.713462 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f664c2b4-e517-427a-bbdc-8fba6d548a21-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.713490 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f664c2b4-e517-427a-bbdc-8fba6d548a21-logs\") pod \"glance-default-internal-api-0\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.713511 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/112a1a42-7019-477a-a6de-62bee8637d41-scripts\") pod \"ceilometer-0\" (UID: \"112a1a42-7019-477a-a6de-62bee8637d41\") " pod="openstack/ceilometer-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.713535 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f664c2b4-e517-427a-bbdc-8fba6d548a21-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.713557 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/112a1a42-7019-477a-a6de-62bee8637d41-log-httpd\") pod \"ceilometer-0\" (UID: \"112a1a42-7019-477a-a6de-62bee8637d41\") " pod="openstack/ceilometer-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.713579 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/112a1a42-7019-477a-a6de-62bee8637d41-config-data\") pod \"ceilometer-0\" (UID: \"112a1a42-7019-477a-a6de-62bee8637d41\") " pod="openstack/ceilometer-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.713604 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/112a1a42-7019-477a-a6de-62bee8637d41-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"112a1a42-7019-477a-a6de-62bee8637d41\") " pod="openstack/ceilometer-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.713638 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.713685 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/112a1a42-7019-477a-a6de-62bee8637d41-run-httpd\") pod \"ceilometer-0\" (UID: \"112a1a42-7019-477a-a6de-62bee8637d41\") " pod="openstack/ceilometer-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.714138 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/112a1a42-7019-477a-a6de-62bee8637d41-run-httpd\") pod \"ceilometer-0\" (UID: \"112a1a42-7019-477a-a6de-62bee8637d41\") " pod="openstack/ceilometer-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.714227 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/112a1a42-7019-477a-a6de-62bee8637d41-log-httpd\") pod \"ceilometer-0\" (UID: \"112a1a42-7019-477a-a6de-62bee8637d41\") " pod="openstack/ceilometer-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.714233 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f664c2b4-e517-427a-bbdc-8fba6d548a21-logs\") pod \"glance-default-internal-api-0\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.714300 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.714526 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f664c2b4-e517-427a-bbdc-8fba6d548a21-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.718916 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f664c2b4-e517-427a-bbdc-8fba6d548a21-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.725937 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/112a1a42-7019-477a-a6de-62bee8637d41-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"112a1a42-7019-477a-a6de-62bee8637d41\") " pod="openstack/ceilometer-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.729176 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f664c2b4-e517-427a-bbdc-8fba6d548a21-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.730594 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/112a1a42-7019-477a-a6de-62bee8637d41-config-data\") pod \"ceilometer-0\" (UID: \"112a1a42-7019-477a-a6de-62bee8637d41\") " pod="openstack/ceilometer-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.730769 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f664c2b4-e517-427a-bbdc-8fba6d548a21-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.731670 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112a1a42-7019-477a-a6de-62bee8637d41-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"112a1a42-7019-477a-a6de-62bee8637d41\") " pod="openstack/ceilometer-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.731685 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f664c2b4-e517-427a-bbdc-8fba6d548a21-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.737636 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p6cc\" (UniqueName: \"kubernetes.io/projected/112a1a42-7019-477a-a6de-62bee8637d41-kube-api-access-8p6cc\") pod \"ceilometer-0\" (UID: \"112a1a42-7019-477a-a6de-62bee8637d41\") " pod="openstack/ceilometer-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.743164 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/112a1a42-7019-477a-a6de-62bee8637d41-scripts\") pod \"ceilometer-0\" (UID: \"112a1a42-7019-477a-a6de-62bee8637d41\") " pod="openstack/ceilometer-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.744289 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76sjv\" (UniqueName: \"kubernetes.io/projected/f664c2b4-e517-427a-bbdc-8fba6d548a21-kube-api-access-76sjv\") pod \"glance-default-internal-api-0\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.758059 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " pod="openstack/glance-default-internal-api-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.928068 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:38:41 crc kubenswrapper[4756]: I1205 16:38:41.933724 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 16:38:42 crc kubenswrapper[4756]: I1205 16:38:42.074380 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59c21716-507e-46b6-ad7e-8aa53968d9a6" path="/var/lib/kubelet/pods/59c21716-507e-46b6-ad7e-8aa53968d9a6/volumes" Dec 05 16:38:42 crc kubenswrapper[4756]: I1205 16:38:42.075774 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d20d841d-b516-42f8-9d69-bee6cab38d04" path="/var/lib/kubelet/pods/d20d841d-b516-42f8-9d69-bee6cab38d04/volumes" Dec 05 16:38:42 crc kubenswrapper[4756]: W1205 16:38:42.472962 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod112a1a42_7019_477a_a6de_62bee8637d41.slice/crio-24ec49565106d282c3887e1e8ec1e4378941e9599949128c9db379d5bf563fa4 WatchSource:0}: Error finding container 24ec49565106d282c3887e1e8ec1e4378941e9599949128c9db379d5bf563fa4: Status 404 returned error can't find the container with id 24ec49565106d282c3887e1e8ec1e4378941e9599949128c9db379d5bf563fa4 Dec 05 16:38:42 crc kubenswrapper[4756]: I1205 16:38:42.475256 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 16:38:42 crc kubenswrapper[4756]: I1205 16:38:42.487230 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:38:42 crc kubenswrapper[4756]: I1205 16:38:42.714201 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 16:38:42 crc kubenswrapper[4756]: W1205 16:38:42.729176 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf664c2b4_e517_427a_bbdc_8fba6d548a21.slice/crio-67d89d49612dc132ef74f1a1238e776268aa657fe7cd1d9ef4982b04addfdd0a WatchSource:0}: Error finding container 67d89d49612dc132ef74f1a1238e776268aa657fe7cd1d9ef4982b04addfdd0a: Status 404 returned error can't find the container with id 67d89d49612dc132ef74f1a1238e776268aa657fe7cd1d9ef4982b04addfdd0a Dec 05 16:38:43 crc kubenswrapper[4756]: I1205 16:38:43.378232 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mrzn4"] Dec 05 16:38:43 crc kubenswrapper[4756]: I1205 16:38:43.382053 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mrzn4" Dec 05 16:38:43 crc kubenswrapper[4756]: I1205 16:38:43.394905 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 05 16:38:43 crc kubenswrapper[4756]: I1205 16:38:43.397155 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-d4fn5" Dec 05 16:38:43 crc kubenswrapper[4756]: I1205 16:38:43.397573 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 05 16:38:43 crc kubenswrapper[4756]: I1205 16:38:43.413120 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mrzn4"] Dec 05 16:38:43 crc kubenswrapper[4756]: I1205 16:38:43.448386 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"112a1a42-7019-477a-a6de-62bee8637d41","Type":"ContainerStarted","Data":"fb335e6c6c7e084ddb3df94d9fc7c54b466a626c3a489b599b2d702db880e87c"} Dec 05 16:38:43 crc kubenswrapper[4756]: I1205 16:38:43.448650 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"112a1a42-7019-477a-a6de-62bee8637d41","Type":"ContainerStarted","Data":"24ec49565106d282c3887e1e8ec1e4378941e9599949128c9db379d5bf563fa4"} Dec 05 16:38:43 crc kubenswrapper[4756]: I1205 16:38:43.449256 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f664c2b4-e517-427a-bbdc-8fba6d548a21","Type":"ContainerStarted","Data":"67d89d49612dc132ef74f1a1238e776268aa657fe7cd1d9ef4982b04addfdd0a"} Dec 05 16:38:43 crc kubenswrapper[4756]: I1205 16:38:43.552399 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ea8041-7117-4be6-9d19-d5d282c10364-config-data\") pod \"nova-cell0-conductor-db-sync-mrzn4\" (UID: \"48ea8041-7117-4be6-9d19-d5d282c10364\") " pod="openstack/nova-cell0-conductor-db-sync-mrzn4" Dec 05 16:38:43 crc kubenswrapper[4756]: I1205 16:38:43.552866 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48ea8041-7117-4be6-9d19-d5d282c10364-scripts\") pod \"nova-cell0-conductor-db-sync-mrzn4\" (UID: \"48ea8041-7117-4be6-9d19-d5d282c10364\") " pod="openstack/nova-cell0-conductor-db-sync-mrzn4" Dec 05 16:38:43 crc kubenswrapper[4756]: I1205 16:38:43.553022 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ea8041-7117-4be6-9d19-d5d282c10364-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mrzn4\" (UID: \"48ea8041-7117-4be6-9d19-d5d282c10364\") " pod="openstack/nova-cell0-conductor-db-sync-mrzn4" Dec 05 16:38:43 crc kubenswrapper[4756]: I1205 16:38:43.553177 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knbmn\" (UniqueName: \"kubernetes.io/projected/48ea8041-7117-4be6-9d19-d5d282c10364-kube-api-access-knbmn\") pod \"nova-cell0-conductor-db-sync-mrzn4\" (UID: \"48ea8041-7117-4be6-9d19-d5d282c10364\") " pod="openstack/nova-cell0-conductor-db-sync-mrzn4" Dec 05 16:38:43 crc kubenswrapper[4756]: I1205 16:38:43.655329 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48ea8041-7117-4be6-9d19-d5d282c10364-scripts\") pod \"nova-cell0-conductor-db-sync-mrzn4\" (UID: \"48ea8041-7117-4be6-9d19-d5d282c10364\") " pod="openstack/nova-cell0-conductor-db-sync-mrzn4" Dec 05 16:38:43 crc kubenswrapper[4756]: I1205 16:38:43.655386 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ea8041-7117-4be6-9d19-d5d282c10364-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mrzn4\" (UID: \"48ea8041-7117-4be6-9d19-d5d282c10364\") " pod="openstack/nova-cell0-conductor-db-sync-mrzn4" Dec 05 16:38:43 crc kubenswrapper[4756]: I1205 16:38:43.655471 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knbmn\" (UniqueName: \"kubernetes.io/projected/48ea8041-7117-4be6-9d19-d5d282c10364-kube-api-access-knbmn\") pod \"nova-cell0-conductor-db-sync-mrzn4\" (UID: \"48ea8041-7117-4be6-9d19-d5d282c10364\") " pod="openstack/nova-cell0-conductor-db-sync-mrzn4" Dec 05 16:38:43 crc kubenswrapper[4756]: I1205 16:38:43.655538 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ea8041-7117-4be6-9d19-d5d282c10364-config-data\") pod \"nova-cell0-conductor-db-sync-mrzn4\" (UID: \"48ea8041-7117-4be6-9d19-d5d282c10364\") " pod="openstack/nova-cell0-conductor-db-sync-mrzn4" Dec 05 16:38:43 crc kubenswrapper[4756]: I1205 16:38:43.661880 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48ea8041-7117-4be6-9d19-d5d282c10364-scripts\") pod \"nova-cell0-conductor-db-sync-mrzn4\" (UID: \"48ea8041-7117-4be6-9d19-d5d282c10364\") " pod="openstack/nova-cell0-conductor-db-sync-mrzn4" Dec 05 16:38:43 crc kubenswrapper[4756]: I1205 16:38:43.662081 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ea8041-7117-4be6-9d19-d5d282c10364-config-data\") pod \"nova-cell0-conductor-db-sync-mrzn4\" (UID: \"48ea8041-7117-4be6-9d19-d5d282c10364\") " pod="openstack/nova-cell0-conductor-db-sync-mrzn4" Dec 05 16:38:43 crc kubenswrapper[4756]: I1205 16:38:43.664524 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ea8041-7117-4be6-9d19-d5d282c10364-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mrzn4\" (UID: \"48ea8041-7117-4be6-9d19-d5d282c10364\") " pod="openstack/nova-cell0-conductor-db-sync-mrzn4" Dec 05 16:38:43 crc kubenswrapper[4756]: I1205 16:38:43.675250 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knbmn\" (UniqueName: \"kubernetes.io/projected/48ea8041-7117-4be6-9d19-d5d282c10364-kube-api-access-knbmn\") pod \"nova-cell0-conductor-db-sync-mrzn4\" (UID: \"48ea8041-7117-4be6-9d19-d5d282c10364\") " pod="openstack/nova-cell0-conductor-db-sync-mrzn4" Dec 05 16:38:43 crc kubenswrapper[4756]: I1205 16:38:43.712111 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mrzn4" Dec 05 16:38:44 crc kubenswrapper[4756]: I1205 16:38:44.287293 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mrzn4"] Dec 05 16:38:44 crc kubenswrapper[4756]: I1205 16:38:44.460266 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mrzn4" event={"ID":"48ea8041-7117-4be6-9d19-d5d282c10364","Type":"ContainerStarted","Data":"70bd63f385700dd6c77123a43d09c7ad68468917162fb55616365d86cf160913"} Dec 05 16:38:46 crc kubenswrapper[4756]: I1205 16:38:46.489750 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f664c2b4-e517-427a-bbdc-8fba6d548a21","Type":"ContainerStarted","Data":"25c6d91b70e3146a820e5c52b8f074d3f3bfefc49cb9b80d00c7a70aae08ff55"} Dec 05 16:38:46 crc kubenswrapper[4756]: I1205 16:38:46.490324 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f664c2b4-e517-427a-bbdc-8fba6d548a21","Type":"ContainerStarted","Data":"8e8b38530a288e65246e8e282b7de364102d9ead1db22ed0d2d5b5a3b7cad231"} Dec 05 16:38:46 crc kubenswrapper[4756]: I1205 16:38:46.492412 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"112a1a42-7019-477a-a6de-62bee8637d41","Type":"ContainerStarted","Data":"9d3a4d72efb6fdd79d973181542fe5a96962b5bbecd9a3104f39d82540b05e9a"} Dec 05 16:38:46 crc kubenswrapper[4756]: I1205 16:38:46.492464 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"112a1a42-7019-477a-a6de-62bee8637d41","Type":"ContainerStarted","Data":"ee1a1b084bdce468442ad6a978c34f5251c12cd62c34d0c797662d566b41f7fc"} Dec 05 16:38:48 crc kubenswrapper[4756]: I1205 16:38:48.514314 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"112a1a42-7019-477a-a6de-62bee8637d41","Type":"ContainerStarted","Data":"ccf6c62c445f8cc7fc571ea3e75e0dfee57a33d8d802ca8f922c3eec3b033ade"} Dec 05 16:38:48 crc kubenswrapper[4756]: I1205 16:38:48.515508 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 16:38:48 crc kubenswrapper[4756]: I1205 16:38:48.542552 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.459466349 podStartE2EDuration="7.542518402s" podCreationTimestamp="2025-12-05 16:38:41 +0000 UTC" firstStartedPulling="2025-12-05 16:38:42.475008519 +0000 UTC m=+1382.713682031" lastFinishedPulling="2025-12-05 16:38:47.558060572 +0000 UTC m=+1387.796734084" observedRunningTime="2025-12-05 16:38:48.537020173 +0000 UTC m=+1388.775693695" watchObservedRunningTime="2025-12-05 16:38:48.542518402 +0000 UTC m=+1388.781191914" Dec 05 16:38:48 crc kubenswrapper[4756]: I1205 16:38:48.546792 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.546769837 podStartE2EDuration="7.546769837s" podCreationTimestamp="2025-12-05 16:38:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:38:46.518255404 +0000 UTC m=+1386.756928916" watchObservedRunningTime="2025-12-05 16:38:48.546769837 +0000 UTC m=+1388.785443349" Dec 05 16:38:48 crc kubenswrapper[4756]: I1205 16:38:48.999186 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 16:38:48 crc kubenswrapper[4756]: I1205 16:38:48.999557 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 16:38:49 crc kubenswrapper[4756]: I1205 16:38:49.035042 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 16:38:49 crc kubenswrapper[4756]: I1205 16:38:49.047649 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 16:38:49 crc kubenswrapper[4756]: I1205 16:38:49.528324 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 16:38:49 crc kubenswrapper[4756]: I1205 16:38:49.528586 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 16:38:51 crc kubenswrapper[4756]: I1205 16:38:51.548975 4756 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 16:38:51 crc kubenswrapper[4756]: I1205 16:38:51.549633 4756 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 16:38:51 crc kubenswrapper[4756]: I1205 16:38:51.588005 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 16:38:51 crc kubenswrapper[4756]: I1205 16:38:51.598522 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 16:38:51 crc kubenswrapper[4756]: I1205 16:38:51.805350 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bgtmv"] Dec 05 16:38:51 crc kubenswrapper[4756]: I1205 16:38:51.815252 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bgtmv" Dec 05 16:38:51 crc kubenswrapper[4756]: I1205 16:38:51.832741 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bgtmv"] Dec 05 16:38:51 crc kubenswrapper[4756]: I1205 16:38:51.933896 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 16:38:51 crc kubenswrapper[4756]: I1205 16:38:51.940761 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 16:38:51 crc kubenswrapper[4756]: I1205 16:38:51.981447 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4586641a-c567-4535-a94a-61b78cdb4a57-catalog-content\") pod \"redhat-operators-bgtmv\" (UID: \"4586641a-c567-4535-a94a-61b78cdb4a57\") " pod="openshift-marketplace/redhat-operators-bgtmv" Dec 05 16:38:51 crc kubenswrapper[4756]: I1205 16:38:51.981525 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4586641a-c567-4535-a94a-61b78cdb4a57-utilities\") pod \"redhat-operators-bgtmv\" (UID: \"4586641a-c567-4535-a94a-61b78cdb4a57\") " pod="openshift-marketplace/redhat-operators-bgtmv" Dec 05 16:38:51 crc kubenswrapper[4756]: I1205 16:38:51.981575 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srfvc\" (UniqueName: \"kubernetes.io/projected/4586641a-c567-4535-a94a-61b78cdb4a57-kube-api-access-srfvc\") pod \"redhat-operators-bgtmv\" (UID: \"4586641a-c567-4535-a94a-61b78cdb4a57\") " pod="openshift-marketplace/redhat-operators-bgtmv" Dec 05 16:38:52 crc kubenswrapper[4756]: I1205 16:38:52.025770 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 16:38:52 crc kubenswrapper[4756]: I1205 16:38:52.026655 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 16:38:52 crc kubenswrapper[4756]: I1205 16:38:52.084647 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4586641a-c567-4535-a94a-61b78cdb4a57-utilities\") pod \"redhat-operators-bgtmv\" (UID: \"4586641a-c567-4535-a94a-61b78cdb4a57\") " pod="openshift-marketplace/redhat-operators-bgtmv" Dec 05 16:38:52 crc kubenswrapper[4756]: I1205 16:38:52.084778 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srfvc\" (UniqueName: \"kubernetes.io/projected/4586641a-c567-4535-a94a-61b78cdb4a57-kube-api-access-srfvc\") pod \"redhat-operators-bgtmv\" (UID: \"4586641a-c567-4535-a94a-61b78cdb4a57\") " pod="openshift-marketplace/redhat-operators-bgtmv" Dec 05 16:38:52 crc kubenswrapper[4756]: I1205 16:38:52.084949 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4586641a-c567-4535-a94a-61b78cdb4a57-catalog-content\") pod \"redhat-operators-bgtmv\" (UID: \"4586641a-c567-4535-a94a-61b78cdb4a57\") " pod="openshift-marketplace/redhat-operators-bgtmv" Dec 05 16:38:52 crc kubenswrapper[4756]: I1205 16:38:52.087076 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4586641a-c567-4535-a94a-61b78cdb4a57-utilities\") pod \"redhat-operators-bgtmv\" (UID: \"4586641a-c567-4535-a94a-61b78cdb4a57\") " pod="openshift-marketplace/redhat-operators-bgtmv" Dec 05 16:38:52 crc kubenswrapper[4756]: I1205 16:38:52.087183 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4586641a-c567-4535-a94a-61b78cdb4a57-catalog-content\") pod \"redhat-operators-bgtmv\" (UID: \"4586641a-c567-4535-a94a-61b78cdb4a57\") " pod="openshift-marketplace/redhat-operators-bgtmv" Dec 05 16:38:52 crc kubenswrapper[4756]: I1205 16:38:52.110147 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srfvc\" (UniqueName: \"kubernetes.io/projected/4586641a-c567-4535-a94a-61b78cdb4a57-kube-api-access-srfvc\") pod \"redhat-operators-bgtmv\" (UID: \"4586641a-c567-4535-a94a-61b78cdb4a57\") " pod="openshift-marketplace/redhat-operators-bgtmv" Dec 05 16:38:52 crc kubenswrapper[4756]: I1205 16:38:52.158325 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bgtmv" Dec 05 16:38:52 crc kubenswrapper[4756]: I1205 16:38:52.295620 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:38:52 crc kubenswrapper[4756]: I1205 16:38:52.295827 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:38:52 crc kubenswrapper[4756]: I1205 16:38:52.558642 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 16:38:52 crc kubenswrapper[4756]: I1205 16:38:52.558906 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 16:38:53 crc kubenswrapper[4756]: I1205 16:38:53.733730 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:38:53 crc kubenswrapper[4756]: I1205 16:38:53.735430 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="112a1a42-7019-477a-a6de-62bee8637d41" containerName="proxy-httpd" containerID="cri-o://ccf6c62c445f8cc7fc571ea3e75e0dfee57a33d8d802ca8f922c3eec3b033ade" gracePeriod=30 Dec 05 16:38:53 crc kubenswrapper[4756]: I1205 16:38:53.735451 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="112a1a42-7019-477a-a6de-62bee8637d41" containerName="sg-core" containerID="cri-o://9d3a4d72efb6fdd79d973181542fe5a96962b5bbecd9a3104f39d82540b05e9a" gracePeriod=30 Dec 05 16:38:53 crc kubenswrapper[4756]: I1205 16:38:53.735586 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="112a1a42-7019-477a-a6de-62bee8637d41" containerName="ceilometer-notification-agent" containerID="cri-o://ee1a1b084bdce468442ad6a978c34f5251c12cd62c34d0c797662d566b41f7fc" gracePeriod=30 Dec 05 16:38:53 crc kubenswrapper[4756]: I1205 16:38:53.735688 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="112a1a42-7019-477a-a6de-62bee8637d41" containerName="ceilometer-central-agent" containerID="cri-o://fb335e6c6c7e084ddb3df94d9fc7c54b466a626c3a489b599b2d702db880e87c" gracePeriod=30 Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.241166 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bgtmv"] Dec 05 16:38:54 crc kubenswrapper[4756]: W1205 16:38:54.241774 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4586641a_c567_4535_a94a_61b78cdb4a57.slice/crio-ea8980bb636404b8f1e8d9d7d8d32374fc37cc56803a2a38d2f20f2822cd5463 WatchSource:0}: Error finding container ea8980bb636404b8f1e8d9d7d8d32374fc37cc56803a2a38d2f20f2822cd5463: Status 404 returned error can't find the container with id ea8980bb636404b8f1e8d9d7d8d32374fc37cc56803a2a38d2f20f2822cd5463 Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.504477 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.548319 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112a1a42-7019-477a-a6de-62bee8637d41-combined-ca-bundle\") pod \"112a1a42-7019-477a-a6de-62bee8637d41\" (UID: \"112a1a42-7019-477a-a6de-62bee8637d41\") " Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.548370 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/112a1a42-7019-477a-a6de-62bee8637d41-config-data\") pod \"112a1a42-7019-477a-a6de-62bee8637d41\" (UID: \"112a1a42-7019-477a-a6de-62bee8637d41\") " Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.548433 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/112a1a42-7019-477a-a6de-62bee8637d41-log-httpd\") pod \"112a1a42-7019-477a-a6de-62bee8637d41\" (UID: \"112a1a42-7019-477a-a6de-62bee8637d41\") " Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.548467 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/112a1a42-7019-477a-a6de-62bee8637d41-scripts\") pod \"112a1a42-7019-477a-a6de-62bee8637d41\" (UID: \"112a1a42-7019-477a-a6de-62bee8637d41\") " Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.548508 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/112a1a42-7019-477a-a6de-62bee8637d41-sg-core-conf-yaml\") pod \"112a1a42-7019-477a-a6de-62bee8637d41\" (UID: \"112a1a42-7019-477a-a6de-62bee8637d41\") " Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.548549 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8p6cc\" (UniqueName: \"kubernetes.io/projected/112a1a42-7019-477a-a6de-62bee8637d41-kube-api-access-8p6cc\") pod \"112a1a42-7019-477a-a6de-62bee8637d41\" (UID: \"112a1a42-7019-477a-a6de-62bee8637d41\") " Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.548616 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/112a1a42-7019-477a-a6de-62bee8637d41-run-httpd\") pod \"112a1a42-7019-477a-a6de-62bee8637d41\" (UID: \"112a1a42-7019-477a-a6de-62bee8637d41\") " Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.549342 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/112a1a42-7019-477a-a6de-62bee8637d41-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "112a1a42-7019-477a-a6de-62bee8637d41" (UID: "112a1a42-7019-477a-a6de-62bee8637d41"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.551464 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/112a1a42-7019-477a-a6de-62bee8637d41-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "112a1a42-7019-477a-a6de-62bee8637d41" (UID: "112a1a42-7019-477a-a6de-62bee8637d41"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.555645 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/112a1a42-7019-477a-a6de-62bee8637d41-scripts" (OuterVolumeSpecName: "scripts") pod "112a1a42-7019-477a-a6de-62bee8637d41" (UID: "112a1a42-7019-477a-a6de-62bee8637d41"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.556056 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/112a1a42-7019-477a-a6de-62bee8637d41-kube-api-access-8p6cc" (OuterVolumeSpecName: "kube-api-access-8p6cc") pod "112a1a42-7019-477a-a6de-62bee8637d41" (UID: "112a1a42-7019-477a-a6de-62bee8637d41"). InnerVolumeSpecName "kube-api-access-8p6cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.593356 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/112a1a42-7019-477a-a6de-62bee8637d41-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "112a1a42-7019-477a-a6de-62bee8637d41" (UID: "112a1a42-7019-477a-a6de-62bee8637d41"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.596858 4756 generic.go:334] "Generic (PLEG): container finished" podID="112a1a42-7019-477a-a6de-62bee8637d41" containerID="ccf6c62c445f8cc7fc571ea3e75e0dfee57a33d8d802ca8f922c3eec3b033ade" exitCode=0 Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.596896 4756 generic.go:334] "Generic (PLEG): container finished" podID="112a1a42-7019-477a-a6de-62bee8637d41" containerID="9d3a4d72efb6fdd79d973181542fe5a96962b5bbecd9a3104f39d82540b05e9a" exitCode=2 Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.596906 4756 generic.go:334] "Generic (PLEG): container finished" podID="112a1a42-7019-477a-a6de-62bee8637d41" containerID="ee1a1b084bdce468442ad6a978c34f5251c12cd62c34d0c797662d566b41f7fc" exitCode=0 Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.596916 4756 generic.go:334] "Generic (PLEG): container finished" podID="112a1a42-7019-477a-a6de-62bee8637d41" containerID="fb335e6c6c7e084ddb3df94d9fc7c54b466a626c3a489b599b2d702db880e87c" exitCode=0 Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.596931 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.596992 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"112a1a42-7019-477a-a6de-62bee8637d41","Type":"ContainerDied","Data":"ccf6c62c445f8cc7fc571ea3e75e0dfee57a33d8d802ca8f922c3eec3b033ade"} Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.597026 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"112a1a42-7019-477a-a6de-62bee8637d41","Type":"ContainerDied","Data":"9d3a4d72efb6fdd79d973181542fe5a96962b5bbecd9a3104f39d82540b05e9a"} Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.597040 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"112a1a42-7019-477a-a6de-62bee8637d41","Type":"ContainerDied","Data":"ee1a1b084bdce468442ad6a978c34f5251c12cd62c34d0c797662d566b41f7fc"} Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.597052 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"112a1a42-7019-477a-a6de-62bee8637d41","Type":"ContainerDied","Data":"fb335e6c6c7e084ddb3df94d9fc7c54b466a626c3a489b599b2d702db880e87c"} Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.597062 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"112a1a42-7019-477a-a6de-62bee8637d41","Type":"ContainerDied","Data":"24ec49565106d282c3887e1e8ec1e4378941e9599949128c9db379d5bf563fa4"} Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.597081 4756 scope.go:117] "RemoveContainer" containerID="ccf6c62c445f8cc7fc571ea3e75e0dfee57a33d8d802ca8f922c3eec3b033ade" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.599849 4756 generic.go:334] "Generic (PLEG): container finished" podID="4586641a-c567-4535-a94a-61b78cdb4a57" containerID="b415be4adbf2c57f78640fa626ccec5d13eaac2b8d4170fceb80d697e7152fad" exitCode=0 Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.599953 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bgtmv" event={"ID":"4586641a-c567-4535-a94a-61b78cdb4a57","Type":"ContainerDied","Data":"b415be4adbf2c57f78640fa626ccec5d13eaac2b8d4170fceb80d697e7152fad"} Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.599988 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bgtmv" event={"ID":"4586641a-c567-4535-a94a-61b78cdb4a57","Type":"ContainerStarted","Data":"ea8980bb636404b8f1e8d9d7d8d32374fc37cc56803a2a38d2f20f2822cd5463"} Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.606059 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mrzn4" event={"ID":"48ea8041-7117-4be6-9d19-d5d282c10364","Type":"ContainerStarted","Data":"2d0048989462b0409ebe8c9066cf5ebecd7f70bb52e8298125aa44f8f432d3f0"} Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.620492 4756 scope.go:117] "RemoveContainer" containerID="9d3a4d72efb6fdd79d973181542fe5a96962b5bbecd9a3104f39d82540b05e9a" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.643102 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-mrzn4" podStartSLOduration=2.135899229 podStartE2EDuration="11.643081192s" podCreationTimestamp="2025-12-05 16:38:43 +0000 UTC" firstStartedPulling="2025-12-05 16:38:44.289765499 +0000 UTC m=+1384.528439011" lastFinishedPulling="2025-12-05 16:38:53.796947462 +0000 UTC m=+1394.035620974" observedRunningTime="2025-12-05 16:38:54.632266409 +0000 UTC m=+1394.870939931" watchObservedRunningTime="2025-12-05 16:38:54.643081192 +0000 UTC m=+1394.881754714" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.644616 4756 scope.go:117] "RemoveContainer" containerID="ee1a1b084bdce468442ad6a978c34f5251c12cd62c34d0c797662d566b41f7fc" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.650260 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8p6cc\" (UniqueName: \"kubernetes.io/projected/112a1a42-7019-477a-a6de-62bee8637d41-kube-api-access-8p6cc\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.650280 4756 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/112a1a42-7019-477a-a6de-62bee8637d41-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.650290 4756 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/112a1a42-7019-477a-a6de-62bee8637d41-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.650299 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/112a1a42-7019-477a-a6de-62bee8637d41-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.650309 4756 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/112a1a42-7019-477a-a6de-62bee8637d41-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.661816 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/112a1a42-7019-477a-a6de-62bee8637d41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "112a1a42-7019-477a-a6de-62bee8637d41" (UID: "112a1a42-7019-477a-a6de-62bee8637d41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.665825 4756 scope.go:117] "RemoveContainer" containerID="fb335e6c6c7e084ddb3df94d9fc7c54b466a626c3a489b599b2d702db880e87c" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.667715 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/112a1a42-7019-477a-a6de-62bee8637d41-config-data" (OuterVolumeSpecName: "config-data") pod "112a1a42-7019-477a-a6de-62bee8637d41" (UID: "112a1a42-7019-477a-a6de-62bee8637d41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.686106 4756 scope.go:117] "RemoveContainer" containerID="ccf6c62c445f8cc7fc571ea3e75e0dfee57a33d8d802ca8f922c3eec3b033ade" Dec 05 16:38:54 crc kubenswrapper[4756]: E1205 16:38:54.686825 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccf6c62c445f8cc7fc571ea3e75e0dfee57a33d8d802ca8f922c3eec3b033ade\": container with ID starting with ccf6c62c445f8cc7fc571ea3e75e0dfee57a33d8d802ca8f922c3eec3b033ade not found: ID does not exist" containerID="ccf6c62c445f8cc7fc571ea3e75e0dfee57a33d8d802ca8f922c3eec3b033ade" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.686957 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccf6c62c445f8cc7fc571ea3e75e0dfee57a33d8d802ca8f922c3eec3b033ade"} err="failed to get container status \"ccf6c62c445f8cc7fc571ea3e75e0dfee57a33d8d802ca8f922c3eec3b033ade\": rpc error: code = NotFound desc = could not find container \"ccf6c62c445f8cc7fc571ea3e75e0dfee57a33d8d802ca8f922c3eec3b033ade\": container with ID starting with ccf6c62c445f8cc7fc571ea3e75e0dfee57a33d8d802ca8f922c3eec3b033ade not found: ID does not exist" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.687034 4756 scope.go:117] "RemoveContainer" containerID="9d3a4d72efb6fdd79d973181542fe5a96962b5bbecd9a3104f39d82540b05e9a" Dec 05 16:38:54 crc kubenswrapper[4756]: E1205 16:38:54.687710 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d3a4d72efb6fdd79d973181542fe5a96962b5bbecd9a3104f39d82540b05e9a\": container with ID starting with 9d3a4d72efb6fdd79d973181542fe5a96962b5bbecd9a3104f39d82540b05e9a not found: ID does not exist" containerID="9d3a4d72efb6fdd79d973181542fe5a96962b5bbecd9a3104f39d82540b05e9a" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.687764 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d3a4d72efb6fdd79d973181542fe5a96962b5bbecd9a3104f39d82540b05e9a"} err="failed to get container status \"9d3a4d72efb6fdd79d973181542fe5a96962b5bbecd9a3104f39d82540b05e9a\": rpc error: code = NotFound desc = could not find container \"9d3a4d72efb6fdd79d973181542fe5a96962b5bbecd9a3104f39d82540b05e9a\": container with ID starting with 9d3a4d72efb6fdd79d973181542fe5a96962b5bbecd9a3104f39d82540b05e9a not found: ID does not exist" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.687800 4756 scope.go:117] "RemoveContainer" containerID="ee1a1b084bdce468442ad6a978c34f5251c12cd62c34d0c797662d566b41f7fc" Dec 05 16:38:54 crc kubenswrapper[4756]: E1205 16:38:54.688255 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee1a1b084bdce468442ad6a978c34f5251c12cd62c34d0c797662d566b41f7fc\": container with ID starting with ee1a1b084bdce468442ad6a978c34f5251c12cd62c34d0c797662d566b41f7fc not found: ID does not exist" containerID="ee1a1b084bdce468442ad6a978c34f5251c12cd62c34d0c797662d566b41f7fc" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.688306 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee1a1b084bdce468442ad6a978c34f5251c12cd62c34d0c797662d566b41f7fc"} err="failed to get container status \"ee1a1b084bdce468442ad6a978c34f5251c12cd62c34d0c797662d566b41f7fc\": rpc error: code = NotFound desc = could not find container \"ee1a1b084bdce468442ad6a978c34f5251c12cd62c34d0c797662d566b41f7fc\": container with ID starting with ee1a1b084bdce468442ad6a978c34f5251c12cd62c34d0c797662d566b41f7fc not found: ID does not exist" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.688339 4756 scope.go:117] "RemoveContainer" containerID="fb335e6c6c7e084ddb3df94d9fc7c54b466a626c3a489b599b2d702db880e87c" Dec 05 16:38:54 crc kubenswrapper[4756]: E1205 16:38:54.688722 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb335e6c6c7e084ddb3df94d9fc7c54b466a626c3a489b599b2d702db880e87c\": container with ID starting with fb335e6c6c7e084ddb3df94d9fc7c54b466a626c3a489b599b2d702db880e87c not found: ID does not exist" containerID="fb335e6c6c7e084ddb3df94d9fc7c54b466a626c3a489b599b2d702db880e87c" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.688818 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb335e6c6c7e084ddb3df94d9fc7c54b466a626c3a489b599b2d702db880e87c"} err="failed to get container status \"fb335e6c6c7e084ddb3df94d9fc7c54b466a626c3a489b599b2d702db880e87c\": rpc error: code = NotFound desc = could not find container \"fb335e6c6c7e084ddb3df94d9fc7c54b466a626c3a489b599b2d702db880e87c\": container with ID starting with fb335e6c6c7e084ddb3df94d9fc7c54b466a626c3a489b599b2d702db880e87c not found: ID does not exist" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.688895 4756 scope.go:117] "RemoveContainer" containerID="ccf6c62c445f8cc7fc571ea3e75e0dfee57a33d8d802ca8f922c3eec3b033ade" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.689498 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccf6c62c445f8cc7fc571ea3e75e0dfee57a33d8d802ca8f922c3eec3b033ade"} err="failed to get container status \"ccf6c62c445f8cc7fc571ea3e75e0dfee57a33d8d802ca8f922c3eec3b033ade\": rpc error: code = NotFound desc = could not find container \"ccf6c62c445f8cc7fc571ea3e75e0dfee57a33d8d802ca8f922c3eec3b033ade\": container with ID starting with ccf6c62c445f8cc7fc571ea3e75e0dfee57a33d8d802ca8f922c3eec3b033ade not found: ID does not exist" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.689542 4756 scope.go:117] "RemoveContainer" containerID="9d3a4d72efb6fdd79d973181542fe5a96962b5bbecd9a3104f39d82540b05e9a" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.689913 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d3a4d72efb6fdd79d973181542fe5a96962b5bbecd9a3104f39d82540b05e9a"} err="failed to get container status \"9d3a4d72efb6fdd79d973181542fe5a96962b5bbecd9a3104f39d82540b05e9a\": rpc error: code = NotFound desc = could not find container \"9d3a4d72efb6fdd79d973181542fe5a96962b5bbecd9a3104f39d82540b05e9a\": container with ID starting with 9d3a4d72efb6fdd79d973181542fe5a96962b5bbecd9a3104f39d82540b05e9a not found: ID does not exist" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.689942 4756 scope.go:117] "RemoveContainer" containerID="ee1a1b084bdce468442ad6a978c34f5251c12cd62c34d0c797662d566b41f7fc" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.690300 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee1a1b084bdce468442ad6a978c34f5251c12cd62c34d0c797662d566b41f7fc"} err="failed to get container status \"ee1a1b084bdce468442ad6a978c34f5251c12cd62c34d0c797662d566b41f7fc\": rpc error: code = NotFound desc = could not find container \"ee1a1b084bdce468442ad6a978c34f5251c12cd62c34d0c797662d566b41f7fc\": container with ID starting with ee1a1b084bdce468442ad6a978c34f5251c12cd62c34d0c797662d566b41f7fc not found: ID does not exist" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.690320 4756 scope.go:117] "RemoveContainer" containerID="fb335e6c6c7e084ddb3df94d9fc7c54b466a626c3a489b599b2d702db880e87c" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.690742 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb335e6c6c7e084ddb3df94d9fc7c54b466a626c3a489b599b2d702db880e87c"} err="failed to get container status \"fb335e6c6c7e084ddb3df94d9fc7c54b466a626c3a489b599b2d702db880e87c\": rpc error: code = NotFound desc = could not find container \"fb335e6c6c7e084ddb3df94d9fc7c54b466a626c3a489b599b2d702db880e87c\": container with ID starting with fb335e6c6c7e084ddb3df94d9fc7c54b466a626c3a489b599b2d702db880e87c not found: ID does not exist" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.690877 4756 scope.go:117] "RemoveContainer" containerID="ccf6c62c445f8cc7fc571ea3e75e0dfee57a33d8d802ca8f922c3eec3b033ade" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.691153 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccf6c62c445f8cc7fc571ea3e75e0dfee57a33d8d802ca8f922c3eec3b033ade"} err="failed to get container status \"ccf6c62c445f8cc7fc571ea3e75e0dfee57a33d8d802ca8f922c3eec3b033ade\": rpc error: code = NotFound desc = could not find container \"ccf6c62c445f8cc7fc571ea3e75e0dfee57a33d8d802ca8f922c3eec3b033ade\": container with ID starting with ccf6c62c445f8cc7fc571ea3e75e0dfee57a33d8d802ca8f922c3eec3b033ade not found: ID does not exist" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.691175 4756 scope.go:117] "RemoveContainer" containerID="9d3a4d72efb6fdd79d973181542fe5a96962b5bbecd9a3104f39d82540b05e9a" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.691545 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d3a4d72efb6fdd79d973181542fe5a96962b5bbecd9a3104f39d82540b05e9a"} err="failed to get container status \"9d3a4d72efb6fdd79d973181542fe5a96962b5bbecd9a3104f39d82540b05e9a\": rpc error: code = NotFound desc = could not find container \"9d3a4d72efb6fdd79d973181542fe5a96962b5bbecd9a3104f39d82540b05e9a\": container with ID starting with 9d3a4d72efb6fdd79d973181542fe5a96962b5bbecd9a3104f39d82540b05e9a not found: ID does not exist" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.691562 4756 scope.go:117] "RemoveContainer" containerID="ee1a1b084bdce468442ad6a978c34f5251c12cd62c34d0c797662d566b41f7fc" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.691860 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee1a1b084bdce468442ad6a978c34f5251c12cd62c34d0c797662d566b41f7fc"} err="failed to get container status \"ee1a1b084bdce468442ad6a978c34f5251c12cd62c34d0c797662d566b41f7fc\": rpc error: code = NotFound desc = could not find container \"ee1a1b084bdce468442ad6a978c34f5251c12cd62c34d0c797662d566b41f7fc\": container with ID starting with ee1a1b084bdce468442ad6a978c34f5251c12cd62c34d0c797662d566b41f7fc not found: ID does not exist" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.691994 4756 scope.go:117] "RemoveContainer" containerID="fb335e6c6c7e084ddb3df94d9fc7c54b466a626c3a489b599b2d702db880e87c" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.692495 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb335e6c6c7e084ddb3df94d9fc7c54b466a626c3a489b599b2d702db880e87c"} err="failed to get container status \"fb335e6c6c7e084ddb3df94d9fc7c54b466a626c3a489b599b2d702db880e87c\": rpc error: code = NotFound desc = could not find container \"fb335e6c6c7e084ddb3df94d9fc7c54b466a626c3a489b599b2d702db880e87c\": container with ID starting with fb335e6c6c7e084ddb3df94d9fc7c54b466a626c3a489b599b2d702db880e87c not found: ID does not exist" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.692526 4756 scope.go:117] "RemoveContainer" containerID="ccf6c62c445f8cc7fc571ea3e75e0dfee57a33d8d802ca8f922c3eec3b033ade" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.693030 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccf6c62c445f8cc7fc571ea3e75e0dfee57a33d8d802ca8f922c3eec3b033ade"} err="failed to get container status \"ccf6c62c445f8cc7fc571ea3e75e0dfee57a33d8d802ca8f922c3eec3b033ade\": rpc error: code = NotFound desc = could not find container \"ccf6c62c445f8cc7fc571ea3e75e0dfee57a33d8d802ca8f922c3eec3b033ade\": container with ID starting with ccf6c62c445f8cc7fc571ea3e75e0dfee57a33d8d802ca8f922c3eec3b033ade not found: ID does not exist" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.693062 4756 scope.go:117] "RemoveContainer" containerID="9d3a4d72efb6fdd79d973181542fe5a96962b5bbecd9a3104f39d82540b05e9a" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.693548 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d3a4d72efb6fdd79d973181542fe5a96962b5bbecd9a3104f39d82540b05e9a"} err="failed to get container status \"9d3a4d72efb6fdd79d973181542fe5a96962b5bbecd9a3104f39d82540b05e9a\": rpc error: code = NotFound desc = could not find container \"9d3a4d72efb6fdd79d973181542fe5a96962b5bbecd9a3104f39d82540b05e9a\": container with ID starting with 9d3a4d72efb6fdd79d973181542fe5a96962b5bbecd9a3104f39d82540b05e9a not found: ID does not exist" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.693714 4756 scope.go:117] "RemoveContainer" containerID="ee1a1b084bdce468442ad6a978c34f5251c12cd62c34d0c797662d566b41f7fc" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.694294 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee1a1b084bdce468442ad6a978c34f5251c12cd62c34d0c797662d566b41f7fc"} err="failed to get container status \"ee1a1b084bdce468442ad6a978c34f5251c12cd62c34d0c797662d566b41f7fc\": rpc error: code = NotFound desc = could not find container \"ee1a1b084bdce468442ad6a978c34f5251c12cd62c34d0c797662d566b41f7fc\": container with ID starting with ee1a1b084bdce468442ad6a978c34f5251c12cd62c34d0c797662d566b41f7fc not found: ID does not exist" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.694325 4756 scope.go:117] "RemoveContainer" containerID="fb335e6c6c7e084ddb3df94d9fc7c54b466a626c3a489b599b2d702db880e87c" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.694695 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb335e6c6c7e084ddb3df94d9fc7c54b466a626c3a489b599b2d702db880e87c"} err="failed to get container status \"fb335e6c6c7e084ddb3df94d9fc7c54b466a626c3a489b599b2d702db880e87c\": rpc error: code = NotFound desc = could not find container \"fb335e6c6c7e084ddb3df94d9fc7c54b466a626c3a489b599b2d702db880e87c\": container with ID starting with fb335e6c6c7e084ddb3df94d9fc7c54b466a626c3a489b599b2d702db880e87c not found: ID does not exist" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.752118 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112a1a42-7019-477a-a6de-62bee8637d41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.752172 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/112a1a42-7019-477a-a6de-62bee8637d41-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.753859 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.754075 4756 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.757661 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.950162 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.961961 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.973076 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:38:54 crc kubenswrapper[4756]: E1205 16:38:54.973552 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="112a1a42-7019-477a-a6de-62bee8637d41" containerName="proxy-httpd" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.973577 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="112a1a42-7019-477a-a6de-62bee8637d41" containerName="proxy-httpd" Dec 05 16:38:54 crc kubenswrapper[4756]: E1205 16:38:54.973594 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="112a1a42-7019-477a-a6de-62bee8637d41" containerName="ceilometer-notification-agent" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.973605 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="112a1a42-7019-477a-a6de-62bee8637d41" containerName="ceilometer-notification-agent" Dec 05 16:38:54 crc kubenswrapper[4756]: E1205 16:38:54.973628 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="112a1a42-7019-477a-a6de-62bee8637d41" containerName="sg-core" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.973639 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="112a1a42-7019-477a-a6de-62bee8637d41" containerName="sg-core" Dec 05 16:38:54 crc kubenswrapper[4756]: E1205 16:38:54.973668 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="112a1a42-7019-477a-a6de-62bee8637d41" containerName="ceilometer-central-agent" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.973676 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="112a1a42-7019-477a-a6de-62bee8637d41" containerName="ceilometer-central-agent" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.973909 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="112a1a42-7019-477a-a6de-62bee8637d41" containerName="ceilometer-notification-agent" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.973936 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="112a1a42-7019-477a-a6de-62bee8637d41" containerName="proxy-httpd" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.973951 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="112a1a42-7019-477a-a6de-62bee8637d41" containerName="ceilometer-central-agent" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.973971 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="112a1a42-7019-477a-a6de-62bee8637d41" containerName="sg-core" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.975977 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.983732 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.984270 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 16:38:54 crc kubenswrapper[4756]: I1205 16:38:54.985103 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 16:38:55 crc kubenswrapper[4756]: I1205 16:38:55.065892 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-log-httpd\") pod \"ceilometer-0\" (UID: \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\") " pod="openstack/ceilometer-0" Dec 05 16:38:55 crc kubenswrapper[4756]: I1205 16:38:55.065970 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\") " pod="openstack/ceilometer-0" Dec 05 16:38:55 crc kubenswrapper[4756]: I1205 16:38:55.066054 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slhgx\" (UniqueName: \"kubernetes.io/projected/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-kube-api-access-slhgx\") pod \"ceilometer-0\" (UID: \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\") " pod="openstack/ceilometer-0" Dec 05 16:38:55 crc kubenswrapper[4756]: I1205 16:38:55.066597 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-run-httpd\") pod \"ceilometer-0\" (UID: \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\") " pod="openstack/ceilometer-0" Dec 05 16:38:55 crc kubenswrapper[4756]: I1205 16:38:55.066672 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-config-data\") pod \"ceilometer-0\" (UID: \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\") " pod="openstack/ceilometer-0" Dec 05 16:38:55 crc kubenswrapper[4756]: I1205 16:38:55.066715 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-scripts\") pod \"ceilometer-0\" (UID: \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\") " pod="openstack/ceilometer-0" Dec 05 16:38:55 crc kubenswrapper[4756]: I1205 16:38:55.066860 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\") " pod="openstack/ceilometer-0" Dec 05 16:38:55 crc kubenswrapper[4756]: I1205 16:38:55.168457 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-run-httpd\") pod \"ceilometer-0\" (UID: \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\") " pod="openstack/ceilometer-0" Dec 05 16:38:55 crc kubenswrapper[4756]: I1205 16:38:55.168518 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-config-data\") pod \"ceilometer-0\" (UID: \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\") " pod="openstack/ceilometer-0" Dec 05 16:38:55 crc kubenswrapper[4756]: I1205 16:38:55.168547 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-scripts\") pod \"ceilometer-0\" (UID: \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\") " pod="openstack/ceilometer-0" Dec 05 16:38:55 crc kubenswrapper[4756]: I1205 16:38:55.168595 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\") " pod="openstack/ceilometer-0" Dec 05 16:38:55 crc kubenswrapper[4756]: I1205 16:38:55.168774 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-log-httpd\") pod \"ceilometer-0\" (UID: \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\") " pod="openstack/ceilometer-0" Dec 05 16:38:55 crc kubenswrapper[4756]: I1205 16:38:55.168804 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\") " pod="openstack/ceilometer-0" Dec 05 16:38:55 crc kubenswrapper[4756]: I1205 16:38:55.168866 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slhgx\" (UniqueName: \"kubernetes.io/projected/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-kube-api-access-slhgx\") pod \"ceilometer-0\" (UID: \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\") " pod="openstack/ceilometer-0" Dec 05 16:38:55 crc kubenswrapper[4756]: I1205 16:38:55.169969 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-log-httpd\") pod \"ceilometer-0\" (UID: \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\") " pod="openstack/ceilometer-0" Dec 05 16:38:55 crc kubenswrapper[4756]: I1205 16:38:55.170320 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-run-httpd\") pod \"ceilometer-0\" (UID: \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\") " pod="openstack/ceilometer-0" Dec 05 16:38:55 crc kubenswrapper[4756]: I1205 16:38:55.178022 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-config-data\") pod \"ceilometer-0\" (UID: \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\") " pod="openstack/ceilometer-0" Dec 05 16:38:55 crc kubenswrapper[4756]: I1205 16:38:55.178704 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\") " pod="openstack/ceilometer-0" Dec 05 16:38:55 crc kubenswrapper[4756]: I1205 16:38:55.179107 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-scripts\") pod \"ceilometer-0\" (UID: \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\") " pod="openstack/ceilometer-0" Dec 05 16:38:55 crc kubenswrapper[4756]: I1205 16:38:55.185532 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\") " pod="openstack/ceilometer-0" Dec 05 16:38:55 crc kubenswrapper[4756]: I1205 16:38:55.187577 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slhgx\" (UniqueName: \"kubernetes.io/projected/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-kube-api-access-slhgx\") pod \"ceilometer-0\" (UID: \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\") " pod="openstack/ceilometer-0" Dec 05 16:38:55 crc kubenswrapper[4756]: I1205 16:38:55.310840 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:38:55 crc kubenswrapper[4756]: I1205 16:38:55.769664 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:38:55 crc kubenswrapper[4756]: W1205 16:38:55.769848 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0c31b46_0af4_4e7f_9a32_06ee16a766aa.slice/crio-8fbfd38d5fc57ddbe763a4e1ee558bfaae1dec034dfa747c10767e6607b37ba0 WatchSource:0}: Error finding container 8fbfd38d5fc57ddbe763a4e1ee558bfaae1dec034dfa747c10767e6607b37ba0: Status 404 returned error can't find the container with id 8fbfd38d5fc57ddbe763a4e1ee558bfaae1dec034dfa747c10767e6607b37ba0 Dec 05 16:38:56 crc kubenswrapper[4756]: I1205 16:38:56.061914 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="112a1a42-7019-477a-a6de-62bee8637d41" path="/var/lib/kubelet/pods/112a1a42-7019-477a-a6de-62bee8637d41/volumes" Dec 05 16:38:56 crc kubenswrapper[4756]: I1205 16:38:56.630376 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0c31b46-0af4-4e7f-9a32-06ee16a766aa","Type":"ContainerStarted","Data":"8fbfd38d5fc57ddbe763a4e1ee558bfaae1dec034dfa747c10767e6607b37ba0"} Dec 05 16:38:56 crc kubenswrapper[4756]: I1205 16:38:56.643861 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bgtmv" event={"ID":"4586641a-c567-4535-a94a-61b78cdb4a57","Type":"ContainerStarted","Data":"ec7bcdc950eb8f5366a61dde4110fc1799c592008c60cf7260168171aa2bb2a5"} Dec 05 16:38:57 crc kubenswrapper[4756]: I1205 16:38:57.655233 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0c31b46-0af4-4e7f-9a32-06ee16a766aa","Type":"ContainerStarted","Data":"5db63ea4a559119523d3e8bb8f75218cc114aadc379f3aae7d36834ca3a9a98b"} Dec 05 16:38:57 crc kubenswrapper[4756]: I1205 16:38:57.657044 4756 generic.go:334] "Generic (PLEG): container finished" podID="4586641a-c567-4535-a94a-61b78cdb4a57" containerID="ec7bcdc950eb8f5366a61dde4110fc1799c592008c60cf7260168171aa2bb2a5" exitCode=0 Dec 05 16:38:57 crc kubenswrapper[4756]: I1205 16:38:57.657080 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bgtmv" event={"ID":"4586641a-c567-4535-a94a-61b78cdb4a57","Type":"ContainerDied","Data":"ec7bcdc950eb8f5366a61dde4110fc1799c592008c60cf7260168171aa2bb2a5"} Dec 05 16:38:59 crc kubenswrapper[4756]: I1205 16:38:59.679852 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bgtmv" event={"ID":"4586641a-c567-4535-a94a-61b78cdb4a57","Type":"ContainerStarted","Data":"26acef0b8c42080f62f6cbd569bf11e7bc0a1d9a5d44c5dbea90ebb6acfb5cd1"} Dec 05 16:38:59 crc kubenswrapper[4756]: I1205 16:38:59.683192 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0c31b46-0af4-4e7f-9a32-06ee16a766aa","Type":"ContainerStarted","Data":"5604a219eeaff30b39a5446471359f5127d1a5b1fbbfeb672889ea616a2c3aa4"} Dec 05 16:38:59 crc kubenswrapper[4756]: I1205 16:38:59.704527 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bgtmv" podStartSLOduration=4.569748676 podStartE2EDuration="8.704508851s" podCreationTimestamp="2025-12-05 16:38:51 +0000 UTC" firstStartedPulling="2025-12-05 16:38:54.602909093 +0000 UTC m=+1394.841582605" lastFinishedPulling="2025-12-05 16:38:58.737669268 +0000 UTC m=+1398.976342780" observedRunningTime="2025-12-05 16:38:59.696590206 +0000 UTC m=+1399.935263738" watchObservedRunningTime="2025-12-05 16:38:59.704508851 +0000 UTC m=+1399.943182353" Dec 05 16:39:02 crc kubenswrapper[4756]: I1205 16:39:02.158648 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bgtmv" Dec 05 16:39:02 crc kubenswrapper[4756]: I1205 16:39:02.160938 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bgtmv" Dec 05 16:39:03 crc kubenswrapper[4756]: I1205 16:39:03.220464 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bgtmv" podUID="4586641a-c567-4535-a94a-61b78cdb4a57" containerName="registry-server" probeResult="failure" output=< Dec 05 16:39:03 crc kubenswrapper[4756]: timeout: failed to connect service ":50051" within 1s Dec 05 16:39:03 crc kubenswrapper[4756]: > Dec 05 16:39:03 crc kubenswrapper[4756]: I1205 16:39:03.720027 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0c31b46-0af4-4e7f-9a32-06ee16a766aa","Type":"ContainerStarted","Data":"459891e34350f5b654672466cfa986677c601900125058b1863760a6d635e367"} Dec 05 16:39:05 crc kubenswrapper[4756]: I1205 16:39:05.748137 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0c31b46-0af4-4e7f-9a32-06ee16a766aa","Type":"ContainerStarted","Data":"4ac9c6438faed2c3da581966d462cca9da347859aefd847e1c08a3c6a331d9f9"} Dec 05 16:39:05 crc kubenswrapper[4756]: I1205 16:39:05.748753 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 16:39:05 crc kubenswrapper[4756]: I1205 16:39:05.774724 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.384397713 podStartE2EDuration="11.774703337s" podCreationTimestamp="2025-12-05 16:38:54 +0000 UTC" firstStartedPulling="2025-12-05 16:38:56.189340357 +0000 UTC m=+1396.428013869" lastFinishedPulling="2025-12-05 16:39:04.579645981 +0000 UTC m=+1404.818319493" observedRunningTime="2025-12-05 16:39:05.769763213 +0000 UTC m=+1406.008436765" watchObservedRunningTime="2025-12-05 16:39:05.774703337 +0000 UTC m=+1406.013376849" Dec 05 16:39:10 crc kubenswrapper[4756]: I1205 16:39:10.792352 4756 generic.go:334] "Generic (PLEG): container finished" podID="48ea8041-7117-4be6-9d19-d5d282c10364" containerID="2d0048989462b0409ebe8c9066cf5ebecd7f70bb52e8298125aa44f8f432d3f0" exitCode=0 Dec 05 16:39:10 crc kubenswrapper[4756]: I1205 16:39:10.792609 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mrzn4" event={"ID":"48ea8041-7117-4be6-9d19-d5d282c10364","Type":"ContainerDied","Data":"2d0048989462b0409ebe8c9066cf5ebecd7f70bb52e8298125aa44f8f432d3f0"} Dec 05 16:39:12 crc kubenswrapper[4756]: I1205 16:39:12.168223 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mrzn4" Dec 05 16:39:12 crc kubenswrapper[4756]: I1205 16:39:12.206702 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ea8041-7117-4be6-9d19-d5d282c10364-combined-ca-bundle\") pod \"48ea8041-7117-4be6-9d19-d5d282c10364\" (UID: \"48ea8041-7117-4be6-9d19-d5d282c10364\") " Dec 05 16:39:12 crc kubenswrapper[4756]: I1205 16:39:12.206981 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48ea8041-7117-4be6-9d19-d5d282c10364-scripts\") pod \"48ea8041-7117-4be6-9d19-d5d282c10364\" (UID: \"48ea8041-7117-4be6-9d19-d5d282c10364\") " Dec 05 16:39:12 crc kubenswrapper[4756]: I1205 16:39:12.207016 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knbmn\" (UniqueName: \"kubernetes.io/projected/48ea8041-7117-4be6-9d19-d5d282c10364-kube-api-access-knbmn\") pod \"48ea8041-7117-4be6-9d19-d5d282c10364\" (UID: \"48ea8041-7117-4be6-9d19-d5d282c10364\") " Dec 05 16:39:12 crc kubenswrapper[4756]: I1205 16:39:12.207059 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ea8041-7117-4be6-9d19-d5d282c10364-config-data\") pod \"48ea8041-7117-4be6-9d19-d5d282c10364\" (UID: \"48ea8041-7117-4be6-9d19-d5d282c10364\") " Dec 05 16:39:12 crc kubenswrapper[4756]: I1205 16:39:12.213894 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48ea8041-7117-4be6-9d19-d5d282c10364-kube-api-access-knbmn" (OuterVolumeSpecName: "kube-api-access-knbmn") pod "48ea8041-7117-4be6-9d19-d5d282c10364" (UID: "48ea8041-7117-4be6-9d19-d5d282c10364"). InnerVolumeSpecName "kube-api-access-knbmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:39:12 crc kubenswrapper[4756]: I1205 16:39:12.217499 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48ea8041-7117-4be6-9d19-d5d282c10364-scripts" (OuterVolumeSpecName: "scripts") pod "48ea8041-7117-4be6-9d19-d5d282c10364" (UID: "48ea8041-7117-4be6-9d19-d5d282c10364"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:39:12 crc kubenswrapper[4756]: I1205 16:39:12.240548 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48ea8041-7117-4be6-9d19-d5d282c10364-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48ea8041-7117-4be6-9d19-d5d282c10364" (UID: "48ea8041-7117-4be6-9d19-d5d282c10364"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:39:12 crc kubenswrapper[4756]: I1205 16:39:12.258455 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48ea8041-7117-4be6-9d19-d5d282c10364-config-data" (OuterVolumeSpecName: "config-data") pod "48ea8041-7117-4be6-9d19-d5d282c10364" (UID: "48ea8041-7117-4be6-9d19-d5d282c10364"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:39:12 crc kubenswrapper[4756]: I1205 16:39:12.309673 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48ea8041-7117-4be6-9d19-d5d282c10364-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:12 crc kubenswrapper[4756]: I1205 16:39:12.309707 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knbmn\" (UniqueName: \"kubernetes.io/projected/48ea8041-7117-4be6-9d19-d5d282c10364-kube-api-access-knbmn\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:12 crc kubenswrapper[4756]: I1205 16:39:12.309721 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ea8041-7117-4be6-9d19-d5d282c10364-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:12 crc kubenswrapper[4756]: I1205 16:39:12.309733 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ea8041-7117-4be6-9d19-d5d282c10364-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:12 crc kubenswrapper[4756]: I1205 16:39:12.817984 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mrzn4" event={"ID":"48ea8041-7117-4be6-9d19-d5d282c10364","Type":"ContainerDied","Data":"70bd63f385700dd6c77123a43d09c7ad68468917162fb55616365d86cf160913"} Dec 05 16:39:12 crc kubenswrapper[4756]: I1205 16:39:12.818026 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70bd63f385700dd6c77123a43d09c7ad68468917162fb55616365d86cf160913" Dec 05 16:39:12 crc kubenswrapper[4756]: I1205 16:39:12.818075 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mrzn4" Dec 05 16:39:12 crc kubenswrapper[4756]: I1205 16:39:12.983132 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 16:39:12 crc kubenswrapper[4756]: E1205 16:39:12.983642 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48ea8041-7117-4be6-9d19-d5d282c10364" containerName="nova-cell0-conductor-db-sync" Dec 05 16:39:12 crc kubenswrapper[4756]: I1205 16:39:12.983669 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="48ea8041-7117-4be6-9d19-d5d282c10364" containerName="nova-cell0-conductor-db-sync" Dec 05 16:39:12 crc kubenswrapper[4756]: I1205 16:39:12.983930 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="48ea8041-7117-4be6-9d19-d5d282c10364" containerName="nova-cell0-conductor-db-sync" Dec 05 16:39:12 crc kubenswrapper[4756]: I1205 16:39:12.984691 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 16:39:12 crc kubenswrapper[4756]: I1205 16:39:12.987182 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 05 16:39:12 crc kubenswrapper[4756]: I1205 16:39:12.987583 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-d4fn5" Dec 05 16:39:13 crc kubenswrapper[4756]: I1205 16:39:13.002589 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 16:39:13 crc kubenswrapper[4756]: I1205 16:39:13.024672 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/187070cf-06f5-4c25-851f-2e1f294c3b7c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"187070cf-06f5-4c25-851f-2e1f294c3b7c\") " pod="openstack/nova-cell0-conductor-0" Dec 05 16:39:13 crc kubenswrapper[4756]: I1205 16:39:13.024740 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/187070cf-06f5-4c25-851f-2e1f294c3b7c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"187070cf-06f5-4c25-851f-2e1f294c3b7c\") " pod="openstack/nova-cell0-conductor-0" Dec 05 16:39:13 crc kubenswrapper[4756]: I1205 16:39:13.024873 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjpn5\" (UniqueName: \"kubernetes.io/projected/187070cf-06f5-4c25-851f-2e1f294c3b7c-kube-api-access-jjpn5\") pod \"nova-cell0-conductor-0\" (UID: \"187070cf-06f5-4c25-851f-2e1f294c3b7c\") " pod="openstack/nova-cell0-conductor-0" Dec 05 16:39:13 crc kubenswrapper[4756]: I1205 16:39:13.128210 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/187070cf-06f5-4c25-851f-2e1f294c3b7c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"187070cf-06f5-4c25-851f-2e1f294c3b7c\") " pod="openstack/nova-cell0-conductor-0" Dec 05 16:39:13 crc kubenswrapper[4756]: I1205 16:39:13.128371 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/187070cf-06f5-4c25-851f-2e1f294c3b7c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"187070cf-06f5-4c25-851f-2e1f294c3b7c\") " pod="openstack/nova-cell0-conductor-0" Dec 05 16:39:13 crc kubenswrapper[4756]: I1205 16:39:13.129691 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjpn5\" (UniqueName: \"kubernetes.io/projected/187070cf-06f5-4c25-851f-2e1f294c3b7c-kube-api-access-jjpn5\") pod \"nova-cell0-conductor-0\" (UID: \"187070cf-06f5-4c25-851f-2e1f294c3b7c\") " pod="openstack/nova-cell0-conductor-0" Dec 05 16:39:13 crc kubenswrapper[4756]: I1205 16:39:13.134464 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/187070cf-06f5-4c25-851f-2e1f294c3b7c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"187070cf-06f5-4c25-851f-2e1f294c3b7c\") " pod="openstack/nova-cell0-conductor-0" Dec 05 16:39:13 crc kubenswrapper[4756]: I1205 16:39:13.134561 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/187070cf-06f5-4c25-851f-2e1f294c3b7c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"187070cf-06f5-4c25-851f-2e1f294c3b7c\") " pod="openstack/nova-cell0-conductor-0" Dec 05 16:39:13 crc kubenswrapper[4756]: I1205 16:39:13.152921 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjpn5\" (UniqueName: \"kubernetes.io/projected/187070cf-06f5-4c25-851f-2e1f294c3b7c-kube-api-access-jjpn5\") pod \"nova-cell0-conductor-0\" (UID: \"187070cf-06f5-4c25-851f-2e1f294c3b7c\") " pod="openstack/nova-cell0-conductor-0" Dec 05 16:39:13 crc kubenswrapper[4756]: I1205 16:39:13.232874 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bgtmv" podUID="4586641a-c567-4535-a94a-61b78cdb4a57" containerName="registry-server" probeResult="failure" output=< Dec 05 16:39:13 crc kubenswrapper[4756]: timeout: failed to connect service ":50051" within 1s Dec 05 16:39:13 crc kubenswrapper[4756]: > Dec 05 16:39:13 crc kubenswrapper[4756]: I1205 16:39:13.304934 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 16:39:13 crc kubenswrapper[4756]: I1205 16:39:13.780075 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 16:39:13 crc kubenswrapper[4756]: I1205 16:39:13.830821 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"187070cf-06f5-4c25-851f-2e1f294c3b7c","Type":"ContainerStarted","Data":"7dc8d8d57bd5e9e0828f87c61352839520b2055e234f1c49a1cea5e9ccf13e96"} Dec 05 16:39:14 crc kubenswrapper[4756]: I1205 16:39:14.843299 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"187070cf-06f5-4c25-851f-2e1f294c3b7c","Type":"ContainerStarted","Data":"7fc666336bac7d8a681a050c8fd16bd151411c4f9fcfe10381ada0984c4a3444"} Dec 05 16:39:14 crc kubenswrapper[4756]: I1205 16:39:14.843941 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 05 16:39:14 crc kubenswrapper[4756]: I1205 16:39:14.869695 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.869659877 podStartE2EDuration="2.869659877s" podCreationTimestamp="2025-12-05 16:39:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:39:14.86163542 +0000 UTC m=+1415.100308952" watchObservedRunningTime="2025-12-05 16:39:14.869659877 +0000 UTC m=+1415.108333389" Dec 05 16:39:18 crc kubenswrapper[4756]: I1205 16:39:18.340132 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 05 16:39:18 crc kubenswrapper[4756]: I1205 16:39:18.776572 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-vs2mj"] Dec 05 16:39:18 crc kubenswrapper[4756]: I1205 16:39:18.782350 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vs2mj" Dec 05 16:39:18 crc kubenswrapper[4756]: I1205 16:39:18.786404 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 05 16:39:18 crc kubenswrapper[4756]: I1205 16:39:18.786794 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 05 16:39:18 crc kubenswrapper[4756]: I1205 16:39:18.807430 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vs2mj"] Dec 05 16:39:18 crc kubenswrapper[4756]: I1205 16:39:18.868819 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90a878ef-27a9-49a2-be83-1a6ba34fab7f-config-data\") pod \"nova-cell0-cell-mapping-vs2mj\" (UID: \"90a878ef-27a9-49a2-be83-1a6ba34fab7f\") " pod="openstack/nova-cell0-cell-mapping-vs2mj" Dec 05 16:39:18 crc kubenswrapper[4756]: I1205 16:39:18.868906 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g5hh\" (UniqueName: \"kubernetes.io/projected/90a878ef-27a9-49a2-be83-1a6ba34fab7f-kube-api-access-7g5hh\") pod \"nova-cell0-cell-mapping-vs2mj\" (UID: \"90a878ef-27a9-49a2-be83-1a6ba34fab7f\") " pod="openstack/nova-cell0-cell-mapping-vs2mj" Dec 05 16:39:18 crc kubenswrapper[4756]: I1205 16:39:18.869009 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90a878ef-27a9-49a2-be83-1a6ba34fab7f-scripts\") pod \"nova-cell0-cell-mapping-vs2mj\" (UID: \"90a878ef-27a9-49a2-be83-1a6ba34fab7f\") " pod="openstack/nova-cell0-cell-mapping-vs2mj" Dec 05 16:39:18 crc kubenswrapper[4756]: I1205 16:39:18.869036 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90a878ef-27a9-49a2-be83-1a6ba34fab7f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vs2mj\" (UID: \"90a878ef-27a9-49a2-be83-1a6ba34fab7f\") " pod="openstack/nova-cell0-cell-mapping-vs2mj" Dec 05 16:39:18 crc kubenswrapper[4756]: I1205 16:39:18.965244 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 16:39:18 crc kubenswrapper[4756]: I1205 16:39:18.966370 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 16:39:18 crc kubenswrapper[4756]: I1205 16:39:18.975279 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g5hh\" (UniqueName: \"kubernetes.io/projected/90a878ef-27a9-49a2-be83-1a6ba34fab7f-kube-api-access-7g5hh\") pod \"nova-cell0-cell-mapping-vs2mj\" (UID: \"90a878ef-27a9-49a2-be83-1a6ba34fab7f\") " pod="openstack/nova-cell0-cell-mapping-vs2mj" Dec 05 16:39:18 crc kubenswrapper[4756]: I1205 16:39:18.975448 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90a878ef-27a9-49a2-be83-1a6ba34fab7f-scripts\") pod \"nova-cell0-cell-mapping-vs2mj\" (UID: \"90a878ef-27a9-49a2-be83-1a6ba34fab7f\") " pod="openstack/nova-cell0-cell-mapping-vs2mj" Dec 05 16:39:18 crc kubenswrapper[4756]: I1205 16:39:18.975498 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90a878ef-27a9-49a2-be83-1a6ba34fab7f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vs2mj\" (UID: \"90a878ef-27a9-49a2-be83-1a6ba34fab7f\") " pod="openstack/nova-cell0-cell-mapping-vs2mj" Dec 05 16:39:18 crc kubenswrapper[4756]: I1205 16:39:18.975770 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90a878ef-27a9-49a2-be83-1a6ba34fab7f-config-data\") pod \"nova-cell0-cell-mapping-vs2mj\" (UID: \"90a878ef-27a9-49a2-be83-1a6ba34fab7f\") " pod="openstack/nova-cell0-cell-mapping-vs2mj" Dec 05 16:39:18 crc kubenswrapper[4756]: I1205 16:39:18.985431 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90a878ef-27a9-49a2-be83-1a6ba34fab7f-scripts\") pod \"nova-cell0-cell-mapping-vs2mj\" (UID: \"90a878ef-27a9-49a2-be83-1a6ba34fab7f\") " pod="openstack/nova-cell0-cell-mapping-vs2mj" Dec 05 16:39:18 crc kubenswrapper[4756]: I1205 16:39:18.987622 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90a878ef-27a9-49a2-be83-1a6ba34fab7f-config-data\") pod \"nova-cell0-cell-mapping-vs2mj\" (UID: \"90a878ef-27a9-49a2-be83-1a6ba34fab7f\") " pod="openstack/nova-cell0-cell-mapping-vs2mj" Dec 05 16:39:18 crc kubenswrapper[4756]: I1205 16:39:18.990374 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90a878ef-27a9-49a2-be83-1a6ba34fab7f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vs2mj\" (UID: \"90a878ef-27a9-49a2-be83-1a6ba34fab7f\") " pod="openstack/nova-cell0-cell-mapping-vs2mj" Dec 05 16:39:18 crc kubenswrapper[4756]: I1205 16:39:18.990866 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.023767 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.067263 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.073389 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.073562 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g5hh\" (UniqueName: \"kubernetes.io/projected/90a878ef-27a9-49a2-be83-1a6ba34fab7f-kube-api-access-7g5hh\") pod \"nova-cell0-cell-mapping-vs2mj\" (UID: \"90a878ef-27a9-49a2-be83-1a6ba34fab7f\") " pod="openstack/nova-cell0-cell-mapping-vs2mj" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.077145 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a04745b-1666-4d6b-ab59-706b04cf4160-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5a04745b-1666-4d6b-ab59-706b04cf4160\") " pod="openstack/nova-scheduler-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.077226 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpgvx\" (UniqueName: \"kubernetes.io/projected/5a04745b-1666-4d6b-ab59-706b04cf4160-kube-api-access-mpgvx\") pod \"nova-scheduler-0\" (UID: \"5a04745b-1666-4d6b-ab59-706b04cf4160\") " pod="openstack/nova-scheduler-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.077292 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a04745b-1666-4d6b-ab59-706b04cf4160-config-data\") pod \"nova-scheduler-0\" (UID: \"5a04745b-1666-4d6b-ab59-706b04cf4160\") " pod="openstack/nova-scheduler-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.087132 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.107264 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.112487 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vs2mj" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.190363 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a04745b-1666-4d6b-ab59-706b04cf4160-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5a04745b-1666-4d6b-ab59-706b04cf4160\") " pod="openstack/nova-scheduler-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.190558 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpgvx\" (UniqueName: \"kubernetes.io/projected/5a04745b-1666-4d6b-ab59-706b04cf4160-kube-api-access-mpgvx\") pod \"nova-scheduler-0\" (UID: \"5a04745b-1666-4d6b-ab59-706b04cf4160\") " pod="openstack/nova-scheduler-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.190625 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.190684 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a04745b-1666-4d6b-ab59-706b04cf4160-config-data\") pod \"nova-scheduler-0\" (UID: \"5a04745b-1666-4d6b-ab59-706b04cf4160\") " pod="openstack/nova-scheduler-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.190768 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.190809 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2sqd\" (UniqueName: \"kubernetes.io/projected/5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e-kube-api-access-r2sqd\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.215101 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a04745b-1666-4d6b-ab59-706b04cf4160-config-data\") pod \"nova-scheduler-0\" (UID: \"5a04745b-1666-4d6b-ab59-706b04cf4160\") " pod="openstack/nova-scheduler-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.218562 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a04745b-1666-4d6b-ab59-706b04cf4160-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5a04745b-1666-4d6b-ab59-706b04cf4160\") " pod="openstack/nova-scheduler-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.227215 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.249146 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.253717 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.269282 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpgvx\" (UniqueName: \"kubernetes.io/projected/5a04745b-1666-4d6b-ab59-706b04cf4160-kube-api-access-mpgvx\") pod \"nova-scheduler-0\" (UID: \"5a04745b-1666-4d6b-ab59-706b04cf4160\") " pod="openstack/nova-scheduler-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.281763 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.292605 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.292837 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2sqd\" (UniqueName: \"kubernetes.io/projected/5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e-kube-api-access-r2sqd\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.293082 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d65433-aeb2-45cf-a371-70f9db4f2064-config-data\") pod \"nova-metadata-0\" (UID: \"d4d65433-aeb2-45cf-a371-70f9db4f2064\") " pod="openstack/nova-metadata-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.293205 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4d65433-aeb2-45cf-a371-70f9db4f2064-logs\") pod \"nova-metadata-0\" (UID: \"d4d65433-aeb2-45cf-a371-70f9db4f2064\") " pod="openstack/nova-metadata-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.293316 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d65433-aeb2-45cf-a371-70f9db4f2064-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d4d65433-aeb2-45cf-a371-70f9db4f2064\") " pod="openstack/nova-metadata-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.293505 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45g9l\" (UniqueName: \"kubernetes.io/projected/d4d65433-aeb2-45cf-a371-70f9db4f2064-kube-api-access-45g9l\") pod \"nova-metadata-0\" (UID: \"d4d65433-aeb2-45cf-a371-70f9db4f2064\") " pod="openstack/nova-metadata-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.293620 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.298814 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.304002 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.306921 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.323569 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.339681 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.372287 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2sqd\" (UniqueName: \"kubernetes.io/projected/5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e-kube-api-access-r2sqd\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.406601 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4d65433-aeb2-45cf-a371-70f9db4f2064-logs\") pod \"nova-metadata-0\" (UID: \"d4d65433-aeb2-45cf-a371-70f9db4f2064\") " pod="openstack/nova-metadata-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.406655 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d65433-aeb2-45cf-a371-70f9db4f2064-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d4d65433-aeb2-45cf-a371-70f9db4f2064\") " pod="openstack/nova-metadata-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.406679 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjgjp\" (UniqueName: \"kubernetes.io/projected/146bf072-8ade-4d0f-91cf-4a58c4c2f136-kube-api-access-qjgjp\") pod \"nova-api-0\" (UID: \"146bf072-8ade-4d0f-91cf-4a58c4c2f136\") " pod="openstack/nova-api-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.406754 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45g9l\" (UniqueName: \"kubernetes.io/projected/d4d65433-aeb2-45cf-a371-70f9db4f2064-kube-api-access-45g9l\") pod \"nova-metadata-0\" (UID: \"d4d65433-aeb2-45cf-a371-70f9db4f2064\") " pod="openstack/nova-metadata-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.406828 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/146bf072-8ade-4d0f-91cf-4a58c4c2f136-config-data\") pod \"nova-api-0\" (UID: \"146bf072-8ade-4d0f-91cf-4a58c4c2f136\") " pod="openstack/nova-api-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.406907 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/146bf072-8ade-4d0f-91cf-4a58c4c2f136-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"146bf072-8ade-4d0f-91cf-4a58c4c2f136\") " pod="openstack/nova-api-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.406935 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d65433-aeb2-45cf-a371-70f9db4f2064-config-data\") pod \"nova-metadata-0\" (UID: \"d4d65433-aeb2-45cf-a371-70f9db4f2064\") " pod="openstack/nova-metadata-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.406970 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/146bf072-8ade-4d0f-91cf-4a58c4c2f136-logs\") pod \"nova-api-0\" (UID: \"146bf072-8ade-4d0f-91cf-4a58c4c2f136\") " pod="openstack/nova-api-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.407482 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4d65433-aeb2-45cf-a371-70f9db4f2064-logs\") pod \"nova-metadata-0\" (UID: \"d4d65433-aeb2-45cf-a371-70f9db4f2064\") " pod="openstack/nova-metadata-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.422212 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d65433-aeb2-45cf-a371-70f9db4f2064-config-data\") pod \"nova-metadata-0\" (UID: \"d4d65433-aeb2-45cf-a371-70f9db4f2064\") " pod="openstack/nova-metadata-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.422826 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d65433-aeb2-45cf-a371-70f9db4f2064-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d4d65433-aeb2-45cf-a371-70f9db4f2064\") " pod="openstack/nova-metadata-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.450074 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.450100 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45g9l\" (UniqueName: \"kubernetes.io/projected/d4d65433-aeb2-45cf-a371-70f9db4f2064-kube-api-access-45g9l\") pod \"nova-metadata-0\" (UID: \"d4d65433-aeb2-45cf-a371-70f9db4f2064\") " pod="openstack/nova-metadata-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.450573 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.470046 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d75688ddc-j5zx2"] Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.486526 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.494512 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d75688ddc-j5zx2"] Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.509255 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/146bf072-8ade-4d0f-91cf-4a58c4c2f136-config-data\") pod \"nova-api-0\" (UID: \"146bf072-8ade-4d0f-91cf-4a58c4c2f136\") " pod="openstack/nova-api-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.509335 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/146bf072-8ade-4d0f-91cf-4a58c4c2f136-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"146bf072-8ade-4d0f-91cf-4a58c4c2f136\") " pod="openstack/nova-api-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.509381 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/146bf072-8ade-4d0f-91cf-4a58c4c2f136-logs\") pod \"nova-api-0\" (UID: \"146bf072-8ade-4d0f-91cf-4a58c4c2f136\") " pod="openstack/nova-api-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.509464 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjgjp\" (UniqueName: \"kubernetes.io/projected/146bf072-8ade-4d0f-91cf-4a58c4c2f136-kube-api-access-qjgjp\") pod \"nova-api-0\" (UID: \"146bf072-8ade-4d0f-91cf-4a58c4c2f136\") " pod="openstack/nova-api-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.513827 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/146bf072-8ade-4d0f-91cf-4a58c4c2f136-config-data\") pod \"nova-api-0\" (UID: \"146bf072-8ade-4d0f-91cf-4a58c4c2f136\") " pod="openstack/nova-api-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.514065 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/146bf072-8ade-4d0f-91cf-4a58c4c2f136-logs\") pod \"nova-api-0\" (UID: \"146bf072-8ade-4d0f-91cf-4a58c4c2f136\") " pod="openstack/nova-api-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.515962 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/146bf072-8ade-4d0f-91cf-4a58c4c2f136-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"146bf072-8ade-4d0f-91cf-4a58c4c2f136\") " pod="openstack/nova-api-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.543379 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjgjp\" (UniqueName: \"kubernetes.io/projected/146bf072-8ade-4d0f-91cf-4a58c4c2f136-kube-api-access-qjgjp\") pod \"nova-api-0\" (UID: \"146bf072-8ade-4d0f-91cf-4a58c4c2f136\") " pod="openstack/nova-api-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.612521 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af31e31e-33e5-48d9-8322-efb1f23ba2b0-dns-swift-storage-0\") pod \"dnsmasq-dns-7d75688ddc-j5zx2\" (UID: \"af31e31e-33e5-48d9-8322-efb1f23ba2b0\") " pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.612564 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af31e31e-33e5-48d9-8322-efb1f23ba2b0-ovsdbserver-sb\") pod \"dnsmasq-dns-7d75688ddc-j5zx2\" (UID: \"af31e31e-33e5-48d9-8322-efb1f23ba2b0\") " pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.612658 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqxq2\" (UniqueName: \"kubernetes.io/projected/af31e31e-33e5-48d9-8322-efb1f23ba2b0-kube-api-access-nqxq2\") pod \"dnsmasq-dns-7d75688ddc-j5zx2\" (UID: \"af31e31e-33e5-48d9-8322-efb1f23ba2b0\") " pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.612696 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af31e31e-33e5-48d9-8322-efb1f23ba2b0-ovsdbserver-nb\") pod \"dnsmasq-dns-7d75688ddc-j5zx2\" (UID: \"af31e31e-33e5-48d9-8322-efb1f23ba2b0\") " pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.612759 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af31e31e-33e5-48d9-8322-efb1f23ba2b0-config\") pod \"dnsmasq-dns-7d75688ddc-j5zx2\" (UID: \"af31e31e-33e5-48d9-8322-efb1f23ba2b0\") " pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.612796 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af31e31e-33e5-48d9-8322-efb1f23ba2b0-dns-svc\") pod \"dnsmasq-dns-7d75688ddc-j5zx2\" (UID: \"af31e31e-33e5-48d9-8322-efb1f23ba2b0\") " pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.624131 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.666576 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.681181 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.714829 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af31e31e-33e5-48d9-8322-efb1f23ba2b0-ovsdbserver-nb\") pod \"dnsmasq-dns-7d75688ddc-j5zx2\" (UID: \"af31e31e-33e5-48d9-8322-efb1f23ba2b0\") " pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.714916 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af31e31e-33e5-48d9-8322-efb1f23ba2b0-config\") pod \"dnsmasq-dns-7d75688ddc-j5zx2\" (UID: \"af31e31e-33e5-48d9-8322-efb1f23ba2b0\") " pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.714956 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af31e31e-33e5-48d9-8322-efb1f23ba2b0-dns-svc\") pod \"dnsmasq-dns-7d75688ddc-j5zx2\" (UID: \"af31e31e-33e5-48d9-8322-efb1f23ba2b0\") " pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.715078 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af31e31e-33e5-48d9-8322-efb1f23ba2b0-dns-swift-storage-0\") pod \"dnsmasq-dns-7d75688ddc-j5zx2\" (UID: \"af31e31e-33e5-48d9-8322-efb1f23ba2b0\") " pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.715112 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af31e31e-33e5-48d9-8322-efb1f23ba2b0-ovsdbserver-sb\") pod \"dnsmasq-dns-7d75688ddc-j5zx2\" (UID: \"af31e31e-33e5-48d9-8322-efb1f23ba2b0\") " pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.715179 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqxq2\" (UniqueName: \"kubernetes.io/projected/af31e31e-33e5-48d9-8322-efb1f23ba2b0-kube-api-access-nqxq2\") pod \"dnsmasq-dns-7d75688ddc-j5zx2\" (UID: \"af31e31e-33e5-48d9-8322-efb1f23ba2b0\") " pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.716051 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af31e31e-33e5-48d9-8322-efb1f23ba2b0-ovsdbserver-nb\") pod \"dnsmasq-dns-7d75688ddc-j5zx2\" (UID: \"af31e31e-33e5-48d9-8322-efb1f23ba2b0\") " pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.716186 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af31e31e-33e5-48d9-8322-efb1f23ba2b0-config\") pod \"dnsmasq-dns-7d75688ddc-j5zx2\" (UID: \"af31e31e-33e5-48d9-8322-efb1f23ba2b0\") " pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.716984 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af31e31e-33e5-48d9-8322-efb1f23ba2b0-dns-svc\") pod \"dnsmasq-dns-7d75688ddc-j5zx2\" (UID: \"af31e31e-33e5-48d9-8322-efb1f23ba2b0\") " pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.717353 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af31e31e-33e5-48d9-8322-efb1f23ba2b0-ovsdbserver-sb\") pod \"dnsmasq-dns-7d75688ddc-j5zx2\" (UID: \"af31e31e-33e5-48d9-8322-efb1f23ba2b0\") " pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.717648 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af31e31e-33e5-48d9-8322-efb1f23ba2b0-dns-swift-storage-0\") pod \"dnsmasq-dns-7d75688ddc-j5zx2\" (UID: \"af31e31e-33e5-48d9-8322-efb1f23ba2b0\") " pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.737221 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqxq2\" (UniqueName: \"kubernetes.io/projected/af31e31e-33e5-48d9-8322-efb1f23ba2b0-kube-api-access-nqxq2\") pod \"dnsmasq-dns-7d75688ddc-j5zx2\" (UID: \"af31e31e-33e5-48d9-8322-efb1f23ba2b0\") " pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.846833 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.877276 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.908167 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5a04745b-1666-4d6b-ab59-706b04cf4160","Type":"ContainerStarted","Data":"fb36993d0a652343770d877a82f25cb5fd76f37e666269d9505b69c2938119e7"} Dec 05 16:39:19 crc kubenswrapper[4756]: I1205 16:39:19.943457 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vs2mj"] Dec 05 16:39:20 crc kubenswrapper[4756]: I1205 16:39:20.133842 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4dl5v"] Dec 05 16:39:20 crc kubenswrapper[4756]: I1205 16:39:20.135337 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4dl5v" Dec 05 16:39:20 crc kubenswrapper[4756]: I1205 16:39:20.137748 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 05 16:39:20 crc kubenswrapper[4756]: I1205 16:39:20.148786 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 05 16:39:20 crc kubenswrapper[4756]: I1205 16:39:20.226446 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvfh4\" (UniqueName: \"kubernetes.io/projected/edb5a656-efa8-47ac-befc-2412854889f7-kube-api-access-fvfh4\") pod \"nova-cell1-conductor-db-sync-4dl5v\" (UID: \"edb5a656-efa8-47ac-befc-2412854889f7\") " pod="openstack/nova-cell1-conductor-db-sync-4dl5v" Dec 05 16:39:20 crc kubenswrapper[4756]: I1205 16:39:20.226503 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edb5a656-efa8-47ac-befc-2412854889f7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4dl5v\" (UID: \"edb5a656-efa8-47ac-befc-2412854889f7\") " pod="openstack/nova-cell1-conductor-db-sync-4dl5v" Dec 05 16:39:20 crc kubenswrapper[4756]: I1205 16:39:20.226690 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edb5a656-efa8-47ac-befc-2412854889f7-config-data\") pod \"nova-cell1-conductor-db-sync-4dl5v\" (UID: \"edb5a656-efa8-47ac-befc-2412854889f7\") " pod="openstack/nova-cell1-conductor-db-sync-4dl5v" Dec 05 16:39:20 crc kubenswrapper[4756]: I1205 16:39:20.226706 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edb5a656-efa8-47ac-befc-2412854889f7-scripts\") pod \"nova-cell1-conductor-db-sync-4dl5v\" (UID: \"edb5a656-efa8-47ac-befc-2412854889f7\") " pod="openstack/nova-cell1-conductor-db-sync-4dl5v" Dec 05 16:39:20 crc kubenswrapper[4756]: I1205 16:39:20.244675 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4dl5v"] Dec 05 16:39:20 crc kubenswrapper[4756]: I1205 16:39:20.276985 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 16:39:20 crc kubenswrapper[4756]: I1205 16:39:20.304732 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 16:39:20 crc kubenswrapper[4756]: I1205 16:39:20.328367 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edb5a656-efa8-47ac-befc-2412854889f7-config-data\") pod \"nova-cell1-conductor-db-sync-4dl5v\" (UID: \"edb5a656-efa8-47ac-befc-2412854889f7\") " pod="openstack/nova-cell1-conductor-db-sync-4dl5v" Dec 05 16:39:20 crc kubenswrapper[4756]: I1205 16:39:20.328399 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edb5a656-efa8-47ac-befc-2412854889f7-scripts\") pod \"nova-cell1-conductor-db-sync-4dl5v\" (UID: \"edb5a656-efa8-47ac-befc-2412854889f7\") " pod="openstack/nova-cell1-conductor-db-sync-4dl5v" Dec 05 16:39:20 crc kubenswrapper[4756]: I1205 16:39:20.328453 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvfh4\" (UniqueName: \"kubernetes.io/projected/edb5a656-efa8-47ac-befc-2412854889f7-kube-api-access-fvfh4\") pod \"nova-cell1-conductor-db-sync-4dl5v\" (UID: \"edb5a656-efa8-47ac-befc-2412854889f7\") " pod="openstack/nova-cell1-conductor-db-sync-4dl5v" Dec 05 16:39:20 crc kubenswrapper[4756]: I1205 16:39:20.328482 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edb5a656-efa8-47ac-befc-2412854889f7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4dl5v\" (UID: \"edb5a656-efa8-47ac-befc-2412854889f7\") " pod="openstack/nova-cell1-conductor-db-sync-4dl5v" Dec 05 16:39:20 crc kubenswrapper[4756]: I1205 16:39:20.344673 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edb5a656-efa8-47ac-befc-2412854889f7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4dl5v\" (UID: \"edb5a656-efa8-47ac-befc-2412854889f7\") " pod="openstack/nova-cell1-conductor-db-sync-4dl5v" Dec 05 16:39:20 crc kubenswrapper[4756]: I1205 16:39:20.344700 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edb5a656-efa8-47ac-befc-2412854889f7-config-data\") pod \"nova-cell1-conductor-db-sync-4dl5v\" (UID: \"edb5a656-efa8-47ac-befc-2412854889f7\") " pod="openstack/nova-cell1-conductor-db-sync-4dl5v" Dec 05 16:39:20 crc kubenswrapper[4756]: I1205 16:39:20.345540 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edb5a656-efa8-47ac-befc-2412854889f7-scripts\") pod \"nova-cell1-conductor-db-sync-4dl5v\" (UID: \"edb5a656-efa8-47ac-befc-2412854889f7\") " pod="openstack/nova-cell1-conductor-db-sync-4dl5v" Dec 05 16:39:20 crc kubenswrapper[4756]: I1205 16:39:20.354753 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvfh4\" (UniqueName: \"kubernetes.io/projected/edb5a656-efa8-47ac-befc-2412854889f7-kube-api-access-fvfh4\") pod \"nova-cell1-conductor-db-sync-4dl5v\" (UID: \"edb5a656-efa8-47ac-befc-2412854889f7\") " pod="openstack/nova-cell1-conductor-db-sync-4dl5v" Dec 05 16:39:20 crc kubenswrapper[4756]: I1205 16:39:20.474396 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4dl5v" Dec 05 16:39:20 crc kubenswrapper[4756]: I1205 16:39:20.489535 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 16:39:20 crc kubenswrapper[4756]: I1205 16:39:20.646377 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d75688ddc-j5zx2"] Dec 05 16:39:20 crc kubenswrapper[4756]: I1205 16:39:20.924278 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vs2mj" event={"ID":"90a878ef-27a9-49a2-be83-1a6ba34fab7f","Type":"ContainerStarted","Data":"6ea8bf01cfeb60c15901502f17cd75838f7a07e04bcf6b23ccc5eae4f4053438"} Dec 05 16:39:20 crc kubenswrapper[4756]: I1205 16:39:20.924640 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vs2mj" event={"ID":"90a878ef-27a9-49a2-be83-1a6ba34fab7f","Type":"ContainerStarted","Data":"980165b50e2515f0863e173a355c29857adefafdbae3e4be143e2b61209eca5d"} Dec 05 16:39:20 crc kubenswrapper[4756]: I1205 16:39:20.926140 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" event={"ID":"af31e31e-33e5-48d9-8322-efb1f23ba2b0","Type":"ContainerStarted","Data":"1b991686bff26b7e4624389ace14acd3a44c0673e9a405b31ea719081697ace7"} Dec 05 16:39:20 crc kubenswrapper[4756]: I1205 16:39:20.927804 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e","Type":"ContainerStarted","Data":"7fa6569015dcd2a34ee28de051fc69e81e56e52c81271abf95d98059e0b325f2"} Dec 05 16:39:20 crc kubenswrapper[4756]: I1205 16:39:20.928913 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d4d65433-aeb2-45cf-a371-70f9db4f2064","Type":"ContainerStarted","Data":"9b3e0eae5b354542f09b86250438ac56c3b30fabcfec956ee7c8a6446c1cc699"} Dec 05 16:39:20 crc kubenswrapper[4756]: I1205 16:39:20.930231 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"146bf072-8ade-4d0f-91cf-4a58c4c2f136","Type":"ContainerStarted","Data":"f13d14af91f746f21fa55026783436d3916749cd0d54ffaddfe76af99f455d6b"} Dec 05 16:39:20 crc kubenswrapper[4756]: I1205 16:39:20.951660 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4dl5v"] Dec 05 16:39:20 crc kubenswrapper[4756]: I1205 16:39:20.965083 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-vs2mj" podStartSLOduration=2.965068117 podStartE2EDuration="2.965068117s" podCreationTimestamp="2025-12-05 16:39:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:39:20.943755579 +0000 UTC m=+1421.182429101" watchObservedRunningTime="2025-12-05 16:39:20.965068117 +0000 UTC m=+1421.203741629" Dec 05 16:39:21 crc kubenswrapper[4756]: I1205 16:39:21.944506 4756 generic.go:334] "Generic (PLEG): container finished" podID="af31e31e-33e5-48d9-8322-efb1f23ba2b0" containerID="95b6dbca61e093f7642dbcccea379059fedefbeb2a4bbe66c31f4b1007dafc90" exitCode=0 Dec 05 16:39:21 crc kubenswrapper[4756]: I1205 16:39:21.944599 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" event={"ID":"af31e31e-33e5-48d9-8322-efb1f23ba2b0","Type":"ContainerDied","Data":"95b6dbca61e093f7642dbcccea379059fedefbeb2a4bbe66c31f4b1007dafc90"} Dec 05 16:39:21 crc kubenswrapper[4756]: I1205 16:39:21.948811 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4dl5v" event={"ID":"edb5a656-efa8-47ac-befc-2412854889f7","Type":"ContainerStarted","Data":"e34299a73608f080fb60c718ee146253bc5828dfc856b17195372ea6a1daeb46"} Dec 05 16:39:21 crc kubenswrapper[4756]: I1205 16:39:21.948843 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4dl5v" event={"ID":"edb5a656-efa8-47ac-befc-2412854889f7","Type":"ContainerStarted","Data":"7570cd48062961401ae83a772b7c7d96226d8b4f8308767cc1a1c99a167b39eb"} Dec 05 16:39:21 crc kubenswrapper[4756]: I1205 16:39:21.994320 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-4dl5v" podStartSLOduration=1.99430232 podStartE2EDuration="1.99430232s" podCreationTimestamp="2025-12-05 16:39:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:39:21.986397156 +0000 UTC m=+1422.225070688" watchObservedRunningTime="2025-12-05 16:39:21.99430232 +0000 UTC m=+1422.232975832" Dec 05 16:39:22 crc kubenswrapper[4756]: I1205 16:39:22.223341 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bgtmv" Dec 05 16:39:22 crc kubenswrapper[4756]: I1205 16:39:22.295341 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:39:22 crc kubenswrapper[4756]: I1205 16:39:22.295467 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:39:22 crc kubenswrapper[4756]: I1205 16:39:22.305314 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bgtmv" Dec 05 16:39:23 crc kubenswrapper[4756]: I1205 16:39:23.002067 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bgtmv"] Dec 05 16:39:23 crc kubenswrapper[4756]: I1205 16:39:23.428151 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 16:39:23 crc kubenswrapper[4756]: I1205 16:39:23.459088 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 16:39:23 crc kubenswrapper[4756]: I1205 16:39:23.970999 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e","Type":"ContainerStarted","Data":"08dad36a090d156d9038db80c8b4e8193a7ab1b1ea6cb42dacdca958b9780bbe"} Dec 05 16:39:23 crc kubenswrapper[4756]: I1205 16:39:23.971093 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://08dad36a090d156d9038db80c8b4e8193a7ab1b1ea6cb42dacdca958b9780bbe" gracePeriod=30 Dec 05 16:39:23 crc kubenswrapper[4756]: I1205 16:39:23.974929 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d4d65433-aeb2-45cf-a371-70f9db4f2064","Type":"ContainerStarted","Data":"e23cf6c086022e6f9b19d781bb928321be9f4d26467f09309e571974f28f4f2c"} Dec 05 16:39:23 crc kubenswrapper[4756]: I1205 16:39:23.976612 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5a04745b-1666-4d6b-ab59-706b04cf4160","Type":"ContainerStarted","Data":"ddc20c25aa72a78a8e26b7c521fc7a31ae86c3a0a05f9a2043bcd6fb7903020a"} Dec 05 16:39:23 crc kubenswrapper[4756]: I1205 16:39:23.979492 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"146bf072-8ade-4d0f-91cf-4a58c4c2f136","Type":"ContainerStarted","Data":"ac2d8cfcc5a2656696d38fccdcbbdb164ee070624c43285298931cb54080ef90"} Dec 05 16:39:23 crc kubenswrapper[4756]: I1205 16:39:23.981565 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" event={"ID":"af31e31e-33e5-48d9-8322-efb1f23ba2b0","Type":"ContainerStarted","Data":"566cf962bfe87ae965ae376ef40f44a93f253e45e7ee49fd1404add6970fbb2b"} Dec 05 16:39:23 crc kubenswrapper[4756]: I1205 16:39:23.981636 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bgtmv" podUID="4586641a-c567-4535-a94a-61b78cdb4a57" containerName="registry-server" containerID="cri-o://26acef0b8c42080f62f6cbd569bf11e7bc0a1d9a5d44c5dbea90ebb6acfb5cd1" gracePeriod=2 Dec 05 16:39:23 crc kubenswrapper[4756]: I1205 16:39:23.992511 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.907570168 podStartE2EDuration="5.992493502s" podCreationTimestamp="2025-12-05 16:39:18 +0000 UTC" firstStartedPulling="2025-12-05 16:39:20.280092723 +0000 UTC m=+1420.518766235" lastFinishedPulling="2025-12-05 16:39:23.365016057 +0000 UTC m=+1423.603689569" observedRunningTime="2025-12-05 16:39:23.986727825 +0000 UTC m=+1424.225401337" watchObservedRunningTime="2025-12-05 16:39:23.992493502 +0000 UTC m=+1424.231167014" Dec 05 16:39:24 crc kubenswrapper[4756]: I1205 16:39:24.007151 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" podStartSLOduration=5.007131988 podStartE2EDuration="5.007131988s" podCreationTimestamp="2025-12-05 16:39:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:39:24.004758324 +0000 UTC m=+1424.243431846" watchObservedRunningTime="2025-12-05 16:39:24.007131988 +0000 UTC m=+1424.245805500" Dec 05 16:39:24 crc kubenswrapper[4756]: I1205 16:39:24.026000 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.560264525 podStartE2EDuration="6.025980919s" podCreationTimestamp="2025-12-05 16:39:18 +0000 UTC" firstStartedPulling="2025-12-05 16:39:19.89992851 +0000 UTC m=+1420.138602022" lastFinishedPulling="2025-12-05 16:39:23.365644914 +0000 UTC m=+1423.604318416" observedRunningTime="2025-12-05 16:39:24.021242381 +0000 UTC m=+1424.259915903" watchObservedRunningTime="2025-12-05 16:39:24.025980919 +0000 UTC m=+1424.264654431" Dec 05 16:39:24 crc kubenswrapper[4756]: I1205 16:39:24.410111 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bgtmv" Dec 05 16:39:24 crc kubenswrapper[4756]: I1205 16:39:24.451369 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 16:39:24 crc kubenswrapper[4756]: I1205 16:39:24.526633 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4586641a-c567-4535-a94a-61b78cdb4a57-utilities\") pod \"4586641a-c567-4535-a94a-61b78cdb4a57\" (UID: \"4586641a-c567-4535-a94a-61b78cdb4a57\") " Dec 05 16:39:24 crc kubenswrapper[4756]: I1205 16:39:24.526723 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srfvc\" (UniqueName: \"kubernetes.io/projected/4586641a-c567-4535-a94a-61b78cdb4a57-kube-api-access-srfvc\") pod \"4586641a-c567-4535-a94a-61b78cdb4a57\" (UID: \"4586641a-c567-4535-a94a-61b78cdb4a57\") " Dec 05 16:39:24 crc kubenswrapper[4756]: I1205 16:39:24.526938 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4586641a-c567-4535-a94a-61b78cdb4a57-catalog-content\") pod \"4586641a-c567-4535-a94a-61b78cdb4a57\" (UID: \"4586641a-c567-4535-a94a-61b78cdb4a57\") " Dec 05 16:39:24 crc kubenswrapper[4756]: I1205 16:39:24.527283 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4586641a-c567-4535-a94a-61b78cdb4a57-utilities" (OuterVolumeSpecName: "utilities") pod "4586641a-c567-4535-a94a-61b78cdb4a57" (UID: "4586641a-c567-4535-a94a-61b78cdb4a57"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:39:24 crc kubenswrapper[4756]: I1205 16:39:24.527407 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4586641a-c567-4535-a94a-61b78cdb4a57-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:24 crc kubenswrapper[4756]: I1205 16:39:24.532533 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4586641a-c567-4535-a94a-61b78cdb4a57-kube-api-access-srfvc" (OuterVolumeSpecName: "kube-api-access-srfvc") pod "4586641a-c567-4535-a94a-61b78cdb4a57" (UID: "4586641a-c567-4535-a94a-61b78cdb4a57"). InnerVolumeSpecName "kube-api-access-srfvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:39:24 crc kubenswrapper[4756]: I1205 16:39:24.605525 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4586641a-c567-4535-a94a-61b78cdb4a57-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4586641a-c567-4535-a94a-61b78cdb4a57" (UID: "4586641a-c567-4535-a94a-61b78cdb4a57"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:39:24 crc kubenswrapper[4756]: I1205 16:39:24.625834 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:24 crc kubenswrapper[4756]: I1205 16:39:24.629014 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4586641a-c567-4535-a94a-61b78cdb4a57-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:24 crc kubenswrapper[4756]: I1205 16:39:24.629056 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srfvc\" (UniqueName: \"kubernetes.io/projected/4586641a-c567-4535-a94a-61b78cdb4a57-kube-api-access-srfvc\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:24 crc kubenswrapper[4756]: I1205 16:39:24.847803 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" Dec 05 16:39:24 crc kubenswrapper[4756]: I1205 16:39:24.992697 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d4d65433-aeb2-45cf-a371-70f9db4f2064","Type":"ContainerStarted","Data":"d9ae61af24dcf33c2df9aac7681968d09313463fd0a6efdd434ce4cf43b8a091"} Dec 05 16:39:24 crc kubenswrapper[4756]: I1205 16:39:24.992860 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d4d65433-aeb2-45cf-a371-70f9db4f2064" containerName="nova-metadata-metadata" containerID="cri-o://d9ae61af24dcf33c2df9aac7681968d09313463fd0a6efdd434ce4cf43b8a091" gracePeriod=30 Dec 05 16:39:24 crc kubenswrapper[4756]: I1205 16:39:24.992858 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d4d65433-aeb2-45cf-a371-70f9db4f2064" containerName="nova-metadata-log" containerID="cri-o://e23cf6c086022e6f9b19d781bb928321be9f4d26467f09309e571974f28f4f2c" gracePeriod=30 Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.001853 4756 generic.go:334] "Generic (PLEG): container finished" podID="4586641a-c567-4535-a94a-61b78cdb4a57" containerID="26acef0b8c42080f62f6cbd569bf11e7bc0a1d9a5d44c5dbea90ebb6acfb5cd1" exitCode=0 Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.001934 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bgtmv" event={"ID":"4586641a-c567-4535-a94a-61b78cdb4a57","Type":"ContainerDied","Data":"26acef0b8c42080f62f6cbd569bf11e7bc0a1d9a5d44c5dbea90ebb6acfb5cd1"} Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.001979 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bgtmv" event={"ID":"4586641a-c567-4535-a94a-61b78cdb4a57","Type":"ContainerDied","Data":"ea8980bb636404b8f1e8d9d7d8d32374fc37cc56803a2a38d2f20f2822cd5463"} Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.001996 4756 scope.go:117] "RemoveContainer" containerID="26acef0b8c42080f62f6cbd569bf11e7bc0a1d9a5d44c5dbea90ebb6acfb5cd1" Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.002144 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bgtmv" Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.012780 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.996085208 podStartE2EDuration="6.012759562s" podCreationTimestamp="2025-12-05 16:39:19 +0000 UTC" firstStartedPulling="2025-12-05 16:39:20.347073128 +0000 UTC m=+1420.585746650" lastFinishedPulling="2025-12-05 16:39:23.363747492 +0000 UTC m=+1423.602421004" observedRunningTime="2025-12-05 16:39:25.011628271 +0000 UTC m=+1425.250301783" watchObservedRunningTime="2025-12-05 16:39:25.012759562 +0000 UTC m=+1425.251433094" Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.014234 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"146bf072-8ade-4d0f-91cf-4a58c4c2f136","Type":"ContainerStarted","Data":"6bce93f5a6dc5ee89c32717f642c74ab41380473213291ede71e14b699b0b4cb"} Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.038343 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.169231791 podStartE2EDuration="6.038325435s" podCreationTimestamp="2025-12-05 16:39:19 +0000 UTC" firstStartedPulling="2025-12-05 16:39:20.506279583 +0000 UTC m=+1420.744953095" lastFinishedPulling="2025-12-05 16:39:23.375373227 +0000 UTC m=+1423.614046739" observedRunningTime="2025-12-05 16:39:25.03333712 +0000 UTC m=+1425.272010642" watchObservedRunningTime="2025-12-05 16:39:25.038325435 +0000 UTC m=+1425.276998957" Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.046458 4756 scope.go:117] "RemoveContainer" containerID="ec7bcdc950eb8f5366a61dde4110fc1799c592008c60cf7260168171aa2bb2a5" Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.065768 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bgtmv"] Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.074506 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bgtmv"] Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.085703 4756 scope.go:117] "RemoveContainer" containerID="b415be4adbf2c57f78640fa626ccec5d13eaac2b8d4170fceb80d697e7152fad" Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.199531 4756 scope.go:117] "RemoveContainer" containerID="26acef0b8c42080f62f6cbd569bf11e7bc0a1d9a5d44c5dbea90ebb6acfb5cd1" Dec 05 16:39:25 crc kubenswrapper[4756]: E1205 16:39:25.200108 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26acef0b8c42080f62f6cbd569bf11e7bc0a1d9a5d44c5dbea90ebb6acfb5cd1\": container with ID starting with 26acef0b8c42080f62f6cbd569bf11e7bc0a1d9a5d44c5dbea90ebb6acfb5cd1 not found: ID does not exist" containerID="26acef0b8c42080f62f6cbd569bf11e7bc0a1d9a5d44c5dbea90ebb6acfb5cd1" Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.200161 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26acef0b8c42080f62f6cbd569bf11e7bc0a1d9a5d44c5dbea90ebb6acfb5cd1"} err="failed to get container status \"26acef0b8c42080f62f6cbd569bf11e7bc0a1d9a5d44c5dbea90ebb6acfb5cd1\": rpc error: code = NotFound desc = could not find container \"26acef0b8c42080f62f6cbd569bf11e7bc0a1d9a5d44c5dbea90ebb6acfb5cd1\": container with ID starting with 26acef0b8c42080f62f6cbd569bf11e7bc0a1d9a5d44c5dbea90ebb6acfb5cd1 not found: ID does not exist" Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.200196 4756 scope.go:117] "RemoveContainer" containerID="ec7bcdc950eb8f5366a61dde4110fc1799c592008c60cf7260168171aa2bb2a5" Dec 05 16:39:25 crc kubenswrapper[4756]: E1205 16:39:25.202174 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec7bcdc950eb8f5366a61dde4110fc1799c592008c60cf7260168171aa2bb2a5\": container with ID starting with ec7bcdc950eb8f5366a61dde4110fc1799c592008c60cf7260168171aa2bb2a5 not found: ID does not exist" containerID="ec7bcdc950eb8f5366a61dde4110fc1799c592008c60cf7260168171aa2bb2a5" Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.202205 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec7bcdc950eb8f5366a61dde4110fc1799c592008c60cf7260168171aa2bb2a5"} err="failed to get container status \"ec7bcdc950eb8f5366a61dde4110fc1799c592008c60cf7260168171aa2bb2a5\": rpc error: code = NotFound desc = could not find container \"ec7bcdc950eb8f5366a61dde4110fc1799c592008c60cf7260168171aa2bb2a5\": container with ID starting with ec7bcdc950eb8f5366a61dde4110fc1799c592008c60cf7260168171aa2bb2a5 not found: ID does not exist" Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.202226 4756 scope.go:117] "RemoveContainer" containerID="b415be4adbf2c57f78640fa626ccec5d13eaac2b8d4170fceb80d697e7152fad" Dec 05 16:39:25 crc kubenswrapper[4756]: E1205 16:39:25.202592 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b415be4adbf2c57f78640fa626ccec5d13eaac2b8d4170fceb80d697e7152fad\": container with ID starting with b415be4adbf2c57f78640fa626ccec5d13eaac2b8d4170fceb80d697e7152fad not found: ID does not exist" containerID="b415be4adbf2c57f78640fa626ccec5d13eaac2b8d4170fceb80d697e7152fad" Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.202636 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b415be4adbf2c57f78640fa626ccec5d13eaac2b8d4170fceb80d697e7152fad"} err="failed to get container status \"b415be4adbf2c57f78640fa626ccec5d13eaac2b8d4170fceb80d697e7152fad\": rpc error: code = NotFound desc = could not find container \"b415be4adbf2c57f78640fa626ccec5d13eaac2b8d4170fceb80d697e7152fad\": container with ID starting with b415be4adbf2c57f78640fa626ccec5d13eaac2b8d4170fceb80d697e7152fad not found: ID does not exist" Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.317184 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.567288 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.769941 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4d65433-aeb2-45cf-a371-70f9db4f2064-logs\") pod \"d4d65433-aeb2-45cf-a371-70f9db4f2064\" (UID: \"d4d65433-aeb2-45cf-a371-70f9db4f2064\") " Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.770280 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4d65433-aeb2-45cf-a371-70f9db4f2064-logs" (OuterVolumeSpecName: "logs") pod "d4d65433-aeb2-45cf-a371-70f9db4f2064" (UID: "d4d65433-aeb2-45cf-a371-70f9db4f2064"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.770430 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d65433-aeb2-45cf-a371-70f9db4f2064-combined-ca-bundle\") pod \"d4d65433-aeb2-45cf-a371-70f9db4f2064\" (UID: \"d4d65433-aeb2-45cf-a371-70f9db4f2064\") " Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.770690 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45g9l\" (UniqueName: \"kubernetes.io/projected/d4d65433-aeb2-45cf-a371-70f9db4f2064-kube-api-access-45g9l\") pod \"d4d65433-aeb2-45cf-a371-70f9db4f2064\" (UID: \"d4d65433-aeb2-45cf-a371-70f9db4f2064\") " Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.770812 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d65433-aeb2-45cf-a371-70f9db4f2064-config-data\") pod \"d4d65433-aeb2-45cf-a371-70f9db4f2064\" (UID: \"d4d65433-aeb2-45cf-a371-70f9db4f2064\") " Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.771761 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4d65433-aeb2-45cf-a371-70f9db4f2064-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.782313 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4d65433-aeb2-45cf-a371-70f9db4f2064-kube-api-access-45g9l" (OuterVolumeSpecName: "kube-api-access-45g9l") pod "d4d65433-aeb2-45cf-a371-70f9db4f2064" (UID: "d4d65433-aeb2-45cf-a371-70f9db4f2064"). InnerVolumeSpecName "kube-api-access-45g9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.795844 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4d65433-aeb2-45cf-a371-70f9db4f2064-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4d65433-aeb2-45cf-a371-70f9db4f2064" (UID: "d4d65433-aeb2-45cf-a371-70f9db4f2064"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.798976 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4d65433-aeb2-45cf-a371-70f9db4f2064-config-data" (OuterVolumeSpecName: "config-data") pod "d4d65433-aeb2-45cf-a371-70f9db4f2064" (UID: "d4d65433-aeb2-45cf-a371-70f9db4f2064"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.873286 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d65433-aeb2-45cf-a371-70f9db4f2064-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.873614 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45g9l\" (UniqueName: \"kubernetes.io/projected/d4d65433-aeb2-45cf-a371-70f9db4f2064-kube-api-access-45g9l\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:25 crc kubenswrapper[4756]: I1205 16:39:25.873627 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d65433-aeb2-45cf-a371-70f9db4f2064-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.023663 4756 generic.go:334] "Generic (PLEG): container finished" podID="d4d65433-aeb2-45cf-a371-70f9db4f2064" containerID="d9ae61af24dcf33c2df9aac7681968d09313463fd0a6efdd434ce4cf43b8a091" exitCode=0 Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.023696 4756 generic.go:334] "Generic (PLEG): container finished" podID="d4d65433-aeb2-45cf-a371-70f9db4f2064" containerID="e23cf6c086022e6f9b19d781bb928321be9f4d26467f09309e571974f28f4f2c" exitCode=143 Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.023716 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.023746 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d4d65433-aeb2-45cf-a371-70f9db4f2064","Type":"ContainerDied","Data":"d9ae61af24dcf33c2df9aac7681968d09313463fd0a6efdd434ce4cf43b8a091"} Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.023773 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d4d65433-aeb2-45cf-a371-70f9db4f2064","Type":"ContainerDied","Data":"e23cf6c086022e6f9b19d781bb928321be9f4d26467f09309e571974f28f4f2c"} Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.023785 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d4d65433-aeb2-45cf-a371-70f9db4f2064","Type":"ContainerDied","Data":"9b3e0eae5b354542f09b86250438ac56c3b30fabcfec956ee7c8a6446c1cc699"} Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.023804 4756 scope.go:117] "RemoveContainer" containerID="d9ae61af24dcf33c2df9aac7681968d09313463fd0a6efdd434ce4cf43b8a091" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.060948 4756 scope.go:117] "RemoveContainer" containerID="e23cf6c086022e6f9b19d781bb928321be9f4d26467f09309e571974f28f4f2c" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.062096 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4586641a-c567-4535-a94a-61b78cdb4a57" path="/var/lib/kubelet/pods/4586641a-c567-4535-a94a-61b78cdb4a57/volumes" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.063329 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.091334 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.095537 4756 scope.go:117] "RemoveContainer" containerID="d9ae61af24dcf33c2df9aac7681968d09313463fd0a6efdd434ce4cf43b8a091" Dec 05 16:39:26 crc kubenswrapper[4756]: E1205 16:39:26.097153 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9ae61af24dcf33c2df9aac7681968d09313463fd0a6efdd434ce4cf43b8a091\": container with ID starting with d9ae61af24dcf33c2df9aac7681968d09313463fd0a6efdd434ce4cf43b8a091 not found: ID does not exist" containerID="d9ae61af24dcf33c2df9aac7681968d09313463fd0a6efdd434ce4cf43b8a091" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.097193 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9ae61af24dcf33c2df9aac7681968d09313463fd0a6efdd434ce4cf43b8a091"} err="failed to get container status \"d9ae61af24dcf33c2df9aac7681968d09313463fd0a6efdd434ce4cf43b8a091\": rpc error: code = NotFound desc = could not find container \"d9ae61af24dcf33c2df9aac7681968d09313463fd0a6efdd434ce4cf43b8a091\": container with ID starting with d9ae61af24dcf33c2df9aac7681968d09313463fd0a6efdd434ce4cf43b8a091 not found: ID does not exist" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.097216 4756 scope.go:117] "RemoveContainer" containerID="e23cf6c086022e6f9b19d781bb928321be9f4d26467f09309e571974f28f4f2c" Dec 05 16:39:26 crc kubenswrapper[4756]: E1205 16:39:26.097486 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e23cf6c086022e6f9b19d781bb928321be9f4d26467f09309e571974f28f4f2c\": container with ID starting with e23cf6c086022e6f9b19d781bb928321be9f4d26467f09309e571974f28f4f2c not found: ID does not exist" containerID="e23cf6c086022e6f9b19d781bb928321be9f4d26467f09309e571974f28f4f2c" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.097506 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e23cf6c086022e6f9b19d781bb928321be9f4d26467f09309e571974f28f4f2c"} err="failed to get container status \"e23cf6c086022e6f9b19d781bb928321be9f4d26467f09309e571974f28f4f2c\": rpc error: code = NotFound desc = could not find container \"e23cf6c086022e6f9b19d781bb928321be9f4d26467f09309e571974f28f4f2c\": container with ID starting with e23cf6c086022e6f9b19d781bb928321be9f4d26467f09309e571974f28f4f2c not found: ID does not exist" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.097523 4756 scope.go:117] "RemoveContainer" containerID="d9ae61af24dcf33c2df9aac7681968d09313463fd0a6efdd434ce4cf43b8a091" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.097780 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9ae61af24dcf33c2df9aac7681968d09313463fd0a6efdd434ce4cf43b8a091"} err="failed to get container status \"d9ae61af24dcf33c2df9aac7681968d09313463fd0a6efdd434ce4cf43b8a091\": rpc error: code = NotFound desc = could not find container \"d9ae61af24dcf33c2df9aac7681968d09313463fd0a6efdd434ce4cf43b8a091\": container with ID starting with d9ae61af24dcf33c2df9aac7681968d09313463fd0a6efdd434ce4cf43b8a091 not found: ID does not exist" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.097797 4756 scope.go:117] "RemoveContainer" containerID="e23cf6c086022e6f9b19d781bb928321be9f4d26467f09309e571974f28f4f2c" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.098047 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e23cf6c086022e6f9b19d781bb928321be9f4d26467f09309e571974f28f4f2c"} err="failed to get container status \"e23cf6c086022e6f9b19d781bb928321be9f4d26467f09309e571974f28f4f2c\": rpc error: code = NotFound desc = could not find container \"e23cf6c086022e6f9b19d781bb928321be9f4d26467f09309e571974f28f4f2c\": container with ID starting with e23cf6c086022e6f9b19d781bb928321be9f4d26467f09309e571974f28f4f2c not found: ID does not exist" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.106273 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 16:39:26 crc kubenswrapper[4756]: E1205 16:39:26.106811 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4586641a-c567-4535-a94a-61b78cdb4a57" containerName="extract-utilities" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.106836 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4586641a-c567-4535-a94a-61b78cdb4a57" containerName="extract-utilities" Dec 05 16:39:26 crc kubenswrapper[4756]: E1205 16:39:26.106855 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4d65433-aeb2-45cf-a371-70f9db4f2064" containerName="nova-metadata-log" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.106863 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d65433-aeb2-45cf-a371-70f9db4f2064" containerName="nova-metadata-log" Dec 05 16:39:26 crc kubenswrapper[4756]: E1205 16:39:26.106878 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4586641a-c567-4535-a94a-61b78cdb4a57" containerName="extract-content" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.106887 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4586641a-c567-4535-a94a-61b78cdb4a57" containerName="extract-content" Dec 05 16:39:26 crc kubenswrapper[4756]: E1205 16:39:26.106913 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4586641a-c567-4535-a94a-61b78cdb4a57" containerName="registry-server" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.106922 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4586641a-c567-4535-a94a-61b78cdb4a57" containerName="registry-server" Dec 05 16:39:26 crc kubenswrapper[4756]: E1205 16:39:26.106950 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4d65433-aeb2-45cf-a371-70f9db4f2064" containerName="nova-metadata-metadata" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.106958 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d65433-aeb2-45cf-a371-70f9db4f2064" containerName="nova-metadata-metadata" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.107183 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4d65433-aeb2-45cf-a371-70f9db4f2064" containerName="nova-metadata-log" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.107202 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="4586641a-c567-4535-a94a-61b78cdb4a57" containerName="registry-server" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.107232 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4d65433-aeb2-45cf-a371-70f9db4f2064" containerName="nova-metadata-metadata" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.108648 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.111249 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.114622 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.121043 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.178136 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp4nv\" (UniqueName: \"kubernetes.io/projected/921dc635-3afc-47a5-9cc5-35a41f37bf0e-kube-api-access-bp4nv\") pod \"nova-metadata-0\" (UID: \"921dc635-3afc-47a5-9cc5-35a41f37bf0e\") " pod="openstack/nova-metadata-0" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.178241 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/921dc635-3afc-47a5-9cc5-35a41f37bf0e-config-data\") pod \"nova-metadata-0\" (UID: \"921dc635-3afc-47a5-9cc5-35a41f37bf0e\") " pod="openstack/nova-metadata-0" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.178276 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/921dc635-3afc-47a5-9cc5-35a41f37bf0e-logs\") pod \"nova-metadata-0\" (UID: \"921dc635-3afc-47a5-9cc5-35a41f37bf0e\") " pod="openstack/nova-metadata-0" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.178346 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/921dc635-3afc-47a5-9cc5-35a41f37bf0e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"921dc635-3afc-47a5-9cc5-35a41f37bf0e\") " pod="openstack/nova-metadata-0" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.178372 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/921dc635-3afc-47a5-9cc5-35a41f37bf0e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"921dc635-3afc-47a5-9cc5-35a41f37bf0e\") " pod="openstack/nova-metadata-0" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.279751 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/921dc635-3afc-47a5-9cc5-35a41f37bf0e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"921dc635-3afc-47a5-9cc5-35a41f37bf0e\") " pod="openstack/nova-metadata-0" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.279796 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/921dc635-3afc-47a5-9cc5-35a41f37bf0e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"921dc635-3afc-47a5-9cc5-35a41f37bf0e\") " pod="openstack/nova-metadata-0" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.279880 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp4nv\" (UniqueName: \"kubernetes.io/projected/921dc635-3afc-47a5-9cc5-35a41f37bf0e-kube-api-access-bp4nv\") pod \"nova-metadata-0\" (UID: \"921dc635-3afc-47a5-9cc5-35a41f37bf0e\") " pod="openstack/nova-metadata-0" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.279970 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/921dc635-3afc-47a5-9cc5-35a41f37bf0e-config-data\") pod \"nova-metadata-0\" (UID: \"921dc635-3afc-47a5-9cc5-35a41f37bf0e\") " pod="openstack/nova-metadata-0" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.280007 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/921dc635-3afc-47a5-9cc5-35a41f37bf0e-logs\") pod \"nova-metadata-0\" (UID: \"921dc635-3afc-47a5-9cc5-35a41f37bf0e\") " pod="openstack/nova-metadata-0" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.280395 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/921dc635-3afc-47a5-9cc5-35a41f37bf0e-logs\") pod \"nova-metadata-0\" (UID: \"921dc635-3afc-47a5-9cc5-35a41f37bf0e\") " pod="openstack/nova-metadata-0" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.284072 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/921dc635-3afc-47a5-9cc5-35a41f37bf0e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"921dc635-3afc-47a5-9cc5-35a41f37bf0e\") " pod="openstack/nova-metadata-0" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.284900 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/921dc635-3afc-47a5-9cc5-35a41f37bf0e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"921dc635-3afc-47a5-9cc5-35a41f37bf0e\") " pod="openstack/nova-metadata-0" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.286783 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/921dc635-3afc-47a5-9cc5-35a41f37bf0e-config-data\") pod \"nova-metadata-0\" (UID: \"921dc635-3afc-47a5-9cc5-35a41f37bf0e\") " pod="openstack/nova-metadata-0" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.296819 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp4nv\" (UniqueName: \"kubernetes.io/projected/921dc635-3afc-47a5-9cc5-35a41f37bf0e-kube-api-access-bp4nv\") pod \"nova-metadata-0\" (UID: \"921dc635-3afc-47a5-9cc5-35a41f37bf0e\") " pod="openstack/nova-metadata-0" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.429502 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 16:39:26 crc kubenswrapper[4756]: I1205 16:39:26.915519 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 16:39:26 crc kubenswrapper[4756]: W1205 16:39:26.918267 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod921dc635_3afc_47a5_9cc5_35a41f37bf0e.slice/crio-2afbf1dd3fd55ea186c55af3de9059e62bc2321226a12b333e1dcbeaeb0545dc WatchSource:0}: Error finding container 2afbf1dd3fd55ea186c55af3de9059e62bc2321226a12b333e1dcbeaeb0545dc: Status 404 returned error can't find the container with id 2afbf1dd3fd55ea186c55af3de9059e62bc2321226a12b333e1dcbeaeb0545dc Dec 05 16:39:27 crc kubenswrapper[4756]: I1205 16:39:27.038550 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"921dc635-3afc-47a5-9cc5-35a41f37bf0e","Type":"ContainerStarted","Data":"2afbf1dd3fd55ea186c55af3de9059e62bc2321226a12b333e1dcbeaeb0545dc"} Dec 05 16:39:28 crc kubenswrapper[4756]: I1205 16:39:28.064409 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4d65433-aeb2-45cf-a371-70f9db4f2064" path="/var/lib/kubelet/pods/d4d65433-aeb2-45cf-a371-70f9db4f2064/volumes" Dec 05 16:39:28 crc kubenswrapper[4756]: I1205 16:39:28.065359 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"921dc635-3afc-47a5-9cc5-35a41f37bf0e","Type":"ContainerStarted","Data":"05c062a47710f345f01bdc0bb785327d68296f5551e26abb9611b03af7b5a8ae"} Dec 05 16:39:28 crc kubenswrapper[4756]: I1205 16:39:28.942195 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 16:39:28 crc kubenswrapper[4756]: I1205 16:39:28.943438 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="d091b847-2b31-492a-a1b7-37e356e55e93" containerName="kube-state-metrics" containerID="cri-o://9b8651f84999e5733fdeed1ce337b453a91be8d32a73815f7e1a831bbb5c01e7" gracePeriod=30 Dec 05 16:39:29 crc kubenswrapper[4756]: I1205 16:39:29.066526 4756 generic.go:334] "Generic (PLEG): container finished" podID="d091b847-2b31-492a-a1b7-37e356e55e93" containerID="9b8651f84999e5733fdeed1ce337b453a91be8d32a73815f7e1a831bbb5c01e7" exitCode=2 Dec 05 16:39:29 crc kubenswrapper[4756]: I1205 16:39:29.066613 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d091b847-2b31-492a-a1b7-37e356e55e93","Type":"ContainerDied","Data":"9b8651f84999e5733fdeed1ce337b453a91be8d32a73815f7e1a831bbb5c01e7"} Dec 05 16:39:29 crc kubenswrapper[4756]: I1205 16:39:29.069383 4756 generic.go:334] "Generic (PLEG): container finished" podID="90a878ef-27a9-49a2-be83-1a6ba34fab7f" containerID="6ea8bf01cfeb60c15901502f17cd75838f7a07e04bcf6b23ccc5eae4f4053438" exitCode=0 Dec 05 16:39:29 crc kubenswrapper[4756]: I1205 16:39:29.069467 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vs2mj" event={"ID":"90a878ef-27a9-49a2-be83-1a6ba34fab7f","Type":"ContainerDied","Data":"6ea8bf01cfeb60c15901502f17cd75838f7a07e04bcf6b23ccc5eae4f4053438"} Dec 05 16:39:29 crc kubenswrapper[4756]: I1205 16:39:29.074389 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"921dc635-3afc-47a5-9cc5-35a41f37bf0e","Type":"ContainerStarted","Data":"d459755e935ea7a5bf08428f66083c9a38c63d198c6bd8d7244cad2f5626c99b"} Dec 05 16:39:29 crc kubenswrapper[4756]: I1205 16:39:29.126918 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.12689491 podStartE2EDuration="3.12689491s" podCreationTimestamp="2025-12-05 16:39:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:39:29.113048405 +0000 UTC m=+1429.351721917" watchObservedRunningTime="2025-12-05 16:39:29.12689491 +0000 UTC m=+1429.365568442" Dec 05 16:39:29 crc kubenswrapper[4756]: I1205 16:39:29.447911 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 16:39:29 crc kubenswrapper[4756]: I1205 16:39:29.450829 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 16:39:29 crc kubenswrapper[4756]: I1205 16:39:29.501870 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 16:39:29 crc kubenswrapper[4756]: I1205 16:39:29.646168 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctt8f\" (UniqueName: \"kubernetes.io/projected/d091b847-2b31-492a-a1b7-37e356e55e93-kube-api-access-ctt8f\") pod \"d091b847-2b31-492a-a1b7-37e356e55e93\" (UID: \"d091b847-2b31-492a-a1b7-37e356e55e93\") " Dec 05 16:39:29 crc kubenswrapper[4756]: I1205 16:39:29.652092 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d091b847-2b31-492a-a1b7-37e356e55e93-kube-api-access-ctt8f" (OuterVolumeSpecName: "kube-api-access-ctt8f") pod "d091b847-2b31-492a-a1b7-37e356e55e93" (UID: "d091b847-2b31-492a-a1b7-37e356e55e93"). InnerVolumeSpecName "kube-api-access-ctt8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:39:29 crc kubenswrapper[4756]: I1205 16:39:29.682203 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 16:39:29 crc kubenswrapper[4756]: I1205 16:39:29.682267 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 16:39:29 crc kubenswrapper[4756]: I1205 16:39:29.748926 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctt8f\" (UniqueName: \"kubernetes.io/projected/d091b847-2b31-492a-a1b7-37e356e55e93-kube-api-access-ctt8f\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:29 crc kubenswrapper[4756]: I1205 16:39:29.848656 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" Dec 05 16:39:29 crc kubenswrapper[4756]: I1205 16:39:29.915970 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77d8c9c7-tmvh4"] Dec 05 16:39:29 crc kubenswrapper[4756]: I1205 16:39:29.916208 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" podUID="f060a6c1-48e8-49b5-9283-172dbef53241" containerName="dnsmasq-dns" containerID="cri-o://9b565047b4f90b9ae68c5bdaf8d5c38a5e415dce6e03641e5c8dca325759a137" gracePeriod=10 Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.108522 4756 generic.go:334] "Generic (PLEG): container finished" podID="f060a6c1-48e8-49b5-9283-172dbef53241" containerID="9b565047b4f90b9ae68c5bdaf8d5c38a5e415dce6e03641e5c8dca325759a137" exitCode=0 Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.108606 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" event={"ID":"f060a6c1-48e8-49b5-9283-172dbef53241","Type":"ContainerDied","Data":"9b565047b4f90b9ae68c5bdaf8d5c38a5e415dce6e03641e5c8dca325759a137"} Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.115164 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.115916 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d091b847-2b31-492a-a1b7-37e356e55e93","Type":"ContainerDied","Data":"c716ebc98ba0a2396dfd1cab4154bec9bcca9e2eb678a919759fbb13fb2de23f"} Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.115965 4756 scope.go:117] "RemoveContainer" containerID="9b8651f84999e5733fdeed1ce337b453a91be8d32a73815f7e1a831bbb5c01e7" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.154571 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.178884 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.187481 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 16:39:30 crc kubenswrapper[4756]: E1205 16:39:30.187943 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d091b847-2b31-492a-a1b7-37e356e55e93" containerName="kube-state-metrics" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.187961 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d091b847-2b31-492a-a1b7-37e356e55e93" containerName="kube-state-metrics" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.188136 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="d091b847-2b31-492a-a1b7-37e356e55e93" containerName="kube-state-metrics" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.188718 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.188794 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.191251 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.191662 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.198114 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.362024 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/797faab6-a6b3-48ae-90ca-09d251b2975f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"797faab6-a6b3-48ae-90ca-09d251b2975f\") " pod="openstack/kube-state-metrics-0" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.362062 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/797faab6-a6b3-48ae-90ca-09d251b2975f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"797faab6-a6b3-48ae-90ca-09d251b2975f\") " pod="openstack/kube-state-metrics-0" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.362138 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797faab6-a6b3-48ae-90ca-09d251b2975f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"797faab6-a6b3-48ae-90ca-09d251b2975f\") " pod="openstack/kube-state-metrics-0" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.362189 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxphb\" (UniqueName: \"kubernetes.io/projected/797faab6-a6b3-48ae-90ca-09d251b2975f-kube-api-access-sxphb\") pod \"kube-state-metrics-0\" (UID: \"797faab6-a6b3-48ae-90ca-09d251b2975f\") " pod="openstack/kube-state-metrics-0" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.464548 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797faab6-a6b3-48ae-90ca-09d251b2975f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"797faab6-a6b3-48ae-90ca-09d251b2975f\") " pod="openstack/kube-state-metrics-0" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.464610 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxphb\" (UniqueName: \"kubernetes.io/projected/797faab6-a6b3-48ae-90ca-09d251b2975f-kube-api-access-sxphb\") pod \"kube-state-metrics-0\" (UID: \"797faab6-a6b3-48ae-90ca-09d251b2975f\") " pod="openstack/kube-state-metrics-0" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.464715 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/797faab6-a6b3-48ae-90ca-09d251b2975f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"797faab6-a6b3-48ae-90ca-09d251b2975f\") " pod="openstack/kube-state-metrics-0" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.464738 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/797faab6-a6b3-48ae-90ca-09d251b2975f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"797faab6-a6b3-48ae-90ca-09d251b2975f\") " pod="openstack/kube-state-metrics-0" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.472818 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/797faab6-a6b3-48ae-90ca-09d251b2975f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"797faab6-a6b3-48ae-90ca-09d251b2975f\") " pod="openstack/kube-state-metrics-0" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.472936 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797faab6-a6b3-48ae-90ca-09d251b2975f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"797faab6-a6b3-48ae-90ca-09d251b2975f\") " pod="openstack/kube-state-metrics-0" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.473939 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/797faab6-a6b3-48ae-90ca-09d251b2975f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"797faab6-a6b3-48ae-90ca-09d251b2975f\") " pod="openstack/kube-state-metrics-0" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.485607 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxphb\" (UniqueName: \"kubernetes.io/projected/797faab6-a6b3-48ae-90ca-09d251b2975f-kube-api-access-sxphb\") pod \"kube-state-metrics-0\" (UID: \"797faab6-a6b3-48ae-90ca-09d251b2975f\") " pod="openstack/kube-state-metrics-0" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.513145 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.632610 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.642283 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vs2mj" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.767603 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="146bf072-8ade-4d0f-91cf-4a58c4c2f136" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.767789 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="146bf072-8ade-4d0f-91cf-4a58c4c2f136" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.769664 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f060a6c1-48e8-49b5-9283-172dbef53241-dns-svc\") pod \"f060a6c1-48e8-49b5-9283-172dbef53241\" (UID: \"f060a6c1-48e8-49b5-9283-172dbef53241\") " Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.769734 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzf8n\" (UniqueName: \"kubernetes.io/projected/f060a6c1-48e8-49b5-9283-172dbef53241-kube-api-access-wzf8n\") pod \"f060a6c1-48e8-49b5-9283-172dbef53241\" (UID: \"f060a6c1-48e8-49b5-9283-172dbef53241\") " Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.769821 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7g5hh\" (UniqueName: \"kubernetes.io/projected/90a878ef-27a9-49a2-be83-1a6ba34fab7f-kube-api-access-7g5hh\") pod \"90a878ef-27a9-49a2-be83-1a6ba34fab7f\" (UID: \"90a878ef-27a9-49a2-be83-1a6ba34fab7f\") " Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.769860 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90a878ef-27a9-49a2-be83-1a6ba34fab7f-combined-ca-bundle\") pod \"90a878ef-27a9-49a2-be83-1a6ba34fab7f\" (UID: \"90a878ef-27a9-49a2-be83-1a6ba34fab7f\") " Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.769923 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f060a6c1-48e8-49b5-9283-172dbef53241-config\") pod \"f060a6c1-48e8-49b5-9283-172dbef53241\" (UID: \"f060a6c1-48e8-49b5-9283-172dbef53241\") " Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.770032 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f060a6c1-48e8-49b5-9283-172dbef53241-ovsdbserver-sb\") pod \"f060a6c1-48e8-49b5-9283-172dbef53241\" (UID: \"f060a6c1-48e8-49b5-9283-172dbef53241\") " Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.770065 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f060a6c1-48e8-49b5-9283-172dbef53241-ovsdbserver-nb\") pod \"f060a6c1-48e8-49b5-9283-172dbef53241\" (UID: \"f060a6c1-48e8-49b5-9283-172dbef53241\") " Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.770131 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90a878ef-27a9-49a2-be83-1a6ba34fab7f-config-data\") pod \"90a878ef-27a9-49a2-be83-1a6ba34fab7f\" (UID: \"90a878ef-27a9-49a2-be83-1a6ba34fab7f\") " Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.770177 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f060a6c1-48e8-49b5-9283-172dbef53241-dns-swift-storage-0\") pod \"f060a6c1-48e8-49b5-9283-172dbef53241\" (UID: \"f060a6c1-48e8-49b5-9283-172dbef53241\") " Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.770215 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90a878ef-27a9-49a2-be83-1a6ba34fab7f-scripts\") pod \"90a878ef-27a9-49a2-be83-1a6ba34fab7f\" (UID: \"90a878ef-27a9-49a2-be83-1a6ba34fab7f\") " Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.777758 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90a878ef-27a9-49a2-be83-1a6ba34fab7f-scripts" (OuterVolumeSpecName: "scripts") pod "90a878ef-27a9-49a2-be83-1a6ba34fab7f" (UID: "90a878ef-27a9-49a2-be83-1a6ba34fab7f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.785128 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90a878ef-27a9-49a2-be83-1a6ba34fab7f-kube-api-access-7g5hh" (OuterVolumeSpecName: "kube-api-access-7g5hh") pod "90a878ef-27a9-49a2-be83-1a6ba34fab7f" (UID: "90a878ef-27a9-49a2-be83-1a6ba34fab7f"). InnerVolumeSpecName "kube-api-access-7g5hh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.790742 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f060a6c1-48e8-49b5-9283-172dbef53241-kube-api-access-wzf8n" (OuterVolumeSpecName: "kube-api-access-wzf8n") pod "f060a6c1-48e8-49b5-9283-172dbef53241" (UID: "f060a6c1-48e8-49b5-9283-172dbef53241"). InnerVolumeSpecName "kube-api-access-wzf8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.835519 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90a878ef-27a9-49a2-be83-1a6ba34fab7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90a878ef-27a9-49a2-be83-1a6ba34fab7f" (UID: "90a878ef-27a9-49a2-be83-1a6ba34fab7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.841286 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f060a6c1-48e8-49b5-9283-172dbef53241-config" (OuterVolumeSpecName: "config") pod "f060a6c1-48e8-49b5-9283-172dbef53241" (UID: "f060a6c1-48e8-49b5-9283-172dbef53241"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.841309 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f060a6c1-48e8-49b5-9283-172dbef53241-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f060a6c1-48e8-49b5-9283-172dbef53241" (UID: "f060a6c1-48e8-49b5-9283-172dbef53241"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.843672 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90a878ef-27a9-49a2-be83-1a6ba34fab7f-config-data" (OuterVolumeSpecName: "config-data") pod "90a878ef-27a9-49a2-be83-1a6ba34fab7f" (UID: "90a878ef-27a9-49a2-be83-1a6ba34fab7f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.846978 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f060a6c1-48e8-49b5-9283-172dbef53241-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f060a6c1-48e8-49b5-9283-172dbef53241" (UID: "f060a6c1-48e8-49b5-9283-172dbef53241"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.859024 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f060a6c1-48e8-49b5-9283-172dbef53241-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f060a6c1-48e8-49b5-9283-172dbef53241" (UID: "f060a6c1-48e8-49b5-9283-172dbef53241"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.872997 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f060a6c1-48e8-49b5-9283-172dbef53241-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.873033 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzf8n\" (UniqueName: \"kubernetes.io/projected/f060a6c1-48e8-49b5-9283-172dbef53241-kube-api-access-wzf8n\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.873046 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7g5hh\" (UniqueName: \"kubernetes.io/projected/90a878ef-27a9-49a2-be83-1a6ba34fab7f-kube-api-access-7g5hh\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.873057 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90a878ef-27a9-49a2-be83-1a6ba34fab7f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.873064 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f060a6c1-48e8-49b5-9283-172dbef53241-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.873074 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f060a6c1-48e8-49b5-9283-172dbef53241-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.873083 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90a878ef-27a9-49a2-be83-1a6ba34fab7f-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.873092 4756 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f060a6c1-48e8-49b5-9283-172dbef53241-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.873099 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90a878ef-27a9-49a2-be83-1a6ba34fab7f-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.877366 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f060a6c1-48e8-49b5-9283-172dbef53241-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f060a6c1-48e8-49b5-9283-172dbef53241" (UID: "f060a6c1-48e8-49b5-9283-172dbef53241"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:39:30 crc kubenswrapper[4756]: I1205 16:39:30.975000 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f060a6c1-48e8-49b5-9283-172dbef53241-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:31 crc kubenswrapper[4756]: W1205 16:39:31.004015 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod797faab6_a6b3_48ae_90ca_09d251b2975f.slice/crio-4f8cb0a49c9c285b0fb9bb22d0db0aa157e24e040f9a11bccff79fe508d8ba21 WatchSource:0}: Error finding container 4f8cb0a49c9c285b0fb9bb22d0db0aa157e24e040f9a11bccff79fe508d8ba21: Status 404 returned error can't find the container with id 4f8cb0a49c9c285b0fb9bb22d0db0aa157e24e040f9a11bccff79fe508d8ba21 Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.007838 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.074935 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.075265 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0c31b46-0af4-4e7f-9a32-06ee16a766aa" containerName="ceilometer-central-agent" containerID="cri-o://5db63ea4a559119523d3e8bb8f75218cc114aadc379f3aae7d36834ca3a9a98b" gracePeriod=30 Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.075334 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0c31b46-0af4-4e7f-9a32-06ee16a766aa" containerName="sg-core" containerID="cri-o://459891e34350f5b654672466cfa986677c601900125058b1863760a6d635e367" gracePeriod=30 Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.075404 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0c31b46-0af4-4e7f-9a32-06ee16a766aa" containerName="proxy-httpd" containerID="cri-o://4ac9c6438faed2c3da581966d462cca9da347859aefd847e1c08a3c6a331d9f9" gracePeriod=30 Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.075440 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0c31b46-0af4-4e7f-9a32-06ee16a766aa" containerName="ceilometer-notification-agent" containerID="cri-o://5604a219eeaff30b39a5446471359f5127d1a5b1fbbfeb672889ea616a2c3aa4" gracePeriod=30 Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.133019 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"797faab6-a6b3-48ae-90ca-09d251b2975f","Type":"ContainerStarted","Data":"4f8cb0a49c9c285b0fb9bb22d0db0aa157e24e040f9a11bccff79fe508d8ba21"} Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.135349 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" event={"ID":"f060a6c1-48e8-49b5-9283-172dbef53241","Type":"ContainerDied","Data":"9d750af7cb3eff27dd2d4042b0f17934b060ae4876835db3d8ab1793fcc13d25"} Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.135410 4756 scope.go:117] "RemoveContainer" containerID="9b565047b4f90b9ae68c5bdaf8d5c38a5e415dce6e03641e5c8dca325759a137" Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.135575 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77d8c9c7-tmvh4" Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.144661 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vs2mj" Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.145357 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vs2mj" event={"ID":"90a878ef-27a9-49a2-be83-1a6ba34fab7f","Type":"ContainerDied","Data":"980165b50e2515f0863e173a355c29857adefafdbae3e4be143e2b61209eca5d"} Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.145455 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="980165b50e2515f0863e173a355c29857adefafdbae3e4be143e2b61209eca5d" Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.172164 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77d8c9c7-tmvh4"] Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.185181 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77d8c9c7-tmvh4"] Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.203704 4756 scope.go:117] "RemoveContainer" containerID="402fb96553c11e8b81aee95a1c0cfb627cc7be118fc8785777d07f7aae8377d2" Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.220303 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.220579 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="146bf072-8ade-4d0f-91cf-4a58c4c2f136" containerName="nova-api-log" containerID="cri-o://ac2d8cfcc5a2656696d38fccdcbbdb164ee070624c43285298931cb54080ef90" gracePeriod=30 Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.220653 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="146bf072-8ade-4d0f-91cf-4a58c4c2f136" containerName="nova-api-api" containerID="cri-o://6bce93f5a6dc5ee89c32717f642c74ab41380473213291ede71e14b699b0b4cb" gracePeriod=30 Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.233393 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.245952 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.246177 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="921dc635-3afc-47a5-9cc5-35a41f37bf0e" containerName="nova-metadata-log" containerID="cri-o://05c062a47710f345f01bdc0bb785327d68296f5551e26abb9611b03af7b5a8ae" gracePeriod=30 Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.246591 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="921dc635-3afc-47a5-9cc5-35a41f37bf0e" containerName="nova-metadata-metadata" containerID="cri-o://d459755e935ea7a5bf08428f66083c9a38c63d198c6bd8d7244cad2f5626c99b" gracePeriod=30 Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.430460 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.430518 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.837447 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 16:39:31 crc kubenswrapper[4756]: E1205 16:39:31.855779 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0c31b46_0af4_4e7f_9a32_06ee16a766aa.slice/crio-5db63ea4a559119523d3e8bb8f75218cc114aadc379f3aae7d36834ca3a9a98b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod921dc635_3afc_47a5_9cc5_35a41f37bf0e.slice/crio-conmon-d459755e935ea7a5bf08428f66083c9a38c63d198c6bd8d7244cad2f5626c99b.scope\": RecentStats: unable to find data in memory cache]" Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.997114 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/921dc635-3afc-47a5-9cc5-35a41f37bf0e-combined-ca-bundle\") pod \"921dc635-3afc-47a5-9cc5-35a41f37bf0e\" (UID: \"921dc635-3afc-47a5-9cc5-35a41f37bf0e\") " Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.997276 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/921dc635-3afc-47a5-9cc5-35a41f37bf0e-config-data\") pod \"921dc635-3afc-47a5-9cc5-35a41f37bf0e\" (UID: \"921dc635-3afc-47a5-9cc5-35a41f37bf0e\") " Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.997325 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/921dc635-3afc-47a5-9cc5-35a41f37bf0e-logs\") pod \"921dc635-3afc-47a5-9cc5-35a41f37bf0e\" (UID: \"921dc635-3afc-47a5-9cc5-35a41f37bf0e\") " Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.997460 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/921dc635-3afc-47a5-9cc5-35a41f37bf0e-nova-metadata-tls-certs\") pod \"921dc635-3afc-47a5-9cc5-35a41f37bf0e\" (UID: \"921dc635-3afc-47a5-9cc5-35a41f37bf0e\") " Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.997509 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp4nv\" (UniqueName: \"kubernetes.io/projected/921dc635-3afc-47a5-9cc5-35a41f37bf0e-kube-api-access-bp4nv\") pod \"921dc635-3afc-47a5-9cc5-35a41f37bf0e\" (UID: \"921dc635-3afc-47a5-9cc5-35a41f37bf0e\") " Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.997936 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/921dc635-3afc-47a5-9cc5-35a41f37bf0e-logs" (OuterVolumeSpecName: "logs") pod "921dc635-3afc-47a5-9cc5-35a41f37bf0e" (UID: "921dc635-3afc-47a5-9cc5-35a41f37bf0e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:39:31 crc kubenswrapper[4756]: I1205 16:39:31.998209 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/921dc635-3afc-47a5-9cc5-35a41f37bf0e-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.001535 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/921dc635-3afc-47a5-9cc5-35a41f37bf0e-kube-api-access-bp4nv" (OuterVolumeSpecName: "kube-api-access-bp4nv") pod "921dc635-3afc-47a5-9cc5-35a41f37bf0e" (UID: "921dc635-3afc-47a5-9cc5-35a41f37bf0e"). InnerVolumeSpecName "kube-api-access-bp4nv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.032521 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/921dc635-3afc-47a5-9cc5-35a41f37bf0e-config-data" (OuterVolumeSpecName: "config-data") pod "921dc635-3afc-47a5-9cc5-35a41f37bf0e" (UID: "921dc635-3afc-47a5-9cc5-35a41f37bf0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.033048 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/921dc635-3afc-47a5-9cc5-35a41f37bf0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "921dc635-3afc-47a5-9cc5-35a41f37bf0e" (UID: "921dc635-3afc-47a5-9cc5-35a41f37bf0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.066881 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d091b847-2b31-492a-a1b7-37e356e55e93" path="/var/lib/kubelet/pods/d091b847-2b31-492a-a1b7-37e356e55e93/volumes" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.067440 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f060a6c1-48e8-49b5-9283-172dbef53241" path="/var/lib/kubelet/pods/f060a6c1-48e8-49b5-9283-172dbef53241/volumes" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.070581 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/921dc635-3afc-47a5-9cc5-35a41f37bf0e-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "921dc635-3afc-47a5-9cc5-35a41f37bf0e" (UID: "921dc635-3afc-47a5-9cc5-35a41f37bf0e"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.099983 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/921dc635-3afc-47a5-9cc5-35a41f37bf0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.100010 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/921dc635-3afc-47a5-9cc5-35a41f37bf0e-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.100019 4756 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/921dc635-3afc-47a5-9cc5-35a41f37bf0e-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.100028 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp4nv\" (UniqueName: \"kubernetes.io/projected/921dc635-3afc-47a5-9cc5-35a41f37bf0e-kube-api-access-bp4nv\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.154306 4756 generic.go:334] "Generic (PLEG): container finished" podID="146bf072-8ade-4d0f-91cf-4a58c4c2f136" containerID="ac2d8cfcc5a2656696d38fccdcbbdb164ee070624c43285298931cb54080ef90" exitCode=143 Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.154378 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"146bf072-8ade-4d0f-91cf-4a58c4c2f136","Type":"ContainerDied","Data":"ac2d8cfcc5a2656696d38fccdcbbdb164ee070624c43285298931cb54080ef90"} Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.156351 4756 generic.go:334] "Generic (PLEG): container finished" podID="921dc635-3afc-47a5-9cc5-35a41f37bf0e" containerID="d459755e935ea7a5bf08428f66083c9a38c63d198c6bd8d7244cad2f5626c99b" exitCode=0 Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.156375 4756 generic.go:334] "Generic (PLEG): container finished" podID="921dc635-3afc-47a5-9cc5-35a41f37bf0e" containerID="05c062a47710f345f01bdc0bb785327d68296f5551e26abb9611b03af7b5a8ae" exitCode=143 Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.156445 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"921dc635-3afc-47a5-9cc5-35a41f37bf0e","Type":"ContainerDied","Data":"d459755e935ea7a5bf08428f66083c9a38c63d198c6bd8d7244cad2f5626c99b"} Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.156493 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"921dc635-3afc-47a5-9cc5-35a41f37bf0e","Type":"ContainerDied","Data":"05c062a47710f345f01bdc0bb785327d68296f5551e26abb9611b03af7b5a8ae"} Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.156510 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"921dc635-3afc-47a5-9cc5-35a41f37bf0e","Type":"ContainerDied","Data":"2afbf1dd3fd55ea186c55af3de9059e62bc2321226a12b333e1dcbeaeb0545dc"} Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.156528 4756 scope.go:117] "RemoveContainer" containerID="d459755e935ea7a5bf08428f66083c9a38c63d198c6bd8d7244cad2f5626c99b" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.156460 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.169169 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"797faab6-a6b3-48ae-90ca-09d251b2975f","Type":"ContainerStarted","Data":"9dcca87cd81ebc30ed3acfa39592365fba84c1006feaa6a068c8c8e226c1806a"} Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.169321 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.176485 4756 generic.go:334] "Generic (PLEG): container finished" podID="a0c31b46-0af4-4e7f-9a32-06ee16a766aa" containerID="4ac9c6438faed2c3da581966d462cca9da347859aefd847e1c08a3c6a331d9f9" exitCode=0 Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.176520 4756 generic.go:334] "Generic (PLEG): container finished" podID="a0c31b46-0af4-4e7f-9a32-06ee16a766aa" containerID="459891e34350f5b654672466cfa986677c601900125058b1863760a6d635e367" exitCode=2 Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.176529 4756 generic.go:334] "Generic (PLEG): container finished" podID="a0c31b46-0af4-4e7f-9a32-06ee16a766aa" containerID="5db63ea4a559119523d3e8bb8f75218cc114aadc379f3aae7d36834ca3a9a98b" exitCode=0 Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.176662 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="5a04745b-1666-4d6b-ab59-706b04cf4160" containerName="nova-scheduler-scheduler" containerID="cri-o://ddc20c25aa72a78a8e26b7c521fc7a31ae86c3a0a05f9a2043bcd6fb7903020a" gracePeriod=30 Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.176898 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0c31b46-0af4-4e7f-9a32-06ee16a766aa","Type":"ContainerDied","Data":"4ac9c6438faed2c3da581966d462cca9da347859aefd847e1c08a3c6a331d9f9"} Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.176937 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0c31b46-0af4-4e7f-9a32-06ee16a766aa","Type":"ContainerDied","Data":"459891e34350f5b654672466cfa986677c601900125058b1863760a6d635e367"} Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.176952 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0c31b46-0af4-4e7f-9a32-06ee16a766aa","Type":"ContainerDied","Data":"5db63ea4a559119523d3e8bb8f75218cc114aadc379f3aae7d36834ca3a9a98b"} Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.181459 4756 scope.go:117] "RemoveContainer" containerID="05c062a47710f345f01bdc0bb785327d68296f5551e26abb9611b03af7b5a8ae" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.196843 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.780619808 podStartE2EDuration="2.196827427s" podCreationTimestamp="2025-12-05 16:39:30 +0000 UTC" firstStartedPulling="2025-12-05 16:39:31.009176511 +0000 UTC m=+1431.247850013" lastFinishedPulling="2025-12-05 16:39:31.42538411 +0000 UTC m=+1431.664057632" observedRunningTime="2025-12-05 16:39:32.186291402 +0000 UTC m=+1432.424964924" watchObservedRunningTime="2025-12-05 16:39:32.196827427 +0000 UTC m=+1432.435500939" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.225317 4756 scope.go:117] "RemoveContainer" containerID="d459755e935ea7a5bf08428f66083c9a38c63d198c6bd8d7244cad2f5626c99b" Dec 05 16:39:32 crc kubenswrapper[4756]: E1205 16:39:32.227049 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d459755e935ea7a5bf08428f66083c9a38c63d198c6bd8d7244cad2f5626c99b\": container with ID starting with d459755e935ea7a5bf08428f66083c9a38c63d198c6bd8d7244cad2f5626c99b not found: ID does not exist" containerID="d459755e935ea7a5bf08428f66083c9a38c63d198c6bd8d7244cad2f5626c99b" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.227103 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d459755e935ea7a5bf08428f66083c9a38c63d198c6bd8d7244cad2f5626c99b"} err="failed to get container status \"d459755e935ea7a5bf08428f66083c9a38c63d198c6bd8d7244cad2f5626c99b\": rpc error: code = NotFound desc = could not find container \"d459755e935ea7a5bf08428f66083c9a38c63d198c6bd8d7244cad2f5626c99b\": container with ID starting with d459755e935ea7a5bf08428f66083c9a38c63d198c6bd8d7244cad2f5626c99b not found: ID does not exist" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.227135 4756 scope.go:117] "RemoveContainer" containerID="05c062a47710f345f01bdc0bb785327d68296f5551e26abb9611b03af7b5a8ae" Dec 05 16:39:32 crc kubenswrapper[4756]: E1205 16:39:32.227560 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05c062a47710f345f01bdc0bb785327d68296f5551e26abb9611b03af7b5a8ae\": container with ID starting with 05c062a47710f345f01bdc0bb785327d68296f5551e26abb9611b03af7b5a8ae not found: ID does not exist" containerID="05c062a47710f345f01bdc0bb785327d68296f5551e26abb9611b03af7b5a8ae" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.227626 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05c062a47710f345f01bdc0bb785327d68296f5551e26abb9611b03af7b5a8ae"} err="failed to get container status \"05c062a47710f345f01bdc0bb785327d68296f5551e26abb9611b03af7b5a8ae\": rpc error: code = NotFound desc = could not find container \"05c062a47710f345f01bdc0bb785327d68296f5551e26abb9611b03af7b5a8ae\": container with ID starting with 05c062a47710f345f01bdc0bb785327d68296f5551e26abb9611b03af7b5a8ae not found: ID does not exist" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.227663 4756 scope.go:117] "RemoveContainer" containerID="d459755e935ea7a5bf08428f66083c9a38c63d198c6bd8d7244cad2f5626c99b" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.230521 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d459755e935ea7a5bf08428f66083c9a38c63d198c6bd8d7244cad2f5626c99b"} err="failed to get container status \"d459755e935ea7a5bf08428f66083c9a38c63d198c6bd8d7244cad2f5626c99b\": rpc error: code = NotFound desc = could not find container \"d459755e935ea7a5bf08428f66083c9a38c63d198c6bd8d7244cad2f5626c99b\": container with ID starting with d459755e935ea7a5bf08428f66083c9a38c63d198c6bd8d7244cad2f5626c99b not found: ID does not exist" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.230737 4756 scope.go:117] "RemoveContainer" containerID="05c062a47710f345f01bdc0bb785327d68296f5551e26abb9611b03af7b5a8ae" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.232510 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.232714 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05c062a47710f345f01bdc0bb785327d68296f5551e26abb9611b03af7b5a8ae"} err="failed to get container status \"05c062a47710f345f01bdc0bb785327d68296f5551e26abb9611b03af7b5a8ae\": rpc error: code = NotFound desc = could not find container \"05c062a47710f345f01bdc0bb785327d68296f5551e26abb9611b03af7b5a8ae\": container with ID starting with 05c062a47710f345f01bdc0bb785327d68296f5551e26abb9611b03af7b5a8ae not found: ID does not exist" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.261886 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.270222 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 16:39:32 crc kubenswrapper[4756]: E1205 16:39:32.270715 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="921dc635-3afc-47a5-9cc5-35a41f37bf0e" containerName="nova-metadata-metadata" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.270741 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="921dc635-3afc-47a5-9cc5-35a41f37bf0e" containerName="nova-metadata-metadata" Dec 05 16:39:32 crc kubenswrapper[4756]: E1205 16:39:32.270771 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90a878ef-27a9-49a2-be83-1a6ba34fab7f" containerName="nova-manage" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.270780 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="90a878ef-27a9-49a2-be83-1a6ba34fab7f" containerName="nova-manage" Dec 05 16:39:32 crc kubenswrapper[4756]: E1205 16:39:32.270818 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="921dc635-3afc-47a5-9cc5-35a41f37bf0e" containerName="nova-metadata-log" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.270828 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="921dc635-3afc-47a5-9cc5-35a41f37bf0e" containerName="nova-metadata-log" Dec 05 16:39:32 crc kubenswrapper[4756]: E1205 16:39:32.270847 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f060a6c1-48e8-49b5-9283-172dbef53241" containerName="dnsmasq-dns" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.270857 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f060a6c1-48e8-49b5-9283-172dbef53241" containerName="dnsmasq-dns" Dec 05 16:39:32 crc kubenswrapper[4756]: E1205 16:39:32.270892 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f060a6c1-48e8-49b5-9283-172dbef53241" containerName="init" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.270901 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f060a6c1-48e8-49b5-9283-172dbef53241" containerName="init" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.271135 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="921dc635-3afc-47a5-9cc5-35a41f37bf0e" containerName="nova-metadata-log" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.271170 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="921dc635-3afc-47a5-9cc5-35a41f37bf0e" containerName="nova-metadata-metadata" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.271185 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f060a6c1-48e8-49b5-9283-172dbef53241" containerName="dnsmasq-dns" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.271207 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="90a878ef-27a9-49a2-be83-1a6ba34fab7f" containerName="nova-manage" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.272488 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.275028 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.275223 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.281165 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.406517 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/40355c38-2e19-4399-ad0e-69986aca4afc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"40355c38-2e19-4399-ad0e-69986aca4afc\") " pod="openstack/nova-metadata-0" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.406973 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mmsw\" (UniqueName: \"kubernetes.io/projected/40355c38-2e19-4399-ad0e-69986aca4afc-kube-api-access-5mmsw\") pod \"nova-metadata-0\" (UID: \"40355c38-2e19-4399-ad0e-69986aca4afc\") " pod="openstack/nova-metadata-0" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.407013 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40355c38-2e19-4399-ad0e-69986aca4afc-logs\") pod \"nova-metadata-0\" (UID: \"40355c38-2e19-4399-ad0e-69986aca4afc\") " pod="openstack/nova-metadata-0" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.407039 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40355c38-2e19-4399-ad0e-69986aca4afc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"40355c38-2e19-4399-ad0e-69986aca4afc\") " pod="openstack/nova-metadata-0" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.407063 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40355c38-2e19-4399-ad0e-69986aca4afc-config-data\") pod \"nova-metadata-0\" (UID: \"40355c38-2e19-4399-ad0e-69986aca4afc\") " pod="openstack/nova-metadata-0" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.512717 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/40355c38-2e19-4399-ad0e-69986aca4afc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"40355c38-2e19-4399-ad0e-69986aca4afc\") " pod="openstack/nova-metadata-0" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.512859 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mmsw\" (UniqueName: \"kubernetes.io/projected/40355c38-2e19-4399-ad0e-69986aca4afc-kube-api-access-5mmsw\") pod \"nova-metadata-0\" (UID: \"40355c38-2e19-4399-ad0e-69986aca4afc\") " pod="openstack/nova-metadata-0" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.512891 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40355c38-2e19-4399-ad0e-69986aca4afc-logs\") pod \"nova-metadata-0\" (UID: \"40355c38-2e19-4399-ad0e-69986aca4afc\") " pod="openstack/nova-metadata-0" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.512914 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40355c38-2e19-4399-ad0e-69986aca4afc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"40355c38-2e19-4399-ad0e-69986aca4afc\") " pod="openstack/nova-metadata-0" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.512940 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40355c38-2e19-4399-ad0e-69986aca4afc-config-data\") pod \"nova-metadata-0\" (UID: \"40355c38-2e19-4399-ad0e-69986aca4afc\") " pod="openstack/nova-metadata-0" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.513783 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40355c38-2e19-4399-ad0e-69986aca4afc-logs\") pod \"nova-metadata-0\" (UID: \"40355c38-2e19-4399-ad0e-69986aca4afc\") " pod="openstack/nova-metadata-0" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.517351 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40355c38-2e19-4399-ad0e-69986aca4afc-config-data\") pod \"nova-metadata-0\" (UID: \"40355c38-2e19-4399-ad0e-69986aca4afc\") " pod="openstack/nova-metadata-0" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.519271 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40355c38-2e19-4399-ad0e-69986aca4afc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"40355c38-2e19-4399-ad0e-69986aca4afc\") " pod="openstack/nova-metadata-0" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.524806 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/40355c38-2e19-4399-ad0e-69986aca4afc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"40355c38-2e19-4399-ad0e-69986aca4afc\") " pod="openstack/nova-metadata-0" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.532351 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mmsw\" (UniqueName: \"kubernetes.io/projected/40355c38-2e19-4399-ad0e-69986aca4afc-kube-api-access-5mmsw\") pod \"nova-metadata-0\" (UID: \"40355c38-2e19-4399-ad0e-69986aca4afc\") " pod="openstack/nova-metadata-0" Dec 05 16:39:32 crc kubenswrapper[4756]: I1205 16:39:32.589539 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 16:39:33 crc kubenswrapper[4756]: I1205 16:39:33.038868 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 16:39:33 crc kubenswrapper[4756]: W1205 16:39:33.040183 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40355c38_2e19_4399_ad0e_69986aca4afc.slice/crio-a37b65e3b140865fce4637d73f03497ef747b7bcd1c6cffc82ad83486581f66c WatchSource:0}: Error finding container a37b65e3b140865fce4637d73f03497ef747b7bcd1c6cffc82ad83486581f66c: Status 404 returned error can't find the container with id a37b65e3b140865fce4637d73f03497ef747b7bcd1c6cffc82ad83486581f66c Dec 05 16:39:33 crc kubenswrapper[4756]: I1205 16:39:33.187672 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"40355c38-2e19-4399-ad0e-69986aca4afc","Type":"ContainerStarted","Data":"f9c0aa434a8aaa877ac9be36559583185484829d4628945a9968fb907b589035"} Dec 05 16:39:33 crc kubenswrapper[4756]: I1205 16:39:33.188517 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"40355c38-2e19-4399-ad0e-69986aca4afc","Type":"ContainerStarted","Data":"a37b65e3b140865fce4637d73f03497ef747b7bcd1c6cffc82ad83486581f66c"} Dec 05 16:39:34 crc kubenswrapper[4756]: I1205 16:39:34.059981 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="921dc635-3afc-47a5-9cc5-35a41f37bf0e" path="/var/lib/kubelet/pods/921dc635-3afc-47a5-9cc5-35a41f37bf0e/volumes" Dec 05 16:39:34 crc kubenswrapper[4756]: I1205 16:39:34.199450 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"40355c38-2e19-4399-ad0e-69986aca4afc","Type":"ContainerStarted","Data":"473febb21047ae03f0d68e956550c08036534d94ad15979bcf8182b244f07a6a"} Dec 05 16:39:34 crc kubenswrapper[4756]: I1205 16:39:34.201208 4756 generic.go:334] "Generic (PLEG): container finished" podID="edb5a656-efa8-47ac-befc-2412854889f7" containerID="e34299a73608f080fb60c718ee146253bc5828dfc856b17195372ea6a1daeb46" exitCode=0 Dec 05 16:39:34 crc kubenswrapper[4756]: I1205 16:39:34.201248 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4dl5v" event={"ID":"edb5a656-efa8-47ac-befc-2412854889f7","Type":"ContainerDied","Data":"e34299a73608f080fb60c718ee146253bc5828dfc856b17195372ea6a1daeb46"} Dec 05 16:39:34 crc kubenswrapper[4756]: I1205 16:39:34.237143 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.23712082 podStartE2EDuration="2.23712082s" podCreationTimestamp="2025-12-05 16:39:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:39:34.219268246 +0000 UTC m=+1434.457941778" watchObservedRunningTime="2025-12-05 16:39:34.23712082 +0000 UTC m=+1434.475794332" Dec 05 16:39:34 crc kubenswrapper[4756]: E1205 16:39:34.453560 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ddc20c25aa72a78a8e26b7c521fc7a31ae86c3a0a05f9a2043bcd6fb7903020a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 16:39:34 crc kubenswrapper[4756]: E1205 16:39:34.454938 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ddc20c25aa72a78a8e26b7c521fc7a31ae86c3a0a05f9a2043bcd6fb7903020a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 16:39:34 crc kubenswrapper[4756]: E1205 16:39:34.457549 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ddc20c25aa72a78a8e26b7c521fc7a31ae86c3a0a05f9a2043bcd6fb7903020a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 16:39:34 crc kubenswrapper[4756]: E1205 16:39:34.457587 4756 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="5a04745b-1666-4d6b-ab59-706b04cf4160" containerName="nova-scheduler-scheduler" Dec 05 16:39:35 crc kubenswrapper[4756]: I1205 16:39:35.711370 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4dl5v" Dec 05 16:39:35 crc kubenswrapper[4756]: I1205 16:39:35.890670 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edb5a656-efa8-47ac-befc-2412854889f7-combined-ca-bundle\") pod \"edb5a656-efa8-47ac-befc-2412854889f7\" (UID: \"edb5a656-efa8-47ac-befc-2412854889f7\") " Dec 05 16:39:35 crc kubenswrapper[4756]: I1205 16:39:35.890767 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvfh4\" (UniqueName: \"kubernetes.io/projected/edb5a656-efa8-47ac-befc-2412854889f7-kube-api-access-fvfh4\") pod \"edb5a656-efa8-47ac-befc-2412854889f7\" (UID: \"edb5a656-efa8-47ac-befc-2412854889f7\") " Dec 05 16:39:35 crc kubenswrapper[4756]: I1205 16:39:35.890823 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edb5a656-efa8-47ac-befc-2412854889f7-config-data\") pod \"edb5a656-efa8-47ac-befc-2412854889f7\" (UID: \"edb5a656-efa8-47ac-befc-2412854889f7\") " Dec 05 16:39:35 crc kubenswrapper[4756]: I1205 16:39:35.890885 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edb5a656-efa8-47ac-befc-2412854889f7-scripts\") pod \"edb5a656-efa8-47ac-befc-2412854889f7\" (UID: \"edb5a656-efa8-47ac-befc-2412854889f7\") " Dec 05 16:39:35 crc kubenswrapper[4756]: I1205 16:39:35.896163 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edb5a656-efa8-47ac-befc-2412854889f7-scripts" (OuterVolumeSpecName: "scripts") pod "edb5a656-efa8-47ac-befc-2412854889f7" (UID: "edb5a656-efa8-47ac-befc-2412854889f7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:39:35 crc kubenswrapper[4756]: I1205 16:39:35.897007 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edb5a656-efa8-47ac-befc-2412854889f7-kube-api-access-fvfh4" (OuterVolumeSpecName: "kube-api-access-fvfh4") pod "edb5a656-efa8-47ac-befc-2412854889f7" (UID: "edb5a656-efa8-47ac-befc-2412854889f7"). InnerVolumeSpecName "kube-api-access-fvfh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:39:35 crc kubenswrapper[4756]: I1205 16:39:35.919316 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edb5a656-efa8-47ac-befc-2412854889f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "edb5a656-efa8-47ac-befc-2412854889f7" (UID: "edb5a656-efa8-47ac-befc-2412854889f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:39:35 crc kubenswrapper[4756]: I1205 16:39:35.930623 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edb5a656-efa8-47ac-befc-2412854889f7-config-data" (OuterVolumeSpecName: "config-data") pod "edb5a656-efa8-47ac-befc-2412854889f7" (UID: "edb5a656-efa8-47ac-befc-2412854889f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:39:35 crc kubenswrapper[4756]: I1205 16:39:35.982018 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:39:35 crc kubenswrapper[4756]: I1205 16:39:35.993034 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edb5a656-efa8-47ac-befc-2412854889f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:35 crc kubenswrapper[4756]: I1205 16:39:35.993076 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvfh4\" (UniqueName: \"kubernetes.io/projected/edb5a656-efa8-47ac-befc-2412854889f7-kube-api-access-fvfh4\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:35 crc kubenswrapper[4756]: I1205 16:39:35.993090 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edb5a656-efa8-47ac-befc-2412854889f7-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:35 crc kubenswrapper[4756]: I1205 16:39:35.993105 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edb5a656-efa8-47ac-befc-2412854889f7-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.094785 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-log-httpd\") pod \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\" (UID: \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\") " Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.095310 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a0c31b46-0af4-4e7f-9a32-06ee16a766aa" (UID: "a0c31b46-0af4-4e7f-9a32-06ee16a766aa"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.094827 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-combined-ca-bundle\") pod \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\" (UID: \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\") " Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.095513 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slhgx\" (UniqueName: \"kubernetes.io/projected/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-kube-api-access-slhgx\") pod \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\" (UID: \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\") " Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.096472 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-scripts\") pod \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\" (UID: \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\") " Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.096536 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-run-httpd\") pod \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\" (UID: \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\") " Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.096564 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-sg-core-conf-yaml\") pod \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\" (UID: \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\") " Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.096724 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-config-data\") pod \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\" (UID: \"a0c31b46-0af4-4e7f-9a32-06ee16a766aa\") " Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.096885 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a0c31b46-0af4-4e7f-9a32-06ee16a766aa" (UID: "a0c31b46-0af4-4e7f-9a32-06ee16a766aa"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.100065 4756 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.100101 4756 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.100467 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-scripts" (OuterVolumeSpecName: "scripts") pod "a0c31b46-0af4-4e7f-9a32-06ee16a766aa" (UID: "a0c31b46-0af4-4e7f-9a32-06ee16a766aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.115542 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.117250 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-kube-api-access-slhgx" (OuterVolumeSpecName: "kube-api-access-slhgx") pod "a0c31b46-0af4-4e7f-9a32-06ee16a766aa" (UID: "a0c31b46-0af4-4e7f-9a32-06ee16a766aa"). InnerVolumeSpecName "kube-api-access-slhgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.123971 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a0c31b46-0af4-4e7f-9a32-06ee16a766aa" (UID: "a0c31b46-0af4-4e7f-9a32-06ee16a766aa"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.179055 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0c31b46-0af4-4e7f-9a32-06ee16a766aa" (UID: "a0c31b46-0af4-4e7f-9a32-06ee16a766aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.202640 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpgvx\" (UniqueName: \"kubernetes.io/projected/5a04745b-1666-4d6b-ab59-706b04cf4160-kube-api-access-mpgvx\") pod \"5a04745b-1666-4d6b-ab59-706b04cf4160\" (UID: \"5a04745b-1666-4d6b-ab59-706b04cf4160\") " Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.202770 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a04745b-1666-4d6b-ab59-706b04cf4160-combined-ca-bundle\") pod \"5a04745b-1666-4d6b-ab59-706b04cf4160\" (UID: \"5a04745b-1666-4d6b-ab59-706b04cf4160\") " Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.202833 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a04745b-1666-4d6b-ab59-706b04cf4160-config-data\") pod \"5a04745b-1666-4d6b-ab59-706b04cf4160\" (UID: \"5a04745b-1666-4d6b-ab59-706b04cf4160\") " Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.203572 4756 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.203590 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.203602 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slhgx\" (UniqueName: \"kubernetes.io/projected/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-kube-api-access-slhgx\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.203617 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.206831 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a04745b-1666-4d6b-ab59-706b04cf4160-kube-api-access-mpgvx" (OuterVolumeSpecName: "kube-api-access-mpgvx") pod "5a04745b-1666-4d6b-ab59-706b04cf4160" (UID: "5a04745b-1666-4d6b-ab59-706b04cf4160"). InnerVolumeSpecName "kube-api-access-mpgvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.211352 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-config-data" (OuterVolumeSpecName: "config-data") pod "a0c31b46-0af4-4e7f-9a32-06ee16a766aa" (UID: "a0c31b46-0af4-4e7f-9a32-06ee16a766aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.223397 4756 generic.go:334] "Generic (PLEG): container finished" podID="a0c31b46-0af4-4e7f-9a32-06ee16a766aa" containerID="5604a219eeaff30b39a5446471359f5127d1a5b1fbbfeb672889ea616a2c3aa4" exitCode=0 Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.223568 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0c31b46-0af4-4e7f-9a32-06ee16a766aa","Type":"ContainerDied","Data":"5604a219eeaff30b39a5446471359f5127d1a5b1fbbfeb672889ea616a2c3aa4"} Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.223602 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0c31b46-0af4-4e7f-9a32-06ee16a766aa","Type":"ContainerDied","Data":"8fbfd38d5fc57ddbe763a4e1ee558bfaae1dec034dfa747c10767e6607b37ba0"} Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.223649 4756 scope.go:117] "RemoveContainer" containerID="4ac9c6438faed2c3da581966d462cca9da347859aefd847e1c08a3c6a331d9f9" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.223902 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.228155 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4dl5v" event={"ID":"edb5a656-efa8-47ac-befc-2412854889f7","Type":"ContainerDied","Data":"7570cd48062961401ae83a772b7c7d96226d8b4f8308767cc1a1c99a167b39eb"} Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.228188 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4dl5v" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.228205 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7570cd48062961401ae83a772b7c7d96226d8b4f8308767cc1a1c99a167b39eb" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.234941 4756 generic.go:334] "Generic (PLEG): container finished" podID="5a04745b-1666-4d6b-ab59-706b04cf4160" containerID="ddc20c25aa72a78a8e26b7c521fc7a31ae86c3a0a05f9a2043bcd6fb7903020a" exitCode=0 Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.234991 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5a04745b-1666-4d6b-ab59-706b04cf4160","Type":"ContainerDied","Data":"ddc20c25aa72a78a8e26b7c521fc7a31ae86c3a0a05f9a2043bcd6fb7903020a"} Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.235022 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5a04745b-1666-4d6b-ab59-706b04cf4160","Type":"ContainerDied","Data":"fb36993d0a652343770d877a82f25cb5fd76f37e666269d9505b69c2938119e7"} Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.235066 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.239865 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a04745b-1666-4d6b-ab59-706b04cf4160-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a04745b-1666-4d6b-ab59-706b04cf4160" (UID: "5a04745b-1666-4d6b-ab59-706b04cf4160"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.240614 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a04745b-1666-4d6b-ab59-706b04cf4160-config-data" (OuterVolumeSpecName: "config-data") pod "5a04745b-1666-4d6b-ab59-706b04cf4160" (UID: "5a04745b-1666-4d6b-ab59-706b04cf4160"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.254052 4756 scope.go:117] "RemoveContainer" containerID="459891e34350f5b654672466cfa986677c601900125058b1863760a6d635e367" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.278637 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.301766 4756 scope.go:117] "RemoveContainer" containerID="5604a219eeaff30b39a5446471359f5127d1a5b1fbbfeb672889ea616a2c3aa4" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.304775 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.305310 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpgvx\" (UniqueName: \"kubernetes.io/projected/5a04745b-1666-4d6b-ab59-706b04cf4160-kube-api-access-mpgvx\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.305340 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0c31b46-0af4-4e7f-9a32-06ee16a766aa-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.305350 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a04745b-1666-4d6b-ab59-706b04cf4160-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.305361 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a04745b-1666-4d6b-ab59-706b04cf4160-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.341685 4756 scope.go:117] "RemoveContainer" containerID="5db63ea4a559119523d3e8bb8f75218cc114aadc379f3aae7d36834ca3a9a98b" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.350933 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:39:36 crc kubenswrapper[4756]: E1205 16:39:36.351384 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c31b46-0af4-4e7f-9a32-06ee16a766aa" containerName="ceilometer-central-agent" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.351398 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c31b46-0af4-4e7f-9a32-06ee16a766aa" containerName="ceilometer-central-agent" Dec 05 16:39:36 crc kubenswrapper[4756]: E1205 16:39:36.351443 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edb5a656-efa8-47ac-befc-2412854889f7" containerName="nova-cell1-conductor-db-sync" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.351449 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="edb5a656-efa8-47ac-befc-2412854889f7" containerName="nova-cell1-conductor-db-sync" Dec 05 16:39:36 crc kubenswrapper[4756]: E1205 16:39:36.351472 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c31b46-0af4-4e7f-9a32-06ee16a766aa" containerName="sg-core" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.351478 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c31b46-0af4-4e7f-9a32-06ee16a766aa" containerName="sg-core" Dec 05 16:39:36 crc kubenswrapper[4756]: E1205 16:39:36.351491 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c31b46-0af4-4e7f-9a32-06ee16a766aa" containerName="ceilometer-notification-agent" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.351497 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c31b46-0af4-4e7f-9a32-06ee16a766aa" containerName="ceilometer-notification-agent" Dec 05 16:39:36 crc kubenswrapper[4756]: E1205 16:39:36.351507 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c31b46-0af4-4e7f-9a32-06ee16a766aa" containerName="proxy-httpd" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.351512 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c31b46-0af4-4e7f-9a32-06ee16a766aa" containerName="proxy-httpd" Dec 05 16:39:36 crc kubenswrapper[4756]: E1205 16:39:36.351525 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a04745b-1666-4d6b-ab59-706b04cf4160" containerName="nova-scheduler-scheduler" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.351531 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a04745b-1666-4d6b-ab59-706b04cf4160" containerName="nova-scheduler-scheduler" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.351696 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c31b46-0af4-4e7f-9a32-06ee16a766aa" containerName="ceilometer-central-agent" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.351719 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c31b46-0af4-4e7f-9a32-06ee16a766aa" containerName="sg-core" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.351727 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a04745b-1666-4d6b-ab59-706b04cf4160" containerName="nova-scheduler-scheduler" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.351737 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c31b46-0af4-4e7f-9a32-06ee16a766aa" containerName="ceilometer-notification-agent" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.351744 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c31b46-0af4-4e7f-9a32-06ee16a766aa" containerName="proxy-httpd" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.351755 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="edb5a656-efa8-47ac-befc-2412854889f7" containerName="nova-cell1-conductor-db-sync" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.353386 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.357405 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.358721 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.358817 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.360722 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.380102 4756 scope.go:117] "RemoveContainer" containerID="4ac9c6438faed2c3da581966d462cca9da347859aefd847e1c08a3c6a331d9f9" Dec 05 16:39:36 crc kubenswrapper[4756]: E1205 16:39:36.389823 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ac9c6438faed2c3da581966d462cca9da347859aefd847e1c08a3c6a331d9f9\": container with ID starting with 4ac9c6438faed2c3da581966d462cca9da347859aefd847e1c08a3c6a331d9f9 not found: ID does not exist" containerID="4ac9c6438faed2c3da581966d462cca9da347859aefd847e1c08a3c6a331d9f9" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.390129 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ac9c6438faed2c3da581966d462cca9da347859aefd847e1c08a3c6a331d9f9"} err="failed to get container status \"4ac9c6438faed2c3da581966d462cca9da347859aefd847e1c08a3c6a331d9f9\": rpc error: code = NotFound desc = could not find container \"4ac9c6438faed2c3da581966d462cca9da347859aefd847e1c08a3c6a331d9f9\": container with ID starting with 4ac9c6438faed2c3da581966d462cca9da347859aefd847e1c08a3c6a331d9f9 not found: ID does not exist" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.390264 4756 scope.go:117] "RemoveContainer" containerID="459891e34350f5b654672466cfa986677c601900125058b1863760a6d635e367" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.391276 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.392879 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 16:39:36 crc kubenswrapper[4756]: E1205 16:39:36.396199 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"459891e34350f5b654672466cfa986677c601900125058b1863760a6d635e367\": container with ID starting with 459891e34350f5b654672466cfa986677c601900125058b1863760a6d635e367 not found: ID does not exist" containerID="459891e34350f5b654672466cfa986677c601900125058b1863760a6d635e367" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.396265 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"459891e34350f5b654672466cfa986677c601900125058b1863760a6d635e367"} err="failed to get container status \"459891e34350f5b654672466cfa986677c601900125058b1863760a6d635e367\": rpc error: code = NotFound desc = could not find container \"459891e34350f5b654672466cfa986677c601900125058b1863760a6d635e367\": container with ID starting with 459891e34350f5b654672466cfa986677c601900125058b1863760a6d635e367 not found: ID does not exist" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.396294 4756 scope.go:117] "RemoveContainer" containerID="5604a219eeaff30b39a5446471359f5127d1a5b1fbbfeb672889ea616a2c3aa4" Dec 05 16:39:36 crc kubenswrapper[4756]: E1205 16:39:36.401742 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5604a219eeaff30b39a5446471359f5127d1a5b1fbbfeb672889ea616a2c3aa4\": container with ID starting with 5604a219eeaff30b39a5446471359f5127d1a5b1fbbfeb672889ea616a2c3aa4 not found: ID does not exist" containerID="5604a219eeaff30b39a5446471359f5127d1a5b1fbbfeb672889ea616a2c3aa4" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.401814 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5604a219eeaff30b39a5446471359f5127d1a5b1fbbfeb672889ea616a2c3aa4"} err="failed to get container status \"5604a219eeaff30b39a5446471359f5127d1a5b1fbbfeb672889ea616a2c3aa4\": rpc error: code = NotFound desc = could not find container \"5604a219eeaff30b39a5446471359f5127d1a5b1fbbfeb672889ea616a2c3aa4\": container with ID starting with 5604a219eeaff30b39a5446471359f5127d1a5b1fbbfeb672889ea616a2c3aa4 not found: ID does not exist" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.401857 4756 scope.go:117] "RemoveContainer" containerID="5db63ea4a559119523d3e8bb8f75218cc114aadc379f3aae7d36834ca3a9a98b" Dec 05 16:39:36 crc kubenswrapper[4756]: E1205 16:39:36.404785 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5db63ea4a559119523d3e8bb8f75218cc114aadc379f3aae7d36834ca3a9a98b\": container with ID starting with 5db63ea4a559119523d3e8bb8f75218cc114aadc379f3aae7d36834ca3a9a98b not found: ID does not exist" containerID="5db63ea4a559119523d3e8bb8f75218cc114aadc379f3aae7d36834ca3a9a98b" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.404859 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5db63ea4a559119523d3e8bb8f75218cc114aadc379f3aae7d36834ca3a9a98b"} err="failed to get container status \"5db63ea4a559119523d3e8bb8f75218cc114aadc379f3aae7d36834ca3a9a98b\": rpc error: code = NotFound desc = could not find container \"5db63ea4a559119523d3e8bb8f75218cc114aadc379f3aae7d36834ca3a9a98b\": container with ID starting with 5db63ea4a559119523d3e8bb8f75218cc114aadc379f3aae7d36834ca3a9a98b not found: ID does not exist" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.404891 4756 scope.go:117] "RemoveContainer" containerID="ddc20c25aa72a78a8e26b7c521fc7a31ae86c3a0a05f9a2043bcd6fb7903020a" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.409770 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0118af-131e-409a-a18b-0be03f3ce81b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"dc0118af-131e-409a-a18b-0be03f3ce81b\") " pod="openstack/nova-cell1-conductor-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.410067 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc0118af-131e-409a-a18b-0be03f3ce81b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"dc0118af-131e-409a-a18b-0be03f3ce81b\") " pod="openstack/nova-cell1-conductor-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.410155 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdmzl\" (UniqueName: \"kubernetes.io/projected/dc0118af-131e-409a-a18b-0be03f3ce81b-kube-api-access-fdmzl\") pod \"nova-cell1-conductor-0\" (UID: \"dc0118af-131e-409a-a18b-0be03f3ce81b\") " pod="openstack/nova-cell1-conductor-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.413434 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.429949 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.444433 4756 scope.go:117] "RemoveContainer" containerID="ddc20c25aa72a78a8e26b7c521fc7a31ae86c3a0a05f9a2043bcd6fb7903020a" Dec 05 16:39:36 crc kubenswrapper[4756]: E1205 16:39:36.455725 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddc20c25aa72a78a8e26b7c521fc7a31ae86c3a0a05f9a2043bcd6fb7903020a\": container with ID starting with ddc20c25aa72a78a8e26b7c521fc7a31ae86c3a0a05f9a2043bcd6fb7903020a not found: ID does not exist" containerID="ddc20c25aa72a78a8e26b7c521fc7a31ae86c3a0a05f9a2043bcd6fb7903020a" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.456007 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddc20c25aa72a78a8e26b7c521fc7a31ae86c3a0a05f9a2043bcd6fb7903020a"} err="failed to get container status \"ddc20c25aa72a78a8e26b7c521fc7a31ae86c3a0a05f9a2043bcd6fb7903020a\": rpc error: code = NotFound desc = could not find container \"ddc20c25aa72a78a8e26b7c521fc7a31ae86c3a0a05f9a2043bcd6fb7903020a\": container with ID starting with ddc20c25aa72a78a8e26b7c521fc7a31ae86c3a0a05f9a2043bcd6fb7903020a not found: ID does not exist" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.511733 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95fc9575-c077-4bff-8879-53e06fe4fa2b-scripts\") pod \"ceilometer-0\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " pod="openstack/ceilometer-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.514056 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glcj5\" (UniqueName: \"kubernetes.io/projected/95fc9575-c077-4bff-8879-53e06fe4fa2b-kube-api-access-glcj5\") pod \"ceilometer-0\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " pod="openstack/ceilometer-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.514308 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95fc9575-c077-4bff-8879-53e06fe4fa2b-log-httpd\") pod \"ceilometer-0\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " pod="openstack/ceilometer-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.514469 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95fc9575-c077-4bff-8879-53e06fe4fa2b-run-httpd\") pod \"ceilometer-0\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " pod="openstack/ceilometer-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.514606 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95fc9575-c077-4bff-8879-53e06fe4fa2b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " pod="openstack/ceilometer-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.514776 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc0118af-131e-409a-a18b-0be03f3ce81b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"dc0118af-131e-409a-a18b-0be03f3ce81b\") " pod="openstack/nova-cell1-conductor-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.515686 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdmzl\" (UniqueName: \"kubernetes.io/projected/dc0118af-131e-409a-a18b-0be03f3ce81b-kube-api-access-fdmzl\") pod \"nova-cell1-conductor-0\" (UID: \"dc0118af-131e-409a-a18b-0be03f3ce81b\") " pod="openstack/nova-cell1-conductor-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.516194 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/95fc9575-c077-4bff-8879-53e06fe4fa2b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " pod="openstack/ceilometer-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.516361 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95fc9575-c077-4bff-8879-53e06fe4fa2b-config-data\") pod \"ceilometer-0\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " pod="openstack/ceilometer-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.516533 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0118af-131e-409a-a18b-0be03f3ce81b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"dc0118af-131e-409a-a18b-0be03f3ce81b\") " pod="openstack/nova-cell1-conductor-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.520902 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95fc9575-c077-4bff-8879-53e06fe4fa2b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " pod="openstack/ceilometer-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.520784 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0118af-131e-409a-a18b-0be03f3ce81b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"dc0118af-131e-409a-a18b-0be03f3ce81b\") " pod="openstack/nova-cell1-conductor-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.520010 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc0118af-131e-409a-a18b-0be03f3ce81b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"dc0118af-131e-409a-a18b-0be03f3ce81b\") " pod="openstack/nova-cell1-conductor-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.576108 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdmzl\" (UniqueName: \"kubernetes.io/projected/dc0118af-131e-409a-a18b-0be03f3ce81b-kube-api-access-fdmzl\") pod \"nova-cell1-conductor-0\" (UID: \"dc0118af-131e-409a-a18b-0be03f3ce81b\") " pod="openstack/nova-cell1-conductor-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.622224 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glcj5\" (UniqueName: \"kubernetes.io/projected/95fc9575-c077-4bff-8879-53e06fe4fa2b-kube-api-access-glcj5\") pod \"ceilometer-0\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " pod="openstack/ceilometer-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.622283 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95fc9575-c077-4bff-8879-53e06fe4fa2b-log-httpd\") pod \"ceilometer-0\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " pod="openstack/ceilometer-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.622311 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95fc9575-c077-4bff-8879-53e06fe4fa2b-run-httpd\") pod \"ceilometer-0\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " pod="openstack/ceilometer-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.622329 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95fc9575-c077-4bff-8879-53e06fe4fa2b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " pod="openstack/ceilometer-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.622388 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/95fc9575-c077-4bff-8879-53e06fe4fa2b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " pod="openstack/ceilometer-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.622425 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95fc9575-c077-4bff-8879-53e06fe4fa2b-config-data\") pod \"ceilometer-0\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " pod="openstack/ceilometer-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.622458 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95fc9575-c077-4bff-8879-53e06fe4fa2b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " pod="openstack/ceilometer-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.622496 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95fc9575-c077-4bff-8879-53e06fe4fa2b-scripts\") pod \"ceilometer-0\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " pod="openstack/ceilometer-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.623369 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95fc9575-c077-4bff-8879-53e06fe4fa2b-log-httpd\") pod \"ceilometer-0\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " pod="openstack/ceilometer-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.626259 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95fc9575-c077-4bff-8879-53e06fe4fa2b-scripts\") pod \"ceilometer-0\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " pod="openstack/ceilometer-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.627896 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95fc9575-c077-4bff-8879-53e06fe4fa2b-config-data\") pod \"ceilometer-0\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " pod="openstack/ceilometer-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.628839 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/95fc9575-c077-4bff-8879-53e06fe4fa2b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " pod="openstack/ceilometer-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.629004 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95fc9575-c077-4bff-8879-53e06fe4fa2b-run-httpd\") pod \"ceilometer-0\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " pod="openstack/ceilometer-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.631210 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95fc9575-c077-4bff-8879-53e06fe4fa2b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " pod="openstack/ceilometer-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.636633 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95fc9575-c077-4bff-8879-53e06fe4fa2b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " pod="openstack/ceilometer-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.647147 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glcj5\" (UniqueName: \"kubernetes.io/projected/95fc9575-c077-4bff-8879-53e06fe4fa2b-kube-api-access-glcj5\") pod \"ceilometer-0\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " pod="openstack/ceilometer-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.687608 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.733988 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.873075 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.887643 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.898758 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.900248 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.902608 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.911723 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.926700 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2pmf\" (UniqueName: \"kubernetes.io/projected/2011aac5-dbbc-4244-aab4-2bb2abc25cee-kube-api-access-q2pmf\") pod \"nova-scheduler-0\" (UID: \"2011aac5-dbbc-4244-aab4-2bb2abc25cee\") " pod="openstack/nova-scheduler-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.926742 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2011aac5-dbbc-4244-aab4-2bb2abc25cee-config-data\") pod \"nova-scheduler-0\" (UID: \"2011aac5-dbbc-4244-aab4-2bb2abc25cee\") " pod="openstack/nova-scheduler-0" Dec 05 16:39:36 crc kubenswrapper[4756]: I1205 16:39:36.926802 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2011aac5-dbbc-4244-aab4-2bb2abc25cee-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2011aac5-dbbc-4244-aab4-2bb2abc25cee\") " pod="openstack/nova-scheduler-0" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.027428 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2011aac5-dbbc-4244-aab4-2bb2abc25cee-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2011aac5-dbbc-4244-aab4-2bb2abc25cee\") " pod="openstack/nova-scheduler-0" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.028139 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2pmf\" (UniqueName: \"kubernetes.io/projected/2011aac5-dbbc-4244-aab4-2bb2abc25cee-kube-api-access-q2pmf\") pod \"nova-scheduler-0\" (UID: \"2011aac5-dbbc-4244-aab4-2bb2abc25cee\") " pod="openstack/nova-scheduler-0" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.028183 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2011aac5-dbbc-4244-aab4-2bb2abc25cee-config-data\") pod \"nova-scheduler-0\" (UID: \"2011aac5-dbbc-4244-aab4-2bb2abc25cee\") " pod="openstack/nova-scheduler-0" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.036506 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2011aac5-dbbc-4244-aab4-2bb2abc25cee-config-data\") pod \"nova-scheduler-0\" (UID: \"2011aac5-dbbc-4244-aab4-2bb2abc25cee\") " pod="openstack/nova-scheduler-0" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.038877 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2011aac5-dbbc-4244-aab4-2bb2abc25cee-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2011aac5-dbbc-4244-aab4-2bb2abc25cee\") " pod="openstack/nova-scheduler-0" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.048615 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2pmf\" (UniqueName: \"kubernetes.io/projected/2011aac5-dbbc-4244-aab4-2bb2abc25cee-kube-api-access-q2pmf\") pod \"nova-scheduler-0\" (UID: \"2011aac5-dbbc-4244-aab4-2bb2abc25cee\") " pod="openstack/nova-scheduler-0" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.135436 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.218640 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.231346 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/146bf072-8ade-4d0f-91cf-4a58c4c2f136-combined-ca-bundle\") pod \"146bf072-8ade-4d0f-91cf-4a58c4c2f136\" (UID: \"146bf072-8ade-4d0f-91cf-4a58c4c2f136\") " Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.231475 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/146bf072-8ade-4d0f-91cf-4a58c4c2f136-config-data\") pod \"146bf072-8ade-4d0f-91cf-4a58c4c2f136\" (UID: \"146bf072-8ade-4d0f-91cf-4a58c4c2f136\") " Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.231541 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/146bf072-8ade-4d0f-91cf-4a58c4c2f136-logs\") pod \"146bf072-8ade-4d0f-91cf-4a58c4c2f136\" (UID: \"146bf072-8ade-4d0f-91cf-4a58c4c2f136\") " Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.231562 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjgjp\" (UniqueName: \"kubernetes.io/projected/146bf072-8ade-4d0f-91cf-4a58c4c2f136-kube-api-access-qjgjp\") pod \"146bf072-8ade-4d0f-91cf-4a58c4c2f136\" (UID: \"146bf072-8ade-4d0f-91cf-4a58c4c2f136\") " Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.232035 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/146bf072-8ade-4d0f-91cf-4a58c4c2f136-logs" (OuterVolumeSpecName: "logs") pod "146bf072-8ade-4d0f-91cf-4a58c4c2f136" (UID: "146bf072-8ade-4d0f-91cf-4a58c4c2f136"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.235602 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/146bf072-8ade-4d0f-91cf-4a58c4c2f136-kube-api-access-qjgjp" (OuterVolumeSpecName: "kube-api-access-qjgjp") pod "146bf072-8ade-4d0f-91cf-4a58c4c2f136" (UID: "146bf072-8ade-4d0f-91cf-4a58c4c2f136"). InnerVolumeSpecName "kube-api-access-qjgjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.236087 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 16:39:37 crc kubenswrapper[4756]: W1205 16:39:37.238641 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95fc9575_c077_4bff_8879_53e06fe4fa2b.slice/crio-d1de21cbaada33dd87b1c898f5717e2407363029f7a06abc14dbc6492b2369cd WatchSource:0}: Error finding container d1de21cbaada33dd87b1c898f5717e2407363029f7a06abc14dbc6492b2369cd: Status 404 returned error can't find the container with id d1de21cbaada33dd87b1c898f5717e2407363029f7a06abc14dbc6492b2369cd Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.247402 4756 generic.go:334] "Generic (PLEG): container finished" podID="146bf072-8ade-4d0f-91cf-4a58c4c2f136" containerID="6bce93f5a6dc5ee89c32717f642c74ab41380473213291ede71e14b699b0b4cb" exitCode=0 Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.247462 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"146bf072-8ade-4d0f-91cf-4a58c4c2f136","Type":"ContainerDied","Data":"6bce93f5a6dc5ee89c32717f642c74ab41380473213291ede71e14b699b0b4cb"} Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.247490 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"146bf072-8ade-4d0f-91cf-4a58c4c2f136","Type":"ContainerDied","Data":"f13d14af91f746f21fa55026783436d3916749cd0d54ffaddfe76af99f455d6b"} Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.247506 4756 scope.go:117] "RemoveContainer" containerID="6bce93f5a6dc5ee89c32717f642c74ab41380473213291ede71e14b699b0b4cb" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.247900 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.268315 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/146bf072-8ade-4d0f-91cf-4a58c4c2f136-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "146bf072-8ade-4d0f-91cf-4a58c4c2f136" (UID: "146bf072-8ade-4d0f-91cf-4a58c4c2f136"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.273952 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/146bf072-8ade-4d0f-91cf-4a58c4c2f136-config-data" (OuterVolumeSpecName: "config-data") pod "146bf072-8ade-4d0f-91cf-4a58c4c2f136" (UID: "146bf072-8ade-4d0f-91cf-4a58c4c2f136"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.280951 4756 scope.go:117] "RemoveContainer" containerID="ac2d8cfcc5a2656696d38fccdcbbdb164ee070624c43285298931cb54080ef90" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.304845 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.315703 4756 scope.go:117] "RemoveContainer" containerID="6bce93f5a6dc5ee89c32717f642c74ab41380473213291ede71e14b699b0b4cb" Dec 05 16:39:37 crc kubenswrapper[4756]: E1205 16:39:37.317691 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bce93f5a6dc5ee89c32717f642c74ab41380473213291ede71e14b699b0b4cb\": container with ID starting with 6bce93f5a6dc5ee89c32717f642c74ab41380473213291ede71e14b699b0b4cb not found: ID does not exist" containerID="6bce93f5a6dc5ee89c32717f642c74ab41380473213291ede71e14b699b0b4cb" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.317748 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bce93f5a6dc5ee89c32717f642c74ab41380473213291ede71e14b699b0b4cb"} err="failed to get container status \"6bce93f5a6dc5ee89c32717f642c74ab41380473213291ede71e14b699b0b4cb\": rpc error: code = NotFound desc = could not find container \"6bce93f5a6dc5ee89c32717f642c74ab41380473213291ede71e14b699b0b4cb\": container with ID starting with 6bce93f5a6dc5ee89c32717f642c74ab41380473213291ede71e14b699b0b4cb not found: ID does not exist" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.317775 4756 scope.go:117] "RemoveContainer" containerID="ac2d8cfcc5a2656696d38fccdcbbdb164ee070624c43285298931cb54080ef90" Dec 05 16:39:37 crc kubenswrapper[4756]: E1205 16:39:37.318902 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac2d8cfcc5a2656696d38fccdcbbdb164ee070624c43285298931cb54080ef90\": container with ID starting with ac2d8cfcc5a2656696d38fccdcbbdb164ee070624c43285298931cb54080ef90 not found: ID does not exist" containerID="ac2d8cfcc5a2656696d38fccdcbbdb164ee070624c43285298931cb54080ef90" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.318945 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac2d8cfcc5a2656696d38fccdcbbdb164ee070624c43285298931cb54080ef90"} err="failed to get container status \"ac2d8cfcc5a2656696d38fccdcbbdb164ee070624c43285298931cb54080ef90\": rpc error: code = NotFound desc = could not find container \"ac2d8cfcc5a2656696d38fccdcbbdb164ee070624c43285298931cb54080ef90\": container with ID starting with ac2d8cfcc5a2656696d38fccdcbbdb164ee070624c43285298931cb54080ef90 not found: ID does not exist" Dec 05 16:39:37 crc kubenswrapper[4756]: W1205 16:39:37.325524 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc0118af_131e_409a_a18b_0be03f3ce81b.slice/crio-faac77f71518e440dd1ffc3f6e6c38fa17bdd9b4d2b40ff8b14a1d554bd9d538 WatchSource:0}: Error finding container faac77f71518e440dd1ffc3f6e6c38fa17bdd9b4d2b40ff8b14a1d554bd9d538: Status 404 returned error can't find the container with id faac77f71518e440dd1ffc3f6e6c38fa17bdd9b4d2b40ff8b14a1d554bd9d538 Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.333191 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/146bf072-8ade-4d0f-91cf-4a58c4c2f136-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.333219 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/146bf072-8ade-4d0f-91cf-4a58c4c2f136-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.333232 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/146bf072-8ade-4d0f-91cf-4a58c4c2f136-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.333242 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjgjp\" (UniqueName: \"kubernetes.io/projected/146bf072-8ade-4d0f-91cf-4a58c4c2f136-kube-api-access-qjgjp\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.589661 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.589755 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.624336 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.642605 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.656629 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 16:39:37 crc kubenswrapper[4756]: E1205 16:39:37.657107 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="146bf072-8ade-4d0f-91cf-4a58c4c2f136" containerName="nova-api-log" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.657127 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="146bf072-8ade-4d0f-91cf-4a58c4c2f136" containerName="nova-api-log" Dec 05 16:39:37 crc kubenswrapper[4756]: E1205 16:39:37.657146 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="146bf072-8ade-4d0f-91cf-4a58c4c2f136" containerName="nova-api-api" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.657151 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="146bf072-8ade-4d0f-91cf-4a58c4c2f136" containerName="nova-api-api" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.657336 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="146bf072-8ade-4d0f-91cf-4a58c4c2f136" containerName="nova-api-log" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.657355 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="146bf072-8ade-4d0f-91cf-4a58c4c2f136" containerName="nova-api-api" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.658447 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.660679 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.666773 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.703559 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.747022 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a4430d3-543d-4341-a279-6491c347a185-logs\") pod \"nova-api-0\" (UID: \"0a4430d3-543d-4341-a279-6491c347a185\") " pod="openstack/nova-api-0" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.747076 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22qvh\" (UniqueName: \"kubernetes.io/projected/0a4430d3-543d-4341-a279-6491c347a185-kube-api-access-22qvh\") pod \"nova-api-0\" (UID: \"0a4430d3-543d-4341-a279-6491c347a185\") " pod="openstack/nova-api-0" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.747268 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a4430d3-543d-4341-a279-6491c347a185-config-data\") pod \"nova-api-0\" (UID: \"0a4430d3-543d-4341-a279-6491c347a185\") " pod="openstack/nova-api-0" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.747307 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4430d3-543d-4341-a279-6491c347a185-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0a4430d3-543d-4341-a279-6491c347a185\") " pod="openstack/nova-api-0" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.848275 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a4430d3-543d-4341-a279-6491c347a185-config-data\") pod \"nova-api-0\" (UID: \"0a4430d3-543d-4341-a279-6491c347a185\") " pod="openstack/nova-api-0" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.848335 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4430d3-543d-4341-a279-6491c347a185-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0a4430d3-543d-4341-a279-6491c347a185\") " pod="openstack/nova-api-0" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.848476 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a4430d3-543d-4341-a279-6491c347a185-logs\") pod \"nova-api-0\" (UID: \"0a4430d3-543d-4341-a279-6491c347a185\") " pod="openstack/nova-api-0" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.848504 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22qvh\" (UniqueName: \"kubernetes.io/projected/0a4430d3-543d-4341-a279-6491c347a185-kube-api-access-22qvh\") pod \"nova-api-0\" (UID: \"0a4430d3-543d-4341-a279-6491c347a185\") " pod="openstack/nova-api-0" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.849077 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a4430d3-543d-4341-a279-6491c347a185-logs\") pod \"nova-api-0\" (UID: \"0a4430d3-543d-4341-a279-6491c347a185\") " pod="openstack/nova-api-0" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.853142 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4430d3-543d-4341-a279-6491c347a185-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0a4430d3-543d-4341-a279-6491c347a185\") " pod="openstack/nova-api-0" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.862087 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a4430d3-543d-4341-a279-6491c347a185-config-data\") pod \"nova-api-0\" (UID: \"0a4430d3-543d-4341-a279-6491c347a185\") " pod="openstack/nova-api-0" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.865979 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22qvh\" (UniqueName: \"kubernetes.io/projected/0a4430d3-543d-4341-a279-6491c347a185-kube-api-access-22qvh\") pod \"nova-api-0\" (UID: \"0a4430d3-543d-4341-a279-6491c347a185\") " pod="openstack/nova-api-0" Dec 05 16:39:37 crc kubenswrapper[4756]: I1205 16:39:37.978240 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 16:39:38 crc kubenswrapper[4756]: I1205 16:39:38.061931 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="146bf072-8ade-4d0f-91cf-4a58c4c2f136" path="/var/lib/kubelet/pods/146bf072-8ade-4d0f-91cf-4a58c4c2f136/volumes" Dec 05 16:39:38 crc kubenswrapper[4756]: I1205 16:39:38.070850 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a04745b-1666-4d6b-ab59-706b04cf4160" path="/var/lib/kubelet/pods/5a04745b-1666-4d6b-ab59-706b04cf4160/volumes" Dec 05 16:39:38 crc kubenswrapper[4756]: I1205 16:39:38.072561 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0c31b46-0af4-4e7f-9a32-06ee16a766aa" path="/var/lib/kubelet/pods/a0c31b46-0af4-4e7f-9a32-06ee16a766aa/volumes" Dec 05 16:39:38 crc kubenswrapper[4756]: I1205 16:39:38.279933 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95fc9575-c077-4bff-8879-53e06fe4fa2b","Type":"ContainerStarted","Data":"bf0eaf9471959ae396651c9b221994acecb0d157435b5c0a1c0004b2ac7a918f"} Dec 05 16:39:38 crc kubenswrapper[4756]: I1205 16:39:38.279977 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95fc9575-c077-4bff-8879-53e06fe4fa2b","Type":"ContainerStarted","Data":"d1de21cbaada33dd87b1c898f5717e2407363029f7a06abc14dbc6492b2369cd"} Dec 05 16:39:38 crc kubenswrapper[4756]: I1205 16:39:38.282945 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2011aac5-dbbc-4244-aab4-2bb2abc25cee","Type":"ContainerStarted","Data":"81cf6379a435061d61a85eaca4c548ecd8a03141a920e8b4cbeae7f9adc9169c"} Dec 05 16:39:38 crc kubenswrapper[4756]: I1205 16:39:38.282981 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2011aac5-dbbc-4244-aab4-2bb2abc25cee","Type":"ContainerStarted","Data":"51ca394c9a2b9ab069cd2fd5f332a92e0806e6d648caecb792e21e515c85ebc9"} Dec 05 16:39:38 crc kubenswrapper[4756]: I1205 16:39:38.290643 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"dc0118af-131e-409a-a18b-0be03f3ce81b","Type":"ContainerStarted","Data":"14a9c94a34d46580da78a252edc6fc101e1053fd75e54da245fe2728758527ee"} Dec 05 16:39:38 crc kubenswrapper[4756]: I1205 16:39:38.290694 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"dc0118af-131e-409a-a18b-0be03f3ce81b","Type":"ContainerStarted","Data":"faac77f71518e440dd1ffc3f6e6c38fa17bdd9b4d2b40ff8b14a1d554bd9d538"} Dec 05 16:39:38 crc kubenswrapper[4756]: I1205 16:39:38.290711 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 05 16:39:38 crc kubenswrapper[4756]: I1205 16:39:38.307033 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.307015997 podStartE2EDuration="2.307015997s" podCreationTimestamp="2025-12-05 16:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:39:38.305433064 +0000 UTC m=+1438.544106576" watchObservedRunningTime="2025-12-05 16:39:38.307015997 +0000 UTC m=+1438.545689509" Dec 05 16:39:38 crc kubenswrapper[4756]: I1205 16:39:38.415859 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.415837047 podStartE2EDuration="2.415837047s" podCreationTimestamp="2025-12-05 16:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:39:38.324155922 +0000 UTC m=+1438.562829434" watchObservedRunningTime="2025-12-05 16:39:38.415837047 +0000 UTC m=+1438.654510579" Dec 05 16:39:38 crc kubenswrapper[4756]: I1205 16:39:38.425385 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 16:39:38 crc kubenswrapper[4756]: W1205 16:39:38.428088 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a4430d3_543d_4341_a279_6491c347a185.slice/crio-35c1baef4c5cc53732553674c699ad2454da2cb37cb6d2b8784d55918aa1d88f WatchSource:0}: Error finding container 35c1baef4c5cc53732553674c699ad2454da2cb37cb6d2b8784d55918aa1d88f: Status 404 returned error can't find the container with id 35c1baef4c5cc53732553674c699ad2454da2cb37cb6d2b8784d55918aa1d88f Dec 05 16:39:39 crc kubenswrapper[4756]: I1205 16:39:39.305446 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0a4430d3-543d-4341-a279-6491c347a185","Type":"ContainerStarted","Data":"465b915d062d6a4d317dadf8821ed947a8649552494adf53bbfe799458194197"} Dec 05 16:39:39 crc kubenswrapper[4756]: I1205 16:39:39.305847 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0a4430d3-543d-4341-a279-6491c347a185","Type":"ContainerStarted","Data":"7e6818144d87d7a778fa8ac3ca773848791d2011a19b29598b18114852b5e770"} Dec 05 16:39:39 crc kubenswrapper[4756]: I1205 16:39:39.305861 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0a4430d3-543d-4341-a279-6491c347a185","Type":"ContainerStarted","Data":"35c1baef4c5cc53732553674c699ad2454da2cb37cb6d2b8784d55918aa1d88f"} Dec 05 16:39:39 crc kubenswrapper[4756]: I1205 16:39:39.309695 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95fc9575-c077-4bff-8879-53e06fe4fa2b","Type":"ContainerStarted","Data":"ef90541027096a0e4d29f34b53ffc464d84d686790dd300792a5cbf9ec303910"} Dec 05 16:39:39 crc kubenswrapper[4756]: I1205 16:39:39.332269 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.332249042 podStartE2EDuration="2.332249042s" podCreationTimestamp="2025-12-05 16:39:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:39:39.329544379 +0000 UTC m=+1439.568217911" watchObservedRunningTime="2025-12-05 16:39:39.332249042 +0000 UTC m=+1439.570922554" Dec 05 16:39:40 crc kubenswrapper[4756]: I1205 16:39:40.320794 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95fc9575-c077-4bff-8879-53e06fe4fa2b","Type":"ContainerStarted","Data":"a9447c7b06c0b75c3a5a234ad6c382968d7e6d01035f0a6586c3160bbf29f0cd"} Dec 05 16:39:40 crc kubenswrapper[4756]: I1205 16:39:40.530556 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 05 16:39:41 crc kubenswrapper[4756]: I1205 16:39:41.338475 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95fc9575-c077-4bff-8879-53e06fe4fa2b","Type":"ContainerStarted","Data":"29bf87e8fca01fb5785de4a0b4db98694e737902a51fb33f126920a032444d72"} Dec 05 16:39:41 crc kubenswrapper[4756]: I1205 16:39:41.339037 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 16:39:41 crc kubenswrapper[4756]: I1205 16:39:41.361454 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.12556998 podStartE2EDuration="5.361390433s" podCreationTimestamp="2025-12-05 16:39:36 +0000 UTC" firstStartedPulling="2025-12-05 16:39:37.256860568 +0000 UTC m=+1437.495534080" lastFinishedPulling="2025-12-05 16:39:40.492681021 +0000 UTC m=+1440.731354533" observedRunningTime="2025-12-05 16:39:41.359124181 +0000 UTC m=+1441.597797683" watchObservedRunningTime="2025-12-05 16:39:41.361390433 +0000 UTC m=+1441.600063945" Dec 05 16:39:42 crc kubenswrapper[4756]: I1205 16:39:42.237509 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 16:39:42 crc kubenswrapper[4756]: I1205 16:39:42.590888 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 16:39:42 crc kubenswrapper[4756]: I1205 16:39:42.591420 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 16:39:43 crc kubenswrapper[4756]: I1205 16:39:43.607649 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="40355c38-2e19-4399-ad0e-69986aca4afc" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 16:39:43 crc kubenswrapper[4756]: I1205 16:39:43.607680 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="40355c38-2e19-4399-ad0e-69986aca4afc" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 16:39:46 crc kubenswrapper[4756]: I1205 16:39:46.763380 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 05 16:39:47 crc kubenswrapper[4756]: I1205 16:39:47.237738 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 16:39:47 crc kubenswrapper[4756]: I1205 16:39:47.266434 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 16:39:47 crc kubenswrapper[4756]: I1205 16:39:47.434076 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 16:39:47 crc kubenswrapper[4756]: I1205 16:39:47.978827 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 16:39:47 crc kubenswrapper[4756]: I1205 16:39:47.979433 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 16:39:49 crc kubenswrapper[4756]: I1205 16:39:49.019649 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0a4430d3-543d-4341-a279-6491c347a185" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 16:39:49 crc kubenswrapper[4756]: I1205 16:39:49.060727 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0a4430d3-543d-4341-a279-6491c347a185" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 16:39:52 crc kubenswrapper[4756]: I1205 16:39:52.294962 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:39:52 crc kubenswrapper[4756]: I1205 16:39:52.295034 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:39:52 crc kubenswrapper[4756]: I1205 16:39:52.295083 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 16:39:52 crc kubenswrapper[4756]: I1205 16:39:52.295652 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4d4a53385c8aee53151ea619736690865a1fdc90406e455c625f67e894ce004c"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 16:39:52 crc kubenswrapper[4756]: I1205 16:39:52.295720 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://4d4a53385c8aee53151ea619736690865a1fdc90406e455c625f67e894ce004c" gracePeriod=600 Dec 05 16:39:52 crc kubenswrapper[4756]: I1205 16:39:52.594937 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 16:39:52 crc kubenswrapper[4756]: I1205 16:39:52.599023 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 16:39:52 crc kubenswrapper[4756]: I1205 16:39:52.599203 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 16:39:53 crc kubenswrapper[4756]: I1205 16:39:53.487310 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="4d4a53385c8aee53151ea619736690865a1fdc90406e455c625f67e894ce004c" exitCode=0 Dec 05 16:39:53 crc kubenswrapper[4756]: I1205 16:39:53.487405 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"4d4a53385c8aee53151ea619736690865a1fdc90406e455c625f67e894ce004c"} Dec 05 16:39:53 crc kubenswrapper[4756]: I1205 16:39:53.488905 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57"} Dec 05 16:39:53 crc kubenswrapper[4756]: I1205 16:39:53.488953 4756 scope.go:117] "RemoveContainer" containerID="314160243b8e9a333f056208abaaefb43e7de34a3e41ec79e08a896638376f87" Dec 05 16:39:53 crc kubenswrapper[4756]: I1205 16:39:53.501669 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 16:39:54 crc kubenswrapper[4756]: I1205 16:39:54.377220 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:54 crc kubenswrapper[4756]: I1205 16:39:54.503094 4756 generic.go:334] "Generic (PLEG): container finished" podID="5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e" containerID="08dad36a090d156d9038db80c8b4e8193a7ab1b1ea6cb42dacdca958b9780bbe" exitCode=137 Dec 05 16:39:54 crc kubenswrapper[4756]: I1205 16:39:54.503152 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:54 crc kubenswrapper[4756]: I1205 16:39:54.503201 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e","Type":"ContainerDied","Data":"08dad36a090d156d9038db80c8b4e8193a7ab1b1ea6cb42dacdca958b9780bbe"} Dec 05 16:39:54 crc kubenswrapper[4756]: I1205 16:39:54.503230 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e","Type":"ContainerDied","Data":"7fa6569015dcd2a34ee28de051fc69e81e56e52c81271abf95d98059e0b325f2"} Dec 05 16:39:54 crc kubenswrapper[4756]: I1205 16:39:54.503250 4756 scope.go:117] "RemoveContainer" containerID="08dad36a090d156d9038db80c8b4e8193a7ab1b1ea6cb42dacdca958b9780bbe" Dec 05 16:39:54 crc kubenswrapper[4756]: I1205 16:39:54.531445 4756 scope.go:117] "RemoveContainer" containerID="08dad36a090d156d9038db80c8b4e8193a7ab1b1ea6cb42dacdca958b9780bbe" Dec 05 16:39:54 crc kubenswrapper[4756]: E1205 16:39:54.531923 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08dad36a090d156d9038db80c8b4e8193a7ab1b1ea6cb42dacdca958b9780bbe\": container with ID starting with 08dad36a090d156d9038db80c8b4e8193a7ab1b1ea6cb42dacdca958b9780bbe not found: ID does not exist" containerID="08dad36a090d156d9038db80c8b4e8193a7ab1b1ea6cb42dacdca958b9780bbe" Dec 05 16:39:54 crc kubenswrapper[4756]: I1205 16:39:54.531990 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08dad36a090d156d9038db80c8b4e8193a7ab1b1ea6cb42dacdca958b9780bbe"} err="failed to get container status \"08dad36a090d156d9038db80c8b4e8193a7ab1b1ea6cb42dacdca958b9780bbe\": rpc error: code = NotFound desc = could not find container \"08dad36a090d156d9038db80c8b4e8193a7ab1b1ea6cb42dacdca958b9780bbe\": container with ID starting with 08dad36a090d156d9038db80c8b4e8193a7ab1b1ea6cb42dacdca958b9780bbe not found: ID does not exist" Dec 05 16:39:54 crc kubenswrapper[4756]: I1205 16:39:54.534678 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e-combined-ca-bundle\") pod \"5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e\" (UID: \"5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e\") " Dec 05 16:39:54 crc kubenswrapper[4756]: I1205 16:39:54.534823 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e-config-data\") pod \"5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e\" (UID: \"5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e\") " Dec 05 16:39:54 crc kubenswrapper[4756]: I1205 16:39:54.534965 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2sqd\" (UniqueName: \"kubernetes.io/projected/5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e-kube-api-access-r2sqd\") pod \"5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e\" (UID: \"5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e\") " Dec 05 16:39:54 crc kubenswrapper[4756]: I1205 16:39:54.541331 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e-kube-api-access-r2sqd" (OuterVolumeSpecName: "kube-api-access-r2sqd") pod "5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e" (UID: "5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e"). InnerVolumeSpecName "kube-api-access-r2sqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:39:54 crc kubenswrapper[4756]: I1205 16:39:54.562581 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e" (UID: "5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:39:54 crc kubenswrapper[4756]: I1205 16:39:54.563617 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e-config-data" (OuterVolumeSpecName: "config-data") pod "5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e" (UID: "5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:39:54 crc kubenswrapper[4756]: I1205 16:39:54.637532 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:54 crc kubenswrapper[4756]: I1205 16:39:54.637578 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2sqd\" (UniqueName: \"kubernetes.io/projected/5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e-kube-api-access-r2sqd\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:54 crc kubenswrapper[4756]: I1205 16:39:54.637592 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:39:54 crc kubenswrapper[4756]: I1205 16:39:54.850641 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 16:39:54 crc kubenswrapper[4756]: I1205 16:39:54.860688 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 16:39:54 crc kubenswrapper[4756]: I1205 16:39:54.877787 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 16:39:54 crc kubenswrapper[4756]: E1205 16:39:54.878315 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 16:39:54 crc kubenswrapper[4756]: I1205 16:39:54.878338 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 16:39:54 crc kubenswrapper[4756]: I1205 16:39:54.878633 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 16:39:54 crc kubenswrapper[4756]: I1205 16:39:54.879445 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:54 crc kubenswrapper[4756]: I1205 16:39:54.883169 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 05 16:39:54 crc kubenswrapper[4756]: I1205 16:39:54.883543 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 05 16:39:54 crc kubenswrapper[4756]: I1205 16:39:54.883768 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 05 16:39:54 crc kubenswrapper[4756]: I1205 16:39:54.886943 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 16:39:55 crc kubenswrapper[4756]: I1205 16:39:55.049706 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl5hw\" (UniqueName: \"kubernetes.io/projected/1f517228-5007-4e3c-a41b-1e4b636446c0-kube-api-access-vl5hw\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f517228-5007-4e3c-a41b-1e4b636446c0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:55 crc kubenswrapper[4756]: I1205 16:39:55.049758 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f517228-5007-4e3c-a41b-1e4b636446c0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f517228-5007-4e3c-a41b-1e4b636446c0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:55 crc kubenswrapper[4756]: I1205 16:39:55.049791 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f517228-5007-4e3c-a41b-1e4b636446c0-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f517228-5007-4e3c-a41b-1e4b636446c0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:55 crc kubenswrapper[4756]: I1205 16:39:55.049853 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f517228-5007-4e3c-a41b-1e4b636446c0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f517228-5007-4e3c-a41b-1e4b636446c0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:55 crc kubenswrapper[4756]: I1205 16:39:55.049884 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f517228-5007-4e3c-a41b-1e4b636446c0-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f517228-5007-4e3c-a41b-1e4b636446c0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:55 crc kubenswrapper[4756]: I1205 16:39:55.151436 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f517228-5007-4e3c-a41b-1e4b636446c0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f517228-5007-4e3c-a41b-1e4b636446c0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:55 crc kubenswrapper[4756]: I1205 16:39:55.151493 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f517228-5007-4e3c-a41b-1e4b636446c0-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f517228-5007-4e3c-a41b-1e4b636446c0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:55 crc kubenswrapper[4756]: I1205 16:39:55.151573 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl5hw\" (UniqueName: \"kubernetes.io/projected/1f517228-5007-4e3c-a41b-1e4b636446c0-kube-api-access-vl5hw\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f517228-5007-4e3c-a41b-1e4b636446c0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:55 crc kubenswrapper[4756]: I1205 16:39:55.151601 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f517228-5007-4e3c-a41b-1e4b636446c0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f517228-5007-4e3c-a41b-1e4b636446c0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:55 crc kubenswrapper[4756]: I1205 16:39:55.151619 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f517228-5007-4e3c-a41b-1e4b636446c0-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f517228-5007-4e3c-a41b-1e4b636446c0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:55 crc kubenswrapper[4756]: I1205 16:39:55.156739 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f517228-5007-4e3c-a41b-1e4b636446c0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f517228-5007-4e3c-a41b-1e4b636446c0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:55 crc kubenswrapper[4756]: I1205 16:39:55.157395 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f517228-5007-4e3c-a41b-1e4b636446c0-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f517228-5007-4e3c-a41b-1e4b636446c0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:55 crc kubenswrapper[4756]: I1205 16:39:55.157736 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f517228-5007-4e3c-a41b-1e4b636446c0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f517228-5007-4e3c-a41b-1e4b636446c0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:55 crc kubenswrapper[4756]: I1205 16:39:55.167598 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl5hw\" (UniqueName: \"kubernetes.io/projected/1f517228-5007-4e3c-a41b-1e4b636446c0-kube-api-access-vl5hw\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f517228-5007-4e3c-a41b-1e4b636446c0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:55 crc kubenswrapper[4756]: I1205 16:39:55.169791 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f517228-5007-4e3c-a41b-1e4b636446c0-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f517228-5007-4e3c-a41b-1e4b636446c0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:55 crc kubenswrapper[4756]: I1205 16:39:55.202315 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:39:55 crc kubenswrapper[4756]: I1205 16:39:55.683630 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 16:39:56 crc kubenswrapper[4756]: I1205 16:39:56.060791 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e" path="/var/lib/kubelet/pods/5ec2cbf1-6c7f-4d0e-adc0-cbbd6a7d995e/volumes" Dec 05 16:39:56 crc kubenswrapper[4756]: I1205 16:39:56.523185 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1f517228-5007-4e3c-a41b-1e4b636446c0","Type":"ContainerStarted","Data":"3314201eb3f0a43ba4e986ea1cf96160fc430f0101df1aa8d597f6eafb4f4d90"} Dec 05 16:39:56 crc kubenswrapper[4756]: I1205 16:39:56.523513 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1f517228-5007-4e3c-a41b-1e4b636446c0","Type":"ContainerStarted","Data":"3aa7ad4334643d0dfb3acf7cf802e64afb91deb087d7c9ca5087959cefc98cfa"} Dec 05 16:39:56 crc kubenswrapper[4756]: I1205 16:39:56.542398 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.542373419 podStartE2EDuration="2.542373419s" podCreationTimestamp="2025-12-05 16:39:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:39:56.538534905 +0000 UTC m=+1456.777208437" watchObservedRunningTime="2025-12-05 16:39:56.542373419 +0000 UTC m=+1456.781046951" Dec 05 16:39:58 crc kubenswrapper[4756]: I1205 16:39:58.046131 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 16:39:58 crc kubenswrapper[4756]: I1205 16:39:58.047185 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 16:39:58 crc kubenswrapper[4756]: I1205 16:39:58.087611 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 16:39:58 crc kubenswrapper[4756]: I1205 16:39:58.089937 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 16:39:58 crc kubenswrapper[4756]: I1205 16:39:58.546833 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 16:39:58 crc kubenswrapper[4756]: I1205 16:39:58.552291 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 16:39:58 crc kubenswrapper[4756]: I1205 16:39:58.776807 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cf4ff87b5-954sn"] Dec 05 16:39:58 crc kubenswrapper[4756]: I1205 16:39:58.778377 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" Dec 05 16:39:58 crc kubenswrapper[4756]: I1205 16:39:58.801085 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cf4ff87b5-954sn"] Dec 05 16:39:58 crc kubenswrapper[4756]: I1205 16:39:58.925538 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp2xv\" (UniqueName: \"kubernetes.io/projected/c609265d-528c-4c9c-bd1c-b735775d9acc-kube-api-access-fp2xv\") pod \"dnsmasq-dns-cf4ff87b5-954sn\" (UID: \"c609265d-528c-4c9c-bd1c-b735775d9acc\") " pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" Dec 05 16:39:58 crc kubenswrapper[4756]: I1205 16:39:58.925907 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c609265d-528c-4c9c-bd1c-b735775d9acc-dns-svc\") pod \"dnsmasq-dns-cf4ff87b5-954sn\" (UID: \"c609265d-528c-4c9c-bd1c-b735775d9acc\") " pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" Dec 05 16:39:58 crc kubenswrapper[4756]: I1205 16:39:58.926141 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c609265d-528c-4c9c-bd1c-b735775d9acc-dns-swift-storage-0\") pod \"dnsmasq-dns-cf4ff87b5-954sn\" (UID: \"c609265d-528c-4c9c-bd1c-b735775d9acc\") " pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" Dec 05 16:39:58 crc kubenswrapper[4756]: I1205 16:39:58.926400 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c609265d-528c-4c9c-bd1c-b735775d9acc-ovsdbserver-nb\") pod \"dnsmasq-dns-cf4ff87b5-954sn\" (UID: \"c609265d-528c-4c9c-bd1c-b735775d9acc\") " pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" Dec 05 16:39:58 crc kubenswrapper[4756]: I1205 16:39:58.926587 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c609265d-528c-4c9c-bd1c-b735775d9acc-config\") pod \"dnsmasq-dns-cf4ff87b5-954sn\" (UID: \"c609265d-528c-4c9c-bd1c-b735775d9acc\") " pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" Dec 05 16:39:58 crc kubenswrapper[4756]: I1205 16:39:58.926768 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c609265d-528c-4c9c-bd1c-b735775d9acc-ovsdbserver-sb\") pod \"dnsmasq-dns-cf4ff87b5-954sn\" (UID: \"c609265d-528c-4c9c-bd1c-b735775d9acc\") " pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" Dec 05 16:39:59 crc kubenswrapper[4756]: I1205 16:39:59.028546 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c609265d-528c-4c9c-bd1c-b735775d9acc-dns-swift-storage-0\") pod \"dnsmasq-dns-cf4ff87b5-954sn\" (UID: \"c609265d-528c-4c9c-bd1c-b735775d9acc\") " pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" Dec 05 16:39:59 crc kubenswrapper[4756]: I1205 16:39:59.028652 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c609265d-528c-4c9c-bd1c-b735775d9acc-ovsdbserver-nb\") pod \"dnsmasq-dns-cf4ff87b5-954sn\" (UID: \"c609265d-528c-4c9c-bd1c-b735775d9acc\") " pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" Dec 05 16:39:59 crc kubenswrapper[4756]: I1205 16:39:59.028685 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c609265d-528c-4c9c-bd1c-b735775d9acc-config\") pod \"dnsmasq-dns-cf4ff87b5-954sn\" (UID: \"c609265d-528c-4c9c-bd1c-b735775d9acc\") " pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" Dec 05 16:39:59 crc kubenswrapper[4756]: I1205 16:39:59.028744 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c609265d-528c-4c9c-bd1c-b735775d9acc-ovsdbserver-sb\") pod \"dnsmasq-dns-cf4ff87b5-954sn\" (UID: \"c609265d-528c-4c9c-bd1c-b735775d9acc\") " pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" Dec 05 16:39:59 crc kubenswrapper[4756]: I1205 16:39:59.028769 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp2xv\" (UniqueName: \"kubernetes.io/projected/c609265d-528c-4c9c-bd1c-b735775d9acc-kube-api-access-fp2xv\") pod \"dnsmasq-dns-cf4ff87b5-954sn\" (UID: \"c609265d-528c-4c9c-bd1c-b735775d9acc\") " pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" Dec 05 16:39:59 crc kubenswrapper[4756]: I1205 16:39:59.028822 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c609265d-528c-4c9c-bd1c-b735775d9acc-dns-svc\") pod \"dnsmasq-dns-cf4ff87b5-954sn\" (UID: \"c609265d-528c-4c9c-bd1c-b735775d9acc\") " pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" Dec 05 16:39:59 crc kubenswrapper[4756]: I1205 16:39:59.029764 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c609265d-528c-4c9c-bd1c-b735775d9acc-dns-swift-storage-0\") pod \"dnsmasq-dns-cf4ff87b5-954sn\" (UID: \"c609265d-528c-4c9c-bd1c-b735775d9acc\") " pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" Dec 05 16:39:59 crc kubenswrapper[4756]: I1205 16:39:59.029772 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c609265d-528c-4c9c-bd1c-b735775d9acc-config\") pod \"dnsmasq-dns-cf4ff87b5-954sn\" (UID: \"c609265d-528c-4c9c-bd1c-b735775d9acc\") " pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" Dec 05 16:39:59 crc kubenswrapper[4756]: I1205 16:39:59.029769 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c609265d-528c-4c9c-bd1c-b735775d9acc-ovsdbserver-nb\") pod \"dnsmasq-dns-cf4ff87b5-954sn\" (UID: \"c609265d-528c-4c9c-bd1c-b735775d9acc\") " pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" Dec 05 16:39:59 crc kubenswrapper[4756]: I1205 16:39:59.029931 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c609265d-528c-4c9c-bd1c-b735775d9acc-ovsdbserver-sb\") pod \"dnsmasq-dns-cf4ff87b5-954sn\" (UID: \"c609265d-528c-4c9c-bd1c-b735775d9acc\") " pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" Dec 05 16:39:59 crc kubenswrapper[4756]: I1205 16:39:59.030101 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c609265d-528c-4c9c-bd1c-b735775d9acc-dns-svc\") pod \"dnsmasq-dns-cf4ff87b5-954sn\" (UID: \"c609265d-528c-4c9c-bd1c-b735775d9acc\") " pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" Dec 05 16:39:59 crc kubenswrapper[4756]: I1205 16:39:59.049121 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp2xv\" (UniqueName: \"kubernetes.io/projected/c609265d-528c-4c9c-bd1c-b735775d9acc-kube-api-access-fp2xv\") pod \"dnsmasq-dns-cf4ff87b5-954sn\" (UID: \"c609265d-528c-4c9c-bd1c-b735775d9acc\") " pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" Dec 05 16:39:59 crc kubenswrapper[4756]: I1205 16:39:59.101691 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" Dec 05 16:39:59 crc kubenswrapper[4756]: I1205 16:39:59.614579 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cf4ff87b5-954sn"] Dec 05 16:39:59 crc kubenswrapper[4756]: W1205 16:39:59.619015 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc609265d_528c_4c9c_bd1c_b735775d9acc.slice/crio-adda2863650eccc6f9766800246a30289626cae346ab5113ebedcffa9fa120e0 WatchSource:0}: Error finding container adda2863650eccc6f9766800246a30289626cae346ab5113ebedcffa9fa120e0: Status 404 returned error can't find the container with id adda2863650eccc6f9766800246a30289626cae346ab5113ebedcffa9fa120e0 Dec 05 16:40:00 crc kubenswrapper[4756]: I1205 16:40:00.202924 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:40:00 crc kubenswrapper[4756]: I1205 16:40:00.565444 4756 generic.go:334] "Generic (PLEG): container finished" podID="c609265d-528c-4c9c-bd1c-b735775d9acc" containerID="be9e958b5cb052a79ce88fba09bec0f4f2bb4d1e023bd71c39ab015d97b9d54b" exitCode=0 Dec 05 16:40:00 crc kubenswrapper[4756]: I1205 16:40:00.565548 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" event={"ID":"c609265d-528c-4c9c-bd1c-b735775d9acc","Type":"ContainerDied","Data":"be9e958b5cb052a79ce88fba09bec0f4f2bb4d1e023bd71c39ab015d97b9d54b"} Dec 05 16:40:00 crc kubenswrapper[4756]: I1205 16:40:00.565619 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" event={"ID":"c609265d-528c-4c9c-bd1c-b735775d9acc","Type":"ContainerStarted","Data":"adda2863650eccc6f9766800246a30289626cae346ab5113ebedcffa9fa120e0"} Dec 05 16:40:00 crc kubenswrapper[4756]: I1205 16:40:00.951710 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:40:00 crc kubenswrapper[4756]: I1205 16:40:00.952041 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95fc9575-c077-4bff-8879-53e06fe4fa2b" containerName="ceilometer-central-agent" containerID="cri-o://bf0eaf9471959ae396651c9b221994acecb0d157435b5c0a1c0004b2ac7a918f" gracePeriod=30 Dec 05 16:40:00 crc kubenswrapper[4756]: I1205 16:40:00.952144 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95fc9575-c077-4bff-8879-53e06fe4fa2b" containerName="proxy-httpd" containerID="cri-o://29bf87e8fca01fb5785de4a0b4db98694e737902a51fb33f126920a032444d72" gracePeriod=30 Dec 05 16:40:00 crc kubenswrapper[4756]: I1205 16:40:00.952214 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95fc9575-c077-4bff-8879-53e06fe4fa2b" containerName="sg-core" containerID="cri-o://a9447c7b06c0b75c3a5a234ad6c382968d7e6d01035f0a6586c3160bbf29f0cd" gracePeriod=30 Dec 05 16:40:00 crc kubenswrapper[4756]: I1205 16:40:00.952192 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95fc9575-c077-4bff-8879-53e06fe4fa2b" containerName="ceilometer-notification-agent" containerID="cri-o://ef90541027096a0e4d29f34b53ffc464d84d686790dd300792a5cbf9ec303910" gracePeriod=30 Dec 05 16:40:00 crc kubenswrapper[4756]: I1205 16:40:00.965317 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="95fc9575-c077-4bff-8879-53e06fe4fa2b" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.192:3000/\": EOF" Dec 05 16:40:01 crc kubenswrapper[4756]: I1205 16:40:01.468775 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 16:40:01 crc kubenswrapper[4756]: I1205 16:40:01.576373 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" event={"ID":"c609265d-528c-4c9c-bd1c-b735775d9acc","Type":"ContainerStarted","Data":"6d5b71ea955f60a17745011df48bb423d4c7c7c82b3f516d2feb518405839e09"} Dec 05 16:40:01 crc kubenswrapper[4756]: I1205 16:40:01.576530 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" Dec 05 16:40:01 crc kubenswrapper[4756]: I1205 16:40:01.581023 4756 generic.go:334] "Generic (PLEG): container finished" podID="95fc9575-c077-4bff-8879-53e06fe4fa2b" containerID="29bf87e8fca01fb5785de4a0b4db98694e737902a51fb33f126920a032444d72" exitCode=0 Dec 05 16:40:01 crc kubenswrapper[4756]: I1205 16:40:01.581063 4756 generic.go:334] "Generic (PLEG): container finished" podID="95fc9575-c077-4bff-8879-53e06fe4fa2b" containerID="a9447c7b06c0b75c3a5a234ad6c382968d7e6d01035f0a6586c3160bbf29f0cd" exitCode=2 Dec 05 16:40:01 crc kubenswrapper[4756]: I1205 16:40:01.581072 4756 generic.go:334] "Generic (PLEG): container finished" podID="95fc9575-c077-4bff-8879-53e06fe4fa2b" containerID="bf0eaf9471959ae396651c9b221994acecb0d157435b5c0a1c0004b2ac7a918f" exitCode=0 Dec 05 16:40:01 crc kubenswrapper[4756]: I1205 16:40:01.581105 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95fc9575-c077-4bff-8879-53e06fe4fa2b","Type":"ContainerDied","Data":"29bf87e8fca01fb5785de4a0b4db98694e737902a51fb33f126920a032444d72"} Dec 05 16:40:01 crc kubenswrapper[4756]: I1205 16:40:01.581154 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95fc9575-c077-4bff-8879-53e06fe4fa2b","Type":"ContainerDied","Data":"a9447c7b06c0b75c3a5a234ad6c382968d7e6d01035f0a6586c3160bbf29f0cd"} Dec 05 16:40:01 crc kubenswrapper[4756]: I1205 16:40:01.581169 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95fc9575-c077-4bff-8879-53e06fe4fa2b","Type":"ContainerDied","Data":"bf0eaf9471959ae396651c9b221994acecb0d157435b5c0a1c0004b2ac7a918f"} Dec 05 16:40:01 crc kubenswrapper[4756]: I1205 16:40:01.581298 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0a4430d3-543d-4341-a279-6491c347a185" containerName="nova-api-log" containerID="cri-o://7e6818144d87d7a778fa8ac3ca773848791d2011a19b29598b18114852b5e770" gracePeriod=30 Dec 05 16:40:01 crc kubenswrapper[4756]: I1205 16:40:01.581363 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0a4430d3-543d-4341-a279-6491c347a185" containerName="nova-api-api" containerID="cri-o://465b915d062d6a4d317dadf8821ed947a8649552494adf53bbfe799458194197" gracePeriod=30 Dec 05 16:40:01 crc kubenswrapper[4756]: I1205 16:40:01.612076 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" podStartSLOduration=3.612054181 podStartE2EDuration="3.612054181s" podCreationTimestamp="2025-12-05 16:39:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:40:01.600314633 +0000 UTC m=+1461.838988155" watchObservedRunningTime="2025-12-05 16:40:01.612054181 +0000 UTC m=+1461.850727693" Dec 05 16:40:02 crc kubenswrapper[4756]: I1205 16:40:02.643482 4756 generic.go:334] "Generic (PLEG): container finished" podID="0a4430d3-543d-4341-a279-6491c347a185" containerID="7e6818144d87d7a778fa8ac3ca773848791d2011a19b29598b18114852b5e770" exitCode=143 Dec 05 16:40:02 crc kubenswrapper[4756]: I1205 16:40:02.643546 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0a4430d3-543d-4341-a279-6491c347a185","Type":"ContainerDied","Data":"7e6818144d87d7a778fa8ac3ca773848791d2011a19b29598b18114852b5e770"} Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.166354 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.172657 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.205656 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.240857 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.249946 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95fc9575-c077-4bff-8879-53e06fe4fa2b-combined-ca-bundle\") pod \"95fc9575-c077-4bff-8879-53e06fe4fa2b\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.249998 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4430d3-543d-4341-a279-6491c347a185-combined-ca-bundle\") pod \"0a4430d3-543d-4341-a279-6491c347a185\" (UID: \"0a4430d3-543d-4341-a279-6491c347a185\") " Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.250065 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glcj5\" (UniqueName: \"kubernetes.io/projected/95fc9575-c077-4bff-8879-53e06fe4fa2b-kube-api-access-glcj5\") pod \"95fc9575-c077-4bff-8879-53e06fe4fa2b\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.250101 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a4430d3-543d-4341-a279-6491c347a185-logs\") pod \"0a4430d3-543d-4341-a279-6491c347a185\" (UID: \"0a4430d3-543d-4341-a279-6491c347a185\") " Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.250152 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95fc9575-c077-4bff-8879-53e06fe4fa2b-run-httpd\") pod \"95fc9575-c077-4bff-8879-53e06fe4fa2b\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.250179 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a4430d3-543d-4341-a279-6491c347a185-config-data\") pod \"0a4430d3-543d-4341-a279-6491c347a185\" (UID: \"0a4430d3-543d-4341-a279-6491c347a185\") " Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.250209 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95fc9575-c077-4bff-8879-53e06fe4fa2b-log-httpd\") pod \"95fc9575-c077-4bff-8879-53e06fe4fa2b\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.250252 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95fc9575-c077-4bff-8879-53e06fe4fa2b-sg-core-conf-yaml\") pod \"95fc9575-c077-4bff-8879-53e06fe4fa2b\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.250276 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/95fc9575-c077-4bff-8879-53e06fe4fa2b-ceilometer-tls-certs\") pod \"95fc9575-c077-4bff-8879-53e06fe4fa2b\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.250475 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95fc9575-c077-4bff-8879-53e06fe4fa2b-scripts\") pod \"95fc9575-c077-4bff-8879-53e06fe4fa2b\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.250508 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95fc9575-c077-4bff-8879-53e06fe4fa2b-config-data\") pod \"95fc9575-c077-4bff-8879-53e06fe4fa2b\" (UID: \"95fc9575-c077-4bff-8879-53e06fe4fa2b\") " Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.250545 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22qvh\" (UniqueName: \"kubernetes.io/projected/0a4430d3-543d-4341-a279-6491c347a185-kube-api-access-22qvh\") pod \"0a4430d3-543d-4341-a279-6491c347a185\" (UID: \"0a4430d3-543d-4341-a279-6491c347a185\") " Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.250817 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a4430d3-543d-4341-a279-6491c347a185-logs" (OuterVolumeSpecName: "logs") pod "0a4430d3-543d-4341-a279-6491c347a185" (UID: "0a4430d3-543d-4341-a279-6491c347a185"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.254116 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95fc9575-c077-4bff-8879-53e06fe4fa2b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "95fc9575-c077-4bff-8879-53e06fe4fa2b" (UID: "95fc9575-c077-4bff-8879-53e06fe4fa2b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.255376 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95fc9575-c077-4bff-8879-53e06fe4fa2b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "95fc9575-c077-4bff-8879-53e06fe4fa2b" (UID: "95fc9575-c077-4bff-8879-53e06fe4fa2b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.258004 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95fc9575-c077-4bff-8879-53e06fe4fa2b-kube-api-access-glcj5" (OuterVolumeSpecName: "kube-api-access-glcj5") pod "95fc9575-c077-4bff-8879-53e06fe4fa2b" (UID: "95fc9575-c077-4bff-8879-53e06fe4fa2b"). InnerVolumeSpecName "kube-api-access-glcj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.261982 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95fc9575-c077-4bff-8879-53e06fe4fa2b-scripts" (OuterVolumeSpecName: "scripts") pod "95fc9575-c077-4bff-8879-53e06fe4fa2b" (UID: "95fc9575-c077-4bff-8879-53e06fe4fa2b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.265084 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a4430d3-543d-4341-a279-6491c347a185-kube-api-access-22qvh" (OuterVolumeSpecName: "kube-api-access-22qvh") pod "0a4430d3-543d-4341-a279-6491c347a185" (UID: "0a4430d3-543d-4341-a279-6491c347a185"). InnerVolumeSpecName "kube-api-access-22qvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.286155 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a4430d3-543d-4341-a279-6491c347a185-config-data" (OuterVolumeSpecName: "config-data") pod "0a4430d3-543d-4341-a279-6491c347a185" (UID: "0a4430d3-543d-4341-a279-6491c347a185"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.290642 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95fc9575-c077-4bff-8879-53e06fe4fa2b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "95fc9575-c077-4bff-8879-53e06fe4fa2b" (UID: "95fc9575-c077-4bff-8879-53e06fe4fa2b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.293677 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a4430d3-543d-4341-a279-6491c347a185-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a4430d3-543d-4341-a279-6491c347a185" (UID: "0a4430d3-543d-4341-a279-6491c347a185"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.321617 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95fc9575-c077-4bff-8879-53e06fe4fa2b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "95fc9575-c077-4bff-8879-53e06fe4fa2b" (UID: "95fc9575-c077-4bff-8879-53e06fe4fa2b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.352401 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95fc9575-c077-4bff-8879-53e06fe4fa2b-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.352453 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22qvh\" (UniqueName: \"kubernetes.io/projected/0a4430d3-543d-4341-a279-6491c347a185-kube-api-access-22qvh\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.352501 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4430d3-543d-4341-a279-6491c347a185-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.352512 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glcj5\" (UniqueName: \"kubernetes.io/projected/95fc9575-c077-4bff-8879-53e06fe4fa2b-kube-api-access-glcj5\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.352523 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a4430d3-543d-4341-a279-6491c347a185-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.352533 4756 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95fc9575-c077-4bff-8879-53e06fe4fa2b-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.352546 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a4430d3-543d-4341-a279-6491c347a185-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.352556 4756 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95fc9575-c077-4bff-8879-53e06fe4fa2b-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.352566 4756 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95fc9575-c077-4bff-8879-53e06fe4fa2b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.352577 4756 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/95fc9575-c077-4bff-8879-53e06fe4fa2b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.356371 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95fc9575-c077-4bff-8879-53e06fe4fa2b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95fc9575-c077-4bff-8879-53e06fe4fa2b" (UID: "95fc9575-c077-4bff-8879-53e06fe4fa2b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.385956 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95fc9575-c077-4bff-8879-53e06fe4fa2b-config-data" (OuterVolumeSpecName: "config-data") pod "95fc9575-c077-4bff-8879-53e06fe4fa2b" (UID: "95fc9575-c077-4bff-8879-53e06fe4fa2b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.453811 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95fc9575-c077-4bff-8879-53e06fe4fa2b-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.453931 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95fc9575-c077-4bff-8879-53e06fe4fa2b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.671916 4756 generic.go:334] "Generic (PLEG): container finished" podID="95fc9575-c077-4bff-8879-53e06fe4fa2b" containerID="ef90541027096a0e4d29f34b53ffc464d84d686790dd300792a5cbf9ec303910" exitCode=0 Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.671981 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.672014 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95fc9575-c077-4bff-8879-53e06fe4fa2b","Type":"ContainerDied","Data":"ef90541027096a0e4d29f34b53ffc464d84d686790dd300792a5cbf9ec303910"} Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.672063 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95fc9575-c077-4bff-8879-53e06fe4fa2b","Type":"ContainerDied","Data":"d1de21cbaada33dd87b1c898f5717e2407363029f7a06abc14dbc6492b2369cd"} Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.672087 4756 scope.go:117] "RemoveContainer" containerID="29bf87e8fca01fb5785de4a0b4db98694e737902a51fb33f126920a032444d72" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.678665 4756 generic.go:334] "Generic (PLEG): container finished" podID="0a4430d3-543d-4341-a279-6491c347a185" containerID="465b915d062d6a4d317dadf8821ed947a8649552494adf53bbfe799458194197" exitCode=0 Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.678721 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0a4430d3-543d-4341-a279-6491c347a185","Type":"ContainerDied","Data":"465b915d062d6a4d317dadf8821ed947a8649552494adf53bbfe799458194197"} Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.678767 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.678778 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0a4430d3-543d-4341-a279-6491c347a185","Type":"ContainerDied","Data":"35c1baef4c5cc53732553674c699ad2454da2cb37cb6d2b8784d55918aa1d88f"} Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.705472 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.789050 4756 scope.go:117] "RemoveContainer" containerID="a9447c7b06c0b75c3a5a234ad6c382968d7e6d01035f0a6586c3160bbf29f0cd" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.807747 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.845882 4756 scope.go:117] "RemoveContainer" containerID="ef90541027096a0e4d29f34b53ffc464d84d686790dd300792a5cbf9ec303910" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.849776 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.860703 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.872500 4756 scope.go:117] "RemoveContainer" containerID="bf0eaf9471959ae396651c9b221994acecb0d157435b5c0a1c0004b2ac7a918f" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.872664 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.880877 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:40:05 crc kubenswrapper[4756]: E1205 16:40:05.881633 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95fc9575-c077-4bff-8879-53e06fe4fa2b" containerName="ceilometer-central-agent" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.881656 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="95fc9575-c077-4bff-8879-53e06fe4fa2b" containerName="ceilometer-central-agent" Dec 05 16:40:05 crc kubenswrapper[4756]: E1205 16:40:05.881672 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a4430d3-543d-4341-a279-6491c347a185" containerName="nova-api-api" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.881682 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a4430d3-543d-4341-a279-6491c347a185" containerName="nova-api-api" Dec 05 16:40:05 crc kubenswrapper[4756]: E1205 16:40:05.881693 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95fc9575-c077-4bff-8879-53e06fe4fa2b" containerName="proxy-httpd" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.881701 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="95fc9575-c077-4bff-8879-53e06fe4fa2b" containerName="proxy-httpd" Dec 05 16:40:05 crc kubenswrapper[4756]: E1205 16:40:05.881717 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a4430d3-543d-4341-a279-6491c347a185" containerName="nova-api-log" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.881722 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a4430d3-543d-4341-a279-6491c347a185" containerName="nova-api-log" Dec 05 16:40:05 crc kubenswrapper[4756]: E1205 16:40:05.881737 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95fc9575-c077-4bff-8879-53e06fe4fa2b" containerName="ceilometer-notification-agent" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.881743 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="95fc9575-c077-4bff-8879-53e06fe4fa2b" containerName="ceilometer-notification-agent" Dec 05 16:40:05 crc kubenswrapper[4756]: E1205 16:40:05.881765 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95fc9575-c077-4bff-8879-53e06fe4fa2b" containerName="sg-core" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.881772 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="95fc9575-c077-4bff-8879-53e06fe4fa2b" containerName="sg-core" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.881965 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="95fc9575-c077-4bff-8879-53e06fe4fa2b" containerName="ceilometer-notification-agent" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.881979 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a4430d3-543d-4341-a279-6491c347a185" containerName="nova-api-log" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.881991 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a4430d3-543d-4341-a279-6491c347a185" containerName="nova-api-api" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.882001 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="95fc9575-c077-4bff-8879-53e06fe4fa2b" containerName="ceilometer-central-agent" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.882015 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="95fc9575-c077-4bff-8879-53e06fe4fa2b" containerName="proxy-httpd" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.882024 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="95fc9575-c077-4bff-8879-53e06fe4fa2b" containerName="sg-core" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.883853 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.885541 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.887035 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.887206 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.890295 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.890896 4756 scope.go:117] "RemoveContainer" containerID="29bf87e8fca01fb5785de4a0b4db98694e737902a51fb33f126920a032444d72" Dec 05 16:40:05 crc kubenswrapper[4756]: E1205 16:40:05.891602 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29bf87e8fca01fb5785de4a0b4db98694e737902a51fb33f126920a032444d72\": container with ID starting with 29bf87e8fca01fb5785de4a0b4db98694e737902a51fb33f126920a032444d72 not found: ID does not exist" containerID="29bf87e8fca01fb5785de4a0b4db98694e737902a51fb33f126920a032444d72" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.891641 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29bf87e8fca01fb5785de4a0b4db98694e737902a51fb33f126920a032444d72"} err="failed to get container status \"29bf87e8fca01fb5785de4a0b4db98694e737902a51fb33f126920a032444d72\": rpc error: code = NotFound desc = could not find container \"29bf87e8fca01fb5785de4a0b4db98694e737902a51fb33f126920a032444d72\": container with ID starting with 29bf87e8fca01fb5785de4a0b4db98694e737902a51fb33f126920a032444d72 not found: ID does not exist" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.891664 4756 scope.go:117] "RemoveContainer" containerID="a9447c7b06c0b75c3a5a234ad6c382968d7e6d01035f0a6586c3160bbf29f0cd" Dec 05 16:40:05 crc kubenswrapper[4756]: E1205 16:40:05.892069 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9447c7b06c0b75c3a5a234ad6c382968d7e6d01035f0a6586c3160bbf29f0cd\": container with ID starting with a9447c7b06c0b75c3a5a234ad6c382968d7e6d01035f0a6586c3160bbf29f0cd not found: ID does not exist" containerID="a9447c7b06c0b75c3a5a234ad6c382968d7e6d01035f0a6586c3160bbf29f0cd" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.892100 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9447c7b06c0b75c3a5a234ad6c382968d7e6d01035f0a6586c3160bbf29f0cd"} err="failed to get container status \"a9447c7b06c0b75c3a5a234ad6c382968d7e6d01035f0a6586c3160bbf29f0cd\": rpc error: code = NotFound desc = could not find container \"a9447c7b06c0b75c3a5a234ad6c382968d7e6d01035f0a6586c3160bbf29f0cd\": container with ID starting with a9447c7b06c0b75c3a5a234ad6c382968d7e6d01035f0a6586c3160bbf29f0cd not found: ID does not exist" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.892118 4756 scope.go:117] "RemoveContainer" containerID="ef90541027096a0e4d29f34b53ffc464d84d686790dd300792a5cbf9ec303910" Dec 05 16:40:05 crc kubenswrapper[4756]: E1205 16:40:05.892377 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef90541027096a0e4d29f34b53ffc464d84d686790dd300792a5cbf9ec303910\": container with ID starting with ef90541027096a0e4d29f34b53ffc464d84d686790dd300792a5cbf9ec303910 not found: ID does not exist" containerID="ef90541027096a0e4d29f34b53ffc464d84d686790dd300792a5cbf9ec303910" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.892395 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef90541027096a0e4d29f34b53ffc464d84d686790dd300792a5cbf9ec303910"} err="failed to get container status \"ef90541027096a0e4d29f34b53ffc464d84d686790dd300792a5cbf9ec303910\": rpc error: code = NotFound desc = could not find container \"ef90541027096a0e4d29f34b53ffc464d84d686790dd300792a5cbf9ec303910\": container with ID starting with ef90541027096a0e4d29f34b53ffc464d84d686790dd300792a5cbf9ec303910 not found: ID does not exist" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.892408 4756 scope.go:117] "RemoveContainer" containerID="bf0eaf9471959ae396651c9b221994acecb0d157435b5c0a1c0004b2ac7a918f" Dec 05 16:40:05 crc kubenswrapper[4756]: E1205 16:40:05.892764 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf0eaf9471959ae396651c9b221994acecb0d157435b5c0a1c0004b2ac7a918f\": container with ID starting with bf0eaf9471959ae396651c9b221994acecb0d157435b5c0a1c0004b2ac7a918f not found: ID does not exist" containerID="bf0eaf9471959ae396651c9b221994acecb0d157435b5c0a1c0004b2ac7a918f" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.892791 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf0eaf9471959ae396651c9b221994acecb0d157435b5c0a1c0004b2ac7a918f"} err="failed to get container status \"bf0eaf9471959ae396651c9b221994acecb0d157435b5c0a1c0004b2ac7a918f\": rpc error: code = NotFound desc = could not find container \"bf0eaf9471959ae396651c9b221994acecb0d157435b5c0a1c0004b2ac7a918f\": container with ID starting with bf0eaf9471959ae396651c9b221994acecb0d157435b5c0a1c0004b2ac7a918f not found: ID does not exist" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.892810 4756 scope.go:117] "RemoveContainer" containerID="465b915d062d6a4d317dadf8821ed947a8649552494adf53bbfe799458194197" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.904518 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.906238 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.908692 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.908734 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.908966 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.916186 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.941933 4756 scope.go:117] "RemoveContainer" containerID="7e6818144d87d7a778fa8ac3ca773848791d2011a19b29598b18114852b5e770" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.964063 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a50a0b2e-8847-451d-bd97-220d4a2dfb8f\") " pod="openstack/nova-api-0" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.964449 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-config-data\") pod \"nova-api-0\" (UID: \"a50a0b2e-8847-451d-bd97-220d4a2dfb8f\") " pod="openstack/nova-api-0" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.964503 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " pod="openstack/ceilometer-0" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.964539 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjbdg\" (UniqueName: \"kubernetes.io/projected/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-kube-api-access-fjbdg\") pod \"ceilometer-0\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " pod="openstack/ceilometer-0" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.964579 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a50a0b2e-8847-451d-bd97-220d4a2dfb8f\") " pod="openstack/nova-api-0" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.964603 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-public-tls-certs\") pod \"nova-api-0\" (UID: \"a50a0b2e-8847-451d-bd97-220d4a2dfb8f\") " pod="openstack/nova-api-0" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.964634 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-scripts\") pod \"ceilometer-0\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " pod="openstack/ceilometer-0" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.964654 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-run-httpd\") pod \"ceilometer-0\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " pod="openstack/ceilometer-0" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.964689 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-logs\") pod \"nova-api-0\" (UID: \"a50a0b2e-8847-451d-bd97-220d4a2dfb8f\") " pod="openstack/nova-api-0" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.964709 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-config-data\") pod \"ceilometer-0\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " pod="openstack/ceilometer-0" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.964739 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " pod="openstack/ceilometer-0" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.964758 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-log-httpd\") pod \"ceilometer-0\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " pod="openstack/ceilometer-0" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.964785 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " pod="openstack/ceilometer-0" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.964805 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzkkt\" (UniqueName: \"kubernetes.io/projected/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-kube-api-access-gzkkt\") pod \"nova-api-0\" (UID: \"a50a0b2e-8847-451d-bd97-220d4a2dfb8f\") " pod="openstack/nova-api-0" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.979269 4756 scope.go:117] "RemoveContainer" containerID="465b915d062d6a4d317dadf8821ed947a8649552494adf53bbfe799458194197" Dec 05 16:40:05 crc kubenswrapper[4756]: E1205 16:40:05.979704 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"465b915d062d6a4d317dadf8821ed947a8649552494adf53bbfe799458194197\": container with ID starting with 465b915d062d6a4d317dadf8821ed947a8649552494adf53bbfe799458194197 not found: ID does not exist" containerID="465b915d062d6a4d317dadf8821ed947a8649552494adf53bbfe799458194197" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.979796 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"465b915d062d6a4d317dadf8821ed947a8649552494adf53bbfe799458194197"} err="failed to get container status \"465b915d062d6a4d317dadf8821ed947a8649552494adf53bbfe799458194197\": rpc error: code = NotFound desc = could not find container \"465b915d062d6a4d317dadf8821ed947a8649552494adf53bbfe799458194197\": container with ID starting with 465b915d062d6a4d317dadf8821ed947a8649552494adf53bbfe799458194197 not found: ID does not exist" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.979885 4756 scope.go:117] "RemoveContainer" containerID="7e6818144d87d7a778fa8ac3ca773848791d2011a19b29598b18114852b5e770" Dec 05 16:40:05 crc kubenswrapper[4756]: E1205 16:40:05.980314 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e6818144d87d7a778fa8ac3ca773848791d2011a19b29598b18114852b5e770\": container with ID starting with 7e6818144d87d7a778fa8ac3ca773848791d2011a19b29598b18114852b5e770 not found: ID does not exist" containerID="7e6818144d87d7a778fa8ac3ca773848791d2011a19b29598b18114852b5e770" Dec 05 16:40:05 crc kubenswrapper[4756]: I1205 16:40:05.980404 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e6818144d87d7a778fa8ac3ca773848791d2011a19b29598b18114852b5e770"} err="failed to get container status \"7e6818144d87d7a778fa8ac3ca773848791d2011a19b29598b18114852b5e770\": rpc error: code = NotFound desc = could not find container \"7e6818144d87d7a778fa8ac3ca773848791d2011a19b29598b18114852b5e770\": container with ID starting with 7e6818144d87d7a778fa8ac3ca773848791d2011a19b29598b18114852b5e770 not found: ID does not exist" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.001375 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-gqdnn"] Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.002999 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gqdnn" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.007059 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.007397 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.009657 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-gqdnn"] Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.062560 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a4430d3-543d-4341-a279-6491c347a185" path="/var/lib/kubelet/pods/0a4430d3-543d-4341-a279-6491c347a185/volumes" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.063323 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95fc9575-c077-4bff-8879-53e06fe4fa2b" path="/var/lib/kubelet/pods/95fc9575-c077-4bff-8879-53e06fe4fa2b/volumes" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.066653 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " pod="openstack/ceilometer-0" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.066700 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjbdg\" (UniqueName: \"kubernetes.io/projected/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-kube-api-access-fjbdg\") pod \"ceilometer-0\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " pod="openstack/ceilometer-0" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.066754 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjvkv\" (UniqueName: \"kubernetes.io/projected/b4a1ee9f-9806-472e-8630-a8312f0c3b99-kube-api-access-bjvkv\") pod \"nova-cell1-cell-mapping-gqdnn\" (UID: \"b4a1ee9f-9806-472e-8630-a8312f0c3b99\") " pod="openstack/nova-cell1-cell-mapping-gqdnn" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.066781 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4a1ee9f-9806-472e-8630-a8312f0c3b99-scripts\") pod \"nova-cell1-cell-mapping-gqdnn\" (UID: \"b4a1ee9f-9806-472e-8630-a8312f0c3b99\") " pod="openstack/nova-cell1-cell-mapping-gqdnn" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.066816 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a50a0b2e-8847-451d-bd97-220d4a2dfb8f\") " pod="openstack/nova-api-0" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.066883 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-public-tls-certs\") pod \"nova-api-0\" (UID: \"a50a0b2e-8847-451d-bd97-220d4a2dfb8f\") " pod="openstack/nova-api-0" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.066965 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-scripts\") pod \"ceilometer-0\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " pod="openstack/ceilometer-0" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.067010 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-run-httpd\") pod \"ceilometer-0\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " pod="openstack/ceilometer-0" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.067064 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-logs\") pod \"nova-api-0\" (UID: \"a50a0b2e-8847-451d-bd97-220d4a2dfb8f\") " pod="openstack/nova-api-0" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.067091 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-config-data\") pod \"ceilometer-0\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " pod="openstack/ceilometer-0" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.067127 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4a1ee9f-9806-472e-8630-a8312f0c3b99-config-data\") pod \"nova-cell1-cell-mapping-gqdnn\" (UID: \"b4a1ee9f-9806-472e-8630-a8312f0c3b99\") " pod="openstack/nova-cell1-cell-mapping-gqdnn" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.067168 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " pod="openstack/ceilometer-0" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.067192 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-log-httpd\") pod \"ceilometer-0\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " pod="openstack/ceilometer-0" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.067226 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " pod="openstack/ceilometer-0" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.067259 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzkkt\" (UniqueName: \"kubernetes.io/projected/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-kube-api-access-gzkkt\") pod \"nova-api-0\" (UID: \"a50a0b2e-8847-451d-bd97-220d4a2dfb8f\") " pod="openstack/nova-api-0" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.067290 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4a1ee9f-9806-472e-8630-a8312f0c3b99-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gqdnn\" (UID: \"b4a1ee9f-9806-472e-8630-a8312f0c3b99\") " pod="openstack/nova-cell1-cell-mapping-gqdnn" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.067330 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a50a0b2e-8847-451d-bd97-220d4a2dfb8f\") " pod="openstack/nova-api-0" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.067395 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-config-data\") pod \"nova-api-0\" (UID: \"a50a0b2e-8847-451d-bd97-220d4a2dfb8f\") " pod="openstack/nova-api-0" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.067872 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-logs\") pod \"nova-api-0\" (UID: \"a50a0b2e-8847-451d-bd97-220d4a2dfb8f\") " pod="openstack/nova-api-0" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.068441 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-run-httpd\") pod \"ceilometer-0\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " pod="openstack/ceilometer-0" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.068878 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-log-httpd\") pod \"ceilometer-0\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " pod="openstack/ceilometer-0" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.071487 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " pod="openstack/ceilometer-0" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.071761 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " pod="openstack/ceilometer-0" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.072229 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-scripts\") pod \"ceilometer-0\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " pod="openstack/ceilometer-0" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.073078 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a50a0b2e-8847-451d-bd97-220d4a2dfb8f\") " pod="openstack/nova-api-0" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.073130 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-public-tls-certs\") pod \"nova-api-0\" (UID: \"a50a0b2e-8847-451d-bd97-220d4a2dfb8f\") " pod="openstack/nova-api-0" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.073215 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-config-data\") pod \"nova-api-0\" (UID: \"a50a0b2e-8847-451d-bd97-220d4a2dfb8f\") " pod="openstack/nova-api-0" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.074200 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a50a0b2e-8847-451d-bd97-220d4a2dfb8f\") " pod="openstack/nova-api-0" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.076639 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-config-data\") pod \"ceilometer-0\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " pod="openstack/ceilometer-0" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.089360 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " pod="openstack/ceilometer-0" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.090264 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjbdg\" (UniqueName: \"kubernetes.io/projected/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-kube-api-access-fjbdg\") pod \"ceilometer-0\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " pod="openstack/ceilometer-0" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.092087 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzkkt\" (UniqueName: \"kubernetes.io/projected/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-kube-api-access-gzkkt\") pod \"nova-api-0\" (UID: \"a50a0b2e-8847-451d-bd97-220d4a2dfb8f\") " pod="openstack/nova-api-0" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.169436 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjvkv\" (UniqueName: \"kubernetes.io/projected/b4a1ee9f-9806-472e-8630-a8312f0c3b99-kube-api-access-bjvkv\") pod \"nova-cell1-cell-mapping-gqdnn\" (UID: \"b4a1ee9f-9806-472e-8630-a8312f0c3b99\") " pod="openstack/nova-cell1-cell-mapping-gqdnn" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.169482 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4a1ee9f-9806-472e-8630-a8312f0c3b99-scripts\") pod \"nova-cell1-cell-mapping-gqdnn\" (UID: \"b4a1ee9f-9806-472e-8630-a8312f0c3b99\") " pod="openstack/nova-cell1-cell-mapping-gqdnn" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.169560 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4a1ee9f-9806-472e-8630-a8312f0c3b99-config-data\") pod \"nova-cell1-cell-mapping-gqdnn\" (UID: \"b4a1ee9f-9806-472e-8630-a8312f0c3b99\") " pod="openstack/nova-cell1-cell-mapping-gqdnn" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.169602 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4a1ee9f-9806-472e-8630-a8312f0c3b99-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gqdnn\" (UID: \"b4a1ee9f-9806-472e-8630-a8312f0c3b99\") " pod="openstack/nova-cell1-cell-mapping-gqdnn" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.173401 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4a1ee9f-9806-472e-8630-a8312f0c3b99-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gqdnn\" (UID: \"b4a1ee9f-9806-472e-8630-a8312f0c3b99\") " pod="openstack/nova-cell1-cell-mapping-gqdnn" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.173682 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4a1ee9f-9806-472e-8630-a8312f0c3b99-scripts\") pod \"nova-cell1-cell-mapping-gqdnn\" (UID: \"b4a1ee9f-9806-472e-8630-a8312f0c3b99\") " pod="openstack/nova-cell1-cell-mapping-gqdnn" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.174512 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4a1ee9f-9806-472e-8630-a8312f0c3b99-config-data\") pod \"nova-cell1-cell-mapping-gqdnn\" (UID: \"b4a1ee9f-9806-472e-8630-a8312f0c3b99\") " pod="openstack/nova-cell1-cell-mapping-gqdnn" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.188992 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjvkv\" (UniqueName: \"kubernetes.io/projected/b4a1ee9f-9806-472e-8630-a8312f0c3b99-kube-api-access-bjvkv\") pod \"nova-cell1-cell-mapping-gqdnn\" (UID: \"b4a1ee9f-9806-472e-8630-a8312f0c3b99\") " pod="openstack/nova-cell1-cell-mapping-gqdnn" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.211224 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.247443 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.329925 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gqdnn" Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.721130 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.787280 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 16:40:06 crc kubenswrapper[4756]: W1205 16:40:06.788858 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda50a0b2e_8847_451d_bd97_220d4a2dfb8f.slice/crio-245b552170a4b7d74281f67c7eadc8b6923d7a8d5fce91034dc261d923e6ff4e WatchSource:0}: Error finding container 245b552170a4b7d74281f67c7eadc8b6923d7a8d5fce91034dc261d923e6ff4e: Status 404 returned error can't find the container with id 245b552170a4b7d74281f67c7eadc8b6923d7a8d5fce91034dc261d923e6ff4e Dec 05 16:40:06 crc kubenswrapper[4756]: I1205 16:40:06.861188 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-gqdnn"] Dec 05 16:40:06 crc kubenswrapper[4756]: W1205 16:40:06.865161 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4a1ee9f_9806_472e_8630_a8312f0c3b99.slice/crio-c7204a7e9db0dff83ae8aec8f05448b0c1a1f9347879930a271e4063347c689c WatchSource:0}: Error finding container c7204a7e9db0dff83ae8aec8f05448b0c1a1f9347879930a271e4063347c689c: Status 404 returned error can't find the container with id c7204a7e9db0dff83ae8aec8f05448b0c1a1f9347879930a271e4063347c689c Dec 05 16:40:07 crc kubenswrapper[4756]: I1205 16:40:07.705326 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a50a0b2e-8847-451d-bd97-220d4a2dfb8f","Type":"ContainerStarted","Data":"e041d677e7a87aac87b265e6246e3bbf6fcea3e3cb75aa2313ddf55bf0e1df41"} Dec 05 16:40:07 crc kubenswrapper[4756]: I1205 16:40:07.706461 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a50a0b2e-8847-451d-bd97-220d4a2dfb8f","Type":"ContainerStarted","Data":"245b552170a4b7d74281f67c7eadc8b6923d7a8d5fce91034dc261d923e6ff4e"} Dec 05 16:40:07 crc kubenswrapper[4756]: I1205 16:40:07.707306 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gqdnn" event={"ID":"b4a1ee9f-9806-472e-8630-a8312f0c3b99","Type":"ContainerStarted","Data":"81f68066e72188559d31bb8d4837587fd7973789d7a3b9234955e55c86d39ce8"} Dec 05 16:40:07 crc kubenswrapper[4756]: I1205 16:40:07.707341 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gqdnn" event={"ID":"b4a1ee9f-9806-472e-8630-a8312f0c3b99","Type":"ContainerStarted","Data":"c7204a7e9db0dff83ae8aec8f05448b0c1a1f9347879930a271e4063347c689c"} Dec 05 16:40:07 crc kubenswrapper[4756]: I1205 16:40:07.709227 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3cf4c6f-f34b-47ea-86f9-88666a59d44a","Type":"ContainerStarted","Data":"319f4b60b71d3161e7bb47af4784820828daf92e7af680ffa21449636e7ee3b8"} Dec 05 16:40:07 crc kubenswrapper[4756]: I1205 16:40:07.725635 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-gqdnn" podStartSLOduration=2.7256158839999998 podStartE2EDuration="2.725615884s" podCreationTimestamp="2025-12-05 16:40:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:40:07.721998646 +0000 UTC m=+1467.960672168" watchObservedRunningTime="2025-12-05 16:40:07.725615884 +0000 UTC m=+1467.964289396" Dec 05 16:40:08 crc kubenswrapper[4756]: I1205 16:40:08.725472 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3cf4c6f-f34b-47ea-86f9-88666a59d44a","Type":"ContainerStarted","Data":"bfcc2c269977aa6668ab1f24c749b53e9efa8a90bc0debb824bbc6f137672462"} Dec 05 16:40:08 crc kubenswrapper[4756]: I1205 16:40:08.726111 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3cf4c6f-f34b-47ea-86f9-88666a59d44a","Type":"ContainerStarted","Data":"e3d6463e66e6a8de5357ab6e221c04b2cb882550348104db03b422d53b2565e5"} Dec 05 16:40:08 crc kubenswrapper[4756]: I1205 16:40:08.728403 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a50a0b2e-8847-451d-bd97-220d4a2dfb8f","Type":"ContainerStarted","Data":"b7aa356230d2c5f12fc1c20407d3c9f15dda173d9ae1bbdd1ad73da9692fac65"} Dec 05 16:40:08 crc kubenswrapper[4756]: I1205 16:40:08.786061 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.786038923 podStartE2EDuration="3.786038923s" podCreationTimestamp="2025-12-05 16:40:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:40:08.774621693 +0000 UTC m=+1469.013295225" watchObservedRunningTime="2025-12-05 16:40:08.786038923 +0000 UTC m=+1469.024712445" Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.104075 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.182138 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d75688ddc-j5zx2"] Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.182815 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" podUID="af31e31e-33e5-48d9-8322-efb1f23ba2b0" containerName="dnsmasq-dns" containerID="cri-o://566cf962bfe87ae965ae376ef40f44a93f253e45e7ee49fd1404add6970fbb2b" gracePeriod=10 Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.685886 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.744009 4756 generic.go:334] "Generic (PLEG): container finished" podID="af31e31e-33e5-48d9-8322-efb1f23ba2b0" containerID="566cf962bfe87ae965ae376ef40f44a93f253e45e7ee49fd1404add6970fbb2b" exitCode=0 Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.744125 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.744958 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" event={"ID":"af31e31e-33e5-48d9-8322-efb1f23ba2b0","Type":"ContainerDied","Data":"566cf962bfe87ae965ae376ef40f44a93f253e45e7ee49fd1404add6970fbb2b"} Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.744989 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d75688ddc-j5zx2" event={"ID":"af31e31e-33e5-48d9-8322-efb1f23ba2b0","Type":"ContainerDied","Data":"1b991686bff26b7e4624389ace14acd3a44c0673e9a405b31ea719081697ace7"} Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.745021 4756 scope.go:117] "RemoveContainer" containerID="566cf962bfe87ae965ae376ef40f44a93f253e45e7ee49fd1404add6970fbb2b" Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.757346 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af31e31e-33e5-48d9-8322-efb1f23ba2b0-dns-swift-storage-0\") pod \"af31e31e-33e5-48d9-8322-efb1f23ba2b0\" (UID: \"af31e31e-33e5-48d9-8322-efb1f23ba2b0\") " Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.757477 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af31e31e-33e5-48d9-8322-efb1f23ba2b0-dns-svc\") pod \"af31e31e-33e5-48d9-8322-efb1f23ba2b0\" (UID: \"af31e31e-33e5-48d9-8322-efb1f23ba2b0\") " Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.757593 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af31e31e-33e5-48d9-8322-efb1f23ba2b0-config\") pod \"af31e31e-33e5-48d9-8322-efb1f23ba2b0\" (UID: \"af31e31e-33e5-48d9-8322-efb1f23ba2b0\") " Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.757633 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af31e31e-33e5-48d9-8322-efb1f23ba2b0-ovsdbserver-sb\") pod \"af31e31e-33e5-48d9-8322-efb1f23ba2b0\" (UID: \"af31e31e-33e5-48d9-8322-efb1f23ba2b0\") " Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.757684 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af31e31e-33e5-48d9-8322-efb1f23ba2b0-ovsdbserver-nb\") pod \"af31e31e-33e5-48d9-8322-efb1f23ba2b0\" (UID: \"af31e31e-33e5-48d9-8322-efb1f23ba2b0\") " Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.758058 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqxq2\" (UniqueName: \"kubernetes.io/projected/af31e31e-33e5-48d9-8322-efb1f23ba2b0-kube-api-access-nqxq2\") pod \"af31e31e-33e5-48d9-8322-efb1f23ba2b0\" (UID: \"af31e31e-33e5-48d9-8322-efb1f23ba2b0\") " Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.759526 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3cf4c6f-f34b-47ea-86f9-88666a59d44a","Type":"ContainerStarted","Data":"e74f52a59e270b006a1fc5714de7ee321f0ef68eebd6a6ed247cd71177dbdaa8"} Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.778147 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af31e31e-33e5-48d9-8322-efb1f23ba2b0-kube-api-access-nqxq2" (OuterVolumeSpecName: "kube-api-access-nqxq2") pod "af31e31e-33e5-48d9-8322-efb1f23ba2b0" (UID: "af31e31e-33e5-48d9-8322-efb1f23ba2b0"). InnerVolumeSpecName "kube-api-access-nqxq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.781643 4756 scope.go:117] "RemoveContainer" containerID="95b6dbca61e093f7642dbcccea379059fedefbeb2a4bbe66c31f4b1007dafc90" Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.810238 4756 scope.go:117] "RemoveContainer" containerID="566cf962bfe87ae965ae376ef40f44a93f253e45e7ee49fd1404add6970fbb2b" Dec 05 16:40:09 crc kubenswrapper[4756]: E1205 16:40:09.810756 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"566cf962bfe87ae965ae376ef40f44a93f253e45e7ee49fd1404add6970fbb2b\": container with ID starting with 566cf962bfe87ae965ae376ef40f44a93f253e45e7ee49fd1404add6970fbb2b not found: ID does not exist" containerID="566cf962bfe87ae965ae376ef40f44a93f253e45e7ee49fd1404add6970fbb2b" Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.810823 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"566cf962bfe87ae965ae376ef40f44a93f253e45e7ee49fd1404add6970fbb2b"} err="failed to get container status \"566cf962bfe87ae965ae376ef40f44a93f253e45e7ee49fd1404add6970fbb2b\": rpc error: code = NotFound desc = could not find container \"566cf962bfe87ae965ae376ef40f44a93f253e45e7ee49fd1404add6970fbb2b\": container with ID starting with 566cf962bfe87ae965ae376ef40f44a93f253e45e7ee49fd1404add6970fbb2b not found: ID does not exist" Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.810868 4756 scope.go:117] "RemoveContainer" containerID="95b6dbca61e093f7642dbcccea379059fedefbeb2a4bbe66c31f4b1007dafc90" Dec 05 16:40:09 crc kubenswrapper[4756]: E1205 16:40:09.811249 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95b6dbca61e093f7642dbcccea379059fedefbeb2a4bbe66c31f4b1007dafc90\": container with ID starting with 95b6dbca61e093f7642dbcccea379059fedefbeb2a4bbe66c31f4b1007dafc90 not found: ID does not exist" containerID="95b6dbca61e093f7642dbcccea379059fedefbeb2a4bbe66c31f4b1007dafc90" Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.811279 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b6dbca61e093f7642dbcccea379059fedefbeb2a4bbe66c31f4b1007dafc90"} err="failed to get container status \"95b6dbca61e093f7642dbcccea379059fedefbeb2a4bbe66c31f4b1007dafc90\": rpc error: code = NotFound desc = could not find container \"95b6dbca61e093f7642dbcccea379059fedefbeb2a4bbe66c31f4b1007dafc90\": container with ID starting with 95b6dbca61e093f7642dbcccea379059fedefbeb2a4bbe66c31f4b1007dafc90 not found: ID does not exist" Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.816963 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af31e31e-33e5-48d9-8322-efb1f23ba2b0-config" (OuterVolumeSpecName: "config") pod "af31e31e-33e5-48d9-8322-efb1f23ba2b0" (UID: "af31e31e-33e5-48d9-8322-efb1f23ba2b0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.821594 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af31e31e-33e5-48d9-8322-efb1f23ba2b0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "af31e31e-33e5-48d9-8322-efb1f23ba2b0" (UID: "af31e31e-33e5-48d9-8322-efb1f23ba2b0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.829121 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af31e31e-33e5-48d9-8322-efb1f23ba2b0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "af31e31e-33e5-48d9-8322-efb1f23ba2b0" (UID: "af31e31e-33e5-48d9-8322-efb1f23ba2b0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.832857 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af31e31e-33e5-48d9-8322-efb1f23ba2b0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "af31e31e-33e5-48d9-8322-efb1f23ba2b0" (UID: "af31e31e-33e5-48d9-8322-efb1f23ba2b0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.844793 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af31e31e-33e5-48d9-8322-efb1f23ba2b0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "af31e31e-33e5-48d9-8322-efb1f23ba2b0" (UID: "af31e31e-33e5-48d9-8322-efb1f23ba2b0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.861402 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af31e31e-33e5-48d9-8322-efb1f23ba2b0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.861455 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af31e31e-33e5-48d9-8322-efb1f23ba2b0-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.861468 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af31e31e-33e5-48d9-8322-efb1f23ba2b0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.861485 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af31e31e-33e5-48d9-8322-efb1f23ba2b0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.861498 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqxq2\" (UniqueName: \"kubernetes.io/projected/af31e31e-33e5-48d9-8322-efb1f23ba2b0-kube-api-access-nqxq2\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:09 crc kubenswrapper[4756]: I1205 16:40:09.861510 4756 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af31e31e-33e5-48d9-8322-efb1f23ba2b0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:10 crc kubenswrapper[4756]: I1205 16:40:10.112477 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d75688ddc-j5zx2"] Dec 05 16:40:10 crc kubenswrapper[4756]: I1205 16:40:10.124477 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d75688ddc-j5zx2"] Dec 05 16:40:11 crc kubenswrapper[4756]: I1205 16:40:11.804117 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3cf4c6f-f34b-47ea-86f9-88666a59d44a","Type":"ContainerStarted","Data":"6299d63206c3fbc25b59d632cb9c2f9bf041eee3b3e097b905b6d1db62b66c82"} Dec 05 16:40:11 crc kubenswrapper[4756]: I1205 16:40:11.804745 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 16:40:11 crc kubenswrapper[4756]: I1205 16:40:11.856700 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.997953634 podStartE2EDuration="6.85668156s" podCreationTimestamp="2025-12-05 16:40:05 +0000 UTC" firstStartedPulling="2025-12-05 16:40:06.726883957 +0000 UTC m=+1466.965557479" lastFinishedPulling="2025-12-05 16:40:10.585611902 +0000 UTC m=+1470.824285405" observedRunningTime="2025-12-05 16:40:11.827232522 +0000 UTC m=+1472.065906054" watchObservedRunningTime="2025-12-05 16:40:11.85668156 +0000 UTC m=+1472.095355062" Dec 05 16:40:12 crc kubenswrapper[4756]: I1205 16:40:12.080516 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af31e31e-33e5-48d9-8322-efb1f23ba2b0" path="/var/lib/kubelet/pods/af31e31e-33e5-48d9-8322-efb1f23ba2b0/volumes" Dec 05 16:40:12 crc kubenswrapper[4756]: I1205 16:40:12.814787 4756 generic.go:334] "Generic (PLEG): container finished" podID="b4a1ee9f-9806-472e-8630-a8312f0c3b99" containerID="81f68066e72188559d31bb8d4837587fd7973789d7a3b9234955e55c86d39ce8" exitCode=0 Dec 05 16:40:12 crc kubenswrapper[4756]: I1205 16:40:12.814859 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gqdnn" event={"ID":"b4a1ee9f-9806-472e-8630-a8312f0c3b99","Type":"ContainerDied","Data":"81f68066e72188559d31bb8d4837587fd7973789d7a3b9234955e55c86d39ce8"} Dec 05 16:40:14 crc kubenswrapper[4756]: I1205 16:40:14.227906 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gqdnn" Dec 05 16:40:14 crc kubenswrapper[4756]: I1205 16:40:14.355572 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4a1ee9f-9806-472e-8630-a8312f0c3b99-combined-ca-bundle\") pod \"b4a1ee9f-9806-472e-8630-a8312f0c3b99\" (UID: \"b4a1ee9f-9806-472e-8630-a8312f0c3b99\") " Dec 05 16:40:14 crc kubenswrapper[4756]: I1205 16:40:14.355887 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4a1ee9f-9806-472e-8630-a8312f0c3b99-config-data\") pod \"b4a1ee9f-9806-472e-8630-a8312f0c3b99\" (UID: \"b4a1ee9f-9806-472e-8630-a8312f0c3b99\") " Dec 05 16:40:14 crc kubenswrapper[4756]: I1205 16:40:14.356028 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjvkv\" (UniqueName: \"kubernetes.io/projected/b4a1ee9f-9806-472e-8630-a8312f0c3b99-kube-api-access-bjvkv\") pod \"b4a1ee9f-9806-472e-8630-a8312f0c3b99\" (UID: \"b4a1ee9f-9806-472e-8630-a8312f0c3b99\") " Dec 05 16:40:14 crc kubenswrapper[4756]: I1205 16:40:14.356157 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4a1ee9f-9806-472e-8630-a8312f0c3b99-scripts\") pod \"b4a1ee9f-9806-472e-8630-a8312f0c3b99\" (UID: \"b4a1ee9f-9806-472e-8630-a8312f0c3b99\") " Dec 05 16:40:14 crc kubenswrapper[4756]: I1205 16:40:14.360852 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4a1ee9f-9806-472e-8630-a8312f0c3b99-kube-api-access-bjvkv" (OuterVolumeSpecName: "kube-api-access-bjvkv") pod "b4a1ee9f-9806-472e-8630-a8312f0c3b99" (UID: "b4a1ee9f-9806-472e-8630-a8312f0c3b99"). InnerVolumeSpecName "kube-api-access-bjvkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:40:14 crc kubenswrapper[4756]: I1205 16:40:14.369567 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4a1ee9f-9806-472e-8630-a8312f0c3b99-scripts" (OuterVolumeSpecName: "scripts") pod "b4a1ee9f-9806-472e-8630-a8312f0c3b99" (UID: "b4a1ee9f-9806-472e-8630-a8312f0c3b99"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:40:14 crc kubenswrapper[4756]: I1205 16:40:14.387446 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4a1ee9f-9806-472e-8630-a8312f0c3b99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4a1ee9f-9806-472e-8630-a8312f0c3b99" (UID: "b4a1ee9f-9806-472e-8630-a8312f0c3b99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:40:14 crc kubenswrapper[4756]: I1205 16:40:14.402764 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4a1ee9f-9806-472e-8630-a8312f0c3b99-config-data" (OuterVolumeSpecName: "config-data") pod "b4a1ee9f-9806-472e-8630-a8312f0c3b99" (UID: "b4a1ee9f-9806-472e-8630-a8312f0c3b99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:40:14 crc kubenswrapper[4756]: I1205 16:40:14.459049 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4a1ee9f-9806-472e-8630-a8312f0c3b99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:14 crc kubenswrapper[4756]: I1205 16:40:14.459095 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4a1ee9f-9806-472e-8630-a8312f0c3b99-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:14 crc kubenswrapper[4756]: I1205 16:40:14.459116 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjvkv\" (UniqueName: \"kubernetes.io/projected/b4a1ee9f-9806-472e-8630-a8312f0c3b99-kube-api-access-bjvkv\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:14 crc kubenswrapper[4756]: I1205 16:40:14.459134 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4a1ee9f-9806-472e-8630-a8312f0c3b99-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:14 crc kubenswrapper[4756]: I1205 16:40:14.831741 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gqdnn" event={"ID":"b4a1ee9f-9806-472e-8630-a8312f0c3b99","Type":"ContainerDied","Data":"c7204a7e9db0dff83ae8aec8f05448b0c1a1f9347879930a271e4063347c689c"} Dec 05 16:40:14 crc kubenswrapper[4756]: I1205 16:40:14.832080 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7204a7e9db0dff83ae8aec8f05448b0c1a1f9347879930a271e4063347c689c" Dec 05 16:40:14 crc kubenswrapper[4756]: I1205 16:40:14.831787 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gqdnn" Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.062076 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.063489 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a50a0b2e-8847-451d-bd97-220d4a2dfb8f" containerName="nova-api-log" containerID="cri-o://e041d677e7a87aac87b265e6246e3bbf6fcea3e3cb75aa2313ddf55bf0e1df41" gracePeriod=30 Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.064011 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a50a0b2e-8847-451d-bd97-220d4a2dfb8f" containerName="nova-api-api" containerID="cri-o://b7aa356230d2c5f12fc1c20407d3c9f15dda173d9ae1bbdd1ad73da9692fac65" gracePeriod=30 Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.079716 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.079970 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2011aac5-dbbc-4244-aab4-2bb2abc25cee" containerName="nova-scheduler-scheduler" containerID="cri-o://81cf6379a435061d61a85eaca4c548ecd8a03141a920e8b4cbeae7f9adc9169c" gracePeriod=30 Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.095994 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.096307 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="40355c38-2e19-4399-ad0e-69986aca4afc" containerName="nova-metadata-log" containerID="cri-o://f9c0aa434a8aaa877ac9be36559583185484829d4628945a9968fb907b589035" gracePeriod=30 Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.096432 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="40355c38-2e19-4399-ad0e-69986aca4afc" containerName="nova-metadata-metadata" containerID="cri-o://473febb21047ae03f0d68e956550c08036534d94ad15979bcf8182b244f07a6a" gracePeriod=30 Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.680724 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.791920 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzkkt\" (UniqueName: \"kubernetes.io/projected/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-kube-api-access-gzkkt\") pod \"a50a0b2e-8847-451d-bd97-220d4a2dfb8f\" (UID: \"a50a0b2e-8847-451d-bd97-220d4a2dfb8f\") " Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.793026 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-internal-tls-certs\") pod \"a50a0b2e-8847-451d-bd97-220d4a2dfb8f\" (UID: \"a50a0b2e-8847-451d-bd97-220d4a2dfb8f\") " Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.793072 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-logs\") pod \"a50a0b2e-8847-451d-bd97-220d4a2dfb8f\" (UID: \"a50a0b2e-8847-451d-bd97-220d4a2dfb8f\") " Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.793123 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-config-data\") pod \"a50a0b2e-8847-451d-bd97-220d4a2dfb8f\" (UID: \"a50a0b2e-8847-451d-bd97-220d4a2dfb8f\") " Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.793145 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-public-tls-certs\") pod \"a50a0b2e-8847-451d-bd97-220d4a2dfb8f\" (UID: \"a50a0b2e-8847-451d-bd97-220d4a2dfb8f\") " Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.793271 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-combined-ca-bundle\") pod \"a50a0b2e-8847-451d-bd97-220d4a2dfb8f\" (UID: \"a50a0b2e-8847-451d-bd97-220d4a2dfb8f\") " Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.793703 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-logs" (OuterVolumeSpecName: "logs") pod "a50a0b2e-8847-451d-bd97-220d4a2dfb8f" (UID: "a50a0b2e-8847-451d-bd97-220d4a2dfb8f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.794389 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.799683 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-kube-api-access-gzkkt" (OuterVolumeSpecName: "kube-api-access-gzkkt") pod "a50a0b2e-8847-451d-bd97-220d4a2dfb8f" (UID: "a50a0b2e-8847-451d-bd97-220d4a2dfb8f"). InnerVolumeSpecName "kube-api-access-gzkkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.823612 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a50a0b2e-8847-451d-bd97-220d4a2dfb8f" (UID: "a50a0b2e-8847-451d-bd97-220d4a2dfb8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.827590 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-config-data" (OuterVolumeSpecName: "config-data") pod "a50a0b2e-8847-451d-bd97-220d4a2dfb8f" (UID: "a50a0b2e-8847-451d-bd97-220d4a2dfb8f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.843983 4756 generic.go:334] "Generic (PLEG): container finished" podID="a50a0b2e-8847-451d-bd97-220d4a2dfb8f" containerID="b7aa356230d2c5f12fc1c20407d3c9f15dda173d9ae1bbdd1ad73da9692fac65" exitCode=0 Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.844012 4756 generic.go:334] "Generic (PLEG): container finished" podID="a50a0b2e-8847-451d-bd97-220d4a2dfb8f" containerID="e041d677e7a87aac87b265e6246e3bbf6fcea3e3cb75aa2313ddf55bf0e1df41" exitCode=143 Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.844056 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a50a0b2e-8847-451d-bd97-220d4a2dfb8f","Type":"ContainerDied","Data":"b7aa356230d2c5f12fc1c20407d3c9f15dda173d9ae1bbdd1ad73da9692fac65"} Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.844088 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a50a0b2e-8847-451d-bd97-220d4a2dfb8f","Type":"ContainerDied","Data":"e041d677e7a87aac87b265e6246e3bbf6fcea3e3cb75aa2313ddf55bf0e1df41"} Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.844108 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a50a0b2e-8847-451d-bd97-220d4a2dfb8f","Type":"ContainerDied","Data":"245b552170a4b7d74281f67c7eadc8b6923d7a8d5fce91034dc261d923e6ff4e"} Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.844129 4756 scope.go:117] "RemoveContainer" containerID="b7aa356230d2c5f12fc1c20407d3c9f15dda173d9ae1bbdd1ad73da9692fac65" Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.844270 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.848038 4756 generic.go:334] "Generic (PLEG): container finished" podID="40355c38-2e19-4399-ad0e-69986aca4afc" containerID="f9c0aa434a8aaa877ac9be36559583185484829d4628945a9968fb907b589035" exitCode=143 Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.848062 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"40355c38-2e19-4399-ad0e-69986aca4afc","Type":"ContainerDied","Data":"f9c0aa434a8aaa877ac9be36559583185484829d4628945a9968fb907b589035"} Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.869669 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a50a0b2e-8847-451d-bd97-220d4a2dfb8f" (UID: "a50a0b2e-8847-451d-bd97-220d4a2dfb8f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.876756 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a50a0b2e-8847-451d-bd97-220d4a2dfb8f" (UID: "a50a0b2e-8847-451d-bd97-220d4a2dfb8f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.879058 4756 scope.go:117] "RemoveContainer" containerID="e041d677e7a87aac87b265e6246e3bbf6fcea3e3cb75aa2313ddf55bf0e1df41" Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.905511 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzkkt\" (UniqueName: \"kubernetes.io/projected/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-kube-api-access-gzkkt\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.905556 4756 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.905565 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.905573 4756 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.905583 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a50a0b2e-8847-451d-bd97-220d4a2dfb8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.907719 4756 scope.go:117] "RemoveContainer" containerID="b7aa356230d2c5f12fc1c20407d3c9f15dda173d9ae1bbdd1ad73da9692fac65" Dec 05 16:40:15 crc kubenswrapper[4756]: E1205 16:40:15.908291 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7aa356230d2c5f12fc1c20407d3c9f15dda173d9ae1bbdd1ad73da9692fac65\": container with ID starting with b7aa356230d2c5f12fc1c20407d3c9f15dda173d9ae1bbdd1ad73da9692fac65 not found: ID does not exist" containerID="b7aa356230d2c5f12fc1c20407d3c9f15dda173d9ae1bbdd1ad73da9692fac65" Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.908353 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7aa356230d2c5f12fc1c20407d3c9f15dda173d9ae1bbdd1ad73da9692fac65"} err="failed to get container status \"b7aa356230d2c5f12fc1c20407d3c9f15dda173d9ae1bbdd1ad73da9692fac65\": rpc error: code = NotFound desc = could not find container \"b7aa356230d2c5f12fc1c20407d3c9f15dda173d9ae1bbdd1ad73da9692fac65\": container with ID starting with b7aa356230d2c5f12fc1c20407d3c9f15dda173d9ae1bbdd1ad73da9692fac65 not found: ID does not exist" Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.908388 4756 scope.go:117] "RemoveContainer" containerID="e041d677e7a87aac87b265e6246e3bbf6fcea3e3cb75aa2313ddf55bf0e1df41" Dec 05 16:40:15 crc kubenswrapper[4756]: E1205 16:40:15.908778 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e041d677e7a87aac87b265e6246e3bbf6fcea3e3cb75aa2313ddf55bf0e1df41\": container with ID starting with e041d677e7a87aac87b265e6246e3bbf6fcea3e3cb75aa2313ddf55bf0e1df41 not found: ID does not exist" containerID="e041d677e7a87aac87b265e6246e3bbf6fcea3e3cb75aa2313ddf55bf0e1df41" Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.908817 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e041d677e7a87aac87b265e6246e3bbf6fcea3e3cb75aa2313ddf55bf0e1df41"} err="failed to get container status \"e041d677e7a87aac87b265e6246e3bbf6fcea3e3cb75aa2313ddf55bf0e1df41\": rpc error: code = NotFound desc = could not find container \"e041d677e7a87aac87b265e6246e3bbf6fcea3e3cb75aa2313ddf55bf0e1df41\": container with ID starting with e041d677e7a87aac87b265e6246e3bbf6fcea3e3cb75aa2313ddf55bf0e1df41 not found: ID does not exist" Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.908852 4756 scope.go:117] "RemoveContainer" containerID="b7aa356230d2c5f12fc1c20407d3c9f15dda173d9ae1bbdd1ad73da9692fac65" Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.909160 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7aa356230d2c5f12fc1c20407d3c9f15dda173d9ae1bbdd1ad73da9692fac65"} err="failed to get container status \"b7aa356230d2c5f12fc1c20407d3c9f15dda173d9ae1bbdd1ad73da9692fac65\": rpc error: code = NotFound desc = could not find container \"b7aa356230d2c5f12fc1c20407d3c9f15dda173d9ae1bbdd1ad73da9692fac65\": container with ID starting with b7aa356230d2c5f12fc1c20407d3c9f15dda173d9ae1bbdd1ad73da9692fac65 not found: ID does not exist" Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.909178 4756 scope.go:117] "RemoveContainer" containerID="e041d677e7a87aac87b265e6246e3bbf6fcea3e3cb75aa2313ddf55bf0e1df41" Dec 05 16:40:15 crc kubenswrapper[4756]: I1205 16:40:15.909645 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e041d677e7a87aac87b265e6246e3bbf6fcea3e3cb75aa2313ddf55bf0e1df41"} err="failed to get container status \"e041d677e7a87aac87b265e6246e3bbf6fcea3e3cb75aa2313ddf55bf0e1df41\": rpc error: code = NotFound desc = could not find container \"e041d677e7a87aac87b265e6246e3bbf6fcea3e3cb75aa2313ddf55bf0e1df41\": container with ID starting with e041d677e7a87aac87b265e6246e3bbf6fcea3e3cb75aa2313ddf55bf0e1df41 not found: ID does not exist" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.168788 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.178103 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.191684 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 16:40:16 crc kubenswrapper[4756]: E1205 16:40:16.192270 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a50a0b2e-8847-451d-bd97-220d4a2dfb8f" containerName="nova-api-api" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.194466 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a50a0b2e-8847-451d-bd97-220d4a2dfb8f" containerName="nova-api-api" Dec 05 16:40:16 crc kubenswrapper[4756]: E1205 16:40:16.194547 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a50a0b2e-8847-451d-bd97-220d4a2dfb8f" containerName="nova-api-log" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.194600 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a50a0b2e-8847-451d-bd97-220d4a2dfb8f" containerName="nova-api-log" Dec 05 16:40:16 crc kubenswrapper[4756]: E1205 16:40:16.194666 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af31e31e-33e5-48d9-8322-efb1f23ba2b0" containerName="dnsmasq-dns" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.194744 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="af31e31e-33e5-48d9-8322-efb1f23ba2b0" containerName="dnsmasq-dns" Dec 05 16:40:16 crc kubenswrapper[4756]: E1205 16:40:16.194807 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af31e31e-33e5-48d9-8322-efb1f23ba2b0" containerName="init" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.194858 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="af31e31e-33e5-48d9-8322-efb1f23ba2b0" containerName="init" Dec 05 16:40:16 crc kubenswrapper[4756]: E1205 16:40:16.194950 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4a1ee9f-9806-472e-8630-a8312f0c3b99" containerName="nova-manage" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.195009 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4a1ee9f-9806-472e-8630-a8312f0c3b99" containerName="nova-manage" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.195339 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4a1ee9f-9806-472e-8630-a8312f0c3b99" containerName="nova-manage" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.195418 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="a50a0b2e-8847-451d-bd97-220d4a2dfb8f" containerName="nova-api-api" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.195505 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="a50a0b2e-8847-451d-bd97-220d4a2dfb8f" containerName="nova-api-log" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.195566 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="af31e31e-33e5-48d9-8322-efb1f23ba2b0" containerName="dnsmasq-dns" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.196591 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.199521 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.199852 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.200003 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.202212 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.313304 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/897f63f9-bf8d-438b-b35a-4195963786e3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"897f63f9-bf8d-438b-b35a-4195963786e3\") " pod="openstack/nova-api-0" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.313720 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/897f63f9-bf8d-438b-b35a-4195963786e3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"897f63f9-bf8d-438b-b35a-4195963786e3\") " pod="openstack/nova-api-0" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.313762 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/897f63f9-bf8d-438b-b35a-4195963786e3-config-data\") pod \"nova-api-0\" (UID: \"897f63f9-bf8d-438b-b35a-4195963786e3\") " pod="openstack/nova-api-0" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.313804 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgbg6\" (UniqueName: \"kubernetes.io/projected/897f63f9-bf8d-438b-b35a-4195963786e3-kube-api-access-fgbg6\") pod \"nova-api-0\" (UID: \"897f63f9-bf8d-438b-b35a-4195963786e3\") " pod="openstack/nova-api-0" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.313864 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/897f63f9-bf8d-438b-b35a-4195963786e3-public-tls-certs\") pod \"nova-api-0\" (UID: \"897f63f9-bf8d-438b-b35a-4195963786e3\") " pod="openstack/nova-api-0" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.313918 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/897f63f9-bf8d-438b-b35a-4195963786e3-logs\") pod \"nova-api-0\" (UID: \"897f63f9-bf8d-438b-b35a-4195963786e3\") " pod="openstack/nova-api-0" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.415735 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/897f63f9-bf8d-438b-b35a-4195963786e3-public-tls-certs\") pod \"nova-api-0\" (UID: \"897f63f9-bf8d-438b-b35a-4195963786e3\") " pod="openstack/nova-api-0" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.415828 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/897f63f9-bf8d-438b-b35a-4195963786e3-logs\") pod \"nova-api-0\" (UID: \"897f63f9-bf8d-438b-b35a-4195963786e3\") " pod="openstack/nova-api-0" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.415892 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/897f63f9-bf8d-438b-b35a-4195963786e3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"897f63f9-bf8d-438b-b35a-4195963786e3\") " pod="openstack/nova-api-0" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.415918 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/897f63f9-bf8d-438b-b35a-4195963786e3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"897f63f9-bf8d-438b-b35a-4195963786e3\") " pod="openstack/nova-api-0" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.415970 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/897f63f9-bf8d-438b-b35a-4195963786e3-config-data\") pod \"nova-api-0\" (UID: \"897f63f9-bf8d-438b-b35a-4195963786e3\") " pod="openstack/nova-api-0" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.416032 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgbg6\" (UniqueName: \"kubernetes.io/projected/897f63f9-bf8d-438b-b35a-4195963786e3-kube-api-access-fgbg6\") pod \"nova-api-0\" (UID: \"897f63f9-bf8d-438b-b35a-4195963786e3\") " pod="openstack/nova-api-0" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.417408 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/897f63f9-bf8d-438b-b35a-4195963786e3-logs\") pod \"nova-api-0\" (UID: \"897f63f9-bf8d-438b-b35a-4195963786e3\") " pod="openstack/nova-api-0" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.421849 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/897f63f9-bf8d-438b-b35a-4195963786e3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"897f63f9-bf8d-438b-b35a-4195963786e3\") " pod="openstack/nova-api-0" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.422368 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/897f63f9-bf8d-438b-b35a-4195963786e3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"897f63f9-bf8d-438b-b35a-4195963786e3\") " pod="openstack/nova-api-0" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.422559 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/897f63f9-bf8d-438b-b35a-4195963786e3-config-data\") pod \"nova-api-0\" (UID: \"897f63f9-bf8d-438b-b35a-4195963786e3\") " pod="openstack/nova-api-0" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.430805 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/897f63f9-bf8d-438b-b35a-4195963786e3-public-tls-certs\") pod \"nova-api-0\" (UID: \"897f63f9-bf8d-438b-b35a-4195963786e3\") " pod="openstack/nova-api-0" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.438225 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgbg6\" (UniqueName: \"kubernetes.io/projected/897f63f9-bf8d-438b-b35a-4195963786e3-kube-api-access-fgbg6\") pod \"nova-api-0\" (UID: \"897f63f9-bf8d-438b-b35a-4195963786e3\") " pod="openstack/nova-api-0" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.660893 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 16:40:16 crc kubenswrapper[4756]: I1205 16:40:16.686886 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:16.822404 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2pmf\" (UniqueName: \"kubernetes.io/projected/2011aac5-dbbc-4244-aab4-2bb2abc25cee-kube-api-access-q2pmf\") pod \"2011aac5-dbbc-4244-aab4-2bb2abc25cee\" (UID: \"2011aac5-dbbc-4244-aab4-2bb2abc25cee\") " Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:16.823030 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2011aac5-dbbc-4244-aab4-2bb2abc25cee-config-data\") pod \"2011aac5-dbbc-4244-aab4-2bb2abc25cee\" (UID: \"2011aac5-dbbc-4244-aab4-2bb2abc25cee\") " Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:16.823088 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2011aac5-dbbc-4244-aab4-2bb2abc25cee-combined-ca-bundle\") pod \"2011aac5-dbbc-4244-aab4-2bb2abc25cee\" (UID: \"2011aac5-dbbc-4244-aab4-2bb2abc25cee\") " Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:16.826602 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2011aac5-dbbc-4244-aab4-2bb2abc25cee-kube-api-access-q2pmf" (OuterVolumeSpecName: "kube-api-access-q2pmf") pod "2011aac5-dbbc-4244-aab4-2bb2abc25cee" (UID: "2011aac5-dbbc-4244-aab4-2bb2abc25cee"). InnerVolumeSpecName "kube-api-access-q2pmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:16.860222 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2011aac5-dbbc-4244-aab4-2bb2abc25cee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2011aac5-dbbc-4244-aab4-2bb2abc25cee" (UID: "2011aac5-dbbc-4244-aab4-2bb2abc25cee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:16.860663 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2011aac5-dbbc-4244-aab4-2bb2abc25cee-config-data" (OuterVolumeSpecName: "config-data") pod "2011aac5-dbbc-4244-aab4-2bb2abc25cee" (UID: "2011aac5-dbbc-4244-aab4-2bb2abc25cee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:16.874323 4756 generic.go:334] "Generic (PLEG): container finished" podID="2011aac5-dbbc-4244-aab4-2bb2abc25cee" containerID="81cf6379a435061d61a85eaca4c548ecd8a03141a920e8b4cbeae7f9adc9169c" exitCode=0 Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:16.874407 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2011aac5-dbbc-4244-aab4-2bb2abc25cee","Type":"ContainerDied","Data":"81cf6379a435061d61a85eaca4c548ecd8a03141a920e8b4cbeae7f9adc9169c"} Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:16.874450 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2011aac5-dbbc-4244-aab4-2bb2abc25cee","Type":"ContainerDied","Data":"51ca394c9a2b9ab069cd2fd5f332a92e0806e6d648caecb792e21e515c85ebc9"} Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:16.874467 4756 scope.go:117] "RemoveContainer" containerID="81cf6379a435061d61a85eaca4c548ecd8a03141a920e8b4cbeae7f9adc9169c" Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:16.874572 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:16.917600 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:16.917804 4756 scope.go:117] "RemoveContainer" containerID="81cf6379a435061d61a85eaca4c548ecd8a03141a920e8b4cbeae7f9adc9169c" Dec 05 16:40:17 crc kubenswrapper[4756]: E1205 16:40:16.924777 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81cf6379a435061d61a85eaca4c548ecd8a03141a920e8b4cbeae7f9adc9169c\": container with ID starting with 81cf6379a435061d61a85eaca4c548ecd8a03141a920e8b4cbeae7f9adc9169c not found: ID does not exist" containerID="81cf6379a435061d61a85eaca4c548ecd8a03141a920e8b4cbeae7f9adc9169c" Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:16.924832 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81cf6379a435061d61a85eaca4c548ecd8a03141a920e8b4cbeae7f9adc9169c"} err="failed to get container status \"81cf6379a435061d61a85eaca4c548ecd8a03141a920e8b4cbeae7f9adc9169c\": rpc error: code = NotFound desc = could not find container \"81cf6379a435061d61a85eaca4c548ecd8a03141a920e8b4cbeae7f9adc9169c\": container with ID starting with 81cf6379a435061d61a85eaca4c548ecd8a03141a920e8b4cbeae7f9adc9169c not found: ID does not exist" Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:16.930460 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2011aac5-dbbc-4244-aab4-2bb2abc25cee-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:16.930509 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2011aac5-dbbc-4244-aab4-2bb2abc25cee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:16.930528 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2pmf\" (UniqueName: \"kubernetes.io/projected/2011aac5-dbbc-4244-aab4-2bb2abc25cee-kube-api-access-q2pmf\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:16.944760 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:16.956589 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 16:40:17 crc kubenswrapper[4756]: E1205 16:40:16.957113 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2011aac5-dbbc-4244-aab4-2bb2abc25cee" containerName="nova-scheduler-scheduler" Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:16.957131 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="2011aac5-dbbc-4244-aab4-2bb2abc25cee" containerName="nova-scheduler-scheduler" Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:16.957352 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="2011aac5-dbbc-4244-aab4-2bb2abc25cee" containerName="nova-scheduler-scheduler" Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:16.958190 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:16.960137 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:16.966003 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:17.032500 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9451d43-4e5f-429b-b364-9220afe19523-config-data\") pod \"nova-scheduler-0\" (UID: \"c9451d43-4e5f-429b-b364-9220afe19523\") " pod="openstack/nova-scheduler-0" Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:17.032933 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9451d43-4e5f-429b-b364-9220afe19523-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c9451d43-4e5f-429b-b364-9220afe19523\") " pod="openstack/nova-scheduler-0" Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:17.033014 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crs8q\" (UniqueName: \"kubernetes.io/projected/c9451d43-4e5f-429b-b364-9220afe19523-kube-api-access-crs8q\") pod \"nova-scheduler-0\" (UID: \"c9451d43-4e5f-429b-b364-9220afe19523\") " pod="openstack/nova-scheduler-0" Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:17.134700 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9451d43-4e5f-429b-b364-9220afe19523-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c9451d43-4e5f-429b-b364-9220afe19523\") " pod="openstack/nova-scheduler-0" Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:17.134754 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crs8q\" (UniqueName: \"kubernetes.io/projected/c9451d43-4e5f-429b-b364-9220afe19523-kube-api-access-crs8q\") pod \"nova-scheduler-0\" (UID: \"c9451d43-4e5f-429b-b364-9220afe19523\") " pod="openstack/nova-scheduler-0" Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:17.134819 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9451d43-4e5f-429b-b364-9220afe19523-config-data\") pod \"nova-scheduler-0\" (UID: \"c9451d43-4e5f-429b-b364-9220afe19523\") " pod="openstack/nova-scheduler-0" Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:17.141197 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9451d43-4e5f-429b-b364-9220afe19523-config-data\") pod \"nova-scheduler-0\" (UID: \"c9451d43-4e5f-429b-b364-9220afe19523\") " pod="openstack/nova-scheduler-0" Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:17.141303 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9451d43-4e5f-429b-b364-9220afe19523-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c9451d43-4e5f-429b-b364-9220afe19523\") " pod="openstack/nova-scheduler-0" Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:17.167617 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crs8q\" (UniqueName: \"kubernetes.io/projected/c9451d43-4e5f-429b-b364-9220afe19523-kube-api-access-crs8q\") pod \"nova-scheduler-0\" (UID: \"c9451d43-4e5f-429b-b364-9220afe19523\") " pod="openstack/nova-scheduler-0" Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:17.279575 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 16:40:17 crc kubenswrapper[4756]: I1205 16:40:17.977575 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.074061 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2011aac5-dbbc-4244-aab4-2bb2abc25cee" path="/var/lib/kubelet/pods/2011aac5-dbbc-4244-aab4-2bb2abc25cee/volumes" Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.074922 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a50a0b2e-8847-451d-bd97-220d4a2dfb8f" path="/var/lib/kubelet/pods/a50a0b2e-8847-451d-bd97-220d4a2dfb8f/volumes" Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.086277 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.233250 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="40355c38-2e19-4399-ad0e-69986aca4afc" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": read tcp 10.217.0.2:51484->10.217.0.191:8775: read: connection reset by peer" Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.233665 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="40355c38-2e19-4399-ad0e-69986aca4afc" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": read tcp 10.217.0.2:51478->10.217.0.191:8775: read: connection reset by peer" Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.690132 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.776908 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40355c38-2e19-4399-ad0e-69986aca4afc-combined-ca-bundle\") pod \"40355c38-2e19-4399-ad0e-69986aca4afc\" (UID: \"40355c38-2e19-4399-ad0e-69986aca4afc\") " Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.777261 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40355c38-2e19-4399-ad0e-69986aca4afc-logs\") pod \"40355c38-2e19-4399-ad0e-69986aca4afc\" (UID: \"40355c38-2e19-4399-ad0e-69986aca4afc\") " Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.777308 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40355c38-2e19-4399-ad0e-69986aca4afc-config-data\") pod \"40355c38-2e19-4399-ad0e-69986aca4afc\" (UID: \"40355c38-2e19-4399-ad0e-69986aca4afc\") " Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.777370 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/40355c38-2e19-4399-ad0e-69986aca4afc-nova-metadata-tls-certs\") pod \"40355c38-2e19-4399-ad0e-69986aca4afc\" (UID: \"40355c38-2e19-4399-ad0e-69986aca4afc\") " Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.778737 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mmsw\" (UniqueName: \"kubernetes.io/projected/40355c38-2e19-4399-ad0e-69986aca4afc-kube-api-access-5mmsw\") pod \"40355c38-2e19-4399-ad0e-69986aca4afc\" (UID: \"40355c38-2e19-4399-ad0e-69986aca4afc\") " Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.779011 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40355c38-2e19-4399-ad0e-69986aca4afc-logs" (OuterVolumeSpecName: "logs") pod "40355c38-2e19-4399-ad0e-69986aca4afc" (UID: "40355c38-2e19-4399-ad0e-69986aca4afc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.779461 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40355c38-2e19-4399-ad0e-69986aca4afc-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.783073 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40355c38-2e19-4399-ad0e-69986aca4afc-kube-api-access-5mmsw" (OuterVolumeSpecName: "kube-api-access-5mmsw") pod "40355c38-2e19-4399-ad0e-69986aca4afc" (UID: "40355c38-2e19-4399-ad0e-69986aca4afc"). InnerVolumeSpecName "kube-api-access-5mmsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.831716 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40355c38-2e19-4399-ad0e-69986aca4afc-config-data" (OuterVolumeSpecName: "config-data") pod "40355c38-2e19-4399-ad0e-69986aca4afc" (UID: "40355c38-2e19-4399-ad0e-69986aca4afc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.840571 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40355c38-2e19-4399-ad0e-69986aca4afc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40355c38-2e19-4399-ad0e-69986aca4afc" (UID: "40355c38-2e19-4399-ad0e-69986aca4afc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.854549 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40355c38-2e19-4399-ad0e-69986aca4afc-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "40355c38-2e19-4399-ad0e-69986aca4afc" (UID: "40355c38-2e19-4399-ad0e-69986aca4afc"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.881570 4756 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/40355c38-2e19-4399-ad0e-69986aca4afc-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.881604 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mmsw\" (UniqueName: \"kubernetes.io/projected/40355c38-2e19-4399-ad0e-69986aca4afc-kube-api-access-5mmsw\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.881615 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40355c38-2e19-4399-ad0e-69986aca4afc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.881624 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40355c38-2e19-4399-ad0e-69986aca4afc-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.912127 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"897f63f9-bf8d-438b-b35a-4195963786e3","Type":"ContainerStarted","Data":"04391e895ac7c022c16d69dfd03bf0f8cab399a37640b4aebf07a28dfbf58d41"} Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.912171 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"897f63f9-bf8d-438b-b35a-4195963786e3","Type":"ContainerStarted","Data":"8f9e222e4bcb63d553ca65dc2988c4b494ff464ecf438d7b80744af7ca09bdb8"} Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.912181 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"897f63f9-bf8d-438b-b35a-4195963786e3","Type":"ContainerStarted","Data":"e004ca7a7bc225259c07efb09913e11e606e4a57045d3e66620ff0267e37e848"} Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.915415 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c9451d43-4e5f-429b-b364-9220afe19523","Type":"ContainerStarted","Data":"52f8fd30a92226261f0f4bd124f526c8082068adbdd24e6aad126c08d81c0f83"} Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.915505 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c9451d43-4e5f-429b-b364-9220afe19523","Type":"ContainerStarted","Data":"7906d615576514a0ae9f355934c433cf342a6577016c205c7bf710d3a0b10651"} Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.917145 4756 generic.go:334] "Generic (PLEG): container finished" podID="40355c38-2e19-4399-ad0e-69986aca4afc" containerID="473febb21047ae03f0d68e956550c08036534d94ad15979bcf8182b244f07a6a" exitCode=0 Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.917170 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"40355c38-2e19-4399-ad0e-69986aca4afc","Type":"ContainerDied","Data":"473febb21047ae03f0d68e956550c08036534d94ad15979bcf8182b244f07a6a"} Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.917185 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"40355c38-2e19-4399-ad0e-69986aca4afc","Type":"ContainerDied","Data":"a37b65e3b140865fce4637d73f03497ef747b7bcd1c6cffc82ad83486581f66c"} Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.917200 4756 scope.go:117] "RemoveContainer" containerID="473febb21047ae03f0d68e956550c08036534d94ad15979bcf8182b244f07a6a" Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.917283 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.944140 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.944125414 podStartE2EDuration="2.944125414s" podCreationTimestamp="2025-12-05 16:40:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:40:18.937571507 +0000 UTC m=+1479.176245019" watchObservedRunningTime="2025-12-05 16:40:18.944125414 +0000 UTC m=+1479.182798926" Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.956352 4756 scope.go:117] "RemoveContainer" containerID="f9c0aa434a8aaa877ac9be36559583185484829d4628945a9968fb907b589035" Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.976098 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.97608274 podStartE2EDuration="2.97608274s" podCreationTimestamp="2025-12-05 16:40:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:40:18.957564508 +0000 UTC m=+1479.196238020" watchObservedRunningTime="2025-12-05 16:40:18.97608274 +0000 UTC m=+1479.214756252" Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.997700 4756 scope.go:117] "RemoveContainer" containerID="473febb21047ae03f0d68e956550c08036534d94ad15979bcf8182b244f07a6a" Dec 05 16:40:18 crc kubenswrapper[4756]: E1205 16:40:18.998192 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"473febb21047ae03f0d68e956550c08036534d94ad15979bcf8182b244f07a6a\": container with ID starting with 473febb21047ae03f0d68e956550c08036534d94ad15979bcf8182b244f07a6a not found: ID does not exist" containerID="473febb21047ae03f0d68e956550c08036534d94ad15979bcf8182b244f07a6a" Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.998223 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"473febb21047ae03f0d68e956550c08036534d94ad15979bcf8182b244f07a6a"} err="failed to get container status \"473febb21047ae03f0d68e956550c08036534d94ad15979bcf8182b244f07a6a\": rpc error: code = NotFound desc = could not find container \"473febb21047ae03f0d68e956550c08036534d94ad15979bcf8182b244f07a6a\": container with ID starting with 473febb21047ae03f0d68e956550c08036534d94ad15979bcf8182b244f07a6a not found: ID does not exist" Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.998246 4756 scope.go:117] "RemoveContainer" containerID="f9c0aa434a8aaa877ac9be36559583185484829d4628945a9968fb907b589035" Dec 05 16:40:18 crc kubenswrapper[4756]: E1205 16:40:18.998633 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9c0aa434a8aaa877ac9be36559583185484829d4628945a9968fb907b589035\": container with ID starting with f9c0aa434a8aaa877ac9be36559583185484829d4628945a9968fb907b589035 not found: ID does not exist" containerID="f9c0aa434a8aaa877ac9be36559583185484829d4628945a9968fb907b589035" Dec 05 16:40:18 crc kubenswrapper[4756]: I1205 16:40:18.998657 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9c0aa434a8aaa877ac9be36559583185484829d4628945a9968fb907b589035"} err="failed to get container status \"f9c0aa434a8aaa877ac9be36559583185484829d4628945a9968fb907b589035\": rpc error: code = NotFound desc = could not find container \"f9c0aa434a8aaa877ac9be36559583185484829d4628945a9968fb907b589035\": container with ID starting with f9c0aa434a8aaa877ac9be36559583185484829d4628945a9968fb907b589035 not found: ID does not exist" Dec 05 16:40:19 crc kubenswrapper[4756]: I1205 16:40:19.004576 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 16:40:19 crc kubenswrapper[4756]: I1205 16:40:19.019693 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 16:40:19 crc kubenswrapper[4756]: I1205 16:40:19.029787 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 16:40:19 crc kubenswrapper[4756]: E1205 16:40:19.030280 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40355c38-2e19-4399-ad0e-69986aca4afc" containerName="nova-metadata-metadata" Dec 05 16:40:19 crc kubenswrapper[4756]: I1205 16:40:19.030306 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="40355c38-2e19-4399-ad0e-69986aca4afc" containerName="nova-metadata-metadata" Dec 05 16:40:19 crc kubenswrapper[4756]: E1205 16:40:19.030340 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40355c38-2e19-4399-ad0e-69986aca4afc" containerName="nova-metadata-log" Dec 05 16:40:19 crc kubenswrapper[4756]: I1205 16:40:19.030350 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="40355c38-2e19-4399-ad0e-69986aca4afc" containerName="nova-metadata-log" Dec 05 16:40:19 crc kubenswrapper[4756]: I1205 16:40:19.030611 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="40355c38-2e19-4399-ad0e-69986aca4afc" containerName="nova-metadata-log" Dec 05 16:40:19 crc kubenswrapper[4756]: I1205 16:40:19.030649 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="40355c38-2e19-4399-ad0e-69986aca4afc" containerName="nova-metadata-metadata" Dec 05 16:40:19 crc kubenswrapper[4756]: I1205 16:40:19.031882 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 16:40:19 crc kubenswrapper[4756]: I1205 16:40:19.037165 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 16:40:19 crc kubenswrapper[4756]: I1205 16:40:19.037327 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 16:40:19 crc kubenswrapper[4756]: I1205 16:40:19.037743 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 16:40:19 crc kubenswrapper[4756]: I1205 16:40:19.188038 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76057872-026a-4ecb-b628-75f7f6fd46e9-config-data\") pod \"nova-metadata-0\" (UID: \"76057872-026a-4ecb-b628-75f7f6fd46e9\") " pod="openstack/nova-metadata-0" Dec 05 16:40:19 crc kubenswrapper[4756]: I1205 16:40:19.188136 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/76057872-026a-4ecb-b628-75f7f6fd46e9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"76057872-026a-4ecb-b628-75f7f6fd46e9\") " pod="openstack/nova-metadata-0" Dec 05 16:40:19 crc kubenswrapper[4756]: I1205 16:40:19.188164 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv5ww\" (UniqueName: \"kubernetes.io/projected/76057872-026a-4ecb-b628-75f7f6fd46e9-kube-api-access-rv5ww\") pod \"nova-metadata-0\" (UID: \"76057872-026a-4ecb-b628-75f7f6fd46e9\") " pod="openstack/nova-metadata-0" Dec 05 16:40:19 crc kubenswrapper[4756]: I1205 16:40:19.190353 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76057872-026a-4ecb-b628-75f7f6fd46e9-logs\") pod \"nova-metadata-0\" (UID: \"76057872-026a-4ecb-b628-75f7f6fd46e9\") " pod="openstack/nova-metadata-0" Dec 05 16:40:19 crc kubenswrapper[4756]: I1205 16:40:19.190508 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76057872-026a-4ecb-b628-75f7f6fd46e9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"76057872-026a-4ecb-b628-75f7f6fd46e9\") " pod="openstack/nova-metadata-0" Dec 05 16:40:19 crc kubenswrapper[4756]: I1205 16:40:19.292748 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/76057872-026a-4ecb-b628-75f7f6fd46e9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"76057872-026a-4ecb-b628-75f7f6fd46e9\") " pod="openstack/nova-metadata-0" Dec 05 16:40:19 crc kubenswrapper[4756]: I1205 16:40:19.292803 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv5ww\" (UniqueName: \"kubernetes.io/projected/76057872-026a-4ecb-b628-75f7f6fd46e9-kube-api-access-rv5ww\") pod \"nova-metadata-0\" (UID: \"76057872-026a-4ecb-b628-75f7f6fd46e9\") " pod="openstack/nova-metadata-0" Dec 05 16:40:19 crc kubenswrapper[4756]: I1205 16:40:19.292900 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76057872-026a-4ecb-b628-75f7f6fd46e9-logs\") pod \"nova-metadata-0\" (UID: \"76057872-026a-4ecb-b628-75f7f6fd46e9\") " pod="openstack/nova-metadata-0" Dec 05 16:40:19 crc kubenswrapper[4756]: I1205 16:40:19.292934 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76057872-026a-4ecb-b628-75f7f6fd46e9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"76057872-026a-4ecb-b628-75f7f6fd46e9\") " pod="openstack/nova-metadata-0" Dec 05 16:40:19 crc kubenswrapper[4756]: I1205 16:40:19.293004 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76057872-026a-4ecb-b628-75f7f6fd46e9-config-data\") pod \"nova-metadata-0\" (UID: \"76057872-026a-4ecb-b628-75f7f6fd46e9\") " pod="openstack/nova-metadata-0" Dec 05 16:40:19 crc kubenswrapper[4756]: I1205 16:40:19.293366 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76057872-026a-4ecb-b628-75f7f6fd46e9-logs\") pod \"nova-metadata-0\" (UID: \"76057872-026a-4ecb-b628-75f7f6fd46e9\") " pod="openstack/nova-metadata-0" Dec 05 16:40:19 crc kubenswrapper[4756]: I1205 16:40:19.297449 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76057872-026a-4ecb-b628-75f7f6fd46e9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"76057872-026a-4ecb-b628-75f7f6fd46e9\") " pod="openstack/nova-metadata-0" Dec 05 16:40:19 crc kubenswrapper[4756]: I1205 16:40:19.297777 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/76057872-026a-4ecb-b628-75f7f6fd46e9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"76057872-026a-4ecb-b628-75f7f6fd46e9\") " pod="openstack/nova-metadata-0" Dec 05 16:40:19 crc kubenswrapper[4756]: I1205 16:40:19.299232 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76057872-026a-4ecb-b628-75f7f6fd46e9-config-data\") pod \"nova-metadata-0\" (UID: \"76057872-026a-4ecb-b628-75f7f6fd46e9\") " pod="openstack/nova-metadata-0" Dec 05 16:40:19 crc kubenswrapper[4756]: I1205 16:40:19.310533 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv5ww\" (UniqueName: \"kubernetes.io/projected/76057872-026a-4ecb-b628-75f7f6fd46e9-kube-api-access-rv5ww\") pod \"nova-metadata-0\" (UID: \"76057872-026a-4ecb-b628-75f7f6fd46e9\") " pod="openstack/nova-metadata-0" Dec 05 16:40:19 crc kubenswrapper[4756]: I1205 16:40:19.361584 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 16:40:19 crc kubenswrapper[4756]: W1205 16:40:19.819296 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76057872_026a_4ecb_b628_75f7f6fd46e9.slice/crio-0f2f060c5d7b9d5be07fb6e2bfc8eb42ffc2bbf85eb55036f2e81e10cdb559b5 WatchSource:0}: Error finding container 0f2f060c5d7b9d5be07fb6e2bfc8eb42ffc2bbf85eb55036f2e81e10cdb559b5: Status 404 returned error can't find the container with id 0f2f060c5d7b9d5be07fb6e2bfc8eb42ffc2bbf85eb55036f2e81e10cdb559b5 Dec 05 16:40:19 crc kubenswrapper[4756]: I1205 16:40:19.820736 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 16:40:19 crc kubenswrapper[4756]: I1205 16:40:19.933032 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"76057872-026a-4ecb-b628-75f7f6fd46e9","Type":"ContainerStarted","Data":"0f2f060c5d7b9d5be07fb6e2bfc8eb42ffc2bbf85eb55036f2e81e10cdb559b5"} Dec 05 16:40:20 crc kubenswrapper[4756]: I1205 16:40:20.076157 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40355c38-2e19-4399-ad0e-69986aca4afc" path="/var/lib/kubelet/pods/40355c38-2e19-4399-ad0e-69986aca4afc/volumes" Dec 05 16:40:20 crc kubenswrapper[4756]: I1205 16:40:20.974042 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"76057872-026a-4ecb-b628-75f7f6fd46e9","Type":"ContainerStarted","Data":"fe8069b814e870c4e578733a24b86485d43263b291fd3dcc6e53ead33749cc50"} Dec 05 16:40:20 crc kubenswrapper[4756]: I1205 16:40:20.974644 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"76057872-026a-4ecb-b628-75f7f6fd46e9","Type":"ContainerStarted","Data":"8c85479e2fe59de4f67537a2a6d08e3945b963ac89af5c01b49b2eb958a942e5"} Dec 05 16:40:20 crc kubenswrapper[4756]: I1205 16:40:20.999617 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.999589138 podStartE2EDuration="2.999589138s" podCreationTimestamp="2025-12-05 16:40:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:40:20.988899989 +0000 UTC m=+1481.227573511" watchObservedRunningTime="2025-12-05 16:40:20.999589138 +0000 UTC m=+1481.238262670" Dec 05 16:40:22 crc kubenswrapper[4756]: I1205 16:40:22.280359 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 16:40:24 crc kubenswrapper[4756]: I1205 16:40:24.361758 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 16:40:24 crc kubenswrapper[4756]: I1205 16:40:24.362026 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 16:40:26 crc kubenswrapper[4756]: I1205 16:40:26.661687 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 16:40:26 crc kubenswrapper[4756]: I1205 16:40:26.663329 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 16:40:27 crc kubenswrapper[4756]: I1205 16:40:27.280109 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 16:40:27 crc kubenswrapper[4756]: I1205 16:40:27.305394 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 16:40:27 crc kubenswrapper[4756]: I1205 16:40:27.676717 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="897f63f9-bf8d-438b-b35a-4195963786e3" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 16:40:27 crc kubenswrapper[4756]: I1205 16:40:27.676745 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="897f63f9-bf8d-438b-b35a-4195963786e3" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 16:40:28 crc kubenswrapper[4756]: I1205 16:40:28.078963 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 16:40:29 crc kubenswrapper[4756]: I1205 16:40:29.362410 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 16:40:29 crc kubenswrapper[4756]: I1205 16:40:29.362470 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 16:40:30 crc kubenswrapper[4756]: I1205 16:40:30.374564 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="76057872-026a-4ecb-b628-75f7f6fd46e9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 16:40:30 crc kubenswrapper[4756]: I1205 16:40:30.374586 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="76057872-026a-4ecb-b628-75f7f6fd46e9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 16:40:33 crc kubenswrapper[4756]: I1205 16:40:33.580961 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s2xnd"] Dec 05 16:40:33 crc kubenswrapper[4756]: I1205 16:40:33.588131 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s2xnd" Dec 05 16:40:33 crc kubenswrapper[4756]: I1205 16:40:33.589815 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s2xnd"] Dec 05 16:40:33 crc kubenswrapper[4756]: I1205 16:40:33.720553 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2zhj\" (UniqueName: \"kubernetes.io/projected/6768dbe8-7db3-463a-bf23-69c35db737ed-kube-api-access-r2zhj\") pod \"community-operators-s2xnd\" (UID: \"6768dbe8-7db3-463a-bf23-69c35db737ed\") " pod="openshift-marketplace/community-operators-s2xnd" Dec 05 16:40:33 crc kubenswrapper[4756]: I1205 16:40:33.720942 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6768dbe8-7db3-463a-bf23-69c35db737ed-catalog-content\") pod \"community-operators-s2xnd\" (UID: \"6768dbe8-7db3-463a-bf23-69c35db737ed\") " pod="openshift-marketplace/community-operators-s2xnd" Dec 05 16:40:33 crc kubenswrapper[4756]: I1205 16:40:33.721331 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6768dbe8-7db3-463a-bf23-69c35db737ed-utilities\") pod \"community-operators-s2xnd\" (UID: \"6768dbe8-7db3-463a-bf23-69c35db737ed\") " pod="openshift-marketplace/community-operators-s2xnd" Dec 05 16:40:33 crc kubenswrapper[4756]: I1205 16:40:33.822670 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6768dbe8-7db3-463a-bf23-69c35db737ed-utilities\") pod \"community-operators-s2xnd\" (UID: \"6768dbe8-7db3-463a-bf23-69c35db737ed\") " pod="openshift-marketplace/community-operators-s2xnd" Dec 05 16:40:33 crc kubenswrapper[4756]: I1205 16:40:33.822751 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2zhj\" (UniqueName: \"kubernetes.io/projected/6768dbe8-7db3-463a-bf23-69c35db737ed-kube-api-access-r2zhj\") pod \"community-operators-s2xnd\" (UID: \"6768dbe8-7db3-463a-bf23-69c35db737ed\") " pod="openshift-marketplace/community-operators-s2xnd" Dec 05 16:40:33 crc kubenswrapper[4756]: I1205 16:40:33.822813 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6768dbe8-7db3-463a-bf23-69c35db737ed-catalog-content\") pod \"community-operators-s2xnd\" (UID: \"6768dbe8-7db3-463a-bf23-69c35db737ed\") " pod="openshift-marketplace/community-operators-s2xnd" Dec 05 16:40:33 crc kubenswrapper[4756]: I1205 16:40:33.823260 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6768dbe8-7db3-463a-bf23-69c35db737ed-utilities\") pod \"community-operators-s2xnd\" (UID: \"6768dbe8-7db3-463a-bf23-69c35db737ed\") " pod="openshift-marketplace/community-operators-s2xnd" Dec 05 16:40:33 crc kubenswrapper[4756]: I1205 16:40:33.823335 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6768dbe8-7db3-463a-bf23-69c35db737ed-catalog-content\") pod \"community-operators-s2xnd\" (UID: \"6768dbe8-7db3-463a-bf23-69c35db737ed\") " pod="openshift-marketplace/community-operators-s2xnd" Dec 05 16:40:33 crc kubenswrapper[4756]: I1205 16:40:33.847803 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2zhj\" (UniqueName: \"kubernetes.io/projected/6768dbe8-7db3-463a-bf23-69c35db737ed-kube-api-access-r2zhj\") pod \"community-operators-s2xnd\" (UID: \"6768dbe8-7db3-463a-bf23-69c35db737ed\") " pod="openshift-marketplace/community-operators-s2xnd" Dec 05 16:40:33 crc kubenswrapper[4756]: I1205 16:40:33.920754 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s2xnd" Dec 05 16:40:34 crc kubenswrapper[4756]: I1205 16:40:34.439386 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s2xnd"] Dec 05 16:40:35 crc kubenswrapper[4756]: I1205 16:40:35.127171 4756 generic.go:334] "Generic (PLEG): container finished" podID="6768dbe8-7db3-463a-bf23-69c35db737ed" containerID="9bf5a6f70f6dc3e314a6051fbffa550642156507b8dd089262eab23ece6d63b8" exitCode=0 Dec 05 16:40:35 crc kubenswrapper[4756]: I1205 16:40:35.127218 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2xnd" event={"ID":"6768dbe8-7db3-463a-bf23-69c35db737ed","Type":"ContainerDied","Data":"9bf5a6f70f6dc3e314a6051fbffa550642156507b8dd089262eab23ece6d63b8"} Dec 05 16:40:35 crc kubenswrapper[4756]: I1205 16:40:35.127245 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2xnd" event={"ID":"6768dbe8-7db3-463a-bf23-69c35db737ed","Type":"ContainerStarted","Data":"12e9a77c3cd3ff8d873257fde814bd176fbc9b9272eae5da7bcedf46e9b75d9f"} Dec 05 16:40:36 crc kubenswrapper[4756]: I1205 16:40:36.137997 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2xnd" event={"ID":"6768dbe8-7db3-463a-bf23-69c35db737ed","Type":"ContainerStarted","Data":"057d26cde316f06871e5631b201c4b3a52a8b79dc027ae819dc8882ebffbc57f"} Dec 05 16:40:36 crc kubenswrapper[4756]: I1205 16:40:36.219361 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 05 16:40:36 crc kubenswrapper[4756]: I1205 16:40:36.667567 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 16:40:36 crc kubenswrapper[4756]: I1205 16:40:36.668034 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 16:40:36 crc kubenswrapper[4756]: I1205 16:40:36.670562 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 16:40:36 crc kubenswrapper[4756]: I1205 16:40:36.672943 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 16:40:37 crc kubenswrapper[4756]: I1205 16:40:37.164510 4756 generic.go:334] "Generic (PLEG): container finished" podID="6768dbe8-7db3-463a-bf23-69c35db737ed" containerID="057d26cde316f06871e5631b201c4b3a52a8b79dc027ae819dc8882ebffbc57f" exitCode=0 Dec 05 16:40:37 crc kubenswrapper[4756]: I1205 16:40:37.165380 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2xnd" event={"ID":"6768dbe8-7db3-463a-bf23-69c35db737ed","Type":"ContainerDied","Data":"057d26cde316f06871e5631b201c4b3a52a8b79dc027ae819dc8882ebffbc57f"} Dec 05 16:40:37 crc kubenswrapper[4756]: I1205 16:40:37.165586 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 16:40:37 crc kubenswrapper[4756]: I1205 16:40:37.176680 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 16:40:38 crc kubenswrapper[4756]: I1205 16:40:38.175169 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2xnd" event={"ID":"6768dbe8-7db3-463a-bf23-69c35db737ed","Type":"ContainerStarted","Data":"6d4a8f337de670ce3edb6a10cba8951085d07cfaa161d55af5f6659a2d2edd7b"} Dec 05 16:40:38 crc kubenswrapper[4756]: I1205 16:40:38.208181 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s2xnd" podStartSLOduration=2.754684015 podStartE2EDuration="5.208162705s" podCreationTimestamp="2025-12-05 16:40:33 +0000 UTC" firstStartedPulling="2025-12-05 16:40:35.129117599 +0000 UTC m=+1495.367791111" lastFinishedPulling="2025-12-05 16:40:37.582596289 +0000 UTC m=+1497.821269801" observedRunningTime="2025-12-05 16:40:38.196121727 +0000 UTC m=+1498.434795249" watchObservedRunningTime="2025-12-05 16:40:38.208162705 +0000 UTC m=+1498.446836217" Dec 05 16:40:39 crc kubenswrapper[4756]: I1205 16:40:39.367004 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 16:40:39 crc kubenswrapper[4756]: I1205 16:40:39.377989 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 16:40:39 crc kubenswrapper[4756]: I1205 16:40:39.406077 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 16:40:40 crc kubenswrapper[4756]: I1205 16:40:40.201041 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 16:40:43 crc kubenswrapper[4756]: I1205 16:40:43.921818 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s2xnd" Dec 05 16:40:43 crc kubenswrapper[4756]: I1205 16:40:43.922370 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s2xnd" Dec 05 16:40:43 crc kubenswrapper[4756]: I1205 16:40:43.985460 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s2xnd" Dec 05 16:40:44 crc kubenswrapper[4756]: I1205 16:40:44.275713 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s2xnd" Dec 05 16:40:44 crc kubenswrapper[4756]: I1205 16:40:44.322282 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s2xnd"] Dec 05 16:40:46 crc kubenswrapper[4756]: I1205 16:40:46.262534 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s2xnd" podUID="6768dbe8-7db3-463a-bf23-69c35db737ed" containerName="registry-server" containerID="cri-o://6d4a8f337de670ce3edb6a10cba8951085d07cfaa161d55af5f6659a2d2edd7b" gracePeriod=2 Dec 05 16:40:46 crc kubenswrapper[4756]: I1205 16:40:46.793175 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s2xnd" Dec 05 16:40:46 crc kubenswrapper[4756]: I1205 16:40:46.880699 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6768dbe8-7db3-463a-bf23-69c35db737ed-catalog-content\") pod \"6768dbe8-7db3-463a-bf23-69c35db737ed\" (UID: \"6768dbe8-7db3-463a-bf23-69c35db737ed\") " Dec 05 16:40:46 crc kubenswrapper[4756]: I1205 16:40:46.880812 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6768dbe8-7db3-463a-bf23-69c35db737ed-utilities\") pod \"6768dbe8-7db3-463a-bf23-69c35db737ed\" (UID: \"6768dbe8-7db3-463a-bf23-69c35db737ed\") " Dec 05 16:40:46 crc kubenswrapper[4756]: I1205 16:40:46.880881 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2zhj\" (UniqueName: \"kubernetes.io/projected/6768dbe8-7db3-463a-bf23-69c35db737ed-kube-api-access-r2zhj\") pod \"6768dbe8-7db3-463a-bf23-69c35db737ed\" (UID: \"6768dbe8-7db3-463a-bf23-69c35db737ed\") " Dec 05 16:40:46 crc kubenswrapper[4756]: I1205 16:40:46.881833 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6768dbe8-7db3-463a-bf23-69c35db737ed-utilities" (OuterVolumeSpecName: "utilities") pod "6768dbe8-7db3-463a-bf23-69c35db737ed" (UID: "6768dbe8-7db3-463a-bf23-69c35db737ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:40:46 crc kubenswrapper[4756]: I1205 16:40:46.888752 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6768dbe8-7db3-463a-bf23-69c35db737ed-kube-api-access-r2zhj" (OuterVolumeSpecName: "kube-api-access-r2zhj") pod "6768dbe8-7db3-463a-bf23-69c35db737ed" (UID: "6768dbe8-7db3-463a-bf23-69c35db737ed"). InnerVolumeSpecName "kube-api-access-r2zhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:40:46 crc kubenswrapper[4756]: I1205 16:40:46.934511 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6768dbe8-7db3-463a-bf23-69c35db737ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6768dbe8-7db3-463a-bf23-69c35db737ed" (UID: "6768dbe8-7db3-463a-bf23-69c35db737ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:40:46 crc kubenswrapper[4756]: I1205 16:40:46.983105 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6768dbe8-7db3-463a-bf23-69c35db737ed-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:46 crc kubenswrapper[4756]: I1205 16:40:46.983150 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2zhj\" (UniqueName: \"kubernetes.io/projected/6768dbe8-7db3-463a-bf23-69c35db737ed-kube-api-access-r2zhj\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:46 crc kubenswrapper[4756]: I1205 16:40:46.983164 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6768dbe8-7db3-463a-bf23-69c35db737ed-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:40:47 crc kubenswrapper[4756]: I1205 16:40:47.275071 4756 generic.go:334] "Generic (PLEG): container finished" podID="6768dbe8-7db3-463a-bf23-69c35db737ed" containerID="6d4a8f337de670ce3edb6a10cba8951085d07cfaa161d55af5f6659a2d2edd7b" exitCode=0 Dec 05 16:40:47 crc kubenswrapper[4756]: I1205 16:40:47.275150 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s2xnd" Dec 05 16:40:47 crc kubenswrapper[4756]: I1205 16:40:47.275181 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2xnd" event={"ID":"6768dbe8-7db3-463a-bf23-69c35db737ed","Type":"ContainerDied","Data":"6d4a8f337de670ce3edb6a10cba8951085d07cfaa161d55af5f6659a2d2edd7b"} Dec 05 16:40:47 crc kubenswrapper[4756]: I1205 16:40:47.275468 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2xnd" event={"ID":"6768dbe8-7db3-463a-bf23-69c35db737ed","Type":"ContainerDied","Data":"12e9a77c3cd3ff8d873257fde814bd176fbc9b9272eae5da7bcedf46e9b75d9f"} Dec 05 16:40:47 crc kubenswrapper[4756]: I1205 16:40:47.275495 4756 scope.go:117] "RemoveContainer" containerID="6d4a8f337de670ce3edb6a10cba8951085d07cfaa161d55af5f6659a2d2edd7b" Dec 05 16:40:47 crc kubenswrapper[4756]: I1205 16:40:47.299316 4756 scope.go:117] "RemoveContainer" containerID="057d26cde316f06871e5631b201c4b3a52a8b79dc027ae819dc8882ebffbc57f" Dec 05 16:40:47 crc kubenswrapper[4756]: I1205 16:40:47.321845 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s2xnd"] Dec 05 16:40:47 crc kubenswrapper[4756]: I1205 16:40:47.334000 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s2xnd"] Dec 05 16:40:47 crc kubenswrapper[4756]: I1205 16:40:47.345363 4756 scope.go:117] "RemoveContainer" containerID="9bf5a6f70f6dc3e314a6051fbffa550642156507b8dd089262eab23ece6d63b8" Dec 05 16:40:47 crc kubenswrapper[4756]: I1205 16:40:47.372226 4756 scope.go:117] "RemoveContainer" containerID="6d4a8f337de670ce3edb6a10cba8951085d07cfaa161d55af5f6659a2d2edd7b" Dec 05 16:40:47 crc kubenswrapper[4756]: E1205 16:40:47.372625 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d4a8f337de670ce3edb6a10cba8951085d07cfaa161d55af5f6659a2d2edd7b\": container with ID starting with 6d4a8f337de670ce3edb6a10cba8951085d07cfaa161d55af5f6659a2d2edd7b not found: ID does not exist" containerID="6d4a8f337de670ce3edb6a10cba8951085d07cfaa161d55af5f6659a2d2edd7b" Dec 05 16:40:47 crc kubenswrapper[4756]: I1205 16:40:47.372655 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d4a8f337de670ce3edb6a10cba8951085d07cfaa161d55af5f6659a2d2edd7b"} err="failed to get container status \"6d4a8f337de670ce3edb6a10cba8951085d07cfaa161d55af5f6659a2d2edd7b\": rpc error: code = NotFound desc = could not find container \"6d4a8f337de670ce3edb6a10cba8951085d07cfaa161d55af5f6659a2d2edd7b\": container with ID starting with 6d4a8f337de670ce3edb6a10cba8951085d07cfaa161d55af5f6659a2d2edd7b not found: ID does not exist" Dec 05 16:40:47 crc kubenswrapper[4756]: I1205 16:40:47.372675 4756 scope.go:117] "RemoveContainer" containerID="057d26cde316f06871e5631b201c4b3a52a8b79dc027ae819dc8882ebffbc57f" Dec 05 16:40:47 crc kubenswrapper[4756]: E1205 16:40:47.373130 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"057d26cde316f06871e5631b201c4b3a52a8b79dc027ae819dc8882ebffbc57f\": container with ID starting with 057d26cde316f06871e5631b201c4b3a52a8b79dc027ae819dc8882ebffbc57f not found: ID does not exist" containerID="057d26cde316f06871e5631b201c4b3a52a8b79dc027ae819dc8882ebffbc57f" Dec 05 16:40:47 crc kubenswrapper[4756]: I1205 16:40:47.373162 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"057d26cde316f06871e5631b201c4b3a52a8b79dc027ae819dc8882ebffbc57f"} err="failed to get container status \"057d26cde316f06871e5631b201c4b3a52a8b79dc027ae819dc8882ebffbc57f\": rpc error: code = NotFound desc = could not find container \"057d26cde316f06871e5631b201c4b3a52a8b79dc027ae819dc8882ebffbc57f\": container with ID starting with 057d26cde316f06871e5631b201c4b3a52a8b79dc027ae819dc8882ebffbc57f not found: ID does not exist" Dec 05 16:40:47 crc kubenswrapper[4756]: I1205 16:40:47.373182 4756 scope.go:117] "RemoveContainer" containerID="9bf5a6f70f6dc3e314a6051fbffa550642156507b8dd089262eab23ece6d63b8" Dec 05 16:40:47 crc kubenswrapper[4756]: E1205 16:40:47.374795 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bf5a6f70f6dc3e314a6051fbffa550642156507b8dd089262eab23ece6d63b8\": container with ID starting with 9bf5a6f70f6dc3e314a6051fbffa550642156507b8dd089262eab23ece6d63b8 not found: ID does not exist" containerID="9bf5a6f70f6dc3e314a6051fbffa550642156507b8dd089262eab23ece6d63b8" Dec 05 16:40:47 crc kubenswrapper[4756]: I1205 16:40:47.374826 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bf5a6f70f6dc3e314a6051fbffa550642156507b8dd089262eab23ece6d63b8"} err="failed to get container status \"9bf5a6f70f6dc3e314a6051fbffa550642156507b8dd089262eab23ece6d63b8\": rpc error: code = NotFound desc = could not find container \"9bf5a6f70f6dc3e314a6051fbffa550642156507b8dd089262eab23ece6d63b8\": container with ID starting with 9bf5a6f70f6dc3e314a6051fbffa550642156507b8dd089262eab23ece6d63b8 not found: ID does not exist" Dec 05 16:40:48 crc kubenswrapper[4756]: I1205 16:40:48.061509 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6768dbe8-7db3-463a-bf23-69c35db737ed" path="/var/lib/kubelet/pods/6768dbe8-7db3-463a-bf23-69c35db737ed/volumes" Dec 05 16:41:00 crc kubenswrapper[4756]: I1205 16:41:00.171295 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 05 16:41:00 crc kubenswrapper[4756]: I1205 16:41:00.176835 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="6aa745f2-e987-4c61-84bd-f40224e4a4a0" containerName="openstackclient" containerID="cri-o://7be737e28ed352a4c08f3c99de7ce6f9d006ea35ab29b304739115ae17238871" gracePeriod=2 Dec 05 16:41:00 crc kubenswrapper[4756]: I1205 16:41:00.178933 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 05 16:41:00 crc kubenswrapper[4756]: I1205 16:41:00.529612 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 16:41:00 crc kubenswrapper[4756]: I1205 16:41:00.530746 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="3f72429c-8e1e-4c2d-8417-c499c22013a9" containerName="openstack-network-exporter" containerID="cri-o://85456f1848243765e2d473f760124a27191626f1650d37509a1eb62e5f3138fc" gracePeriod=300 Dec 05 16:41:00 crc kubenswrapper[4756]: I1205 16:41:00.572014 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 16:41:00 crc kubenswrapper[4756]: I1205 16:41:00.693487 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance1a2d-account-delete-vgqdz"] Dec 05 16:41:00 crc kubenswrapper[4756]: E1205 16:41:00.693957 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6768dbe8-7db3-463a-bf23-69c35db737ed" containerName="extract-content" Dec 05 16:41:00 crc kubenswrapper[4756]: I1205 16:41:00.693969 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="6768dbe8-7db3-463a-bf23-69c35db737ed" containerName="extract-content" Dec 05 16:41:00 crc kubenswrapper[4756]: E1205 16:41:00.693991 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6768dbe8-7db3-463a-bf23-69c35db737ed" containerName="registry-server" Dec 05 16:41:00 crc kubenswrapper[4756]: I1205 16:41:00.693997 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="6768dbe8-7db3-463a-bf23-69c35db737ed" containerName="registry-server" Dec 05 16:41:00 crc kubenswrapper[4756]: E1205 16:41:00.694020 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aa745f2-e987-4c61-84bd-f40224e4a4a0" containerName="openstackclient" Dec 05 16:41:00 crc kubenswrapper[4756]: I1205 16:41:00.694027 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aa745f2-e987-4c61-84bd-f40224e4a4a0" containerName="openstackclient" Dec 05 16:41:00 crc kubenswrapper[4756]: E1205 16:41:00.694040 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6768dbe8-7db3-463a-bf23-69c35db737ed" containerName="extract-utilities" Dec 05 16:41:00 crc kubenswrapper[4756]: I1205 16:41:00.694047 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="6768dbe8-7db3-463a-bf23-69c35db737ed" containerName="extract-utilities" Dec 05 16:41:00 crc kubenswrapper[4756]: I1205 16:41:00.694260 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aa745f2-e987-4c61-84bd-f40224e4a4a0" containerName="openstackclient" Dec 05 16:41:00 crc kubenswrapper[4756]: I1205 16:41:00.694278 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="6768dbe8-7db3-463a-bf23-69c35db737ed" containerName="registry-server" Dec 05 16:41:00 crc kubenswrapper[4756]: E1205 16:41:00.694586 4756 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 05 16:41:00 crc kubenswrapper[4756]: E1205 16:41:00.694651 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/331486fe-c403-4516-9568-862200ec0672-config-data podName:331486fe-c403-4516-9568-862200ec0672 nodeName:}" failed. No retries permitted until 2025-12-05 16:41:01.194633407 +0000 UTC m=+1521.433306919 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/331486fe-c403-4516-9568-862200ec0672-config-data") pod "rabbitmq-cell1-server-0" (UID: "331486fe-c403-4516-9568-862200ec0672") : configmap "rabbitmq-cell1-config-data" not found Dec 05 16:41:00 crc kubenswrapper[4756]: I1205 16:41:00.695157 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance1a2d-account-delete-vgqdz" Dec 05 16:41:00 crc kubenswrapper[4756]: I1205 16:41:00.706694 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance1a2d-account-delete-vgqdz"] Dec 05 16:41:00 crc kubenswrapper[4756]: I1205 16:41:00.722410 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="3f72429c-8e1e-4c2d-8417-c499c22013a9" containerName="ovsdbserver-nb" containerID="cri-o://e20df79125cd2c365e707fdb1cf60b40b6b4b579a5f0bd54d7e52f0b70e3d8ea" gracePeriod=300 Dec 05 16:41:00 crc kubenswrapper[4756]: I1205 16:41:00.795161 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8865\" (UniqueName: \"kubernetes.io/projected/735bacc2-aeb8-4f2f-8f1a-bf3d092fd163-kube-api-access-l8865\") pod \"glance1a2d-account-delete-vgqdz\" (UID: \"735bacc2-aeb8-4f2f-8f1a-bf3d092fd163\") " pod="openstack/glance1a2d-account-delete-vgqdz" Dec 05 16:41:00 crc kubenswrapper[4756]: I1205 16:41:00.795217 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/735bacc2-aeb8-4f2f-8f1a-bf3d092fd163-operator-scripts\") pod \"glance1a2d-account-delete-vgqdz\" (UID: \"735bacc2-aeb8-4f2f-8f1a-bf3d092fd163\") " pod="openstack/glance1a2d-account-delete-vgqdz" Dec 05 16:41:00 crc kubenswrapper[4756]: I1205 16:41:00.832838 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-mvn92"] Dec 05 16:41:00 crc kubenswrapper[4756]: I1205 16:41:00.891942 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-ffbjc"] Dec 05 16:41:00 crc kubenswrapper[4756]: I1205 16:41:00.902700 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8865\" (UniqueName: \"kubernetes.io/projected/735bacc2-aeb8-4f2f-8f1a-bf3d092fd163-kube-api-access-l8865\") pod \"glance1a2d-account-delete-vgqdz\" (UID: \"735bacc2-aeb8-4f2f-8f1a-bf3d092fd163\") " pod="openstack/glance1a2d-account-delete-vgqdz" Dec 05 16:41:00 crc kubenswrapper[4756]: I1205 16:41:00.902762 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/735bacc2-aeb8-4f2f-8f1a-bf3d092fd163-operator-scripts\") pod \"glance1a2d-account-delete-vgqdz\" (UID: \"735bacc2-aeb8-4f2f-8f1a-bf3d092fd163\") " pod="openstack/glance1a2d-account-delete-vgqdz" Dec 05 16:41:00 crc kubenswrapper[4756]: I1205 16:41:00.903476 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/735bacc2-aeb8-4f2f-8f1a-bf3d092fd163-operator-scripts\") pod \"glance1a2d-account-delete-vgqdz\" (UID: \"735bacc2-aeb8-4f2f-8f1a-bf3d092fd163\") " pod="openstack/glance1a2d-account-delete-vgqdz" Dec 05 16:41:00 crc kubenswrapper[4756]: I1205 16:41:00.913640 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-mvn92"] Dec 05 16:41:00 crc kubenswrapper[4756]: I1205 16:41:00.942658 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-ffbjc"] Dec 05 16:41:00 crc kubenswrapper[4756]: I1205 16:41:00.990488 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder2230-account-delete-jfjlh"] Dec 05 16:41:00 crc kubenswrapper[4756]: I1205 16:41:00.991754 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder2230-account-delete-jfjlh" Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.002104 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8865\" (UniqueName: \"kubernetes.io/projected/735bacc2-aeb8-4f2f-8f1a-bf3d092fd163-kube-api-access-l8865\") pod \"glance1a2d-account-delete-vgqdz\" (UID: \"735bacc2-aeb8-4f2f-8f1a-bf3d092fd163\") " pod="openstack/glance1a2d-account-delete-vgqdz" Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.029485 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder2230-account-delete-jfjlh"] Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.058108 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance1a2d-account-delete-vgqdz" Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.118860 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e2e416b-60db-4b09-b585-59911e192daf-operator-scripts\") pod \"cinder2230-account-delete-jfjlh\" (UID: \"6e2e416b-60db-4b09-b585-59911e192daf\") " pod="openstack/cinder2230-account-delete-jfjlh" Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.118932 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnsv6\" (UniqueName: \"kubernetes.io/projected/6e2e416b-60db-4b09-b585-59911e192daf-kube-api-access-wnsv6\") pod \"cinder2230-account-delete-jfjlh\" (UID: \"6e2e416b-60db-4b09-b585-59911e192daf\") " pod="openstack/cinder2230-account-delete-jfjlh" Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.162274 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.179635 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-w47gl"] Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.217933 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-w47gl"] Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.223489 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e2e416b-60db-4b09-b585-59911e192daf-operator-scripts\") pod \"cinder2230-account-delete-jfjlh\" (UID: \"6e2e416b-60db-4b09-b585-59911e192daf\") " pod="openstack/cinder2230-account-delete-jfjlh" Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.223542 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnsv6\" (UniqueName: \"kubernetes.io/projected/6e2e416b-60db-4b09-b585-59911e192daf-kube-api-access-wnsv6\") pod \"cinder2230-account-delete-jfjlh\" (UID: \"6e2e416b-60db-4b09-b585-59911e192daf\") " pod="openstack/cinder2230-account-delete-jfjlh" Dec 05 16:41:01 crc kubenswrapper[4756]: E1205 16:41:01.223734 4756 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 05 16:41:01 crc kubenswrapper[4756]: E1205 16:41:01.223784 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/331486fe-c403-4516-9568-862200ec0672-config-data podName:331486fe-c403-4516-9568-862200ec0672 nodeName:}" failed. No retries permitted until 2025-12-05 16:41:02.223766815 +0000 UTC m=+1522.462440327 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/331486fe-c403-4516-9568-862200ec0672-config-data") pod "rabbitmq-cell1-server-0" (UID: "331486fe-c403-4516-9568-862200ec0672") : configmap "rabbitmq-cell1-config-data" not found Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.224822 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e2e416b-60db-4b09-b585-59911e192daf-operator-scripts\") pod \"cinder2230-account-delete-jfjlh\" (UID: \"6e2e416b-60db-4b09-b585-59911e192daf\") " pod="openstack/cinder2230-account-delete-jfjlh" Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.228394 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.228645 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683" containerName="ovn-northd" containerID="cri-o://1ad5c1e2511e818eb90b9c9bbfc0f48afbd46e3e9b27a610240331e0b4b66444" gracePeriod=30 Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.229584 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683" containerName="openstack-network-exporter" containerID="cri-o://3d84600ac906281768c60200071669b5bea31ab8b67b8f4f35c9c9885c5ed8d8" gracePeriod=30 Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.273249 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnsv6\" (UniqueName: \"kubernetes.io/projected/6e2e416b-60db-4b09-b585-59911e192daf-kube-api-access-wnsv6\") pod \"cinder2230-account-delete-jfjlh\" (UID: \"6e2e416b-60db-4b09-b585-59911e192daf\") " pod="openstack/cinder2230-account-delete-jfjlh" Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.296929 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbicanc3ab-account-delete-ndr77"] Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.298181 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbicanc3ab-account-delete-ndr77" Dec 05 16:41:01 crc kubenswrapper[4756]: E1205 16:41:01.339087 4756 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 05 16:41:01 crc kubenswrapper[4756]: E1205 16:41:01.339165 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/11d0ad48-c987-4415-8536-b150536534a5-config-data podName:11d0ad48-c987-4415-8536-b150536534a5 nodeName:}" failed. No retries permitted until 2025-12-05 16:41:01.839148171 +0000 UTC m=+1522.077821683 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/11d0ad48-c987-4415-8536-b150536534a5-config-data") pod "rabbitmq-server-0" (UID: "11d0ad48-c987-4415-8536-b150536534a5") : configmap "rabbitmq-config-data" not found Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.366933 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbicanc3ab-account-delete-ndr77"] Dec 05 16:41:01 crc kubenswrapper[4756]: E1205 16:41:01.371646 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1ad5c1e2511e818eb90b9c9bbfc0f48afbd46e3e9b27a610240331e0b4b66444" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 05 16:41:01 crc kubenswrapper[4756]: E1205 16:41:01.388670 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1ad5c1e2511e818eb90b9c9bbfc0f48afbd46e3e9b27a610240331e0b4b66444" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 05 16:41:01 crc kubenswrapper[4756]: E1205 16:41:01.422594 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1ad5c1e2511e818eb90b9c9bbfc0f48afbd46e3e9b27a610240331e0b4b66444" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 05 16:41:01 crc kubenswrapper[4756]: E1205 16:41:01.422849 4756 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683" containerName="ovn-northd" Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.441655 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9639f2b9-2b61-4ab9-ab94-6c54d373662f-operator-scripts\") pod \"barbicanc3ab-account-delete-ndr77\" (UID: \"9639f2b9-2b61-4ab9-ab94-6c54d373662f\") " pod="openstack/barbicanc3ab-account-delete-ndr77" Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.441717 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvhcn\" (UniqueName: \"kubernetes.io/projected/9639f2b9-2b61-4ab9-ab94-6c54d373662f-kube-api-access-tvhcn\") pod \"barbicanc3ab-account-delete-ndr77\" (UID: \"9639f2b9-2b61-4ab9-ab94-6c54d373662f\") " pod="openstack/barbicanc3ab-account-delete-ndr77" Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.459493 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutronbc5a-account-delete-wtsjj"] Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.460929 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutronbc5a-account-delete-wtsjj" Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.492660 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-4rtgn"] Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.516806 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-64jk8"] Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.523980 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3f72429c-8e1e-4c2d-8417-c499c22013a9/ovsdbserver-nb/0.log" Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.524028 4756 generic.go:334] "Generic (PLEG): container finished" podID="3f72429c-8e1e-4c2d-8417-c499c22013a9" containerID="85456f1848243765e2d473f760124a27191626f1650d37509a1eb62e5f3138fc" exitCode=2 Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.524048 4756 generic.go:334] "Generic (PLEG): container finished" podID="3f72429c-8e1e-4c2d-8417-c499c22013a9" containerID="e20df79125cd2c365e707fdb1cf60b40b6b4b579a5f0bd54d7e52f0b70e3d8ea" exitCode=143 Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.524070 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3f72429c-8e1e-4c2d-8417-c499c22013a9","Type":"ContainerDied","Data":"85456f1848243765e2d473f760124a27191626f1650d37509a1eb62e5f3138fc"} Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.524096 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3f72429c-8e1e-4c2d-8417-c499c22013a9","Type":"ContainerDied","Data":"e20df79125cd2c365e707fdb1cf60b40b6b4b579a5f0bd54d7e52f0b70e3d8ea"} Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.538491 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder2230-account-delete-jfjlh" Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.543230 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37e233df-e9ab-45c9-a335-f1182a3b8c79-operator-scripts\") pod \"neutronbc5a-account-delete-wtsjj\" (UID: \"37e233df-e9ab-45c9-a335-f1182a3b8c79\") " pod="openstack/neutronbc5a-account-delete-wtsjj" Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.543271 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkpmp\" (UniqueName: \"kubernetes.io/projected/37e233df-e9ab-45c9-a335-f1182a3b8c79-kube-api-access-fkpmp\") pod \"neutronbc5a-account-delete-wtsjj\" (UID: \"37e233df-e9ab-45c9-a335-f1182a3b8c79\") " pod="openstack/neutronbc5a-account-delete-wtsjj" Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.543324 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9639f2b9-2b61-4ab9-ab94-6c54d373662f-operator-scripts\") pod \"barbicanc3ab-account-delete-ndr77\" (UID: \"9639f2b9-2b61-4ab9-ab94-6c54d373662f\") " pod="openstack/barbicanc3ab-account-delete-ndr77" Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.543355 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvhcn\" (UniqueName: \"kubernetes.io/projected/9639f2b9-2b61-4ab9-ab94-6c54d373662f-kube-api-access-tvhcn\") pod \"barbicanc3ab-account-delete-ndr77\" (UID: \"9639f2b9-2b61-4ab9-ab94-6c54d373662f\") " pod="openstack/barbicanc3ab-account-delete-ndr77" Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.547602 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9639f2b9-2b61-4ab9-ab94-6c54d373662f-operator-scripts\") pod \"barbicanc3ab-account-delete-ndr77\" (UID: \"9639f2b9-2b61-4ab9-ab94-6c54d373662f\") " pod="openstack/barbicanc3ab-account-delete-ndr77" Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.559483 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-64jk8"] Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.576496 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cf4ff87b5-954sn"] Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.576815 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" podUID="c609265d-528c-4c9c-bd1c-b735775d9acc" containerName="dnsmasq-dns" containerID="cri-o://6d5b71ea955f60a17745011df48bb423d4c7c7c82b3f516d2feb518405839e09" gracePeriod=10 Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.649304 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvhcn\" (UniqueName: \"kubernetes.io/projected/9639f2b9-2b61-4ab9-ab94-6c54d373662f-kube-api-access-tvhcn\") pod \"barbicanc3ab-account-delete-ndr77\" (UID: \"9639f2b9-2b61-4ab9-ab94-6c54d373662f\") " pod="openstack/barbicanc3ab-account-delete-ndr77" Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.673394 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37e233df-e9ab-45c9-a335-f1182a3b8c79-operator-scripts\") pod \"neutronbc5a-account-delete-wtsjj\" (UID: \"37e233df-e9ab-45c9-a335-f1182a3b8c79\") " pod="openstack/neutronbc5a-account-delete-wtsjj" Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.673487 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkpmp\" (UniqueName: \"kubernetes.io/projected/37e233df-e9ab-45c9-a335-f1182a3b8c79-kube-api-access-fkpmp\") pod \"neutronbc5a-account-delete-wtsjj\" (UID: \"37e233df-e9ab-45c9-a335-f1182a3b8c79\") " pod="openstack/neutronbc5a-account-delete-wtsjj" Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.676965 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-4rtgn"] Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.678181 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbicanc3ab-account-delete-ndr77" Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.754456 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkpmp\" (UniqueName: \"kubernetes.io/projected/37e233df-e9ab-45c9-a335-f1182a3b8c79-kube-api-access-fkpmp\") pod \"neutronbc5a-account-delete-wtsjj\" (UID: \"37e233df-e9ab-45c9-a335-f1182a3b8c79\") " pod="openstack/neutronbc5a-account-delete-wtsjj" Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.755858 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37e233df-e9ab-45c9-a335-f1182a3b8c79-operator-scripts\") pod \"neutronbc5a-account-delete-wtsjj\" (UID: \"37e233df-e9ab-45c9-a335-f1182a3b8c79\") " pod="openstack/neutronbc5a-account-delete-wtsjj" Dec 05 16:41:01 crc kubenswrapper[4756]: E1205 16:41:01.907170 4756 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 05 16:41:01 crc kubenswrapper[4756]: E1205 16:41:01.907652 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/11d0ad48-c987-4415-8536-b150536534a5-config-data podName:11d0ad48-c987-4415-8536-b150536534a5 nodeName:}" failed. No retries permitted until 2025-12-05 16:41:02.907636142 +0000 UTC m=+1523.146309654 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/11d0ad48-c987-4415-8536-b150536534a5-config-data") pod "rabbitmq-server-0" (UID: "11d0ad48-c987-4415-8536-b150536534a5") : configmap "rabbitmq-config-data" not found Dec 05 16:41:01 crc kubenswrapper[4756]: I1205 16:41:01.924692 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutronbc5a-account-delete-wtsjj" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.024134 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutronbc5a-account-delete-wtsjj"] Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.041359 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-vsmjz"] Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.125958 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01800831-fa81-4f58-9da9-66dcbfe86580" path="/var/lib/kubelet/pods/01800831-fa81-4f58-9da9-66dcbfe86580/volumes" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.127067 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="343931b4-805d-4457-a27f-7c680e29a475" path="/var/lib/kubelet/pods/343931b4-805d-4457-a27f-7c680e29a475/volumes" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.156540 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f0a57ed-b792-42fb-95df-8a88b426b3b7" path="/var/lib/kubelet/pods/9f0a57ed-b792-42fb-95df-8a88b426b3b7/volumes" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.162012 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b35c2088-a11f-4514-8390-1a1f5ff0ccb5" path="/var/lib/kubelet/pods/b35c2088-a11f-4514-8390-1a1f5ff0ccb5/volumes" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.170729 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbf5eba8-7c13-43e6-9777-978152dff744" path="/var/lib/kubelet/pods/dbf5eba8-7c13-43e6-9777-978152dff744/volumes" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.171841 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-kdtg5"] Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.171878 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement5c6b-account-delete-nhd8d"] Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.174216 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-bg8df"] Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.174261 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement5c6b-account-delete-nhd8d"] Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.174477 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-bg8df" podUID="b9642035-7395-4ead-9da4-4d01f497a80b" containerName="openstack-network-exporter" containerID="cri-o://aaa563c0b40764c0748232c6188d4ab9d2d109f170578ae13acab5b936dcc5c5" gracePeriod=30 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.174653 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement5c6b-account-delete-nhd8d" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.185184 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.185649 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="930a573c-21eb-46e3-827b-2ff167eecd4a" containerName="openstack-network-exporter" containerID="cri-o://552b70f0a07dfc0a35589ca2e4b524aad06f3b481cf9fd381f8830f3e4bb7734" gracePeriod=300 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.232561 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v59d\" (UniqueName: \"kubernetes.io/projected/3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e-kube-api-access-6v59d\") pod \"placement5c6b-account-delete-nhd8d\" (UID: \"3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e\") " pod="openstack/placement5c6b-account-delete-nhd8d" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.232703 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e-operator-scripts\") pod \"placement5c6b-account-delete-nhd8d\" (UID: \"3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e\") " pod="openstack/placement5c6b-account-delete-nhd8d" Dec 05 16:41:02 crc kubenswrapper[4756]: E1205 16:41:02.233127 4756 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 05 16:41:02 crc kubenswrapper[4756]: E1205 16:41:02.233178 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/331486fe-c403-4516-9568-862200ec0672-config-data podName:331486fe-c403-4516-9568-862200ec0672 nodeName:}" failed. No retries permitted until 2025-12-05 16:41:04.233160398 +0000 UTC m=+1524.471833910 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/331486fe-c403-4516-9568-862200ec0672-config-data") pod "rabbitmq-cell1-server-0" (UID: "331486fe-c403-4516-9568-862200ec0672") : configmap "rabbitmq-cell1-config-data" not found Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.246828 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-gqdnn"] Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.281001 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.281278 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="509573b5-33f4-432c-bb2e-1fc9186ed072" containerName="cinder-scheduler" containerID="cri-o://1c001a9c870ca4bb31c57a0eaad2bee3efc8b47aef90bc4502a5337603659d8b" gracePeriod=30 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.281615 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="509573b5-33f4-432c-bb2e-1fc9186ed072" containerName="probe" containerID="cri-o://a4403245fba51bc317b8d7897b375b1d195288a8e13b1478f5ed78542abf0293" gracePeriod=30 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.334694 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v59d\" (UniqueName: \"kubernetes.io/projected/3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e-kube-api-access-6v59d\") pod \"placement5c6b-account-delete-nhd8d\" (UID: \"3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e\") " pod="openstack/placement5c6b-account-delete-nhd8d" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.334826 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e-operator-scripts\") pod \"placement5c6b-account-delete-nhd8d\" (UID: \"3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e\") " pod="openstack/placement5c6b-account-delete-nhd8d" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.335947 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e-operator-scripts\") pod \"placement5c6b-account-delete-nhd8d\" (UID: \"3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e\") " pod="openstack/placement5c6b-account-delete-nhd8d" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.364109 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-gqdnn"] Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.389788 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-vs2mj"] Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.429213 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-vs2mj"] Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.430998 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v59d\" (UniqueName: \"kubernetes.io/projected/3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e-kube-api-access-6v59d\") pod \"placement5c6b-account-delete-nhd8d\" (UID: \"3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e\") " pod="openstack/placement5c6b-account-delete-nhd8d" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.440430 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.440680 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4beba48e-5ac4-4d41-864b-ecfd8b039988" containerName="glance-log" containerID="cri-o://ff50efe465d0847be81b8c502e77c5ea6e01d546de46512db17a70f058fa77b7" gracePeriod=30 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.440932 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4beba48e-5ac4-4d41-864b-ecfd8b039988" containerName="glance-httpd" containerID="cri-o://ee842d9afcdec61635c6c49c87079fe579d522f6aece80d8d0ea7d52afa84d53" gracePeriod=30 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.485172 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell16b06-account-delete-q6cdw"] Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.486685 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell16b06-account-delete-q6cdw" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.530842 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell0bc19-account-delete-xlr4r"] Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.532111 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0bc19-account-delete-xlr4r" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.541289 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dabf9eb-5097-4d26-b64c-00938b0638f6-operator-scripts\") pod \"novacell16b06-account-delete-q6cdw\" (UID: \"1dabf9eb-5097-4d26-b64c-00938b0638f6\") " pod="openstack/novacell16b06-account-delete-q6cdw" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.547955 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb7dk\" (UniqueName: \"kubernetes.io/projected/1dabf9eb-5097-4d26-b64c-00938b0638f6-kube-api-access-jb7dk\") pod \"novacell16b06-account-delete-q6cdw\" (UID: \"1dabf9eb-5097-4d26-b64c-00938b0638f6\") " pod="openstack/novacell16b06-account-delete-q6cdw" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.544001 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell16b06-account-delete-q6cdw"] Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.564621 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.574998 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f664c2b4-e517-427a-bbdc-8fba6d548a21" containerName="glance-log" containerID="cri-o://25c6d91b70e3146a820e5c52b8f074d3f3bfefc49cb9b80d00c7a70aae08ff55" gracePeriod=30 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.575523 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f664c2b4-e517-427a-bbdc-8fba6d548a21" containerName="glance-httpd" containerID="cri-o://8e8b38530a288e65246e8e282b7de364102d9ead1db22ed0d2d5b5a3b7cad231" gracePeriod=30 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.632465 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapi58bc-account-delete-cdg92"] Dec 05 16:41:02 crc kubenswrapper[4756]: E1205 16:41:02.633207 4756 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-vsmjz" message="Exiting ovn-controller (1) " Dec 05 16:41:02 crc kubenswrapper[4756]: E1205 16:41:02.633245 4756 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-vsmjz" podUID="359e87f3-8fb4-4476-a538-13c948b65885" containerName="ovn-controller" containerID="cri-o://ade01f42bbe77b89747984e4dc69936f10f823fef3986224b12446fc96eaf272" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.633755 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-vsmjz" podUID="359e87f3-8fb4-4476-a538-13c948b65885" containerName="ovn-controller" containerID="cri-o://ade01f42bbe77b89747984e4dc69936f10f823fef3986224b12446fc96eaf272" gracePeriod=30 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.634202 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi58bc-account-delete-cdg92" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.649336 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kflx7\" (UniqueName: \"kubernetes.io/projected/cc4b1e45-87c0-471f-beb9-70bce6933d47-kube-api-access-kflx7\") pod \"novacell0bc19-account-delete-xlr4r\" (UID: \"cc4b1e45-87c0-471f-beb9-70bce6933d47\") " pod="openstack/novacell0bc19-account-delete-xlr4r" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.649442 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb7dk\" (UniqueName: \"kubernetes.io/projected/1dabf9eb-5097-4d26-b64c-00938b0638f6-kube-api-access-jb7dk\") pod \"novacell16b06-account-delete-q6cdw\" (UID: \"1dabf9eb-5097-4d26-b64c-00938b0638f6\") " pod="openstack/novacell16b06-account-delete-q6cdw" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.649512 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dabf9eb-5097-4d26-b64c-00938b0638f6-operator-scripts\") pod \"novacell16b06-account-delete-q6cdw\" (UID: \"1dabf9eb-5097-4d26-b64c-00938b0638f6\") " pod="openstack/novacell16b06-account-delete-q6cdw" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.649613 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc4b1e45-87c0-471f-beb9-70bce6933d47-operator-scripts\") pod \"novacell0bc19-account-delete-xlr4r\" (UID: \"cc4b1e45-87c0-471f-beb9-70bce6933d47\") " pod="openstack/novacell0bc19-account-delete-xlr4r" Dec 05 16:41:02 crc kubenswrapper[4756]: E1205 16:41:02.650083 4756 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Dec 05 16:41:02 crc kubenswrapper[4756]: E1205 16:41:02.650124 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1dabf9eb-5097-4d26-b64c-00938b0638f6-operator-scripts podName:1dabf9eb-5097-4d26-b64c-00938b0638f6 nodeName:}" failed. No retries permitted until 2025-12-05 16:41:03.150109768 +0000 UTC m=+1523.388783280 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1dabf9eb-5097-4d26-b64c-00938b0638f6-operator-scripts") pod "novacell16b06-account-delete-q6cdw" (UID: "1dabf9eb-5097-4d26-b64c-00938b0638f6") : configmap "openstack-cell1-scripts" not found Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.660469 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell0bc19-account-delete-xlr4r"] Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.661892 4756 generic.go:334] "Generic (PLEG): container finished" podID="c609265d-528c-4c9c-bd1c-b735775d9acc" containerID="6d5b71ea955f60a17745011df48bb423d4c7c7c82b3f516d2feb518405839e09" exitCode=0 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.661977 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" event={"ID":"c609265d-528c-4c9c-bd1c-b735775d9acc","Type":"ContainerDied","Data":"6d5b71ea955f60a17745011df48bb423d4c7c7c82b3f516d2feb518405839e09"} Dec 05 16:41:02 crc kubenswrapper[4756]: E1205 16:41:02.667717 4756 projected.go:194] Error preparing data for projected volume kube-api-access-jb7dk for pod openstack/novacell16b06-account-delete-q6cdw: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Dec 05 16:41:02 crc kubenswrapper[4756]: E1205 16:41:02.667776 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1dabf9eb-5097-4d26-b64c-00938b0638f6-kube-api-access-jb7dk podName:1dabf9eb-5097-4d26-b64c-00938b0638f6 nodeName:}" failed. No retries permitted until 2025-12-05 16:41:03.167756719 +0000 UTC m=+1523.406430231 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-jb7dk" (UniqueName: "kubernetes.io/projected/1dabf9eb-5097-4d26-b64c-00938b0638f6-kube-api-access-jb7dk") pod "novacell16b06-account-delete-q6cdw" (UID: "1dabf9eb-5097-4d26-b64c-00938b0638f6") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.676043 4756 generic.go:334] "Generic (PLEG): container finished" podID="930a573c-21eb-46e3-827b-2ff167eecd4a" containerID="552b70f0a07dfc0a35589ca2e4b524aad06f3b481cf9fd381f8830f3e4bb7734" exitCode=2 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.676103 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"930a573c-21eb-46e3-827b-2ff167eecd4a","Type":"ContainerDied","Data":"552b70f0a07dfc0a35589ca2e4b524aad06f3b481cf9fd381f8830f3e4bb7734"} Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.687983 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-bg8df_b9642035-7395-4ead-9da4-4d01f497a80b/openstack-network-exporter/0.log" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.688021 4756 generic.go:334] "Generic (PLEG): container finished" podID="b9642035-7395-4ead-9da4-4d01f497a80b" containerID="aaa563c0b40764c0748232c6188d4ab9d2d109f170578ae13acab5b936dcc5c5" exitCode=2 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.688079 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-bg8df" event={"ID":"b9642035-7395-4ead-9da4-4d01f497a80b","Type":"ContainerDied","Data":"aaa563c0b40764c0748232c6188d4ab9d2d109f170578ae13acab5b936dcc5c5"} Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.690611 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.690920 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9df5252d-16cf-4612-910c-8f37d10b48a7" containerName="cinder-api-log" containerID="cri-o://b2292cd9b3a9b3b5abd290889833f8a1801c60128dcd06bb289b998128e70716" gracePeriod=30 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.691041 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9df5252d-16cf-4612-910c-8f37d10b48a7" containerName="cinder-api" containerID="cri-o://6d0787e75ae992dd2b3dc29096ca4f2b71204bd07d887523f0de467c4e4688d4" gracePeriod=30 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.699153 4756 generic.go:334] "Generic (PLEG): container finished" podID="b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683" containerID="3d84600ac906281768c60200071669b5bea31ab8b67b8f4f35c9c9885c5ed8d8" exitCode=2 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.699219 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683","Type":"ContainerDied","Data":"3d84600ac906281768c60200071669b5bea31ab8b67b8f4f35c9c9885c5ed8d8"} Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.718395 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapi58bc-account-delete-cdg92"] Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.730429 4756 generic.go:334] "Generic (PLEG): container finished" podID="6aa745f2-e987-4c61-84bd-f40224e4a4a0" containerID="7be737e28ed352a4c08f3c99de7ce6f9d006ea35ab29b304739115ae17238871" exitCode=137 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.747063 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.747543 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="account-server" containerID="cri-o://dda1d79f2b59ff6c5c2d0266e0acd62ebf58cab920b8afccadd520b9892680b4" gracePeriod=30 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.747674 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="object-server" containerID="cri-o://821acf37d22b8f71cb55bb2d57dcd64430e023776cecd598ec2c206cea1a4102" gracePeriod=30 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.747766 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="container-updater" containerID="cri-o://1e0ca2f0b49755dfd0c9132cf3368e74ff1b4a7228ed3e94f5207e344afe1db1" gracePeriod=30 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.747820 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="container-auditor" containerID="cri-o://e20786692279aaf2d8f121cc96987f9f799c1af4d0ee5262b2c6e9ee5557839a" gracePeriod=30 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.747863 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="container-replicator" containerID="cri-o://76099a82df8143f2cc4ed5aad230dfa5b3acbe45156918636453426258978e46" gracePeriod=30 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.747903 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="container-server" containerID="cri-o://ebf30d748f8665f39bb4c4c38feb695f0528281f593a1cb8bba7b8c8fd0a8fc2" gracePeriod=30 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.747944 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="swift-recon-cron" containerID="cri-o://247cd9f8e619b5b90e0efa25570e05c275258c09a3806318d1576d31ee9c54d4" gracePeriod=30 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.747945 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="account-reaper" containerID="cri-o://0c6cfb9d5dd454d68dc8f1516ae3fbaa64a717012996cf3d859df476dcfab0a5" gracePeriod=30 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.747983 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="rsync" containerID="cri-o://9e3e1415751beadec3c5e98d32474f2d2dec1760de9605948fb52aab896fc9c7" gracePeriod=30 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.747991 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="account-auditor" containerID="cri-o://fb543cb304c9d5844c9b5a67d5c174d88a21239b421f7c141d51de2980c31e75" gracePeriod=30 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.748016 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="object-expirer" containerID="cri-o://9c291384610a06cc375042a98a47615fdbff66e21123c781d2769ab2bcadeded" gracePeriod=30 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.748018 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="account-replicator" containerID="cri-o://58129759a9bc200a7d116b2fc2c28784959095ead6831251c418d834f5b0a55f" gracePeriod=30 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.748058 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="object-auditor" containerID="cri-o://4389071ecd156da0b5f4845fe6c02738ea9fcc211d54b131fe326e1966f73345" gracePeriod=30 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.748072 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="object-replicator" containerID="cri-o://05437fa0b343d36ef2ae01e088f3fdcbef68a2d04ddfa1eaa69a14db0d05f9b1" gracePeriod=30 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.748089 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="object-updater" containerID="cri-o://5e1f40cea05fa071537e16acd6b09db6a6409dc74b0cd1fa563ac99186148a45" gracePeriod=30 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.750792 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2586474d-e368-428a-a86b-1ec95b8006a7-operator-scripts\") pod \"novaapi58bc-account-delete-cdg92\" (UID: \"2586474d-e368-428a-a86b-1ec95b8006a7\") " pod="openstack/novaapi58bc-account-delete-cdg92" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.750875 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc4b1e45-87c0-471f-beb9-70bce6933d47-operator-scripts\") pod \"novacell0bc19-account-delete-xlr4r\" (UID: \"cc4b1e45-87c0-471f-beb9-70bce6933d47\") " pod="openstack/novacell0bc19-account-delete-xlr4r" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.750906 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kflx7\" (UniqueName: \"kubernetes.io/projected/cc4b1e45-87c0-471f-beb9-70bce6933d47-kube-api-access-kflx7\") pod \"novacell0bc19-account-delete-xlr4r\" (UID: \"cc4b1e45-87c0-471f-beb9-70bce6933d47\") " pod="openstack/novacell0bc19-account-delete-xlr4r" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.750931 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmbq6\" (UniqueName: \"kubernetes.io/projected/2586474d-e368-428a-a86b-1ec95b8006a7-kube-api-access-dmbq6\") pod \"novaapi58bc-account-delete-cdg92\" (UID: \"2586474d-e368-428a-a86b-1ec95b8006a7\") " pod="openstack/novaapi58bc-account-delete-cdg92" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.752093 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc4b1e45-87c0-471f-beb9-70bce6933d47-operator-scripts\") pod \"novacell0bc19-account-delete-xlr4r\" (UID: \"cc4b1e45-87c0-471f-beb9-70bce6933d47\") " pod="openstack/novacell0bc19-account-delete-xlr4r" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.797352 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="930a573c-21eb-46e3-827b-2ff167eecd4a" containerName="ovsdbserver-sb" containerID="cri-o://6fbdbf2325367baff55e8b846433a59f4bbca7569a3296fb749614fe4ac3a285" gracePeriod=300 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.812585 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kflx7\" (UniqueName: \"kubernetes.io/projected/cc4b1e45-87c0-471f-beb9-70bce6933d47-kube-api-access-kflx7\") pod \"novacell0bc19-account-delete-xlr4r\" (UID: \"cc4b1e45-87c0-471f-beb9-70bce6933d47\") " pod="openstack/novacell0bc19-account-delete-xlr4r" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.828974 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5b545bc8c5-xcv4s"] Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.829197 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5b545bc8c5-xcv4s" podUID="926e1895-6638-4861-be46-f06e94e4b4ae" containerName="neutron-api" containerID="cri-o://f64d7df36b795e3694f688c94ab0720253fe8c0c08bc8d7f803bb4019f348b81" gracePeriod=30 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.829738 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5b545bc8c5-xcv4s" podUID="926e1895-6638-4861-be46-f06e94e4b4ae" containerName="neutron-httpd" containerID="cri-o://c7af4d58a9c44e723d1f4b250fb4689fe9457696570e5e229b0f57ec7ad64a29" gracePeriod=30 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.866864 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmbq6\" (UniqueName: \"kubernetes.io/projected/2586474d-e368-428a-a86b-1ec95b8006a7-kube-api-access-dmbq6\") pod \"novaapi58bc-account-delete-cdg92\" (UID: \"2586474d-e368-428a-a86b-1ec95b8006a7\") " pod="openstack/novaapi58bc-account-delete-cdg92" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.866987 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2586474d-e368-428a-a86b-1ec95b8006a7-operator-scripts\") pod \"novaapi58bc-account-delete-cdg92\" (UID: \"2586474d-e368-428a-a86b-1ec95b8006a7\") " pod="openstack/novaapi58bc-account-delete-cdg92" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.867740 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2586474d-e368-428a-a86b-1ec95b8006a7-operator-scripts\") pod \"novaapi58bc-account-delete-cdg92\" (UID: \"2586474d-e368-428a-a86b-1ec95b8006a7\") " pod="openstack/novaapi58bc-account-delete-cdg92" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.889651 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-z8mgx"] Dec 05 16:41:02 crc kubenswrapper[4756]: E1205 16:41:02.906163 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6fbdbf2325367baff55e8b846433a59f4bbca7569a3296fb749614fe4ac3a285 is running failed: container process not found" containerID="6fbdbf2325367baff55e8b846433a59f4bbca7569a3296fb749614fe4ac3a285" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 05 16:41:02 crc kubenswrapper[4756]: E1205 16:41:02.907398 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6fbdbf2325367baff55e8b846433a59f4bbca7569a3296fb749614fe4ac3a285 is running failed: container process not found" containerID="6fbdbf2325367baff55e8b846433a59f4bbca7569a3296fb749614fe4ac3a285" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 05 16:41:02 crc kubenswrapper[4756]: E1205 16:41:02.907880 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6fbdbf2325367baff55e8b846433a59f4bbca7569a3296fb749614fe4ac3a285 is running failed: container process not found" containerID="6fbdbf2325367baff55e8b846433a59f4bbca7569a3296fb749614fe4ac3a285" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 05 16:41:02 crc kubenswrapper[4756]: E1205 16:41:02.907908 4756 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6fbdbf2325367baff55e8b846433a59f4bbca7569a3296fb749614fe4ac3a285 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="930a573c-21eb-46e3-827b-2ff167eecd4a" containerName="ovsdbserver-sb" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.928757 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmbq6\" (UniqueName: \"kubernetes.io/projected/2586474d-e368-428a-a86b-1ec95b8006a7-kube-api-access-dmbq6\") pod \"novaapi58bc-account-delete-cdg92\" (UID: \"2586474d-e368-428a-a86b-1ec95b8006a7\") " pod="openstack/novaapi58bc-account-delete-cdg92" Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.928836 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-668cbcb89b-dvw88"] Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.929054 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-668cbcb89b-dvw88" podUID="84d62427-bd91-4330-be20-152bc3508447" containerName="placement-log" containerID="cri-o://a99c90cd22f1473b85f31a90f8f7912ef7f574ee4089fbd8fd4b2995f484d61c" gracePeriod=30 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.929476 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-668cbcb89b-dvw88" podUID="84d62427-bd91-4330-be20-152bc3508447" containerName="placement-api" containerID="cri-o://a005af597ef26ea0b48f5743eb681caeba6662ee90634b36c5befe70f580e2a0" gracePeriod=30 Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.967051 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-z8mgx"] Dec 05 16:41:02 crc kubenswrapper[4756]: E1205 16:41:02.972152 4756 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 05 16:41:02 crc kubenswrapper[4756]: E1205 16:41:02.972222 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/11d0ad48-c987-4415-8536-b150536534a5-config-data podName:11d0ad48-c987-4415-8536-b150536534a5 nodeName:}" failed. No retries permitted until 2025-12-05 16:41:04.97220405 +0000 UTC m=+1525.210877572 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/11d0ad48-c987-4415-8536-b150536534a5-config-data") pod "rabbitmq-server-0" (UID: "11d0ad48-c987-4415-8536-b150536534a5") : configmap "rabbitmq-config-data" not found Dec 05 16:41:02 crc kubenswrapper[4756]: I1205 16:41:02.992737 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.034933 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.035191 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="897f63f9-bf8d-438b-b35a-4195963786e3" containerName="nova-api-log" containerID="cri-o://8f9e222e4bcb63d553ca65dc2988c4b494ff464ecf438d7b80744af7ca09bdb8" gracePeriod=30 Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.036553 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="897f63f9-bf8d-438b-b35a-4195963786e3" containerName="nova-api-api" containerID="cri-o://04391e895ac7c022c16d69dfd03bf0f8cab399a37640b4aebf07a28dfbf58d41" gracePeriod=30 Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.045347 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6fd4b854cb-rm9k8"] Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.045790 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6fd4b854cb-rm9k8" podUID="13fa1e9e-f893-4641-a55f-d0c209d43917" containerName="barbican-keystone-listener-log" containerID="cri-o://04daa8a7964206ec50253f25a3104d1d3037a3b4b32afa5015cf06cab5b9fd7d" gracePeriod=30 Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.045942 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6fd4b854cb-rm9k8" podUID="13fa1e9e-f893-4641-a55f-d0c209d43917" containerName="barbican-keystone-listener" containerID="cri-o://f08e0d462ffd45dfa7950e9164ec47234418bfc47d5218d2a85f0732c1d92391" gracePeriod=30 Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.097265 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-9d6468757-b9ttg"] Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.098780 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-9d6468757-b9ttg" podUID="f0fc7cfb-9160-471d-8aec-457769d46602" containerName="barbican-worker-log" containerID="cri-o://481bb590f2466b1ec99a35fcf296aa1fc3174e0cc5b933c289f817ad96130df5" gracePeriod=30 Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.105128 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-9d6468757-b9ttg" podUID="f0fc7cfb-9160-471d-8aec-457769d46602" containerName="barbican-worker" containerID="cri-o://b4e4e8a043bf4f88e1edd5b6dd41710a7149c49c31bb40ac6714307c37009357" gracePeriod=30 Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.108441 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-54c6d9b488-h9hmj"] Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.108838 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-54c6d9b488-h9hmj" podUID="2606acc0-f425-4595-8f5a-df1f009175ee" containerName="barbican-api-log" containerID="cri-o://25d37ab5592969027f78161ad8c13da0276bef9030f7a390f77c1e2d2d113f8c" gracePeriod=30 Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.109160 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-54c6d9b488-h9hmj" podUID="2606acc0-f425-4595-8f5a-df1f009175ee" containerName="barbican-api" containerID="cri-o://bfa23ea411e7c12b52777bbf724439990757136fd7d85b535a806320b41f695b" gracePeriod=30 Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.122606 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.126629 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="76057872-026a-4ecb-b628-75f7f6fd46e9" containerName="nova-metadata-log" containerID="cri-o://8c85479e2fe59de4f67537a2a6d08e3945b963ac89af5c01b49b2eb958a942e5" gracePeriod=30 Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.127216 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="76057872-026a-4ecb-b628-75f7f6fd46e9" containerName="nova-metadata-metadata" containerID="cri-o://fe8069b814e870c4e578733a24b86485d43263b291fd3dcc6e53ead33749cc50" gracePeriod=30 Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.148660 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-647b5b9577-j6wcs"] Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.149056 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-647b5b9577-j6wcs" podUID="541f7de0-3773-4e28-9881-b552367f1c89" containerName="proxy-httpd" containerID="cri-o://1b4671c01e0fecf9173531aafb787e57c3c0f19f93680f966bff65b5e243cf8c" gracePeriod=30 Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.149255 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-647b5b9577-j6wcs" podUID="541f7de0-3773-4e28-9881-b552367f1c89" containerName="proxy-server" containerID="cri-o://40fa47072fd0322fd7094036691019fba18613d3d6531f5dd1a0acf458462514" gracePeriod=30 Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.170317 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.170601 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c9451d43-4e5f-429b-b364-9220afe19523" containerName="nova-scheduler-scheduler" containerID="cri-o://52f8fd30a92226261f0f4bd124f526c8082068adbdd24e6aad126c08d81c0f83" gracePeriod=30 Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.190110 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb7dk\" (UniqueName: \"kubernetes.io/projected/1dabf9eb-5097-4d26-b64c-00938b0638f6-kube-api-access-jb7dk\") pod \"novacell16b06-account-delete-q6cdw\" (UID: \"1dabf9eb-5097-4d26-b64c-00938b0638f6\") " pod="openstack/novacell16b06-account-delete-q6cdw" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.190265 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dabf9eb-5097-4d26-b64c-00938b0638f6-operator-scripts\") pod \"novacell16b06-account-delete-q6cdw\" (UID: \"1dabf9eb-5097-4d26-b64c-00938b0638f6\") " pod="openstack/novacell16b06-account-delete-q6cdw" Dec 05 16:41:03 crc kubenswrapper[4756]: E1205 16:41:03.194531 4756 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Dec 05 16:41:03 crc kubenswrapper[4756]: E1205 16:41:03.194585 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1dabf9eb-5097-4d26-b64c-00938b0638f6-operator-scripts podName:1dabf9eb-5097-4d26-b64c-00938b0638f6 nodeName:}" failed. No retries permitted until 2025-12-05 16:41:04.194571334 +0000 UTC m=+1524.433244846 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1dabf9eb-5097-4d26-b64c-00938b0638f6-operator-scripts") pod "novacell16b06-account-delete-q6cdw" (UID: "1dabf9eb-5097-4d26-b64c-00938b0638f6") : configmap "openstack-cell1-scripts" not found Dec 05 16:41:03 crc kubenswrapper[4756]: E1205 16:41:03.202130 4756 projected.go:194] Error preparing data for projected volume kube-api-access-jb7dk for pod openstack/novacell16b06-account-delete-q6cdw: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Dec 05 16:41:03 crc kubenswrapper[4756]: E1205 16:41:03.202192 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1dabf9eb-5097-4d26-b64c-00938b0638f6-kube-api-access-jb7dk podName:1dabf9eb-5097-4d26-b64c-00938b0638f6 nodeName:}" failed. No retries permitted until 2025-12-05 16:41:04.202176601 +0000 UTC m=+1524.440850113 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-jb7dk" (UniqueName: "kubernetes.io/projected/1dabf9eb-5097-4d26-b64c-00938b0638f6-kube-api-access-jb7dk") pod "novacell16b06-account-delete-q6cdw" (UID: "1dabf9eb-5097-4d26-b64c-00938b0638f6") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Dec 05 16:41:03 crc kubenswrapper[4756]: E1205 16:41:03.208240 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ade01f42bbe77b89747984e4dc69936f10f823fef3986224b12446fc96eaf272 is running failed: container process not found" containerID="ade01f42bbe77b89747984e4dc69936f10f823fef3986224b12446fc96eaf272" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.210065 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="331486fe-c403-4516-9568-862200ec0672" containerName="rabbitmq" containerID="cri-o://fa659750fdca4260aabdf7cc3b64385ac78f7103ed2236c3231de5b75e383a15" gracePeriod=604800 Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.229405 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 16:41:03 crc kubenswrapper[4756]: E1205 16:41:03.263805 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ade01f42bbe77b89747984e4dc69936f10f823fef3986224b12446fc96eaf272 is running failed: container process not found" containerID="ade01f42bbe77b89747984e4dc69936f10f823fef3986224b12446fc96eaf272" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Dec 05 16:41:03 crc kubenswrapper[4756]: E1205 16:41:03.273553 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ade01f42bbe77b89747984e4dc69936f10f823fef3986224b12446fc96eaf272 is running failed: container process not found" containerID="ade01f42bbe77b89747984e4dc69936f10f823fef3986224b12446fc96eaf272" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Dec 05 16:41:03 crc kubenswrapper[4756]: E1205 16:41:03.273622 4756 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ade01f42bbe77b89747984e4dc69936f10f823fef3986224b12446fc96eaf272 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-vsmjz" podUID="359e87f3-8fb4-4476-a538-13c948b65885" containerName="ovn-controller" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.299182 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.328478 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mrzn4"] Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.374048 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mrzn4"] Dec 05 16:41:03 crc kubenswrapper[4756]: E1205 16:41:03.383560 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e20df79125cd2c365e707fdb1cf60b40b6b4b579a5f0bd54d7e52f0b70e3d8ea is running failed: container process not found" containerID="e20df79125cd2c365e707fdb1cf60b40b6b4b579a5f0bd54d7e52f0b70e3d8ea" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.383713 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.383910 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="187070cf-06f5-4c25-851f-2e1f294c3b7c" containerName="nova-cell0-conductor-conductor" containerID="cri-o://7fc666336bac7d8a681a050c8fd16bd151411c4f9fcfe10381ada0984c4a3444" gracePeriod=30 Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.392006 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4dl5v"] Dec 05 16:41:03 crc kubenswrapper[4756]: E1205 16:41:03.413761 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e20df79125cd2c365e707fdb1cf60b40b6b4b579a5f0bd54d7e52f0b70e3d8ea is running failed: container process not found" containerID="e20df79125cd2c365e707fdb1cf60b40b6b4b579a5f0bd54d7e52f0b70e3d8ea" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.428492 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.428774 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="dc0118af-131e-409a-a18b-0be03f3ce81b" containerName="nova-cell1-conductor-conductor" containerID="cri-o://14a9c94a34d46580da78a252edc6fc101e1053fd75e54da245fe2728758527ee" gracePeriod=30 Dec 05 16:41:03 crc kubenswrapper[4756]: E1205 16:41:03.442748 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e20df79125cd2c365e707fdb1cf60b40b6b4b579a5f0bd54d7e52f0b70e3d8ea is running failed: container process not found" containerID="e20df79125cd2c365e707fdb1cf60b40b6b4b579a5f0bd54d7e52f0b70e3d8ea" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 05 16:41:03 crc kubenswrapper[4756]: E1205 16:41:03.442824 4756 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e20df79125cd2c365e707fdb1cf60b40b6b4b579a5f0bd54d7e52f0b70e3d8ea is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="3f72429c-8e1e-4c2d-8417-c499c22013a9" containerName="ovsdbserver-nb" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.448550 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4dl5v"] Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.450483 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement5c6b-account-delete-nhd8d" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.473287 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-bsbtj"] Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.500684 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0bc19-account-delete-xlr4r" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.504596 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.504898 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="1f517228-5007-4e3c-a41b-1e4b636446c0" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://3314201eb3f0a43ba4e986ea1cf96160fc430f0101df1aa8d597f6eafb4f4d90" gracePeriod=30 Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.518219 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi58bc-account-delete-cdg92" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.529495 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-6b06-account-create-update-tjk9b"] Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.539396 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3f72429c-8e1e-4c2d-8417-c499c22013a9/ovsdbserver-nb/0.log" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.539495 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.548668 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell16b06-account-delete-q6cdw"] Dec 05 16:41:03 crc kubenswrapper[4756]: E1205 16:41:03.549650 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-jb7dk operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/novacell16b06-account-delete-q6cdw" podUID="1dabf9eb-5097-4d26-b64c-00938b0638f6" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.560444 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-bsbtj"] Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.601490 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-6b06-account-create-update-tjk9b"] Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.602851 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="11d0ad48-c987-4415-8536-b150536534a5" containerName="rabbitmq" containerID="cri-o://b92e191defe003995a105571535e742081d3f399571f5918d9704a31621eb3ff" gracePeriod=604800 Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.647105 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-647b5b9577-j6wcs" podUID="541f7de0-3773-4e28-9881-b552367f1c89" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.165:8080/healthcheck\": dial tcp 10.217.0.165:8080: connect: connection refused" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.647752 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-647b5b9577-j6wcs" podUID="541f7de0-3773-4e28-9881-b552367f1c89" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.165:8080/healthcheck\": dial tcp 10.217.0.165:8080: connect: connection refused" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.651130 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f72429c-8e1e-4c2d-8417-c499c22013a9-metrics-certs-tls-certs\") pod \"3f72429c-8e1e-4c2d-8417-c499c22013a9\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.651191 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"3f72429c-8e1e-4c2d-8417-c499c22013a9\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.651372 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f72429c-8e1e-4c2d-8417-c499c22013a9-config\") pod \"3f72429c-8e1e-4c2d-8417-c499c22013a9\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.652708 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f72429c-8e1e-4c2d-8417-c499c22013a9-config" (OuterVolumeSpecName: "config") pod "3f72429c-8e1e-4c2d-8417-c499c22013a9" (UID: "3f72429c-8e1e-4c2d-8417-c499c22013a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.661617 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3f72429c-8e1e-4c2d-8417-c499c22013a9-ovsdb-rundir\") pod \"3f72429c-8e1e-4c2d-8417-c499c22013a9\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.661664 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f72429c-8e1e-4c2d-8417-c499c22013a9-ovsdbserver-nb-tls-certs\") pod \"3f72429c-8e1e-4c2d-8417-c499c22013a9\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.661748 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4q5b\" (UniqueName: \"kubernetes.io/projected/3f72429c-8e1e-4c2d-8417-c499c22013a9-kube-api-access-k4q5b\") pod \"3f72429c-8e1e-4c2d-8417-c499c22013a9\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.661791 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f72429c-8e1e-4c2d-8417-c499c22013a9-scripts\") pod \"3f72429c-8e1e-4c2d-8417-c499c22013a9\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.661820 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f72429c-8e1e-4c2d-8417-c499c22013a9-combined-ca-bundle\") pod \"3f72429c-8e1e-4c2d-8417-c499c22013a9\" (UID: \"3f72429c-8e1e-4c2d-8417-c499c22013a9\") " Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.662764 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f72429c-8e1e-4c2d-8417-c499c22013a9-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.674033 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f72429c-8e1e-4c2d-8417-c499c22013a9-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "3f72429c-8e1e-4c2d-8417-c499c22013a9" (UID: "3f72429c-8e1e-4c2d-8417-c499c22013a9"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.676302 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f72429c-8e1e-4c2d-8417-c499c22013a9-scripts" (OuterVolumeSpecName: "scripts") pod "3f72429c-8e1e-4c2d-8417-c499c22013a9" (UID: "3f72429c-8e1e-4c2d-8417-c499c22013a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.684216 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "3f72429c-8e1e-4c2d-8417-c499c22013a9" (UID: "3f72429c-8e1e-4c2d-8417-c499c22013a9"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.691799 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f72429c-8e1e-4c2d-8417-c499c22013a9-kube-api-access-k4q5b" (OuterVolumeSpecName: "kube-api-access-k4q5b") pod "3f72429c-8e1e-4c2d-8417-c499c22013a9" (UID: "3f72429c-8e1e-4c2d-8417-c499c22013a9"). InnerVolumeSpecName "kube-api-access-k4q5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.743112 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.748386 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.748547 4756 generic.go:334] "Generic (PLEG): container finished" podID="f0fc7cfb-9160-471d-8aec-457769d46602" containerID="481bb590f2466b1ec99a35fcf296aa1fc3174e0cc5b933c289f817ad96130df5" exitCode=143 Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.748605 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-9d6468757-b9ttg" event={"ID":"f0fc7cfb-9160-471d-8aec-457769d46602","Type":"ContainerDied","Data":"481bb590f2466b1ec99a35fcf296aa1fc3174e0cc5b933c289f817ad96130df5"} Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.770038 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f72429c-8e1e-4c2d-8417-c499c22013a9-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.770106 4756 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.770118 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3f72429c-8e1e-4c2d-8417-c499c22013a9-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.770128 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4q5b\" (UniqueName: \"kubernetes.io/projected/3f72429c-8e1e-4c2d-8417-c499c22013a9-kube-api-access-k4q5b\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.786069 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" event={"ID":"c609265d-528c-4c9c-bd1c-b735775d9acc","Type":"ContainerDied","Data":"adda2863650eccc6f9766800246a30289626cae346ab5113ebedcffa9fa120e0"} Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.786338 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf4ff87b5-954sn" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.786931 4756 scope.go:117] "RemoveContainer" containerID="6d5b71ea955f60a17745011df48bb423d4c7c7c82b3f516d2feb518405839e09" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.809952 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3f72429c-8e1e-4c2d-8417-c499c22013a9/ovsdbserver-nb/0.log" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.810172 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.811238 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3f72429c-8e1e-4c2d-8417-c499c22013a9","Type":"ContainerDied","Data":"9592012620c7f3b4bbf1f01ba218cb5ce66bbfbb0ee270d98879ada47358f432"} Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.820224 4756 generic.go:334] "Generic (PLEG): container finished" podID="359e87f3-8fb4-4476-a538-13c948b65885" containerID="ade01f42bbe77b89747984e4dc69936f10f823fef3986224b12446fc96eaf272" exitCode=0 Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.820308 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vsmjz" event={"ID":"359e87f3-8fb4-4476-a538-13c948b65885","Type":"ContainerDied","Data":"ade01f42bbe77b89747984e4dc69936f10f823fef3986224b12446fc96eaf272"} Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.823001 4756 generic.go:334] "Generic (PLEG): container finished" podID="541f7de0-3773-4e28-9881-b552367f1c89" containerID="1b4671c01e0fecf9173531aafb787e57c3c0f19f93680f966bff65b5e243cf8c" exitCode=0 Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.823043 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-647b5b9577-j6wcs" event={"ID":"541f7de0-3773-4e28-9881-b552367f1c89","Type":"ContainerDied","Data":"1b4671c01e0fecf9173531aafb787e57c3c0f19f93680f966bff65b5e243cf8c"} Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.836158 4756 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.872637 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c609265d-528c-4c9c-bd1c-b735775d9acc-dns-svc\") pod \"c609265d-528c-4c9c-bd1c-b735775d9acc\" (UID: \"c609265d-528c-4c9c-bd1c-b735775d9acc\") " Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.872750 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp2xv\" (UniqueName: \"kubernetes.io/projected/c609265d-528c-4c9c-bd1c-b735775d9acc-kube-api-access-fp2xv\") pod \"c609265d-528c-4c9c-bd1c-b735775d9acc\" (UID: \"c609265d-528c-4c9c-bd1c-b735775d9acc\") " Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.872768 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c609265d-528c-4c9c-bd1c-b735775d9acc-dns-swift-storage-0\") pod \"c609265d-528c-4c9c-bd1c-b735775d9acc\" (UID: \"c609265d-528c-4c9c-bd1c-b735775d9acc\") " Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.872832 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c609265d-528c-4c9c-bd1c-b735775d9acc-ovsdbserver-sb\") pod \"c609265d-528c-4c9c-bd1c-b735775d9acc\" (UID: \"c609265d-528c-4c9c-bd1c-b735775d9acc\") " Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.872856 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6aa745f2-e987-4c61-84bd-f40224e4a4a0-openstack-config-secret\") pod \"6aa745f2-e987-4c61-84bd-f40224e4a4a0\" (UID: \"6aa745f2-e987-4c61-84bd-f40224e4a4a0\") " Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.872913 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c609265d-528c-4c9c-bd1c-b735775d9acc-ovsdbserver-nb\") pod \"c609265d-528c-4c9c-bd1c-b735775d9acc\" (UID: \"c609265d-528c-4c9c-bd1c-b735775d9acc\") " Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.872933 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa745f2-e987-4c61-84bd-f40224e4a4a0-combined-ca-bundle\") pod \"6aa745f2-e987-4c61-84bd-f40224e4a4a0\" (UID: \"6aa745f2-e987-4c61-84bd-f40224e4a4a0\") " Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.872953 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c609265d-528c-4c9c-bd1c-b735775d9acc-config\") pod \"c609265d-528c-4c9c-bd1c-b735775d9acc\" (UID: \"c609265d-528c-4c9c-bd1c-b735775d9acc\") " Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.873001 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6aa745f2-e987-4c61-84bd-f40224e4a4a0-openstack-config\") pod \"6aa745f2-e987-4c61-84bd-f40224e4a4a0\" (UID: \"6aa745f2-e987-4c61-84bd-f40224e4a4a0\") " Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.873021 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gg42\" (UniqueName: \"kubernetes.io/projected/6aa745f2-e987-4c61-84bd-f40224e4a4a0-kube-api-access-7gg42\") pod \"6aa745f2-e987-4c61-84bd-f40224e4a4a0\" (UID: \"6aa745f2-e987-4c61-84bd-f40224e4a4a0\") " Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.873947 4756 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:03 crc kubenswrapper[4756]: E1205 16:41:03.874694 4756 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Dec 05 16:41:03 crc kubenswrapper[4756]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 05 16:41:03 crc kubenswrapper[4756]: + source /usr/local/bin/container-scripts/functions Dec 05 16:41:03 crc kubenswrapper[4756]: ++ OVNBridge=br-int Dec 05 16:41:03 crc kubenswrapper[4756]: ++ OVNRemote=tcp:localhost:6642 Dec 05 16:41:03 crc kubenswrapper[4756]: ++ OVNEncapType=geneve Dec 05 16:41:03 crc kubenswrapper[4756]: ++ OVNAvailabilityZones= Dec 05 16:41:03 crc kubenswrapper[4756]: ++ EnableChassisAsGateway=true Dec 05 16:41:03 crc kubenswrapper[4756]: ++ PhysicalNetworks= Dec 05 16:41:03 crc kubenswrapper[4756]: ++ OVNHostName= Dec 05 16:41:03 crc kubenswrapper[4756]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 05 16:41:03 crc kubenswrapper[4756]: ++ ovs_dir=/var/lib/openvswitch Dec 05 16:41:03 crc kubenswrapper[4756]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 05 16:41:03 crc kubenswrapper[4756]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 05 16:41:03 crc kubenswrapper[4756]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 05 16:41:03 crc kubenswrapper[4756]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 05 16:41:03 crc kubenswrapper[4756]: + sleep 0.5 Dec 05 16:41:03 crc kubenswrapper[4756]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 05 16:41:03 crc kubenswrapper[4756]: + sleep 0.5 Dec 05 16:41:03 crc kubenswrapper[4756]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 05 16:41:03 crc kubenswrapper[4756]: + cleanup_ovsdb_server_semaphore Dec 05 16:41:03 crc kubenswrapper[4756]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 05 16:41:03 crc kubenswrapper[4756]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 05 16:41:03 crc kubenswrapper[4756]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-kdtg5" message=< Dec 05 16:41:03 crc kubenswrapper[4756]: Exiting ovsdb-server (5) [ OK ] Dec 05 16:41:03 crc kubenswrapper[4756]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 05 16:41:03 crc kubenswrapper[4756]: + source /usr/local/bin/container-scripts/functions Dec 05 16:41:03 crc kubenswrapper[4756]: ++ OVNBridge=br-int Dec 05 16:41:03 crc kubenswrapper[4756]: ++ OVNRemote=tcp:localhost:6642 Dec 05 16:41:03 crc kubenswrapper[4756]: ++ OVNEncapType=geneve Dec 05 16:41:03 crc kubenswrapper[4756]: ++ OVNAvailabilityZones= Dec 05 16:41:03 crc kubenswrapper[4756]: ++ EnableChassisAsGateway=true Dec 05 16:41:03 crc kubenswrapper[4756]: ++ PhysicalNetworks= Dec 05 16:41:03 crc kubenswrapper[4756]: ++ OVNHostName= Dec 05 16:41:03 crc kubenswrapper[4756]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 05 16:41:03 crc kubenswrapper[4756]: ++ ovs_dir=/var/lib/openvswitch Dec 05 16:41:03 crc kubenswrapper[4756]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 05 16:41:03 crc kubenswrapper[4756]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 05 16:41:03 crc kubenswrapper[4756]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 05 16:41:03 crc kubenswrapper[4756]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 05 16:41:03 crc kubenswrapper[4756]: + sleep 0.5 Dec 05 16:41:03 crc kubenswrapper[4756]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 05 16:41:03 crc kubenswrapper[4756]: + sleep 0.5 Dec 05 16:41:03 crc kubenswrapper[4756]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 05 16:41:03 crc kubenswrapper[4756]: + cleanup_ovsdb_server_semaphore Dec 05 16:41:03 crc kubenswrapper[4756]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 05 16:41:03 crc kubenswrapper[4756]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 05 16:41:03 crc kubenswrapper[4756]: > Dec 05 16:41:03 crc kubenswrapper[4756]: E1205 16:41:03.874741 4756 kuberuntime_container.go:691] "PreStop hook failed" err=< Dec 05 16:41:03 crc kubenswrapper[4756]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 05 16:41:03 crc kubenswrapper[4756]: + source /usr/local/bin/container-scripts/functions Dec 05 16:41:03 crc kubenswrapper[4756]: ++ OVNBridge=br-int Dec 05 16:41:03 crc kubenswrapper[4756]: ++ OVNRemote=tcp:localhost:6642 Dec 05 16:41:03 crc kubenswrapper[4756]: ++ OVNEncapType=geneve Dec 05 16:41:03 crc kubenswrapper[4756]: ++ OVNAvailabilityZones= Dec 05 16:41:03 crc kubenswrapper[4756]: ++ EnableChassisAsGateway=true Dec 05 16:41:03 crc kubenswrapper[4756]: ++ PhysicalNetworks= Dec 05 16:41:03 crc kubenswrapper[4756]: ++ OVNHostName= Dec 05 16:41:03 crc kubenswrapper[4756]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 05 16:41:03 crc kubenswrapper[4756]: ++ ovs_dir=/var/lib/openvswitch Dec 05 16:41:03 crc kubenswrapper[4756]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 05 16:41:03 crc kubenswrapper[4756]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 05 16:41:03 crc kubenswrapper[4756]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 05 16:41:03 crc kubenswrapper[4756]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 05 16:41:03 crc kubenswrapper[4756]: + sleep 0.5 Dec 05 16:41:03 crc kubenswrapper[4756]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 05 16:41:03 crc kubenswrapper[4756]: + sleep 0.5 Dec 05 16:41:03 crc kubenswrapper[4756]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 05 16:41:03 crc kubenswrapper[4756]: + cleanup_ovsdb_server_semaphore Dec 05 16:41:03 crc kubenswrapper[4756]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 05 16:41:03 crc kubenswrapper[4756]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 05 16:41:03 crc kubenswrapper[4756]: > pod="openstack/ovn-controller-ovs-kdtg5" podUID="40ff87cb-4d82-49d4-a4d9-55e104c9becb" containerName="ovsdb-server" containerID="cri-o://664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.874813 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-kdtg5" podUID="40ff87cb-4d82-49d4-a4d9-55e104c9becb" containerName="ovsdb-server" containerID="cri-o://664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b" gracePeriod=29 Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.875028 4756 generic.go:334] "Generic (PLEG): container finished" podID="84d62427-bd91-4330-be20-152bc3508447" containerID="a99c90cd22f1473b85f31a90f8f7912ef7f574ee4089fbd8fd4b2995f484d61c" exitCode=143 Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.875074 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-668cbcb89b-dvw88" event={"ID":"84d62427-bd91-4330-be20-152bc3508447","Type":"ContainerDied","Data":"a99c90cd22f1473b85f31a90f8f7912ef7f574ee4089fbd8fd4b2995f484d61c"} Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.876035 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-bg8df_b9642035-7395-4ead-9da4-4d01f497a80b/openstack-network-exporter/0.log" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.876102 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-bg8df" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.901827 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f72429c-8e1e-4c2d-8417-c499c22013a9-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "3f72429c-8e1e-4c2d-8417-c499c22013a9" (UID: "3f72429c-8e1e-4c2d-8417-c499c22013a9"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.902501 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-kdtg5" podUID="40ff87cb-4d82-49d4-a4d9-55e104c9becb" containerName="ovs-vswitchd" containerID="cri-o://15ed1a30eec906bcf7cd0aae8c31d3f0fcdcf61ff466d03836be17d99a7a65f3" gracePeriod=29 Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.902612 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f72429c-8e1e-4c2d-8417-c499c22013a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f72429c-8e1e-4c2d-8417-c499c22013a9" (UID: "3f72429c-8e1e-4c2d-8417-c499c22013a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.924307 4756 scope.go:117] "RemoveContainer" containerID="be9e958b5cb052a79ce88fba09bec0f4f2bb4d1e023bd71c39ab015d97b9d54b" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.924528 4756 generic.go:334] "Generic (PLEG): container finished" podID="897f63f9-bf8d-438b-b35a-4195963786e3" containerID="8f9e222e4bcb63d553ca65dc2988c4b494ff464ecf438d7b80744af7ca09bdb8" exitCode=143 Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.924615 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"897f63f9-bf8d-438b-b35a-4195963786e3","Type":"ContainerDied","Data":"8f9e222e4bcb63d553ca65dc2988c4b494ff464ecf438d7b80744af7ca09bdb8"} Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.933679 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c609265d-528c-4c9c-bd1c-b735775d9acc-kube-api-access-fp2xv" (OuterVolumeSpecName: "kube-api-access-fp2xv") pod "c609265d-528c-4c9c-bd1c-b735775d9acc" (UID: "c609265d-528c-4c9c-bd1c-b735775d9acc"). InnerVolumeSpecName "kube-api-access-fp2xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.935064 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aa745f2-e987-4c61-84bd-f40224e4a4a0-kube-api-access-7gg42" (OuterVolumeSpecName: "kube-api-access-7gg42") pod "6aa745f2-e987-4c61-84bd-f40224e4a4a0" (UID: "6aa745f2-e987-4c61-84bd-f40224e4a4a0"). InnerVolumeSpecName "kube-api-access-7gg42". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.938189 4756 generic.go:334] "Generic (PLEG): container finished" podID="9df5252d-16cf-4612-910c-8f37d10b48a7" containerID="b2292cd9b3a9b3b5abd290889833f8a1801c60128dcd06bb289b998128e70716" exitCode=143 Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.938373 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9df5252d-16cf-4612-910c-8f37d10b48a7","Type":"ContainerDied","Data":"b2292cd9b3a9b3b5abd290889833f8a1801c60128dcd06bb289b998128e70716"} Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.960441 4756 generic.go:334] "Generic (PLEG): container finished" podID="2606acc0-f425-4595-8f5a-df1f009175ee" containerID="25d37ab5592969027f78161ad8c13da0276bef9030f7a390f77c1e2d2d113f8c" exitCode=143 Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.960577 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54c6d9b488-h9hmj" event={"ID":"2606acc0-f425-4595-8f5a-df1f009175ee","Type":"ContainerDied","Data":"25d37ab5592969027f78161ad8c13da0276bef9030f7a390f77c1e2d2d113f8c"} Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.970849 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aa745f2-e987-4c61-84bd-f40224e4a4a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6aa745f2-e987-4c61-84bd-f40224e4a4a0" (UID: "6aa745f2-e987-4c61-84bd-f40224e4a4a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.975921 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b9642035-7395-4ead-9da4-4d01f497a80b-ovn-rundir\") pod \"b9642035-7395-4ead-9da4-4d01f497a80b\" (UID: \"b9642035-7395-4ead-9da4-4d01f497a80b\") " Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.976460 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b9642035-7395-4ead-9da4-4d01f497a80b-ovs-rundir\") pod \"b9642035-7395-4ead-9da4-4d01f497a80b\" (UID: \"b9642035-7395-4ead-9da4-4d01f497a80b\") " Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.976484 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9642035-7395-4ead-9da4-4d01f497a80b-combined-ca-bundle\") pod \"b9642035-7395-4ead-9da4-4d01f497a80b\" (UID: \"b9642035-7395-4ead-9da4-4d01f497a80b\") " Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.976507 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9642035-7395-4ead-9da4-4d01f497a80b-metrics-certs-tls-certs\") pod \"b9642035-7395-4ead-9da4-4d01f497a80b\" (UID: \"b9642035-7395-4ead-9da4-4d01f497a80b\") " Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.976536 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gckzq\" (UniqueName: \"kubernetes.io/projected/b9642035-7395-4ead-9da4-4d01f497a80b-kube-api-access-gckzq\") pod \"b9642035-7395-4ead-9da4-4d01f497a80b\" (UID: \"b9642035-7395-4ead-9da4-4d01f497a80b\") " Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.976605 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9642035-7395-4ead-9da4-4d01f497a80b-config\") pod \"b9642035-7395-4ead-9da4-4d01f497a80b\" (UID: \"b9642035-7395-4ead-9da4-4d01f497a80b\") " Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.977060 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f72429c-8e1e-4c2d-8417-c499c22013a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.977074 4756 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f72429c-8e1e-4c2d-8417-c499c22013a9-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.977086 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa745f2-e987-4c61-84bd-f40224e4a4a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.977095 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gg42\" (UniqueName: \"kubernetes.io/projected/6aa745f2-e987-4c61-84bd-f40224e4a4a0-kube-api-access-7gg42\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.977105 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp2xv\" (UniqueName: \"kubernetes.io/projected/c609265d-528c-4c9c-bd1c-b735775d9acc-kube-api-access-fp2xv\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.977976 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9642035-7395-4ead-9da4-4d01f497a80b-config" (OuterVolumeSpecName: "config") pod "b9642035-7395-4ead-9da4-4d01f497a80b" (UID: "b9642035-7395-4ead-9da4-4d01f497a80b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.980214 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b9642035-7395-4ead-9da4-4d01f497a80b-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "b9642035-7395-4ead-9da4-4d01f497a80b" (UID: "b9642035-7395-4ead-9da4-4d01f497a80b"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.980274 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b9642035-7395-4ead-9da4-4d01f497a80b-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "b9642035-7395-4ead-9da4-4d01f497a80b" (UID: "b9642035-7395-4ead-9da4-4d01f497a80b"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.990024 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f72429c-8e1e-4c2d-8417-c499c22013a9-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "3f72429c-8e1e-4c2d-8417-c499c22013a9" (UID: "3f72429c-8e1e-4c2d-8417-c499c22013a9"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.990195 4756 generic.go:334] "Generic (PLEG): container finished" podID="13fa1e9e-f893-4641-a55f-d0c209d43917" containerID="04daa8a7964206ec50253f25a3104d1d3037a3b4b32afa5015cf06cab5b9fd7d" exitCode=143 Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.990276 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6fd4b854cb-rm9k8" event={"ID":"13fa1e9e-f893-4641-a55f-d0c209d43917","Type":"ContainerDied","Data":"04daa8a7964206ec50253f25a3104d1d3037a3b4b32afa5015cf06cab5b9fd7d"} Dec 05 16:41:03 crc kubenswrapper[4756]: I1205 16:41:03.999585 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9642035-7395-4ead-9da4-4d01f497a80b-kube-api-access-gckzq" (OuterVolumeSpecName: "kube-api-access-gckzq") pod "b9642035-7395-4ead-9da4-4d01f497a80b" (UID: "b9642035-7395-4ead-9da4-4d01f497a80b"). InnerVolumeSpecName "kube-api-access-gckzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.005831 4756 generic.go:334] "Generic (PLEG): container finished" podID="f664c2b4-e517-427a-bbdc-8fba6d548a21" containerID="25c6d91b70e3146a820e5c52b8f074d3f3bfefc49cb9b80d00c7a70aae08ff55" exitCode=143 Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.005906 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f664c2b4-e517-427a-bbdc-8fba6d548a21","Type":"ContainerDied","Data":"25c6d91b70e3146a820e5c52b8f074d3f3bfefc49cb9b80d00c7a70aae08ff55"} Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.081338 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="881d2b4e-3f2e-4aca-a965-2508edc92a3b" containerName="galera" containerID="cri-o://41db2bc934dd1d8f5c3deaf39234aa1d852019a880bf3e665b6803cc5b7d2923" gracePeriod=30 Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.081640 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f72429c-8e1e-4c2d-8417-c499c22013a9-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.082694 4756 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b9642035-7395-4ead-9da4-4d01f497a80b-ovs-rundir\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.082710 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gckzq\" (UniqueName: \"kubernetes.io/projected/b9642035-7395-4ead-9da4-4d01f497a80b-kube-api-access-gckzq\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.082723 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9642035-7395-4ead-9da4-4d01f497a80b-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.082832 4756 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b9642035-7395-4ead-9da4-4d01f497a80b-ovn-rundir\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.084282 4756 generic.go:334] "Generic (PLEG): container finished" podID="930a573c-21eb-46e3-827b-2ff167eecd4a" containerID="6fbdbf2325367baff55e8b846433a59f4bbca7569a3296fb749614fe4ac3a285" exitCode=0 Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.094478 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48ea8041-7117-4be6-9d19-d5d282c10364" path="/var/lib/kubelet/pods/48ea8041-7117-4be6-9d19-d5d282c10364/volumes" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.095329 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e3345b9-82e1-469d-8db0-0d5a8163c369" path="/var/lib/kubelet/pods/4e3345b9-82e1-469d-8db0-0d5a8163c369/volumes" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.096172 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f76f909-d1f6-4f2d-a398-5791bb8a14cf" path="/var/lib/kubelet/pods/6f76f909-d1f6-4f2d-a398-5791bb8a14cf/volumes" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.097801 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90a878ef-27a9-49a2-be83-1a6ba34fab7f" path="/var/lib/kubelet/pods/90a878ef-27a9-49a2-be83-1a6ba34fab7f/volumes" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.098985 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4a1ee9f-9806-472e-8630-a8312f0c3b99" path="/var/lib/kubelet/pods/b4a1ee9f-9806-472e-8630-a8312f0c3b99/volumes" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.099863 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0bd0441-24bc-48f4-aeba-006c3b77419e" path="/var/lib/kubelet/pods/d0bd0441-24bc-48f4-aeba-006c3b77419e/volumes" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.100664 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edb5a656-efa8-47ac-befc-2412854889f7" path="/var/lib/kubelet/pods/edb5a656-efa8-47ac-befc-2412854889f7/volumes" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.125669 4756 generic.go:334] "Generic (PLEG): container finished" podID="926e1895-6638-4861-be46-f06e94e4b4ae" containerID="c7af4d58a9c44e723d1f4b250fb4689fe9457696570e5e229b0f57ec7ad64a29" exitCode=0 Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.135865 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aa745f2-e987-4c61-84bd-f40224e4a4a0-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "6aa745f2-e987-4c61-84bd-f40224e4a4a0" (UID: "6aa745f2-e987-4c61-84bd-f40224e4a4a0"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.161334 4756 generic.go:334] "Generic (PLEG): container finished" podID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerID="9e3e1415751beadec3c5e98d32474f2d2dec1760de9605948fb52aab896fc9c7" exitCode=0 Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.161363 4756 generic.go:334] "Generic (PLEG): container finished" podID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerID="9c291384610a06cc375042a98a47615fdbff66e21123c781d2769ab2bcadeded" exitCode=0 Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.161373 4756 generic.go:334] "Generic (PLEG): container finished" podID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerID="5e1f40cea05fa071537e16acd6b09db6a6409dc74b0cd1fa563ac99186148a45" exitCode=0 Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.161381 4756 generic.go:334] "Generic (PLEG): container finished" podID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerID="4389071ecd156da0b5f4845fe6c02738ea9fcc211d54b131fe326e1966f73345" exitCode=0 Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.161388 4756 generic.go:334] "Generic (PLEG): container finished" podID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerID="05437fa0b343d36ef2ae01e088f3fdcbef68a2d04ddfa1eaa69a14db0d05f9b1" exitCode=0 Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.161395 4756 generic.go:334] "Generic (PLEG): container finished" podID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerID="821acf37d22b8f71cb55bb2d57dcd64430e023776cecd598ec2c206cea1a4102" exitCode=0 Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.161403 4756 generic.go:334] "Generic (PLEG): container finished" podID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerID="1e0ca2f0b49755dfd0c9132cf3368e74ff1b4a7228ed3e94f5207e344afe1db1" exitCode=0 Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.161410 4756 generic.go:334] "Generic (PLEG): container finished" podID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerID="e20786692279aaf2d8f121cc96987f9f799c1af4d0ee5262b2c6e9ee5557839a" exitCode=0 Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.161433 4756 generic.go:334] "Generic (PLEG): container finished" podID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerID="76099a82df8143f2cc4ed5aad230dfa5b3acbe45156918636453426258978e46" exitCode=0 Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.161440 4756 generic.go:334] "Generic (PLEG): container finished" podID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerID="0c6cfb9d5dd454d68dc8f1516ae3fbaa64a717012996cf3d859df476dcfab0a5" exitCode=0 Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.161446 4756 generic.go:334] "Generic (PLEG): container finished" podID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerID="fb543cb304c9d5844c9b5a67d5c174d88a21239b421f7c141d51de2980c31e75" exitCode=0 Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.161453 4756 generic.go:334] "Generic (PLEG): container finished" podID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerID="58129759a9bc200a7d116b2fc2c28784959095ead6831251c418d834f5b0a55f" exitCode=0 Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.164722 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c609265d-528c-4c9c-bd1c-b735775d9acc-config" (OuterVolumeSpecName: "config") pod "c609265d-528c-4c9c-bd1c-b735775d9acc" (UID: "c609265d-528c-4c9c-bd1c-b735775d9acc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.172984 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.174725 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9642035-7395-4ead-9da4-4d01f497a80b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9642035-7395-4ead-9da4-4d01f497a80b" (UID: "b9642035-7395-4ead-9da4-4d01f497a80b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.188951 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9642035-7395-4ead-9da4-4d01f497a80b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.188980 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c609265d-528c-4c9c-bd1c-b735775d9acc-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.188990 4756 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6aa745f2-e987-4c61-84bd-f40224e4a4a0-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.198858 4756 generic.go:334] "Generic (PLEG): container finished" podID="76057872-026a-4ecb-b628-75f7f6fd46e9" containerID="8c85479e2fe59de4f67537a2a6d08e3945b963ac89af5c01b49b2eb958a942e5" exitCode=143 Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.216010 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aa745f2-e987-4c61-84bd-f40224e4a4a0-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "6aa745f2-e987-4c61-84bd-f40224e4a4a0" (UID: "6aa745f2-e987-4c61-84bd-f40224e4a4a0"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.236830 4756 generic.go:334] "Generic (PLEG): container finished" podID="4beba48e-5ac4-4d41-864b-ecfd8b039988" containerID="ff50efe465d0847be81b8c502e77c5ea6e01d546de46512db17a70f058fa77b7" exitCode=143 Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.236931 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell16b06-account-delete-q6cdw" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.252974 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c609265d-528c-4c9c-bd1c-b735775d9acc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c609265d-528c-4c9c-bd1c-b735775d9acc" (UID: "c609265d-528c-4c9c-bd1c-b735775d9acc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.286635 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c609265d-528c-4c9c-bd1c-b735775d9acc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c609265d-528c-4c9c-bd1c-b735775d9acc" (UID: "c609265d-528c-4c9c-bd1c-b735775d9acc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.291521 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c609265d-528c-4c9c-bd1c-b735775d9acc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c609265d-528c-4c9c-bd1c-b735775d9acc" (UID: "c609265d-528c-4c9c-bd1c-b735775d9acc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.291534 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb7dk\" (UniqueName: \"kubernetes.io/projected/1dabf9eb-5097-4d26-b64c-00938b0638f6-kube-api-access-jb7dk\") pod \"novacell16b06-account-delete-q6cdw\" (UID: \"1dabf9eb-5097-4d26-b64c-00938b0638f6\") " pod="openstack/novacell16b06-account-delete-q6cdw" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.292632 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dabf9eb-5097-4d26-b64c-00938b0638f6-operator-scripts\") pod \"novacell16b06-account-delete-q6cdw\" (UID: \"1dabf9eb-5097-4d26-b64c-00938b0638f6\") " pod="openstack/novacell16b06-account-delete-q6cdw" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.292833 4756 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c609265d-528c-4c9c-bd1c-b735775d9acc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.292852 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c609265d-528c-4c9c-bd1c-b735775d9acc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.292863 4756 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6aa745f2-e987-4c61-84bd-f40224e4a4a0-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.292874 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c609265d-528c-4c9c-bd1c-b735775d9acc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:04 crc kubenswrapper[4756]: E1205 16:41:04.292919 4756 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 05 16:41:04 crc kubenswrapper[4756]: E1205 16:41:04.292959 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/331486fe-c403-4516-9568-862200ec0672-config-data podName:331486fe-c403-4516-9568-862200ec0672 nodeName:}" failed. No retries permitted until 2025-12-05 16:41:08.292943803 +0000 UTC m=+1528.531617315 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/331486fe-c403-4516-9568-862200ec0672-config-data") pod "rabbitmq-cell1-server-0" (UID: "331486fe-c403-4516-9568-862200ec0672") : configmap "rabbitmq-cell1-config-data" not found Dec 05 16:41:04 crc kubenswrapper[4756]: E1205 16:41:04.293009 4756 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Dec 05 16:41:04 crc kubenswrapper[4756]: E1205 16:41:04.293091 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1dabf9eb-5097-4d26-b64c-00938b0638f6-operator-scripts podName:1dabf9eb-5097-4d26-b64c-00938b0638f6 nodeName:}" failed. No retries permitted until 2025-12-05 16:41:06.293067366 +0000 UTC m=+1526.531740998 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1dabf9eb-5097-4d26-b64c-00938b0638f6-operator-scripts") pod "novacell16b06-account-delete-q6cdw" (UID: "1dabf9eb-5097-4d26-b64c-00938b0638f6") : configmap "openstack-cell1-scripts" not found Dec 05 16:41:04 crc kubenswrapper[4756]: E1205 16:41:04.295978 4756 projected.go:194] Error preparing data for projected volume kube-api-access-jb7dk for pod openstack/novacell16b06-account-delete-q6cdw: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Dec 05 16:41:04 crc kubenswrapper[4756]: E1205 16:41:04.296058 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1dabf9eb-5097-4d26-b64c-00938b0638f6-kube-api-access-jb7dk podName:1dabf9eb-5097-4d26-b64c-00938b0638f6 nodeName:}" failed. No retries permitted until 2025-12-05 16:41:06.296029437 +0000 UTC m=+1526.534702949 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-jb7dk" (UniqueName: "kubernetes.io/projected/1dabf9eb-5097-4d26-b64c-00938b0638f6-kube-api-access-jb7dk") pod "novacell16b06-account-delete-q6cdw" (UID: "1dabf9eb-5097-4d26-b64c-00938b0638f6") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.311932 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c609265d-528c-4c9c-bd1c-b735775d9acc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c609265d-528c-4c9c-bd1c-b735775d9acc" (UID: "c609265d-528c-4c9c-bd1c-b735775d9acc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.325909 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9642035-7395-4ead-9da4-4d01f497a80b-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "b9642035-7395-4ead-9da4-4d01f497a80b" (UID: "b9642035-7395-4ead-9da4-4d01f497a80b"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.394217 4756 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9642035-7395-4ead-9da4-4d01f497a80b-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.394400 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c609265d-528c-4c9c-bd1c-b735775d9acc-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:04 crc kubenswrapper[4756]: E1205 16:41:04.399146 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f517228_5007_4e3c_a41b_1e4b636446c0.slice/crio-3314201eb3f0a43ba4e986ea1cf96160fc430f0101df1aa8d597f6eafb4f4d90.scope\": RecentStats: unable to find data in memory cache]" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.455955 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"930a573c-21eb-46e3-827b-2ff167eecd4a","Type":"ContainerDied","Data":"6fbdbf2325367baff55e8b846433a59f4bbca7569a3296fb749614fe4ac3a285"} Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.456006 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b545bc8c5-xcv4s" event={"ID":"926e1895-6638-4861-be46-f06e94e4b4ae","Type":"ContainerDied","Data":"c7af4d58a9c44e723d1f4b250fb4689fe9457696570e5e229b0f57ec7ad64a29"} Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.456066 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerDied","Data":"9e3e1415751beadec3c5e98d32474f2d2dec1760de9605948fb52aab896fc9c7"} Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.456080 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerDied","Data":"9c291384610a06cc375042a98a47615fdbff66e21123c781d2769ab2bcadeded"} Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.456126 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerDied","Data":"5e1f40cea05fa071537e16acd6b09db6a6409dc74b0cd1fa563ac99186148a45"} Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.456137 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerDied","Data":"4389071ecd156da0b5f4845fe6c02738ea9fcc211d54b131fe326e1966f73345"} Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.456149 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerDied","Data":"05437fa0b343d36ef2ae01e088f3fdcbef68a2d04ddfa1eaa69a14db0d05f9b1"} Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.456158 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerDied","Data":"821acf37d22b8f71cb55bb2d57dcd64430e023776cecd598ec2c206cea1a4102"} Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.456166 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerDied","Data":"1e0ca2f0b49755dfd0c9132cf3368e74ff1b4a7228ed3e94f5207e344afe1db1"} Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.456175 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerDied","Data":"e20786692279aaf2d8f121cc96987f9f799c1af4d0ee5262b2c6e9ee5557839a"} Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.456190 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerDied","Data":"76099a82df8143f2cc4ed5aad230dfa5b3acbe45156918636453426258978e46"} Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.456198 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerDied","Data":"0c6cfb9d5dd454d68dc8f1516ae3fbaa64a717012996cf3d859df476dcfab0a5"} Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.456207 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerDied","Data":"fb543cb304c9d5844c9b5a67d5c174d88a21239b421f7c141d51de2980c31e75"} Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.456218 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerDied","Data":"58129759a9bc200a7d116b2fc2c28784959095ead6831251c418d834f5b0a55f"} Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.456233 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"76057872-026a-4ecb-b628-75f7f6fd46e9","Type":"ContainerDied","Data":"8c85479e2fe59de4f67537a2a6d08e3945b963ac89af5c01b49b2eb958a942e5"} Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.456248 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4beba48e-5ac4-4d41-864b-ecfd8b039988","Type":"ContainerDied","Data":"ff50efe465d0847be81b8c502e77c5ea6e01d546de46512db17a70f058fa77b7"} Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.473941 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell16b06-account-delete-q6cdw" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.474362 4756 scope.go:117] "RemoveContainer" containerID="85456f1848243765e2d473f760124a27191626f1650d37509a1eb62e5f3138fc" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.511374 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.515258 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.517588 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vsmjz" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.540835 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.543870 4756 scope.go:117] "RemoveContainer" containerID="e20df79125cd2c365e707fdb1cf60b40b6b4b579a5f0bd54d7e52f0b70e3d8ea" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.587847 4756 scope.go:117] "RemoveContainer" containerID="7be737e28ed352a4c08f3c99de7ce6f9d006ea35ab29b304739115ae17238871" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.603910 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/930a573c-21eb-46e3-827b-2ff167eecd4a-ovsdb-rundir\") pod \"930a573c-21eb-46e3-827b-2ff167eecd4a\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.603963 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/359e87f3-8fb4-4476-a538-13c948b65885-var-run\") pod \"359e87f3-8fb4-4476-a538-13c948b65885\" (UID: \"359e87f3-8fb4-4476-a538-13c948b65885\") " Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.604025 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pldz2\" (UniqueName: \"kubernetes.io/projected/930a573c-21eb-46e3-827b-2ff167eecd4a-kube-api-access-pldz2\") pod \"930a573c-21eb-46e3-827b-2ff167eecd4a\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.604047 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/359e87f3-8fb4-4476-a538-13c948b65885-var-log-ovn\") pod \"359e87f3-8fb4-4476-a538-13c948b65885\" (UID: \"359e87f3-8fb4-4476-a538-13c948b65885\") " Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.604069 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/359e87f3-8fb4-4476-a538-13c948b65885-combined-ca-bundle\") pod \"359e87f3-8fb4-4476-a538-13c948b65885\" (UID: \"359e87f3-8fb4-4476-a538-13c948b65885\") " Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.604106 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/930a573c-21eb-46e3-827b-2ff167eecd4a-metrics-certs-tls-certs\") pod \"930a573c-21eb-46e3-827b-2ff167eecd4a\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.604133 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/359e87f3-8fb4-4476-a538-13c948b65885-var-run-ovn\") pod \"359e87f3-8fb4-4476-a538-13c948b65885\" (UID: \"359e87f3-8fb4-4476-a538-13c948b65885\") " Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.604153 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/930a573c-21eb-46e3-827b-2ff167eecd4a-config\") pod \"930a573c-21eb-46e3-827b-2ff167eecd4a\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.604181 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/359e87f3-8fb4-4476-a538-13c948b65885-scripts\") pod \"359e87f3-8fb4-4476-a538-13c948b65885\" (UID: \"359e87f3-8fb4-4476-a538-13c948b65885\") " Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.604228 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"930a573c-21eb-46e3-827b-2ff167eecd4a\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.604250 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/930a573c-21eb-46e3-827b-2ff167eecd4a-scripts\") pod \"930a573c-21eb-46e3-827b-2ff167eecd4a\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.604296 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/359e87f3-8fb4-4476-a538-13c948b65885-ovn-controller-tls-certs\") pod \"359e87f3-8fb4-4476-a538-13c948b65885\" (UID: \"359e87f3-8fb4-4476-a538-13c948b65885\") " Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.604323 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/930a573c-21eb-46e3-827b-2ff167eecd4a-ovsdbserver-sb-tls-certs\") pod \"930a573c-21eb-46e3-827b-2ff167eecd4a\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.604342 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/930a573c-21eb-46e3-827b-2ff167eecd4a-combined-ca-bundle\") pod \"930a573c-21eb-46e3-827b-2ff167eecd4a\" (UID: \"930a573c-21eb-46e3-827b-2ff167eecd4a\") " Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.604393 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc482\" (UniqueName: \"kubernetes.io/projected/359e87f3-8fb4-4476-a538-13c948b65885-kube-api-access-mc482\") pod \"359e87f3-8fb4-4476-a538-13c948b65885\" (UID: \"359e87f3-8fb4-4476-a538-13c948b65885\") " Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.605392 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/930a573c-21eb-46e3-827b-2ff167eecd4a-config" (OuterVolumeSpecName: "config") pod "930a573c-21eb-46e3-827b-2ff167eecd4a" (UID: "930a573c-21eb-46e3-827b-2ff167eecd4a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.605493 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/359e87f3-8fb4-4476-a538-13c948b65885-var-run" (OuterVolumeSpecName: "var-run") pod "359e87f3-8fb4-4476-a538-13c948b65885" (UID: "359e87f3-8fb4-4476-a538-13c948b65885"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.605656 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/930a573c-21eb-46e3-827b-2ff167eecd4a-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "930a573c-21eb-46e3-827b-2ff167eecd4a" (UID: "930a573c-21eb-46e3-827b-2ff167eecd4a"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.608586 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/359e87f3-8fb4-4476-a538-13c948b65885-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "359e87f3-8fb4-4476-a538-13c948b65885" (UID: "359e87f3-8fb4-4476-a538-13c948b65885"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.608698 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/359e87f3-8fb4-4476-a538-13c948b65885-scripts" (OuterVolumeSpecName: "scripts") pod "359e87f3-8fb4-4476-a538-13c948b65885" (UID: "359e87f3-8fb4-4476-a538-13c948b65885"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.608781 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/359e87f3-8fb4-4476-a538-13c948b65885-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "359e87f3-8fb4-4476-a538-13c948b65885" (UID: "359e87f3-8fb4-4476-a538-13c948b65885"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.609243 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/930a573c-21eb-46e3-827b-2ff167eecd4a-scripts" (OuterVolumeSpecName: "scripts") pod "930a573c-21eb-46e3-827b-2ff167eecd4a" (UID: "930a573c-21eb-46e3-827b-2ff167eecd4a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.633210 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cf4ff87b5-954sn"] Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.634698 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/359e87f3-8fb4-4476-a538-13c948b65885-kube-api-access-mc482" (OuterVolumeSpecName: "kube-api-access-mc482") pod "359e87f3-8fb4-4476-a538-13c948b65885" (UID: "359e87f3-8fb4-4476-a538-13c948b65885"). InnerVolumeSpecName "kube-api-access-mc482". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.686451 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/930a573c-21eb-46e3-827b-2ff167eecd4a-kube-api-access-pldz2" (OuterVolumeSpecName: "kube-api-access-pldz2") pod "930a573c-21eb-46e3-827b-2ff167eecd4a" (UID: "930a573c-21eb-46e3-827b-2ff167eecd4a"). InnerVolumeSpecName "kube-api-access-pldz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.686651 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "930a573c-21eb-46e3-827b-2ff167eecd4a" (UID: "930a573c-21eb-46e3-827b-2ff167eecd4a"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.722971 4756 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/359e87f3-8fb4-4476-a538-13c948b65885-var-run\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.723013 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pldz2\" (UniqueName: \"kubernetes.io/projected/930a573c-21eb-46e3-827b-2ff167eecd4a-kube-api-access-pldz2\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.723029 4756 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/359e87f3-8fb4-4476-a538-13c948b65885-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.723041 4756 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/359e87f3-8fb4-4476-a538-13c948b65885-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.723054 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/930a573c-21eb-46e3-827b-2ff167eecd4a-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.723066 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/359e87f3-8fb4-4476-a538-13c948b65885-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.724515 4756 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.724554 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/930a573c-21eb-46e3-827b-2ff167eecd4a-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.724566 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc482\" (UniqueName: \"kubernetes.io/projected/359e87f3-8fb4-4476-a538-13c948b65885-kube-api-access-mc482\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.724576 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/930a573c-21eb-46e3-827b-2ff167eecd4a-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.751400 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cf4ff87b5-954sn"] Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.789753 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/930a573c-21eb-46e3-827b-2ff167eecd4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "930a573c-21eb-46e3-827b-2ff167eecd4a" (UID: "930a573c-21eb-46e3-827b-2ff167eecd4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.793572 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder2230-account-delete-jfjlh"] Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.831354 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/930a573c-21eb-46e3-827b-2ff167eecd4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.851799 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance1a2d-account-delete-vgqdz"] Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.858683 4756 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.867706 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/359e87f3-8fb4-4476-a538-13c948b65885-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "359e87f3-8fb4-4476-a538-13c948b65885" (UID: "359e87f3-8fb4-4476-a538-13c948b65885"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.965913 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/359e87f3-8fb4-4476-a538-13c948b65885-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:04 crc kubenswrapper[4756]: I1205 16:41:04.965959 4756 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:05 crc kubenswrapper[4756]: E1205 16:41:05.075064 4756 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 05 16:41:05 crc kubenswrapper[4756]: E1205 16:41:05.075508 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/11d0ad48-c987-4415-8536-b150536534a5-config-data podName:11d0ad48-c987-4415-8536-b150536534a5 nodeName:}" failed. No retries permitted until 2025-12-05 16:41:09.075488271 +0000 UTC m=+1529.314161783 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/11d0ad48-c987-4415-8536-b150536534a5-config-data") pod "rabbitmq-server-0" (UID: "11d0ad48-c987-4415-8536-b150536534a5") : configmap "rabbitmq-config-data" not found Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.140916 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/930a573c-21eb-46e3-827b-2ff167eecd4a-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "930a573c-21eb-46e3-827b-2ff167eecd4a" (UID: "930a573c-21eb-46e3-827b-2ff167eecd4a"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.175522 4756 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/930a573c-21eb-46e3-827b-2ff167eecd4a-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.208712 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/930a573c-21eb-46e3-827b-2ff167eecd4a-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "930a573c-21eb-46e3-827b-2ff167eecd4a" (UID: "930a573c-21eb-46e3-827b-2ff167eecd4a"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.212175 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbicanc3ab-account-delete-ndr77"] Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.252546 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutronbc5a-account-delete-wtsjj"] Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.256342 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance1a2d-account-delete-vgqdz" event={"ID":"735bacc2-aeb8-4f2f-8f1a-bf3d092fd163","Type":"ContainerStarted","Data":"fd8fbeb5aa4e68791452eb3cdf0f6975a07c7438ce715874e542353447b5aba3"} Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.257907 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicanc3ab-account-delete-ndr77" event={"ID":"9639f2b9-2b61-4ab9-ab94-6c54d373662f","Type":"ContainerStarted","Data":"c941a18a891c805ed027d232a36e4d9d229670f42069ad4c59c6b942b30bd1af"} Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.263095 4756 generic.go:334] "Generic (PLEG): container finished" podID="1f517228-5007-4e3c-a41b-1e4b636446c0" containerID="3314201eb3f0a43ba4e986ea1cf96160fc430f0101df1aa8d597f6eafb4f4d90" exitCode=0 Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.263177 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1f517228-5007-4e3c-a41b-1e4b636446c0","Type":"ContainerDied","Data":"3314201eb3f0a43ba4e986ea1cf96160fc430f0101df1aa8d597f6eafb4f4d90"} Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.263257 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1f517228-5007-4e3c-a41b-1e4b636446c0","Type":"ContainerDied","Data":"3aa7ad4334643d0dfb3acf7cf802e64afb91deb087d7c9ca5087959cefc98cfa"} Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.263278 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3aa7ad4334643d0dfb3acf7cf802e64afb91deb087d7c9ca5087959cefc98cfa" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.268040 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vsmjz" event={"ID":"359e87f3-8fb4-4476-a538-13c948b65885","Type":"ContainerDied","Data":"3e70c93238337b86942115e08684235ead427eaf4b65639d07de66647a8b9cd5"} Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.268106 4756 scope.go:117] "RemoveContainer" containerID="ade01f42bbe77b89747984e4dc69936f10f823fef3986224b12446fc96eaf272" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.268244 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vsmjz" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.279748 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/930a573c-21eb-46e3-827b-2ff167eecd4a-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.282719 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/359e87f3-8fb4-4476-a538-13c948b65885-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "359e87f3-8fb4-4476-a538-13c948b65885" (UID: "359e87f3-8fb4-4476-a538-13c948b65885"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.288672 4756 generic.go:334] "Generic (PLEG): container finished" podID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerID="ebf30d748f8665f39bb4c4c38feb695f0528281f593a1cb8bba7b8c8fd0a8fc2" exitCode=0 Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.288727 4756 generic.go:334] "Generic (PLEG): container finished" podID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerID="dda1d79f2b59ff6c5c2d0266e0acd62ebf58cab920b8afccadd520b9892680b4" exitCode=0 Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.288797 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerDied","Data":"ebf30d748f8665f39bb4c4c38feb695f0528281f593a1cb8bba7b8c8fd0a8fc2"} Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.288830 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerDied","Data":"dda1d79f2b59ff6c5c2d0266e0acd62ebf58cab920b8afccadd520b9892680b4"} Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.294551 4756 generic.go:334] "Generic (PLEG): container finished" podID="509573b5-33f4-432c-bb2e-1fc9186ed072" containerID="a4403245fba51bc317b8d7897b375b1d195288a8e13b1478f5ed78542abf0293" exitCode=0 Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.294602 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"509573b5-33f4-432c-bb2e-1fc9186ed072","Type":"ContainerDied","Data":"a4403245fba51bc317b8d7897b375b1d195288a8e13b1478f5ed78542abf0293"} Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.324028 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"930a573c-21eb-46e3-827b-2ff167eecd4a","Type":"ContainerDied","Data":"707ab58705624e64f1ef086e35b4a44addb664700c08627d7f41a84fc6487652"} Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.324140 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.335812 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder2230-account-delete-jfjlh" event={"ID":"6e2e416b-60db-4b09-b585-59911e192daf","Type":"ContainerStarted","Data":"eaf8147f089331777c3150465522528cfcb38ec25fe36ed9336bb4c5205ca2bc"} Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.351210 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder2230-account-delete-jfjlh" podStartSLOduration=5.351191058 podStartE2EDuration="5.351191058s" podCreationTimestamp="2025-12-05 16:41:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:41:05.349403449 +0000 UTC m=+1525.588076961" watchObservedRunningTime="2025-12-05 16:41:05.351191058 +0000 UTC m=+1525.589864570" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.356606 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-bg8df_b9642035-7395-4ead-9da4-4d01f497a80b/openstack-network-exporter/0.log" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.356734 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-bg8df" event={"ID":"b9642035-7395-4ead-9da4-4d01f497a80b","Type":"ContainerDied","Data":"8ca8d782a42936d354b601b8dabd7a874bafa4a7d2349554876c0e32cc42076f"} Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.356846 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-bg8df" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.375473 4756 generic.go:334] "Generic (PLEG): container finished" podID="541f7de0-3773-4e28-9881-b552367f1c89" containerID="40fa47072fd0322fd7094036691019fba18613d3d6531f5dd1a0acf458462514" exitCode=0 Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.375631 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-647b5b9577-j6wcs" event={"ID":"541f7de0-3773-4e28-9881-b552367f1c89","Type":"ContainerDied","Data":"40fa47072fd0322fd7094036691019fba18613d3d6531f5dd1a0acf458462514"} Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.384099 4756 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/359e87f3-8fb4-4476-a538-13c948b65885-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.398879 4756 generic.go:334] "Generic (PLEG): container finished" podID="40ff87cb-4d82-49d4-a4d9-55e104c9becb" containerID="664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b" exitCode=0 Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.399009 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell16b06-account-delete-q6cdw" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.400526 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kdtg5" event={"ID":"40ff87cb-4d82-49d4-a4d9-55e104c9becb","Type":"ContainerDied","Data":"664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b"} Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.408397 4756 scope.go:117] "RemoveContainer" containerID="552b70f0a07dfc0a35589ca2e4b524aad06f3b481cf9fd381f8830f3e4bb7734" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.457651 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement5c6b-account-delete-nhd8d"] Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.550165 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.575203 4756 scope.go:117] "RemoveContainer" containerID="6fbdbf2325367baff55e8b846433a59f4bbca7569a3296fb749614fe4ac3a285" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.590335 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f517228-5007-4e3c-a41b-1e4b636446c0-vencrypt-tls-certs\") pod \"1f517228-5007-4e3c-a41b-1e4b636446c0\" (UID: \"1f517228-5007-4e3c-a41b-1e4b636446c0\") " Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.590400 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vl5hw\" (UniqueName: \"kubernetes.io/projected/1f517228-5007-4e3c-a41b-1e4b636446c0-kube-api-access-vl5hw\") pod \"1f517228-5007-4e3c-a41b-1e4b636446c0\" (UID: \"1f517228-5007-4e3c-a41b-1e4b636446c0\") " Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.590444 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f517228-5007-4e3c-a41b-1e4b636446c0-nova-novncproxy-tls-certs\") pod \"1f517228-5007-4e3c-a41b-1e4b636446c0\" (UID: \"1f517228-5007-4e3c-a41b-1e4b636446c0\") " Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.590514 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f517228-5007-4e3c-a41b-1e4b636446c0-combined-ca-bundle\") pod \"1f517228-5007-4e3c-a41b-1e4b636446c0\" (UID: \"1f517228-5007-4e3c-a41b-1e4b636446c0\") " Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.590571 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f517228-5007-4e3c-a41b-1e4b636446c0-config-data\") pod \"1f517228-5007-4e3c-a41b-1e4b636446c0\" (UID: \"1f517228-5007-4e3c-a41b-1e4b636446c0\") " Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.607667 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f517228-5007-4e3c-a41b-1e4b636446c0-kube-api-access-vl5hw" (OuterVolumeSpecName: "kube-api-access-vl5hw") pod "1f517228-5007-4e3c-a41b-1e4b636446c0" (UID: "1f517228-5007-4e3c-a41b-1e4b636446c0"). InnerVolumeSpecName "kube-api-access-vl5hw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.622260 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.626562 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f517228-5007-4e3c-a41b-1e4b636446c0-config-data" (OuterVolumeSpecName: "config-data") pod "1f517228-5007-4e3c-a41b-1e4b636446c0" (UID: "1f517228-5007-4e3c-a41b-1e4b636446c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.666266 4756 scope.go:117] "RemoveContainer" containerID="aaa563c0b40764c0748232c6188d4ab9d2d109f170578ae13acab5b936dcc5c5" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.675381 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell16b06-account-delete-q6cdw"] Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.689334 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f517228-5007-4e3c-a41b-1e4b636446c0-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "1f517228-5007-4e3c-a41b-1e4b636446c0" (UID: "1f517228-5007-4e3c-a41b-1e4b636446c0"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.692126 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjk8k\" (UniqueName: \"kubernetes.io/projected/541f7de0-3773-4e28-9881-b552367f1c89-kube-api-access-pjk8k\") pod \"541f7de0-3773-4e28-9881-b552367f1c89\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.692180 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541f7de0-3773-4e28-9881-b552367f1c89-combined-ca-bundle\") pod \"541f7de0-3773-4e28-9881-b552367f1c89\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.692245 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/541f7de0-3773-4e28-9881-b552367f1c89-log-httpd\") pod \"541f7de0-3773-4e28-9881-b552367f1c89\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.692281 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/541f7de0-3773-4e28-9881-b552367f1c89-run-httpd\") pod \"541f7de0-3773-4e28-9881-b552367f1c89\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.692321 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/541f7de0-3773-4e28-9881-b552367f1c89-etc-swift\") pod \"541f7de0-3773-4e28-9881-b552367f1c89\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.692359 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/541f7de0-3773-4e28-9881-b552367f1c89-public-tls-certs\") pod \"541f7de0-3773-4e28-9881-b552367f1c89\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.692429 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/541f7de0-3773-4e28-9881-b552367f1c89-config-data\") pod \"541f7de0-3773-4e28-9881-b552367f1c89\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.692479 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/541f7de0-3773-4e28-9881-b552367f1c89-internal-tls-certs\") pod \"541f7de0-3773-4e28-9881-b552367f1c89\" (UID: \"541f7de0-3773-4e28-9881-b552367f1c89\") " Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.692876 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/541f7de0-3773-4e28-9881-b552367f1c89-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "541f7de0-3773-4e28-9881-b552367f1c89" (UID: "541f7de0-3773-4e28-9881-b552367f1c89"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.692974 4756 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f517228-5007-4e3c-a41b-1e4b636446c0-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.693081 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vl5hw\" (UniqueName: \"kubernetes.io/projected/1f517228-5007-4e3c-a41b-1e4b636446c0-kube-api-access-vl5hw\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.693144 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f517228-5007-4e3c-a41b-1e4b636446c0-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.693513 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/541f7de0-3773-4e28-9881-b552367f1c89-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "541f7de0-3773-4e28-9881-b552367f1c89" (UID: "541f7de0-3773-4e28-9881-b552367f1c89"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.715373 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f517228-5007-4e3c-a41b-1e4b636446c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f517228-5007-4e3c-a41b-1e4b636446c0" (UID: "1f517228-5007-4e3c-a41b-1e4b636446c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.727848 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/541f7de0-3773-4e28-9881-b552367f1c89-kube-api-access-pjk8k" (OuterVolumeSpecName: "kube-api-access-pjk8k") pod "541f7de0-3773-4e28-9881-b552367f1c89" (UID: "541f7de0-3773-4e28-9881-b552367f1c89"). InnerVolumeSpecName "kube-api-access-pjk8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.737166 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/541f7de0-3773-4e28-9881-b552367f1c89-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "541f7de0-3773-4e28-9881-b552367f1c89" (UID: "541f7de0-3773-4e28-9881-b552367f1c89"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.741580 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell16b06-account-delete-q6cdw"] Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.750636 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.758845 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.773979 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapi58bc-account-delete-cdg92"] Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.782490 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-bg8df"] Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.790304 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-bg8df"] Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.794597 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dabf9eb-5097-4d26-b64c-00938b0638f6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.794627 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb7dk\" (UniqueName: \"kubernetes.io/projected/1dabf9eb-5097-4d26-b64c-00938b0638f6-kube-api-access-jb7dk\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.794636 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjk8k\" (UniqueName: \"kubernetes.io/projected/541f7de0-3773-4e28-9881-b552367f1c89-kube-api-access-pjk8k\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.794644 4756 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/541f7de0-3773-4e28-9881-b552367f1c89-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.794655 4756 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/541f7de0-3773-4e28-9881-b552367f1c89-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.794663 4756 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/541f7de0-3773-4e28-9881-b552367f1c89-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.794672 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f517228-5007-4e3c-a41b-1e4b636446c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.800569 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-vsmjz"] Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.815885 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-vsmjz"] Dec 05 16:41:05 crc kubenswrapper[4756]: I1205 16:41:05.879292 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell0bc19-account-delete-xlr4r"] Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.122779 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dabf9eb-5097-4d26-b64c-00938b0638f6" path="/var/lib/kubelet/pods/1dabf9eb-5097-4d26-b64c-00938b0638f6/volumes" Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.123499 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="359e87f3-8fb4-4476-a538-13c948b65885" path="/var/lib/kubelet/pods/359e87f3-8fb4-4476-a538-13c948b65885/volumes" Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.124458 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f72429c-8e1e-4c2d-8417-c499c22013a9" path="/var/lib/kubelet/pods/3f72429c-8e1e-4c2d-8417-c499c22013a9/volumes" Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.125170 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aa745f2-e987-4c61-84bd-f40224e4a4a0" path="/var/lib/kubelet/pods/6aa745f2-e987-4c61-84bd-f40224e4a4a0/volumes" Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.153056 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="930a573c-21eb-46e3-827b-2ff167eecd4a" path="/var/lib/kubelet/pods/930a573c-21eb-46e3-827b-2ff167eecd4a/volumes" Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.153969 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9642035-7395-4ead-9da4-4d01f497a80b" path="/var/lib/kubelet/pods/b9642035-7395-4ead-9da4-4d01f497a80b/volumes" Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.154079 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="9df5252d-16cf-4612-910c-8f37d10b48a7" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.162:8776/healthcheck\": read tcp 10.217.0.2:37672->10.217.0.162:8776: read: connection reset by peer" Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.164475 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c609265d-528c-4c9c-bd1c-b735775d9acc" path="/var/lib/kubelet/pods/c609265d-528c-4c9c-bd1c-b735775d9acc/volumes" Dec 05 16:41:06 crc kubenswrapper[4756]: E1205 16:41:06.319693 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1ad5c1e2511e818eb90b9c9bbfc0f48afbd46e3e9b27a610240331e0b4b66444" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 05 16:41:06 crc kubenswrapper[4756]: E1205 16:41:06.342777 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1ad5c1e2511e818eb90b9c9bbfc0f48afbd46e3e9b27a610240331e0b4b66444" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 05 16:41:06 crc kubenswrapper[4756]: E1205 16:41:06.347405 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1ad5c1e2511e818eb90b9c9bbfc0f48afbd46e3e9b27a610240331e0b4b66444" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 05 16:41:06 crc kubenswrapper[4756]: E1205 16:41:06.347497 4756 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683" containerName="ovn-northd" Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.475537 4756 generic.go:334] "Generic (PLEG): container finished" podID="9df5252d-16cf-4612-910c-8f37d10b48a7" containerID="6d0787e75ae992dd2b3dc29096ca4f2b71204bd07d887523f0de467c4e4688d4" exitCode=0 Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.497155 4756 generic.go:334] "Generic (PLEG): container finished" podID="4beba48e-5ac4-4d41-864b-ecfd8b039988" containerID="ee842d9afcdec61635c6c49c87079fe579d522f6aece80d8d0ea7d52afa84d53" exitCode=0 Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.556985 4756 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/glance1a2d-account-delete-vgqdz" secret="" err="secret \"galera-openstack-dockercfg-br626\" not found" Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.558612 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/541f7de0-3773-4e28-9881-b552367f1c89-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "541f7de0-3773-4e28-9881-b552367f1c89" (UID: "541f7de0-3773-4e28-9881-b552367f1c89"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.575433 4756 generic.go:334] "Generic (PLEG): container finished" podID="509573b5-33f4-432c-bb2e-1fc9186ed072" containerID="1c001a9c870ca4bb31c57a0eaad2bee3efc8b47aef90bc4502a5337603659d8b" exitCode=0 Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.584131 4756 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/541f7de0-3773-4e28-9881-b552367f1c89-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.617661 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f517228-5007-4e3c-a41b-1e4b636446c0-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "1f517228-5007-4e3c-a41b-1e4b636446c0" (UID: "1f517228-5007-4e3c-a41b-1e4b636446c0"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.618928 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/541f7de0-3773-4e28-9881-b552367f1c89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "541f7de0-3773-4e28-9881-b552367f1c89" (UID: "541f7de0-3773-4e28-9881-b552367f1c89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.675653 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/541f7de0-3773-4e28-9881-b552367f1c89-config-data" (OuterVolumeSpecName: "config-data") pod "541f7de0-3773-4e28-9881-b552367f1c89" (UID: "541f7de0-3773-4e28-9881-b552367f1c89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.676807 4756 generic.go:334] "Generic (PLEG): container finished" podID="6e2e416b-60db-4b09-b585-59911e192daf" containerID="7cbefc8071abfa1c8c1d0b2908585ff763fc9701a11ed995f99f21ef8c00a196" exitCode=0 Dec 05 16:41:06 crc kubenswrapper[4756]: E1205 16:41:06.687956 4756 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 16:41:06 crc kubenswrapper[4756]: E1205 16:41:06.688017 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/735bacc2-aeb8-4f2f-8f1a-bf3d092fd163-operator-scripts podName:735bacc2-aeb8-4f2f-8f1a-bf3d092fd163 nodeName:}" failed. No retries permitted until 2025-12-05 16:41:07.188002289 +0000 UTC m=+1527.426675801 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/735bacc2-aeb8-4f2f-8f1a-bf3d092fd163-operator-scripts") pod "glance1a2d-account-delete-vgqdz" (UID: "735bacc2-aeb8-4f2f-8f1a-bf3d092fd163") : configmap "openstack-scripts" not found Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.688288 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/541f7de0-3773-4e28-9881-b552367f1c89-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.688316 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541f7de0-3773-4e28-9881-b552367f1c89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.688330 4756 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f517228-5007-4e3c-a41b-1e4b636446c0-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.697252 4756 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/neutronbc5a-account-delete-wtsjj" secret="" err="secret \"galera-openstack-dockercfg-br626\" not found" Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.705540 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/541f7de0-3773-4e28-9881-b552367f1c89-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "541f7de0-3773-4e28-9881-b552367f1c89" (UID: "541f7de0-3773-4e28-9881-b552367f1c89"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.720747 4756 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/placement5c6b-account-delete-nhd8d" secret="" err="secret \"galera-openstack-dockercfg-br626\" not found" Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.724322 4756 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/barbicanc3ab-account-delete-ndr77" secret="" err="secret \"galera-openstack-dockercfg-br626\" not found" Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.726561 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance1a2d-account-delete-vgqdz" podStartSLOduration=6.72653678 podStartE2EDuration="6.72653678s" podCreationTimestamp="2025-12-05 16:41:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:41:06.589431682 +0000 UTC m=+1526.828105194" watchObservedRunningTime="2025-12-05 16:41:06.72653678 +0000 UTC m=+1526.965210292" Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.748625 4756 generic.go:334] "Generic (PLEG): container finished" podID="881d2b4e-3f2e-4aca-a965-2508edc92a3b" containerID="41db2bc934dd1d8f5c3deaf39234aa1d852019a880bf3e665b6803cc5b7d2923" exitCode=0 Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.759717 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-647b5b9577-j6wcs" Dec 05 16:41:06 crc kubenswrapper[4756]: E1205 16:41:06.768959 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="14a9c94a34d46580da78a252edc6fc101e1053fd75e54da245fe2728758527ee" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.776433 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-54c6d9b488-h9hmj" podUID="2606acc0-f425-4595-8f5a-df1f009175ee" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.155:9311/healthcheck\": read tcp 10.217.0.2:53204->10.217.0.155:9311: read: connection reset by peer" Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.776715 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-54c6d9b488-h9hmj" podUID="2606acc0-f425-4595-8f5a-df1f009175ee" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.155:9311/healthcheck\": read tcp 10.217.0.2:53196->10.217.0.155:9311: read: connection reset by peer" Dec 05 16:41:06 crc kubenswrapper[4756]: E1205 16:41:06.785624 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="14a9c94a34d46580da78a252edc6fc101e1053fd75e54da245fe2728758527ee" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.791256 4756 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/541f7de0-3773-4e28-9881-b552367f1c89-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:06 crc kubenswrapper[4756]: E1205 16:41:06.791348 4756 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 16:41:06 crc kubenswrapper[4756]: E1205 16:41:06.791406 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e-operator-scripts podName:3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e nodeName:}" failed. No retries permitted until 2025-12-05 16:41:07.291382707 +0000 UTC m=+1527.530056219 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e-operator-scripts") pod "placement5c6b-account-delete-nhd8d" (UID: "3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e") : configmap "openstack-scripts" not found Dec 05 16:41:06 crc kubenswrapper[4756]: E1205 16:41:06.791622 4756 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 16:41:06 crc kubenswrapper[4756]: E1205 16:41:06.791651 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/37e233df-e9ab-45c9-a335-f1182a3b8c79-operator-scripts podName:37e233df-e9ab-45c9-a335-f1182a3b8c79 nodeName:}" failed. No retries permitted until 2025-12-05 16:41:07.291643764 +0000 UTC m=+1527.530317276 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/37e233df-e9ab-45c9-a335-f1182a3b8c79-operator-scripts") pod "neutronbc5a-account-delete-wtsjj" (UID: "37e233df-e9ab-45c9-a335-f1182a3b8c79") : configmap "openstack-scripts" not found Dec 05 16:41:06 crc kubenswrapper[4756]: E1205 16:41:06.792014 4756 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 16:41:06 crc kubenswrapper[4756]: E1205 16:41:06.792038 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9639f2b9-2b61-4ab9-ab94-6c54d373662f-operator-scripts podName:9639f2b9-2b61-4ab9-ab94-6c54d373662f nodeName:}" failed. No retries permitted until 2025-12-05 16:41:07.292031305 +0000 UTC m=+1527.530704817 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9639f2b9-2b61-4ab9-ab94-6c54d373662f-operator-scripts") pod "barbicanc3ab-account-delete-ndr77" (UID: "9639f2b9-2b61-4ab9-ab94-6c54d373662f") : configmap "openstack-scripts" not found Dec 05 16:41:06 crc kubenswrapper[4756]: E1205 16:41:06.808173 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="14a9c94a34d46580da78a252edc6fc101e1053fd75e54da245fe2728758527ee" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 05 16:41:06 crc kubenswrapper[4756]: E1205 16:41:06.808220 4756 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="dc0118af-131e-409a-a18b-0be03f3ce81b" containerName="nova-cell1-conductor-conductor" Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.815773 4756 generic.go:334] "Generic (PLEG): container finished" podID="f664c2b4-e517-427a-bbdc-8fba6d548a21" containerID="8e8b38530a288e65246e8e282b7de364102d9ead1db22ed0d2d5b5a3b7cad231" exitCode=0 Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.815894 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.839828 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutronbc5a-account-delete-wtsjj" podStartSLOduration=5.839812158 podStartE2EDuration="5.839812158s" podCreationTimestamp="2025-12-05 16:41:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:41:06.715987803 +0000 UTC m=+1526.954661315" watchObservedRunningTime="2025-12-05 16:41:06.839812158 +0000 UTC m=+1527.078485670" Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.856657 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement5c6b-account-delete-nhd8d" podStartSLOduration=5.856443311 podStartE2EDuration="5.856443311s" podCreationTimestamp="2025-12-05 16:41:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:41:06.746009091 +0000 UTC m=+1526.984682603" watchObservedRunningTime="2025-12-05 16:41:06.856443311 +0000 UTC m=+1527.095116823" Dec 05 16:41:06 crc kubenswrapper[4756]: I1205 16:41:06.873148 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbicanc3ab-account-delete-ndr77" podStartSLOduration=5.873130716 podStartE2EDuration="5.873130716s" podCreationTimestamp="2025-12-05 16:41:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:41:06.780783809 +0000 UTC m=+1527.019457321" watchObservedRunningTime="2025-12-05 16:41:06.873130716 +0000 UTC m=+1527.111804228" Dec 05 16:41:07 crc kubenswrapper[4756]: E1205 16:41:07.203953 4756 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 16:41:07 crc kubenswrapper[4756]: E1205 16:41:07.204033 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/735bacc2-aeb8-4f2f-8f1a-bf3d092fd163-operator-scripts podName:735bacc2-aeb8-4f2f-8f1a-bf3d092fd163 nodeName:}" failed. No retries permitted until 2025-12-05 16:41:08.204017629 +0000 UTC m=+1528.442691141 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/735bacc2-aeb8-4f2f-8f1a-bf3d092fd163-operator-scripts") pod "glance1a2d-account-delete-vgqdz" (UID: "735bacc2-aeb8-4f2f-8f1a-bf3d092fd163") : configmap "openstack-scripts" not found Dec 05 16:41:07 crc kubenswrapper[4756]: E1205 16:41:07.233702 4756 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.176s" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.233775 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.233829 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9df5252d-16cf-4612-910c-8f37d10b48a7","Type":"ContainerDied","Data":"6d0787e75ae992dd2b3dc29096ca4f2b71204bd07d887523f0de467c4e4688d4"} Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.233886 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4beba48e-5ac4-4d41-864b-ecfd8b039988","Type":"ContainerDied","Data":"ee842d9afcdec61635c6c49c87079fe579d522f6aece80d8d0ea7d52afa84d53"} Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.234156 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.234198 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.234230 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance1a2d-account-delete-vgqdz" event={"ID":"735bacc2-aeb8-4f2f-8f1a-bf3d092fd163","Type":"ContainerStarted","Data":"b462dd63e42cfa5cb78440495d7faec00b0f9c211eb3ed5ddcb9403c412b08b7"} Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.234289 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"509573b5-33f4-432c-bb2e-1fc9186ed072","Type":"ContainerDied","Data":"1c001a9c870ca4bb31c57a0eaad2bee3efc8b47aef90bc4502a5337603659d8b"} Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.234329 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-gmdc8"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.234342 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-nrzfl"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.234356 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-gmdc8"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.234373 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-nrzfl"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.234396 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-857c4c465c-j99gg"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293130 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293201 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-6hl4j"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293223 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-6hl4j"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293241 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-0c76-account-create-update-st5sb"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293255 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-0c76-account-create-update-st5sb"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293269 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder2230-account-delete-jfjlh" event={"ID":"6e2e416b-60db-4b09-b585-59911e192daf","Type":"ContainerDied","Data":"7cbefc8071abfa1c8c1d0b2908585ff763fc9701a11ed995f99f21ef8c00a196"} Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293298 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-qwgf6"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293313 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-qwgf6"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293327 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0bc19-account-delete-xlr4r" event={"ID":"cc4b1e45-87c0-471f-beb9-70bce6933d47","Type":"ContainerStarted","Data":"5252e226fa698ec8d62703bad74f73c22f2f2fca1bbcc716bbaed92366640a9b"} Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293347 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi58bc-account-delete-cdg92" event={"ID":"2586474d-e368-428a-a86b-1ec95b8006a7","Type":"ContainerStarted","Data":"50313c8fcba529cbbb960ae7922f99f11e92fc811b0e97a85eb316cc9cac36bd"} Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293362 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutronbc5a-account-delete-wtsjj" event={"ID":"37e233df-e9ab-45c9-a335-f1182a3b8c79","Type":"ContainerStarted","Data":"693ab2a387b6a0c295bf6ab3d9d8b6730de073c73bc06fccbb3c0097dd28ae4a"} Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293376 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5c6b-account-create-update-kb6fq"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293389 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutronbc5a-account-delete-wtsjj" event={"ID":"37e233df-e9ab-45c9-a335-f1182a3b8c79","Type":"ContainerStarted","Data":"2bb632277c50d0b113d70b858b3c484f729dd254b7423cf3b9d15aa508e4c585"} Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293403 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5c6b-account-create-update-kb6fq"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293437 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement5c6b-account-delete-nhd8d" event={"ID":"3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e","Type":"ContainerStarted","Data":"4d8ef07e1a07f5358ac394401715b8fe4e221ba774cd2b748b15afd94fa0426f"} Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293452 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement5c6b-account-delete-nhd8d"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293467 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicanc3ab-account-delete-ndr77" event={"ID":"9639f2b9-2b61-4ab9-ab94-6c54d373662f","Type":"ContainerStarted","Data":"d01e9052eadb9b19c59bb451fdb0fec97a060cfb91308a4e5a61cd10b153475f"} Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293483 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"881d2b4e-3f2e-4aca-a965-2508edc92a3b","Type":"ContainerDied","Data":"41db2bc934dd1d8f5c3deaf39234aa1d852019a880bf3e665b6803cc5b7d2923"} Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293504 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"881d2b4e-3f2e-4aca-a965-2508edc92a3b","Type":"ContainerDied","Data":"e44703d92d418215bf4058bbba49c988c5dd02387e4ce366e08a1a22a8daa411"} Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293517 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e44703d92d418215bf4058bbba49c988c5dd02387e4ce366e08a1a22a8daa411" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293537 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-647b5b9577-j6wcs" event={"ID":"541f7de0-3773-4e28-9881-b552367f1c89","Type":"ContainerDied","Data":"3745ded4e39a2011f659d17a30eb1ac6e00e0cafa518d9a8e7f02ca39ea34a5d"} Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293554 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f664c2b4-e517-427a-bbdc-8fba6d548a21","Type":"ContainerDied","Data":"8e8b38530a288e65246e8e282b7de364102d9ead1db22ed0d2d5b5a3b7cad231"} Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293574 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-hjw2d"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293591 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-hjw2d"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293610 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell0bc19-account-delete-xlr4r"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293625 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-bc19-account-create-update-rxqln"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293642 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-bc19-account-create-update-rxqln"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293655 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-5jhw5"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293669 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-5jhw5"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293686 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi58bc-account-delete-cdg92"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.293700 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-58bc-account-create-update-r7d29"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.294091 4756 scope.go:117] "RemoveContainer" containerID="40fa47072fd0322fd7094036691019fba18613d3d6531f5dd1a0acf458462514" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.295992 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-58bc-account-create-update-r7d29"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.296283 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a3cf4c6f-f34b-47ea-86f9-88666a59d44a" containerName="ceilometer-central-agent" containerID="cri-o://e3d6463e66e6a8de5357ab6e221c04b2cb882550348104db03b422d53b2565e5" gracePeriod=30 Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.297380 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="797faab6-a6b3-48ae-90ca-09d251b2975f" containerName="kube-state-metrics" containerID="cri-o://9dcca87cd81ebc30ed3acfa39592365fba84c1006feaa6a068c8c8e226c1806a" gracePeriod=30 Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.297563 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="032af055-c0a6-4fda-90c8-c34ef9278613" containerName="memcached" containerID="cri-o://0d4580a5393a54bbc3f4d1d132fb4626624aacc2b800fb18a8d15e491de91652" gracePeriod=30 Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.298489 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a3cf4c6f-f34b-47ea-86f9-88666a59d44a" containerName="proxy-httpd" containerID="cri-o://6299d63206c3fbc25b59d632cb9c2f9bf041eee3b3e097b905b6d1db62b66c82" gracePeriod=30 Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.298559 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a3cf4c6f-f34b-47ea-86f9-88666a59d44a" containerName="sg-core" containerID="cri-o://e74f52a59e270b006a1fc5714de7ee321f0ef68eebd6a6ed247cd71177dbdaa8" gracePeriod=30 Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.298627 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a3cf4c6f-f34b-47ea-86f9-88666a59d44a" containerName="ceilometer-notification-agent" containerID="cri-o://bfcc2c269977aa6668ab1f24c749b53e9efa8a90bc0debb824bbc6f137672462" gracePeriod=30 Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.316870 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-857c4c465c-j99gg" podUID="87da04a1-7ee1-47e4-9bc7-1808ea506457" containerName="keystone-api" containerID="cri-o://84853475a4ebc4610f6d0ed52745339c050a856a8d40b0bcfb30e50f936692ed" gracePeriod=30 Dec 05 16:41:07 crc kubenswrapper[4756]: E1205 16:41:07.325842 4756 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 16:41:07 crc kubenswrapper[4756]: E1205 16:41:07.326180 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e-operator-scripts podName:3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e nodeName:}" failed. No retries permitted until 2025-12-05 16:41:08.326158789 +0000 UTC m=+1528.564832301 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e-operator-scripts") pod "placement5c6b-account-delete-nhd8d" (UID: "3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e") : configmap "openstack-scripts" not found Dec 05 16:41:07 crc kubenswrapper[4756]: E1205 16:41:07.326227 4756 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 16:41:07 crc kubenswrapper[4756]: E1205 16:41:07.326255 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/37e233df-e9ab-45c9-a335-f1182a3b8c79-operator-scripts podName:37e233df-e9ab-45c9-a335-f1182a3b8c79 nodeName:}" failed. No retries permitted until 2025-12-05 16:41:08.326247222 +0000 UTC m=+1528.564920744 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/37e233df-e9ab-45c9-a335-f1182a3b8c79-operator-scripts") pod "neutronbc5a-account-delete-wtsjj" (UID: "37e233df-e9ab-45c9-a335-f1182a3b8c79") : configmap "openstack-scripts" not found Dec 05 16:41:07 crc kubenswrapper[4756]: E1205 16:41:07.326289 4756 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 16:41:07 crc kubenswrapper[4756]: E1205 16:41:07.326313 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9639f2b9-2b61-4ab9-ab94-6c54d373662f-operator-scripts podName:9639f2b9-2b61-4ab9-ab94-6c54d373662f nodeName:}" failed. No retries permitted until 2025-12-05 16:41:08.326306033 +0000 UTC m=+1528.564979545 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9639f2b9-2b61-4ab9-ab94-6c54d373662f-operator-scripts") pod "barbicanc3ab-account-delete-ndr77" (UID: "9639f2b9-2b61-4ab9-ab94-6c54d373662f") : configmap "openstack-scripts" not found Dec 05 16:41:07 crc kubenswrapper[4756]: E1205 16:41:07.341934 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52f8fd30a92226261f0f4bd124f526c8082068adbdd24e6aad126c08d81c0f83" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.344697 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 05 16:41:07 crc kubenswrapper[4756]: E1205 16:41:07.361452 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52f8fd30a92226261f0f4bd124f526c8082068adbdd24e6aad126c08d81c0f83" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 16:41:07 crc kubenswrapper[4756]: E1205 16:41:07.374074 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52f8fd30a92226261f0f4bd124f526c8082068adbdd24e6aad126c08d81c0f83" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 16:41:07 crc kubenswrapper[4756]: E1205 16:41:07.374152 4756 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="c9451d43-4e5f-429b-b364-9220afe19523" containerName="nova-scheduler-scheduler" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.377478 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-647b5b9577-j6wcs"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.388442 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-647b5b9577-j6wcs"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.404818 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.423490 4756 scope.go:117] "RemoveContainer" containerID="1b4671c01e0fecf9173531aafb787e57c3c0f19f93680f966bff65b5e243cf8c" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.432911 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/881d2b4e-3f2e-4aca-a965-2508edc92a3b-config-data-generated\") pod \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.432974 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/881d2b4e-3f2e-4aca-a965-2508edc92a3b-combined-ca-bundle\") pod \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.432992 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/509573b5-33f4-432c-bb2e-1fc9186ed072-scripts\") pod \"509573b5-33f4-432c-bb2e-1fc9186ed072\" (UID: \"509573b5-33f4-432c-bb2e-1fc9186ed072\") " Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.433047 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.433119 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/881d2b4e-3f2e-4aca-a965-2508edc92a3b-config-data-default\") pod \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.433153 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/881d2b4e-3f2e-4aca-a965-2508edc92a3b-operator-scripts\") pod \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.433229 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/509573b5-33f4-432c-bb2e-1fc9186ed072-config-data-custom\") pod \"509573b5-33f4-432c-bb2e-1fc9186ed072\" (UID: \"509573b5-33f4-432c-bb2e-1fc9186ed072\") " Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.433257 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/509573b5-33f4-432c-bb2e-1fc9186ed072-config-data\") pod \"509573b5-33f4-432c-bb2e-1fc9186ed072\" (UID: \"509573b5-33f4-432c-bb2e-1fc9186ed072\") " Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.433285 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/509573b5-33f4-432c-bb2e-1fc9186ed072-etc-machine-id\") pod \"509573b5-33f4-432c-bb2e-1fc9186ed072\" (UID: \"509573b5-33f4-432c-bb2e-1fc9186ed072\") " Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.433324 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7sks\" (UniqueName: \"kubernetes.io/projected/881d2b4e-3f2e-4aca-a965-2508edc92a3b-kube-api-access-x7sks\") pod \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.433354 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sksn6\" (UniqueName: \"kubernetes.io/projected/509573b5-33f4-432c-bb2e-1fc9186ed072-kube-api-access-sksn6\") pod \"509573b5-33f4-432c-bb2e-1fc9186ed072\" (UID: \"509573b5-33f4-432c-bb2e-1fc9186ed072\") " Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.433384 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/881d2b4e-3f2e-4aca-a965-2508edc92a3b-galera-tls-certs\") pod \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.433431 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/509573b5-33f4-432c-bb2e-1fc9186ed072-combined-ca-bundle\") pod \"509573b5-33f4-432c-bb2e-1fc9186ed072\" (UID: \"509573b5-33f4-432c-bb2e-1fc9186ed072\") " Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.433488 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/881d2b4e-3f2e-4aca-a965-2508edc92a3b-kolla-config\") pod \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\" (UID: \"881d2b4e-3f2e-4aca-a965-2508edc92a3b\") " Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.446093 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/509573b5-33f4-432c-bb2e-1fc9186ed072-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "509573b5-33f4-432c-bb2e-1fc9186ed072" (UID: "509573b5-33f4-432c-bb2e-1fc9186ed072"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.447289 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/881d2b4e-3f2e-4aca-a965-2508edc92a3b-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "881d2b4e-3f2e-4aca-a965-2508edc92a3b" (UID: "881d2b4e-3f2e-4aca-a965-2508edc92a3b"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.448035 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/881d2b4e-3f2e-4aca-a965-2508edc92a3b-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "881d2b4e-3f2e-4aca-a965-2508edc92a3b" (UID: "881d2b4e-3f2e-4aca-a965-2508edc92a3b"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.449403 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/881d2b4e-3f2e-4aca-a965-2508edc92a3b-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "881d2b4e-3f2e-4aca-a965-2508edc92a3b" (UID: "881d2b4e-3f2e-4aca-a965-2508edc92a3b"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.451201 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/881d2b4e-3f2e-4aca-a965-2508edc92a3b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "881d2b4e-3f2e-4aca-a965-2508edc92a3b" (UID: "881d2b4e-3f2e-4aca-a965-2508edc92a3b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.455066 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/881d2b4e-3f2e-4aca-a965-2508edc92a3b-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.455094 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/881d2b4e-3f2e-4aca-a965-2508edc92a3b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.455104 4756 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/509573b5-33f4-432c-bb2e-1fc9186ed072-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.455128 4756 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/881d2b4e-3f2e-4aca-a965-2508edc92a3b-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.455138 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/881d2b4e-3f2e-4aca-a965-2508edc92a3b-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.466403 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/509573b5-33f4-432c-bb2e-1fc9186ed072-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "509573b5-33f4-432c-bb2e-1fc9186ed072" (UID: "509573b5-33f4-432c-bb2e-1fc9186ed072"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.467252 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.515226 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/881d2b4e-3f2e-4aca-a965-2508edc92a3b-kube-api-access-x7sks" (OuterVolumeSpecName: "kube-api-access-x7sks") pod "881d2b4e-3f2e-4aca-a965-2508edc92a3b" (UID: "881d2b4e-3f2e-4aca-a965-2508edc92a3b"). InnerVolumeSpecName "kube-api-access-x7sks". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.519931 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.521995 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/509573b5-33f4-432c-bb2e-1fc9186ed072-kube-api-access-sksn6" (OuterVolumeSpecName: "kube-api-access-sksn6") pod "509573b5-33f4-432c-bb2e-1fc9186ed072" (UID: "509573b5-33f4-432c-bb2e-1fc9186ed072"). InnerVolumeSpecName "kube-api-access-sksn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.534951 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/509573b5-33f4-432c-bb2e-1fc9186ed072-scripts" (OuterVolumeSpecName: "scripts") pod "509573b5-33f4-432c-bb2e-1fc9186ed072" (UID: "509573b5-33f4-432c-bb2e-1fc9186ed072"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.557068 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7sks\" (UniqueName: \"kubernetes.io/projected/881d2b4e-3f2e-4aca-a965-2508edc92a3b-kube-api-access-x7sks\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.557113 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sksn6\" (UniqueName: \"kubernetes.io/projected/509573b5-33f4-432c-bb2e-1fc9186ed072-kube-api-access-sksn6\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.557125 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/509573b5-33f4-432c-bb2e-1fc9186ed072-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.557136 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/509573b5-33f4-432c-bb2e-1fc9186ed072-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.560182 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "mysql-db") pod "881d2b4e-3f2e-4aca-a965-2508edc92a3b" (UID: "881d2b4e-3f2e-4aca-a965-2508edc92a3b"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.592672 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="76057872-026a-4ecb-b628-75f7f6fd46e9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": read tcp 10.217.0.2:54830->10.217.0.203:8775: read: connection reset by peer" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.592994 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="76057872-026a-4ecb-b628-75f7f6fd46e9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": read tcp 10.217.0.2:54822->10.217.0.203:8775: read: connection reset by peer" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.663095 4756 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.818285 4756 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.836564 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/881d2b4e-3f2e-4aca-a965-2508edc92a3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "881d2b4e-3f2e-4aca-a965-2508edc92a3b" (UID: "881d2b4e-3f2e-4aca-a965-2508edc92a3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.840763 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0bc19-account-delete-xlr4r" event={"ID":"cc4b1e45-87c0-471f-beb9-70bce6933d47","Type":"ContainerStarted","Data":"5ce2c044a1e7f24fb0d011f65c18d5af43c0921735e6ef864f501fb53bd366b8"} Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.840912 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novacell0bc19-account-delete-xlr4r" podUID="cc4b1e45-87c0-471f-beb9-70bce6933d47" containerName="mariadb-account-delete" containerID="cri-o://5ce2c044a1e7f24fb0d011f65c18d5af43c0921735e6ef864f501fb53bd366b8" gracePeriod=30 Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.849940 4756 generic.go:334] "Generic (PLEG): container finished" podID="a3cf4c6f-f34b-47ea-86f9-88666a59d44a" containerID="6299d63206c3fbc25b59d632cb9c2f9bf041eee3b3e097b905b6d1db62b66c82" exitCode=0 Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.849969 4756 generic.go:334] "Generic (PLEG): container finished" podID="a3cf4c6f-f34b-47ea-86f9-88666a59d44a" containerID="e74f52a59e270b006a1fc5714de7ee321f0ef68eebd6a6ed247cd71177dbdaa8" exitCode=2 Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.850040 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3cf4c6f-f34b-47ea-86f9-88666a59d44a","Type":"ContainerDied","Data":"6299d63206c3fbc25b59d632cb9c2f9bf041eee3b3e097b905b6d1db62b66c82"} Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.850088 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3cf4c6f-f34b-47ea-86f9-88666a59d44a","Type":"ContainerDied","Data":"e74f52a59e270b006a1fc5714de7ee321f0ef68eebd6a6ed247cd71177dbdaa8"} Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.864310 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novacell0bc19-account-delete-xlr4r" podStartSLOduration=6.864291373 podStartE2EDuration="6.864291373s" podCreationTimestamp="2025-12-05 16:41:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:41:07.855042251 +0000 UTC m=+1528.093715773" watchObservedRunningTime="2025-12-05 16:41:07.864291373 +0000 UTC m=+1528.102964885" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.867762 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement5c6b-account-delete-nhd8d" event={"ID":"3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e","Type":"ContainerStarted","Data":"7d4e4c4996c6095e55a9528af5c2a29da4aefbcb8aa7c785ffbdee54592a58ca"} Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.867999 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement5c6b-account-delete-nhd8d" podUID="3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e" containerName="mariadb-account-delete" containerID="cri-o://7d4e4c4996c6095e55a9528af5c2a29da4aefbcb8aa7c785ffbdee54592a58ca" gracePeriod=30 Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.876231 4756 generic.go:334] "Generic (PLEG): container finished" podID="f0fc7cfb-9160-471d-8aec-457769d46602" containerID="b4e4e8a043bf4f88e1edd5b6dd41710a7149c49c31bb40ac6714307c37009357" exitCode=0 Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.876444 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-9d6468757-b9ttg" event={"ID":"f0fc7cfb-9160-471d-8aec-457769d46602","Type":"ContainerDied","Data":"b4e4e8a043bf4f88e1edd5b6dd41710a7149c49c31bb40ac6714307c37009357"} Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.899665 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/881d2b4e-3f2e-4aca-a965-2508edc92a3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.899703 4756 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.971147 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4beba48e-5ac4-4d41-864b-ecfd8b039988","Type":"ContainerDied","Data":"297838fda664c6d36050dcab620e44ecebc97fe76faa2c79149e53d7073dd6ea"} Dec 05 16:41:07 crc kubenswrapper[4756]: I1205 16:41:07.971184 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="297838fda664c6d36050dcab620e44ecebc97fe76faa2c79149e53d7073dd6ea" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.007771 4756 generic.go:334] "Generic (PLEG): container finished" podID="797faab6-a6b3-48ae-90ca-09d251b2975f" containerID="9dcca87cd81ebc30ed3acfa39592365fba84c1006feaa6a068c8c8e226c1806a" exitCode=2 Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.007848 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"797faab6-a6b3-48ae-90ca-09d251b2975f","Type":"ContainerDied","Data":"9dcca87cd81ebc30ed3acfa39592365fba84c1006feaa6a068c8c8e226c1806a"} Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.040311 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novaapi58bc-account-delete-cdg92" podUID="2586474d-e368-428a-a86b-1ec95b8006a7" containerName="mariadb-account-delete" containerID="cri-o://2ef947f383c65754f97816358b40e2b97ba1e5bdbdd958871bbce8c452cb3b0e" gracePeriod=30 Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.044573 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/509573b5-33f4-432c-bb2e-1fc9186ed072-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "509573b5-33f4-432c-bb2e-1fc9186ed072" (UID: "509573b5-33f4-432c-bb2e-1fc9186ed072"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.061184 4756 generic.go:334] "Generic (PLEG): container finished" podID="84d62427-bd91-4330-be20-152bc3508447" containerID="a005af597ef26ea0b48f5743eb681caeba6662ee90634b36c5befe70f580e2a0" exitCode=0 Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.071521 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novaapi58bc-account-delete-cdg92" podStartSLOduration=7.071502083 podStartE2EDuration="7.071502083s" podCreationTimestamp="2025-12-05 16:41:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 16:41:08.067171365 +0000 UTC m=+1528.305844897" watchObservedRunningTime="2025-12-05 16:41:08.071502083 +0000 UTC m=+1528.310175595" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.119953 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/881d2b4e-3f2e-4aca-a965-2508edc92a3b-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "881d2b4e-3f2e-4aca-a965-2508edc92a3b" (UID: "881d2b4e-3f2e-4aca-a965-2508edc92a3b"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.149552 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/509573b5-33f4-432c-bb2e-1fc9186ed072-config-data" (OuterVolumeSpecName: "config-data") pod "509573b5-33f4-432c-bb2e-1fc9186ed072" (UID: "509573b5-33f4-432c-bb2e-1fc9186ed072"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.157311 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d04ce23-86cb-48ed-9d9b-59bb3b650da2" path="/var/lib/kubelet/pods/0d04ce23-86cb-48ed-9d9b-59bb3b650da2/volumes" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.158456 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0de45488-2c4b-408e-810b-896d2db706d8" path="/var/lib/kubelet/pods/0de45488-2c4b-408e-810b-896d2db706d8/volumes" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.158984 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15e16ba1-ba6e-42bd-a187-a22710026feb" path="/var/lib/kubelet/pods/15e16ba1-ba6e-42bd-a187-a22710026feb/volumes" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.159859 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1702bfa9-c7bd-4e9b-996a-dbb838d454ac" path="/var/lib/kubelet/pods/1702bfa9-c7bd-4e9b-996a-dbb838d454ac/volumes" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.160439 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f517228-5007-4e3c-a41b-1e4b636446c0" path="/var/lib/kubelet/pods/1f517228-5007-4e3c-a41b-1e4b636446c0/volumes" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.163143 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45e502c2-2f9c-4390-b3ea-ed099cd2111b" path="/var/lib/kubelet/pods/45e502c2-2f9c-4390-b3ea-ed099cd2111b/volumes" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.163710 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="541f7de0-3773-4e28-9881-b552367f1c89" path="/var/lib/kubelet/pods/541f7de0-3773-4e28-9881-b552367f1c89/volumes" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.164284 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6071e116-5a36-4d4f-800a-a2b52ef342c8" path="/var/lib/kubelet/pods/6071e116-5a36-4d4f-800a-a2b52ef342c8/volumes" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.165328 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/509573b5-33f4-432c-bb2e-1fc9186ed072-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.165347 4756 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/881d2b4e-3f2e-4aca-a965-2508edc92a3b-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.165356 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/509573b5-33f4-432c-bb2e-1fc9186ed072-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.165401 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67e154b5-e680-4c49-b449-62b100aac6af" path="/var/lib/kubelet/pods/67e154b5-e680-4c49-b449-62b100aac6af/volumes" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.165926 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a3091dc-ae5e-4f60-9e11-a07f76cff76f" path="/var/lib/kubelet/pods/7a3091dc-ae5e-4f60-9e11-a07f76cff76f/volumes" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.166405 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="986034bd-a776-4242-b772-b0db284f6d4f" path="/var/lib/kubelet/pods/986034bd-a776-4242-b772-b0db284f6d4f/volumes" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.167405 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e884c151-d5ce-44a5-819d-9ee0c4328cbe" path="/var/lib/kubelet/pods/e884c151-d5ce-44a5-819d-9ee0c4328cbe/volumes" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.168303 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f664c2b4-e517-427a-bbdc-8fba6d548a21","Type":"ContainerDied","Data":"67d89d49612dc132ef74f1a1238e776268aa657fe7cd1d9ef4982b04addfdd0a"} Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.168330 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67d89d49612dc132ef74f1a1238e776268aa657fe7cd1d9ef4982b04addfdd0a" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.168342 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-668cbcb89b-dvw88" event={"ID":"84d62427-bd91-4330-be20-152bc3508447","Type":"ContainerDied","Data":"a005af597ef26ea0b48f5743eb681caeba6662ee90634b36c5befe70f580e2a0"} Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.168354 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9df5252d-16cf-4612-910c-8f37d10b48a7","Type":"ContainerDied","Data":"ef0f39ff2baef02e17cf770a2dbd86eae23197baa277532416645111fc401104"} Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.168367 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef0f39ff2baef02e17cf770a2dbd86eae23197baa277532416645111fc401104" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.179717 4756 generic.go:334] "Generic (PLEG): container finished" podID="735bacc2-aeb8-4f2f-8f1a-bf3d092fd163" containerID="b462dd63e42cfa5cb78440495d7faec00b0f9c211eb3ed5ddcb9403c412b08b7" exitCode=0 Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.179826 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance1a2d-account-delete-vgqdz" event={"ID":"735bacc2-aeb8-4f2f-8f1a-bf3d092fd163","Type":"ContainerDied","Data":"b462dd63e42cfa5cb78440495d7faec00b0f9c211eb3ed5ddcb9403c412b08b7"} Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.236715 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"509573b5-33f4-432c-bb2e-1fc9186ed072","Type":"ContainerDied","Data":"922472c149da7ca1519db8bd8b314ce74a0247f88ad115a5512bb0e4a2f0e9c5"} Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.236816 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 16:41:08 crc kubenswrapper[4756]: E1205 16:41:08.260326 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="15ed1a30eec906bcf7cd0aae8c31d3f0fcdcf61ff466d03836be17d99a7a65f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.260607 4756 generic.go:334] "Generic (PLEG): container finished" podID="13fa1e9e-f893-4641-a55f-d0c209d43917" containerID="f08e0d462ffd45dfa7950e9164ec47234418bfc47d5218d2a85f0732c1d92391" exitCode=0 Dec 05 16:41:08 crc kubenswrapper[4756]: E1205 16:41:08.260664 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b is running failed: container process not found" containerID="664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.260716 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6fd4b854cb-rm9k8" event={"ID":"13fa1e9e-f893-4641-a55f-d0c209d43917","Type":"ContainerDied","Data":"f08e0d462ffd45dfa7950e9164ec47234418bfc47d5218d2a85f0732c1d92391"} Dec 05 16:41:08 crc kubenswrapper[4756]: E1205 16:41:08.267544 4756 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 16:41:08 crc kubenswrapper[4756]: E1205 16:41:08.267754 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/735bacc2-aeb8-4f2f-8f1a-bf3d092fd163-operator-scripts podName:735bacc2-aeb8-4f2f-8f1a-bf3d092fd163 nodeName:}" failed. No retries permitted until 2025-12-05 16:41:10.267727833 +0000 UTC m=+1530.506401405 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/735bacc2-aeb8-4f2f-8f1a-bf3d092fd163-operator-scripts") pod "glance1a2d-account-delete-vgqdz" (UID: "735bacc2-aeb8-4f2f-8f1a-bf3d092fd163") : configmap "openstack-scripts" not found Dec 05 16:41:08 crc kubenswrapper[4756]: E1205 16:41:08.272590 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b is running failed: container process not found" containerID="664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 16:41:08 crc kubenswrapper[4756]: E1205 16:41:08.273066 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b is running failed: container process not found" containerID="664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 16:41:08 crc kubenswrapper[4756]: E1205 16:41:08.273129 4756 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-kdtg5" podUID="40ff87cb-4d82-49d4-a4d9-55e104c9becb" containerName="ovsdb-server" Dec 05 16:41:08 crc kubenswrapper[4756]: E1205 16:41:08.273300 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="15ed1a30eec906bcf7cd0aae8c31d3f0fcdcf61ff466d03836be17d99a7a65f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 16:41:08 crc kubenswrapper[4756]: E1205 16:41:08.278850 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="15ed1a30eec906bcf7cd0aae8c31d3f0fcdcf61ff466d03836be17d99a7a65f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 16:41:08 crc kubenswrapper[4756]: E1205 16:41:08.278909 4756 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-kdtg5" podUID="40ff87cb-4d82-49d4-a4d9-55e104c9becb" containerName="ovs-vswitchd" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.296703 4756 generic.go:334] "Generic (PLEG): container finished" podID="76057872-026a-4ecb-b628-75f7f6fd46e9" containerID="fe8069b814e870c4e578733a24b86485d43263b291fd3dcc6e53ead33749cc50" exitCode=0 Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.296797 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"76057872-026a-4ecb-b628-75f7f6fd46e9","Type":"ContainerDied","Data":"fe8069b814e870c4e578733a24b86485d43263b291fd3dcc6e53ead33749cc50"} Dec 05 16:41:08 crc kubenswrapper[4756]: E1205 16:41:08.308548 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7fc666336bac7d8a681a050c8fd16bd151411c4f9fcfe10381ada0984c4a3444 is running failed: container process not found" containerID="7fc666336bac7d8a681a050c8fd16bd151411c4f9fcfe10381ada0984c4a3444" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 05 16:41:08 crc kubenswrapper[4756]: E1205 16:41:08.313548 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7fc666336bac7d8a681a050c8fd16bd151411c4f9fcfe10381ada0984c4a3444 is running failed: container process not found" containerID="7fc666336bac7d8a681a050c8fd16bd151411c4f9fcfe10381ada0984c4a3444" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.313806 4756 generic.go:334] "Generic (PLEG): container finished" podID="dc0118af-131e-409a-a18b-0be03f3ce81b" containerID="14a9c94a34d46580da78a252edc6fc101e1053fd75e54da245fe2728758527ee" exitCode=0 Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.313856 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"dc0118af-131e-409a-a18b-0be03f3ce81b","Type":"ContainerDied","Data":"14a9c94a34d46580da78a252edc6fc101e1053fd75e54da245fe2728758527ee"} Dec 05 16:41:08 crc kubenswrapper[4756]: E1205 16:41:08.320593 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7fc666336bac7d8a681a050c8fd16bd151411c4f9fcfe10381ada0984c4a3444 is running failed: container process not found" containerID="7fc666336bac7d8a681a050c8fd16bd151411c4f9fcfe10381ada0984c4a3444" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 05 16:41:08 crc kubenswrapper[4756]: E1205 16:41:08.320743 4756 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7fc666336bac7d8a681a050c8fd16bd151411c4f9fcfe10381ada0984c4a3444 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="187070cf-06f5-4c25-851f-2e1f294c3b7c" containerName="nova-cell0-conductor-conductor" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.333936 4756 generic.go:334] "Generic (PLEG): container finished" podID="897f63f9-bf8d-438b-b35a-4195963786e3" containerID="04391e895ac7c022c16d69dfd03bf0f8cab399a37640b4aebf07a28dfbf58d41" exitCode=0 Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.334006 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"897f63f9-bf8d-438b-b35a-4195963786e3","Type":"ContainerDied","Data":"04391e895ac7c022c16d69dfd03bf0f8cab399a37640b4aebf07a28dfbf58d41"} Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.349238 4756 generic.go:334] "Generic (PLEG): container finished" podID="2606acc0-f425-4595-8f5a-df1f009175ee" containerID="bfa23ea411e7c12b52777bbf724439990757136fd7d85b535a806320b41f695b" exitCode=0 Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.349432 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54c6d9b488-h9hmj" event={"ID":"2606acc0-f425-4595-8f5a-df1f009175ee","Type":"ContainerDied","Data":"bfa23ea411e7c12b52777bbf724439990757136fd7d85b535a806320b41f695b"} Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.349935 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.350007 4756 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/barbicanc3ab-account-delete-ndr77" secret="" err="secret \"galera-openstack-dockercfg-br626\" not found" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.350556 4756 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/neutronbc5a-account-delete-wtsjj" secret="" err="secret \"galera-openstack-dockercfg-br626\" not found" Dec 05 16:41:08 crc kubenswrapper[4756]: E1205 16:41:08.368777 4756 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 05 16:41:08 crc kubenswrapper[4756]: E1205 16:41:08.368949 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/331486fe-c403-4516-9568-862200ec0672-config-data podName:331486fe-c403-4516-9568-862200ec0672 nodeName:}" failed. No retries permitted until 2025-12-05 16:41:16.368924763 +0000 UTC m=+1536.607598275 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/331486fe-c403-4516-9568-862200ec0672-config-data") pod "rabbitmq-cell1-server-0" (UID: "331486fe-c403-4516-9568-862200ec0672") : configmap "rabbitmq-cell1-config-data" not found Dec 05 16:41:08 crc kubenswrapper[4756]: E1205 16:41:08.369020 4756 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 16:41:08 crc kubenswrapper[4756]: E1205 16:41:08.369084 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/37e233df-e9ab-45c9-a335-f1182a3b8c79-operator-scripts podName:37e233df-e9ab-45c9-a335-f1182a3b8c79 nodeName:}" failed. No retries permitted until 2025-12-05 16:41:10.369065027 +0000 UTC m=+1530.607738619 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/37e233df-e9ab-45c9-a335-f1182a3b8c79-operator-scripts") pod "neutronbc5a-account-delete-wtsjj" (UID: "37e233df-e9ab-45c9-a335-f1182a3b8c79") : configmap "openstack-scripts" not found Dec 05 16:41:08 crc kubenswrapper[4756]: E1205 16:41:08.368973 4756 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 16:41:08 crc kubenswrapper[4756]: E1205 16:41:08.369223 4756 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 16:41:08 crc kubenswrapper[4756]: E1205 16:41:08.369317 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9639f2b9-2b61-4ab9-ab94-6c54d373662f-operator-scripts podName:9639f2b9-2b61-4ab9-ab94-6c54d373662f nodeName:}" failed. No retries permitted until 2025-12-05 16:41:10.369306163 +0000 UTC m=+1530.607979745 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9639f2b9-2b61-4ab9-ab94-6c54d373662f-operator-scripts") pod "barbicanc3ab-account-delete-ndr77" (UID: "9639f2b9-2b61-4ab9-ab94-6c54d373662f") : configmap "openstack-scripts" not found Dec 05 16:41:08 crc kubenswrapper[4756]: E1205 16:41:08.370592 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e-operator-scripts podName:3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e nodeName:}" failed. No retries permitted until 2025-12-05 16:41:10.370554497 +0000 UTC m=+1530.609228009 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e-operator-scripts") pod "placement5c6b-account-delete-nhd8d" (UID: "3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e") : configmap "openstack-scripts" not found Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.411812 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="55609ecb-b28c-488e-b80f-9add6d256016" containerName="galera" containerID="cri-o://158cecb488ef422eac05d2c4948de5a6dfc1b4ec80df672c0fc11d44da5db969" gracePeriod=29 Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.479359 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.497879 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.515207 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.525849 4756 scope.go:117] "RemoveContainer" containerID="a4403245fba51bc317b8d7897b375b1d195288a8e13b1478f5ed78542abf0293" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.530063 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.531478 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54c6d9b488-h9hmj" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.531559 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-9d6468757-b9ttg" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.531752 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.535233 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.535990 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.540013 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-668cbcb89b-dvw88" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.542164 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581006 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0fc7cfb-9160-471d-8aec-457769d46602-config-data-custom\") pod \"f0fc7cfb-9160-471d-8aec-457769d46602\" (UID: \"f0fc7cfb-9160-471d-8aec-457769d46602\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581057 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84d62427-bd91-4330-be20-152bc3508447-combined-ca-bundle\") pod \"84d62427-bd91-4330-be20-152bc3508447\" (UID: \"84d62427-bd91-4330-be20-152bc3508447\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581081 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67sp5\" (UniqueName: \"kubernetes.io/projected/f0fc7cfb-9160-471d-8aec-457769d46602-kube-api-access-67sp5\") pod \"f0fc7cfb-9160-471d-8aec-457769d46602\" (UID: \"f0fc7cfb-9160-471d-8aec-457769d46602\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581104 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76sjv\" (UniqueName: \"kubernetes.io/projected/f664c2b4-e517-427a-bbdc-8fba6d548a21-kube-api-access-76sjv\") pod \"f664c2b4-e517-427a-bbdc-8fba6d548a21\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581130 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84d62427-bd91-4330-be20-152bc3508447-logs\") pod \"84d62427-bd91-4330-be20-152bc3508447\" (UID: \"84d62427-bd91-4330-be20-152bc3508447\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581151 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9df5252d-16cf-4612-910c-8f37d10b48a7-etc-machine-id\") pod \"9df5252d-16cf-4612-910c-8f37d10b48a7\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581178 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4beba48e-5ac4-4d41-864b-ecfd8b039988-scripts\") pod \"4beba48e-5ac4-4d41-864b-ecfd8b039988\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581205 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2606acc0-f425-4595-8f5a-df1f009175ee-internal-tls-certs\") pod \"2606acc0-f425-4595-8f5a-df1f009175ee\" (UID: \"2606acc0-f425-4595-8f5a-df1f009175ee\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581228 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxhq9\" (UniqueName: \"kubernetes.io/projected/2606acc0-f425-4595-8f5a-df1f009175ee-kube-api-access-vxhq9\") pod \"2606acc0-f425-4595-8f5a-df1f009175ee\" (UID: \"2606acc0-f425-4595-8f5a-df1f009175ee\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581257 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f664c2b4-e517-427a-bbdc-8fba6d548a21-scripts\") pod \"f664c2b4-e517-427a-bbdc-8fba6d548a21\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581273 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2606acc0-f425-4595-8f5a-df1f009175ee-logs\") pod \"2606acc0-f425-4595-8f5a-df1f009175ee\" (UID: \"2606acc0-f425-4595-8f5a-df1f009175ee\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581291 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0fc7cfb-9160-471d-8aec-457769d46602-combined-ca-bundle\") pod \"f0fc7cfb-9160-471d-8aec-457769d46602\" (UID: \"f0fc7cfb-9160-471d-8aec-457769d46602\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581308 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f664c2b4-e517-427a-bbdc-8fba6d548a21-config-data\") pod \"f664c2b4-e517-427a-bbdc-8fba6d548a21\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581333 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrfjj\" (UniqueName: \"kubernetes.io/projected/84d62427-bd91-4330-be20-152bc3508447-kube-api-access-wrfjj\") pod \"84d62427-bd91-4330-be20-152bc3508447\" (UID: \"84d62427-bd91-4330-be20-152bc3508447\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581371 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84d62427-bd91-4330-be20-152bc3508447-config-data\") pod \"84d62427-bd91-4330-be20-152bc3508447\" (UID: \"84d62427-bd91-4330-be20-152bc3508447\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581396 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"4beba48e-5ac4-4d41-864b-ecfd8b039988\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581446 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4beba48e-5ac4-4d41-864b-ecfd8b039988-config-data\") pod \"4beba48e-5ac4-4d41-864b-ecfd8b039988\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581465 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2606acc0-f425-4595-8f5a-df1f009175ee-combined-ca-bundle\") pod \"2606acc0-f425-4595-8f5a-df1f009175ee\" (UID: \"2606acc0-f425-4595-8f5a-df1f009175ee\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581483 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4beba48e-5ac4-4d41-864b-ecfd8b039988-httpd-run\") pod \"4beba48e-5ac4-4d41-864b-ecfd8b039988\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581503 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f664c2b4-e517-427a-bbdc-8fba6d548a21-combined-ca-bundle\") pod \"f664c2b4-e517-427a-bbdc-8fba6d548a21\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581531 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-combined-ca-bundle\") pod \"9df5252d-16cf-4612-910c-8f37d10b48a7\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581559 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f664c2b4-e517-427a-bbdc-8fba6d548a21-logs\") pod \"f664c2b4-e517-427a-bbdc-8fba6d548a21\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581577 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f664c2b4-e517-427a-bbdc-8fba6d548a21-internal-tls-certs\") pod \"f664c2b4-e517-427a-bbdc-8fba6d548a21\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581599 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-public-tls-certs\") pod \"9df5252d-16cf-4612-910c-8f37d10b48a7\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581616 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4beba48e-5ac4-4d41-864b-ecfd8b039988-combined-ca-bundle\") pod \"4beba48e-5ac4-4d41-864b-ecfd8b039988\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581645 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2606acc0-f425-4595-8f5a-df1f009175ee-config-data\") pod \"2606acc0-f425-4595-8f5a-df1f009175ee\" (UID: \"2606acc0-f425-4595-8f5a-df1f009175ee\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581671 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2606acc0-f425-4595-8f5a-df1f009175ee-config-data-custom\") pod \"2606acc0-f425-4595-8f5a-df1f009175ee\" (UID: \"2606acc0-f425-4595-8f5a-df1f009175ee\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581689 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/797faab6-a6b3-48ae-90ca-09d251b2975f-kube-state-metrics-tls-certs\") pod \"797faab6-a6b3-48ae-90ca-09d251b2975f\" (UID: \"797faab6-a6b3-48ae-90ca-09d251b2975f\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581705 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncmqs\" (UniqueName: \"kubernetes.io/projected/9df5252d-16cf-4612-910c-8f37d10b48a7-kube-api-access-ncmqs\") pod \"9df5252d-16cf-4612-910c-8f37d10b48a7\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581725 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0fc7cfb-9160-471d-8aec-457769d46602-logs\") pod \"f0fc7cfb-9160-471d-8aec-457769d46602\" (UID: \"f0fc7cfb-9160-471d-8aec-457769d46602\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581740 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4beba48e-5ac4-4d41-864b-ecfd8b039988-logs\") pod \"4beba48e-5ac4-4d41-864b-ecfd8b039988\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581771 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-scripts\") pod \"9df5252d-16cf-4612-910c-8f37d10b48a7\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581792 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-config-data-custom\") pod \"9df5252d-16cf-4612-910c-8f37d10b48a7\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581808 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84d62427-bd91-4330-be20-152bc3508447-internal-tls-certs\") pod \"84d62427-bd91-4330-be20-152bc3508447\" (UID: \"84d62427-bd91-4330-be20-152bc3508447\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581840 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxphb\" (UniqueName: \"kubernetes.io/projected/797faab6-a6b3-48ae-90ca-09d251b2975f-kube-api-access-sxphb\") pod \"797faab6-a6b3-48ae-90ca-09d251b2975f\" (UID: \"797faab6-a6b3-48ae-90ca-09d251b2975f\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581861 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2606acc0-f425-4595-8f5a-df1f009175ee-public-tls-certs\") pod \"2606acc0-f425-4595-8f5a-df1f009175ee\" (UID: \"2606acc0-f425-4595-8f5a-df1f009175ee\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581875 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4beba48e-5ac4-4d41-864b-ecfd8b039988-public-tls-certs\") pod \"4beba48e-5ac4-4d41-864b-ecfd8b039988\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581890 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"f664c2b4-e517-427a-bbdc-8fba6d548a21\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581906 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84d62427-bd91-4330-be20-152bc3508447-public-tls-certs\") pod \"84d62427-bd91-4330-be20-152bc3508447\" (UID: \"84d62427-bd91-4330-be20-152bc3508447\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581935 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797faab6-a6b3-48ae-90ca-09d251b2975f-combined-ca-bundle\") pod \"797faab6-a6b3-48ae-90ca-09d251b2975f\" (UID: \"797faab6-a6b3-48ae-90ca-09d251b2975f\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581959 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0fc7cfb-9160-471d-8aec-457769d46602-config-data\") pod \"f0fc7cfb-9160-471d-8aec-457769d46602\" (UID: \"f0fc7cfb-9160-471d-8aec-457769d46602\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.581979 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/797faab6-a6b3-48ae-90ca-09d251b2975f-kube-state-metrics-tls-config\") pod \"797faab6-a6b3-48ae-90ca-09d251b2975f\" (UID: \"797faab6-a6b3-48ae-90ca-09d251b2975f\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.582012 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-internal-tls-certs\") pod \"9df5252d-16cf-4612-910c-8f37d10b48a7\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.582036 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-config-data\") pod \"9df5252d-16cf-4612-910c-8f37d10b48a7\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.582058 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f664c2b4-e517-427a-bbdc-8fba6d548a21-httpd-run\") pod \"f664c2b4-e517-427a-bbdc-8fba6d548a21\" (UID: \"f664c2b4-e517-427a-bbdc-8fba6d548a21\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.582077 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84d62427-bd91-4330-be20-152bc3508447-scripts\") pod \"84d62427-bd91-4330-be20-152bc3508447\" (UID: \"84d62427-bd91-4330-be20-152bc3508447\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.582092 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt77c\" (UniqueName: \"kubernetes.io/projected/4beba48e-5ac4-4d41-864b-ecfd8b039988-kube-api-access-tt77c\") pod \"4beba48e-5ac4-4d41-864b-ecfd8b039988\" (UID: \"4beba48e-5ac4-4d41-864b-ecfd8b039988\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.582110 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9df5252d-16cf-4612-910c-8f37d10b48a7-logs\") pod \"9df5252d-16cf-4612-910c-8f37d10b48a7\" (UID: \"9df5252d-16cf-4612-910c-8f37d10b48a7\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.583815 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9df5252d-16cf-4612-910c-8f37d10b48a7-logs" (OuterVolumeSpecName: "logs") pod "9df5252d-16cf-4612-910c-8f37d10b48a7" (UID: "9df5252d-16cf-4612-910c-8f37d10b48a7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.592119 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4beba48e-5ac4-4d41-864b-ecfd8b039988-logs" (OuterVolumeSpecName: "logs") pod "4beba48e-5ac4-4d41-864b-ecfd8b039988" (UID: "4beba48e-5ac4-4d41-864b-ecfd8b039988"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.594454 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f664c2b4-e517-427a-bbdc-8fba6d548a21-logs" (OuterVolumeSpecName: "logs") pod "f664c2b4-e517-427a-bbdc-8fba6d548a21" (UID: "f664c2b4-e517-427a-bbdc-8fba6d548a21"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.599992 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4beba48e-5ac4-4d41-864b-ecfd8b039988-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4beba48e-5ac4-4d41-864b-ecfd8b039988" (UID: "4beba48e-5ac4-4d41-864b-ecfd8b039988"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.600372 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0fc7cfb-9160-471d-8aec-457769d46602-logs" (OuterVolumeSpecName: "logs") pod "f0fc7cfb-9160-471d-8aec-457769d46602" (UID: "f0fc7cfb-9160-471d-8aec-457769d46602"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.610954 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2606acc0-f425-4595-8f5a-df1f009175ee-logs" (OuterVolumeSpecName: "logs") pod "2606acc0-f425-4595-8f5a-df1f009175ee" (UID: "2606acc0-f425-4595-8f5a-df1f009175ee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.614837 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f664c2b4-e517-427a-bbdc-8fba6d548a21-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f664c2b4-e517-427a-bbdc-8fba6d548a21" (UID: "f664c2b4-e517-427a-bbdc-8fba6d548a21"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.618569 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2606acc0-f425-4595-8f5a-df1f009175ee-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2606acc0-f425-4595-8f5a-df1f009175ee" (UID: "2606acc0-f425-4595-8f5a-df1f009175ee"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.620559 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9df5252d-16cf-4612-910c-8f37d10b48a7-kube-api-access-ncmqs" (OuterVolumeSpecName: "kube-api-access-ncmqs") pod "9df5252d-16cf-4612-910c-8f37d10b48a7" (UID: "9df5252d-16cf-4612-910c-8f37d10b48a7"). InnerVolumeSpecName "kube-api-access-ncmqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.627595 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "f664c2b4-e517-427a-bbdc-8fba6d548a21" (UID: "f664c2b4-e517-427a-bbdc-8fba6d548a21"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.627837 4756 scope.go:117] "RemoveContainer" containerID="1c001a9c870ca4bb31c57a0eaad2bee3efc8b47aef90bc4502a5337603659d8b" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.628342 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84d62427-bd91-4330-be20-152bc3508447-scripts" (OuterVolumeSpecName: "scripts") pod "84d62427-bd91-4330-be20-152bc3508447" (UID: "84d62427-bd91-4330-be20-152bc3508447"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.635176 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84d62427-bd91-4330-be20-152bc3508447-logs" (OuterVolumeSpecName: "logs") pod "84d62427-bd91-4330-be20-152bc3508447" (UID: "84d62427-bd91-4330-be20-152bc3508447"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.635281 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9df5252d-16cf-4612-910c-8f37d10b48a7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9df5252d-16cf-4612-910c-8f37d10b48a7" (UID: "9df5252d-16cf-4612-910c-8f37d10b48a7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.672469 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4beba48e-5ac4-4d41-864b-ecfd8b039988-scripts" (OuterVolumeSpecName: "scripts") pod "4beba48e-5ac4-4d41-864b-ecfd8b039988" (UID: "4beba48e-5ac4-4d41-864b-ecfd8b039988"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.674216 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2606acc0-f425-4595-8f5a-df1f009175ee-kube-api-access-vxhq9" (OuterVolumeSpecName: "kube-api-access-vxhq9") pod "2606acc0-f425-4595-8f5a-df1f009175ee" (UID: "2606acc0-f425-4595-8f5a-df1f009175ee"). InnerVolumeSpecName "kube-api-access-vxhq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.677203 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4beba48e-5ac4-4d41-864b-ecfd8b039988-kube-api-access-tt77c" (OuterVolumeSpecName: "kube-api-access-tt77c") pod "4beba48e-5ac4-4d41-864b-ecfd8b039988" (UID: "4beba48e-5ac4-4d41-864b-ecfd8b039988"). InnerVolumeSpecName "kube-api-access-tt77c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.705039 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f664c2b4-e517-427a-bbdc-8fba6d548a21-kube-api-access-76sjv" (OuterVolumeSpecName: "kube-api-access-76sjv") pod "f664c2b4-e517-427a-bbdc-8fba6d548a21" (UID: "f664c2b4-e517-427a-bbdc-8fba6d548a21"). InnerVolumeSpecName "kube-api-access-76sjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.705102 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="11d0ad48-c987-4415-8536-b150536534a5" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.727267 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76sjv\" (UniqueName: \"kubernetes.io/projected/f664c2b4-e517-427a-bbdc-8fba6d548a21-kube-api-access-76sjv\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.727685 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84d62427-bd91-4330-be20-152bc3508447-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.727909 4756 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9df5252d-16cf-4612-910c-8f37d10b48a7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.727977 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4beba48e-5ac4-4d41-864b-ecfd8b039988-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.728000 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxhq9\" (UniqueName: \"kubernetes.io/projected/2606acc0-f425-4595-8f5a-df1f009175ee-kube-api-access-vxhq9\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.728041 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2606acc0-f425-4595-8f5a-df1f009175ee-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.728055 4756 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4beba48e-5ac4-4d41-864b-ecfd8b039988-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.728076 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f664c2b4-e517-427a-bbdc-8fba6d548a21-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.728091 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2606acc0-f425-4595-8f5a-df1f009175ee-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.728132 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncmqs\" (UniqueName: \"kubernetes.io/projected/9df5252d-16cf-4612-910c-8f37d10b48a7-kube-api-access-ncmqs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.728145 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0fc7cfb-9160-471d-8aec-457769d46602-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.728163 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4beba48e-5ac4-4d41-864b-ecfd8b039988-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.728219 4756 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.728241 4756 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f664c2b4-e517-427a-bbdc-8fba6d548a21-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.728278 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt77c\" (UniqueName: \"kubernetes.io/projected/4beba48e-5ac4-4d41-864b-ecfd8b039988-kube-api-access-tt77c\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.728295 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84d62427-bd91-4330-be20-152bc3508447-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.728310 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9df5252d-16cf-4612-910c-8f37d10b48a7-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.729399 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f664c2b4-e517-427a-bbdc-8fba6d548a21-scripts" (OuterVolumeSpecName: "scripts") pod "f664c2b4-e517-427a-bbdc-8fba6d548a21" (UID: "f664c2b4-e517-427a-bbdc-8fba6d548a21"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.734577 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84d62427-bd91-4330-be20-152bc3508447-kube-api-access-wrfjj" (OuterVolumeSpecName: "kube-api-access-wrfjj") pod "84d62427-bd91-4330-be20-152bc3508447" (UID: "84d62427-bd91-4330-be20-152bc3508447"). InnerVolumeSpecName "kube-api-access-wrfjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.734674 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/797faab6-a6b3-48ae-90ca-09d251b2975f-kube-api-access-sxphb" (OuterVolumeSpecName: "kube-api-access-sxphb") pod "797faab6-a6b3-48ae-90ca-09d251b2975f" (UID: "797faab6-a6b3-48ae-90ca-09d251b2975f"). InnerVolumeSpecName "kube-api-access-sxphb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.749838 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0fc7cfb-9160-471d-8aec-457769d46602-kube-api-access-67sp5" (OuterVolumeSpecName: "kube-api-access-67sp5") pod "f0fc7cfb-9160-471d-8aec-457769d46602" (UID: "f0fc7cfb-9160-471d-8aec-457769d46602"). InnerVolumeSpecName "kube-api-access-67sp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.756739 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0fc7cfb-9160-471d-8aec-457769d46602-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f0fc7cfb-9160-471d-8aec-457769d46602" (UID: "f0fc7cfb-9160-471d-8aec-457769d46602"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.756852 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9df5252d-16cf-4612-910c-8f37d10b48a7" (UID: "9df5252d-16cf-4612-910c-8f37d10b48a7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.767296 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "4beba48e-5ac4-4d41-864b-ecfd8b039988" (UID: "4beba48e-5ac4-4d41-864b-ecfd8b039988"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.768126 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-scripts" (OuterVolumeSpecName: "scripts") pod "9df5252d-16cf-4612-910c-8f37d10b48a7" (UID: "9df5252d-16cf-4612-910c-8f37d10b48a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.829668 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f664c2b4-e517-427a-bbdc-8fba6d548a21-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.829726 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrfjj\" (UniqueName: \"kubernetes.io/projected/84d62427-bd91-4330-be20-152bc3508447-kube-api-access-wrfjj\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.829752 4756 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.829763 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.829772 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.829781 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxphb\" (UniqueName: \"kubernetes.io/projected/797faab6-a6b3-48ae-90ca-09d251b2975f-kube-api-access-sxphb\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.829790 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0fc7cfb-9160-471d-8aec-457769d46602-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.829799 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67sp5\" (UniqueName: \"kubernetes.io/projected/f0fc7cfb-9160-471d-8aec-457769d46602-kube-api-access-67sp5\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.848179 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6fd4b854cb-rm9k8" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.903067 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2606acc0-f425-4595-8f5a-df1f009175ee-config-data" (OuterVolumeSpecName: "config-data") pod "2606acc0-f425-4595-8f5a-df1f009175ee" (UID: "2606acc0-f425-4595-8f5a-df1f009175ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.913591 4756 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.918223 4756 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.921203 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4beba48e-5ac4-4d41-864b-ecfd8b039988-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4beba48e-5ac4-4d41-864b-ecfd8b039988" (UID: "4beba48e-5ac4-4d41-864b-ecfd8b039988"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.930753 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtmpg\" (UniqueName: \"kubernetes.io/projected/13fa1e9e-f893-4641-a55f-d0c209d43917-kube-api-access-wtmpg\") pod \"13fa1e9e-f893-4641-a55f-d0c209d43917\" (UID: \"13fa1e9e-f893-4641-a55f-d0c209d43917\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.930850 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13fa1e9e-f893-4641-a55f-d0c209d43917-config-data-custom\") pod \"13fa1e9e-f893-4641-a55f-d0c209d43917\" (UID: \"13fa1e9e-f893-4641-a55f-d0c209d43917\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.930879 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13fa1e9e-f893-4641-a55f-d0c209d43917-config-data\") pod \"13fa1e9e-f893-4641-a55f-d0c209d43917\" (UID: \"13fa1e9e-f893-4641-a55f-d0c209d43917\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.930980 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13fa1e9e-f893-4641-a55f-d0c209d43917-logs\") pod \"13fa1e9e-f893-4641-a55f-d0c209d43917\" (UID: \"13fa1e9e-f893-4641-a55f-d0c209d43917\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.931118 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13fa1e9e-f893-4641-a55f-d0c209d43917-combined-ca-bundle\") pod \"13fa1e9e-f893-4641-a55f-d0c209d43917\" (UID: \"13fa1e9e-f893-4641-a55f-d0c209d43917\") " Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.931527 4756 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.931546 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4beba48e-5ac4-4d41-864b-ecfd8b039988-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.931555 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2606acc0-f425-4595-8f5a-df1f009175ee-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.931566 4756 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.931609 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0fc7cfb-9160-471d-8aec-457769d46602-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0fc7cfb-9160-471d-8aec-457769d46602" (UID: "f0fc7cfb-9160-471d-8aec-457769d46602"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.932348 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13fa1e9e-f893-4641-a55f-d0c209d43917-logs" (OuterVolumeSpecName: "logs") pod "13fa1e9e-f893-4641-a55f-d0c209d43917" (UID: "13fa1e9e-f893-4641-a55f-d0c209d43917"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.944745 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2606acc0-f425-4595-8f5a-df1f009175ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2606acc0-f425-4595-8f5a-df1f009175ee" (UID: "2606acc0-f425-4595-8f5a-df1f009175ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.945322 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13fa1e9e-f893-4641-a55f-d0c209d43917-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "13fa1e9e-f893-4641-a55f-d0c209d43917" (UID: "13fa1e9e-f893-4641-a55f-d0c209d43917"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.945998 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f664c2b4-e517-427a-bbdc-8fba6d548a21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f664c2b4-e517-427a-bbdc-8fba6d548a21" (UID: "f664c2b4-e517-427a-bbdc-8fba6d548a21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.946879 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13fa1e9e-f893-4641-a55f-d0c209d43917-kube-api-access-wtmpg" (OuterVolumeSpecName: "kube-api-access-wtmpg") pod "13fa1e9e-f893-4641-a55f-d0c209d43917" (UID: "13fa1e9e-f893-4641-a55f-d0c209d43917"). InnerVolumeSpecName "kube-api-access-wtmpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.973151 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0fc7cfb-9160-471d-8aec-457769d46602-config-data" (OuterVolumeSpecName: "config-data") pod "f0fc7cfb-9160-471d-8aec-457769d46602" (UID: "f0fc7cfb-9160-471d-8aec-457769d46602"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:08 crc kubenswrapper[4756]: I1205 16:41:08.973110 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9df5252d-16cf-4612-910c-8f37d10b48a7" (UID: "9df5252d-16cf-4612-910c-8f37d10b48a7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.033787 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13fa1e9e-f893-4641-a55f-d0c209d43917-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.033825 4756 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.033836 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13fa1e9e-f893-4641-a55f-d0c209d43917-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.033847 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0fc7cfb-9160-471d-8aec-457769d46602-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.033856 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0fc7cfb-9160-471d-8aec-457769d46602-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.033865 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2606acc0-f425-4595-8f5a-df1f009175ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.033874 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtmpg\" (UniqueName: \"kubernetes.io/projected/13fa1e9e-f893-4641-a55f-d0c209d43917-kube-api-access-wtmpg\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.033883 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f664c2b4-e517-427a-bbdc-8fba6d548a21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.044951 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84d62427-bd91-4330-be20-152bc3508447-config-data" (OuterVolumeSpecName: "config-data") pod "84d62427-bd91-4330-be20-152bc3508447" (UID: "84d62427-bd91-4330-be20-152bc3508447"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.049173 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f664c2b4-e517-427a-bbdc-8fba6d548a21-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f664c2b4-e517-427a-bbdc-8fba6d548a21" (UID: "f664c2b4-e517-427a-bbdc-8fba6d548a21"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.136588 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84d62427-bd91-4330-be20-152bc3508447-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.136620 4756 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f664c2b4-e517-427a-bbdc-8fba6d548a21-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: E1205 16:41:09.137271 4756 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 05 16:41:09 crc kubenswrapper[4756]: E1205 16:41:09.137318 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/11d0ad48-c987-4415-8536-b150536534a5-config-data podName:11d0ad48-c987-4415-8536-b150536534a5 nodeName:}" failed. No retries permitted until 2025-12-05 16:41:17.137303935 +0000 UTC m=+1537.375977447 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/11d0ad48-c987-4415-8536-b150536534a5-config-data") pod "rabbitmq-server-0" (UID: "11d0ad48-c987-4415-8536-b150536534a5") : configmap "rabbitmq-config-data" not found Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.146849 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9df5252d-16cf-4612-910c-8f37d10b48a7" (UID: "9df5252d-16cf-4612-910c-8f37d10b48a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.172715 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4beba48e-5ac4-4d41-864b-ecfd8b039988-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4beba48e-5ac4-4d41-864b-ecfd8b039988" (UID: "4beba48e-5ac4-4d41-864b-ecfd8b039988"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.209362 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/797faab6-a6b3-48ae-90ca-09d251b2975f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "797faab6-a6b3-48ae-90ca-09d251b2975f" (UID: "797faab6-a6b3-48ae-90ca-09d251b2975f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.210811 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9df5252d-16cf-4612-910c-8f37d10b48a7" (UID: "9df5252d-16cf-4612-910c-8f37d10b48a7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.239356 4756 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4beba48e-5ac4-4d41-864b-ecfd8b039988-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.239394 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797faab6-a6b3-48ae-90ca-09d251b2975f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.239404 4756 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.239469 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.256402 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/797faab6-a6b3-48ae-90ca-09d251b2975f-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "797faab6-a6b3-48ae-90ca-09d251b2975f" (UID: "797faab6-a6b3-48ae-90ca-09d251b2975f"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.326102 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2606acc0-f425-4595-8f5a-df1f009175ee-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2606acc0-f425-4595-8f5a-df1f009175ee" (UID: "2606acc0-f425-4595-8f5a-df1f009175ee"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.340724 4756 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/797faab6-a6b3-48ae-90ca-09d251b2975f-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.340757 4756 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2606acc0-f425-4595-8f5a-df1f009175ee-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.345142 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13fa1e9e-f893-4641-a55f-d0c209d43917-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13fa1e9e-f893-4641-a55f-d0c209d43917" (UID: "13fa1e9e-f893-4641-a55f-d0c209d43917"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.369715 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4beba48e-5ac4-4d41-864b-ecfd8b039988-config-data" (OuterVolumeSpecName: "config-data") pod "4beba48e-5ac4-4d41-864b-ecfd8b039988" (UID: "4beba48e-5ac4-4d41-864b-ecfd8b039988"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.379042 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54c6d9b488-h9hmj" event={"ID":"2606acc0-f425-4595-8f5a-df1f009175ee","Type":"ContainerDied","Data":"9bf353f74a5196bea33d9de00268bbf27578984ef2ad72a57669263aca8c5dbb"} Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.379101 4756 scope.go:117] "RemoveContainer" containerID="bfa23ea411e7c12b52777bbf724439990757136fd7d85b535a806320b41f695b" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.379245 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54c6d9b488-h9hmj" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.392356 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f664c2b4-e517-427a-bbdc-8fba6d548a21-config-data" (OuterVolumeSpecName: "config-data") pod "f664c2b4-e517-427a-bbdc-8fba6d548a21" (UID: "f664c2b4-e517-427a-bbdc-8fba6d548a21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.393654 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi58bc-account-delete-cdg92" event={"ID":"2586474d-e368-428a-a86b-1ec95b8006a7","Type":"ContainerStarted","Data":"2ef947f383c65754f97816358b40e2b97ba1e5bdbdd958871bbce8c452cb3b0e"} Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.397267 4756 generic.go:334] "Generic (PLEG): container finished" podID="a3cf4c6f-f34b-47ea-86f9-88666a59d44a" containerID="e3d6463e66e6a8de5357ab6e221c04b2cb882550348104db03b422d53b2565e5" exitCode=0 Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.397327 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3cf4c6f-f34b-47ea-86f9-88666a59d44a","Type":"ContainerDied","Data":"e3d6463e66e6a8de5357ab6e221c04b2cb882550348104db03b422d53b2565e5"} Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.399729 4756 generic.go:334] "Generic (PLEG): container finished" podID="032af055-c0a6-4fda-90c8-c34ef9278613" containerID="0d4580a5393a54bbc3f4d1d132fb4626624aacc2b800fb18a8d15e491de91652" exitCode=0 Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.399809 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"032af055-c0a6-4fda-90c8-c34ef9278613","Type":"ContainerDied","Data":"0d4580a5393a54bbc3f4d1d132fb4626624aacc2b800fb18a8d15e491de91652"} Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.399880 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"032af055-c0a6-4fda-90c8-c34ef9278613","Type":"ContainerDied","Data":"a0382226d9555c95b26872032e39d2b01e30bfae3f59087df67631d329270a69"} Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.399900 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0382226d9555c95b26872032e39d2b01e30bfae3f59087df67631d329270a69" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.402790 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"897f63f9-bf8d-438b-b35a-4195963786e3","Type":"ContainerDied","Data":"e004ca7a7bc225259c07efb09913e11e606e4a57045d3e66620ff0267e37e848"} Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.402826 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e004ca7a7bc225259c07efb09913e11e606e4a57045d3e66620ff0267e37e848" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.404488 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder2230-account-delete-jfjlh" event={"ID":"6e2e416b-60db-4b09-b585-59911e192daf","Type":"ContainerDied","Data":"eaf8147f089331777c3150465522528cfcb38ec25fe36ed9336bb4c5205ca2bc"} Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.404541 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eaf8147f089331777c3150465522528cfcb38ec25fe36ed9336bb4c5205ca2bc" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.407818 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"76057872-026a-4ecb-b628-75f7f6fd46e9","Type":"ContainerDied","Data":"0f2f060c5d7b9d5be07fb6e2bfc8eb42ffc2bbf85eb55036f2e81e10cdb559b5"} Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.407855 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f2f060c5d7b9d5be07fb6e2bfc8eb42ffc2bbf85eb55036f2e81e10cdb559b5" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.409265 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13fa1e9e-f893-4641-a55f-d0c209d43917-config-data" (OuterVolumeSpecName: "config-data") pod "13fa1e9e-f893-4641-a55f-d0c209d43917" (UID: "13fa1e9e-f893-4641-a55f-d0c209d43917"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.423062 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2606acc0-f425-4595-8f5a-df1f009175ee-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2606acc0-f425-4595-8f5a-df1f009175ee" (UID: "2606acc0-f425-4595-8f5a-df1f009175ee"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.427522 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-9d6468757-b9ttg" event={"ID":"f0fc7cfb-9160-471d-8aec-457769d46602","Type":"ContainerDied","Data":"db960e3dfedc47f213662aed65e9f25449ed394349e91d48c0df3cbb42c99bbd"} Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.427660 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-9d6468757-b9ttg" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.439479 4756 generic.go:334] "Generic (PLEG): container finished" podID="c9451d43-4e5f-429b-b364-9220afe19523" containerID="52f8fd30a92226261f0f4bd124f526c8082068adbdd24e6aad126c08d81c0f83" exitCode=0 Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.439544 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c9451d43-4e5f-429b-b364-9220afe19523","Type":"ContainerDied","Data":"52f8fd30a92226261f0f4bd124f526c8082068adbdd24e6aad126c08d81c0f83"} Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.439570 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c9451d43-4e5f-429b-b364-9220afe19523","Type":"ContainerDied","Data":"7906d615576514a0ae9f355934c433cf342a6577016c205c7bf710d3a0b10651"} Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.439581 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7906d615576514a0ae9f355934c433cf342a6577016c205c7bf710d3a0b10651" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.441357 4756 generic.go:334] "Generic (PLEG): container finished" podID="187070cf-06f5-4c25-851f-2e1f294c3b7c" containerID="7fc666336bac7d8a681a050c8fd16bd151411c4f9fcfe10381ada0984c4a3444" exitCode=0 Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.441427 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"187070cf-06f5-4c25-851f-2e1f294c3b7c","Type":"ContainerDied","Data":"7fc666336bac7d8a681a050c8fd16bd151411c4f9fcfe10381ada0984c4a3444"} Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.441451 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"187070cf-06f5-4c25-851f-2e1f294c3b7c","Type":"ContainerDied","Data":"7dc8d8d57bd5e9e0828f87c61352839520b2055e234f1c49a1cea5e9ccf13e96"} Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.441498 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dc8d8d57bd5e9e0828f87c61352839520b2055e234f1c49a1cea5e9ccf13e96" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.442716 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13fa1e9e-f893-4641-a55f-d0c209d43917-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.442741 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f664c2b4-e517-427a-bbdc-8fba6d548a21-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.442755 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4beba48e-5ac4-4d41-864b-ecfd8b039988-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.442767 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13fa1e9e-f893-4641-a55f-d0c209d43917-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.442776 4756 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2606acc0-f425-4595-8f5a-df1f009175ee-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.442890 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6fd4b854cb-rm9k8" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.443152 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6fd4b854cb-rm9k8" event={"ID":"13fa1e9e-f893-4641-a55f-d0c209d43917","Type":"ContainerDied","Data":"066946f6bad4e2e1416b640fa9541ea8a8e415d35ee0bb8d2783c6616335c822"} Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.445979 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"797faab6-a6b3-48ae-90ca-09d251b2975f","Type":"ContainerDied","Data":"4f8cb0a49c9c285b0fb9bb22d0db0aa157e24e040f9a11bccff79fe508d8ba21"} Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.446339 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.451377 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-668cbcb89b-dvw88" event={"ID":"84d62427-bd91-4330-be20-152bc3508447","Type":"ContainerDied","Data":"731818aad98a6a2a6899b20d4dca2d9aed160a17a7cdc3867438296fbe19a6a6"} Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.451634 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-668cbcb89b-dvw88" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.451709 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-config-data" (OuterVolumeSpecName: "config-data") pod "9df5252d-16cf-4612-910c-8f37d10b48a7" (UID: "9df5252d-16cf-4612-910c-8f37d10b48a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.453522 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.453648 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"dc0118af-131e-409a-a18b-0be03f3ce81b","Type":"ContainerDied","Data":"faac77f71518e440dd1ffc3f6e6c38fa17bdd9b4d2b40ff8b14a1d554bd9d538"} Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.453674 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.453682 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="faac77f71518e440dd1ffc3f6e6c38fa17bdd9b4d2b40ff8b14a1d554bd9d538" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.453630 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.454718 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84d62427-bd91-4330-be20-152bc3508447-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "84d62427-bd91-4330-be20-152bc3508447" (UID: "84d62427-bd91-4330-be20-152bc3508447"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.462155 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84d62427-bd91-4330-be20-152bc3508447-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84d62427-bd91-4330-be20-152bc3508447" (UID: "84d62427-bd91-4330-be20-152bc3508447"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.482863 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/797faab6-a6b3-48ae-90ca-09d251b2975f-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "797faab6-a6b3-48ae-90ca-09d251b2975f" (UID: "797faab6-a6b3-48ae-90ca-09d251b2975f"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.496691 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84d62427-bd91-4330-be20-152bc3508447-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "84d62427-bd91-4330-be20-152bc3508447" (UID: "84d62427-bd91-4330-be20-152bc3508447"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.545794 4756 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/797faab6-a6b3-48ae-90ca-09d251b2975f-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.545847 4756 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84d62427-bd91-4330-be20-152bc3508447-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.545863 4756 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84d62427-bd91-4330-be20-152bc3508447-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.545875 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9df5252d-16cf-4612-910c-8f37d10b48a7-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.545887 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84d62427-bd91-4330-be20-152bc3508447-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.580548 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.603703 4756 scope.go:117] "RemoveContainer" containerID="25d37ab5592969027f78161ad8c13da0276bef9030f7a390f77c1e2d2d113f8c" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.604007 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.628756 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.639665 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.644229 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.648681 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.656097 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.657331 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.667454 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.669040 4756 scope.go:117] "RemoveContainer" containerID="b4e4e8a043bf4f88e1edd5b6dd41710a7149c49c31bb40ac6714307c37009357" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.683115 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.695800 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder2230-account-delete-jfjlh" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.704598 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6fd4b854cb-rm9k8"] Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.713841 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-6fd4b854cb-rm9k8"] Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.721944 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-9d6468757-b9ttg"] Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.729816 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-9d6468757-b9ttg"] Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.735984 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.745036 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.748708 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/897f63f9-bf8d-438b-b35a-4195963786e3-combined-ca-bundle\") pod \"897f63f9-bf8d-438b-b35a-4195963786e3\" (UID: \"897f63f9-bf8d-438b-b35a-4195963786e3\") " Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.749119 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgbg6\" (UniqueName: \"kubernetes.io/projected/897f63f9-bf8d-438b-b35a-4195963786e3-kube-api-access-fgbg6\") pod \"897f63f9-bf8d-438b-b35a-4195963786e3\" (UID: \"897f63f9-bf8d-438b-b35a-4195963786e3\") " Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.750169 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/032af055-c0a6-4fda-90c8-c34ef9278613-memcached-tls-certs\") pod \"032af055-c0a6-4fda-90c8-c34ef9278613\" (UID: \"032af055-c0a6-4fda-90c8-c34ef9278613\") " Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.750195 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crs8q\" (UniqueName: \"kubernetes.io/projected/c9451d43-4e5f-429b-b364-9220afe19523-kube-api-access-crs8q\") pod \"c9451d43-4e5f-429b-b364-9220afe19523\" (UID: \"c9451d43-4e5f-429b-b364-9220afe19523\") " Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.750557 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv5ww\" (UniqueName: \"kubernetes.io/projected/76057872-026a-4ecb-b628-75f7f6fd46e9-kube-api-access-rv5ww\") pod \"76057872-026a-4ecb-b628-75f7f6fd46e9\" (UID: \"76057872-026a-4ecb-b628-75f7f6fd46e9\") " Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.750588 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76057872-026a-4ecb-b628-75f7f6fd46e9-config-data\") pod \"76057872-026a-4ecb-b628-75f7f6fd46e9\" (UID: \"76057872-026a-4ecb-b628-75f7f6fd46e9\") " Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.750624 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76057872-026a-4ecb-b628-75f7f6fd46e9-combined-ca-bundle\") pod \"76057872-026a-4ecb-b628-75f7f6fd46e9\" (UID: \"76057872-026a-4ecb-b628-75f7f6fd46e9\") " Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.750671 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjpn5\" (UniqueName: \"kubernetes.io/projected/187070cf-06f5-4c25-851f-2e1f294c3b7c-kube-api-access-jjpn5\") pod \"187070cf-06f5-4c25-851f-2e1f294c3b7c\" (UID: \"187070cf-06f5-4c25-851f-2e1f294c3b7c\") " Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.750699 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/187070cf-06f5-4c25-851f-2e1f294c3b7c-combined-ca-bundle\") pod \"187070cf-06f5-4c25-851f-2e1f294c3b7c\" (UID: \"187070cf-06f5-4c25-851f-2e1f294c3b7c\") " Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.750721 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/897f63f9-bf8d-438b-b35a-4195963786e3-logs\") pod \"897f63f9-bf8d-438b-b35a-4195963786e3\" (UID: \"897f63f9-bf8d-438b-b35a-4195963786e3\") " Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.750742 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/032af055-c0a6-4fda-90c8-c34ef9278613-config-data\") pod \"032af055-c0a6-4fda-90c8-c34ef9278613\" (UID: \"032af055-c0a6-4fda-90c8-c34ef9278613\") " Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.750763 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc0118af-131e-409a-a18b-0be03f3ce81b-config-data\") pod \"dc0118af-131e-409a-a18b-0be03f3ce81b\" (UID: \"dc0118af-131e-409a-a18b-0be03f3ce81b\") " Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.750786 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/032af055-c0a6-4fda-90c8-c34ef9278613-kolla-config\") pod \"032af055-c0a6-4fda-90c8-c34ef9278613\" (UID: \"032af055-c0a6-4fda-90c8-c34ef9278613\") " Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.751571 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/897f63f9-bf8d-438b-b35a-4195963786e3-config-data\") pod \"897f63f9-bf8d-438b-b35a-4195963786e3\" (UID: \"897f63f9-bf8d-438b-b35a-4195963786e3\") " Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.751603 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/897f63f9-bf8d-438b-b35a-4195963786e3-internal-tls-certs\") pod \"897f63f9-bf8d-438b-b35a-4195963786e3\" (UID: \"897f63f9-bf8d-438b-b35a-4195963786e3\") " Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.751650 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e2e416b-60db-4b09-b585-59911e192daf-operator-scripts\") pod \"6e2e416b-60db-4b09-b585-59911e192daf\" (UID: \"6e2e416b-60db-4b09-b585-59911e192daf\") " Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.751671 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/897f63f9-bf8d-438b-b35a-4195963786e3-public-tls-certs\") pod \"897f63f9-bf8d-438b-b35a-4195963786e3\" (UID: \"897f63f9-bf8d-438b-b35a-4195963786e3\") " Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.751688 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9451d43-4e5f-429b-b364-9220afe19523-combined-ca-bundle\") pod \"c9451d43-4e5f-429b-b364-9220afe19523\" (UID: \"c9451d43-4e5f-429b-b364-9220afe19523\") " Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.751709 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/032af055-c0a6-4fda-90c8-c34ef9278613-combined-ca-bundle\") pod \"032af055-c0a6-4fda-90c8-c34ef9278613\" (UID: \"032af055-c0a6-4fda-90c8-c34ef9278613\") " Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.751728 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnsv6\" (UniqueName: \"kubernetes.io/projected/6e2e416b-60db-4b09-b585-59911e192daf-kube-api-access-wnsv6\") pod \"6e2e416b-60db-4b09-b585-59911e192daf\" (UID: \"6e2e416b-60db-4b09-b585-59911e192daf\") " Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.751749 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vm8t\" (UniqueName: \"kubernetes.io/projected/032af055-c0a6-4fda-90c8-c34ef9278613-kube-api-access-7vm8t\") pod \"032af055-c0a6-4fda-90c8-c34ef9278613\" (UID: \"032af055-c0a6-4fda-90c8-c34ef9278613\") " Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.751803 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76057872-026a-4ecb-b628-75f7f6fd46e9-logs\") pod \"76057872-026a-4ecb-b628-75f7f6fd46e9\" (UID: \"76057872-026a-4ecb-b628-75f7f6fd46e9\") " Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.752540 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/187070cf-06f5-4c25-851f-2e1f294c3b7c-config-data\") pod \"187070cf-06f5-4c25-851f-2e1f294c3b7c\" (UID: \"187070cf-06f5-4c25-851f-2e1f294c3b7c\") " Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.752561 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0118af-131e-409a-a18b-0be03f3ce81b-combined-ca-bundle\") pod \"dc0118af-131e-409a-a18b-0be03f3ce81b\" (UID: \"dc0118af-131e-409a-a18b-0be03f3ce81b\") " Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.752590 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/76057872-026a-4ecb-b628-75f7f6fd46e9-nova-metadata-tls-certs\") pod \"76057872-026a-4ecb-b628-75f7f6fd46e9\" (UID: \"76057872-026a-4ecb-b628-75f7f6fd46e9\") " Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.752630 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9451d43-4e5f-429b-b364-9220afe19523-config-data\") pod \"c9451d43-4e5f-429b-b364-9220afe19523\" (UID: \"c9451d43-4e5f-429b-b364-9220afe19523\") " Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.752463 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76057872-026a-4ecb-b628-75f7f6fd46e9-logs" (OuterVolumeSpecName: "logs") pod "76057872-026a-4ecb-b628-75f7f6fd46e9" (UID: "76057872-026a-4ecb-b628-75f7f6fd46e9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.753229 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/897f63f9-bf8d-438b-b35a-4195963786e3-logs" (OuterVolumeSpecName: "logs") pod "897f63f9-bf8d-438b-b35a-4195963786e3" (UID: "897f63f9-bf8d-438b-b35a-4195963786e3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.756866 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/032af055-c0a6-4fda-90c8-c34ef9278613-config-data" (OuterVolumeSpecName: "config-data") pod "032af055-c0a6-4fda-90c8-c34ef9278613" (UID: "032af055-c0a6-4fda-90c8-c34ef9278613"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.759363 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e2e416b-60db-4b09-b585-59911e192daf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6e2e416b-60db-4b09-b585-59911e192daf" (UID: "6e2e416b-60db-4b09-b585-59911e192daf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.759574 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/187070cf-06f5-4c25-851f-2e1f294c3b7c-kube-api-access-jjpn5" (OuterVolumeSpecName: "kube-api-access-jjpn5") pod "187070cf-06f5-4c25-851f-2e1f294c3b7c" (UID: "187070cf-06f5-4c25-851f-2e1f294c3b7c"). InnerVolumeSpecName "kube-api-access-jjpn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.761148 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/897f63f9-bf8d-438b-b35a-4195963786e3-kube-api-access-fgbg6" (OuterVolumeSpecName: "kube-api-access-fgbg6") pod "897f63f9-bf8d-438b-b35a-4195963786e3" (UID: "897f63f9-bf8d-438b-b35a-4195963786e3"). InnerVolumeSpecName "kube-api-access-fgbg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.766046 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/032af055-c0a6-4fda-90c8-c34ef9278613-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "032af055-c0a6-4fda-90c8-c34ef9278613" (UID: "032af055-c0a6-4fda-90c8-c34ef9278613"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.774590 4756 scope.go:117] "RemoveContainer" containerID="481bb590f2466b1ec99a35fcf296aa1fc3174e0cc5b933c289f817ad96130df5" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.782191 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76057872-026a-4ecb-b628-75f7f6fd46e9-kube-api-access-rv5ww" (OuterVolumeSpecName: "kube-api-access-rv5ww") pod "76057872-026a-4ecb-b628-75f7f6fd46e9" (UID: "76057872-026a-4ecb-b628-75f7f6fd46e9"). InnerVolumeSpecName "kube-api-access-rv5ww". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.804404 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc0118af-131e-409a-a18b-0be03f3ce81b-config-data" (OuterVolumeSpecName: "config-data") pod "dc0118af-131e-409a-a18b-0be03f3ce81b" (UID: "dc0118af-131e-409a-a18b-0be03f3ce81b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.805734 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/032af055-c0a6-4fda-90c8-c34ef9278613-kube-api-access-7vm8t" (OuterVolumeSpecName: "kube-api-access-7vm8t") pod "032af055-c0a6-4fda-90c8-c34ef9278613" (UID: "032af055-c0a6-4fda-90c8-c34ef9278613"). InnerVolumeSpecName "kube-api-access-7vm8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.821911 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e2e416b-60db-4b09-b585-59911e192daf-kube-api-access-wnsv6" (OuterVolumeSpecName: "kube-api-access-wnsv6") pod "6e2e416b-60db-4b09-b585-59911e192daf" (UID: "6e2e416b-60db-4b09-b585-59911e192daf"). InnerVolumeSpecName "kube-api-access-wnsv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.847871 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76057872-026a-4ecb-b628-75f7f6fd46e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76057872-026a-4ecb-b628-75f7f6fd46e9" (UID: "76057872-026a-4ecb-b628-75f7f6fd46e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.857234 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdmzl\" (UniqueName: \"kubernetes.io/projected/dc0118af-131e-409a-a18b-0be03f3ce81b-kube-api-access-fdmzl\") pod \"dc0118af-131e-409a-a18b-0be03f3ce81b\" (UID: \"dc0118af-131e-409a-a18b-0be03f3ce81b\") " Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.857897 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjpn5\" (UniqueName: \"kubernetes.io/projected/187070cf-06f5-4c25-851f-2e1f294c3b7c-kube-api-access-jjpn5\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.857910 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/897f63f9-bf8d-438b-b35a-4195963786e3-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.857919 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/032af055-c0a6-4fda-90c8-c34ef9278613-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.857929 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc0118af-131e-409a-a18b-0be03f3ce81b-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.857937 4756 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/032af055-c0a6-4fda-90c8-c34ef9278613-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.857945 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e2e416b-60db-4b09-b585-59911e192daf-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.857954 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnsv6\" (UniqueName: \"kubernetes.io/projected/6e2e416b-60db-4b09-b585-59911e192daf-kube-api-access-wnsv6\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.857962 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vm8t\" (UniqueName: \"kubernetes.io/projected/032af055-c0a6-4fda-90c8-c34ef9278613-kube-api-access-7vm8t\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.857969 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76057872-026a-4ecb-b628-75f7f6fd46e9-logs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.857978 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgbg6\" (UniqueName: \"kubernetes.io/projected/897f63f9-bf8d-438b-b35a-4195963786e3-kube-api-access-fgbg6\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.857986 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rv5ww\" (UniqueName: \"kubernetes.io/projected/76057872-026a-4ecb-b628-75f7f6fd46e9-kube-api-access-rv5ww\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.857994 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76057872-026a-4ecb-b628-75f7f6fd46e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.865409 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/032af055-c0a6-4fda-90c8-c34ef9278613-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "032af055-c0a6-4fda-90c8-c34ef9278613" (UID: "032af055-c0a6-4fda-90c8-c34ef9278613"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.866905 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9451d43-4e5f-429b-b364-9220afe19523-kube-api-access-crs8q" (OuterVolumeSpecName: "kube-api-access-crs8q") pod "c9451d43-4e5f-429b-b364-9220afe19523" (UID: "c9451d43-4e5f-429b-b364-9220afe19523"). InnerVolumeSpecName "kube-api-access-crs8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.878634 4756 scope.go:117] "RemoveContainer" containerID="f08e0d462ffd45dfa7950e9164ec47234418bfc47d5218d2a85f0732c1d92391" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.964825 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc0118af-131e-409a-a18b-0be03f3ce81b-kube-api-access-fdmzl" (OuterVolumeSpecName: "kube-api-access-fdmzl") pod "dc0118af-131e-409a-a18b-0be03f3ce81b" (UID: "dc0118af-131e-409a-a18b-0be03f3ce81b"). InnerVolumeSpecName "kube-api-access-fdmzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.969448 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdmzl\" (UniqueName: \"kubernetes.io/projected/dc0118af-131e-409a-a18b-0be03f3ce81b-kube-api-access-fdmzl\") pod \"dc0118af-131e-409a-a18b-0be03f3ce81b\" (UID: \"dc0118af-131e-409a-a18b-0be03f3ce81b\") " Dec 05 16:41:09 crc kubenswrapper[4756]: W1205 16:41:09.969735 4756 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/dc0118af-131e-409a-a18b-0be03f3ce81b/volumes/kubernetes.io~projected/kube-api-access-fdmzl Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.969852 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc0118af-131e-409a-a18b-0be03f3ce81b-kube-api-access-fdmzl" (OuterVolumeSpecName: "kube-api-access-fdmzl") pod "dc0118af-131e-409a-a18b-0be03f3ce81b" (UID: "dc0118af-131e-409a-a18b-0be03f3ce81b"). InnerVolumeSpecName "kube-api-access-fdmzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.971094 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/032af055-c0a6-4fda-90c8-c34ef9278613-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.971126 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdmzl\" (UniqueName: \"kubernetes.io/projected/dc0118af-131e-409a-a18b-0be03f3ce81b-kube-api-access-fdmzl\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.971145 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crs8q\" (UniqueName: \"kubernetes.io/projected/c9451d43-4e5f-429b-b364-9220afe19523-kube-api-access-crs8q\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:09 crc kubenswrapper[4756]: I1205 16:41:09.991066 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9451d43-4e5f-429b-b364-9220afe19523-config-data" (OuterVolumeSpecName: "config-data") pod "c9451d43-4e5f-429b-b364-9220afe19523" (UID: "c9451d43-4e5f-429b-b364-9220afe19523"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.006837 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-54c6d9b488-h9hmj"] Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.016805 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/187070cf-06f5-4c25-851f-2e1f294c3b7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "187070cf-06f5-4c25-851f-2e1f294c3b7c" (UID: "187070cf-06f5-4c25-851f-2e1f294c3b7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.030451 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-54c6d9b488-h9hmj"] Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.045763 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc0118af-131e-409a-a18b-0be03f3ce81b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc0118af-131e-409a-a18b-0be03f3ce81b" (UID: "dc0118af-131e-409a-a18b-0be03f3ce81b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.074143 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0118af-131e-409a-a18b-0be03f3ce81b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.074217 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9451d43-4e5f-429b-b364-9220afe19523-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.074233 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/187070cf-06f5-4c25-851f-2e1f294c3b7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.092011 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/187070cf-06f5-4c25-851f-2e1f294c3b7c-config-data" (OuterVolumeSpecName: "config-data") pod "187070cf-06f5-4c25-851f-2e1f294c3b7c" (UID: "187070cf-06f5-4c25-851f-2e1f294c3b7c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.092302 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/897f63f9-bf8d-438b-b35a-4195963786e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "897f63f9-bf8d-438b-b35a-4195963786e3" (UID: "897f63f9-bf8d-438b-b35a-4195963786e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.092931 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/897f63f9-bf8d-438b-b35a-4195963786e3-config-data" (OuterVolumeSpecName: "config-data") pod "897f63f9-bf8d-438b-b35a-4195963786e3" (UID: "897f63f9-bf8d-438b-b35a-4195963786e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.097528 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76057872-026a-4ecb-b628-75f7f6fd46e9-config-data" (OuterVolumeSpecName: "config-data") pod "76057872-026a-4ecb-b628-75f7f6fd46e9" (UID: "76057872-026a-4ecb-b628-75f7f6fd46e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.097829 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9451d43-4e5f-429b-b364-9220afe19523-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9451d43-4e5f-429b-b364-9220afe19523" (UID: "c9451d43-4e5f-429b-b364-9220afe19523"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.111199 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76057872-026a-4ecb-b628-75f7f6fd46e9-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "76057872-026a-4ecb-b628-75f7f6fd46e9" (UID: "76057872-026a-4ecb-b628-75f7f6fd46e9"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.111866 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13fa1e9e-f893-4641-a55f-d0c209d43917" path="/var/lib/kubelet/pods/13fa1e9e-f893-4641-a55f-d0c209d43917/volumes" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.118769 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/897f63f9-bf8d-438b-b35a-4195963786e3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "897f63f9-bf8d-438b-b35a-4195963786e3" (UID: "897f63f9-bf8d-438b-b35a-4195963786e3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.129598 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/032af055-c0a6-4fda-90c8-c34ef9278613-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "032af055-c0a6-4fda-90c8-c34ef9278613" (UID: "032af055-c0a6-4fda-90c8-c34ef9278613"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.136935 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2606acc0-f425-4595-8f5a-df1f009175ee" path="/var/lib/kubelet/pods/2606acc0-f425-4595-8f5a-df1f009175ee/volumes" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.149147 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/897f63f9-bf8d-438b-b35a-4195963786e3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "897f63f9-bf8d-438b-b35a-4195963786e3" (UID: "897f63f9-bf8d-438b-b35a-4195963786e3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.156398 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4beba48e-5ac4-4d41-864b-ecfd8b039988" path="/var/lib/kubelet/pods/4beba48e-5ac4-4d41-864b-ecfd8b039988/volumes" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.157629 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="509573b5-33f4-432c-bb2e-1fc9186ed072" path="/var/lib/kubelet/pods/509573b5-33f4-432c-bb2e-1fc9186ed072/volumes" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.158960 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="881d2b4e-3f2e-4aca-a965-2508edc92a3b" path="/var/lib/kubelet/pods/881d2b4e-3f2e-4aca-a965-2508edc92a3b/volumes" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.160242 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9df5252d-16cf-4612-910c-8f37d10b48a7" path="/var/lib/kubelet/pods/9df5252d-16cf-4612-910c-8f37d10b48a7/volumes" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.161046 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0fc7cfb-9160-471d-8aec-457769d46602" path="/var/lib/kubelet/pods/f0fc7cfb-9160-471d-8aec-457769d46602/volumes" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.174066 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f664c2b4-e517-427a-bbdc-8fba6d548a21" path="/var/lib/kubelet/pods/f664c2b4-e517-427a-bbdc-8fba6d548a21/volumes" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.190864 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/897f63f9-bf8d-438b-b35a-4195963786e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.191538 4756 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/032af055-c0a6-4fda-90c8-c34ef9278613-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.192597 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76057872-026a-4ecb-b628-75f7f6fd46e9-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.200659 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/897f63f9-bf8d-438b-b35a-4195963786e3-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.201716 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-668cbcb89b-dvw88"] Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.201745 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-668cbcb89b-dvw88"] Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.201763 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.201774 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.202266 4756 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/897f63f9-bf8d-438b-b35a-4195963786e3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.202296 4756 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/897f63f9-bf8d-438b-b35a-4195963786e3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.202309 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9451d43-4e5f-429b-b364-9220afe19523-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.202322 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/187070cf-06f5-4c25-851f-2e1f294c3b7c-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.202335 4756 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/76057872-026a-4ecb-b628-75f7f6fd46e9-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.203444 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="1f517228-5007-4e3c-a41b-1e4b636446c0" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.196:6080/vnc_lite.html\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.227079 4756 scope.go:117] "RemoveContainer" containerID="04daa8a7964206ec50253f25a3104d1d3037a3b4b32afa5015cf06cab5b9fd7d" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.250929 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.258725 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance1a2d-account-delete-vgqdz" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.270186 4756 scope.go:117] "RemoveContainer" containerID="9dcca87cd81ebc30ed3acfa39592365fba84c1006feaa6a068c8c8e226c1806a" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.305899 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/331486fe-c403-4516-9568-862200ec0672-config-data\") pod \"331486fe-c403-4516-9568-862200ec0672\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.305979 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/331486fe-c403-4516-9568-862200ec0672-erlang-cookie-secret\") pod \"331486fe-c403-4516-9568-862200ec0672\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.306061 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/331486fe-c403-4516-9568-862200ec0672-plugins-conf\") pod \"331486fe-c403-4516-9568-862200ec0672\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.306101 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/331486fe-c403-4516-9568-862200ec0672-rabbitmq-plugins\") pod \"331486fe-c403-4516-9568-862200ec0672\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.306128 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/735bacc2-aeb8-4f2f-8f1a-bf3d092fd163-operator-scripts\") pod \"735bacc2-aeb8-4f2f-8f1a-bf3d092fd163\" (UID: \"735bacc2-aeb8-4f2f-8f1a-bf3d092fd163\") " Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.306152 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn42p\" (UniqueName: \"kubernetes.io/projected/331486fe-c403-4516-9568-862200ec0672-kube-api-access-pn42p\") pod \"331486fe-c403-4516-9568-862200ec0672\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.306243 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/331486fe-c403-4516-9568-862200ec0672-pod-info\") pod \"331486fe-c403-4516-9568-862200ec0672\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.306270 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/331486fe-c403-4516-9568-862200ec0672-server-conf\") pod \"331486fe-c403-4516-9568-862200ec0672\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.306338 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/331486fe-c403-4516-9568-862200ec0672-rabbitmq-erlang-cookie\") pod \"331486fe-c403-4516-9568-862200ec0672\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.306379 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"331486fe-c403-4516-9568-862200ec0672\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.306409 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8865\" (UniqueName: \"kubernetes.io/projected/735bacc2-aeb8-4f2f-8f1a-bf3d092fd163-kube-api-access-l8865\") pod \"735bacc2-aeb8-4f2f-8f1a-bf3d092fd163\" (UID: \"735bacc2-aeb8-4f2f-8f1a-bf3d092fd163\") " Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.306453 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/331486fe-c403-4516-9568-862200ec0672-rabbitmq-tls\") pod \"331486fe-c403-4516-9568-862200ec0672\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.306476 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/331486fe-c403-4516-9568-862200ec0672-rabbitmq-confd\") pod \"331486fe-c403-4516-9568-862200ec0672\" (UID: \"331486fe-c403-4516-9568-862200ec0672\") " Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.306783 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/331486fe-c403-4516-9568-862200ec0672-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "331486fe-c403-4516-9568-862200ec0672" (UID: "331486fe-c403-4516-9568-862200ec0672"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.306940 4756 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/331486fe-c403-4516-9568-862200ec0672-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.307042 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/331486fe-c403-4516-9568-862200ec0672-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "331486fe-c403-4516-9568-862200ec0672" (UID: "331486fe-c403-4516-9568-862200ec0672"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.307568 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/331486fe-c403-4516-9568-862200ec0672-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "331486fe-c403-4516-9568-862200ec0672" (UID: "331486fe-c403-4516-9568-862200ec0672"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.308193 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/735bacc2-aeb8-4f2f-8f1a-bf3d092fd163-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "735bacc2-aeb8-4f2f-8f1a-bf3d092fd163" (UID: "735bacc2-aeb8-4f2f-8f1a-bf3d092fd163"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.311771 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/331486fe-c403-4516-9568-862200ec0672-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "331486fe-c403-4516-9568-862200ec0672" (UID: "331486fe-c403-4516-9568-862200ec0672"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.311803 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/331486fe-c403-4516-9568-862200ec0672-kube-api-access-pn42p" (OuterVolumeSpecName: "kube-api-access-pn42p") pod "331486fe-c403-4516-9568-862200ec0672" (UID: "331486fe-c403-4516-9568-862200ec0672"). InnerVolumeSpecName "kube-api-access-pn42p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.314238 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/331486fe-c403-4516-9568-862200ec0672-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "331486fe-c403-4516-9568-862200ec0672" (UID: "331486fe-c403-4516-9568-862200ec0672"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.317193 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/735bacc2-aeb8-4f2f-8f1a-bf3d092fd163-kube-api-access-l8865" (OuterVolumeSpecName: "kube-api-access-l8865") pod "735bacc2-aeb8-4f2f-8f1a-bf3d092fd163" (UID: "735bacc2-aeb8-4f2f-8f1a-bf3d092fd163"). InnerVolumeSpecName "kube-api-access-l8865". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.317451 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "331486fe-c403-4516-9568-862200ec0672" (UID: "331486fe-c403-4516-9568-862200ec0672"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.328264 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/331486fe-c403-4516-9568-862200ec0672-config-data" (OuterVolumeSpecName: "config-data") pod "331486fe-c403-4516-9568-862200ec0672" (UID: "331486fe-c403-4516-9568-862200ec0672"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.334486 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/331486fe-c403-4516-9568-862200ec0672-pod-info" (OuterVolumeSpecName: "pod-info") pod "331486fe-c403-4516-9568-862200ec0672" (UID: "331486fe-c403-4516-9568-862200ec0672"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.340124 4756 scope.go:117] "RemoveContainer" containerID="a005af597ef26ea0b48f5743eb681caeba6662ee90634b36c5befe70f580e2a0" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.357436 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/331486fe-c403-4516-9568-862200ec0672-server-conf" (OuterVolumeSpecName: "server-conf") pod "331486fe-c403-4516-9568-862200ec0672" (UID: "331486fe-c403-4516-9568-862200ec0672"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.408592 4756 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/331486fe-c403-4516-9568-862200ec0672-pod-info\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.408654 4756 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/331486fe-c403-4516-9568-862200ec0672-server-conf\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.408667 4756 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/331486fe-c403-4516-9568-862200ec0672-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.408732 4756 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.408744 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8865\" (UniqueName: \"kubernetes.io/projected/735bacc2-aeb8-4f2f-8f1a-bf3d092fd163-kube-api-access-l8865\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.408753 4756 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/331486fe-c403-4516-9568-862200ec0672-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.408761 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/331486fe-c403-4516-9568-862200ec0672-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.408773 4756 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/331486fe-c403-4516-9568-862200ec0672-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.408781 4756 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/331486fe-c403-4516-9568-862200ec0672-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.408874 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/735bacc2-aeb8-4f2f-8f1a-bf3d092fd163-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.408890 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pn42p\" (UniqueName: \"kubernetes.io/projected/331486fe-c403-4516-9568-862200ec0672-kube-api-access-pn42p\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: E1205 16:41:10.410138 4756 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 16:41:10 crc kubenswrapper[4756]: E1205 16:41:10.410222 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e-operator-scripts podName:3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e nodeName:}" failed. No retries permitted until 2025-12-05 16:41:14.410201722 +0000 UTC m=+1534.648875234 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e-operator-scripts") pod "placement5c6b-account-delete-nhd8d" (UID: "3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e") : configmap "openstack-scripts" not found Dec 05 16:41:10 crc kubenswrapper[4756]: E1205 16:41:10.411567 4756 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 16:41:10 crc kubenswrapper[4756]: E1205 16:41:10.411608 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9639f2b9-2b61-4ab9-ab94-6c54d373662f-operator-scripts podName:9639f2b9-2b61-4ab9-ab94-6c54d373662f nodeName:}" failed. No retries permitted until 2025-12-05 16:41:14.41159866 +0000 UTC m=+1534.650272172 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9639f2b9-2b61-4ab9-ab94-6c54d373662f-operator-scripts") pod "barbicanc3ab-account-delete-ndr77" (UID: "9639f2b9-2b61-4ab9-ab94-6c54d373662f") : configmap "openstack-scripts" not found Dec 05 16:41:10 crc kubenswrapper[4756]: E1205 16:41:10.411646 4756 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 16:41:10 crc kubenswrapper[4756]: E1205 16:41:10.411699 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/37e233df-e9ab-45c9-a335-f1182a3b8c79-operator-scripts podName:37e233df-e9ab-45c9-a335-f1182a3b8c79 nodeName:}" failed. No retries permitted until 2025-12-05 16:41:14.411690733 +0000 UTC m=+1534.650364245 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/37e233df-e9ab-45c9-a335-f1182a3b8c79-operator-scripts") pod "neutronbc5a-account-delete-wtsjj" (UID: "37e233df-e9ab-45c9-a335-f1182a3b8c79") : configmap "openstack-scripts" not found Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.427354 4756 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.451565 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/331486fe-c403-4516-9568-862200ec0672-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "331486fe-c403-4516-9568-862200ec0672" (UID: "331486fe-c403-4516-9568-862200ec0672"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.469497 4756 generic.go:334] "Generic (PLEG): container finished" podID="55609ecb-b28c-488e-b80f-9add6d256016" containerID="158cecb488ef422eac05d2c4948de5a6dfc1b4ec80df672c0fc11d44da5db969" exitCode=0 Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.469547 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"55609ecb-b28c-488e-b80f-9add6d256016","Type":"ContainerDied","Data":"158cecb488ef422eac05d2c4948de5a6dfc1b4ec80df672c0fc11d44da5db969"} Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.469656 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.477091 4756 generic.go:334] "Generic (PLEG): container finished" podID="11d0ad48-c987-4415-8536-b150536534a5" containerID="b92e191defe003995a105571535e742081d3f399571f5918d9704a31621eb3ff" exitCode=0 Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.477162 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"11d0ad48-c987-4415-8536-b150536534a5","Type":"ContainerDied","Data":"b92e191defe003995a105571535e742081d3f399571f5918d9704a31621eb3ff"} Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.477194 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"11d0ad48-c987-4415-8536-b150536534a5","Type":"ContainerDied","Data":"90e66243c1e0c6167189a5d79298fcd076235bdecc2da77cd2ead4fbdbb9fcf7"} Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.477261 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.477713 4756 scope.go:117] "RemoveContainer" containerID="a99c90cd22f1473b85f31a90f8f7912ef7f574ee4089fbd8fd4b2995f484d61c" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.485585 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance1a2d-account-delete-vgqdz" event={"ID":"735bacc2-aeb8-4f2f-8f1a-bf3d092fd163","Type":"ContainerDied","Data":"fd8fbeb5aa4e68791452eb3cdf0f6975a07c7438ce715874e542353447b5aba3"} Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.485653 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd8fbeb5aa4e68791452eb3cdf0f6975a07c7438ce715874e542353447b5aba3" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.485789 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance1a2d-account-delete-vgqdz" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.499989 4756 generic.go:334] "Generic (PLEG): container finished" podID="331486fe-c403-4516-9568-862200ec0672" containerID="fa659750fdca4260aabdf7cc3b64385ac78f7103ed2236c3231de5b75e383a15" exitCode=0 Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.500092 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"331486fe-c403-4516-9568-862200ec0672","Type":"ContainerDied","Data":"fa659750fdca4260aabdf7cc3b64385ac78f7103ed2236c3231de5b75e383a15"} Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.500149 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"331486fe-c403-4516-9568-862200ec0672","Type":"ContainerDied","Data":"183eaa345ab9e899042dbd94af160bfe145d6f2ea24a9c9fc7ac9d7c89938422"} Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.500248 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.507277 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.507805 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.508141 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.508527 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.508790 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.509054 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.509448 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/11d0ad48-c987-4415-8536-b150536534a5-erlang-cookie-secret\") pod \"11d0ad48-c987-4415-8536-b150536534a5\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.509494 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder2230-account-delete-jfjlh" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.509494 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/11d0ad48-c987-4415-8536-b150536534a5-rabbitmq-tls\") pod \"11d0ad48-c987-4415-8536-b150536534a5\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.509842 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/11d0ad48-c987-4415-8536-b150536534a5-server-conf\") pod \"11d0ad48-c987-4415-8536-b150536534a5\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.509869 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/11d0ad48-c987-4415-8536-b150536534a5-rabbitmq-erlang-cookie\") pod \"11d0ad48-c987-4415-8536-b150536534a5\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.509916 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/11d0ad48-c987-4415-8536-b150536534a5-rabbitmq-confd\") pod \"11d0ad48-c987-4415-8536-b150536534a5\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.509973 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11d0ad48-c987-4415-8536-b150536534a5-config-data\") pod \"11d0ad48-c987-4415-8536-b150536534a5\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.509992 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/11d0ad48-c987-4415-8536-b150536534a5-plugins-conf\") pod \"11d0ad48-c987-4415-8536-b150536534a5\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.510044 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"11d0ad48-c987-4415-8536-b150536534a5\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.510068 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/11d0ad48-c987-4415-8536-b150536534a5-rabbitmq-plugins\") pod \"11d0ad48-c987-4415-8536-b150536534a5\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.510102 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/11d0ad48-c987-4415-8536-b150536534a5-pod-info\") pod \"11d0ad48-c987-4415-8536-b150536534a5\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.510124 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2trz\" (UniqueName: \"kubernetes.io/projected/11d0ad48-c987-4415-8536-b150536534a5-kube-api-access-m2trz\") pod \"11d0ad48-c987-4415-8536-b150536534a5\" (UID: \"11d0ad48-c987-4415-8536-b150536534a5\") " Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.510540 4756 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.510653 4756 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/331486fe-c403-4516-9568-862200ec0672-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.510998 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11d0ad48-c987-4415-8536-b150536534a5-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "11d0ad48-c987-4415-8536-b150536534a5" (UID: "11d0ad48-c987-4415-8536-b150536534a5"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.513784 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11d0ad48-c987-4415-8536-b150536534a5-kube-api-access-m2trz" (OuterVolumeSpecName: "kube-api-access-m2trz") pod "11d0ad48-c987-4415-8536-b150536534a5" (UID: "11d0ad48-c987-4415-8536-b150536534a5"). InnerVolumeSpecName "kube-api-access-m2trz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.515643 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11d0ad48-c987-4415-8536-b150536534a5-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "11d0ad48-c987-4415-8536-b150536534a5" (UID: "11d0ad48-c987-4415-8536-b150536534a5"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.516100 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "11d0ad48-c987-4415-8536-b150536534a5" (UID: "11d0ad48-c987-4415-8536-b150536534a5"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.517806 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11d0ad48-c987-4415-8536-b150536534a5-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "11d0ad48-c987-4415-8536-b150536534a5" (UID: "11d0ad48-c987-4415-8536-b150536534a5"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.519326 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11d0ad48-c987-4415-8536-b150536534a5-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "11d0ad48-c987-4415-8536-b150536534a5" (UID: "11d0ad48-c987-4415-8536-b150536534a5"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.519810 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/11d0ad48-c987-4415-8536-b150536534a5-pod-info" (OuterVolumeSpecName: "pod-info") pod "11d0ad48-c987-4415-8536-b150536534a5" (UID: "11d0ad48-c987-4415-8536-b150536534a5"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.524288 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11d0ad48-c987-4415-8536-b150536534a5-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "11d0ad48-c987-4415-8536-b150536534a5" (UID: "11d0ad48-c987-4415-8536-b150536534a5"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.531783 4756 scope.go:117] "RemoveContainer" containerID="b92e191defe003995a105571535e742081d3f399571f5918d9704a31621eb3ff" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.565066 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11d0ad48-c987-4415-8536-b150536534a5-server-conf" (OuterVolumeSpecName: "server-conf") pod "11d0ad48-c987-4415-8536-b150536534a5" (UID: "11d0ad48-c987-4415-8536-b150536534a5"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.571708 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11d0ad48-c987-4415-8536-b150536534a5-config-data" (OuterVolumeSpecName: "config-data") pod "11d0ad48-c987-4415-8536-b150536534a5" (UID: "11d0ad48-c987-4415-8536-b150536534a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.576550 4756 scope.go:117] "RemoveContainer" containerID="2ddfe79aa135b617b0c2f40a253431db40792217d9ff92e87cd83d2c7077ee6b" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.577283 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.603465 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.611470 4756 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.611504 4756 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/11d0ad48-c987-4415-8536-b150536534a5-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.611517 4756 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/11d0ad48-c987-4415-8536-b150536534a5-pod-info\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.611529 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2trz\" (UniqueName: \"kubernetes.io/projected/11d0ad48-c987-4415-8536-b150536534a5-kube-api-access-m2trz\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.611541 4756 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/11d0ad48-c987-4415-8536-b150536534a5-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.611553 4756 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/11d0ad48-c987-4415-8536-b150536534a5-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.611564 4756 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/11d0ad48-c987-4415-8536-b150536534a5-server-conf\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.611576 4756 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/11d0ad48-c987-4415-8536-b150536534a5-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.611588 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11d0ad48-c987-4415-8536-b150536534a5-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.611599 4756 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/11d0ad48-c987-4415-8536-b150536534a5-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.621007 4756 scope.go:117] "RemoveContainer" containerID="b92e191defe003995a105571535e742081d3f399571f5918d9704a31621eb3ff" Dec 05 16:41:10 crc kubenswrapper[4756]: E1205 16:41:10.621481 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b92e191defe003995a105571535e742081d3f399571f5918d9704a31621eb3ff\": container with ID starting with b92e191defe003995a105571535e742081d3f399571f5918d9704a31621eb3ff not found: ID does not exist" containerID="b92e191defe003995a105571535e742081d3f399571f5918d9704a31621eb3ff" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.621517 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b92e191defe003995a105571535e742081d3f399571f5918d9704a31621eb3ff"} err="failed to get container status \"b92e191defe003995a105571535e742081d3f399571f5918d9704a31621eb3ff\": rpc error: code = NotFound desc = could not find container \"b92e191defe003995a105571535e742081d3f399571f5918d9704a31621eb3ff\": container with ID starting with b92e191defe003995a105571535e742081d3f399571f5918d9704a31621eb3ff not found: ID does not exist" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.621539 4756 scope.go:117] "RemoveContainer" containerID="2ddfe79aa135b617b0c2f40a253431db40792217d9ff92e87cd83d2c7077ee6b" Dec 05 16:41:10 crc kubenswrapper[4756]: E1205 16:41:10.621814 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ddfe79aa135b617b0c2f40a253431db40792217d9ff92e87cd83d2c7077ee6b\": container with ID starting with 2ddfe79aa135b617b0c2f40a253431db40792217d9ff92e87cd83d2c7077ee6b not found: ID does not exist" containerID="2ddfe79aa135b617b0c2f40a253431db40792217d9ff92e87cd83d2c7077ee6b" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.621829 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ddfe79aa135b617b0c2f40a253431db40792217d9ff92e87cd83d2c7077ee6b"} err="failed to get container status \"2ddfe79aa135b617b0c2f40a253431db40792217d9ff92e87cd83d2c7077ee6b\": rpc error: code = NotFound desc = could not find container \"2ddfe79aa135b617b0c2f40a253431db40792217d9ff92e87cd83d2c7077ee6b\": container with ID starting with 2ddfe79aa135b617b0c2f40a253431db40792217d9ff92e87cd83d2c7077ee6b not found: ID does not exist" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.621840 4756 scope.go:117] "RemoveContainer" containerID="fa659750fdca4260aabdf7cc3b64385ac78f7103ed2236c3231de5b75e383a15" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.623719 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.638832 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11d0ad48-c987-4415-8536-b150536534a5-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "11d0ad48-c987-4415-8536-b150536534a5" (UID: "11d0ad48-c987-4415-8536-b150536534a5"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.652708 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.658807 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.664161 4756 scope.go:117] "RemoveContainer" containerID="dbfb13eaaa696e55e78ab35fd710ba80f5f15ace22edb546c16e9257aac5efcc" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.664342 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.668212 4756 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.670549 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.678455 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.687691 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.690445 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.697548 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.702981 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.705656 4756 scope.go:117] "RemoveContainer" containerID="fa659750fdca4260aabdf7cc3b64385ac78f7103ed2236c3231de5b75e383a15" Dec 05 16:41:10 crc kubenswrapper[4756]: E1205 16:41:10.706078 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa659750fdca4260aabdf7cc3b64385ac78f7103ed2236c3231de5b75e383a15\": container with ID starting with fa659750fdca4260aabdf7cc3b64385ac78f7103ed2236c3231de5b75e383a15 not found: ID does not exist" containerID="fa659750fdca4260aabdf7cc3b64385ac78f7103ed2236c3231de5b75e383a15" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.706111 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa659750fdca4260aabdf7cc3b64385ac78f7103ed2236c3231de5b75e383a15"} err="failed to get container status \"fa659750fdca4260aabdf7cc3b64385ac78f7103ed2236c3231de5b75e383a15\": rpc error: code = NotFound desc = could not find container \"fa659750fdca4260aabdf7cc3b64385ac78f7103ed2236c3231de5b75e383a15\": container with ID starting with fa659750fdca4260aabdf7cc3b64385ac78f7103ed2236c3231de5b75e383a15 not found: ID does not exist" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.706136 4756 scope.go:117] "RemoveContainer" containerID="dbfb13eaaa696e55e78ab35fd710ba80f5f15ace22edb546c16e9257aac5efcc" Dec 05 16:41:10 crc kubenswrapper[4756]: E1205 16:41:10.706329 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbfb13eaaa696e55e78ab35fd710ba80f5f15ace22edb546c16e9257aac5efcc\": container with ID starting with dbfb13eaaa696e55e78ab35fd710ba80f5f15ace22edb546c16e9257aac5efcc not found: ID does not exist" containerID="dbfb13eaaa696e55e78ab35fd710ba80f5f15ace22edb546c16e9257aac5efcc" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.706350 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbfb13eaaa696e55e78ab35fd710ba80f5f15ace22edb546c16e9257aac5efcc"} err="failed to get container status \"dbfb13eaaa696e55e78ab35fd710ba80f5f15ace22edb546c16e9257aac5efcc\": rpc error: code = NotFound desc = could not find container \"dbfb13eaaa696e55e78ab35fd710ba80f5f15ace22edb546c16e9257aac5efcc\": container with ID starting with dbfb13eaaa696e55e78ab35fd710ba80f5f15ace22edb546c16e9257aac5efcc not found: ID does not exist" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.710472 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.720034 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.722178 4756 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/11d0ad48-c987-4415-8536-b150536534a5-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.722213 4756 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.781556 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-ts8rq"] Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.848355 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-ts8rq"] Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.869199 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.895472 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1a2d-account-create-update-5jzww"] Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.914787 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance1a2d-account-delete-vgqdz"] Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.919782 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-1a2d-account-create-update-5jzww"] Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.952208 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance1a2d-account-delete-vgqdz"] Dec 05 16:41:10 crc kubenswrapper[4756]: I1205 16:41:10.992721 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.002957 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.032883 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n58v\" (UniqueName: \"kubernetes.io/projected/55609ecb-b28c-488e-b80f-9add6d256016-kube-api-access-7n58v\") pod \"55609ecb-b28c-488e-b80f-9add6d256016\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.032916 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55609ecb-b28c-488e-b80f-9add6d256016-combined-ca-bundle\") pod \"55609ecb-b28c-488e-b80f-9add6d256016\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.032958 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/55609ecb-b28c-488e-b80f-9add6d256016-kolla-config\") pod \"55609ecb-b28c-488e-b80f-9add6d256016\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.033032 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55609ecb-b28c-488e-b80f-9add6d256016-operator-scripts\") pod \"55609ecb-b28c-488e-b80f-9add6d256016\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.033145 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/55609ecb-b28c-488e-b80f-9add6d256016-config-data-default\") pod \"55609ecb-b28c-488e-b80f-9add6d256016\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.033169 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/55609ecb-b28c-488e-b80f-9add6d256016-galera-tls-certs\") pod \"55609ecb-b28c-488e-b80f-9add6d256016\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.033214 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"55609ecb-b28c-488e-b80f-9add6d256016\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.033234 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/55609ecb-b28c-488e-b80f-9add6d256016-config-data-generated\") pod \"55609ecb-b28c-488e-b80f-9add6d256016\" (UID: \"55609ecb-b28c-488e-b80f-9add6d256016\") " Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.034273 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55609ecb-b28c-488e-b80f-9add6d256016-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "55609ecb-b28c-488e-b80f-9add6d256016" (UID: "55609ecb-b28c-488e-b80f-9add6d256016"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.035189 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55609ecb-b28c-488e-b80f-9add6d256016-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "55609ecb-b28c-488e-b80f-9add6d256016" (UID: "55609ecb-b28c-488e-b80f-9add6d256016"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.036521 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55609ecb-b28c-488e-b80f-9add6d256016-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "55609ecb-b28c-488e-b80f-9add6d256016" (UID: "55609ecb-b28c-488e-b80f-9add6d256016"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.037484 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55609ecb-b28c-488e-b80f-9add6d256016-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "55609ecb-b28c-488e-b80f-9add6d256016" (UID: "55609ecb-b28c-488e-b80f-9add6d256016"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.041938 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55609ecb-b28c-488e-b80f-9add6d256016-kube-api-access-7n58v" (OuterVolumeSpecName: "kube-api-access-7n58v") pod "55609ecb-b28c-488e-b80f-9add6d256016" (UID: "55609ecb-b28c-488e-b80f-9add6d256016"). InnerVolumeSpecName "kube-api-access-7n58v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.045309 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "mysql-db") pod "55609ecb-b28c-488e-b80f-9add6d256016" (UID: "55609ecb-b28c-488e-b80f-9add6d256016"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.050560 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-csgm4"] Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.067038 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-csgm4"] Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.069476 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55609ecb-b28c-488e-b80f-9add6d256016-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55609ecb-b28c-488e-b80f-9add6d256016" (UID: "55609ecb-b28c-488e-b80f-9add6d256016"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.076522 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder2230-account-delete-jfjlh"] Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.092558 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-2230-account-create-update-lj8h4"] Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.102955 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder2230-account-delete-jfjlh"] Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.110525 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-2230-account-create-update-lj8h4"] Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.113137 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55609ecb-b28c-488e-b80f-9add6d256016-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "55609ecb-b28c-488e-b80f-9add6d256016" (UID: "55609ecb-b28c-488e-b80f-9add6d256016"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.138089 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n58v\" (UniqueName: \"kubernetes.io/projected/55609ecb-b28c-488e-b80f-9add6d256016-kube-api-access-7n58v\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.138120 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55609ecb-b28c-488e-b80f-9add6d256016-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.138133 4756 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/55609ecb-b28c-488e-b80f-9add6d256016-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.138146 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55609ecb-b28c-488e-b80f-9add6d256016-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.138157 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/55609ecb-b28c-488e-b80f-9add6d256016-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.138169 4756 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/55609ecb-b28c-488e-b80f-9add6d256016-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.138180 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/55609ecb-b28c-488e-b80f-9add6d256016-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.138203 4756 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.156639 4756 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.239272 4756 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.305933 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-468zf"] Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.309340 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-857c4c465c-j99gg" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.314173 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-468zf"] Dec 05 16:41:11 crc kubenswrapper[4756]: E1205 16:41:11.317988 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1ad5c1e2511e818eb90b9c9bbfc0f48afbd46e3e9b27a610240331e0b4b66444 is running failed: container process not found" containerID="1ad5c1e2511e818eb90b9c9bbfc0f48afbd46e3e9b27a610240331e0b4b66444" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 05 16:41:11 crc kubenswrapper[4756]: E1205 16:41:11.318500 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1ad5c1e2511e818eb90b9c9bbfc0f48afbd46e3e9b27a610240331e0b4b66444 is running failed: container process not found" containerID="1ad5c1e2511e818eb90b9c9bbfc0f48afbd46e3e9b27a610240331e0b4b66444" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 05 16:41:11 crc kubenswrapper[4756]: E1205 16:41:11.319030 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1ad5c1e2511e818eb90b9c9bbfc0f48afbd46e3e9b27a610240331e0b4b66444 is running failed: container process not found" containerID="1ad5c1e2511e818eb90b9c9bbfc0f48afbd46e3e9b27a610240331e0b4b66444" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 05 16:41:11 crc kubenswrapper[4756]: E1205 16:41:11.319067 4756 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1ad5c1e2511e818eb90b9c9bbfc0f48afbd46e3e9b27a610240331e0b4b66444 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683" containerName="ovn-northd" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.332028 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbicanc3ab-account-delete-ndr77"] Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.332346 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbicanc3ab-account-delete-ndr77" podUID="9639f2b9-2b61-4ab9-ab94-6c54d373662f" containerName="mariadb-account-delete" containerID="cri-o://d01e9052eadb9b19c59bb451fdb0fec97a060cfb91308a4e5a61cd10b153475f" gracePeriod=30 Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.341443 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-c3ab-account-create-update-s6vq9"] Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.360373 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-c3ab-account-create-update-s6vq9"] Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.406351 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-wj2j9"] Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.414393 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-wj2j9"] Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.442286 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-credential-keys\") pod \"87da04a1-7ee1-47e4-9bc7-1808ea506457\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.442339 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-config-data\") pod \"87da04a1-7ee1-47e4-9bc7-1808ea506457\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.442386 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-scripts\") pod \"87da04a1-7ee1-47e4-9bc7-1808ea506457\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.442459 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vct9x\" (UniqueName: \"kubernetes.io/projected/87da04a1-7ee1-47e4-9bc7-1808ea506457-kube-api-access-vct9x\") pod \"87da04a1-7ee1-47e4-9bc7-1808ea506457\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.442498 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-combined-ca-bundle\") pod \"87da04a1-7ee1-47e4-9bc7-1808ea506457\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.442553 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-public-tls-certs\") pod \"87da04a1-7ee1-47e4-9bc7-1808ea506457\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.442665 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-internal-tls-certs\") pod \"87da04a1-7ee1-47e4-9bc7-1808ea506457\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.442729 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-fernet-keys\") pod \"87da04a1-7ee1-47e4-9bc7-1808ea506457\" (UID: \"87da04a1-7ee1-47e4-9bc7-1808ea506457\") " Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.454696 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-scripts" (OuterVolumeSpecName: "scripts") pod "87da04a1-7ee1-47e4-9bc7-1808ea506457" (UID: "87da04a1-7ee1-47e4-9bc7-1808ea506457"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.456557 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87da04a1-7ee1-47e4-9bc7-1808ea506457-kube-api-access-vct9x" (OuterVolumeSpecName: "kube-api-access-vct9x") pod "87da04a1-7ee1-47e4-9bc7-1808ea506457" (UID: "87da04a1-7ee1-47e4-9bc7-1808ea506457"). InnerVolumeSpecName "kube-api-access-vct9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.461681 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "87da04a1-7ee1-47e4-9bc7-1808ea506457" (UID: "87da04a1-7ee1-47e4-9bc7-1808ea506457"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.465639 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "87da04a1-7ee1-47e4-9bc7-1808ea506457" (UID: "87da04a1-7ee1-47e4-9bc7-1808ea506457"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.482475 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-bc5a-account-create-update-rvl5h"] Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.504584 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-config-data" (OuterVolumeSpecName: "config-data") pod "87da04a1-7ee1-47e4-9bc7-1808ea506457" (UID: "87da04a1-7ee1-47e4-9bc7-1808ea506457"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.519686 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-bc5a-account-create-update-rvl5h"] Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.529620 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87da04a1-7ee1-47e4-9bc7-1808ea506457" (UID: "87da04a1-7ee1-47e4-9bc7-1808ea506457"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.535514 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "87da04a1-7ee1-47e4-9bc7-1808ea506457" (UID: "87da04a1-7ee1-47e4-9bc7-1808ea506457"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.546518 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.546555 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vct9x\" (UniqueName: \"kubernetes.io/projected/87da04a1-7ee1-47e4-9bc7-1808ea506457-kube-api-access-vct9x\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.546572 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.546583 4756 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.546596 4756 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.546608 4756 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.546619 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.546711 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "87da04a1-7ee1-47e4-9bc7-1808ea506457" (UID: "87da04a1-7ee1-47e4-9bc7-1808ea506457"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.548901 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutronbc5a-account-delete-wtsjj"] Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.549116 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutronbc5a-account-delete-wtsjj" podUID="37e233df-e9ab-45c9-a335-f1182a3b8c79" containerName="mariadb-account-delete" containerID="cri-o://693ab2a387b6a0c295bf6ab3d9d8b6730de073c73bc06fccbb3c0097dd28ae4a" gracePeriod=30 Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.578994 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"55609ecb-b28c-488e-b80f-9add6d256016","Type":"ContainerDied","Data":"d8643f6472346503691bfeafbc54939d454b7c083a9706fcbc56a3fb5bc072a1"} Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.579045 4756 scope.go:117] "RemoveContainer" containerID="158cecb488ef422eac05d2c4948de5a6dfc1b4ec80df672c0fc11d44da5db969" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.579164 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.613049 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683/ovn-northd/0.log" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.613122 4756 generic.go:334] "Generic (PLEG): container finished" podID="b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683" containerID="1ad5c1e2511e818eb90b9c9bbfc0f48afbd46e3e9b27a610240331e0b4b66444" exitCode=139 Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.613236 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683","Type":"ContainerDied","Data":"1ad5c1e2511e818eb90b9c9bbfc0f48afbd46e3e9b27a610240331e0b4b66444"} Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.648447 4756 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/87da04a1-7ee1-47e4-9bc7-1808ea506457-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.677615 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.683651 4756 generic.go:334] "Generic (PLEG): container finished" podID="87da04a1-7ee1-47e4-9bc7-1808ea506457" containerID="84853475a4ebc4610f6d0ed52745339c050a856a8d40b0bcfb30e50f936692ed" exitCode=0 Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.683698 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-857c4c465c-j99gg" event={"ID":"87da04a1-7ee1-47e4-9bc7-1808ea506457","Type":"ContainerDied","Data":"84853475a4ebc4610f6d0ed52745339c050a856a8d40b0bcfb30e50f936692ed"} Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.683726 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.683745 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-857c4c465c-j99gg" event={"ID":"87da04a1-7ee1-47e4-9bc7-1808ea506457","Type":"ContainerDied","Data":"8c4bc2a41badc74bb7d0d18e5fdbc0ea44112e88cc1b30de0ce6e857c2a33ef3"} Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.683813 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-857c4c465c-j99gg" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.715619 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-857c4c465c-j99gg"] Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.725153 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-857c4c465c-j99gg"] Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.798590 4756 scope.go:117] "RemoveContainer" containerID="25e41d82e6c78a57098d4c59694df774a54b05fa51cbfe74ad482fe43a5c7e54" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.832525 4756 scope.go:117] "RemoveContainer" containerID="84853475a4ebc4610f6d0ed52745339c050a856a8d40b0bcfb30e50f936692ed" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.895229 4756 scope.go:117] "RemoveContainer" containerID="84853475a4ebc4610f6d0ed52745339c050a856a8d40b0bcfb30e50f936692ed" Dec 05 16:41:11 crc kubenswrapper[4756]: E1205 16:41:11.895607 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84853475a4ebc4610f6d0ed52745339c050a856a8d40b0bcfb30e50f936692ed\": container with ID starting with 84853475a4ebc4610f6d0ed52745339c050a856a8d40b0bcfb30e50f936692ed not found: ID does not exist" containerID="84853475a4ebc4610f6d0ed52745339c050a856a8d40b0bcfb30e50f936692ed" Dec 05 16:41:11 crc kubenswrapper[4756]: I1205 16:41:11.895632 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84853475a4ebc4610f6d0ed52745339c050a856a8d40b0bcfb30e50f936692ed"} err="failed to get container status \"84853475a4ebc4610f6d0ed52745339c050a856a8d40b0bcfb30e50f936692ed\": rpc error: code = NotFound desc = could not find container \"84853475a4ebc4610f6d0ed52745339c050a856a8d40b0bcfb30e50f936692ed\": container with ID starting with 84853475a4ebc4610f6d0ed52745339c050a856a8d40b0bcfb30e50f936692ed not found: ID does not exist" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.061737 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="032af055-c0a6-4fda-90c8-c34ef9278613" path="/var/lib/kubelet/pods/032af055-c0a6-4fda-90c8-c34ef9278613/volumes" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.063096 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11d0ad48-c987-4415-8536-b150536534a5" path="/var/lib/kubelet/pods/11d0ad48-c987-4415-8536-b150536534a5/volumes" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.063823 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="187070cf-06f5-4c25-851f-2e1f294c3b7c" path="/var/lib/kubelet/pods/187070cf-06f5-4c25-851f-2e1f294c3b7c/volumes" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.065222 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="331486fe-c403-4516-9568-862200ec0672" path="/var/lib/kubelet/pods/331486fe-c403-4516-9568-862200ec0672/volumes" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.065989 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36fe1100-e673-4067-84f1-30bbfc2e4a0c" path="/var/lib/kubelet/pods/36fe1100-e673-4067-84f1-30bbfc2e4a0c/volumes" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.066672 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a485c78-6a69-4e0d-a3ab-0048b19ed955" path="/var/lib/kubelet/pods/3a485c78-6a69-4e0d-a3ab-0048b19ed955/volumes" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.072134 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3eb981f4-baed-45b9-856b-cda56ded1142" path="/var/lib/kubelet/pods/3eb981f4-baed-45b9-856b-cda56ded1142/volumes" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.073012 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4319acb7-6d0a-4954-9d76-e698dde5536b" path="/var/lib/kubelet/pods/4319acb7-6d0a-4954-9d76-e698dde5536b/volumes" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.073792 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55609ecb-b28c-488e-b80f-9add6d256016" path="/var/lib/kubelet/pods/55609ecb-b28c-488e-b80f-9add6d256016/volumes" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.075112 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="669dd164-f2eb-4cc7-bc41-c12a82bb74b7" path="/var/lib/kubelet/pods/669dd164-f2eb-4cc7-bc41-c12a82bb74b7/volumes" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.075775 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68d72ac3-7f79-4644-9195-ebc6ed861edd" path="/var/lib/kubelet/pods/68d72ac3-7f79-4644-9195-ebc6ed861edd/volumes" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.076392 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e2e416b-60db-4b09-b585-59911e192daf" path="/var/lib/kubelet/pods/6e2e416b-60db-4b09-b585-59911e192daf/volumes" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.077877 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="735bacc2-aeb8-4f2f-8f1a-bf3d092fd163" path="/var/lib/kubelet/pods/735bacc2-aeb8-4f2f-8f1a-bf3d092fd163/volumes" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.078565 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76057872-026a-4ecb-b628-75f7f6fd46e9" path="/var/lib/kubelet/pods/76057872-026a-4ecb-b628-75f7f6fd46e9/volumes" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.079215 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="797faab6-a6b3-48ae-90ca-09d251b2975f" path="/var/lib/kubelet/pods/797faab6-a6b3-48ae-90ca-09d251b2975f/volumes" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.080468 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e7af33e-bc54-4dde-9b49-ea742c331563" path="/var/lib/kubelet/pods/7e7af33e-bc54-4dde-9b49-ea742c331563/volumes" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.081396 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84d62427-bd91-4330-be20-152bc3508447" path="/var/lib/kubelet/pods/84d62427-bd91-4330-be20-152bc3508447/volumes" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.082236 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87da04a1-7ee1-47e4-9bc7-1808ea506457" path="/var/lib/kubelet/pods/87da04a1-7ee1-47e4-9bc7-1808ea506457/volumes" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.083575 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="897f63f9-bf8d-438b-b35a-4195963786e3" path="/var/lib/kubelet/pods/897f63f9-bf8d-438b-b35a-4195963786e3/volumes" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.084296 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9451d43-4e5f-429b-b364-9220afe19523" path="/var/lib/kubelet/pods/c9451d43-4e5f-429b-b364-9220afe19523/volumes" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.085041 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc0118af-131e-409a-a18b-0be03f3ce81b" path="/var/lib/kubelet/pods/dc0118af-131e-409a-a18b-0be03f3ce81b/volumes" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.086289 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f13540dc-b5e4-4449-9abb-160846406121" path="/var/lib/kubelet/pods/f13540dc-b5e4-4449-9abb-160846406121/volumes" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.166458 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683/ovn-northd/0.log" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.166530 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.258501 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-combined-ca-bundle\") pod \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\" (UID: \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\") " Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.258565 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-scripts\") pod \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\" (UID: \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\") " Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.258616 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-ovn-northd-tls-certs\") pod \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\" (UID: \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\") " Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.258643 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-metrics-certs-tls-certs\") pod \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\" (UID: \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\") " Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.258691 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-ovn-rundir\") pod \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\" (UID: \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\") " Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.258724 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfcfm\" (UniqueName: \"kubernetes.io/projected/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-kube-api-access-gfcfm\") pod \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\" (UID: \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\") " Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.258816 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-config\") pod \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\" (UID: \"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683\") " Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.259409 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-scripts" (OuterVolumeSpecName: "scripts") pod "b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683" (UID: "b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.259665 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683" (UID: "b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.259727 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-config" (OuterVolumeSpecName: "config") pod "b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683" (UID: "b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.280801 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-kube-api-access-gfcfm" (OuterVolumeSpecName: "kube-api-access-gfcfm") pod "b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683" (UID: "b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683"). InnerVolumeSpecName "kube-api-access-gfcfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.281845 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683" (UID: "b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.327309 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683" (UID: "b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.330328 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683" (UID: "b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.360753 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.360793 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.360802 4756 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.360814 4756 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.360825 4756 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-ovn-rundir\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.360838 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfcfm\" (UniqueName: \"kubernetes.io/projected/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-kube-api-access-gfcfm\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.360847 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.699911 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683/ovn-northd/0.log" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.699999 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683","Type":"ContainerDied","Data":"6347d4e49846db798b8a977c98253fae53ae88bcb853fe4e28c7333747302452"} Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.700039 4756 scope.go:117] "RemoveContainer" containerID="3d84600ac906281768c60200071669b5bea31ab8b67b8f4f35c9c9885c5ed8d8" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.700193 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.734943 4756 scope.go:117] "RemoveContainer" containerID="1ad5c1e2511e818eb90b9c9bbfc0f48afbd46e3e9b27a610240331e0b4b66444" Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.738136 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Dec 05 16:41:12 crc kubenswrapper[4756]: I1205 16:41:12.750234 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Dec 05 16:41:13 crc kubenswrapper[4756]: I1205 16:41:13.041705 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="9df5252d-16cf-4612-910c-8f37d10b48a7" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.162:8776/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 16:41:13 crc kubenswrapper[4756]: E1205 16:41:13.247668 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b is running failed: container process not found" containerID="664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 16:41:13 crc kubenswrapper[4756]: E1205 16:41:13.248061 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b is running failed: container process not found" containerID="664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 16:41:13 crc kubenswrapper[4756]: E1205 16:41:13.248280 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b is running failed: container process not found" containerID="664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 16:41:13 crc kubenswrapper[4756]: E1205 16:41:13.248306 4756 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-kdtg5" podUID="40ff87cb-4d82-49d4-a4d9-55e104c9becb" containerName="ovsdb-server" Dec 05 16:41:13 crc kubenswrapper[4756]: E1205 16:41:13.249201 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="15ed1a30eec906bcf7cd0aae8c31d3f0fcdcf61ff466d03836be17d99a7a65f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 16:41:13 crc kubenswrapper[4756]: E1205 16:41:13.251324 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="15ed1a30eec906bcf7cd0aae8c31d3f0fcdcf61ff466d03836be17d99a7a65f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 16:41:13 crc kubenswrapper[4756]: E1205 16:41:13.252627 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="15ed1a30eec906bcf7cd0aae8c31d3f0fcdcf61ff466d03836be17d99a7a65f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 16:41:13 crc kubenswrapper[4756]: E1205 16:41:13.252656 4756 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-kdtg5" podUID="40ff87cb-4d82-49d4-a4d9-55e104c9becb" containerName="ovs-vswitchd" Dec 05 16:41:14 crc kubenswrapper[4756]: I1205 16:41:14.058533 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683" path="/var/lib/kubelet/pods/b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683/volumes" Dec 05 16:41:14 crc kubenswrapper[4756]: I1205 16:41:14.362934 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="76057872-026a-4ecb-b628-75f7f6fd46e9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": dial tcp 10.217.0.203:8775: i/o timeout" Dec 05 16:41:14 crc kubenswrapper[4756]: I1205 16:41:14.363366 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="76057872-026a-4ecb-b628-75f7f6fd46e9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 16:41:14 crc kubenswrapper[4756]: E1205 16:41:14.500634 4756 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 16:41:14 crc kubenswrapper[4756]: E1205 16:41:14.500718 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/37e233df-e9ab-45c9-a335-f1182a3b8c79-operator-scripts podName:37e233df-e9ab-45c9-a335-f1182a3b8c79 nodeName:}" failed. No retries permitted until 2025-12-05 16:41:22.500698689 +0000 UTC m=+1542.739372201 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/37e233df-e9ab-45c9-a335-f1182a3b8c79-operator-scripts") pod "neutronbc5a-account-delete-wtsjj" (UID: "37e233df-e9ab-45c9-a335-f1182a3b8c79") : configmap "openstack-scripts" not found Dec 05 16:41:14 crc kubenswrapper[4756]: E1205 16:41:14.500640 4756 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 16:41:14 crc kubenswrapper[4756]: E1205 16:41:14.501109 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e-operator-scripts podName:3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e nodeName:}" failed. No retries permitted until 2025-12-05 16:41:22.501083339 +0000 UTC m=+1542.739756871 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e-operator-scripts") pod "placement5c6b-account-delete-nhd8d" (UID: "3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e") : configmap "openstack-scripts" not found Dec 05 16:41:14 crc kubenswrapper[4756]: E1205 16:41:14.500638 4756 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 16:41:14 crc kubenswrapper[4756]: E1205 16:41:14.501301 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9639f2b9-2b61-4ab9-ab94-6c54d373662f-operator-scripts podName:9639f2b9-2b61-4ab9-ab94-6c54d373662f nodeName:}" failed. No retries permitted until 2025-12-05 16:41:22.501289005 +0000 UTC m=+1542.739962557 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9639f2b9-2b61-4ab9-ab94-6c54d373662f-operator-scripts") pod "barbicanc3ab-account-delete-ndr77" (UID: "9639f2b9-2b61-4ab9-ab94-6c54d373662f") : configmap "openstack-scripts" not found Dec 05 16:41:14 crc kubenswrapper[4756]: E1205 16:41:14.648837 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3cf4c6f_f34b_47ea_86f9_88666a59d44a.slice/crio-conmon-bfcc2c269977aa6668ab1f24c749b53e9efa8a90bc0debb824bbc6f137672462.scope\": RecentStats: unable to find data in memory cache]" Dec 05 16:41:14 crc kubenswrapper[4756]: I1205 16:41:14.737681 4756 generic.go:334] "Generic (PLEG): container finished" podID="a3cf4c6f-f34b-47ea-86f9-88666a59d44a" containerID="bfcc2c269977aa6668ab1f24c749b53e9efa8a90bc0debb824bbc6f137672462" exitCode=0 Dec 05 16:41:14 crc kubenswrapper[4756]: I1205 16:41:14.737787 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3cf4c6f-f34b-47ea-86f9-88666a59d44a","Type":"ContainerDied","Data":"bfcc2c269977aa6668ab1f24c749b53e9efa8a90bc0debb824bbc6f137672462"} Dec 05 16:41:14 crc kubenswrapper[4756]: I1205 16:41:14.836957 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="331486fe-c403-4516-9568-862200ec0672" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: i/o timeout" Dec 05 16:41:14 crc kubenswrapper[4756]: I1205 16:41:14.956715 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.108572 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-ceilometer-tls-certs\") pod \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.108636 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-run-httpd\") pod \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.108665 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-config-data\") pod \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.108696 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjbdg\" (UniqueName: \"kubernetes.io/projected/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-kube-api-access-fjbdg\") pod \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.108770 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-log-httpd\") pod \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.108796 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-combined-ca-bundle\") pod \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.108844 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-sg-core-conf-yaml\") pod \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.108882 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-scripts\") pod \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\" (UID: \"a3cf4c6f-f34b-47ea-86f9-88666a59d44a\") " Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.109077 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a3cf4c6f-f34b-47ea-86f9-88666a59d44a" (UID: "a3cf4c6f-f34b-47ea-86f9-88666a59d44a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.109317 4756 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.109344 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a3cf4c6f-f34b-47ea-86f9-88666a59d44a" (UID: "a3cf4c6f-f34b-47ea-86f9-88666a59d44a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.116617 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-kube-api-access-fjbdg" (OuterVolumeSpecName: "kube-api-access-fjbdg") pod "a3cf4c6f-f34b-47ea-86f9-88666a59d44a" (UID: "a3cf4c6f-f34b-47ea-86f9-88666a59d44a"). InnerVolumeSpecName "kube-api-access-fjbdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.119660 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-scripts" (OuterVolumeSpecName: "scripts") pod "a3cf4c6f-f34b-47ea-86f9-88666a59d44a" (UID: "a3cf4c6f-f34b-47ea-86f9-88666a59d44a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.137684 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a3cf4c6f-f34b-47ea-86f9-88666a59d44a" (UID: "a3cf4c6f-f34b-47ea-86f9-88666a59d44a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.155803 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "a3cf4c6f-f34b-47ea-86f9-88666a59d44a" (UID: "a3cf4c6f-f34b-47ea-86f9-88666a59d44a"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.181131 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3cf4c6f-f34b-47ea-86f9-88666a59d44a" (UID: "a3cf4c6f-f34b-47ea-86f9-88666a59d44a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.206269 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-config-data" (OuterVolumeSpecName: "config-data") pod "a3cf4c6f-f34b-47ea-86f9-88666a59d44a" (UID: "a3cf4c6f-f34b-47ea-86f9-88666a59d44a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.211214 4756 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.211249 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.211262 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjbdg\" (UniqueName: \"kubernetes.io/projected/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-kube-api-access-fjbdg\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.211274 4756 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.211284 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.211296 4756 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.211306 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3cf4c6f-f34b-47ea-86f9-88666a59d44a-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.748704 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3cf4c6f-f34b-47ea-86f9-88666a59d44a","Type":"ContainerDied","Data":"319f4b60b71d3161e7bb47af4784820828daf92e7af680ffa21449636e7ee3b8"} Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.749037 4756 scope.go:117] "RemoveContainer" containerID="6299d63206c3fbc25b59d632cb9c2f9bf041eee3b3e097b905b6d1db62b66c82" Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.749148 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.781694 4756 scope.go:117] "RemoveContainer" containerID="e74f52a59e270b006a1fc5714de7ee321f0ef68eebd6a6ed247cd71177dbdaa8" Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.801325 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.809869 4756 scope.go:117] "RemoveContainer" containerID="bfcc2c269977aa6668ab1f24c749b53e9efa8a90bc0debb824bbc6f137672462" Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.815309 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 16:41:15 crc kubenswrapper[4756]: I1205 16:41:15.832771 4756 scope.go:117] "RemoveContainer" containerID="e3d6463e66e6a8de5357ab6e221c04b2cb882550348104db03b422d53b2565e5" Dec 05 16:41:16 crc kubenswrapper[4756]: I1205 16:41:16.059313 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3cf4c6f-f34b-47ea-86f9-88666a59d44a" path="/var/lib/kubelet/pods/a3cf4c6f-f34b-47ea-86f9-88666a59d44a/volumes" Dec 05 16:41:18 crc kubenswrapper[4756]: E1205 16:41:18.248621 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b is running failed: container process not found" containerID="664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 16:41:18 crc kubenswrapper[4756]: E1205 16:41:18.249691 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b is running failed: container process not found" containerID="664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 16:41:18 crc kubenswrapper[4756]: E1205 16:41:18.249844 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="15ed1a30eec906bcf7cd0aae8c31d3f0fcdcf61ff466d03836be17d99a7a65f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 16:41:18 crc kubenswrapper[4756]: E1205 16:41:18.250082 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b is running failed: container process not found" containerID="664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 16:41:18 crc kubenswrapper[4756]: E1205 16:41:18.250117 4756 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-kdtg5" podUID="40ff87cb-4d82-49d4-a4d9-55e104c9becb" containerName="ovsdb-server" Dec 05 16:41:18 crc kubenswrapper[4756]: E1205 16:41:18.251236 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="15ed1a30eec906bcf7cd0aae8c31d3f0fcdcf61ff466d03836be17d99a7a65f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 16:41:18 crc kubenswrapper[4756]: E1205 16:41:18.253112 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="15ed1a30eec906bcf7cd0aae8c31d3f0fcdcf61ff466d03836be17d99a7a65f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 16:41:18 crc kubenswrapper[4756]: E1205 16:41:18.253152 4756 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-kdtg5" podUID="40ff87cb-4d82-49d4-a4d9-55e104c9becb" containerName="ovs-vswitchd" Dec 05 16:41:21 crc kubenswrapper[4756]: I1205 16:41:21.809043 4756 generic.go:334] "Generic (PLEG): container finished" podID="926e1895-6638-4861-be46-f06e94e4b4ae" containerID="f64d7df36b795e3694f688c94ab0720253fe8c0c08bc8d7f803bb4019f348b81" exitCode=0 Dec 05 16:41:21 crc kubenswrapper[4756]: I1205 16:41:21.809087 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b545bc8c5-xcv4s" event={"ID":"926e1895-6638-4861-be46-f06e94e4b4ae","Type":"ContainerDied","Data":"f64d7df36b795e3694f688c94ab0720253fe8c0c08bc8d7f803bb4019f348b81"} Dec 05 16:41:22 crc kubenswrapper[4756]: I1205 16:41:22.011937 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b545bc8c5-xcv4s" Dec 05 16:41:22 crc kubenswrapper[4756]: I1205 16:41:22.123407 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-ovndb-tls-certs\") pod \"926e1895-6638-4861-be46-f06e94e4b4ae\" (UID: \"926e1895-6638-4861-be46-f06e94e4b4ae\") " Dec 05 16:41:22 crc kubenswrapper[4756]: I1205 16:41:22.123578 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-internal-tls-certs\") pod \"926e1895-6638-4861-be46-f06e94e4b4ae\" (UID: \"926e1895-6638-4861-be46-f06e94e4b4ae\") " Dec 05 16:41:22 crc kubenswrapper[4756]: I1205 16:41:22.123613 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-public-tls-certs\") pod \"926e1895-6638-4861-be46-f06e94e4b4ae\" (UID: \"926e1895-6638-4861-be46-f06e94e4b4ae\") " Dec 05 16:41:22 crc kubenswrapper[4756]: I1205 16:41:22.123641 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-httpd-config\") pod \"926e1895-6638-4861-be46-f06e94e4b4ae\" (UID: \"926e1895-6638-4861-be46-f06e94e4b4ae\") " Dec 05 16:41:22 crc kubenswrapper[4756]: I1205 16:41:22.123672 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-config\") pod \"926e1895-6638-4861-be46-f06e94e4b4ae\" (UID: \"926e1895-6638-4861-be46-f06e94e4b4ae\") " Dec 05 16:41:22 crc kubenswrapper[4756]: I1205 16:41:22.123701 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-combined-ca-bundle\") pod \"926e1895-6638-4861-be46-f06e94e4b4ae\" (UID: \"926e1895-6638-4861-be46-f06e94e4b4ae\") " Dec 05 16:41:22 crc kubenswrapper[4756]: I1205 16:41:22.123721 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfgg9\" (UniqueName: \"kubernetes.io/projected/926e1895-6638-4861-be46-f06e94e4b4ae-kube-api-access-bfgg9\") pod \"926e1895-6638-4861-be46-f06e94e4b4ae\" (UID: \"926e1895-6638-4861-be46-f06e94e4b4ae\") " Dec 05 16:41:22 crc kubenswrapper[4756]: I1205 16:41:22.136581 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/926e1895-6638-4861-be46-f06e94e4b4ae-kube-api-access-bfgg9" (OuterVolumeSpecName: "kube-api-access-bfgg9") pod "926e1895-6638-4861-be46-f06e94e4b4ae" (UID: "926e1895-6638-4861-be46-f06e94e4b4ae"). InnerVolumeSpecName "kube-api-access-bfgg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:22 crc kubenswrapper[4756]: I1205 16:41:22.136670 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "926e1895-6638-4861-be46-f06e94e4b4ae" (UID: "926e1895-6638-4861-be46-f06e94e4b4ae"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:22 crc kubenswrapper[4756]: I1205 16:41:22.165064 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "926e1895-6638-4861-be46-f06e94e4b4ae" (UID: "926e1895-6638-4861-be46-f06e94e4b4ae"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:22 crc kubenswrapper[4756]: I1205 16:41:22.172707 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "926e1895-6638-4861-be46-f06e94e4b4ae" (UID: "926e1895-6638-4861-be46-f06e94e4b4ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:22 crc kubenswrapper[4756]: I1205 16:41:22.184062 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "926e1895-6638-4861-be46-f06e94e4b4ae" (UID: "926e1895-6638-4861-be46-f06e94e4b4ae"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:22 crc kubenswrapper[4756]: I1205 16:41:22.188668 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-config" (OuterVolumeSpecName: "config") pod "926e1895-6638-4861-be46-f06e94e4b4ae" (UID: "926e1895-6638-4861-be46-f06e94e4b4ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:22 crc kubenswrapper[4756]: I1205 16:41:22.201229 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "926e1895-6638-4861-be46-f06e94e4b4ae" (UID: "926e1895-6638-4861-be46-f06e94e4b4ae"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:41:22 crc kubenswrapper[4756]: I1205 16:41:22.225900 4756 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:22 crc kubenswrapper[4756]: I1205 16:41:22.226080 4756 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:22 crc kubenswrapper[4756]: I1205 16:41:22.226185 4756 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:22 crc kubenswrapper[4756]: I1205 16:41:22.226254 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-config\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:22 crc kubenswrapper[4756]: I1205 16:41:22.226358 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:22 crc kubenswrapper[4756]: I1205 16:41:22.226470 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfgg9\" (UniqueName: \"kubernetes.io/projected/926e1895-6638-4861-be46-f06e94e4b4ae-kube-api-access-bfgg9\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:22 crc kubenswrapper[4756]: I1205 16:41:22.226559 4756 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/926e1895-6638-4861-be46-f06e94e4b4ae-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:22 crc kubenswrapper[4756]: E1205 16:41:22.530323 4756 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 16:41:22 crc kubenswrapper[4756]: E1205 16:41:22.530390 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9639f2b9-2b61-4ab9-ab94-6c54d373662f-operator-scripts podName:9639f2b9-2b61-4ab9-ab94-6c54d373662f nodeName:}" failed. No retries permitted until 2025-12-05 16:41:38.530372915 +0000 UTC m=+1558.769046427 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9639f2b9-2b61-4ab9-ab94-6c54d373662f-operator-scripts") pod "barbicanc3ab-account-delete-ndr77" (UID: "9639f2b9-2b61-4ab9-ab94-6c54d373662f") : configmap "openstack-scripts" not found Dec 05 16:41:22 crc kubenswrapper[4756]: E1205 16:41:22.530730 4756 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 16:41:22 crc kubenswrapper[4756]: E1205 16:41:22.530756 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e-operator-scripts podName:3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e nodeName:}" failed. No retries permitted until 2025-12-05 16:41:38.530747435 +0000 UTC m=+1558.769420947 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e-operator-scripts") pod "placement5c6b-account-delete-nhd8d" (UID: "3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e") : configmap "openstack-scripts" not found Dec 05 16:41:22 crc kubenswrapper[4756]: E1205 16:41:22.530780 4756 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 16:41:22 crc kubenswrapper[4756]: E1205 16:41:22.530795 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/37e233df-e9ab-45c9-a335-f1182a3b8c79-operator-scripts podName:37e233df-e9ab-45c9-a335-f1182a3b8c79 nodeName:}" failed. No retries permitted until 2025-12-05 16:41:38.530789896 +0000 UTC m=+1558.769463408 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/37e233df-e9ab-45c9-a335-f1182a3b8c79-operator-scripts") pod "neutronbc5a-account-delete-wtsjj" (UID: "37e233df-e9ab-45c9-a335-f1182a3b8c79") : configmap "openstack-scripts" not found Dec 05 16:41:22 crc kubenswrapper[4756]: I1205 16:41:22.829465 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b545bc8c5-xcv4s" event={"ID":"926e1895-6638-4861-be46-f06e94e4b4ae","Type":"ContainerDied","Data":"841e19590514a1169a3fd1e5b25f0fa69ff80f78a27e877a3961e22fc23c67a1"} Dec 05 16:41:22 crc kubenswrapper[4756]: I1205 16:41:22.829517 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b545bc8c5-xcv4s" Dec 05 16:41:22 crc kubenswrapper[4756]: I1205 16:41:22.829534 4756 scope.go:117] "RemoveContainer" containerID="c7af4d58a9c44e723d1f4b250fb4689fe9457696570e5e229b0f57ec7ad64a29" Dec 05 16:41:22 crc kubenswrapper[4756]: I1205 16:41:22.857563 4756 scope.go:117] "RemoveContainer" containerID="f64d7df36b795e3694f688c94ab0720253fe8c0c08bc8d7f803bb4019f348b81" Dec 05 16:41:22 crc kubenswrapper[4756]: I1205 16:41:22.874236 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5b545bc8c5-xcv4s"] Dec 05 16:41:22 crc kubenswrapper[4756]: I1205 16:41:22.890315 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5b545bc8c5-xcv4s"] Dec 05 16:41:23 crc kubenswrapper[4756]: E1205 16:41:23.247384 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b is running failed: container process not found" containerID="664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 16:41:23 crc kubenswrapper[4756]: E1205 16:41:23.247847 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b is running failed: container process not found" containerID="664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 16:41:23 crc kubenswrapper[4756]: E1205 16:41:23.248315 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b is running failed: container process not found" containerID="664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 16:41:23 crc kubenswrapper[4756]: E1205 16:41:23.248356 4756 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-kdtg5" podUID="40ff87cb-4d82-49d4-a4d9-55e104c9becb" containerName="ovsdb-server" Dec 05 16:41:23 crc kubenswrapper[4756]: E1205 16:41:23.249153 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="15ed1a30eec906bcf7cd0aae8c31d3f0fcdcf61ff466d03836be17d99a7a65f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 16:41:23 crc kubenswrapper[4756]: E1205 16:41:23.250802 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="15ed1a30eec906bcf7cd0aae8c31d3f0fcdcf61ff466d03836be17d99a7a65f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 16:41:23 crc kubenswrapper[4756]: E1205 16:41:23.252494 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="15ed1a30eec906bcf7cd0aae8c31d3f0fcdcf61ff466d03836be17d99a7a65f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 16:41:23 crc kubenswrapper[4756]: E1205 16:41:23.252539 4756 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-kdtg5" podUID="40ff87cb-4d82-49d4-a4d9-55e104c9becb" containerName="ovs-vswitchd" Dec 05 16:41:24 crc kubenswrapper[4756]: I1205 16:41:24.057771 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="926e1895-6638-4861-be46-f06e94e4b4ae" path="/var/lib/kubelet/pods/926e1895-6638-4861-be46-f06e94e4b4ae/volumes" Dec 05 16:41:28 crc kubenswrapper[4756]: E1205 16:41:28.247971 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b is running failed: container process not found" containerID="664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 16:41:28 crc kubenswrapper[4756]: E1205 16:41:28.249035 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b is running failed: container process not found" containerID="664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 16:41:28 crc kubenswrapper[4756]: E1205 16:41:28.249340 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="15ed1a30eec906bcf7cd0aae8c31d3f0fcdcf61ff466d03836be17d99a7a65f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 16:41:28 crc kubenswrapper[4756]: E1205 16:41:28.249808 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b is running failed: container process not found" containerID="664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 16:41:28 crc kubenswrapper[4756]: E1205 16:41:28.249886 4756 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-kdtg5" podUID="40ff87cb-4d82-49d4-a4d9-55e104c9becb" containerName="ovsdb-server" Dec 05 16:41:28 crc kubenswrapper[4756]: E1205 16:41:28.251866 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="15ed1a30eec906bcf7cd0aae8c31d3f0fcdcf61ff466d03836be17d99a7a65f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 16:41:28 crc kubenswrapper[4756]: E1205 16:41:28.253282 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="15ed1a30eec906bcf7cd0aae8c31d3f0fcdcf61ff466d03836be17d99a7a65f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 16:41:28 crc kubenswrapper[4756]: E1205 16:41:28.253334 4756 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-kdtg5" podUID="40ff87cb-4d82-49d4-a4d9-55e104c9becb" containerName="ovs-vswitchd" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.883660 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dlxfd"] Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.885476 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2606acc0-f425-4595-8f5a-df1f009175ee" containerName="barbican-api-log" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.885520 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="2606acc0-f425-4595-8f5a-df1f009175ee" containerName="barbican-api-log" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.885545 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0fc7cfb-9160-471d-8aec-457769d46602" containerName="barbican-worker-log" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.885555 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0fc7cfb-9160-471d-8aec-457769d46602" containerName="barbican-worker-log" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.885567 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="926e1895-6638-4861-be46-f06e94e4b4ae" containerName="neutron-httpd" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.885578 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="926e1895-6638-4861-be46-f06e94e4b4ae" containerName="neutron-httpd" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.885593 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3cf4c6f-f34b-47ea-86f9-88666a59d44a" containerName="ceilometer-central-agent" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.885599 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3cf4c6f-f34b-47ea-86f9-88666a59d44a" containerName="ceilometer-central-agent" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.885611 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="881d2b4e-3f2e-4aca-a965-2508edc92a3b" containerName="mysql-bootstrap" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.885618 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="881d2b4e-3f2e-4aca-a965-2508edc92a3b" containerName="mysql-bootstrap" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.885629 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f72429c-8e1e-4c2d-8417-c499c22013a9" containerName="ovsdbserver-nb" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.885641 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f72429c-8e1e-4c2d-8417-c499c22013a9" containerName="ovsdbserver-nb" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.885668 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="930a573c-21eb-46e3-827b-2ff167eecd4a" containerName="ovsdbserver-sb" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.885674 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="930a573c-21eb-46e3-827b-2ff167eecd4a" containerName="ovsdbserver-sb" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.885685 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55609ecb-b28c-488e-b80f-9add6d256016" containerName="mysql-bootstrap" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.885691 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="55609ecb-b28c-488e-b80f-9add6d256016" containerName="mysql-bootstrap" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.885705 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9df5252d-16cf-4612-910c-8f37d10b48a7" containerName="cinder-api" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.885711 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="9df5252d-16cf-4612-910c-8f37d10b48a7" containerName="cinder-api" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.885723 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9642035-7395-4ead-9da4-4d01f497a80b" containerName="openstack-network-exporter" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.885729 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9642035-7395-4ead-9da4-4d01f497a80b" containerName="openstack-network-exporter" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.885739 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4beba48e-5ac4-4d41-864b-ecfd8b039988" containerName="glance-log" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.885746 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4beba48e-5ac4-4d41-864b-ecfd8b039988" containerName="glance-log" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.885757 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f72429c-8e1e-4c2d-8417-c499c22013a9" containerName="openstack-network-exporter" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.885762 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f72429c-8e1e-4c2d-8417-c499c22013a9" containerName="openstack-network-exporter" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.885773 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11d0ad48-c987-4415-8536-b150536534a5" containerName="rabbitmq" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.885779 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="11d0ad48-c987-4415-8536-b150536534a5" containerName="rabbitmq" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.885792 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3cf4c6f-f34b-47ea-86f9-88666a59d44a" containerName="sg-core" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.885798 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3cf4c6f-f34b-47ea-86f9-88666a59d44a" containerName="sg-core" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.885807 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="897f63f9-bf8d-438b-b35a-4195963786e3" containerName="nova-api-log" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.885813 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="897f63f9-bf8d-438b-b35a-4195963786e3" containerName="nova-api-log" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.885821 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="881d2b4e-3f2e-4aca-a965-2508edc92a3b" containerName="galera" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.885828 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="881d2b4e-3f2e-4aca-a965-2508edc92a3b" containerName="galera" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.885835 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13fa1e9e-f893-4641-a55f-d0c209d43917" containerName="barbican-keystone-listener-log" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.885840 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="13fa1e9e-f893-4641-a55f-d0c209d43917" containerName="barbican-keystone-listener-log" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.885847 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3cf4c6f-f34b-47ea-86f9-88666a59d44a" containerName="ceilometer-notification-agent" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.885853 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3cf4c6f-f34b-47ea-86f9-88666a59d44a" containerName="ceilometer-notification-agent" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.885861 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76057872-026a-4ecb-b628-75f7f6fd46e9" containerName="nova-metadata-log" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.885867 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="76057872-026a-4ecb-b628-75f7f6fd46e9" containerName="nova-metadata-log" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.885876 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e2e416b-60db-4b09-b585-59911e192daf" containerName="mariadb-account-delete" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.885882 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e2e416b-60db-4b09-b585-59911e192daf" containerName="mariadb-account-delete" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.885898 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683" containerName="ovn-northd" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.885910 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683" containerName="ovn-northd" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.885921 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="331486fe-c403-4516-9568-862200ec0672" containerName="rabbitmq" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.885930 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="331486fe-c403-4516-9568-862200ec0672" containerName="rabbitmq" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.885941 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="735bacc2-aeb8-4f2f-8f1a-bf3d092fd163" containerName="mariadb-account-delete" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.885947 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="735bacc2-aeb8-4f2f-8f1a-bf3d092fd163" containerName="mariadb-account-delete" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.885959 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11d0ad48-c987-4415-8536-b150536534a5" containerName="setup-container" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.885966 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="11d0ad48-c987-4415-8536-b150536534a5" containerName="setup-container" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.885978 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13fa1e9e-f893-4641-a55f-d0c209d43917" containerName="barbican-keystone-listener" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.885985 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="13fa1e9e-f893-4641-a55f-d0c209d43917" containerName="barbican-keystone-listener" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.885992 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="187070cf-06f5-4c25-851f-2e1f294c3b7c" containerName="nova-cell0-conductor-conductor" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.885998 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="187070cf-06f5-4c25-851f-2e1f294c3b7c" containerName="nova-cell0-conductor-conductor" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.886008 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683" containerName="openstack-network-exporter" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886015 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683" containerName="openstack-network-exporter" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.886024 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0fc7cfb-9160-471d-8aec-457769d46602" containerName="barbican-worker" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886029 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0fc7cfb-9160-471d-8aec-457769d46602" containerName="barbican-worker" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.886039 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="509573b5-33f4-432c-bb2e-1fc9186ed072" containerName="cinder-scheduler" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886044 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="509573b5-33f4-432c-bb2e-1fc9186ed072" containerName="cinder-scheduler" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.886056 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="331486fe-c403-4516-9568-862200ec0672" containerName="setup-container" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886061 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="331486fe-c403-4516-9568-862200ec0672" containerName="setup-container" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.886072 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4beba48e-5ac4-4d41-864b-ecfd8b039988" containerName="glance-httpd" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886077 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4beba48e-5ac4-4d41-864b-ecfd8b039988" containerName="glance-httpd" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.886085 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3cf4c6f-f34b-47ea-86f9-88666a59d44a" containerName="proxy-httpd" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886091 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3cf4c6f-f34b-47ea-86f9-88666a59d44a" containerName="proxy-httpd" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.886100 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="930a573c-21eb-46e3-827b-2ff167eecd4a" containerName="openstack-network-exporter" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886106 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="930a573c-21eb-46e3-827b-2ff167eecd4a" containerName="openstack-network-exporter" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.886114 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c609265d-528c-4c9c-bd1c-b735775d9acc" containerName="dnsmasq-dns" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886120 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c609265d-528c-4c9c-bd1c-b735775d9acc" containerName="dnsmasq-dns" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.886131 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc0118af-131e-409a-a18b-0be03f3ce81b" containerName="nova-cell1-conductor-conductor" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886137 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc0118af-131e-409a-a18b-0be03f3ce81b" containerName="nova-cell1-conductor-conductor" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.886149 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="897f63f9-bf8d-438b-b35a-4195963786e3" containerName="nova-api-api" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886154 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="897f63f9-bf8d-438b-b35a-4195963786e3" containerName="nova-api-api" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.886164 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d62427-bd91-4330-be20-152bc3508447" containerName="placement-api" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886170 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d62427-bd91-4330-be20-152bc3508447" containerName="placement-api" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.886179 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="797faab6-a6b3-48ae-90ca-09d251b2975f" containerName="kube-state-metrics" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886186 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="797faab6-a6b3-48ae-90ca-09d251b2975f" containerName="kube-state-metrics" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.886197 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="509573b5-33f4-432c-bb2e-1fc9186ed072" containerName="probe" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886203 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="509573b5-33f4-432c-bb2e-1fc9186ed072" containerName="probe" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.886212 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9df5252d-16cf-4612-910c-8f37d10b48a7" containerName="cinder-api-log" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886218 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="9df5252d-16cf-4612-910c-8f37d10b48a7" containerName="cinder-api-log" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.886225 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c609265d-528c-4c9c-bd1c-b735775d9acc" containerName="init" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886231 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c609265d-528c-4c9c-bd1c-b735775d9acc" containerName="init" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.886237 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87da04a1-7ee1-47e4-9bc7-1808ea506457" containerName="keystone-api" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886243 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="87da04a1-7ee1-47e4-9bc7-1808ea506457" containerName="keystone-api" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.886254 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2606acc0-f425-4595-8f5a-df1f009175ee" containerName="barbican-api" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886260 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="2606acc0-f425-4595-8f5a-df1f009175ee" containerName="barbican-api" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.886267 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f664c2b4-e517-427a-bbdc-8fba6d548a21" containerName="glance-log" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886273 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f664c2b4-e517-427a-bbdc-8fba6d548a21" containerName="glance-log" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.886284 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d62427-bd91-4330-be20-152bc3508447" containerName="placement-log" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886290 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d62427-bd91-4330-be20-152bc3508447" containerName="placement-log" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.886297 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="541f7de0-3773-4e28-9881-b552367f1c89" containerName="proxy-server" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886303 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="541f7de0-3773-4e28-9881-b552367f1c89" containerName="proxy-server" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.886310 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="032af055-c0a6-4fda-90c8-c34ef9278613" containerName="memcached" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886316 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="032af055-c0a6-4fda-90c8-c34ef9278613" containerName="memcached" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.886324 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="359e87f3-8fb4-4476-a538-13c948b65885" containerName="ovn-controller" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886330 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="359e87f3-8fb4-4476-a538-13c948b65885" containerName="ovn-controller" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.886343 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76057872-026a-4ecb-b628-75f7f6fd46e9" containerName="nova-metadata-metadata" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886348 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="76057872-026a-4ecb-b628-75f7f6fd46e9" containerName="nova-metadata-metadata" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.886360 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f664c2b4-e517-427a-bbdc-8fba6d548a21" containerName="glance-httpd" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886365 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f664c2b4-e517-427a-bbdc-8fba6d548a21" containerName="glance-httpd" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.886376 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9451d43-4e5f-429b-b364-9220afe19523" containerName="nova-scheduler-scheduler" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886382 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9451d43-4e5f-429b-b364-9220afe19523" containerName="nova-scheduler-scheduler" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.886390 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f517228-5007-4e3c-a41b-1e4b636446c0" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886396 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f517228-5007-4e3c-a41b-1e4b636446c0" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.886404 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55609ecb-b28c-488e-b80f-9add6d256016" containerName="galera" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886432 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="55609ecb-b28c-488e-b80f-9add6d256016" containerName="galera" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.886438 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="926e1895-6638-4861-be46-f06e94e4b4ae" containerName="neutron-api" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886444 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="926e1895-6638-4861-be46-f06e94e4b4ae" containerName="neutron-api" Dec 05 16:41:32 crc kubenswrapper[4756]: E1205 16:41:32.886453 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="541f7de0-3773-4e28-9881-b552367f1c89" containerName="proxy-httpd" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886459 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="541f7de0-3773-4e28-9881-b552367f1c89" containerName="proxy-httpd" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886720 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f664c2b4-e517-427a-bbdc-8fba6d548a21" containerName="glance-log" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886729 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="881d2b4e-3f2e-4aca-a965-2508edc92a3b" containerName="galera" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886736 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="2606acc0-f425-4595-8f5a-df1f009175ee" containerName="barbican-api-log" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886746 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="13fa1e9e-f893-4641-a55f-d0c209d43917" containerName="barbican-keystone-listener-log" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886752 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="541f7de0-3773-4e28-9881-b552367f1c89" containerName="proxy-httpd" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886759 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="897f63f9-bf8d-438b-b35a-4195963786e3" containerName="nova-api-log" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886769 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="331486fe-c403-4516-9568-862200ec0672" containerName="rabbitmq" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886779 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="2606acc0-f425-4595-8f5a-df1f009175ee" containerName="barbican-api" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886789 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f72429c-8e1e-4c2d-8417-c499c22013a9" containerName="ovsdbserver-nb" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886796 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="797faab6-a6b3-48ae-90ca-09d251b2975f" containerName="kube-state-metrics" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886808 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3cf4c6f-f34b-47ea-86f9-88666a59d44a" containerName="ceilometer-notification-agent" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886817 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e2e416b-60db-4b09-b585-59911e192daf" containerName="mariadb-account-delete" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886825 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683" containerName="ovn-northd" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886834 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="359e87f3-8fb4-4476-a538-13c948b65885" containerName="ovn-controller" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886843 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="76057872-026a-4ecb-b628-75f7f6fd46e9" containerName="nova-metadata-log" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886848 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="87da04a1-7ee1-47e4-9bc7-1808ea506457" containerName="keystone-api" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886859 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9451d43-4e5f-429b-b364-9220afe19523" containerName="nova-scheduler-scheduler" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886868 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="4beba48e-5ac4-4d41-864b-ecfd8b039988" containerName="glance-httpd" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886875 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="76057872-026a-4ecb-b628-75f7f6fd46e9" containerName="nova-metadata-metadata" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886887 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc0118af-131e-409a-a18b-0be03f3ce81b" containerName="nova-cell1-conductor-conductor" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886900 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="9df5252d-16cf-4612-910c-8f37d10b48a7" containerName="cinder-api" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886911 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c609265d-528c-4c9c-bd1c-b735775d9acc" containerName="dnsmasq-dns" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886917 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="926e1895-6638-4861-be46-f06e94e4b4ae" containerName="neutron-httpd" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886925 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="930a573c-21eb-46e3-827b-2ff167eecd4a" containerName="ovsdbserver-sb" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886933 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1b8eaf1-d7f3-4b6a-97a5-86a6c53ff683" containerName="openstack-network-exporter" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886940 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="897f63f9-bf8d-438b-b35a-4195963786e3" containerName="nova-api-api" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886949 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9642035-7395-4ead-9da4-4d01f497a80b" containerName="openstack-network-exporter" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886957 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0fc7cfb-9160-471d-8aec-457769d46602" containerName="barbican-worker-log" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886963 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="032af055-c0a6-4fda-90c8-c34ef9278613" containerName="memcached" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886970 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0fc7cfb-9160-471d-8aec-457769d46602" containerName="barbican-worker" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886979 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="55609ecb-b28c-488e-b80f-9add6d256016" containerName="galera" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886988 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d62427-bd91-4330-be20-152bc3508447" containerName="placement-log" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.886996 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="735bacc2-aeb8-4f2f-8f1a-bf3d092fd163" containerName="mariadb-account-delete" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.887005 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="13fa1e9e-f893-4641-a55f-d0c209d43917" containerName="barbican-keystone-listener" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.887013 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="541f7de0-3773-4e28-9881-b552367f1c89" containerName="proxy-server" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.887023 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="187070cf-06f5-4c25-851f-2e1f294c3b7c" containerName="nova-cell0-conductor-conductor" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.887032 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="509573b5-33f4-432c-bb2e-1fc9186ed072" containerName="probe" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.887041 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d62427-bd91-4330-be20-152bc3508447" containerName="placement-api" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.887050 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="926e1895-6638-4861-be46-f06e94e4b4ae" containerName="neutron-api" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.887056 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="11d0ad48-c987-4415-8536-b150536534a5" containerName="rabbitmq" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.887065 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="4beba48e-5ac4-4d41-864b-ecfd8b039988" containerName="glance-log" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.887073 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="9df5252d-16cf-4612-910c-8f37d10b48a7" containerName="cinder-api-log" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.887080 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f664c2b4-e517-427a-bbdc-8fba6d548a21" containerName="glance-httpd" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.887096 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3cf4c6f-f34b-47ea-86f9-88666a59d44a" containerName="proxy-httpd" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.887107 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3cf4c6f-f34b-47ea-86f9-88666a59d44a" containerName="ceilometer-central-agent" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.887118 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3cf4c6f-f34b-47ea-86f9-88666a59d44a" containerName="sg-core" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.887127 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f72429c-8e1e-4c2d-8417-c499c22013a9" containerName="openstack-network-exporter" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.887137 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="509573b5-33f4-432c-bb2e-1fc9186ed072" containerName="cinder-scheduler" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.887147 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f517228-5007-4e3c-a41b-1e4b636446c0" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.887155 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="930a573c-21eb-46e3-827b-2ff167eecd4a" containerName="openstack-network-exporter" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.888856 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dlxfd" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.902541 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dlxfd"] Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.992770 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deb9672a-8f10-4836-9ddf-cb4f47674257-utilities\") pod \"redhat-marketplace-dlxfd\" (UID: \"deb9672a-8f10-4836-9ddf-cb4f47674257\") " pod="openshift-marketplace/redhat-marketplace-dlxfd" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.992839 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deb9672a-8f10-4836-9ddf-cb4f47674257-catalog-content\") pod \"redhat-marketplace-dlxfd\" (UID: \"deb9672a-8f10-4836-9ddf-cb4f47674257\") " pod="openshift-marketplace/redhat-marketplace-dlxfd" Dec 05 16:41:32 crc kubenswrapper[4756]: I1205 16:41:32.992878 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7lcm\" (UniqueName: \"kubernetes.io/projected/deb9672a-8f10-4836-9ddf-cb4f47674257-kube-api-access-p7lcm\") pod \"redhat-marketplace-dlxfd\" (UID: \"deb9672a-8f10-4836-9ddf-cb4f47674257\") " pod="openshift-marketplace/redhat-marketplace-dlxfd" Dec 05 16:41:33 crc kubenswrapper[4756]: I1205 16:41:33.094567 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deb9672a-8f10-4836-9ddf-cb4f47674257-utilities\") pod \"redhat-marketplace-dlxfd\" (UID: \"deb9672a-8f10-4836-9ddf-cb4f47674257\") " pod="openshift-marketplace/redhat-marketplace-dlxfd" Dec 05 16:41:33 crc kubenswrapper[4756]: I1205 16:41:33.094619 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deb9672a-8f10-4836-9ddf-cb4f47674257-catalog-content\") pod \"redhat-marketplace-dlxfd\" (UID: \"deb9672a-8f10-4836-9ddf-cb4f47674257\") " pod="openshift-marketplace/redhat-marketplace-dlxfd" Dec 05 16:41:33 crc kubenswrapper[4756]: I1205 16:41:33.094642 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7lcm\" (UniqueName: \"kubernetes.io/projected/deb9672a-8f10-4836-9ddf-cb4f47674257-kube-api-access-p7lcm\") pod \"redhat-marketplace-dlxfd\" (UID: \"deb9672a-8f10-4836-9ddf-cb4f47674257\") " pod="openshift-marketplace/redhat-marketplace-dlxfd" Dec 05 16:41:33 crc kubenswrapper[4756]: I1205 16:41:33.095532 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deb9672a-8f10-4836-9ddf-cb4f47674257-utilities\") pod \"redhat-marketplace-dlxfd\" (UID: \"deb9672a-8f10-4836-9ddf-cb4f47674257\") " pod="openshift-marketplace/redhat-marketplace-dlxfd" Dec 05 16:41:33 crc kubenswrapper[4756]: I1205 16:41:33.095851 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deb9672a-8f10-4836-9ddf-cb4f47674257-catalog-content\") pod \"redhat-marketplace-dlxfd\" (UID: \"deb9672a-8f10-4836-9ddf-cb4f47674257\") " pod="openshift-marketplace/redhat-marketplace-dlxfd" Dec 05 16:41:33 crc kubenswrapper[4756]: I1205 16:41:33.116324 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7lcm\" (UniqueName: \"kubernetes.io/projected/deb9672a-8f10-4836-9ddf-cb4f47674257-kube-api-access-p7lcm\") pod \"redhat-marketplace-dlxfd\" (UID: \"deb9672a-8f10-4836-9ddf-cb4f47674257\") " pod="openshift-marketplace/redhat-marketplace-dlxfd" Dec 05 16:41:33 crc kubenswrapper[4756]: I1205 16:41:33.288352 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dlxfd" Dec 05 16:41:33 crc kubenswrapper[4756]: E1205 16:41:33.289944 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 15ed1a30eec906bcf7cd0aae8c31d3f0fcdcf61ff466d03836be17d99a7a65f3 is running failed: container process not found" containerID="15ed1a30eec906bcf7cd0aae8c31d3f0fcdcf61ff466d03836be17d99a7a65f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 16:41:33 crc kubenswrapper[4756]: E1205 16:41:33.290109 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b is running failed: container process not found" containerID="664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 16:41:33 crc kubenswrapper[4756]: E1205 16:41:33.290510 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 15ed1a30eec906bcf7cd0aae8c31d3f0fcdcf61ff466d03836be17d99a7a65f3 is running failed: container process not found" containerID="15ed1a30eec906bcf7cd0aae8c31d3f0fcdcf61ff466d03836be17d99a7a65f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 16:41:33 crc kubenswrapper[4756]: E1205 16:41:33.290809 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b is running failed: container process not found" containerID="664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 16:41:33 crc kubenswrapper[4756]: E1205 16:41:33.291470 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 15ed1a30eec906bcf7cd0aae8c31d3f0fcdcf61ff466d03836be17d99a7a65f3 is running failed: container process not found" containerID="15ed1a30eec906bcf7cd0aae8c31d3f0fcdcf61ff466d03836be17d99a7a65f3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 05 16:41:33 crc kubenswrapper[4756]: E1205 16:41:33.291507 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b is running failed: container process not found" containerID="664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 05 16:41:33 crc kubenswrapper[4756]: E1205 16:41:33.291556 4756 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 15ed1a30eec906bcf7cd0aae8c31d3f0fcdcf61ff466d03836be17d99a7a65f3 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-kdtg5" podUID="40ff87cb-4d82-49d4-a4d9-55e104c9becb" containerName="ovs-vswitchd" Dec 05 16:41:33 crc kubenswrapper[4756]: E1205 16:41:33.291600 4756 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-kdtg5" podUID="40ff87cb-4d82-49d4-a4d9-55e104c9becb" containerName="ovsdb-server" Dec 05 16:41:33 crc kubenswrapper[4756]: I1205 16:41:33.789208 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 05 16:41:33 crc kubenswrapper[4756]: I1205 16:41:33.843352 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dlxfd"] Dec 05 16:41:33 crc kubenswrapper[4756]: I1205 16:41:33.909060 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\" (UID: \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\") " Dec 05 16:41:33 crc kubenswrapper[4756]: I1205 16:41:33.909493 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-etc-swift\") pod \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\" (UID: \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\") " Dec 05 16:41:33 crc kubenswrapper[4756]: I1205 16:41:33.909523 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-cache\") pod \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\" (UID: \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\") " Dec 05 16:41:33 crc kubenswrapper[4756]: I1205 16:41:33.909557 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-225xh\" (UniqueName: \"kubernetes.io/projected/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-kube-api-access-225xh\") pod \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\" (UID: \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\") " Dec 05 16:41:33 crc kubenswrapper[4756]: I1205 16:41:33.909632 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-lock\") pod \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\" (UID: \"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40\") " Dec 05 16:41:33 crc kubenswrapper[4756]: I1205 16:41:33.910498 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-lock" (OuterVolumeSpecName: "lock") pod "1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" (UID: "1da6dea0-ecf9-4bec-9c85-2fddcc0fae40"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:33 crc kubenswrapper[4756]: I1205 16:41:33.910631 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-cache" (OuterVolumeSpecName: "cache") pod "1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" (UID: "1da6dea0-ecf9-4bec-9c85-2fddcc0fae40"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:33 crc kubenswrapper[4756]: I1205 16:41:33.916926 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-kube-api-access-225xh" (OuterVolumeSpecName: "kube-api-access-225xh") pod "1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" (UID: "1da6dea0-ecf9-4bec-9c85-2fddcc0fae40"). InnerVolumeSpecName "kube-api-access-225xh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:33 crc kubenswrapper[4756]: I1205 16:41:33.919458 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "swift") pod "1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" (UID: "1da6dea0-ecf9-4bec-9c85-2fddcc0fae40"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 16:41:33 crc kubenswrapper[4756]: I1205 16:41:33.919503 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" (UID: "1da6dea0-ecf9-4bec-9c85-2fddcc0fae40"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:33 crc kubenswrapper[4756]: I1205 16:41:33.921882 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dlxfd" event={"ID":"deb9672a-8f10-4836-9ddf-cb4f47674257","Type":"ContainerStarted","Data":"862e0280c297dfe8a98622808975300a193d7ab1ea449eba14adaed6bc8f988d"} Dec 05 16:41:33 crc kubenswrapper[4756]: I1205 16:41:33.931295 4756 generic.go:334] "Generic (PLEG): container finished" podID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerID="247cd9f8e619b5b90e0efa25570e05c275258c09a3806318d1576d31ee9c54d4" exitCode=137 Dec 05 16:41:33 crc kubenswrapper[4756]: I1205 16:41:33.931374 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerDied","Data":"247cd9f8e619b5b90e0efa25570e05c275258c09a3806318d1576d31ee9c54d4"} Dec 05 16:41:33 crc kubenswrapper[4756]: I1205 16:41:33.931410 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1da6dea0-ecf9-4bec-9c85-2fddcc0fae40","Type":"ContainerDied","Data":"b185d3cc3e40a67cd7e7744ce5c71c3f1698e265dfcf8d008203d209df117427"} Dec 05 16:41:33 crc kubenswrapper[4756]: I1205 16:41:33.931451 4756 scope.go:117] "RemoveContainer" containerID="247cd9f8e619b5b90e0efa25570e05c275258c09a3806318d1576d31ee9c54d4" Dec 05 16:41:33 crc kubenswrapper[4756]: I1205 16:41:33.931569 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 05 16:41:33 crc kubenswrapper[4756]: I1205 16:41:33.935651 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kdtg5_40ff87cb-4d82-49d4-a4d9-55e104c9becb/ovs-vswitchd/0.log" Dec 05 16:41:33 crc kubenswrapper[4756]: I1205 16:41:33.939459 4756 generic.go:334] "Generic (PLEG): container finished" podID="40ff87cb-4d82-49d4-a4d9-55e104c9becb" containerID="15ed1a30eec906bcf7cd0aae8c31d3f0fcdcf61ff466d03836be17d99a7a65f3" exitCode=137 Dec 05 16:41:33 crc kubenswrapper[4756]: I1205 16:41:33.939510 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kdtg5" event={"ID":"40ff87cb-4d82-49d4-a4d9-55e104c9becb","Type":"ContainerDied","Data":"15ed1a30eec906bcf7cd0aae8c31d3f0fcdcf61ff466d03836be17d99a7a65f3"} Dec 05 16:41:33 crc kubenswrapper[4756]: I1205 16:41:33.978391 4756 scope.go:117] "RemoveContainer" containerID="9e3e1415751beadec3c5e98d32474f2d2dec1760de9605948fb52aab896fc9c7" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.001118 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.003849 4756 scope.go:117] "RemoveContainer" containerID="9c291384610a06cc375042a98a47615fdbff66e21123c781d2769ab2bcadeded" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.007047 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.011166 4756 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.011192 4756 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.011202 4756 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-cache\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.011211 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-225xh\" (UniqueName: \"kubernetes.io/projected/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-kube-api-access-225xh\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.011221 4756 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40-lock\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.023340 4756 scope.go:117] "RemoveContainer" containerID="5e1f40cea05fa071537e16acd6b09db6a6409dc74b0cd1fa563ac99186148a45" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.027115 4756 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.047658 4756 scope.go:117] "RemoveContainer" containerID="4389071ecd156da0b5f4845fe6c02738ea9fcc211d54b131fe326e1966f73345" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.059409 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" path="/var/lib/kubelet/pods/1da6dea0-ecf9-4bec-9c85-2fddcc0fae40/volumes" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.112595 4756 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.140115 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kdtg5_40ff87cb-4d82-49d4-a4d9-55e104c9becb/ovs-vswitchd/0.log" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.140948 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-kdtg5" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.145112 4756 scope.go:117] "RemoveContainer" containerID="05437fa0b343d36ef2ae01e088f3fdcbef68a2d04ddfa1eaa69a14db0d05f9b1" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.168491 4756 scope.go:117] "RemoveContainer" containerID="821acf37d22b8f71cb55bb2d57dcd64430e023776cecd598ec2c206cea1a4102" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.196014 4756 scope.go:117] "RemoveContainer" containerID="1e0ca2f0b49755dfd0c9132cf3368e74ff1b4a7228ed3e94f5207e344afe1db1" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.216278 4756 scope.go:117] "RemoveContainer" containerID="e20786692279aaf2d8f121cc96987f9f799c1af4d0ee5262b2c6e9ee5557839a" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.235533 4756 scope.go:117] "RemoveContainer" containerID="76099a82df8143f2cc4ed5aad230dfa5b3acbe45156918636453426258978e46" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.251639 4756 scope.go:117] "RemoveContainer" containerID="ebf30d748f8665f39bb4c4c38feb695f0528281f593a1cb8bba7b8c8fd0a8fc2" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.271690 4756 scope.go:117] "RemoveContainer" containerID="0c6cfb9d5dd454d68dc8f1516ae3fbaa64a717012996cf3d859df476dcfab0a5" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.286672 4756 scope.go:117] "RemoveContainer" containerID="fb543cb304c9d5844c9b5a67d5c174d88a21239b421f7c141d51de2980c31e75" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.304558 4756 scope.go:117] "RemoveContainer" containerID="58129759a9bc200a7d116b2fc2c28784959095ead6831251c418d834f5b0a55f" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.315940 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/40ff87cb-4d82-49d4-a4d9-55e104c9becb-var-lib\") pod \"40ff87cb-4d82-49d4-a4d9-55e104c9becb\" (UID: \"40ff87cb-4d82-49d4-a4d9-55e104c9becb\") " Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.316109 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxptp\" (UniqueName: \"kubernetes.io/projected/40ff87cb-4d82-49d4-a4d9-55e104c9becb-kube-api-access-kxptp\") pod \"40ff87cb-4d82-49d4-a4d9-55e104c9becb\" (UID: \"40ff87cb-4d82-49d4-a4d9-55e104c9becb\") " Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.316274 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/40ff87cb-4d82-49d4-a4d9-55e104c9becb-var-log\") pod \"40ff87cb-4d82-49d4-a4d9-55e104c9becb\" (UID: \"40ff87cb-4d82-49d4-a4d9-55e104c9becb\") " Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.316306 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40ff87cb-4d82-49d4-a4d9-55e104c9becb-scripts\") pod \"40ff87cb-4d82-49d4-a4d9-55e104c9becb\" (UID: \"40ff87cb-4d82-49d4-a4d9-55e104c9becb\") " Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.316444 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/40ff87cb-4d82-49d4-a4d9-55e104c9becb-var-run\") pod \"40ff87cb-4d82-49d4-a4d9-55e104c9becb\" (UID: \"40ff87cb-4d82-49d4-a4d9-55e104c9becb\") " Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.316542 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/40ff87cb-4d82-49d4-a4d9-55e104c9becb-etc-ovs\") pod \"40ff87cb-4d82-49d4-a4d9-55e104c9becb\" (UID: \"40ff87cb-4d82-49d4-a4d9-55e104c9becb\") " Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.316543 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40ff87cb-4d82-49d4-a4d9-55e104c9becb-var-lib" (OuterVolumeSpecName: "var-lib") pod "40ff87cb-4d82-49d4-a4d9-55e104c9becb" (UID: "40ff87cb-4d82-49d4-a4d9-55e104c9becb"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.316567 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40ff87cb-4d82-49d4-a4d9-55e104c9becb-var-log" (OuterVolumeSpecName: "var-log") pod "40ff87cb-4d82-49d4-a4d9-55e104c9becb" (UID: "40ff87cb-4d82-49d4-a4d9-55e104c9becb"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.316811 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40ff87cb-4d82-49d4-a4d9-55e104c9becb-var-run" (OuterVolumeSpecName: "var-run") pod "40ff87cb-4d82-49d4-a4d9-55e104c9becb" (UID: "40ff87cb-4d82-49d4-a4d9-55e104c9becb"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.316908 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40ff87cb-4d82-49d4-a4d9-55e104c9becb-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "40ff87cb-4d82-49d4-a4d9-55e104c9becb" (UID: "40ff87cb-4d82-49d4-a4d9-55e104c9becb"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.317349 4756 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/40ff87cb-4d82-49d4-a4d9-55e104c9becb-var-run\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.317366 4756 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/40ff87cb-4d82-49d4-a4d9-55e104c9becb-etc-ovs\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.317375 4756 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/40ff87cb-4d82-49d4-a4d9-55e104c9becb-var-lib\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.317384 4756 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/40ff87cb-4d82-49d4-a4d9-55e104c9becb-var-log\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.318056 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40ff87cb-4d82-49d4-a4d9-55e104c9becb-scripts" (OuterVolumeSpecName: "scripts") pod "40ff87cb-4d82-49d4-a4d9-55e104c9becb" (UID: "40ff87cb-4d82-49d4-a4d9-55e104c9becb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.320434 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40ff87cb-4d82-49d4-a4d9-55e104c9becb-kube-api-access-kxptp" (OuterVolumeSpecName: "kube-api-access-kxptp") pod "40ff87cb-4d82-49d4-a4d9-55e104c9becb" (UID: "40ff87cb-4d82-49d4-a4d9-55e104c9becb"). InnerVolumeSpecName "kube-api-access-kxptp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.323610 4756 scope.go:117] "RemoveContainer" containerID="dda1d79f2b59ff6c5c2d0266e0acd62ebf58cab920b8afccadd520b9892680b4" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.343152 4756 scope.go:117] "RemoveContainer" containerID="247cd9f8e619b5b90e0efa25570e05c275258c09a3806318d1576d31ee9c54d4" Dec 05 16:41:34 crc kubenswrapper[4756]: E1205 16:41:34.343649 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"247cd9f8e619b5b90e0efa25570e05c275258c09a3806318d1576d31ee9c54d4\": container with ID starting with 247cd9f8e619b5b90e0efa25570e05c275258c09a3806318d1576d31ee9c54d4 not found: ID does not exist" containerID="247cd9f8e619b5b90e0efa25570e05c275258c09a3806318d1576d31ee9c54d4" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.343687 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"247cd9f8e619b5b90e0efa25570e05c275258c09a3806318d1576d31ee9c54d4"} err="failed to get container status \"247cd9f8e619b5b90e0efa25570e05c275258c09a3806318d1576d31ee9c54d4\": rpc error: code = NotFound desc = could not find container \"247cd9f8e619b5b90e0efa25570e05c275258c09a3806318d1576d31ee9c54d4\": container with ID starting with 247cd9f8e619b5b90e0efa25570e05c275258c09a3806318d1576d31ee9c54d4 not found: ID does not exist" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.343722 4756 scope.go:117] "RemoveContainer" containerID="9e3e1415751beadec3c5e98d32474f2d2dec1760de9605948fb52aab896fc9c7" Dec 05 16:41:34 crc kubenswrapper[4756]: E1205 16:41:34.344107 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e3e1415751beadec3c5e98d32474f2d2dec1760de9605948fb52aab896fc9c7\": container with ID starting with 9e3e1415751beadec3c5e98d32474f2d2dec1760de9605948fb52aab896fc9c7 not found: ID does not exist" containerID="9e3e1415751beadec3c5e98d32474f2d2dec1760de9605948fb52aab896fc9c7" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.344136 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e3e1415751beadec3c5e98d32474f2d2dec1760de9605948fb52aab896fc9c7"} err="failed to get container status \"9e3e1415751beadec3c5e98d32474f2d2dec1760de9605948fb52aab896fc9c7\": rpc error: code = NotFound desc = could not find container \"9e3e1415751beadec3c5e98d32474f2d2dec1760de9605948fb52aab896fc9c7\": container with ID starting with 9e3e1415751beadec3c5e98d32474f2d2dec1760de9605948fb52aab896fc9c7 not found: ID does not exist" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.344171 4756 scope.go:117] "RemoveContainer" containerID="9c291384610a06cc375042a98a47615fdbff66e21123c781d2769ab2bcadeded" Dec 05 16:41:34 crc kubenswrapper[4756]: E1205 16:41:34.344558 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c291384610a06cc375042a98a47615fdbff66e21123c781d2769ab2bcadeded\": container with ID starting with 9c291384610a06cc375042a98a47615fdbff66e21123c781d2769ab2bcadeded not found: ID does not exist" containerID="9c291384610a06cc375042a98a47615fdbff66e21123c781d2769ab2bcadeded" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.344582 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c291384610a06cc375042a98a47615fdbff66e21123c781d2769ab2bcadeded"} err="failed to get container status \"9c291384610a06cc375042a98a47615fdbff66e21123c781d2769ab2bcadeded\": rpc error: code = NotFound desc = could not find container \"9c291384610a06cc375042a98a47615fdbff66e21123c781d2769ab2bcadeded\": container with ID starting with 9c291384610a06cc375042a98a47615fdbff66e21123c781d2769ab2bcadeded not found: ID does not exist" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.344600 4756 scope.go:117] "RemoveContainer" containerID="5e1f40cea05fa071537e16acd6b09db6a6409dc74b0cd1fa563ac99186148a45" Dec 05 16:41:34 crc kubenswrapper[4756]: E1205 16:41:34.344952 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e1f40cea05fa071537e16acd6b09db6a6409dc74b0cd1fa563ac99186148a45\": container with ID starting with 5e1f40cea05fa071537e16acd6b09db6a6409dc74b0cd1fa563ac99186148a45 not found: ID does not exist" containerID="5e1f40cea05fa071537e16acd6b09db6a6409dc74b0cd1fa563ac99186148a45" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.344975 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e1f40cea05fa071537e16acd6b09db6a6409dc74b0cd1fa563ac99186148a45"} err="failed to get container status \"5e1f40cea05fa071537e16acd6b09db6a6409dc74b0cd1fa563ac99186148a45\": rpc error: code = NotFound desc = could not find container \"5e1f40cea05fa071537e16acd6b09db6a6409dc74b0cd1fa563ac99186148a45\": container with ID starting with 5e1f40cea05fa071537e16acd6b09db6a6409dc74b0cd1fa563ac99186148a45 not found: ID does not exist" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.344993 4756 scope.go:117] "RemoveContainer" containerID="4389071ecd156da0b5f4845fe6c02738ea9fcc211d54b131fe326e1966f73345" Dec 05 16:41:34 crc kubenswrapper[4756]: E1205 16:41:34.345220 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4389071ecd156da0b5f4845fe6c02738ea9fcc211d54b131fe326e1966f73345\": container with ID starting with 4389071ecd156da0b5f4845fe6c02738ea9fcc211d54b131fe326e1966f73345 not found: ID does not exist" containerID="4389071ecd156da0b5f4845fe6c02738ea9fcc211d54b131fe326e1966f73345" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.345244 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4389071ecd156da0b5f4845fe6c02738ea9fcc211d54b131fe326e1966f73345"} err="failed to get container status \"4389071ecd156da0b5f4845fe6c02738ea9fcc211d54b131fe326e1966f73345\": rpc error: code = NotFound desc = could not find container \"4389071ecd156da0b5f4845fe6c02738ea9fcc211d54b131fe326e1966f73345\": container with ID starting with 4389071ecd156da0b5f4845fe6c02738ea9fcc211d54b131fe326e1966f73345 not found: ID does not exist" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.345265 4756 scope.go:117] "RemoveContainer" containerID="05437fa0b343d36ef2ae01e088f3fdcbef68a2d04ddfa1eaa69a14db0d05f9b1" Dec 05 16:41:34 crc kubenswrapper[4756]: E1205 16:41:34.345520 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05437fa0b343d36ef2ae01e088f3fdcbef68a2d04ddfa1eaa69a14db0d05f9b1\": container with ID starting with 05437fa0b343d36ef2ae01e088f3fdcbef68a2d04ddfa1eaa69a14db0d05f9b1 not found: ID does not exist" containerID="05437fa0b343d36ef2ae01e088f3fdcbef68a2d04ddfa1eaa69a14db0d05f9b1" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.345542 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05437fa0b343d36ef2ae01e088f3fdcbef68a2d04ddfa1eaa69a14db0d05f9b1"} err="failed to get container status \"05437fa0b343d36ef2ae01e088f3fdcbef68a2d04ddfa1eaa69a14db0d05f9b1\": rpc error: code = NotFound desc = could not find container \"05437fa0b343d36ef2ae01e088f3fdcbef68a2d04ddfa1eaa69a14db0d05f9b1\": container with ID starting with 05437fa0b343d36ef2ae01e088f3fdcbef68a2d04ddfa1eaa69a14db0d05f9b1 not found: ID does not exist" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.345589 4756 scope.go:117] "RemoveContainer" containerID="821acf37d22b8f71cb55bb2d57dcd64430e023776cecd598ec2c206cea1a4102" Dec 05 16:41:34 crc kubenswrapper[4756]: E1205 16:41:34.345826 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"821acf37d22b8f71cb55bb2d57dcd64430e023776cecd598ec2c206cea1a4102\": container with ID starting with 821acf37d22b8f71cb55bb2d57dcd64430e023776cecd598ec2c206cea1a4102 not found: ID does not exist" containerID="821acf37d22b8f71cb55bb2d57dcd64430e023776cecd598ec2c206cea1a4102" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.345853 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"821acf37d22b8f71cb55bb2d57dcd64430e023776cecd598ec2c206cea1a4102"} err="failed to get container status \"821acf37d22b8f71cb55bb2d57dcd64430e023776cecd598ec2c206cea1a4102\": rpc error: code = NotFound desc = could not find container \"821acf37d22b8f71cb55bb2d57dcd64430e023776cecd598ec2c206cea1a4102\": container with ID starting with 821acf37d22b8f71cb55bb2d57dcd64430e023776cecd598ec2c206cea1a4102 not found: ID does not exist" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.345872 4756 scope.go:117] "RemoveContainer" containerID="1e0ca2f0b49755dfd0c9132cf3368e74ff1b4a7228ed3e94f5207e344afe1db1" Dec 05 16:41:34 crc kubenswrapper[4756]: E1205 16:41:34.346201 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e0ca2f0b49755dfd0c9132cf3368e74ff1b4a7228ed3e94f5207e344afe1db1\": container with ID starting with 1e0ca2f0b49755dfd0c9132cf3368e74ff1b4a7228ed3e94f5207e344afe1db1 not found: ID does not exist" containerID="1e0ca2f0b49755dfd0c9132cf3368e74ff1b4a7228ed3e94f5207e344afe1db1" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.346228 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e0ca2f0b49755dfd0c9132cf3368e74ff1b4a7228ed3e94f5207e344afe1db1"} err="failed to get container status \"1e0ca2f0b49755dfd0c9132cf3368e74ff1b4a7228ed3e94f5207e344afe1db1\": rpc error: code = NotFound desc = could not find container \"1e0ca2f0b49755dfd0c9132cf3368e74ff1b4a7228ed3e94f5207e344afe1db1\": container with ID starting with 1e0ca2f0b49755dfd0c9132cf3368e74ff1b4a7228ed3e94f5207e344afe1db1 not found: ID does not exist" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.346247 4756 scope.go:117] "RemoveContainer" containerID="e20786692279aaf2d8f121cc96987f9f799c1af4d0ee5262b2c6e9ee5557839a" Dec 05 16:41:34 crc kubenswrapper[4756]: E1205 16:41:34.346603 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e20786692279aaf2d8f121cc96987f9f799c1af4d0ee5262b2c6e9ee5557839a\": container with ID starting with e20786692279aaf2d8f121cc96987f9f799c1af4d0ee5262b2c6e9ee5557839a not found: ID does not exist" containerID="e20786692279aaf2d8f121cc96987f9f799c1af4d0ee5262b2c6e9ee5557839a" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.346635 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e20786692279aaf2d8f121cc96987f9f799c1af4d0ee5262b2c6e9ee5557839a"} err="failed to get container status \"e20786692279aaf2d8f121cc96987f9f799c1af4d0ee5262b2c6e9ee5557839a\": rpc error: code = NotFound desc = could not find container \"e20786692279aaf2d8f121cc96987f9f799c1af4d0ee5262b2c6e9ee5557839a\": container with ID starting with e20786692279aaf2d8f121cc96987f9f799c1af4d0ee5262b2c6e9ee5557839a not found: ID does not exist" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.346655 4756 scope.go:117] "RemoveContainer" containerID="76099a82df8143f2cc4ed5aad230dfa5b3acbe45156918636453426258978e46" Dec 05 16:41:34 crc kubenswrapper[4756]: E1205 16:41:34.346976 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76099a82df8143f2cc4ed5aad230dfa5b3acbe45156918636453426258978e46\": container with ID starting with 76099a82df8143f2cc4ed5aad230dfa5b3acbe45156918636453426258978e46 not found: ID does not exist" containerID="76099a82df8143f2cc4ed5aad230dfa5b3acbe45156918636453426258978e46" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.347002 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76099a82df8143f2cc4ed5aad230dfa5b3acbe45156918636453426258978e46"} err="failed to get container status \"76099a82df8143f2cc4ed5aad230dfa5b3acbe45156918636453426258978e46\": rpc error: code = NotFound desc = could not find container \"76099a82df8143f2cc4ed5aad230dfa5b3acbe45156918636453426258978e46\": container with ID starting with 76099a82df8143f2cc4ed5aad230dfa5b3acbe45156918636453426258978e46 not found: ID does not exist" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.347022 4756 scope.go:117] "RemoveContainer" containerID="ebf30d748f8665f39bb4c4c38feb695f0528281f593a1cb8bba7b8c8fd0a8fc2" Dec 05 16:41:34 crc kubenswrapper[4756]: E1205 16:41:34.347294 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebf30d748f8665f39bb4c4c38feb695f0528281f593a1cb8bba7b8c8fd0a8fc2\": container with ID starting with ebf30d748f8665f39bb4c4c38feb695f0528281f593a1cb8bba7b8c8fd0a8fc2 not found: ID does not exist" containerID="ebf30d748f8665f39bb4c4c38feb695f0528281f593a1cb8bba7b8c8fd0a8fc2" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.347322 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebf30d748f8665f39bb4c4c38feb695f0528281f593a1cb8bba7b8c8fd0a8fc2"} err="failed to get container status \"ebf30d748f8665f39bb4c4c38feb695f0528281f593a1cb8bba7b8c8fd0a8fc2\": rpc error: code = NotFound desc = could not find container \"ebf30d748f8665f39bb4c4c38feb695f0528281f593a1cb8bba7b8c8fd0a8fc2\": container with ID starting with ebf30d748f8665f39bb4c4c38feb695f0528281f593a1cb8bba7b8c8fd0a8fc2 not found: ID does not exist" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.347339 4756 scope.go:117] "RemoveContainer" containerID="0c6cfb9d5dd454d68dc8f1516ae3fbaa64a717012996cf3d859df476dcfab0a5" Dec 05 16:41:34 crc kubenswrapper[4756]: E1205 16:41:34.347660 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c6cfb9d5dd454d68dc8f1516ae3fbaa64a717012996cf3d859df476dcfab0a5\": container with ID starting with 0c6cfb9d5dd454d68dc8f1516ae3fbaa64a717012996cf3d859df476dcfab0a5 not found: ID does not exist" containerID="0c6cfb9d5dd454d68dc8f1516ae3fbaa64a717012996cf3d859df476dcfab0a5" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.347686 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c6cfb9d5dd454d68dc8f1516ae3fbaa64a717012996cf3d859df476dcfab0a5"} err="failed to get container status \"0c6cfb9d5dd454d68dc8f1516ae3fbaa64a717012996cf3d859df476dcfab0a5\": rpc error: code = NotFound desc = could not find container \"0c6cfb9d5dd454d68dc8f1516ae3fbaa64a717012996cf3d859df476dcfab0a5\": container with ID starting with 0c6cfb9d5dd454d68dc8f1516ae3fbaa64a717012996cf3d859df476dcfab0a5 not found: ID does not exist" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.347705 4756 scope.go:117] "RemoveContainer" containerID="fb543cb304c9d5844c9b5a67d5c174d88a21239b421f7c141d51de2980c31e75" Dec 05 16:41:34 crc kubenswrapper[4756]: E1205 16:41:34.347992 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb543cb304c9d5844c9b5a67d5c174d88a21239b421f7c141d51de2980c31e75\": container with ID starting with fb543cb304c9d5844c9b5a67d5c174d88a21239b421f7c141d51de2980c31e75 not found: ID does not exist" containerID="fb543cb304c9d5844c9b5a67d5c174d88a21239b421f7c141d51de2980c31e75" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.348084 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb543cb304c9d5844c9b5a67d5c174d88a21239b421f7c141d51de2980c31e75"} err="failed to get container status \"fb543cb304c9d5844c9b5a67d5c174d88a21239b421f7c141d51de2980c31e75\": rpc error: code = NotFound desc = could not find container \"fb543cb304c9d5844c9b5a67d5c174d88a21239b421f7c141d51de2980c31e75\": container with ID starting with fb543cb304c9d5844c9b5a67d5c174d88a21239b421f7c141d51de2980c31e75 not found: ID does not exist" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.348165 4756 scope.go:117] "RemoveContainer" containerID="58129759a9bc200a7d116b2fc2c28784959095ead6831251c418d834f5b0a55f" Dec 05 16:41:34 crc kubenswrapper[4756]: E1205 16:41:34.348781 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58129759a9bc200a7d116b2fc2c28784959095ead6831251c418d834f5b0a55f\": container with ID starting with 58129759a9bc200a7d116b2fc2c28784959095ead6831251c418d834f5b0a55f not found: ID does not exist" containerID="58129759a9bc200a7d116b2fc2c28784959095ead6831251c418d834f5b0a55f" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.348904 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58129759a9bc200a7d116b2fc2c28784959095ead6831251c418d834f5b0a55f"} err="failed to get container status \"58129759a9bc200a7d116b2fc2c28784959095ead6831251c418d834f5b0a55f\": rpc error: code = NotFound desc = could not find container \"58129759a9bc200a7d116b2fc2c28784959095ead6831251c418d834f5b0a55f\": container with ID starting with 58129759a9bc200a7d116b2fc2c28784959095ead6831251c418d834f5b0a55f not found: ID does not exist" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.348969 4756 scope.go:117] "RemoveContainer" containerID="dda1d79f2b59ff6c5c2d0266e0acd62ebf58cab920b8afccadd520b9892680b4" Dec 05 16:41:34 crc kubenswrapper[4756]: E1205 16:41:34.349444 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dda1d79f2b59ff6c5c2d0266e0acd62ebf58cab920b8afccadd520b9892680b4\": container with ID starting with dda1d79f2b59ff6c5c2d0266e0acd62ebf58cab920b8afccadd520b9892680b4 not found: ID does not exist" containerID="dda1d79f2b59ff6c5c2d0266e0acd62ebf58cab920b8afccadd520b9892680b4" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.349530 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dda1d79f2b59ff6c5c2d0266e0acd62ebf58cab920b8afccadd520b9892680b4"} err="failed to get container status \"dda1d79f2b59ff6c5c2d0266e0acd62ebf58cab920b8afccadd520b9892680b4\": rpc error: code = NotFound desc = could not find container \"dda1d79f2b59ff6c5c2d0266e0acd62ebf58cab920b8afccadd520b9892680b4\": container with ID starting with dda1d79f2b59ff6c5c2d0266e0acd62ebf58cab920b8afccadd520b9892680b4 not found: ID does not exist" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.419041 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxptp\" (UniqueName: \"kubernetes.io/projected/40ff87cb-4d82-49d4-a4d9-55e104c9becb-kube-api-access-kxptp\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.419363 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40ff87cb-4d82-49d4-a4d9-55e104c9becb-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.947856 4756 generic.go:334] "Generic (PLEG): container finished" podID="deb9672a-8f10-4836-9ddf-cb4f47674257" containerID="3d1d27288c158cf92845dfc075394689381afda8cf43d822241e019857c37d08" exitCode=0 Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.947942 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dlxfd" event={"ID":"deb9672a-8f10-4836-9ddf-cb4f47674257","Type":"ContainerDied","Data":"3d1d27288c158cf92845dfc075394689381afda8cf43d822241e019857c37d08"} Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.951214 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kdtg5_40ff87cb-4d82-49d4-a4d9-55e104c9becb/ovs-vswitchd/0.log" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.951902 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kdtg5" event={"ID":"40ff87cb-4d82-49d4-a4d9-55e104c9becb","Type":"ContainerDied","Data":"31a9a3d50797eb9461b26695c87bceb47e0963457bdf176c58f4841b6b4d2a33"} Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.951947 4756 scope.go:117] "RemoveContainer" containerID="15ed1a30eec906bcf7cd0aae8c31d3f0fcdcf61ff466d03836be17d99a7a65f3" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.952060 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-kdtg5" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.982469 4756 scope.go:117] "RemoveContainer" containerID="664beb5f8614ed6f7120f47018836c5faaa8047ac0e68aad8fe76a28dd609e9b" Dec 05 16:41:34 crc kubenswrapper[4756]: I1205 16:41:34.996594 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-kdtg5"] Dec 05 16:41:35 crc kubenswrapper[4756]: I1205 16:41:35.006169 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-kdtg5"] Dec 05 16:41:35 crc kubenswrapper[4756]: I1205 16:41:35.019681 4756 scope.go:117] "RemoveContainer" containerID="906970e8d880890b549aab00084b13dfefdd6a84f0b2cc8df5e80914b5a2230a" Dec 05 16:41:35 crc kubenswrapper[4756]: E1205 16:41:35.089186 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40ff87cb_4d82_49d4_a4d9_55e104c9becb.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40ff87cb_4d82_49d4_a4d9_55e104c9becb.slice/crio-31a9a3d50797eb9461b26695c87bceb47e0963457bdf176c58f4841b6b4d2a33\": RecentStats: unable to find data in memory cache]" Dec 05 16:41:35 crc kubenswrapper[4756]: I1205 16:41:35.966338 4756 generic.go:334] "Generic (PLEG): container finished" podID="deb9672a-8f10-4836-9ddf-cb4f47674257" containerID="1197ee1ba72354c23f47f610efdff905deabd24c505c1115de3497d83894e344" exitCode=0 Dec 05 16:41:35 crc kubenswrapper[4756]: I1205 16:41:35.966461 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dlxfd" event={"ID":"deb9672a-8f10-4836-9ddf-cb4f47674257","Type":"ContainerDied","Data":"1197ee1ba72354c23f47f610efdff905deabd24c505c1115de3497d83894e344"} Dec 05 16:41:36 crc kubenswrapper[4756]: I1205 16:41:36.058854 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40ff87cb-4d82-49d4-a4d9-55e104c9becb" path="/var/lib/kubelet/pods/40ff87cb-4d82-49d4-a4d9-55e104c9becb/volumes" Dec 05 16:41:36 crc kubenswrapper[4756]: I1205 16:41:36.980949 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dlxfd" event={"ID":"deb9672a-8f10-4836-9ddf-cb4f47674257","Type":"ContainerStarted","Data":"0553069605af85e6fd239ae2b71b889953cd025cbd793435850e39cef8ef1789"} Dec 05 16:41:37 crc kubenswrapper[4756]: I1205 16:41:37.010478 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dlxfd" podStartSLOduration=3.607634825 podStartE2EDuration="5.010456935s" podCreationTimestamp="2025-12-05 16:41:32 +0000 UTC" firstStartedPulling="2025-12-05 16:41:34.950442745 +0000 UTC m=+1555.189116257" lastFinishedPulling="2025-12-05 16:41:36.353264855 +0000 UTC m=+1556.591938367" observedRunningTime="2025-12-05 16:41:37.004831362 +0000 UTC m=+1557.243504884" watchObservedRunningTime="2025-12-05 16:41:37.010456935 +0000 UTC m=+1557.249130467" Dec 05 16:41:37 crc kubenswrapper[4756]: I1205 16:41:37.999568 4756 generic.go:334] "Generic (PLEG): container finished" podID="3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e" containerID="7d4e4c4996c6095e55a9528af5c2a29da4aefbcb8aa7c785ffbdee54592a58ca" exitCode=137 Dec 05 16:41:38 crc kubenswrapper[4756]: I1205 16:41:37.999689 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement5c6b-account-delete-nhd8d" event={"ID":"3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e","Type":"ContainerDied","Data":"7d4e4c4996c6095e55a9528af5c2a29da4aefbcb8aa7c785ffbdee54592a58ca"} Dec 05 16:41:38 crc kubenswrapper[4756]: I1205 16:41:38.002975 4756 generic.go:334] "Generic (PLEG): container finished" podID="cc4b1e45-87c0-471f-beb9-70bce6933d47" containerID="5ce2c044a1e7f24fb0d011f65c18d5af43c0921735e6ef864f501fb53bd366b8" exitCode=137 Dec 05 16:41:38 crc kubenswrapper[4756]: I1205 16:41:38.003058 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0bc19-account-delete-xlr4r" event={"ID":"cc4b1e45-87c0-471f-beb9-70bce6933d47","Type":"ContainerDied","Data":"5ce2c044a1e7f24fb0d011f65c18d5af43c0921735e6ef864f501fb53bd366b8"} Dec 05 16:41:38 crc kubenswrapper[4756]: I1205 16:41:38.315955 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement5c6b-account-delete-nhd8d" Dec 05 16:41:38 crc kubenswrapper[4756]: I1205 16:41:38.394460 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi58bc-account-delete-cdg92" Dec 05 16:41:38 crc kubenswrapper[4756]: I1205 16:41:38.488921 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v59d\" (UniqueName: \"kubernetes.io/projected/3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e-kube-api-access-6v59d\") pod \"3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e\" (UID: \"3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e\") " Dec 05 16:41:38 crc kubenswrapper[4756]: I1205 16:41:38.489195 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e-operator-scripts\") pod \"3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e\" (UID: \"3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e\") " Dec 05 16:41:38 crc kubenswrapper[4756]: I1205 16:41:38.490041 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e" (UID: "3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:38 crc kubenswrapper[4756]: I1205 16:41:38.495803 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e-kube-api-access-6v59d" (OuterVolumeSpecName: "kube-api-access-6v59d") pod "3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e" (UID: "3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e"). InnerVolumeSpecName "kube-api-access-6v59d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:38 crc kubenswrapper[4756]: I1205 16:41:38.594277 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmbq6\" (UniqueName: \"kubernetes.io/projected/2586474d-e368-428a-a86b-1ec95b8006a7-kube-api-access-dmbq6\") pod \"2586474d-e368-428a-a86b-1ec95b8006a7\" (UID: \"2586474d-e368-428a-a86b-1ec95b8006a7\") " Dec 05 16:41:38 crc kubenswrapper[4756]: I1205 16:41:38.594450 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2586474d-e368-428a-a86b-1ec95b8006a7-operator-scripts\") pod \"2586474d-e368-428a-a86b-1ec95b8006a7\" (UID: \"2586474d-e368-428a-a86b-1ec95b8006a7\") " Dec 05 16:41:38 crc kubenswrapper[4756]: I1205 16:41:38.594822 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v59d\" (UniqueName: \"kubernetes.io/projected/3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e-kube-api-access-6v59d\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:38 crc kubenswrapper[4756]: I1205 16:41:38.594850 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:38 crc kubenswrapper[4756]: E1205 16:41:38.594917 4756 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 16:41:38 crc kubenswrapper[4756]: E1205 16:41:38.594971 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9639f2b9-2b61-4ab9-ab94-6c54d373662f-operator-scripts podName:9639f2b9-2b61-4ab9-ab94-6c54d373662f nodeName:}" failed. No retries permitted until 2025-12-05 16:42:10.59495384 +0000 UTC m=+1590.833627352 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9639f2b9-2b61-4ab9-ab94-6c54d373662f-operator-scripts") pod "barbicanc3ab-account-delete-ndr77" (UID: "9639f2b9-2b61-4ab9-ab94-6c54d373662f") : configmap "openstack-scripts" not found Dec 05 16:41:38 crc kubenswrapper[4756]: I1205 16:41:38.595337 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2586474d-e368-428a-a86b-1ec95b8006a7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2586474d-e368-428a-a86b-1ec95b8006a7" (UID: "2586474d-e368-428a-a86b-1ec95b8006a7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:38 crc kubenswrapper[4756]: E1205 16:41:38.595409 4756 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 05 16:41:38 crc kubenswrapper[4756]: E1205 16:41:38.595462 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/37e233df-e9ab-45c9-a335-f1182a3b8c79-operator-scripts podName:37e233df-e9ab-45c9-a335-f1182a3b8c79 nodeName:}" failed. No retries permitted until 2025-12-05 16:42:10.595452984 +0000 UTC m=+1590.834126496 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/37e233df-e9ab-45c9-a335-f1182a3b8c79-operator-scripts") pod "neutronbc5a-account-delete-wtsjj" (UID: "37e233df-e9ab-45c9-a335-f1182a3b8c79") : configmap "openstack-scripts" not found Dec 05 16:41:38 crc kubenswrapper[4756]: I1205 16:41:38.597949 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2586474d-e368-428a-a86b-1ec95b8006a7-kube-api-access-dmbq6" (OuterVolumeSpecName: "kube-api-access-dmbq6") pod "2586474d-e368-428a-a86b-1ec95b8006a7" (UID: "2586474d-e368-428a-a86b-1ec95b8006a7"). InnerVolumeSpecName "kube-api-access-dmbq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:38 crc kubenswrapper[4756]: I1205 16:41:38.653635 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0bc19-account-delete-xlr4r" Dec 05 16:41:38 crc kubenswrapper[4756]: I1205 16:41:38.696408 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kflx7\" (UniqueName: \"kubernetes.io/projected/cc4b1e45-87c0-471f-beb9-70bce6933d47-kube-api-access-kflx7\") pod \"cc4b1e45-87c0-471f-beb9-70bce6933d47\" (UID: \"cc4b1e45-87c0-471f-beb9-70bce6933d47\") " Dec 05 16:41:38 crc kubenswrapper[4756]: I1205 16:41:38.696551 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc4b1e45-87c0-471f-beb9-70bce6933d47-operator-scripts\") pod \"cc4b1e45-87c0-471f-beb9-70bce6933d47\" (UID: \"cc4b1e45-87c0-471f-beb9-70bce6933d47\") " Dec 05 16:41:38 crc kubenswrapper[4756]: I1205 16:41:38.696800 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2586474d-e368-428a-a86b-1ec95b8006a7-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:38 crc kubenswrapper[4756]: I1205 16:41:38.696822 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmbq6\" (UniqueName: \"kubernetes.io/projected/2586474d-e368-428a-a86b-1ec95b8006a7-kube-api-access-dmbq6\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:38 crc kubenswrapper[4756]: I1205 16:41:38.696997 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc4b1e45-87c0-471f-beb9-70bce6933d47-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cc4b1e45-87c0-471f-beb9-70bce6933d47" (UID: "cc4b1e45-87c0-471f-beb9-70bce6933d47"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:38 crc kubenswrapper[4756]: I1205 16:41:38.700154 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc4b1e45-87c0-471f-beb9-70bce6933d47-kube-api-access-kflx7" (OuterVolumeSpecName: "kube-api-access-kflx7") pod "cc4b1e45-87c0-471f-beb9-70bce6933d47" (UID: "cc4b1e45-87c0-471f-beb9-70bce6933d47"). InnerVolumeSpecName "kube-api-access-kflx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:38 crc kubenswrapper[4756]: I1205 16:41:38.797624 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc4b1e45-87c0-471f-beb9-70bce6933d47-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:38 crc kubenswrapper[4756]: I1205 16:41:38.797654 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kflx7\" (UniqueName: \"kubernetes.io/projected/cc4b1e45-87c0-471f-beb9-70bce6933d47-kube-api-access-kflx7\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:39 crc kubenswrapper[4756]: I1205 16:41:39.016871 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement5c6b-account-delete-nhd8d" event={"ID":"3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e","Type":"ContainerDied","Data":"4d8ef07e1a07f5358ac394401715b8fe4e221ba774cd2b748b15afd94fa0426f"} Dec 05 16:41:39 crc kubenswrapper[4756]: I1205 16:41:39.016928 4756 scope.go:117] "RemoveContainer" containerID="7d4e4c4996c6095e55a9528af5c2a29da4aefbcb8aa7c785ffbdee54592a58ca" Dec 05 16:41:39 crc kubenswrapper[4756]: I1205 16:41:39.017043 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement5c6b-account-delete-nhd8d" Dec 05 16:41:39 crc kubenswrapper[4756]: I1205 16:41:39.022684 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0bc19-account-delete-xlr4r" event={"ID":"cc4b1e45-87c0-471f-beb9-70bce6933d47","Type":"ContainerDied","Data":"5252e226fa698ec8d62703bad74f73c22f2f2fca1bbcc716bbaed92366640a9b"} Dec 05 16:41:39 crc kubenswrapper[4756]: I1205 16:41:39.022745 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0bc19-account-delete-xlr4r" Dec 05 16:41:39 crc kubenswrapper[4756]: I1205 16:41:39.025159 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi58bc-account-delete-cdg92" Dec 05 16:41:39 crc kubenswrapper[4756]: I1205 16:41:39.025157 4756 generic.go:334] "Generic (PLEG): container finished" podID="2586474d-e368-428a-a86b-1ec95b8006a7" containerID="2ef947f383c65754f97816358b40e2b97ba1e5bdbdd958871bbce8c452cb3b0e" exitCode=137 Dec 05 16:41:39 crc kubenswrapper[4756]: I1205 16:41:39.025181 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi58bc-account-delete-cdg92" event={"ID":"2586474d-e368-428a-a86b-1ec95b8006a7","Type":"ContainerDied","Data":"2ef947f383c65754f97816358b40e2b97ba1e5bdbdd958871bbce8c452cb3b0e"} Dec 05 16:41:39 crc kubenswrapper[4756]: I1205 16:41:39.025404 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi58bc-account-delete-cdg92" event={"ID":"2586474d-e368-428a-a86b-1ec95b8006a7","Type":"ContainerDied","Data":"50313c8fcba529cbbb960ae7922f99f11e92fc811b0e97a85eb316cc9cac36bd"} Dec 05 16:41:39 crc kubenswrapper[4756]: I1205 16:41:39.049458 4756 scope.go:117] "RemoveContainer" containerID="5ce2c044a1e7f24fb0d011f65c18d5af43c0921735e6ef864f501fb53bd366b8" Dec 05 16:41:39 crc kubenswrapper[4756]: I1205 16:41:39.052868 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement5c6b-account-delete-nhd8d"] Dec 05 16:41:39 crc kubenswrapper[4756]: I1205 16:41:39.065182 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement5c6b-account-delete-nhd8d"] Dec 05 16:41:39 crc kubenswrapper[4756]: I1205 16:41:39.080778 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi58bc-account-delete-cdg92"] Dec 05 16:41:39 crc kubenswrapper[4756]: I1205 16:41:39.080963 4756 scope.go:117] "RemoveContainer" containerID="2ef947f383c65754f97816358b40e2b97ba1e5bdbdd958871bbce8c452cb3b0e" Dec 05 16:41:39 crc kubenswrapper[4756]: I1205 16:41:39.092305 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapi58bc-account-delete-cdg92"] Dec 05 16:41:39 crc kubenswrapper[4756]: I1205 16:41:39.099239 4756 scope.go:117] "RemoveContainer" containerID="2ef947f383c65754f97816358b40e2b97ba1e5bdbdd958871bbce8c452cb3b0e" Dec 05 16:41:39 crc kubenswrapper[4756]: E1205 16:41:39.100874 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ef947f383c65754f97816358b40e2b97ba1e5bdbdd958871bbce8c452cb3b0e\": container with ID starting with 2ef947f383c65754f97816358b40e2b97ba1e5bdbdd958871bbce8c452cb3b0e not found: ID does not exist" containerID="2ef947f383c65754f97816358b40e2b97ba1e5bdbdd958871bbce8c452cb3b0e" Dec 05 16:41:39 crc kubenswrapper[4756]: I1205 16:41:39.100914 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ef947f383c65754f97816358b40e2b97ba1e5bdbdd958871bbce8c452cb3b0e"} err="failed to get container status \"2ef947f383c65754f97816358b40e2b97ba1e5bdbdd958871bbce8c452cb3b0e\": rpc error: code = NotFound desc = could not find container \"2ef947f383c65754f97816358b40e2b97ba1e5bdbdd958871bbce8c452cb3b0e\": container with ID starting with 2ef947f383c65754f97816358b40e2b97ba1e5bdbdd958871bbce8c452cb3b0e not found: ID does not exist" Dec 05 16:41:39 crc kubenswrapper[4756]: I1205 16:41:39.104411 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell0bc19-account-delete-xlr4r"] Dec 05 16:41:39 crc kubenswrapper[4756]: I1205 16:41:39.110786 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell0bc19-account-delete-xlr4r"] Dec 05 16:41:40 crc kubenswrapper[4756]: I1205 16:41:40.059486 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2586474d-e368-428a-a86b-1ec95b8006a7" path="/var/lib/kubelet/pods/2586474d-e368-428a-a86b-1ec95b8006a7/volumes" Dec 05 16:41:40 crc kubenswrapper[4756]: I1205 16:41:40.059961 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e" path="/var/lib/kubelet/pods/3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e/volumes" Dec 05 16:41:40 crc kubenswrapper[4756]: I1205 16:41:40.060445 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc4b1e45-87c0-471f-beb9-70bce6933d47" path="/var/lib/kubelet/pods/cc4b1e45-87c0-471f-beb9-70bce6933d47/volumes" Dec 05 16:41:42 crc kubenswrapper[4756]: I1205 16:41:42.055629 4756 generic.go:334] "Generic (PLEG): container finished" podID="37e233df-e9ab-45c9-a335-f1182a3b8c79" containerID="693ab2a387b6a0c295bf6ab3d9d8b6730de073c73bc06fccbb3c0097dd28ae4a" exitCode=137 Dec 05 16:41:42 crc kubenswrapper[4756]: I1205 16:41:42.057398 4756 generic.go:334] "Generic (PLEG): container finished" podID="9639f2b9-2b61-4ab9-ab94-6c54d373662f" containerID="d01e9052eadb9b19c59bb451fdb0fec97a060cfb91308a4e5a61cd10b153475f" exitCode=137 Dec 05 16:41:42 crc kubenswrapper[4756]: I1205 16:41:42.059451 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutronbc5a-account-delete-wtsjj" event={"ID":"37e233df-e9ab-45c9-a335-f1182a3b8c79","Type":"ContainerDied","Data":"693ab2a387b6a0c295bf6ab3d9d8b6730de073c73bc06fccbb3c0097dd28ae4a"} Dec 05 16:41:42 crc kubenswrapper[4756]: I1205 16:41:42.059482 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicanc3ab-account-delete-ndr77" event={"ID":"9639f2b9-2b61-4ab9-ab94-6c54d373662f","Type":"ContainerDied","Data":"d01e9052eadb9b19c59bb451fdb0fec97a060cfb91308a4e5a61cd10b153475f"} Dec 05 16:41:42 crc kubenswrapper[4756]: I1205 16:41:42.471005 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutronbc5a-account-delete-wtsjj" Dec 05 16:41:42 crc kubenswrapper[4756]: I1205 16:41:42.550566 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37e233df-e9ab-45c9-a335-f1182a3b8c79-operator-scripts\") pod \"37e233df-e9ab-45c9-a335-f1182a3b8c79\" (UID: \"37e233df-e9ab-45c9-a335-f1182a3b8c79\") " Dec 05 16:41:42 crc kubenswrapper[4756]: I1205 16:41:42.551364 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37e233df-e9ab-45c9-a335-f1182a3b8c79-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "37e233df-e9ab-45c9-a335-f1182a3b8c79" (UID: "37e233df-e9ab-45c9-a335-f1182a3b8c79"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:42 crc kubenswrapper[4756]: I1205 16:41:42.652027 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkpmp\" (UniqueName: \"kubernetes.io/projected/37e233df-e9ab-45c9-a335-f1182a3b8c79-kube-api-access-fkpmp\") pod \"37e233df-e9ab-45c9-a335-f1182a3b8c79\" (UID: \"37e233df-e9ab-45c9-a335-f1182a3b8c79\") " Dec 05 16:41:42 crc kubenswrapper[4756]: I1205 16:41:42.652329 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37e233df-e9ab-45c9-a335-f1182a3b8c79-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:42 crc kubenswrapper[4756]: I1205 16:41:42.663698 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37e233df-e9ab-45c9-a335-f1182a3b8c79-kube-api-access-fkpmp" (OuterVolumeSpecName: "kube-api-access-fkpmp") pod "37e233df-e9ab-45c9-a335-f1182a3b8c79" (UID: "37e233df-e9ab-45c9-a335-f1182a3b8c79"). InnerVolumeSpecName "kube-api-access-fkpmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:42 crc kubenswrapper[4756]: I1205 16:41:42.754940 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkpmp\" (UniqueName: \"kubernetes.io/projected/37e233df-e9ab-45c9-a335-f1182a3b8c79-kube-api-access-fkpmp\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:42 crc kubenswrapper[4756]: I1205 16:41:42.793314 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbicanc3ab-account-delete-ndr77" Dec 05 16:41:42 crc kubenswrapper[4756]: I1205 16:41:42.958141 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9639f2b9-2b61-4ab9-ab94-6c54d373662f-operator-scripts\") pod \"9639f2b9-2b61-4ab9-ab94-6c54d373662f\" (UID: \"9639f2b9-2b61-4ab9-ab94-6c54d373662f\") " Dec 05 16:41:42 crc kubenswrapper[4756]: I1205 16:41:42.958268 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvhcn\" (UniqueName: \"kubernetes.io/projected/9639f2b9-2b61-4ab9-ab94-6c54d373662f-kube-api-access-tvhcn\") pod \"9639f2b9-2b61-4ab9-ab94-6c54d373662f\" (UID: \"9639f2b9-2b61-4ab9-ab94-6c54d373662f\") " Dec 05 16:41:42 crc kubenswrapper[4756]: I1205 16:41:42.958637 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9639f2b9-2b61-4ab9-ab94-6c54d373662f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9639f2b9-2b61-4ab9-ab94-6c54d373662f" (UID: "9639f2b9-2b61-4ab9-ab94-6c54d373662f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:41:42 crc kubenswrapper[4756]: I1205 16:41:42.960533 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9639f2b9-2b61-4ab9-ab94-6c54d373662f-kube-api-access-tvhcn" (OuterVolumeSpecName: "kube-api-access-tvhcn") pod "9639f2b9-2b61-4ab9-ab94-6c54d373662f" (UID: "9639f2b9-2b61-4ab9-ab94-6c54d373662f"). InnerVolumeSpecName "kube-api-access-tvhcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:43 crc kubenswrapper[4756]: I1205 16:41:43.060188 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvhcn\" (UniqueName: \"kubernetes.io/projected/9639f2b9-2b61-4ab9-ab94-6c54d373662f-kube-api-access-tvhcn\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:43 crc kubenswrapper[4756]: I1205 16:41:43.060216 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9639f2b9-2b61-4ab9-ab94-6c54d373662f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:43 crc kubenswrapper[4756]: I1205 16:41:43.067141 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbicanc3ab-account-delete-ndr77" Dec 05 16:41:43 crc kubenswrapper[4756]: I1205 16:41:43.067122 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicanc3ab-account-delete-ndr77" event={"ID":"9639f2b9-2b61-4ab9-ab94-6c54d373662f","Type":"ContainerDied","Data":"c941a18a891c805ed027d232a36e4d9d229670f42069ad4c59c6b942b30bd1af"} Dec 05 16:41:43 crc kubenswrapper[4756]: I1205 16:41:43.067215 4756 scope.go:117] "RemoveContainer" containerID="d01e9052eadb9b19c59bb451fdb0fec97a060cfb91308a4e5a61cd10b153475f" Dec 05 16:41:43 crc kubenswrapper[4756]: I1205 16:41:43.070307 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutronbc5a-account-delete-wtsjj" event={"ID":"37e233df-e9ab-45c9-a335-f1182a3b8c79","Type":"ContainerDied","Data":"2bb632277c50d0b113d70b858b3c484f729dd254b7423cf3b9d15aa508e4c585"} Dec 05 16:41:43 crc kubenswrapper[4756]: I1205 16:41:43.070385 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutronbc5a-account-delete-wtsjj" Dec 05 16:41:43 crc kubenswrapper[4756]: I1205 16:41:43.110163 4756 scope.go:117] "RemoveContainer" containerID="693ab2a387b6a0c295bf6ab3d9d8b6730de073c73bc06fccbb3c0097dd28ae4a" Dec 05 16:41:43 crc kubenswrapper[4756]: I1205 16:41:43.111464 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbicanc3ab-account-delete-ndr77"] Dec 05 16:41:43 crc kubenswrapper[4756]: I1205 16:41:43.118493 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbicanc3ab-account-delete-ndr77"] Dec 05 16:41:43 crc kubenswrapper[4756]: I1205 16:41:43.125232 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutronbc5a-account-delete-wtsjj"] Dec 05 16:41:43 crc kubenswrapper[4756]: I1205 16:41:43.145646 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutronbc5a-account-delete-wtsjj"] Dec 05 16:41:43 crc kubenswrapper[4756]: I1205 16:41:43.289576 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dlxfd" Dec 05 16:41:43 crc kubenswrapper[4756]: I1205 16:41:43.289927 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dlxfd" Dec 05 16:41:43 crc kubenswrapper[4756]: I1205 16:41:43.333189 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dlxfd" Dec 05 16:41:44 crc kubenswrapper[4756]: I1205 16:41:44.057923 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37e233df-e9ab-45c9-a335-f1182a3b8c79" path="/var/lib/kubelet/pods/37e233df-e9ab-45c9-a335-f1182a3b8c79/volumes" Dec 05 16:41:44 crc kubenswrapper[4756]: I1205 16:41:44.058518 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9639f2b9-2b61-4ab9-ab94-6c54d373662f" path="/var/lib/kubelet/pods/9639f2b9-2b61-4ab9-ab94-6c54d373662f/volumes" Dec 05 16:41:44 crc kubenswrapper[4756]: I1205 16:41:44.118403 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dlxfd" Dec 05 16:41:44 crc kubenswrapper[4756]: I1205 16:41:44.157630 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dlxfd"] Dec 05 16:41:46 crc kubenswrapper[4756]: I1205 16:41:46.104068 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dlxfd" podUID="deb9672a-8f10-4836-9ddf-cb4f47674257" containerName="registry-server" containerID="cri-o://0553069605af85e6fd239ae2b71b889953cd025cbd793435850e39cef8ef1789" gracePeriod=2 Dec 05 16:41:46 crc kubenswrapper[4756]: I1205 16:41:46.999180 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dlxfd" Dec 05 16:41:47 crc kubenswrapper[4756]: I1205 16:41:47.014157 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deb9672a-8f10-4836-9ddf-cb4f47674257-utilities\") pod \"deb9672a-8f10-4836-9ddf-cb4f47674257\" (UID: \"deb9672a-8f10-4836-9ddf-cb4f47674257\") " Dec 05 16:41:47 crc kubenswrapper[4756]: I1205 16:41:47.014235 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7lcm\" (UniqueName: \"kubernetes.io/projected/deb9672a-8f10-4836-9ddf-cb4f47674257-kube-api-access-p7lcm\") pod \"deb9672a-8f10-4836-9ddf-cb4f47674257\" (UID: \"deb9672a-8f10-4836-9ddf-cb4f47674257\") " Dec 05 16:41:47 crc kubenswrapper[4756]: I1205 16:41:47.014267 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deb9672a-8f10-4836-9ddf-cb4f47674257-catalog-content\") pod \"deb9672a-8f10-4836-9ddf-cb4f47674257\" (UID: \"deb9672a-8f10-4836-9ddf-cb4f47674257\") " Dec 05 16:41:47 crc kubenswrapper[4756]: I1205 16:41:47.019245 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deb9672a-8f10-4836-9ddf-cb4f47674257-utilities" (OuterVolumeSpecName: "utilities") pod "deb9672a-8f10-4836-9ddf-cb4f47674257" (UID: "deb9672a-8f10-4836-9ddf-cb4f47674257"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:47 crc kubenswrapper[4756]: I1205 16:41:47.030665 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deb9672a-8f10-4836-9ddf-cb4f47674257-kube-api-access-p7lcm" (OuterVolumeSpecName: "kube-api-access-p7lcm") pod "deb9672a-8f10-4836-9ddf-cb4f47674257" (UID: "deb9672a-8f10-4836-9ddf-cb4f47674257"). InnerVolumeSpecName "kube-api-access-p7lcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:41:47 crc kubenswrapper[4756]: I1205 16:41:47.042626 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deb9672a-8f10-4836-9ddf-cb4f47674257-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "deb9672a-8f10-4836-9ddf-cb4f47674257" (UID: "deb9672a-8f10-4836-9ddf-cb4f47674257"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:41:47 crc kubenswrapper[4756]: I1205 16:41:47.116402 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deb9672a-8f10-4836-9ddf-cb4f47674257-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:47 crc kubenswrapper[4756]: I1205 16:41:47.116452 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7lcm\" (UniqueName: \"kubernetes.io/projected/deb9672a-8f10-4836-9ddf-cb4f47674257-kube-api-access-p7lcm\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:47 crc kubenswrapper[4756]: I1205 16:41:47.116465 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deb9672a-8f10-4836-9ddf-cb4f47674257-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:41:47 crc kubenswrapper[4756]: I1205 16:41:47.120527 4756 generic.go:334] "Generic (PLEG): container finished" podID="deb9672a-8f10-4836-9ddf-cb4f47674257" containerID="0553069605af85e6fd239ae2b71b889953cd025cbd793435850e39cef8ef1789" exitCode=0 Dec 05 16:41:47 crc kubenswrapper[4756]: I1205 16:41:47.120565 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dlxfd" event={"ID":"deb9672a-8f10-4836-9ddf-cb4f47674257","Type":"ContainerDied","Data":"0553069605af85e6fd239ae2b71b889953cd025cbd793435850e39cef8ef1789"} Dec 05 16:41:47 crc kubenswrapper[4756]: I1205 16:41:47.120598 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dlxfd" event={"ID":"deb9672a-8f10-4836-9ddf-cb4f47674257","Type":"ContainerDied","Data":"862e0280c297dfe8a98622808975300a193d7ab1ea449eba14adaed6bc8f988d"} Dec 05 16:41:47 crc kubenswrapper[4756]: I1205 16:41:47.120619 4756 scope.go:117] "RemoveContainer" containerID="0553069605af85e6fd239ae2b71b889953cd025cbd793435850e39cef8ef1789" Dec 05 16:41:47 crc kubenswrapper[4756]: I1205 16:41:47.120619 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dlxfd" Dec 05 16:41:47 crc kubenswrapper[4756]: I1205 16:41:47.144830 4756 scope.go:117] "RemoveContainer" containerID="1197ee1ba72354c23f47f610efdff905deabd24c505c1115de3497d83894e344" Dec 05 16:41:47 crc kubenswrapper[4756]: I1205 16:41:47.190341 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dlxfd"] Dec 05 16:41:47 crc kubenswrapper[4756]: I1205 16:41:47.194698 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dlxfd"] Dec 05 16:41:47 crc kubenswrapper[4756]: I1205 16:41:47.198688 4756 scope.go:117] "RemoveContainer" containerID="3d1d27288c158cf92845dfc075394689381afda8cf43d822241e019857c37d08" Dec 05 16:41:47 crc kubenswrapper[4756]: I1205 16:41:47.225382 4756 scope.go:117] "RemoveContainer" containerID="0553069605af85e6fd239ae2b71b889953cd025cbd793435850e39cef8ef1789" Dec 05 16:41:47 crc kubenswrapper[4756]: E1205 16:41:47.226298 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0553069605af85e6fd239ae2b71b889953cd025cbd793435850e39cef8ef1789\": container with ID starting with 0553069605af85e6fd239ae2b71b889953cd025cbd793435850e39cef8ef1789 not found: ID does not exist" containerID="0553069605af85e6fd239ae2b71b889953cd025cbd793435850e39cef8ef1789" Dec 05 16:41:47 crc kubenswrapper[4756]: I1205 16:41:47.226362 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0553069605af85e6fd239ae2b71b889953cd025cbd793435850e39cef8ef1789"} err="failed to get container status \"0553069605af85e6fd239ae2b71b889953cd025cbd793435850e39cef8ef1789\": rpc error: code = NotFound desc = could not find container \"0553069605af85e6fd239ae2b71b889953cd025cbd793435850e39cef8ef1789\": container with ID starting with 0553069605af85e6fd239ae2b71b889953cd025cbd793435850e39cef8ef1789 not found: ID does not exist" Dec 05 16:41:47 crc kubenswrapper[4756]: I1205 16:41:47.226396 4756 scope.go:117] "RemoveContainer" containerID="1197ee1ba72354c23f47f610efdff905deabd24c505c1115de3497d83894e344" Dec 05 16:41:47 crc kubenswrapper[4756]: E1205 16:41:47.226920 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1197ee1ba72354c23f47f610efdff905deabd24c505c1115de3497d83894e344\": container with ID starting with 1197ee1ba72354c23f47f610efdff905deabd24c505c1115de3497d83894e344 not found: ID does not exist" containerID="1197ee1ba72354c23f47f610efdff905deabd24c505c1115de3497d83894e344" Dec 05 16:41:47 crc kubenswrapper[4756]: I1205 16:41:47.227021 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1197ee1ba72354c23f47f610efdff905deabd24c505c1115de3497d83894e344"} err="failed to get container status \"1197ee1ba72354c23f47f610efdff905deabd24c505c1115de3497d83894e344\": rpc error: code = NotFound desc = could not find container \"1197ee1ba72354c23f47f610efdff905deabd24c505c1115de3497d83894e344\": container with ID starting with 1197ee1ba72354c23f47f610efdff905deabd24c505c1115de3497d83894e344 not found: ID does not exist" Dec 05 16:41:47 crc kubenswrapper[4756]: I1205 16:41:47.227093 4756 scope.go:117] "RemoveContainer" containerID="3d1d27288c158cf92845dfc075394689381afda8cf43d822241e019857c37d08" Dec 05 16:41:47 crc kubenswrapper[4756]: E1205 16:41:47.227443 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d1d27288c158cf92845dfc075394689381afda8cf43d822241e019857c37d08\": container with ID starting with 3d1d27288c158cf92845dfc075394689381afda8cf43d822241e019857c37d08 not found: ID does not exist" containerID="3d1d27288c158cf92845dfc075394689381afda8cf43d822241e019857c37d08" Dec 05 16:41:47 crc kubenswrapper[4756]: I1205 16:41:47.227478 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d1d27288c158cf92845dfc075394689381afda8cf43d822241e019857c37d08"} err="failed to get container status \"3d1d27288c158cf92845dfc075394689381afda8cf43d822241e019857c37d08\": rpc error: code = NotFound desc = could not find container \"3d1d27288c158cf92845dfc075394689381afda8cf43d822241e019857c37d08\": container with ID starting with 3d1d27288c158cf92845dfc075394689381afda8cf43d822241e019857c37d08 not found: ID does not exist" Dec 05 16:41:48 crc kubenswrapper[4756]: I1205 16:41:48.067159 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deb9672a-8f10-4836-9ddf-cb4f47674257" path="/var/lib/kubelet/pods/deb9672a-8f10-4836-9ddf-cb4f47674257/volumes" Dec 05 16:41:54 crc kubenswrapper[4756]: I1205 16:41:54.394480 4756 scope.go:117] "RemoveContainer" containerID="ffe12582c087f36dcf1ccc4b2569e203bbdd0e871d37f6f302894e91b850d9b4" Dec 05 16:41:54 crc kubenswrapper[4756]: I1205 16:41:54.421660 4756 scope.go:117] "RemoveContainer" containerID="2371130b7e39c0f952bc7b9336044db2e136ebedd7f9a015ed4eb6b6db00bd32" Dec 05 16:41:54 crc kubenswrapper[4756]: I1205 16:41:54.477403 4756 scope.go:117] "RemoveContainer" containerID="41db2bc934dd1d8f5c3deaf39234aa1d852019a880bf3e665b6803cc5b7d2923" Dec 05 16:41:54 crc kubenswrapper[4756]: I1205 16:41:54.501085 4756 scope.go:117] "RemoveContainer" containerID="30f7d07e5b70e1f2d721b39fa20de30dcc105da791b7ef1790d9c73e0f8de457" Dec 05 16:41:54 crc kubenswrapper[4756]: I1205 16:41:54.528917 4756 scope.go:117] "RemoveContainer" containerID="732c7bc429bae861433213544fb19f3e8c9cfa3b827a04b2bdee69fe3f36c4a2" Dec 05 16:41:54 crc kubenswrapper[4756]: I1205 16:41:54.552367 4756 scope.go:117] "RemoveContainer" containerID="57244970037b08a10bfce5d09153a60d6d4d0743e6dcb260784dd10814796f07" Dec 05 16:41:54 crc kubenswrapper[4756]: I1205 16:41:54.578844 4756 scope.go:117] "RemoveContainer" containerID="772b002d40245f8da0b5197a01d0e6ef213b4ad63d84975dc5b67c4131b51e39" Dec 05 16:41:54 crc kubenswrapper[4756]: I1205 16:41:54.611204 4756 scope.go:117] "RemoveContainer" containerID="0d4580a5393a54bbc3f4d1d132fb4626624aacc2b800fb18a8d15e491de91652" Dec 05 16:41:54 crc kubenswrapper[4756]: I1205 16:41:54.634740 4756 scope.go:117] "RemoveContainer" containerID="d11db3e82fa52daef7384c7f1d126a686291ca3143db5866bd59dc3cfdbcbffd" Dec 05 16:41:54 crc kubenswrapper[4756]: I1205 16:41:54.655142 4756 scope.go:117] "RemoveContainer" containerID="db2c602af5ae592e04d0c4437ed7ec9413a4df6cdb793118baacd952ab4a6512" Dec 05 16:42:22 crc kubenswrapper[4756]: I1205 16:42:22.295559 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:42:22 crc kubenswrapper[4756]: I1205 16:42:22.296158 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:42:52 crc kubenswrapper[4756]: I1205 16:42:52.296061 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:42:52 crc kubenswrapper[4756]: I1205 16:42:52.296653 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:42:55 crc kubenswrapper[4756]: I1205 16:42:55.187018 4756 scope.go:117] "RemoveContainer" containerID="622296c3958d5d7a5843eb02cf0aa27222ca463b3056cce2ed68e7255ddd8f4b" Dec 05 16:42:55 crc kubenswrapper[4756]: I1205 16:42:55.215529 4756 scope.go:117] "RemoveContainer" containerID="fb8ac0de6e9797d2428d2cbdde22697f86aebba8e11f73aecb75abd983f806ec" Dec 05 16:42:55 crc kubenswrapper[4756]: I1205 16:42:55.256186 4756 scope.go:117] "RemoveContainer" containerID="bc3fc7814a08ae99eb6fd057cff3a1942e88941dafc3edd5ec0acce3c68e60f0" Dec 05 16:42:55 crc kubenswrapper[4756]: I1205 16:42:55.278189 4756 scope.go:117] "RemoveContainer" containerID="97ff50d7a20d09b9ee422d55325eb627b6df125b1520eb972aebd163564bccb6" Dec 05 16:42:55 crc kubenswrapper[4756]: I1205 16:42:55.308466 4756 scope.go:117] "RemoveContainer" containerID="08a4ee5938edef2174f98c8d125df245ebb94504859012da9b9eee42e3708693" Dec 05 16:42:55 crc kubenswrapper[4756]: I1205 16:42:55.332070 4756 scope.go:117] "RemoveContainer" containerID="fe04514fb7986a3917fe465e5c61beebbacdaf0ec3432b12272d4701aa5b6d79" Dec 05 16:42:55 crc kubenswrapper[4756]: I1205 16:42:55.351388 4756 scope.go:117] "RemoveContainer" containerID="7c27b557cb4a78e84801969e7ecd598a80c88010e59d4e2e8a5bcae978626017" Dec 05 16:42:55 crc kubenswrapper[4756]: I1205 16:42:55.387638 4756 scope.go:117] "RemoveContainer" containerID="b80c949a7e62ec7be4eed3cd8082a46205bea1efe05fb041a6260912f1b9e6aa" Dec 05 16:42:55 crc kubenswrapper[4756]: I1205 16:42:55.404112 4756 scope.go:117] "RemoveContainer" containerID="935c0bc93eae0dc34c3b477543cb9ee593f85515f0bc2d07d1730561667fd233" Dec 05 16:42:55 crc kubenswrapper[4756]: I1205 16:42:55.432721 4756 scope.go:117] "RemoveContainer" containerID="f3c6cab0ce8d40ac7ad080640579b50975b38fa2de3b98eba3f1ed047e946240" Dec 05 16:42:55 crc kubenswrapper[4756]: I1205 16:42:55.458633 4756 scope.go:117] "RemoveContainer" containerID="ec5c3ff55792fd4b79770c090cc741ae3724e644f600d45e694289adcf92843e" Dec 05 16:42:55 crc kubenswrapper[4756]: I1205 16:42:55.498913 4756 scope.go:117] "RemoveContainer" containerID="b2c09374299b8b9e5c8557a6143462bf42ba4b6ac57d54c21e967011eebb395e" Dec 05 16:42:55 crc kubenswrapper[4756]: I1205 16:42:55.528872 4756 scope.go:117] "RemoveContainer" containerID="01bd2e3d0a47c24809786f02ba0d948624a24bcd0220147dcc5a098acfd29827" Dec 05 16:43:22 crc kubenswrapper[4756]: I1205 16:43:22.295977 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:43:22 crc kubenswrapper[4756]: I1205 16:43:22.296530 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:43:22 crc kubenswrapper[4756]: I1205 16:43:22.296575 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 16:43:22 crc kubenswrapper[4756]: I1205 16:43:22.297193 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 16:43:22 crc kubenswrapper[4756]: I1205 16:43:22.297246 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57" gracePeriod=600 Dec 05 16:43:22 crc kubenswrapper[4756]: E1205 16:43:22.418467 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:43:22 crc kubenswrapper[4756]: I1205 16:43:22.996256 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57" exitCode=0 Dec 05 16:43:22 crc kubenswrapper[4756]: I1205 16:43:22.996328 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57"} Dec 05 16:43:22 crc kubenswrapper[4756]: I1205 16:43:22.996713 4756 scope.go:117] "RemoveContainer" containerID="4d4a53385c8aee53151ea619736690865a1fdc90406e455c625f67e894ce004c" Dec 05 16:43:22 crc kubenswrapper[4756]: I1205 16:43:22.997647 4756 scope.go:117] "RemoveContainer" containerID="40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57" Dec 05 16:43:22 crc kubenswrapper[4756]: E1205 16:43:22.998363 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:43:38 crc kubenswrapper[4756]: I1205 16:43:38.050842 4756 scope.go:117] "RemoveContainer" containerID="40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57" Dec 05 16:43:38 crc kubenswrapper[4756]: E1205 16:43:38.051837 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:43:48 crc kubenswrapper[4756]: I1205 16:43:48.314085 4756 patch_prober.go:28] interesting pod/router-default-5444994796-pt8fk container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 16:43:48 crc kubenswrapper[4756]: I1205 16:43:48.315360 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-pt8fk" podUID="6ad5d1b1-4e12-430f-91f1-70e8fd975a09" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 16:43:53 crc kubenswrapper[4756]: I1205 16:43:53.050038 4756 scope.go:117] "RemoveContainer" containerID="40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57" Dec 05 16:43:53 crc kubenswrapper[4756]: E1205 16:43:53.051101 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:43:55 crc kubenswrapper[4756]: I1205 16:43:55.746619 4756 scope.go:117] "RemoveContainer" containerID="2e051bfca4c328a46627298d5d4b9c579dfd75f24de4f9f86c2754e3d192c184" Dec 05 16:43:55 crc kubenswrapper[4756]: I1205 16:43:55.789868 4756 scope.go:117] "RemoveContainer" containerID="871db04e55a28c57dd3290df7567c594c5863df0a7288c54f7126b691f5b411f" Dec 05 16:43:55 crc kubenswrapper[4756]: I1205 16:43:55.823098 4756 scope.go:117] "RemoveContainer" containerID="03919cc24c81be7e09a5b0d4242745d52ccadb053c029fd639abe9d7ba68719a" Dec 05 16:43:55 crc kubenswrapper[4756]: I1205 16:43:55.853212 4756 scope.go:117] "RemoveContainer" containerID="4da7d739364a7736ae954751f02c3f5f707192b39710f85ec3c47bc1b3a59586" Dec 05 16:44:04 crc kubenswrapper[4756]: I1205 16:44:04.050189 4756 scope.go:117] "RemoveContainer" containerID="40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57" Dec 05 16:44:04 crc kubenswrapper[4756]: E1205 16:44:04.050970 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:44:18 crc kubenswrapper[4756]: I1205 16:44:18.050502 4756 scope.go:117] "RemoveContainer" containerID="40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57" Dec 05 16:44:18 crc kubenswrapper[4756]: E1205 16:44:18.051713 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:44:32 crc kubenswrapper[4756]: I1205 16:44:32.049254 4756 scope.go:117] "RemoveContainer" containerID="40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57" Dec 05 16:44:32 crc kubenswrapper[4756]: E1205 16:44:32.050256 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:44:45 crc kubenswrapper[4756]: I1205 16:44:45.050712 4756 scope.go:117] "RemoveContainer" containerID="40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57" Dec 05 16:44:45 crc kubenswrapper[4756]: E1205 16:44:45.051391 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:44:55 crc kubenswrapper[4756]: I1205 16:44:55.987071 4756 scope.go:117] "RemoveContainer" containerID="ee842d9afcdec61635c6c49c87079fe579d522f6aece80d8d0ea7d52afa84d53" Dec 05 16:44:56 crc kubenswrapper[4756]: I1205 16:44:56.015064 4756 scope.go:117] "RemoveContainer" containerID="8e8b38530a288e65246e8e282b7de364102d9ead1db22ed0d2d5b5a3b7cad231" Dec 05 16:44:56 crc kubenswrapper[4756]: I1205 16:44:56.042009 4756 scope.go:117] "RemoveContainer" containerID="0613a7ab5a963b46b6c293bb87644a995d68eed3117fa15ec6230d11c8e9c821" Dec 05 16:44:56 crc kubenswrapper[4756]: I1205 16:44:56.067764 4756 scope.go:117] "RemoveContainer" containerID="2d0048989462b0409ebe8c9066cf5ebecd7f70bb52e8298125aa44f8f432d3f0" Dec 05 16:44:56 crc kubenswrapper[4756]: I1205 16:44:56.123599 4756 scope.go:117] "RemoveContainer" containerID="ce4700bb9c9d51e56f5b62af753192e92110be5ca1e9970ef1f21dbbdd1f826c" Dec 05 16:44:56 crc kubenswrapper[4756]: I1205 16:44:56.142739 4756 scope.go:117] "RemoveContainer" containerID="278c6bf1ce5c05a26a66ac4857f514a3ecfce374d4e7e0ce0f0f871d34afdb5e" Dec 05 16:44:56 crc kubenswrapper[4756]: I1205 16:44:56.163571 4756 scope.go:117] "RemoveContainer" containerID="25c6d91b70e3146a820e5c52b8f074d3f3bfefc49cb9b80d00c7a70aae08ff55" Dec 05 16:44:56 crc kubenswrapper[4756]: I1205 16:44:56.181370 4756 scope.go:117] "RemoveContainer" containerID="6d0787e75ae992dd2b3dc29096ca4f2b71204bd07d887523f0de467c4e4688d4" Dec 05 16:44:56 crc kubenswrapper[4756]: I1205 16:44:56.203665 4756 scope.go:117] "RemoveContainer" containerID="d327f9403ce4568b62d72fe9980a0432ff8cd23fd8d7e8702c031cc920a379de" Dec 05 16:44:56 crc kubenswrapper[4756]: I1205 16:44:56.227622 4756 scope.go:117] "RemoveContainer" containerID="e9cb97c846b6e2094cc7aac2cfe5bee6e90448725e33aa4f60ab7b3ad4926211" Dec 05 16:44:56 crc kubenswrapper[4756]: I1205 16:44:56.249473 4756 scope.go:117] "RemoveContainer" containerID="ff50efe465d0847be81b8c502e77c5ea6e01d546de46512db17a70f058fa77b7" Dec 05 16:44:56 crc kubenswrapper[4756]: I1205 16:44:56.270040 4756 scope.go:117] "RemoveContainer" containerID="9eb7c337bc0192f0bbdc6691c02794d365f0af44aebf7d0ed7d4e350deb56d23" Dec 05 16:44:56 crc kubenswrapper[4756]: I1205 16:44:56.288385 4756 scope.go:117] "RemoveContainer" containerID="748d893aca593248b6065e871080d3dc2da1e7446554b7f0e8a379e4ea7892be" Dec 05 16:44:56 crc kubenswrapper[4756]: I1205 16:44:56.311875 4756 scope.go:117] "RemoveContainer" containerID="a98732f8a9adea1e07b9136ead6f0a5c91c6e5e7c6bd79d6b9a321ff304a94b5" Dec 05 16:44:56 crc kubenswrapper[4756]: I1205 16:44:56.331252 4756 scope.go:117] "RemoveContainer" containerID="b1fdd4b0355b13fe70aff4ac0136e9ef9e527331dc56f1dbae9b13dfe8b9c5f1" Dec 05 16:44:56 crc kubenswrapper[4756]: I1205 16:44:56.349904 4756 scope.go:117] "RemoveContainer" containerID="b2292cd9b3a9b3b5abd290889833f8a1801c60128dcd06bb289b998128e70716" Dec 05 16:44:56 crc kubenswrapper[4756]: I1205 16:44:56.368328 4756 scope.go:117] "RemoveContainer" containerID="2924cfbec2bcd7342ee1c2869c60f831538193dcd182c8f17f207d9200a1fc4f" Dec 05 16:44:59 crc kubenswrapper[4756]: I1205 16:44:59.049468 4756 scope.go:117] "RemoveContainer" containerID="40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57" Dec 05 16:44:59 crc kubenswrapper[4756]: E1205 16:44:59.049970 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.162990 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415885-ptx2k"] Dec 05 16:45:00 crc kubenswrapper[4756]: E1205 16:45:00.163289 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="container-server" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.163303 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="container-server" Dec 05 16:45:00 crc kubenswrapper[4756]: E1205 16:45:00.163317 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9639f2b9-2b61-4ab9-ab94-6c54d373662f" containerName="mariadb-account-delete" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.163323 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="9639f2b9-2b61-4ab9-ab94-6c54d373662f" containerName="mariadb-account-delete" Dec 05 16:45:00 crc kubenswrapper[4756]: E1205 16:45:00.163335 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="object-updater" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.163341 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="object-updater" Dec 05 16:45:00 crc kubenswrapper[4756]: E1205 16:45:00.163351 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="container-replicator" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.163357 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="container-replicator" Dec 05 16:45:00 crc kubenswrapper[4756]: E1205 16:45:00.163366 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="object-auditor" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.163371 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="object-auditor" Dec 05 16:45:00 crc kubenswrapper[4756]: E1205 16:45:00.163384 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40ff87cb-4d82-49d4-a4d9-55e104c9becb" containerName="ovsdb-server-init" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.163389 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="40ff87cb-4d82-49d4-a4d9-55e104c9becb" containerName="ovsdb-server-init" Dec 05 16:45:00 crc kubenswrapper[4756]: E1205 16:45:00.163399 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="container-auditor" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.163404 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="container-auditor" Dec 05 16:45:00 crc kubenswrapper[4756]: E1205 16:45:00.163433 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="object-replicator" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.163438 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="object-replicator" Dec 05 16:45:00 crc kubenswrapper[4756]: E1205 16:45:00.163447 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb9672a-8f10-4836-9ddf-cb4f47674257" containerName="extract-content" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.163453 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb9672a-8f10-4836-9ddf-cb4f47674257" containerName="extract-content" Dec 05 16:45:00 crc kubenswrapper[4756]: E1205 16:45:00.163465 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="swift-recon-cron" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.163470 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="swift-recon-cron" Dec 05 16:45:00 crc kubenswrapper[4756]: E1205 16:45:00.163477 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="container-updater" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.163483 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="container-updater" Dec 05 16:45:00 crc kubenswrapper[4756]: E1205 16:45:00.163491 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2586474d-e368-428a-a86b-1ec95b8006a7" containerName="mariadb-account-delete" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.163496 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="2586474d-e368-428a-a86b-1ec95b8006a7" containerName="mariadb-account-delete" Dec 05 16:45:00 crc kubenswrapper[4756]: E1205 16:45:00.163502 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb9672a-8f10-4836-9ddf-cb4f47674257" containerName="registry-server" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.163509 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb9672a-8f10-4836-9ddf-cb4f47674257" containerName="registry-server" Dec 05 16:45:00 crc kubenswrapper[4756]: E1205 16:45:00.163522 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40ff87cb-4d82-49d4-a4d9-55e104c9becb" containerName="ovs-vswitchd" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.163528 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="40ff87cb-4d82-49d4-a4d9-55e104c9becb" containerName="ovs-vswitchd" Dec 05 16:45:00 crc kubenswrapper[4756]: E1205 16:45:00.163806 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="account-reaper" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.163811 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="account-reaper" Dec 05 16:45:00 crc kubenswrapper[4756]: E1205 16:45:00.163825 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="rsync" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.163832 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="rsync" Dec 05 16:45:00 crc kubenswrapper[4756]: E1205 16:45:00.163843 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e" containerName="mariadb-account-delete" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.163851 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e" containerName="mariadb-account-delete" Dec 05 16:45:00 crc kubenswrapper[4756]: E1205 16:45:00.163861 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37e233df-e9ab-45c9-a335-f1182a3b8c79" containerName="mariadb-account-delete" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.163866 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="37e233df-e9ab-45c9-a335-f1182a3b8c79" containerName="mariadb-account-delete" Dec 05 16:45:00 crc kubenswrapper[4756]: E1205 16:45:00.163878 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc4b1e45-87c0-471f-beb9-70bce6933d47" containerName="mariadb-account-delete" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.163883 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc4b1e45-87c0-471f-beb9-70bce6933d47" containerName="mariadb-account-delete" Dec 05 16:45:00 crc kubenswrapper[4756]: E1205 16:45:00.163920 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="account-auditor" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.163925 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="account-auditor" Dec 05 16:45:00 crc kubenswrapper[4756]: E1205 16:45:00.163938 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="object-expirer" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.163944 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="object-expirer" Dec 05 16:45:00 crc kubenswrapper[4756]: E1205 16:45:00.163957 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="account-server" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.163964 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="account-server" Dec 05 16:45:00 crc kubenswrapper[4756]: E1205 16:45:00.163972 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb9672a-8f10-4836-9ddf-cb4f47674257" containerName="extract-utilities" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.163979 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb9672a-8f10-4836-9ddf-cb4f47674257" containerName="extract-utilities" Dec 05 16:45:00 crc kubenswrapper[4756]: E1205 16:45:00.163993 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40ff87cb-4d82-49d4-a4d9-55e104c9becb" containerName="ovsdb-server" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.164000 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="40ff87cb-4d82-49d4-a4d9-55e104c9becb" containerName="ovsdb-server" Dec 05 16:45:00 crc kubenswrapper[4756]: E1205 16:45:00.164011 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="account-replicator" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.164017 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="account-replicator" Dec 05 16:45:00 crc kubenswrapper[4756]: E1205 16:45:00.164034 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="object-server" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.164039 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="object-server" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.164166 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="object-expirer" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.164175 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="object-updater" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.164183 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="object-auditor" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.164190 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="account-auditor" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.164196 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="swift-recon-cron" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.164210 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc4b1e45-87c0-471f-beb9-70bce6933d47" containerName="mariadb-account-delete" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.164217 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="account-server" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.164227 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="container-updater" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.164236 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="40ff87cb-4d82-49d4-a4d9-55e104c9becb" containerName="ovs-vswitchd" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.164245 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="9639f2b9-2b61-4ab9-ab94-6c54d373662f" containerName="mariadb-account-delete" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.164253 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb9672a-8f10-4836-9ddf-cb4f47674257" containerName="registry-server" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.164260 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ee7bff4-f0c9-4b82-a8ea-318f2e16ff4e" containerName="mariadb-account-delete" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.164268 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="container-auditor" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.164276 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="account-replicator" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.164287 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="object-server" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.164295 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="37e233df-e9ab-45c9-a335-f1182a3b8c79" containerName="mariadb-account-delete" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.164305 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="2586474d-e368-428a-a86b-1ec95b8006a7" containerName="mariadb-account-delete" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.164312 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="container-replicator" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.164321 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="40ff87cb-4d82-49d4-a4d9-55e104c9becb" containerName="ovsdb-server" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.164329 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="account-reaper" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.164335 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="container-server" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.164342 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="rsync" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.164348 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1da6dea0-ecf9-4bec-9c85-2fddcc0fae40" containerName="object-replicator" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.164860 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-ptx2k" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.169106 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.177050 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.180398 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415885-ptx2k"] Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.272001 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c8f6798-be3d-499c-98fb-ab59439176ff-config-volume\") pod \"collect-profiles-29415885-ptx2k\" (UID: \"1c8f6798-be3d-499c-98fb-ab59439176ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-ptx2k" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.272049 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2h9n\" (UniqueName: \"kubernetes.io/projected/1c8f6798-be3d-499c-98fb-ab59439176ff-kube-api-access-j2h9n\") pod \"collect-profiles-29415885-ptx2k\" (UID: \"1c8f6798-be3d-499c-98fb-ab59439176ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-ptx2k" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.272215 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c8f6798-be3d-499c-98fb-ab59439176ff-secret-volume\") pod \"collect-profiles-29415885-ptx2k\" (UID: \"1c8f6798-be3d-499c-98fb-ab59439176ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-ptx2k" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.373890 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2h9n\" (UniqueName: \"kubernetes.io/projected/1c8f6798-be3d-499c-98fb-ab59439176ff-kube-api-access-j2h9n\") pod \"collect-profiles-29415885-ptx2k\" (UID: \"1c8f6798-be3d-499c-98fb-ab59439176ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-ptx2k" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.373963 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c8f6798-be3d-499c-98fb-ab59439176ff-secret-volume\") pod \"collect-profiles-29415885-ptx2k\" (UID: \"1c8f6798-be3d-499c-98fb-ab59439176ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-ptx2k" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.374035 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c8f6798-be3d-499c-98fb-ab59439176ff-config-volume\") pod \"collect-profiles-29415885-ptx2k\" (UID: \"1c8f6798-be3d-499c-98fb-ab59439176ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-ptx2k" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.374828 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c8f6798-be3d-499c-98fb-ab59439176ff-config-volume\") pod \"collect-profiles-29415885-ptx2k\" (UID: \"1c8f6798-be3d-499c-98fb-ab59439176ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-ptx2k" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.379386 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c8f6798-be3d-499c-98fb-ab59439176ff-secret-volume\") pod \"collect-profiles-29415885-ptx2k\" (UID: \"1c8f6798-be3d-499c-98fb-ab59439176ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-ptx2k" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.400584 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2h9n\" (UniqueName: \"kubernetes.io/projected/1c8f6798-be3d-499c-98fb-ab59439176ff-kube-api-access-j2h9n\") pod \"collect-profiles-29415885-ptx2k\" (UID: \"1c8f6798-be3d-499c-98fb-ab59439176ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-ptx2k" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.493694 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-ptx2k" Dec 05 16:45:00 crc kubenswrapper[4756]: I1205 16:45:00.926908 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415885-ptx2k"] Dec 05 16:45:01 crc kubenswrapper[4756]: I1205 16:45:01.864638 4756 generic.go:334] "Generic (PLEG): container finished" podID="1c8f6798-be3d-499c-98fb-ab59439176ff" containerID="3cf229a4daba2cbc5b3f8d04c99533c9d834eeac3bcf221d39628223dd960629" exitCode=0 Dec 05 16:45:01 crc kubenswrapper[4756]: I1205 16:45:01.864723 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-ptx2k" event={"ID":"1c8f6798-be3d-499c-98fb-ab59439176ff","Type":"ContainerDied","Data":"3cf229a4daba2cbc5b3f8d04c99533c9d834eeac3bcf221d39628223dd960629"} Dec 05 16:45:01 crc kubenswrapper[4756]: I1205 16:45:01.866119 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-ptx2k" event={"ID":"1c8f6798-be3d-499c-98fb-ab59439176ff","Type":"ContainerStarted","Data":"52cb14f3a2da234c9660c22db547dca55e8e5129a6b8d1a4d7b7ee9d61d6ee87"} Dec 05 16:45:03 crc kubenswrapper[4756]: I1205 16:45:03.199705 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-ptx2k" Dec 05 16:45:03 crc kubenswrapper[4756]: I1205 16:45:03.313779 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c8f6798-be3d-499c-98fb-ab59439176ff-config-volume\") pod \"1c8f6798-be3d-499c-98fb-ab59439176ff\" (UID: \"1c8f6798-be3d-499c-98fb-ab59439176ff\") " Dec 05 16:45:03 crc kubenswrapper[4756]: I1205 16:45:03.313897 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2h9n\" (UniqueName: \"kubernetes.io/projected/1c8f6798-be3d-499c-98fb-ab59439176ff-kube-api-access-j2h9n\") pod \"1c8f6798-be3d-499c-98fb-ab59439176ff\" (UID: \"1c8f6798-be3d-499c-98fb-ab59439176ff\") " Dec 05 16:45:03 crc kubenswrapper[4756]: I1205 16:45:03.313937 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c8f6798-be3d-499c-98fb-ab59439176ff-secret-volume\") pod \"1c8f6798-be3d-499c-98fb-ab59439176ff\" (UID: \"1c8f6798-be3d-499c-98fb-ab59439176ff\") " Dec 05 16:45:03 crc kubenswrapper[4756]: I1205 16:45:03.314666 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c8f6798-be3d-499c-98fb-ab59439176ff-config-volume" (OuterVolumeSpecName: "config-volume") pod "1c8f6798-be3d-499c-98fb-ab59439176ff" (UID: "1c8f6798-be3d-499c-98fb-ab59439176ff"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 16:45:03 crc kubenswrapper[4756]: I1205 16:45:03.320569 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c8f6798-be3d-499c-98fb-ab59439176ff-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1c8f6798-be3d-499c-98fb-ab59439176ff" (UID: "1c8f6798-be3d-499c-98fb-ab59439176ff"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 16:45:03 crc kubenswrapper[4756]: I1205 16:45:03.320576 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c8f6798-be3d-499c-98fb-ab59439176ff-kube-api-access-j2h9n" (OuterVolumeSpecName: "kube-api-access-j2h9n") pod "1c8f6798-be3d-499c-98fb-ab59439176ff" (UID: "1c8f6798-be3d-499c-98fb-ab59439176ff"). InnerVolumeSpecName "kube-api-access-j2h9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:45:03 crc kubenswrapper[4756]: I1205 16:45:03.416226 4756 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c8f6798-be3d-499c-98fb-ab59439176ff-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 16:45:03 crc kubenswrapper[4756]: I1205 16:45:03.416268 4756 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c8f6798-be3d-499c-98fb-ab59439176ff-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 16:45:03 crc kubenswrapper[4756]: I1205 16:45:03.416282 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2h9n\" (UniqueName: \"kubernetes.io/projected/1c8f6798-be3d-499c-98fb-ab59439176ff-kube-api-access-j2h9n\") on node \"crc\" DevicePath \"\"" Dec 05 16:45:03 crc kubenswrapper[4756]: I1205 16:45:03.882266 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-ptx2k" event={"ID":"1c8f6798-be3d-499c-98fb-ab59439176ff","Type":"ContainerDied","Data":"52cb14f3a2da234c9660c22db547dca55e8e5129a6b8d1a4d7b7ee9d61d6ee87"} Dec 05 16:45:03 crc kubenswrapper[4756]: I1205 16:45:03.882607 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52cb14f3a2da234c9660c22db547dca55e8e5129a6b8d1a4d7b7ee9d61d6ee87" Dec 05 16:45:03 crc kubenswrapper[4756]: I1205 16:45:03.882346 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415885-ptx2k" Dec 05 16:45:10 crc kubenswrapper[4756]: I1205 16:45:10.053740 4756 scope.go:117] "RemoveContainer" containerID="40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57" Dec 05 16:45:10 crc kubenswrapper[4756]: E1205 16:45:10.054582 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:45:22 crc kubenswrapper[4756]: I1205 16:45:22.049192 4756 scope.go:117] "RemoveContainer" containerID="40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57" Dec 05 16:45:22 crc kubenswrapper[4756]: E1205 16:45:22.050012 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:45:33 crc kubenswrapper[4756]: I1205 16:45:33.049876 4756 scope.go:117] "RemoveContainer" containerID="40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57" Dec 05 16:45:33 crc kubenswrapper[4756]: E1205 16:45:33.050578 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:45:44 crc kubenswrapper[4756]: I1205 16:45:44.050291 4756 scope.go:117] "RemoveContainer" containerID="40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57" Dec 05 16:45:44 crc kubenswrapper[4756]: E1205 16:45:44.051164 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:45:56 crc kubenswrapper[4756]: I1205 16:45:56.565163 4756 scope.go:117] "RemoveContainer" containerID="e34299a73608f080fb60c718ee146253bc5828dfc856b17195372ea6a1daeb46" Dec 05 16:45:56 crc kubenswrapper[4756]: I1205 16:45:56.614658 4756 scope.go:117] "RemoveContainer" containerID="14a9c94a34d46580da78a252edc6fc101e1053fd75e54da245fe2728758527ee" Dec 05 16:45:56 crc kubenswrapper[4756]: I1205 16:45:56.645790 4756 scope.go:117] "RemoveContainer" containerID="3314201eb3f0a43ba4e986ea1cf96160fc430f0101df1aa8d597f6eafb4f4d90" Dec 05 16:45:56 crc kubenswrapper[4756]: I1205 16:45:56.665292 4756 scope.go:117] "RemoveContainer" containerID="7fc666336bac7d8a681a050c8fd16bd151411c4f9fcfe10381ada0984c4a3444" Dec 05 16:45:56 crc kubenswrapper[4756]: I1205 16:45:56.692335 4756 scope.go:117] "RemoveContainer" containerID="6ea8bf01cfeb60c15901502f17cd75838f7a07e04bcf6b23ccc5eae4f4053438" Dec 05 16:45:57 crc kubenswrapper[4756]: I1205 16:45:57.049754 4756 scope.go:117] "RemoveContainer" containerID="40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57" Dec 05 16:45:57 crc kubenswrapper[4756]: E1205 16:45:57.050127 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:46:11 crc kubenswrapper[4756]: I1205 16:46:11.049175 4756 scope.go:117] "RemoveContainer" containerID="40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57" Dec 05 16:46:11 crc kubenswrapper[4756]: E1205 16:46:11.050011 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:46:22 crc kubenswrapper[4756]: I1205 16:46:22.050324 4756 scope.go:117] "RemoveContainer" containerID="40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57" Dec 05 16:46:22 crc kubenswrapper[4756]: E1205 16:46:22.051132 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:46:35 crc kubenswrapper[4756]: I1205 16:46:35.050118 4756 scope.go:117] "RemoveContainer" containerID="40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57" Dec 05 16:46:35 crc kubenswrapper[4756]: E1205 16:46:35.052389 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:46:48 crc kubenswrapper[4756]: I1205 16:46:48.050383 4756 scope.go:117] "RemoveContainer" containerID="40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57" Dec 05 16:46:48 crc kubenswrapper[4756]: E1205 16:46:48.051164 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:46:56 crc kubenswrapper[4756]: I1205 16:46:56.808559 4756 scope.go:117] "RemoveContainer" containerID="fe8069b814e870c4e578733a24b86485d43263b291fd3dcc6e53ead33749cc50" Dec 05 16:46:56 crc kubenswrapper[4756]: I1205 16:46:56.828294 4756 scope.go:117] "RemoveContainer" containerID="8c85479e2fe59de4f67537a2a6d08e3945b963ac89af5c01b49b2eb958a942e5" Dec 05 16:46:56 crc kubenswrapper[4756]: I1205 16:46:56.842893 4756 scope.go:117] "RemoveContainer" containerID="8f9e222e4bcb63d553ca65dc2988c4b494ff464ecf438d7b80744af7ca09bdb8" Dec 05 16:46:56 crc kubenswrapper[4756]: I1205 16:46:56.863336 4756 scope.go:117] "RemoveContainer" containerID="81f68066e72188559d31bb8d4837587fd7973789d7a3b9234955e55c86d39ce8" Dec 05 16:46:56 crc kubenswrapper[4756]: I1205 16:46:56.902111 4756 scope.go:117] "RemoveContainer" containerID="52f8fd30a92226261f0f4bd124f526c8082068adbdd24e6aad126c08d81c0f83" Dec 05 16:46:56 crc kubenswrapper[4756]: I1205 16:46:56.917737 4756 scope.go:117] "RemoveContainer" containerID="04391e895ac7c022c16d69dfd03bf0f8cab399a37640b4aebf07a28dfbf58d41" Dec 05 16:47:00 crc kubenswrapper[4756]: I1205 16:47:00.052819 4756 scope.go:117] "RemoveContainer" containerID="40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57" Dec 05 16:47:00 crc kubenswrapper[4756]: E1205 16:47:00.053401 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:47:11 crc kubenswrapper[4756]: I1205 16:47:11.049967 4756 scope.go:117] "RemoveContainer" containerID="40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57" Dec 05 16:47:11 crc kubenswrapper[4756]: E1205 16:47:11.050803 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:47:23 crc kubenswrapper[4756]: I1205 16:47:23.049570 4756 scope.go:117] "RemoveContainer" containerID="40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57" Dec 05 16:47:23 crc kubenswrapper[4756]: E1205 16:47:23.050217 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:47:34 crc kubenswrapper[4756]: I1205 16:47:34.050230 4756 scope.go:117] "RemoveContainer" containerID="40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57" Dec 05 16:47:34 crc kubenswrapper[4756]: E1205 16:47:34.051084 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:47:47 crc kubenswrapper[4756]: I1205 16:47:47.049590 4756 scope.go:117] "RemoveContainer" containerID="40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57" Dec 05 16:47:47 crc kubenswrapper[4756]: E1205 16:47:47.050372 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:47:56 crc kubenswrapper[4756]: I1205 16:47:56.993121 4756 scope.go:117] "RemoveContainer" containerID="7cbefc8071abfa1c8c1d0b2908585ff763fc9701a11ed995f99f21ef8c00a196" Dec 05 16:47:57 crc kubenswrapper[4756]: I1205 16:47:57.018501 4756 scope.go:117] "RemoveContainer" containerID="b462dd63e42cfa5cb78440495d7faec00b0f9c211eb3ed5ddcb9403c412b08b7" Dec 05 16:47:59 crc kubenswrapper[4756]: I1205 16:47:59.050774 4756 scope.go:117] "RemoveContainer" containerID="40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57" Dec 05 16:47:59 crc kubenswrapper[4756]: E1205 16:47:59.051293 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:48:14 crc kubenswrapper[4756]: I1205 16:48:14.050014 4756 scope.go:117] "RemoveContainer" containerID="40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57" Dec 05 16:48:14 crc kubenswrapper[4756]: E1205 16:48:14.050773 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:48:29 crc kubenswrapper[4756]: I1205 16:48:29.049762 4756 scope.go:117] "RemoveContainer" containerID="40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57" Dec 05 16:48:29 crc kubenswrapper[4756]: I1205 16:48:29.565831 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"b08c122861a85300ba33993513b01f033b9489d5a5114fa7395b74161bc4057c"} Dec 05 16:49:07 crc kubenswrapper[4756]: I1205 16:49:07.552920 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dbvh5"] Dec 05 16:49:07 crc kubenswrapper[4756]: E1205 16:49:07.555303 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c8f6798-be3d-499c-98fb-ab59439176ff" containerName="collect-profiles" Dec 05 16:49:07 crc kubenswrapper[4756]: I1205 16:49:07.555382 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c8f6798-be3d-499c-98fb-ab59439176ff" containerName="collect-profiles" Dec 05 16:49:07 crc kubenswrapper[4756]: I1205 16:49:07.555608 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c8f6798-be3d-499c-98fb-ab59439176ff" containerName="collect-profiles" Dec 05 16:49:07 crc kubenswrapper[4756]: I1205 16:49:07.556715 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dbvh5" Dec 05 16:49:07 crc kubenswrapper[4756]: I1205 16:49:07.577573 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dbvh5"] Dec 05 16:49:07 crc kubenswrapper[4756]: I1205 16:49:07.657801 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6150a9f9-b462-4cc3-b0b0-6f1f136680cd-utilities\") pod \"redhat-operators-dbvh5\" (UID: \"6150a9f9-b462-4cc3-b0b0-6f1f136680cd\") " pod="openshift-marketplace/redhat-operators-dbvh5" Dec 05 16:49:07 crc kubenswrapper[4756]: I1205 16:49:07.657875 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmzd2\" (UniqueName: \"kubernetes.io/projected/6150a9f9-b462-4cc3-b0b0-6f1f136680cd-kube-api-access-hmzd2\") pod \"redhat-operators-dbvh5\" (UID: \"6150a9f9-b462-4cc3-b0b0-6f1f136680cd\") " pod="openshift-marketplace/redhat-operators-dbvh5" Dec 05 16:49:07 crc kubenswrapper[4756]: I1205 16:49:07.657905 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6150a9f9-b462-4cc3-b0b0-6f1f136680cd-catalog-content\") pod \"redhat-operators-dbvh5\" (UID: \"6150a9f9-b462-4cc3-b0b0-6f1f136680cd\") " pod="openshift-marketplace/redhat-operators-dbvh5" Dec 05 16:49:07 crc kubenswrapper[4756]: I1205 16:49:07.759268 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmzd2\" (UniqueName: \"kubernetes.io/projected/6150a9f9-b462-4cc3-b0b0-6f1f136680cd-kube-api-access-hmzd2\") pod \"redhat-operators-dbvh5\" (UID: \"6150a9f9-b462-4cc3-b0b0-6f1f136680cd\") " pod="openshift-marketplace/redhat-operators-dbvh5" Dec 05 16:49:07 crc kubenswrapper[4756]: I1205 16:49:07.759318 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6150a9f9-b462-4cc3-b0b0-6f1f136680cd-catalog-content\") pod \"redhat-operators-dbvh5\" (UID: \"6150a9f9-b462-4cc3-b0b0-6f1f136680cd\") " pod="openshift-marketplace/redhat-operators-dbvh5" Dec 05 16:49:07 crc kubenswrapper[4756]: I1205 16:49:07.759433 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6150a9f9-b462-4cc3-b0b0-6f1f136680cd-utilities\") pod \"redhat-operators-dbvh5\" (UID: \"6150a9f9-b462-4cc3-b0b0-6f1f136680cd\") " pod="openshift-marketplace/redhat-operators-dbvh5" Dec 05 16:49:07 crc kubenswrapper[4756]: I1205 16:49:07.760070 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6150a9f9-b462-4cc3-b0b0-6f1f136680cd-utilities\") pod \"redhat-operators-dbvh5\" (UID: \"6150a9f9-b462-4cc3-b0b0-6f1f136680cd\") " pod="openshift-marketplace/redhat-operators-dbvh5" Dec 05 16:49:07 crc kubenswrapper[4756]: I1205 16:49:07.760223 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6150a9f9-b462-4cc3-b0b0-6f1f136680cd-catalog-content\") pod \"redhat-operators-dbvh5\" (UID: \"6150a9f9-b462-4cc3-b0b0-6f1f136680cd\") " pod="openshift-marketplace/redhat-operators-dbvh5" Dec 05 16:49:07 crc kubenswrapper[4756]: I1205 16:49:07.776876 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmzd2\" (UniqueName: \"kubernetes.io/projected/6150a9f9-b462-4cc3-b0b0-6f1f136680cd-kube-api-access-hmzd2\") pod \"redhat-operators-dbvh5\" (UID: \"6150a9f9-b462-4cc3-b0b0-6f1f136680cd\") " pod="openshift-marketplace/redhat-operators-dbvh5" Dec 05 16:49:07 crc kubenswrapper[4756]: I1205 16:49:07.881815 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dbvh5" Dec 05 16:49:08 crc kubenswrapper[4756]: I1205 16:49:08.160719 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dbvh5"] Dec 05 16:49:08 crc kubenswrapper[4756]: I1205 16:49:08.888006 4756 generic.go:334] "Generic (PLEG): container finished" podID="6150a9f9-b462-4cc3-b0b0-6f1f136680cd" containerID="b514875dc4ed45e71c351e3ffd309bd639b629c21c9262f74b0a372eef92ca08" exitCode=0 Dec 05 16:49:08 crc kubenswrapper[4756]: I1205 16:49:08.888048 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbvh5" event={"ID":"6150a9f9-b462-4cc3-b0b0-6f1f136680cd","Type":"ContainerDied","Data":"b514875dc4ed45e71c351e3ffd309bd639b629c21c9262f74b0a372eef92ca08"} Dec 05 16:49:08 crc kubenswrapper[4756]: I1205 16:49:08.888071 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbvh5" event={"ID":"6150a9f9-b462-4cc3-b0b0-6f1f136680cd","Type":"ContainerStarted","Data":"fdf205ce638df7f73bc400f895d1a75923247715abc0d68502faa014a9c69148"} Dec 05 16:49:08 crc kubenswrapper[4756]: I1205 16:49:08.891116 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 16:49:09 crc kubenswrapper[4756]: I1205 16:49:09.899046 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbvh5" event={"ID":"6150a9f9-b462-4cc3-b0b0-6f1f136680cd","Type":"ContainerStarted","Data":"016fe08d53ef827f9b8d770cfa1e45bdbe4d5afa7de7647062eb044d8775ba26"} Dec 05 16:49:10 crc kubenswrapper[4756]: I1205 16:49:10.907236 4756 generic.go:334] "Generic (PLEG): container finished" podID="6150a9f9-b462-4cc3-b0b0-6f1f136680cd" containerID="016fe08d53ef827f9b8d770cfa1e45bdbe4d5afa7de7647062eb044d8775ba26" exitCode=0 Dec 05 16:49:10 crc kubenswrapper[4756]: I1205 16:49:10.907332 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbvh5" event={"ID":"6150a9f9-b462-4cc3-b0b0-6f1f136680cd","Type":"ContainerDied","Data":"016fe08d53ef827f9b8d770cfa1e45bdbe4d5afa7de7647062eb044d8775ba26"} Dec 05 16:49:11 crc kubenswrapper[4756]: I1205 16:49:11.916514 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbvh5" event={"ID":"6150a9f9-b462-4cc3-b0b0-6f1f136680cd","Type":"ContainerStarted","Data":"22e85de31c496fd8e3912e001cd52f1ee35944e90c195043c18ddb781ad918a8"} Dec 05 16:49:11 crc kubenswrapper[4756]: I1205 16:49:11.936710 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dbvh5" podStartSLOduration=2.515861518 podStartE2EDuration="4.936692231s" podCreationTimestamp="2025-12-05 16:49:07 +0000 UTC" firstStartedPulling="2025-12-05 16:49:08.890806583 +0000 UTC m=+2009.129480095" lastFinishedPulling="2025-12-05 16:49:11.311637276 +0000 UTC m=+2011.550310808" observedRunningTime="2025-12-05 16:49:11.932784726 +0000 UTC m=+2012.171458248" watchObservedRunningTime="2025-12-05 16:49:11.936692231 +0000 UTC m=+2012.175365743" Dec 05 16:49:17 crc kubenswrapper[4756]: I1205 16:49:17.882130 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dbvh5" Dec 05 16:49:17 crc kubenswrapper[4756]: I1205 16:49:17.882775 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dbvh5" Dec 05 16:49:17 crc kubenswrapper[4756]: I1205 16:49:17.948722 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dbvh5" Dec 05 16:49:18 crc kubenswrapper[4756]: I1205 16:49:18.016245 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dbvh5" Dec 05 16:49:18 crc kubenswrapper[4756]: I1205 16:49:18.190452 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dbvh5"] Dec 05 16:49:19 crc kubenswrapper[4756]: I1205 16:49:19.975816 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dbvh5" podUID="6150a9f9-b462-4cc3-b0b0-6f1f136680cd" containerName="registry-server" containerID="cri-o://22e85de31c496fd8e3912e001cd52f1ee35944e90c195043c18ddb781ad918a8" gracePeriod=2 Dec 05 16:49:23 crc kubenswrapper[4756]: I1205 16:49:23.000759 4756 generic.go:334] "Generic (PLEG): container finished" podID="6150a9f9-b462-4cc3-b0b0-6f1f136680cd" containerID="22e85de31c496fd8e3912e001cd52f1ee35944e90c195043c18ddb781ad918a8" exitCode=0 Dec 05 16:49:23 crc kubenswrapper[4756]: I1205 16:49:23.000816 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbvh5" event={"ID":"6150a9f9-b462-4cc3-b0b0-6f1f136680cd","Type":"ContainerDied","Data":"22e85de31c496fd8e3912e001cd52f1ee35944e90c195043c18ddb781ad918a8"} Dec 05 16:49:23 crc kubenswrapper[4756]: I1205 16:49:23.092667 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dbvh5" Dec 05 16:49:23 crc kubenswrapper[4756]: I1205 16:49:23.194714 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6150a9f9-b462-4cc3-b0b0-6f1f136680cd-catalog-content\") pod \"6150a9f9-b462-4cc3-b0b0-6f1f136680cd\" (UID: \"6150a9f9-b462-4cc3-b0b0-6f1f136680cd\") " Dec 05 16:49:23 crc kubenswrapper[4756]: I1205 16:49:23.194784 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6150a9f9-b462-4cc3-b0b0-6f1f136680cd-utilities\") pod \"6150a9f9-b462-4cc3-b0b0-6f1f136680cd\" (UID: \"6150a9f9-b462-4cc3-b0b0-6f1f136680cd\") " Dec 05 16:49:23 crc kubenswrapper[4756]: I1205 16:49:23.194907 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmzd2\" (UniqueName: \"kubernetes.io/projected/6150a9f9-b462-4cc3-b0b0-6f1f136680cd-kube-api-access-hmzd2\") pod \"6150a9f9-b462-4cc3-b0b0-6f1f136680cd\" (UID: \"6150a9f9-b462-4cc3-b0b0-6f1f136680cd\") " Dec 05 16:49:23 crc kubenswrapper[4756]: I1205 16:49:23.195969 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6150a9f9-b462-4cc3-b0b0-6f1f136680cd-utilities" (OuterVolumeSpecName: "utilities") pod "6150a9f9-b462-4cc3-b0b0-6f1f136680cd" (UID: "6150a9f9-b462-4cc3-b0b0-6f1f136680cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:49:23 crc kubenswrapper[4756]: I1205 16:49:23.209770 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6150a9f9-b462-4cc3-b0b0-6f1f136680cd-kube-api-access-hmzd2" (OuterVolumeSpecName: "kube-api-access-hmzd2") pod "6150a9f9-b462-4cc3-b0b0-6f1f136680cd" (UID: "6150a9f9-b462-4cc3-b0b0-6f1f136680cd"). InnerVolumeSpecName "kube-api-access-hmzd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:49:23 crc kubenswrapper[4756]: I1205 16:49:23.297232 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6150a9f9-b462-4cc3-b0b0-6f1f136680cd-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:49:23 crc kubenswrapper[4756]: I1205 16:49:23.297270 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmzd2\" (UniqueName: \"kubernetes.io/projected/6150a9f9-b462-4cc3-b0b0-6f1f136680cd-kube-api-access-hmzd2\") on node \"crc\" DevicePath \"\"" Dec 05 16:49:23 crc kubenswrapper[4756]: I1205 16:49:23.315871 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6150a9f9-b462-4cc3-b0b0-6f1f136680cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6150a9f9-b462-4cc3-b0b0-6f1f136680cd" (UID: "6150a9f9-b462-4cc3-b0b0-6f1f136680cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:49:23 crc kubenswrapper[4756]: I1205 16:49:23.398402 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6150a9f9-b462-4cc3-b0b0-6f1f136680cd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:49:24 crc kubenswrapper[4756]: I1205 16:49:24.011470 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbvh5" event={"ID":"6150a9f9-b462-4cc3-b0b0-6f1f136680cd","Type":"ContainerDied","Data":"fdf205ce638df7f73bc400f895d1a75923247715abc0d68502faa014a9c69148"} Dec 05 16:49:24 crc kubenswrapper[4756]: I1205 16:49:24.011559 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dbvh5" Dec 05 16:49:24 crc kubenswrapper[4756]: I1205 16:49:24.012503 4756 scope.go:117] "RemoveContainer" containerID="22e85de31c496fd8e3912e001cd52f1ee35944e90c195043c18ddb781ad918a8" Dec 05 16:49:24 crc kubenswrapper[4756]: I1205 16:49:24.042711 4756 scope.go:117] "RemoveContainer" containerID="016fe08d53ef827f9b8d770cfa1e45bdbe4d5afa7de7647062eb044d8775ba26" Dec 05 16:49:24 crc kubenswrapper[4756]: I1205 16:49:24.067485 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dbvh5"] Dec 05 16:49:24 crc kubenswrapper[4756]: I1205 16:49:24.067631 4756 scope.go:117] "RemoveContainer" containerID="b514875dc4ed45e71c351e3ffd309bd639b629c21c9262f74b0a372eef92ca08" Dec 05 16:49:24 crc kubenswrapper[4756]: I1205 16:49:24.091080 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dbvh5"] Dec 05 16:49:26 crc kubenswrapper[4756]: I1205 16:49:26.062107 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6150a9f9-b462-4cc3-b0b0-6f1f136680cd" path="/var/lib/kubelet/pods/6150a9f9-b462-4cc3-b0b0-6f1f136680cd/volumes" Dec 05 16:50:52 crc kubenswrapper[4756]: I1205 16:50:52.295874 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:50:52 crc kubenswrapper[4756]: I1205 16:50:52.296643 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:50:54 crc kubenswrapper[4756]: I1205 16:50:54.424809 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kdspk"] Dec 05 16:50:54 crc kubenswrapper[4756]: E1205 16:50:54.425169 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6150a9f9-b462-4cc3-b0b0-6f1f136680cd" containerName="extract-content" Dec 05 16:50:54 crc kubenswrapper[4756]: I1205 16:50:54.425186 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="6150a9f9-b462-4cc3-b0b0-6f1f136680cd" containerName="extract-content" Dec 05 16:50:54 crc kubenswrapper[4756]: E1205 16:50:54.425203 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6150a9f9-b462-4cc3-b0b0-6f1f136680cd" containerName="extract-utilities" Dec 05 16:50:54 crc kubenswrapper[4756]: I1205 16:50:54.425214 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="6150a9f9-b462-4cc3-b0b0-6f1f136680cd" containerName="extract-utilities" Dec 05 16:50:54 crc kubenswrapper[4756]: E1205 16:50:54.425251 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6150a9f9-b462-4cc3-b0b0-6f1f136680cd" containerName="registry-server" Dec 05 16:50:54 crc kubenswrapper[4756]: I1205 16:50:54.425261 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="6150a9f9-b462-4cc3-b0b0-6f1f136680cd" containerName="registry-server" Dec 05 16:50:54 crc kubenswrapper[4756]: I1205 16:50:54.425453 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="6150a9f9-b462-4cc3-b0b0-6f1f136680cd" containerName="registry-server" Dec 05 16:50:54 crc kubenswrapper[4756]: I1205 16:50:54.426771 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kdspk" Dec 05 16:50:54 crc kubenswrapper[4756]: I1205 16:50:54.483354 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kdspk"] Dec 05 16:50:54 crc kubenswrapper[4756]: I1205 16:50:54.554665 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqzxj\" (UniqueName: \"kubernetes.io/projected/f60b5696-48f2-4e34-ad74-e89284077c6d-kube-api-access-xqzxj\") pod \"certified-operators-kdspk\" (UID: \"f60b5696-48f2-4e34-ad74-e89284077c6d\") " pod="openshift-marketplace/certified-operators-kdspk" Dec 05 16:50:54 crc kubenswrapper[4756]: I1205 16:50:54.555019 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f60b5696-48f2-4e34-ad74-e89284077c6d-catalog-content\") pod \"certified-operators-kdspk\" (UID: \"f60b5696-48f2-4e34-ad74-e89284077c6d\") " pod="openshift-marketplace/certified-operators-kdspk" Dec 05 16:50:54 crc kubenswrapper[4756]: I1205 16:50:54.555171 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f60b5696-48f2-4e34-ad74-e89284077c6d-utilities\") pod \"certified-operators-kdspk\" (UID: \"f60b5696-48f2-4e34-ad74-e89284077c6d\") " pod="openshift-marketplace/certified-operators-kdspk" Dec 05 16:50:54 crc kubenswrapper[4756]: I1205 16:50:54.656634 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqzxj\" (UniqueName: \"kubernetes.io/projected/f60b5696-48f2-4e34-ad74-e89284077c6d-kube-api-access-xqzxj\") pod \"certified-operators-kdspk\" (UID: \"f60b5696-48f2-4e34-ad74-e89284077c6d\") " pod="openshift-marketplace/certified-operators-kdspk" Dec 05 16:50:54 crc kubenswrapper[4756]: I1205 16:50:54.656998 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f60b5696-48f2-4e34-ad74-e89284077c6d-catalog-content\") pod \"certified-operators-kdspk\" (UID: \"f60b5696-48f2-4e34-ad74-e89284077c6d\") " pod="openshift-marketplace/certified-operators-kdspk" Dec 05 16:50:54 crc kubenswrapper[4756]: I1205 16:50:54.657126 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f60b5696-48f2-4e34-ad74-e89284077c6d-utilities\") pod \"certified-operators-kdspk\" (UID: \"f60b5696-48f2-4e34-ad74-e89284077c6d\") " pod="openshift-marketplace/certified-operators-kdspk" Dec 05 16:50:54 crc kubenswrapper[4756]: I1205 16:50:54.657542 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f60b5696-48f2-4e34-ad74-e89284077c6d-catalog-content\") pod \"certified-operators-kdspk\" (UID: \"f60b5696-48f2-4e34-ad74-e89284077c6d\") " pod="openshift-marketplace/certified-operators-kdspk" Dec 05 16:50:54 crc kubenswrapper[4756]: I1205 16:50:54.657594 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f60b5696-48f2-4e34-ad74-e89284077c6d-utilities\") pod \"certified-operators-kdspk\" (UID: \"f60b5696-48f2-4e34-ad74-e89284077c6d\") " pod="openshift-marketplace/certified-operators-kdspk" Dec 05 16:50:54 crc kubenswrapper[4756]: I1205 16:50:54.680689 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqzxj\" (UniqueName: \"kubernetes.io/projected/f60b5696-48f2-4e34-ad74-e89284077c6d-kube-api-access-xqzxj\") pod \"certified-operators-kdspk\" (UID: \"f60b5696-48f2-4e34-ad74-e89284077c6d\") " pod="openshift-marketplace/certified-operators-kdspk" Dec 05 16:50:54 crc kubenswrapper[4756]: I1205 16:50:54.748793 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kdspk" Dec 05 16:50:55 crc kubenswrapper[4756]: I1205 16:50:55.280154 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kdspk"] Dec 05 16:50:55 crc kubenswrapper[4756]: I1205 16:50:55.723766 4756 generic.go:334] "Generic (PLEG): container finished" podID="f60b5696-48f2-4e34-ad74-e89284077c6d" containerID="4789b90d2acb34905a65f87552e2c24d8b7392732e82eb2a82d76a893a760591" exitCode=0 Dec 05 16:50:55 crc kubenswrapper[4756]: I1205 16:50:55.723812 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdspk" event={"ID":"f60b5696-48f2-4e34-ad74-e89284077c6d","Type":"ContainerDied","Data":"4789b90d2acb34905a65f87552e2c24d8b7392732e82eb2a82d76a893a760591"} Dec 05 16:50:55 crc kubenswrapper[4756]: I1205 16:50:55.723838 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdspk" event={"ID":"f60b5696-48f2-4e34-ad74-e89284077c6d","Type":"ContainerStarted","Data":"54222681eac66951dca6e388782fb01a60cf58c717d28fe3fbeb18ef42ec7285"} Dec 05 16:50:56 crc kubenswrapper[4756]: I1205 16:50:56.733130 4756 generic.go:334] "Generic (PLEG): container finished" podID="f60b5696-48f2-4e34-ad74-e89284077c6d" containerID="3df4328c9bc9b476e95e5a7d55b5e334bd0de6703507151e7a6dea55f234215b" exitCode=0 Dec 05 16:50:56 crc kubenswrapper[4756]: I1205 16:50:56.733342 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdspk" event={"ID":"f60b5696-48f2-4e34-ad74-e89284077c6d","Type":"ContainerDied","Data":"3df4328c9bc9b476e95e5a7d55b5e334bd0de6703507151e7a6dea55f234215b"} Dec 05 16:50:57 crc kubenswrapper[4756]: I1205 16:50:57.742137 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdspk" event={"ID":"f60b5696-48f2-4e34-ad74-e89284077c6d","Type":"ContainerStarted","Data":"660d65eef16d71dd6fe25a511b611fc98f0d6627db11e1147c7bfac1b5094225"} Dec 05 16:50:57 crc kubenswrapper[4756]: I1205 16:50:57.759550 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kdspk" podStartSLOduration=2.389362173 podStartE2EDuration="3.759528418s" podCreationTimestamp="2025-12-05 16:50:54 +0000 UTC" firstStartedPulling="2025-12-05 16:50:55.725354216 +0000 UTC m=+2115.964027718" lastFinishedPulling="2025-12-05 16:50:57.095520411 +0000 UTC m=+2117.334193963" observedRunningTime="2025-12-05 16:50:57.756149157 +0000 UTC m=+2117.994822689" watchObservedRunningTime="2025-12-05 16:50:57.759528418 +0000 UTC m=+2117.998201930" Dec 05 16:51:04 crc kubenswrapper[4756]: I1205 16:51:04.749603 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kdspk" Dec 05 16:51:04 crc kubenswrapper[4756]: I1205 16:51:04.750192 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kdspk" Dec 05 16:51:04 crc kubenswrapper[4756]: I1205 16:51:04.797370 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kdspk" Dec 05 16:51:04 crc kubenswrapper[4756]: I1205 16:51:04.874329 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kdspk" Dec 05 16:51:05 crc kubenswrapper[4756]: I1205 16:51:05.030737 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kdspk"] Dec 05 16:51:06 crc kubenswrapper[4756]: I1205 16:51:06.850606 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kdspk" podUID="f60b5696-48f2-4e34-ad74-e89284077c6d" containerName="registry-server" containerID="cri-o://660d65eef16d71dd6fe25a511b611fc98f0d6627db11e1147c7bfac1b5094225" gracePeriod=2 Dec 05 16:51:07 crc kubenswrapper[4756]: I1205 16:51:07.751693 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kdspk" Dec 05 16:51:07 crc kubenswrapper[4756]: I1205 16:51:07.855679 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f60b5696-48f2-4e34-ad74-e89284077c6d-catalog-content\") pod \"f60b5696-48f2-4e34-ad74-e89284077c6d\" (UID: \"f60b5696-48f2-4e34-ad74-e89284077c6d\") " Dec 05 16:51:07 crc kubenswrapper[4756]: I1205 16:51:07.855729 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqzxj\" (UniqueName: \"kubernetes.io/projected/f60b5696-48f2-4e34-ad74-e89284077c6d-kube-api-access-xqzxj\") pod \"f60b5696-48f2-4e34-ad74-e89284077c6d\" (UID: \"f60b5696-48f2-4e34-ad74-e89284077c6d\") " Dec 05 16:51:07 crc kubenswrapper[4756]: I1205 16:51:07.855827 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f60b5696-48f2-4e34-ad74-e89284077c6d-utilities\") pod \"f60b5696-48f2-4e34-ad74-e89284077c6d\" (UID: \"f60b5696-48f2-4e34-ad74-e89284077c6d\") " Dec 05 16:51:07 crc kubenswrapper[4756]: I1205 16:51:07.856849 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f60b5696-48f2-4e34-ad74-e89284077c6d-utilities" (OuterVolumeSpecName: "utilities") pod "f60b5696-48f2-4e34-ad74-e89284077c6d" (UID: "f60b5696-48f2-4e34-ad74-e89284077c6d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:51:07 crc kubenswrapper[4756]: I1205 16:51:07.859405 4756 generic.go:334] "Generic (PLEG): container finished" podID="f60b5696-48f2-4e34-ad74-e89284077c6d" containerID="660d65eef16d71dd6fe25a511b611fc98f0d6627db11e1147c7bfac1b5094225" exitCode=0 Dec 05 16:51:07 crc kubenswrapper[4756]: I1205 16:51:07.859495 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kdspk" Dec 05 16:51:07 crc kubenswrapper[4756]: I1205 16:51:07.859514 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdspk" event={"ID":"f60b5696-48f2-4e34-ad74-e89284077c6d","Type":"ContainerDied","Data":"660d65eef16d71dd6fe25a511b611fc98f0d6627db11e1147c7bfac1b5094225"} Dec 05 16:51:07 crc kubenswrapper[4756]: I1205 16:51:07.859869 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdspk" event={"ID":"f60b5696-48f2-4e34-ad74-e89284077c6d","Type":"ContainerDied","Data":"54222681eac66951dca6e388782fb01a60cf58c717d28fe3fbeb18ef42ec7285"} Dec 05 16:51:07 crc kubenswrapper[4756]: I1205 16:51:07.859891 4756 scope.go:117] "RemoveContainer" containerID="660d65eef16d71dd6fe25a511b611fc98f0d6627db11e1147c7bfac1b5094225" Dec 05 16:51:07 crc kubenswrapper[4756]: I1205 16:51:07.860131 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f60b5696-48f2-4e34-ad74-e89284077c6d-kube-api-access-xqzxj" (OuterVolumeSpecName: "kube-api-access-xqzxj") pod "f60b5696-48f2-4e34-ad74-e89284077c6d" (UID: "f60b5696-48f2-4e34-ad74-e89284077c6d"). InnerVolumeSpecName "kube-api-access-xqzxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:51:07 crc kubenswrapper[4756]: I1205 16:51:07.894092 4756 scope.go:117] "RemoveContainer" containerID="3df4328c9bc9b476e95e5a7d55b5e334bd0de6703507151e7a6dea55f234215b" Dec 05 16:51:07 crc kubenswrapper[4756]: I1205 16:51:07.912769 4756 scope.go:117] "RemoveContainer" containerID="4789b90d2acb34905a65f87552e2c24d8b7392732e82eb2a82d76a893a760591" Dec 05 16:51:07 crc kubenswrapper[4756]: I1205 16:51:07.917217 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f60b5696-48f2-4e34-ad74-e89284077c6d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f60b5696-48f2-4e34-ad74-e89284077c6d" (UID: "f60b5696-48f2-4e34-ad74-e89284077c6d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:51:07 crc kubenswrapper[4756]: I1205 16:51:07.941410 4756 scope.go:117] "RemoveContainer" containerID="660d65eef16d71dd6fe25a511b611fc98f0d6627db11e1147c7bfac1b5094225" Dec 05 16:51:07 crc kubenswrapper[4756]: E1205 16:51:07.941933 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"660d65eef16d71dd6fe25a511b611fc98f0d6627db11e1147c7bfac1b5094225\": container with ID starting with 660d65eef16d71dd6fe25a511b611fc98f0d6627db11e1147c7bfac1b5094225 not found: ID does not exist" containerID="660d65eef16d71dd6fe25a511b611fc98f0d6627db11e1147c7bfac1b5094225" Dec 05 16:51:07 crc kubenswrapper[4756]: I1205 16:51:07.941964 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"660d65eef16d71dd6fe25a511b611fc98f0d6627db11e1147c7bfac1b5094225"} err="failed to get container status \"660d65eef16d71dd6fe25a511b611fc98f0d6627db11e1147c7bfac1b5094225\": rpc error: code = NotFound desc = could not find container \"660d65eef16d71dd6fe25a511b611fc98f0d6627db11e1147c7bfac1b5094225\": container with ID starting with 660d65eef16d71dd6fe25a511b611fc98f0d6627db11e1147c7bfac1b5094225 not found: ID does not exist" Dec 05 16:51:07 crc kubenswrapper[4756]: I1205 16:51:07.941984 4756 scope.go:117] "RemoveContainer" containerID="3df4328c9bc9b476e95e5a7d55b5e334bd0de6703507151e7a6dea55f234215b" Dec 05 16:51:07 crc kubenswrapper[4756]: E1205 16:51:07.942377 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3df4328c9bc9b476e95e5a7d55b5e334bd0de6703507151e7a6dea55f234215b\": container with ID starting with 3df4328c9bc9b476e95e5a7d55b5e334bd0de6703507151e7a6dea55f234215b not found: ID does not exist" containerID="3df4328c9bc9b476e95e5a7d55b5e334bd0de6703507151e7a6dea55f234215b" Dec 05 16:51:07 crc kubenswrapper[4756]: I1205 16:51:07.942398 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3df4328c9bc9b476e95e5a7d55b5e334bd0de6703507151e7a6dea55f234215b"} err="failed to get container status \"3df4328c9bc9b476e95e5a7d55b5e334bd0de6703507151e7a6dea55f234215b\": rpc error: code = NotFound desc = could not find container \"3df4328c9bc9b476e95e5a7d55b5e334bd0de6703507151e7a6dea55f234215b\": container with ID starting with 3df4328c9bc9b476e95e5a7d55b5e334bd0de6703507151e7a6dea55f234215b not found: ID does not exist" Dec 05 16:51:07 crc kubenswrapper[4756]: I1205 16:51:07.942425 4756 scope.go:117] "RemoveContainer" containerID="4789b90d2acb34905a65f87552e2c24d8b7392732e82eb2a82d76a893a760591" Dec 05 16:51:07 crc kubenswrapper[4756]: E1205 16:51:07.942680 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4789b90d2acb34905a65f87552e2c24d8b7392732e82eb2a82d76a893a760591\": container with ID starting with 4789b90d2acb34905a65f87552e2c24d8b7392732e82eb2a82d76a893a760591 not found: ID does not exist" containerID="4789b90d2acb34905a65f87552e2c24d8b7392732e82eb2a82d76a893a760591" Dec 05 16:51:07 crc kubenswrapper[4756]: I1205 16:51:07.942732 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4789b90d2acb34905a65f87552e2c24d8b7392732e82eb2a82d76a893a760591"} err="failed to get container status \"4789b90d2acb34905a65f87552e2c24d8b7392732e82eb2a82d76a893a760591\": rpc error: code = NotFound desc = could not find container \"4789b90d2acb34905a65f87552e2c24d8b7392732e82eb2a82d76a893a760591\": container with ID starting with 4789b90d2acb34905a65f87552e2c24d8b7392732e82eb2a82d76a893a760591 not found: ID does not exist" Dec 05 16:51:07 crc kubenswrapper[4756]: I1205 16:51:07.957603 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f60b5696-48f2-4e34-ad74-e89284077c6d-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:51:07 crc kubenswrapper[4756]: I1205 16:51:07.957627 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f60b5696-48f2-4e34-ad74-e89284077c6d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:51:07 crc kubenswrapper[4756]: I1205 16:51:07.957640 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqzxj\" (UniqueName: \"kubernetes.io/projected/f60b5696-48f2-4e34-ad74-e89284077c6d-kube-api-access-xqzxj\") on node \"crc\" DevicePath \"\"" Dec 05 16:51:08 crc kubenswrapper[4756]: I1205 16:51:08.181660 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kdspk"] Dec 05 16:51:08 crc kubenswrapper[4756]: I1205 16:51:08.187003 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kdspk"] Dec 05 16:51:10 crc kubenswrapper[4756]: I1205 16:51:10.063491 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f60b5696-48f2-4e34-ad74-e89284077c6d" path="/var/lib/kubelet/pods/f60b5696-48f2-4e34-ad74-e89284077c6d/volumes" Dec 05 16:51:22 crc kubenswrapper[4756]: I1205 16:51:22.295983 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:51:22 crc kubenswrapper[4756]: I1205 16:51:22.296565 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:51:46 crc kubenswrapper[4756]: I1205 16:51:46.957344 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4s57k"] Dec 05 16:51:46 crc kubenswrapper[4756]: E1205 16:51:46.958225 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f60b5696-48f2-4e34-ad74-e89284077c6d" containerName="extract-utilities" Dec 05 16:51:46 crc kubenswrapper[4756]: I1205 16:51:46.958241 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f60b5696-48f2-4e34-ad74-e89284077c6d" containerName="extract-utilities" Dec 05 16:51:46 crc kubenswrapper[4756]: E1205 16:51:46.958271 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f60b5696-48f2-4e34-ad74-e89284077c6d" containerName="extract-content" Dec 05 16:51:46 crc kubenswrapper[4756]: I1205 16:51:46.958279 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f60b5696-48f2-4e34-ad74-e89284077c6d" containerName="extract-content" Dec 05 16:51:46 crc kubenswrapper[4756]: E1205 16:51:46.958290 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f60b5696-48f2-4e34-ad74-e89284077c6d" containerName="registry-server" Dec 05 16:51:46 crc kubenswrapper[4756]: I1205 16:51:46.958299 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f60b5696-48f2-4e34-ad74-e89284077c6d" containerName="registry-server" Dec 05 16:51:46 crc kubenswrapper[4756]: I1205 16:51:46.958483 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f60b5696-48f2-4e34-ad74-e89284077c6d" containerName="registry-server" Dec 05 16:51:46 crc kubenswrapper[4756]: I1205 16:51:46.959643 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4s57k" Dec 05 16:51:47 crc kubenswrapper[4756]: I1205 16:51:47.326805 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4s57k"] Dec 05 16:51:47 crc kubenswrapper[4756]: I1205 16:51:47.392462 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c907408-b752-4e6c-9a11-99e8082be884-utilities\") pod \"community-operators-4s57k\" (UID: \"6c907408-b752-4e6c-9a11-99e8082be884\") " pod="openshift-marketplace/community-operators-4s57k" Dec 05 16:51:47 crc kubenswrapper[4756]: I1205 16:51:47.393696 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r8w8\" (UniqueName: \"kubernetes.io/projected/6c907408-b752-4e6c-9a11-99e8082be884-kube-api-access-6r8w8\") pod \"community-operators-4s57k\" (UID: \"6c907408-b752-4e6c-9a11-99e8082be884\") " pod="openshift-marketplace/community-operators-4s57k" Dec 05 16:51:47 crc kubenswrapper[4756]: I1205 16:51:47.394013 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c907408-b752-4e6c-9a11-99e8082be884-catalog-content\") pod \"community-operators-4s57k\" (UID: \"6c907408-b752-4e6c-9a11-99e8082be884\") " pod="openshift-marketplace/community-operators-4s57k" Dec 05 16:51:47 crc kubenswrapper[4756]: I1205 16:51:47.496922 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c907408-b752-4e6c-9a11-99e8082be884-catalog-content\") pod \"community-operators-4s57k\" (UID: \"6c907408-b752-4e6c-9a11-99e8082be884\") " pod="openshift-marketplace/community-operators-4s57k" Dec 05 16:51:47 crc kubenswrapper[4756]: I1205 16:51:47.497007 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c907408-b752-4e6c-9a11-99e8082be884-utilities\") pod \"community-operators-4s57k\" (UID: \"6c907408-b752-4e6c-9a11-99e8082be884\") " pod="openshift-marketplace/community-operators-4s57k" Dec 05 16:51:47 crc kubenswrapper[4756]: I1205 16:51:47.497059 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r8w8\" (UniqueName: \"kubernetes.io/projected/6c907408-b752-4e6c-9a11-99e8082be884-kube-api-access-6r8w8\") pod \"community-operators-4s57k\" (UID: \"6c907408-b752-4e6c-9a11-99e8082be884\") " pod="openshift-marketplace/community-operators-4s57k" Dec 05 16:51:47 crc kubenswrapper[4756]: I1205 16:51:47.497807 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c907408-b752-4e6c-9a11-99e8082be884-catalog-content\") pod \"community-operators-4s57k\" (UID: \"6c907408-b752-4e6c-9a11-99e8082be884\") " pod="openshift-marketplace/community-operators-4s57k" Dec 05 16:51:47 crc kubenswrapper[4756]: I1205 16:51:47.497917 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c907408-b752-4e6c-9a11-99e8082be884-utilities\") pod \"community-operators-4s57k\" (UID: \"6c907408-b752-4e6c-9a11-99e8082be884\") " pod="openshift-marketplace/community-operators-4s57k" Dec 05 16:51:47 crc kubenswrapper[4756]: I1205 16:51:47.518980 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r8w8\" (UniqueName: \"kubernetes.io/projected/6c907408-b752-4e6c-9a11-99e8082be884-kube-api-access-6r8w8\") pod \"community-operators-4s57k\" (UID: \"6c907408-b752-4e6c-9a11-99e8082be884\") " pod="openshift-marketplace/community-operators-4s57k" Dec 05 16:51:47 crc kubenswrapper[4756]: I1205 16:51:47.581331 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4s57k" Dec 05 16:51:48 crc kubenswrapper[4756]: I1205 16:51:48.078956 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4s57k"] Dec 05 16:51:48 crc kubenswrapper[4756]: I1205 16:51:48.346560 4756 generic.go:334] "Generic (PLEG): container finished" podID="6c907408-b752-4e6c-9a11-99e8082be884" containerID="004701d5ad0285a2a31641f134ca3c40c578e01cc5a799a12fe26760b2e09045" exitCode=0 Dec 05 16:51:48 crc kubenswrapper[4756]: I1205 16:51:48.346670 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4s57k" event={"ID":"6c907408-b752-4e6c-9a11-99e8082be884","Type":"ContainerDied","Data":"004701d5ad0285a2a31641f134ca3c40c578e01cc5a799a12fe26760b2e09045"} Dec 05 16:51:48 crc kubenswrapper[4756]: I1205 16:51:48.346886 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4s57k" event={"ID":"6c907408-b752-4e6c-9a11-99e8082be884","Type":"ContainerStarted","Data":"78176b2b39999be441939db3e1cd2d7388c4e0c9d5829ba23477ef5fbb39143b"} Dec 05 16:51:49 crc kubenswrapper[4756]: I1205 16:51:49.354940 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4s57k" event={"ID":"6c907408-b752-4e6c-9a11-99e8082be884","Type":"ContainerStarted","Data":"49ab20f5d410041140b1cd2a055b7431484186dff611a32d574941e2f590d119"} Dec 05 16:51:49 crc kubenswrapper[4756]: I1205 16:51:49.953674 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6dmlf"] Dec 05 16:51:49 crc kubenswrapper[4756]: I1205 16:51:49.956030 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6dmlf" Dec 05 16:51:49 crc kubenswrapper[4756]: I1205 16:51:49.965453 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f790cd3a-7a14-4d66-af22-c2474936e462-catalog-content\") pod \"redhat-marketplace-6dmlf\" (UID: \"f790cd3a-7a14-4d66-af22-c2474936e462\") " pod="openshift-marketplace/redhat-marketplace-6dmlf" Dec 05 16:51:49 crc kubenswrapper[4756]: I1205 16:51:49.965579 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f790cd3a-7a14-4d66-af22-c2474936e462-utilities\") pod \"redhat-marketplace-6dmlf\" (UID: \"f790cd3a-7a14-4d66-af22-c2474936e462\") " pod="openshift-marketplace/redhat-marketplace-6dmlf" Dec 05 16:51:49 crc kubenswrapper[4756]: I1205 16:51:49.965603 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bphz\" (UniqueName: \"kubernetes.io/projected/f790cd3a-7a14-4d66-af22-c2474936e462-kube-api-access-7bphz\") pod \"redhat-marketplace-6dmlf\" (UID: \"f790cd3a-7a14-4d66-af22-c2474936e462\") " pod="openshift-marketplace/redhat-marketplace-6dmlf" Dec 05 16:51:49 crc kubenswrapper[4756]: I1205 16:51:49.975528 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dmlf"] Dec 05 16:51:50 crc kubenswrapper[4756]: I1205 16:51:50.067318 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f790cd3a-7a14-4d66-af22-c2474936e462-catalog-content\") pod \"redhat-marketplace-6dmlf\" (UID: \"f790cd3a-7a14-4d66-af22-c2474936e462\") " pod="openshift-marketplace/redhat-marketplace-6dmlf" Dec 05 16:51:50 crc kubenswrapper[4756]: I1205 16:51:50.067463 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f790cd3a-7a14-4d66-af22-c2474936e462-utilities\") pod \"redhat-marketplace-6dmlf\" (UID: \"f790cd3a-7a14-4d66-af22-c2474936e462\") " pod="openshift-marketplace/redhat-marketplace-6dmlf" Dec 05 16:51:50 crc kubenswrapper[4756]: I1205 16:51:50.067494 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bphz\" (UniqueName: \"kubernetes.io/projected/f790cd3a-7a14-4d66-af22-c2474936e462-kube-api-access-7bphz\") pod \"redhat-marketplace-6dmlf\" (UID: \"f790cd3a-7a14-4d66-af22-c2474936e462\") " pod="openshift-marketplace/redhat-marketplace-6dmlf" Dec 05 16:51:50 crc kubenswrapper[4756]: I1205 16:51:50.068066 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f790cd3a-7a14-4d66-af22-c2474936e462-utilities\") pod \"redhat-marketplace-6dmlf\" (UID: \"f790cd3a-7a14-4d66-af22-c2474936e462\") " pod="openshift-marketplace/redhat-marketplace-6dmlf" Dec 05 16:51:50 crc kubenswrapper[4756]: I1205 16:51:50.068362 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f790cd3a-7a14-4d66-af22-c2474936e462-catalog-content\") pod \"redhat-marketplace-6dmlf\" (UID: \"f790cd3a-7a14-4d66-af22-c2474936e462\") " pod="openshift-marketplace/redhat-marketplace-6dmlf" Dec 05 16:51:50 crc kubenswrapper[4756]: I1205 16:51:50.086534 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bphz\" (UniqueName: \"kubernetes.io/projected/f790cd3a-7a14-4d66-af22-c2474936e462-kube-api-access-7bphz\") pod \"redhat-marketplace-6dmlf\" (UID: \"f790cd3a-7a14-4d66-af22-c2474936e462\") " pod="openshift-marketplace/redhat-marketplace-6dmlf" Dec 05 16:51:50 crc kubenswrapper[4756]: I1205 16:51:50.276486 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6dmlf" Dec 05 16:51:50 crc kubenswrapper[4756]: I1205 16:51:50.420223 4756 generic.go:334] "Generic (PLEG): container finished" podID="6c907408-b752-4e6c-9a11-99e8082be884" containerID="49ab20f5d410041140b1cd2a055b7431484186dff611a32d574941e2f590d119" exitCode=0 Dec 05 16:51:50 crc kubenswrapper[4756]: I1205 16:51:50.420306 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4s57k" event={"ID":"6c907408-b752-4e6c-9a11-99e8082be884","Type":"ContainerDied","Data":"49ab20f5d410041140b1cd2a055b7431484186dff611a32d574941e2f590d119"} Dec 05 16:51:50 crc kubenswrapper[4756]: I1205 16:51:50.750198 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dmlf"] Dec 05 16:51:51 crc kubenswrapper[4756]: I1205 16:51:51.430595 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4s57k" event={"ID":"6c907408-b752-4e6c-9a11-99e8082be884","Type":"ContainerStarted","Data":"02242872cc51c5126c0ce4825e3b1539b2f2d978e1e4afa8855b87bbc6b44d8e"} Dec 05 16:51:51 crc kubenswrapper[4756]: I1205 16:51:51.434497 4756 generic.go:334] "Generic (PLEG): container finished" podID="f790cd3a-7a14-4d66-af22-c2474936e462" containerID="93b1c7e346c4b740ae441802463cc89bf2c0aab590fa2ecd7efd527ac9dce159" exitCode=0 Dec 05 16:51:51 crc kubenswrapper[4756]: I1205 16:51:51.434548 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dmlf" event={"ID":"f790cd3a-7a14-4d66-af22-c2474936e462","Type":"ContainerDied","Data":"93b1c7e346c4b740ae441802463cc89bf2c0aab590fa2ecd7efd527ac9dce159"} Dec 05 16:51:51 crc kubenswrapper[4756]: I1205 16:51:51.434575 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dmlf" event={"ID":"f790cd3a-7a14-4d66-af22-c2474936e462","Type":"ContainerStarted","Data":"9157cb9e1f70f0cb3f9c9820acdd35441371d41ed03493a3fb5b777516250dbc"} Dec 05 16:51:51 crc kubenswrapper[4756]: I1205 16:51:51.457859 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4s57k" podStartSLOduration=2.95746551 podStartE2EDuration="5.457837733s" podCreationTimestamp="2025-12-05 16:51:46 +0000 UTC" firstStartedPulling="2025-12-05 16:51:48.348922072 +0000 UTC m=+2168.587595584" lastFinishedPulling="2025-12-05 16:51:50.849294295 +0000 UTC m=+2171.087967807" observedRunningTime="2025-12-05 16:51:51.454749409 +0000 UTC m=+2171.693422931" watchObservedRunningTime="2025-12-05 16:51:51.457837733 +0000 UTC m=+2171.696511255" Dec 05 16:51:52 crc kubenswrapper[4756]: I1205 16:51:52.295615 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:51:52 crc kubenswrapper[4756]: I1205 16:51:52.295711 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:51:52 crc kubenswrapper[4756]: I1205 16:51:52.295787 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 16:51:52 crc kubenswrapper[4756]: I1205 16:51:52.296762 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b08c122861a85300ba33993513b01f033b9489d5a5114fa7395b74161bc4057c"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 16:51:52 crc kubenswrapper[4756]: I1205 16:51:52.296834 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://b08c122861a85300ba33993513b01f033b9489d5a5114fa7395b74161bc4057c" gracePeriod=600 Dec 05 16:51:52 crc kubenswrapper[4756]: I1205 16:51:52.443459 4756 generic.go:334] "Generic (PLEG): container finished" podID="f790cd3a-7a14-4d66-af22-c2474936e462" containerID="62a46db6dbba88fa2ecd8664f3c85fcd006d39ebc73d2f2d577dd6ffb23b02f2" exitCode=0 Dec 05 16:51:52 crc kubenswrapper[4756]: I1205 16:51:52.443829 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dmlf" event={"ID":"f790cd3a-7a14-4d66-af22-c2474936e462","Type":"ContainerDied","Data":"62a46db6dbba88fa2ecd8664f3c85fcd006d39ebc73d2f2d577dd6ffb23b02f2"} Dec 05 16:51:52 crc kubenswrapper[4756]: I1205 16:51:52.448912 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="b08c122861a85300ba33993513b01f033b9489d5a5114fa7395b74161bc4057c" exitCode=0 Dec 05 16:51:52 crc kubenswrapper[4756]: I1205 16:51:52.449261 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"b08c122861a85300ba33993513b01f033b9489d5a5114fa7395b74161bc4057c"} Dec 05 16:51:52 crc kubenswrapper[4756]: I1205 16:51:52.449354 4756 scope.go:117] "RemoveContainer" containerID="40046a9deaf4ed5c50f041e92e3e76c434ddfe745e09fffeedde7ecf38a28b57" Dec 05 16:51:53 crc kubenswrapper[4756]: I1205 16:51:53.457531 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb"} Dec 05 16:51:53 crc kubenswrapper[4756]: I1205 16:51:53.460217 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dmlf" event={"ID":"f790cd3a-7a14-4d66-af22-c2474936e462","Type":"ContainerStarted","Data":"299ee2e6d4a54fdd63f6ffb7854f03b0baebd9d9ba2a913693a14a6b29f99025"} Dec 05 16:51:53 crc kubenswrapper[4756]: I1205 16:51:53.496700 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6dmlf" podStartSLOduration=3.103437613 podStartE2EDuration="4.496680332s" podCreationTimestamp="2025-12-05 16:51:49 +0000 UTC" firstStartedPulling="2025-12-05 16:51:51.439202699 +0000 UTC m=+2171.677876211" lastFinishedPulling="2025-12-05 16:51:52.832445418 +0000 UTC m=+2173.071118930" observedRunningTime="2025-12-05 16:51:53.493592359 +0000 UTC m=+2173.732265891" watchObservedRunningTime="2025-12-05 16:51:53.496680332 +0000 UTC m=+2173.735353844" Dec 05 16:51:57 crc kubenswrapper[4756]: I1205 16:51:57.581527 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4s57k" Dec 05 16:51:57 crc kubenswrapper[4756]: I1205 16:51:57.582322 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4s57k" Dec 05 16:51:57 crc kubenswrapper[4756]: I1205 16:51:57.620889 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4s57k" Dec 05 16:51:58 crc kubenswrapper[4756]: I1205 16:51:58.546574 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4s57k" Dec 05 16:51:58 crc kubenswrapper[4756]: I1205 16:51:58.592928 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4s57k"] Dec 05 16:52:00 crc kubenswrapper[4756]: I1205 16:52:00.277257 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6dmlf" Dec 05 16:52:00 crc kubenswrapper[4756]: I1205 16:52:00.277577 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6dmlf" Dec 05 16:52:00 crc kubenswrapper[4756]: I1205 16:52:00.341017 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6dmlf" Dec 05 16:52:00 crc kubenswrapper[4756]: I1205 16:52:00.516893 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4s57k" podUID="6c907408-b752-4e6c-9a11-99e8082be884" containerName="registry-server" containerID="cri-o://02242872cc51c5126c0ce4825e3b1539b2f2d978e1e4afa8855b87bbc6b44d8e" gracePeriod=2 Dec 05 16:52:00 crc kubenswrapper[4756]: I1205 16:52:00.559394 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6dmlf" Dec 05 16:52:01 crc kubenswrapper[4756]: I1205 16:52:01.391882 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4s57k" Dec 05 16:52:01 crc kubenswrapper[4756]: I1205 16:52:01.479263 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c907408-b752-4e6c-9a11-99e8082be884-catalog-content\") pod \"6c907408-b752-4e6c-9a11-99e8082be884\" (UID: \"6c907408-b752-4e6c-9a11-99e8082be884\") " Dec 05 16:52:01 crc kubenswrapper[4756]: I1205 16:52:01.479386 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r8w8\" (UniqueName: \"kubernetes.io/projected/6c907408-b752-4e6c-9a11-99e8082be884-kube-api-access-6r8w8\") pod \"6c907408-b752-4e6c-9a11-99e8082be884\" (UID: \"6c907408-b752-4e6c-9a11-99e8082be884\") " Dec 05 16:52:01 crc kubenswrapper[4756]: I1205 16:52:01.479532 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c907408-b752-4e6c-9a11-99e8082be884-utilities\") pod \"6c907408-b752-4e6c-9a11-99e8082be884\" (UID: \"6c907408-b752-4e6c-9a11-99e8082be884\") " Dec 05 16:52:01 crc kubenswrapper[4756]: I1205 16:52:01.480627 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c907408-b752-4e6c-9a11-99e8082be884-utilities" (OuterVolumeSpecName: "utilities") pod "6c907408-b752-4e6c-9a11-99e8082be884" (UID: "6c907408-b752-4e6c-9a11-99e8082be884"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:52:01 crc kubenswrapper[4756]: I1205 16:52:01.486793 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c907408-b752-4e6c-9a11-99e8082be884-kube-api-access-6r8w8" (OuterVolumeSpecName: "kube-api-access-6r8w8") pod "6c907408-b752-4e6c-9a11-99e8082be884" (UID: "6c907408-b752-4e6c-9a11-99e8082be884"). InnerVolumeSpecName "kube-api-access-6r8w8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:52:01 crc kubenswrapper[4756]: I1205 16:52:01.527649 4756 generic.go:334] "Generic (PLEG): container finished" podID="6c907408-b752-4e6c-9a11-99e8082be884" containerID="02242872cc51c5126c0ce4825e3b1539b2f2d978e1e4afa8855b87bbc6b44d8e" exitCode=0 Dec 05 16:52:01 crc kubenswrapper[4756]: I1205 16:52:01.528267 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4s57k" Dec 05 16:52:01 crc kubenswrapper[4756]: I1205 16:52:01.528549 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4s57k" event={"ID":"6c907408-b752-4e6c-9a11-99e8082be884","Type":"ContainerDied","Data":"02242872cc51c5126c0ce4825e3b1539b2f2d978e1e4afa8855b87bbc6b44d8e"} Dec 05 16:52:01 crc kubenswrapper[4756]: I1205 16:52:01.529954 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4s57k" event={"ID":"6c907408-b752-4e6c-9a11-99e8082be884","Type":"ContainerDied","Data":"78176b2b39999be441939db3e1cd2d7388c4e0c9d5829ba23477ef5fbb39143b"} Dec 05 16:52:01 crc kubenswrapper[4756]: I1205 16:52:01.530007 4756 scope.go:117] "RemoveContainer" containerID="02242872cc51c5126c0ce4825e3b1539b2f2d978e1e4afa8855b87bbc6b44d8e" Dec 05 16:52:01 crc kubenswrapper[4756]: I1205 16:52:01.541409 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dmlf"] Dec 05 16:52:01 crc kubenswrapper[4756]: I1205 16:52:01.555094 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c907408-b752-4e6c-9a11-99e8082be884-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6c907408-b752-4e6c-9a11-99e8082be884" (UID: "6c907408-b752-4e6c-9a11-99e8082be884"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:52:01 crc kubenswrapper[4756]: I1205 16:52:01.564031 4756 scope.go:117] "RemoveContainer" containerID="49ab20f5d410041140b1cd2a055b7431484186dff611a32d574941e2f590d119" Dec 05 16:52:01 crc kubenswrapper[4756]: I1205 16:52:01.581394 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c907408-b752-4e6c-9a11-99e8082be884-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:52:01 crc kubenswrapper[4756]: I1205 16:52:01.581444 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c907408-b752-4e6c-9a11-99e8082be884-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:52:01 crc kubenswrapper[4756]: I1205 16:52:01.581458 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r8w8\" (UniqueName: \"kubernetes.io/projected/6c907408-b752-4e6c-9a11-99e8082be884-kube-api-access-6r8w8\") on node \"crc\" DevicePath \"\"" Dec 05 16:52:01 crc kubenswrapper[4756]: I1205 16:52:01.584752 4756 scope.go:117] "RemoveContainer" containerID="004701d5ad0285a2a31641f134ca3c40c578e01cc5a799a12fe26760b2e09045" Dec 05 16:52:01 crc kubenswrapper[4756]: I1205 16:52:01.604715 4756 scope.go:117] "RemoveContainer" containerID="02242872cc51c5126c0ce4825e3b1539b2f2d978e1e4afa8855b87bbc6b44d8e" Dec 05 16:52:01 crc kubenswrapper[4756]: E1205 16:52:01.605196 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02242872cc51c5126c0ce4825e3b1539b2f2d978e1e4afa8855b87bbc6b44d8e\": container with ID starting with 02242872cc51c5126c0ce4825e3b1539b2f2d978e1e4afa8855b87bbc6b44d8e not found: ID does not exist" containerID="02242872cc51c5126c0ce4825e3b1539b2f2d978e1e4afa8855b87bbc6b44d8e" Dec 05 16:52:01 crc kubenswrapper[4756]: I1205 16:52:01.605234 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02242872cc51c5126c0ce4825e3b1539b2f2d978e1e4afa8855b87bbc6b44d8e"} err="failed to get container status \"02242872cc51c5126c0ce4825e3b1539b2f2d978e1e4afa8855b87bbc6b44d8e\": rpc error: code = NotFound desc = could not find container \"02242872cc51c5126c0ce4825e3b1539b2f2d978e1e4afa8855b87bbc6b44d8e\": container with ID starting with 02242872cc51c5126c0ce4825e3b1539b2f2d978e1e4afa8855b87bbc6b44d8e not found: ID does not exist" Dec 05 16:52:01 crc kubenswrapper[4756]: I1205 16:52:01.605260 4756 scope.go:117] "RemoveContainer" containerID="49ab20f5d410041140b1cd2a055b7431484186dff611a32d574941e2f590d119" Dec 05 16:52:01 crc kubenswrapper[4756]: E1205 16:52:01.605658 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49ab20f5d410041140b1cd2a055b7431484186dff611a32d574941e2f590d119\": container with ID starting with 49ab20f5d410041140b1cd2a055b7431484186dff611a32d574941e2f590d119 not found: ID does not exist" containerID="49ab20f5d410041140b1cd2a055b7431484186dff611a32d574941e2f590d119" Dec 05 16:52:01 crc kubenswrapper[4756]: I1205 16:52:01.605693 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49ab20f5d410041140b1cd2a055b7431484186dff611a32d574941e2f590d119"} err="failed to get container status \"49ab20f5d410041140b1cd2a055b7431484186dff611a32d574941e2f590d119\": rpc error: code = NotFound desc = could not find container \"49ab20f5d410041140b1cd2a055b7431484186dff611a32d574941e2f590d119\": container with ID starting with 49ab20f5d410041140b1cd2a055b7431484186dff611a32d574941e2f590d119 not found: ID does not exist" Dec 05 16:52:01 crc kubenswrapper[4756]: I1205 16:52:01.605709 4756 scope.go:117] "RemoveContainer" containerID="004701d5ad0285a2a31641f134ca3c40c578e01cc5a799a12fe26760b2e09045" Dec 05 16:52:01 crc kubenswrapper[4756]: E1205 16:52:01.605999 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"004701d5ad0285a2a31641f134ca3c40c578e01cc5a799a12fe26760b2e09045\": container with ID starting with 004701d5ad0285a2a31641f134ca3c40c578e01cc5a799a12fe26760b2e09045 not found: ID does not exist" containerID="004701d5ad0285a2a31641f134ca3c40c578e01cc5a799a12fe26760b2e09045" Dec 05 16:52:01 crc kubenswrapper[4756]: I1205 16:52:01.606025 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"004701d5ad0285a2a31641f134ca3c40c578e01cc5a799a12fe26760b2e09045"} err="failed to get container status \"004701d5ad0285a2a31641f134ca3c40c578e01cc5a799a12fe26760b2e09045\": rpc error: code = NotFound desc = could not find container \"004701d5ad0285a2a31641f134ca3c40c578e01cc5a799a12fe26760b2e09045\": container with ID starting with 004701d5ad0285a2a31641f134ca3c40c578e01cc5a799a12fe26760b2e09045 not found: ID does not exist" Dec 05 16:52:01 crc kubenswrapper[4756]: I1205 16:52:01.863133 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4s57k"] Dec 05 16:52:01 crc kubenswrapper[4756]: I1205 16:52:01.868284 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4s57k"] Dec 05 16:52:02 crc kubenswrapper[4756]: I1205 16:52:02.059361 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c907408-b752-4e6c-9a11-99e8082be884" path="/var/lib/kubelet/pods/6c907408-b752-4e6c-9a11-99e8082be884/volumes" Dec 05 16:52:02 crc kubenswrapper[4756]: I1205 16:52:02.536842 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6dmlf" podUID="f790cd3a-7a14-4d66-af22-c2474936e462" containerName="registry-server" containerID="cri-o://299ee2e6d4a54fdd63f6ffb7854f03b0baebd9d9ba2a913693a14a6b29f99025" gracePeriod=2 Dec 05 16:52:02 crc kubenswrapper[4756]: I1205 16:52:02.962757 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6dmlf" Dec 05 16:52:03 crc kubenswrapper[4756]: I1205 16:52:03.105883 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bphz\" (UniqueName: \"kubernetes.io/projected/f790cd3a-7a14-4d66-af22-c2474936e462-kube-api-access-7bphz\") pod \"f790cd3a-7a14-4d66-af22-c2474936e462\" (UID: \"f790cd3a-7a14-4d66-af22-c2474936e462\") " Dec 05 16:52:03 crc kubenswrapper[4756]: I1205 16:52:03.106008 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f790cd3a-7a14-4d66-af22-c2474936e462-utilities\") pod \"f790cd3a-7a14-4d66-af22-c2474936e462\" (UID: \"f790cd3a-7a14-4d66-af22-c2474936e462\") " Dec 05 16:52:03 crc kubenswrapper[4756]: I1205 16:52:03.106090 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f790cd3a-7a14-4d66-af22-c2474936e462-catalog-content\") pod \"f790cd3a-7a14-4d66-af22-c2474936e462\" (UID: \"f790cd3a-7a14-4d66-af22-c2474936e462\") " Dec 05 16:52:03 crc kubenswrapper[4756]: I1205 16:52:03.107350 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f790cd3a-7a14-4d66-af22-c2474936e462-utilities" (OuterVolumeSpecName: "utilities") pod "f790cd3a-7a14-4d66-af22-c2474936e462" (UID: "f790cd3a-7a14-4d66-af22-c2474936e462"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:52:03 crc kubenswrapper[4756]: I1205 16:52:03.110705 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f790cd3a-7a14-4d66-af22-c2474936e462-kube-api-access-7bphz" (OuterVolumeSpecName: "kube-api-access-7bphz") pod "f790cd3a-7a14-4d66-af22-c2474936e462" (UID: "f790cd3a-7a14-4d66-af22-c2474936e462"). InnerVolumeSpecName "kube-api-access-7bphz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 16:52:03 crc kubenswrapper[4756]: I1205 16:52:03.129891 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f790cd3a-7a14-4d66-af22-c2474936e462-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f790cd3a-7a14-4d66-af22-c2474936e462" (UID: "f790cd3a-7a14-4d66-af22-c2474936e462"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 16:52:03 crc kubenswrapper[4756]: I1205 16:52:03.208226 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bphz\" (UniqueName: \"kubernetes.io/projected/f790cd3a-7a14-4d66-af22-c2474936e462-kube-api-access-7bphz\") on node \"crc\" DevicePath \"\"" Dec 05 16:52:03 crc kubenswrapper[4756]: I1205 16:52:03.208284 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f790cd3a-7a14-4d66-af22-c2474936e462-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 16:52:03 crc kubenswrapper[4756]: I1205 16:52:03.208297 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f790cd3a-7a14-4d66-af22-c2474936e462-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 16:52:03 crc kubenswrapper[4756]: I1205 16:52:03.545545 4756 generic.go:334] "Generic (PLEG): container finished" podID="f790cd3a-7a14-4d66-af22-c2474936e462" containerID="299ee2e6d4a54fdd63f6ffb7854f03b0baebd9d9ba2a913693a14a6b29f99025" exitCode=0 Dec 05 16:52:03 crc kubenswrapper[4756]: I1205 16:52:03.545595 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dmlf" event={"ID":"f790cd3a-7a14-4d66-af22-c2474936e462","Type":"ContainerDied","Data":"299ee2e6d4a54fdd63f6ffb7854f03b0baebd9d9ba2a913693a14a6b29f99025"} Dec 05 16:52:03 crc kubenswrapper[4756]: I1205 16:52:03.545609 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6dmlf" Dec 05 16:52:03 crc kubenswrapper[4756]: I1205 16:52:03.545627 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dmlf" event={"ID":"f790cd3a-7a14-4d66-af22-c2474936e462","Type":"ContainerDied","Data":"9157cb9e1f70f0cb3f9c9820acdd35441371d41ed03493a3fb5b777516250dbc"} Dec 05 16:52:03 crc kubenswrapper[4756]: I1205 16:52:03.545639 4756 scope.go:117] "RemoveContainer" containerID="299ee2e6d4a54fdd63f6ffb7854f03b0baebd9d9ba2a913693a14a6b29f99025" Dec 05 16:52:03 crc kubenswrapper[4756]: I1205 16:52:03.566548 4756 scope.go:117] "RemoveContainer" containerID="62a46db6dbba88fa2ecd8664f3c85fcd006d39ebc73d2f2d577dd6ffb23b02f2" Dec 05 16:52:03 crc kubenswrapper[4756]: I1205 16:52:03.582083 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dmlf"] Dec 05 16:52:03 crc kubenswrapper[4756]: I1205 16:52:03.587350 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dmlf"] Dec 05 16:52:03 crc kubenswrapper[4756]: I1205 16:52:03.606547 4756 scope.go:117] "RemoveContainer" containerID="93b1c7e346c4b740ae441802463cc89bf2c0aab590fa2ecd7efd527ac9dce159" Dec 05 16:52:03 crc kubenswrapper[4756]: I1205 16:52:03.622563 4756 scope.go:117] "RemoveContainer" containerID="299ee2e6d4a54fdd63f6ffb7854f03b0baebd9d9ba2a913693a14a6b29f99025" Dec 05 16:52:03 crc kubenswrapper[4756]: E1205 16:52:03.623189 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"299ee2e6d4a54fdd63f6ffb7854f03b0baebd9d9ba2a913693a14a6b29f99025\": container with ID starting with 299ee2e6d4a54fdd63f6ffb7854f03b0baebd9d9ba2a913693a14a6b29f99025 not found: ID does not exist" containerID="299ee2e6d4a54fdd63f6ffb7854f03b0baebd9d9ba2a913693a14a6b29f99025" Dec 05 16:52:03 crc kubenswrapper[4756]: I1205 16:52:03.623234 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"299ee2e6d4a54fdd63f6ffb7854f03b0baebd9d9ba2a913693a14a6b29f99025"} err="failed to get container status \"299ee2e6d4a54fdd63f6ffb7854f03b0baebd9d9ba2a913693a14a6b29f99025\": rpc error: code = NotFound desc = could not find container \"299ee2e6d4a54fdd63f6ffb7854f03b0baebd9d9ba2a913693a14a6b29f99025\": container with ID starting with 299ee2e6d4a54fdd63f6ffb7854f03b0baebd9d9ba2a913693a14a6b29f99025 not found: ID does not exist" Dec 05 16:52:03 crc kubenswrapper[4756]: I1205 16:52:03.623259 4756 scope.go:117] "RemoveContainer" containerID="62a46db6dbba88fa2ecd8664f3c85fcd006d39ebc73d2f2d577dd6ffb23b02f2" Dec 05 16:52:03 crc kubenswrapper[4756]: E1205 16:52:03.623623 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62a46db6dbba88fa2ecd8664f3c85fcd006d39ebc73d2f2d577dd6ffb23b02f2\": container with ID starting with 62a46db6dbba88fa2ecd8664f3c85fcd006d39ebc73d2f2d577dd6ffb23b02f2 not found: ID does not exist" containerID="62a46db6dbba88fa2ecd8664f3c85fcd006d39ebc73d2f2d577dd6ffb23b02f2" Dec 05 16:52:03 crc kubenswrapper[4756]: I1205 16:52:03.623664 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62a46db6dbba88fa2ecd8664f3c85fcd006d39ebc73d2f2d577dd6ffb23b02f2"} err="failed to get container status \"62a46db6dbba88fa2ecd8664f3c85fcd006d39ebc73d2f2d577dd6ffb23b02f2\": rpc error: code = NotFound desc = could not find container \"62a46db6dbba88fa2ecd8664f3c85fcd006d39ebc73d2f2d577dd6ffb23b02f2\": container with ID starting with 62a46db6dbba88fa2ecd8664f3c85fcd006d39ebc73d2f2d577dd6ffb23b02f2 not found: ID does not exist" Dec 05 16:52:03 crc kubenswrapper[4756]: I1205 16:52:03.623691 4756 scope.go:117] "RemoveContainer" containerID="93b1c7e346c4b740ae441802463cc89bf2c0aab590fa2ecd7efd527ac9dce159" Dec 05 16:52:03 crc kubenswrapper[4756]: E1205 16:52:03.623881 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93b1c7e346c4b740ae441802463cc89bf2c0aab590fa2ecd7efd527ac9dce159\": container with ID starting with 93b1c7e346c4b740ae441802463cc89bf2c0aab590fa2ecd7efd527ac9dce159 not found: ID does not exist" containerID="93b1c7e346c4b740ae441802463cc89bf2c0aab590fa2ecd7efd527ac9dce159" Dec 05 16:52:03 crc kubenswrapper[4756]: I1205 16:52:03.623903 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93b1c7e346c4b740ae441802463cc89bf2c0aab590fa2ecd7efd527ac9dce159"} err="failed to get container status \"93b1c7e346c4b740ae441802463cc89bf2c0aab590fa2ecd7efd527ac9dce159\": rpc error: code = NotFound desc = could not find container \"93b1c7e346c4b740ae441802463cc89bf2c0aab590fa2ecd7efd527ac9dce159\": container with ID starting with 93b1c7e346c4b740ae441802463cc89bf2c0aab590fa2ecd7efd527ac9dce159 not found: ID does not exist" Dec 05 16:52:04 crc kubenswrapper[4756]: I1205 16:52:04.062130 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f790cd3a-7a14-4d66-af22-c2474936e462" path="/var/lib/kubelet/pods/f790cd3a-7a14-4d66-af22-c2474936e462/volumes" Dec 05 16:53:52 crc kubenswrapper[4756]: I1205 16:53:52.295127 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:53:52 crc kubenswrapper[4756]: I1205 16:53:52.295809 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:54:22 crc kubenswrapper[4756]: I1205 16:54:22.295362 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:54:22 crc kubenswrapper[4756]: I1205 16:54:22.296089 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:54:52 crc kubenswrapper[4756]: I1205 16:54:52.295469 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 16:54:52 crc kubenswrapper[4756]: I1205 16:54:52.296052 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 16:54:52 crc kubenswrapper[4756]: I1205 16:54:52.296101 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 16:54:52 crc kubenswrapper[4756]: I1205 16:54:52.296721 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 16:54:52 crc kubenswrapper[4756]: I1205 16:54:52.296772 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb" gracePeriod=600 Dec 05 16:54:52 crc kubenswrapper[4756]: E1205 16:54:52.422238 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:54:52 crc kubenswrapper[4756]: I1205 16:54:52.898093 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb" exitCode=0 Dec 05 16:54:52 crc kubenswrapper[4756]: I1205 16:54:52.898156 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb"} Dec 05 16:54:52 crc kubenswrapper[4756]: I1205 16:54:52.898316 4756 scope.go:117] "RemoveContainer" containerID="b08c122861a85300ba33993513b01f033b9489d5a5114fa7395b74161bc4057c" Dec 05 16:54:52 crc kubenswrapper[4756]: I1205 16:54:52.899111 4756 scope.go:117] "RemoveContainer" containerID="b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb" Dec 05 16:54:52 crc kubenswrapper[4756]: E1205 16:54:52.899615 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:55:04 crc kubenswrapper[4756]: I1205 16:55:04.050275 4756 scope.go:117] "RemoveContainer" containerID="b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb" Dec 05 16:55:04 crc kubenswrapper[4756]: E1205 16:55:04.051103 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:55:16 crc kubenswrapper[4756]: I1205 16:55:16.050325 4756 scope.go:117] "RemoveContainer" containerID="b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb" Dec 05 16:55:16 crc kubenswrapper[4756]: E1205 16:55:16.051127 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:55:31 crc kubenswrapper[4756]: I1205 16:55:31.049674 4756 scope.go:117] "RemoveContainer" containerID="b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb" Dec 05 16:55:31 crc kubenswrapper[4756]: E1205 16:55:31.050595 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:55:44 crc kubenswrapper[4756]: I1205 16:55:44.049306 4756 scope.go:117] "RemoveContainer" containerID="b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb" Dec 05 16:55:44 crc kubenswrapper[4756]: E1205 16:55:44.049933 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:55:57 crc kubenswrapper[4756]: I1205 16:55:57.049874 4756 scope.go:117] "RemoveContainer" containerID="b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb" Dec 05 16:55:57 crc kubenswrapper[4756]: E1205 16:55:57.050520 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:56:09 crc kubenswrapper[4756]: I1205 16:56:09.049374 4756 scope.go:117] "RemoveContainer" containerID="b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb" Dec 05 16:56:09 crc kubenswrapper[4756]: E1205 16:56:09.050221 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:56:20 crc kubenswrapper[4756]: I1205 16:56:20.055995 4756 scope.go:117] "RemoveContainer" containerID="b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb" Dec 05 16:56:20 crc kubenswrapper[4756]: E1205 16:56:20.056966 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:56:33 crc kubenswrapper[4756]: I1205 16:56:33.049411 4756 scope.go:117] "RemoveContainer" containerID="b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb" Dec 05 16:56:33 crc kubenswrapper[4756]: E1205 16:56:33.050506 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:56:48 crc kubenswrapper[4756]: I1205 16:56:48.049765 4756 scope.go:117] "RemoveContainer" containerID="b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb" Dec 05 16:56:48 crc kubenswrapper[4756]: E1205 16:56:48.052279 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:56:59 crc kubenswrapper[4756]: I1205 16:56:59.049959 4756 scope.go:117] "RemoveContainer" containerID="b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb" Dec 05 16:56:59 crc kubenswrapper[4756]: E1205 16:56:59.050885 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:57:12 crc kubenswrapper[4756]: I1205 16:57:12.050716 4756 scope.go:117] "RemoveContainer" containerID="b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb" Dec 05 16:57:12 crc kubenswrapper[4756]: E1205 16:57:12.051662 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:57:25 crc kubenswrapper[4756]: I1205 16:57:25.050042 4756 scope.go:117] "RemoveContainer" containerID="b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb" Dec 05 16:57:25 crc kubenswrapper[4756]: E1205 16:57:25.052057 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:57:39 crc kubenswrapper[4756]: I1205 16:57:39.049801 4756 scope.go:117] "RemoveContainer" containerID="b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb" Dec 05 16:57:39 crc kubenswrapper[4756]: E1205 16:57:39.050632 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:57:51 crc kubenswrapper[4756]: I1205 16:57:51.050294 4756 scope.go:117] "RemoveContainer" containerID="b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb" Dec 05 16:57:51 crc kubenswrapper[4756]: E1205 16:57:51.051899 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:58:03 crc kubenswrapper[4756]: I1205 16:58:03.050525 4756 scope.go:117] "RemoveContainer" containerID="b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb" Dec 05 16:58:03 crc kubenswrapper[4756]: E1205 16:58:03.051526 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:58:14 crc kubenswrapper[4756]: I1205 16:58:14.049534 4756 scope.go:117] "RemoveContainer" containerID="b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb" Dec 05 16:58:14 crc kubenswrapper[4756]: E1205 16:58:14.050374 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:58:27 crc kubenswrapper[4756]: I1205 16:58:27.049174 4756 scope.go:117] "RemoveContainer" containerID="b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb" Dec 05 16:58:27 crc kubenswrapper[4756]: E1205 16:58:27.050068 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:58:41 crc kubenswrapper[4756]: I1205 16:58:41.050523 4756 scope.go:117] "RemoveContainer" containerID="b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb" Dec 05 16:58:41 crc kubenswrapper[4756]: E1205 16:58:41.051531 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:58:55 crc kubenswrapper[4756]: I1205 16:58:55.050146 4756 scope.go:117] "RemoveContainer" containerID="b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb" Dec 05 16:58:55 crc kubenswrapper[4756]: E1205 16:58:55.051131 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:59:10 crc kubenswrapper[4756]: I1205 16:59:10.055621 4756 scope.go:117] "RemoveContainer" containerID="b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb" Dec 05 16:59:10 crc kubenswrapper[4756]: E1205 16:59:10.056536 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:59:21 crc kubenswrapper[4756]: I1205 16:59:21.049713 4756 scope.go:117] "RemoveContainer" containerID="b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb" Dec 05 16:59:21 crc kubenswrapper[4756]: E1205 16:59:21.050286 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:59:33 crc kubenswrapper[4756]: I1205 16:59:33.050428 4756 scope.go:117] "RemoveContainer" containerID="b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb" Dec 05 16:59:33 crc kubenswrapper[4756]: E1205 16:59:33.051223 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 16:59:48 crc kubenswrapper[4756]: I1205 16:59:48.050349 4756 scope.go:117] "RemoveContainer" containerID="b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb" Dec 05 16:59:48 crc kubenswrapper[4756]: E1205 16:59:48.052944 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:00:00 crc kubenswrapper[4756]: I1205 17:00:00.154994 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415900-g8cns"] Dec 05 17:00:00 crc kubenswrapper[4756]: E1205 17:00:00.157363 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c907408-b752-4e6c-9a11-99e8082be884" containerName="extract-content" Dec 05 17:00:00 crc kubenswrapper[4756]: I1205 17:00:00.157499 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c907408-b752-4e6c-9a11-99e8082be884" containerName="extract-content" Dec 05 17:00:00 crc kubenswrapper[4756]: E1205 17:00:00.157621 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c907408-b752-4e6c-9a11-99e8082be884" containerName="extract-utilities" Dec 05 17:00:00 crc kubenswrapper[4756]: I1205 17:00:00.157697 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c907408-b752-4e6c-9a11-99e8082be884" containerName="extract-utilities" Dec 05 17:00:00 crc kubenswrapper[4756]: E1205 17:00:00.157788 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f790cd3a-7a14-4d66-af22-c2474936e462" containerName="registry-server" Dec 05 17:00:00 crc kubenswrapper[4756]: I1205 17:00:00.157876 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f790cd3a-7a14-4d66-af22-c2474936e462" containerName="registry-server" Dec 05 17:00:00 crc kubenswrapper[4756]: E1205 17:00:00.157961 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f790cd3a-7a14-4d66-af22-c2474936e462" containerName="extract-content" Dec 05 17:00:00 crc kubenswrapper[4756]: I1205 17:00:00.158043 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f790cd3a-7a14-4d66-af22-c2474936e462" containerName="extract-content" Dec 05 17:00:00 crc kubenswrapper[4756]: E1205 17:00:00.158145 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c907408-b752-4e6c-9a11-99e8082be884" containerName="registry-server" Dec 05 17:00:00 crc kubenswrapper[4756]: I1205 17:00:00.158231 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c907408-b752-4e6c-9a11-99e8082be884" containerName="registry-server" Dec 05 17:00:00 crc kubenswrapper[4756]: E1205 17:00:00.158334 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f790cd3a-7a14-4d66-af22-c2474936e462" containerName="extract-utilities" Dec 05 17:00:00 crc kubenswrapper[4756]: I1205 17:00:00.158438 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f790cd3a-7a14-4d66-af22-c2474936e462" containerName="extract-utilities" Dec 05 17:00:00 crc kubenswrapper[4756]: I1205 17:00:00.158693 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f790cd3a-7a14-4d66-af22-c2474936e462" containerName="registry-server" Dec 05 17:00:00 crc kubenswrapper[4756]: I1205 17:00:00.158781 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c907408-b752-4e6c-9a11-99e8082be884" containerName="registry-server" Dec 05 17:00:00 crc kubenswrapper[4756]: I1205 17:00:00.159436 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-g8cns" Dec 05 17:00:00 crc kubenswrapper[4756]: I1205 17:00:00.164254 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415900-g8cns"] Dec 05 17:00:00 crc kubenswrapper[4756]: I1205 17:00:00.166355 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 17:00:00 crc kubenswrapper[4756]: I1205 17:00:00.166982 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 17:00:00 crc kubenswrapper[4756]: I1205 17:00:00.256840 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14bb829d-b636-4040-a51c-121ba1163192-config-volume\") pod \"collect-profiles-29415900-g8cns\" (UID: \"14bb829d-b636-4040-a51c-121ba1163192\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-g8cns" Dec 05 17:00:00 crc kubenswrapper[4756]: I1205 17:00:00.257128 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhtxm\" (UniqueName: \"kubernetes.io/projected/14bb829d-b636-4040-a51c-121ba1163192-kube-api-access-jhtxm\") pod \"collect-profiles-29415900-g8cns\" (UID: \"14bb829d-b636-4040-a51c-121ba1163192\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-g8cns" Dec 05 17:00:00 crc kubenswrapper[4756]: I1205 17:00:00.257260 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/14bb829d-b636-4040-a51c-121ba1163192-secret-volume\") pod \"collect-profiles-29415900-g8cns\" (UID: \"14bb829d-b636-4040-a51c-121ba1163192\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-g8cns" Dec 05 17:00:00 crc kubenswrapper[4756]: I1205 17:00:00.358532 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/14bb829d-b636-4040-a51c-121ba1163192-secret-volume\") pod \"collect-profiles-29415900-g8cns\" (UID: \"14bb829d-b636-4040-a51c-121ba1163192\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-g8cns" Dec 05 17:00:00 crc kubenswrapper[4756]: I1205 17:00:00.360020 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14bb829d-b636-4040-a51c-121ba1163192-config-volume\") pod \"collect-profiles-29415900-g8cns\" (UID: \"14bb829d-b636-4040-a51c-121ba1163192\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-g8cns" Dec 05 17:00:00 crc kubenswrapper[4756]: I1205 17:00:00.360646 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhtxm\" (UniqueName: \"kubernetes.io/projected/14bb829d-b636-4040-a51c-121ba1163192-kube-api-access-jhtxm\") pod \"collect-profiles-29415900-g8cns\" (UID: \"14bb829d-b636-4040-a51c-121ba1163192\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-g8cns" Dec 05 17:00:00 crc kubenswrapper[4756]: I1205 17:00:00.361998 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14bb829d-b636-4040-a51c-121ba1163192-config-volume\") pod \"collect-profiles-29415900-g8cns\" (UID: \"14bb829d-b636-4040-a51c-121ba1163192\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-g8cns" Dec 05 17:00:00 crc kubenswrapper[4756]: I1205 17:00:00.368251 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/14bb829d-b636-4040-a51c-121ba1163192-secret-volume\") pod \"collect-profiles-29415900-g8cns\" (UID: \"14bb829d-b636-4040-a51c-121ba1163192\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-g8cns" Dec 05 17:00:00 crc kubenswrapper[4756]: I1205 17:00:00.391392 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhtxm\" (UniqueName: \"kubernetes.io/projected/14bb829d-b636-4040-a51c-121ba1163192-kube-api-access-jhtxm\") pod \"collect-profiles-29415900-g8cns\" (UID: \"14bb829d-b636-4040-a51c-121ba1163192\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-g8cns" Dec 05 17:00:00 crc kubenswrapper[4756]: I1205 17:00:00.501149 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-g8cns" Dec 05 17:00:00 crc kubenswrapper[4756]: I1205 17:00:00.954074 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415900-g8cns"] Dec 05 17:00:01 crc kubenswrapper[4756]: I1205 17:00:01.049061 4756 scope.go:117] "RemoveContainer" containerID="b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb" Dec 05 17:00:01 crc kubenswrapper[4756]: I1205 17:00:01.648673 4756 generic.go:334] "Generic (PLEG): container finished" podID="14bb829d-b636-4040-a51c-121ba1163192" containerID="685747f7f6066d7b8cf513d37c6955b3490ea1c3a2e22a5cecaa20cf2500928d" exitCode=0 Dec 05 17:00:01 crc kubenswrapper[4756]: I1205 17:00:01.648737 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-g8cns" event={"ID":"14bb829d-b636-4040-a51c-121ba1163192","Type":"ContainerDied","Data":"685747f7f6066d7b8cf513d37c6955b3490ea1c3a2e22a5cecaa20cf2500928d"} Dec 05 17:00:01 crc kubenswrapper[4756]: I1205 17:00:01.649042 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-g8cns" event={"ID":"14bb829d-b636-4040-a51c-121ba1163192","Type":"ContainerStarted","Data":"c1a67e46eeef7aab197fac6e602022885706d973fac5d1c4884207fd525496db"} Dec 05 17:00:01 crc kubenswrapper[4756]: I1205 17:00:01.651729 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"18353ebf1f1010a20a0b374d54ad18c9786c2c302f29971c6fc3155359bf7ff3"} Dec 05 17:00:02 crc kubenswrapper[4756]: I1205 17:00:02.999156 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-g8cns" Dec 05 17:00:03 crc kubenswrapper[4756]: I1205 17:00:03.103951 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhtxm\" (UniqueName: \"kubernetes.io/projected/14bb829d-b636-4040-a51c-121ba1163192-kube-api-access-jhtxm\") pod \"14bb829d-b636-4040-a51c-121ba1163192\" (UID: \"14bb829d-b636-4040-a51c-121ba1163192\") " Dec 05 17:00:03 crc kubenswrapper[4756]: I1205 17:00:03.104010 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14bb829d-b636-4040-a51c-121ba1163192-config-volume\") pod \"14bb829d-b636-4040-a51c-121ba1163192\" (UID: \"14bb829d-b636-4040-a51c-121ba1163192\") " Dec 05 17:00:03 crc kubenswrapper[4756]: I1205 17:00:03.104099 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/14bb829d-b636-4040-a51c-121ba1163192-secret-volume\") pod \"14bb829d-b636-4040-a51c-121ba1163192\" (UID: \"14bb829d-b636-4040-a51c-121ba1163192\") " Dec 05 17:00:03 crc kubenswrapper[4756]: I1205 17:00:03.104758 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14bb829d-b636-4040-a51c-121ba1163192-config-volume" (OuterVolumeSpecName: "config-volume") pod "14bb829d-b636-4040-a51c-121ba1163192" (UID: "14bb829d-b636-4040-a51c-121ba1163192"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:00:03 crc kubenswrapper[4756]: I1205 17:00:03.110629 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14bb829d-b636-4040-a51c-121ba1163192-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "14bb829d-b636-4040-a51c-121ba1163192" (UID: "14bb829d-b636-4040-a51c-121ba1163192"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:00:03 crc kubenswrapper[4756]: I1205 17:00:03.114871 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14bb829d-b636-4040-a51c-121ba1163192-kube-api-access-jhtxm" (OuterVolumeSpecName: "kube-api-access-jhtxm") pod "14bb829d-b636-4040-a51c-121ba1163192" (UID: "14bb829d-b636-4040-a51c-121ba1163192"). InnerVolumeSpecName "kube-api-access-jhtxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:00:03 crc kubenswrapper[4756]: I1205 17:00:03.205922 4756 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14bb829d-b636-4040-a51c-121ba1163192-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 17:00:03 crc kubenswrapper[4756]: I1205 17:00:03.205980 4756 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/14bb829d-b636-4040-a51c-121ba1163192-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 17:00:03 crc kubenswrapper[4756]: I1205 17:00:03.205998 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhtxm\" (UniqueName: \"kubernetes.io/projected/14bb829d-b636-4040-a51c-121ba1163192-kube-api-access-jhtxm\") on node \"crc\" DevicePath \"\"" Dec 05 17:00:03 crc kubenswrapper[4756]: I1205 17:00:03.668399 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-g8cns" event={"ID":"14bb829d-b636-4040-a51c-121ba1163192","Type":"ContainerDied","Data":"c1a67e46eeef7aab197fac6e602022885706d973fac5d1c4884207fd525496db"} Dec 05 17:00:03 crc kubenswrapper[4756]: I1205 17:00:03.668747 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1a67e46eeef7aab197fac6e602022885706d973fac5d1c4884207fd525496db" Dec 05 17:00:03 crc kubenswrapper[4756]: I1205 17:00:03.668473 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415900-g8cns" Dec 05 17:00:04 crc kubenswrapper[4756]: I1205 17:00:04.101798 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415855-9cgsd"] Dec 05 17:00:04 crc kubenswrapper[4756]: I1205 17:00:04.108254 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415855-9cgsd"] Dec 05 17:00:06 crc kubenswrapper[4756]: I1205 17:00:06.061664 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2e132f7-f4ea-426c-b094-b5eea4577d5f" path="/var/lib/kubelet/pods/f2e132f7-f4ea-426c-b094-b5eea4577d5f/volumes" Dec 05 17:00:57 crc kubenswrapper[4756]: I1205 17:00:57.358683 4756 scope.go:117] "RemoveContainer" containerID="5bf7e4ce4c2ad1fa74046b0bbb6176cfd3dfff53529b073ffe77e626a33292ba" Dec 05 17:02:22 crc kubenswrapper[4756]: I1205 17:02:22.295842 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 17:02:22 crc kubenswrapper[4756]: I1205 17:02:22.297611 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 17:02:52 crc kubenswrapper[4756]: I1205 17:02:52.295296 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 17:02:52 crc kubenswrapper[4756]: I1205 17:02:52.295858 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 17:03:13 crc kubenswrapper[4756]: I1205 17:03:13.376477 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7qck2"] Dec 05 17:03:13 crc kubenswrapper[4756]: E1205 17:03:13.377332 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14bb829d-b636-4040-a51c-121ba1163192" containerName="collect-profiles" Dec 05 17:03:13 crc kubenswrapper[4756]: I1205 17:03:13.377346 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="14bb829d-b636-4040-a51c-121ba1163192" containerName="collect-profiles" Dec 05 17:03:13 crc kubenswrapper[4756]: I1205 17:03:13.377527 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="14bb829d-b636-4040-a51c-121ba1163192" containerName="collect-profiles" Dec 05 17:03:13 crc kubenswrapper[4756]: I1205 17:03:13.378856 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7qck2" Dec 05 17:03:13 crc kubenswrapper[4756]: I1205 17:03:13.398434 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7qck2"] Dec 05 17:03:13 crc kubenswrapper[4756]: I1205 17:03:13.433825 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0a90af7-a15a-4249-9a48-ef4be190d61b-catalog-content\") pod \"redhat-marketplace-7qck2\" (UID: \"b0a90af7-a15a-4249-9a48-ef4be190d61b\") " pod="openshift-marketplace/redhat-marketplace-7qck2" Dec 05 17:03:13 crc kubenswrapper[4756]: I1205 17:03:13.433909 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0a90af7-a15a-4249-9a48-ef4be190d61b-utilities\") pod \"redhat-marketplace-7qck2\" (UID: \"b0a90af7-a15a-4249-9a48-ef4be190d61b\") " pod="openshift-marketplace/redhat-marketplace-7qck2" Dec 05 17:03:13 crc kubenswrapper[4756]: I1205 17:03:13.434034 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvxtj\" (UniqueName: \"kubernetes.io/projected/b0a90af7-a15a-4249-9a48-ef4be190d61b-kube-api-access-jvxtj\") pod \"redhat-marketplace-7qck2\" (UID: \"b0a90af7-a15a-4249-9a48-ef4be190d61b\") " pod="openshift-marketplace/redhat-marketplace-7qck2" Dec 05 17:03:13 crc kubenswrapper[4756]: I1205 17:03:13.535015 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0a90af7-a15a-4249-9a48-ef4be190d61b-catalog-content\") pod \"redhat-marketplace-7qck2\" (UID: \"b0a90af7-a15a-4249-9a48-ef4be190d61b\") " pod="openshift-marketplace/redhat-marketplace-7qck2" Dec 05 17:03:13 crc kubenswrapper[4756]: I1205 17:03:13.535075 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0a90af7-a15a-4249-9a48-ef4be190d61b-utilities\") pod \"redhat-marketplace-7qck2\" (UID: \"b0a90af7-a15a-4249-9a48-ef4be190d61b\") " pod="openshift-marketplace/redhat-marketplace-7qck2" Dec 05 17:03:13 crc kubenswrapper[4756]: I1205 17:03:13.535144 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvxtj\" (UniqueName: \"kubernetes.io/projected/b0a90af7-a15a-4249-9a48-ef4be190d61b-kube-api-access-jvxtj\") pod \"redhat-marketplace-7qck2\" (UID: \"b0a90af7-a15a-4249-9a48-ef4be190d61b\") " pod="openshift-marketplace/redhat-marketplace-7qck2" Dec 05 17:03:13 crc kubenswrapper[4756]: I1205 17:03:13.535897 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0a90af7-a15a-4249-9a48-ef4be190d61b-catalog-content\") pod \"redhat-marketplace-7qck2\" (UID: \"b0a90af7-a15a-4249-9a48-ef4be190d61b\") " pod="openshift-marketplace/redhat-marketplace-7qck2" Dec 05 17:03:13 crc kubenswrapper[4756]: I1205 17:03:13.536107 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0a90af7-a15a-4249-9a48-ef4be190d61b-utilities\") pod \"redhat-marketplace-7qck2\" (UID: \"b0a90af7-a15a-4249-9a48-ef4be190d61b\") " pod="openshift-marketplace/redhat-marketplace-7qck2" Dec 05 17:03:13 crc kubenswrapper[4756]: I1205 17:03:13.554220 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvxtj\" (UniqueName: \"kubernetes.io/projected/b0a90af7-a15a-4249-9a48-ef4be190d61b-kube-api-access-jvxtj\") pod \"redhat-marketplace-7qck2\" (UID: \"b0a90af7-a15a-4249-9a48-ef4be190d61b\") " pod="openshift-marketplace/redhat-marketplace-7qck2" Dec 05 17:03:13 crc kubenswrapper[4756]: I1205 17:03:13.697068 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7qck2" Dec 05 17:03:14 crc kubenswrapper[4756]: I1205 17:03:14.142664 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7qck2"] Dec 05 17:03:14 crc kubenswrapper[4756]: I1205 17:03:14.350637 4756 generic.go:334] "Generic (PLEG): container finished" podID="b0a90af7-a15a-4249-9a48-ef4be190d61b" containerID="675f200f296dd3b798f481bfbd5461ce0d7f03b95258be39b59b3db15ff691d2" exitCode=0 Dec 05 17:03:14 crc kubenswrapper[4756]: I1205 17:03:14.350680 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qck2" event={"ID":"b0a90af7-a15a-4249-9a48-ef4be190d61b","Type":"ContainerDied","Data":"675f200f296dd3b798f481bfbd5461ce0d7f03b95258be39b59b3db15ff691d2"} Dec 05 17:03:14 crc kubenswrapper[4756]: I1205 17:03:14.350712 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qck2" event={"ID":"b0a90af7-a15a-4249-9a48-ef4be190d61b","Type":"ContainerStarted","Data":"b09d2121af8b2c1b9040df785dcf7f8181feed02c073362cbb284c8a260a8548"} Dec 05 17:03:14 crc kubenswrapper[4756]: I1205 17:03:14.353659 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 17:03:15 crc kubenswrapper[4756]: I1205 17:03:15.361680 4756 generic.go:334] "Generic (PLEG): container finished" podID="b0a90af7-a15a-4249-9a48-ef4be190d61b" containerID="257e035b9376b6cf3eda0a4c77318913454ad08896dbfaa63571425df4560d0c" exitCode=0 Dec 05 17:03:15 crc kubenswrapper[4756]: I1205 17:03:15.361735 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qck2" event={"ID":"b0a90af7-a15a-4249-9a48-ef4be190d61b","Type":"ContainerDied","Data":"257e035b9376b6cf3eda0a4c77318913454ad08896dbfaa63571425df4560d0c"} Dec 05 17:03:16 crc kubenswrapper[4756]: I1205 17:03:16.375043 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qck2" event={"ID":"b0a90af7-a15a-4249-9a48-ef4be190d61b","Type":"ContainerStarted","Data":"97643d0f8f265b9eaf57c5a0c070c993054852642bf2e4da9c536c8619ca59a7"} Dec 05 17:03:16 crc kubenswrapper[4756]: I1205 17:03:16.398771 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7qck2" podStartSLOduration=1.9699719999999998 podStartE2EDuration="3.398751391s" podCreationTimestamp="2025-12-05 17:03:13 +0000 UTC" firstStartedPulling="2025-12-05 17:03:14.353312609 +0000 UTC m=+2854.591986131" lastFinishedPulling="2025-12-05 17:03:15.78209201 +0000 UTC m=+2856.020765522" observedRunningTime="2025-12-05 17:03:16.39383866 +0000 UTC m=+2856.632512172" watchObservedRunningTime="2025-12-05 17:03:16.398751391 +0000 UTC m=+2856.637424903" Dec 05 17:03:22 crc kubenswrapper[4756]: I1205 17:03:22.295065 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 17:03:22 crc kubenswrapper[4756]: I1205 17:03:22.295669 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 17:03:22 crc kubenswrapper[4756]: I1205 17:03:22.295717 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 17:03:22 crc kubenswrapper[4756]: I1205 17:03:22.296323 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"18353ebf1f1010a20a0b374d54ad18c9786c2c302f29971c6fc3155359bf7ff3"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 17:03:22 crc kubenswrapper[4756]: I1205 17:03:22.296375 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://18353ebf1f1010a20a0b374d54ad18c9786c2c302f29971c6fc3155359bf7ff3" gracePeriod=600 Dec 05 17:03:23 crc kubenswrapper[4756]: I1205 17:03:23.437051 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="18353ebf1f1010a20a0b374d54ad18c9786c2c302f29971c6fc3155359bf7ff3" exitCode=0 Dec 05 17:03:23 crc kubenswrapper[4756]: I1205 17:03:23.437116 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"18353ebf1f1010a20a0b374d54ad18c9786c2c302f29971c6fc3155359bf7ff3"} Dec 05 17:03:23 crc kubenswrapper[4756]: I1205 17:03:23.437860 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90"} Dec 05 17:03:23 crc kubenswrapper[4756]: I1205 17:03:23.437895 4756 scope.go:117] "RemoveContainer" containerID="b299b3b2d578c472088703a3ffcf91c7d101e25876ef362a2b0822aef0bcb7fb" Dec 05 17:03:23 crc kubenswrapper[4756]: I1205 17:03:23.697304 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7qck2" Dec 05 17:03:23 crc kubenswrapper[4756]: I1205 17:03:23.697354 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7qck2" Dec 05 17:03:23 crc kubenswrapper[4756]: I1205 17:03:23.738780 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7qck2" Dec 05 17:03:24 crc kubenswrapper[4756]: I1205 17:03:24.498119 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7qck2" Dec 05 17:03:24 crc kubenswrapper[4756]: I1205 17:03:24.543141 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7qck2"] Dec 05 17:03:26 crc kubenswrapper[4756]: I1205 17:03:26.462704 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7qck2" podUID="b0a90af7-a15a-4249-9a48-ef4be190d61b" containerName="registry-server" containerID="cri-o://97643d0f8f265b9eaf57c5a0c070c993054852642bf2e4da9c536c8619ca59a7" gracePeriod=2 Dec 05 17:03:26 crc kubenswrapper[4756]: I1205 17:03:26.957646 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7qck2" Dec 05 17:03:27 crc kubenswrapper[4756]: I1205 17:03:27.040962 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0a90af7-a15a-4249-9a48-ef4be190d61b-utilities\") pod \"b0a90af7-a15a-4249-9a48-ef4be190d61b\" (UID: \"b0a90af7-a15a-4249-9a48-ef4be190d61b\") " Dec 05 17:03:27 crc kubenswrapper[4756]: I1205 17:03:27.041118 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0a90af7-a15a-4249-9a48-ef4be190d61b-catalog-content\") pod \"b0a90af7-a15a-4249-9a48-ef4be190d61b\" (UID: \"b0a90af7-a15a-4249-9a48-ef4be190d61b\") " Dec 05 17:03:27 crc kubenswrapper[4756]: I1205 17:03:27.041143 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvxtj\" (UniqueName: \"kubernetes.io/projected/b0a90af7-a15a-4249-9a48-ef4be190d61b-kube-api-access-jvxtj\") pod \"b0a90af7-a15a-4249-9a48-ef4be190d61b\" (UID: \"b0a90af7-a15a-4249-9a48-ef4be190d61b\") " Dec 05 17:03:27 crc kubenswrapper[4756]: I1205 17:03:27.042689 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0a90af7-a15a-4249-9a48-ef4be190d61b-utilities" (OuterVolumeSpecName: "utilities") pod "b0a90af7-a15a-4249-9a48-ef4be190d61b" (UID: "b0a90af7-a15a-4249-9a48-ef4be190d61b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:03:27 crc kubenswrapper[4756]: I1205 17:03:27.051794 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0a90af7-a15a-4249-9a48-ef4be190d61b-kube-api-access-jvxtj" (OuterVolumeSpecName: "kube-api-access-jvxtj") pod "b0a90af7-a15a-4249-9a48-ef4be190d61b" (UID: "b0a90af7-a15a-4249-9a48-ef4be190d61b"). InnerVolumeSpecName "kube-api-access-jvxtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:03:27 crc kubenswrapper[4756]: I1205 17:03:27.063181 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0a90af7-a15a-4249-9a48-ef4be190d61b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b0a90af7-a15a-4249-9a48-ef4be190d61b" (UID: "b0a90af7-a15a-4249-9a48-ef4be190d61b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:03:27 crc kubenswrapper[4756]: I1205 17:03:27.142765 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0a90af7-a15a-4249-9a48-ef4be190d61b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 17:03:27 crc kubenswrapper[4756]: I1205 17:03:27.142826 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvxtj\" (UniqueName: \"kubernetes.io/projected/b0a90af7-a15a-4249-9a48-ef4be190d61b-kube-api-access-jvxtj\") on node \"crc\" DevicePath \"\"" Dec 05 17:03:27 crc kubenswrapper[4756]: I1205 17:03:27.142847 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0a90af7-a15a-4249-9a48-ef4be190d61b-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 17:03:27 crc kubenswrapper[4756]: I1205 17:03:27.475164 4756 generic.go:334] "Generic (PLEG): container finished" podID="b0a90af7-a15a-4249-9a48-ef4be190d61b" containerID="97643d0f8f265b9eaf57c5a0c070c993054852642bf2e4da9c536c8619ca59a7" exitCode=0 Dec 05 17:03:27 crc kubenswrapper[4756]: I1205 17:03:27.475210 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qck2" event={"ID":"b0a90af7-a15a-4249-9a48-ef4be190d61b","Type":"ContainerDied","Data":"97643d0f8f265b9eaf57c5a0c070c993054852642bf2e4da9c536c8619ca59a7"} Dec 05 17:03:27 crc kubenswrapper[4756]: I1205 17:03:27.475243 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7qck2" Dec 05 17:03:27 crc kubenswrapper[4756]: I1205 17:03:27.475277 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qck2" event={"ID":"b0a90af7-a15a-4249-9a48-ef4be190d61b","Type":"ContainerDied","Data":"b09d2121af8b2c1b9040df785dcf7f8181feed02c073362cbb284c8a260a8548"} Dec 05 17:03:27 crc kubenswrapper[4756]: I1205 17:03:27.475307 4756 scope.go:117] "RemoveContainer" containerID="97643d0f8f265b9eaf57c5a0c070c993054852642bf2e4da9c536c8619ca59a7" Dec 05 17:03:27 crc kubenswrapper[4756]: I1205 17:03:27.500768 4756 scope.go:117] "RemoveContainer" containerID="257e035b9376b6cf3eda0a4c77318913454ad08896dbfaa63571425df4560d0c" Dec 05 17:03:27 crc kubenswrapper[4756]: I1205 17:03:27.529587 4756 scope.go:117] "RemoveContainer" containerID="675f200f296dd3b798f481bfbd5461ce0d7f03b95258be39b59b3db15ff691d2" Dec 05 17:03:27 crc kubenswrapper[4756]: I1205 17:03:27.532547 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7qck2"] Dec 05 17:03:27 crc kubenswrapper[4756]: I1205 17:03:27.538058 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7qck2"] Dec 05 17:03:27 crc kubenswrapper[4756]: I1205 17:03:27.567355 4756 scope.go:117] "RemoveContainer" containerID="97643d0f8f265b9eaf57c5a0c070c993054852642bf2e4da9c536c8619ca59a7" Dec 05 17:03:27 crc kubenswrapper[4756]: E1205 17:03:27.567904 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97643d0f8f265b9eaf57c5a0c070c993054852642bf2e4da9c536c8619ca59a7\": container with ID starting with 97643d0f8f265b9eaf57c5a0c070c993054852642bf2e4da9c536c8619ca59a7 not found: ID does not exist" containerID="97643d0f8f265b9eaf57c5a0c070c993054852642bf2e4da9c536c8619ca59a7" Dec 05 17:03:27 crc kubenswrapper[4756]: I1205 17:03:27.567934 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97643d0f8f265b9eaf57c5a0c070c993054852642bf2e4da9c536c8619ca59a7"} err="failed to get container status \"97643d0f8f265b9eaf57c5a0c070c993054852642bf2e4da9c536c8619ca59a7\": rpc error: code = NotFound desc = could not find container \"97643d0f8f265b9eaf57c5a0c070c993054852642bf2e4da9c536c8619ca59a7\": container with ID starting with 97643d0f8f265b9eaf57c5a0c070c993054852642bf2e4da9c536c8619ca59a7 not found: ID does not exist" Dec 05 17:03:27 crc kubenswrapper[4756]: I1205 17:03:27.567958 4756 scope.go:117] "RemoveContainer" containerID="257e035b9376b6cf3eda0a4c77318913454ad08896dbfaa63571425df4560d0c" Dec 05 17:03:27 crc kubenswrapper[4756]: E1205 17:03:27.568622 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"257e035b9376b6cf3eda0a4c77318913454ad08896dbfaa63571425df4560d0c\": container with ID starting with 257e035b9376b6cf3eda0a4c77318913454ad08896dbfaa63571425df4560d0c not found: ID does not exist" containerID="257e035b9376b6cf3eda0a4c77318913454ad08896dbfaa63571425df4560d0c" Dec 05 17:03:27 crc kubenswrapper[4756]: I1205 17:03:27.568647 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"257e035b9376b6cf3eda0a4c77318913454ad08896dbfaa63571425df4560d0c"} err="failed to get container status \"257e035b9376b6cf3eda0a4c77318913454ad08896dbfaa63571425df4560d0c\": rpc error: code = NotFound desc = could not find container \"257e035b9376b6cf3eda0a4c77318913454ad08896dbfaa63571425df4560d0c\": container with ID starting with 257e035b9376b6cf3eda0a4c77318913454ad08896dbfaa63571425df4560d0c not found: ID does not exist" Dec 05 17:03:27 crc kubenswrapper[4756]: I1205 17:03:27.568661 4756 scope.go:117] "RemoveContainer" containerID="675f200f296dd3b798f481bfbd5461ce0d7f03b95258be39b59b3db15ff691d2" Dec 05 17:03:27 crc kubenswrapper[4756]: E1205 17:03:27.569322 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"675f200f296dd3b798f481bfbd5461ce0d7f03b95258be39b59b3db15ff691d2\": container with ID starting with 675f200f296dd3b798f481bfbd5461ce0d7f03b95258be39b59b3db15ff691d2 not found: ID does not exist" containerID="675f200f296dd3b798f481bfbd5461ce0d7f03b95258be39b59b3db15ff691d2" Dec 05 17:03:27 crc kubenswrapper[4756]: I1205 17:03:27.569344 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"675f200f296dd3b798f481bfbd5461ce0d7f03b95258be39b59b3db15ff691d2"} err="failed to get container status \"675f200f296dd3b798f481bfbd5461ce0d7f03b95258be39b59b3db15ff691d2\": rpc error: code = NotFound desc = could not find container \"675f200f296dd3b798f481bfbd5461ce0d7f03b95258be39b59b3db15ff691d2\": container with ID starting with 675f200f296dd3b798f481bfbd5461ce0d7f03b95258be39b59b3db15ff691d2 not found: ID does not exist" Dec 05 17:03:28 crc kubenswrapper[4756]: I1205 17:03:28.067311 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0a90af7-a15a-4249-9a48-ef4be190d61b" path="/var/lib/kubelet/pods/b0a90af7-a15a-4249-9a48-ef4be190d61b/volumes" Dec 05 17:05:52 crc kubenswrapper[4756]: I1205 17:05:52.295571 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 17:05:52 crc kubenswrapper[4756]: I1205 17:05:52.296510 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 17:06:22 crc kubenswrapper[4756]: I1205 17:06:22.295153 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 17:06:22 crc kubenswrapper[4756]: I1205 17:06:22.295772 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 17:06:52 crc kubenswrapper[4756]: I1205 17:06:52.296006 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 17:06:52 crc kubenswrapper[4756]: I1205 17:06:52.297128 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 17:06:52 crc kubenswrapper[4756]: I1205 17:06:52.297286 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 17:06:52 crc kubenswrapper[4756]: I1205 17:06:52.298621 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 17:06:52 crc kubenswrapper[4756]: I1205 17:06:52.298735 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90" gracePeriod=600 Dec 05 17:06:52 crc kubenswrapper[4756]: E1205 17:06:52.422480 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:06:53 crc kubenswrapper[4756]: I1205 17:06:53.296570 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90" exitCode=0 Dec 05 17:06:53 crc kubenswrapper[4756]: I1205 17:06:53.296649 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90"} Dec 05 17:06:53 crc kubenswrapper[4756]: I1205 17:06:53.296725 4756 scope.go:117] "RemoveContainer" containerID="18353ebf1f1010a20a0b374d54ad18c9786c2c302f29971c6fc3155359bf7ff3" Dec 05 17:06:53 crc kubenswrapper[4756]: I1205 17:06:53.297508 4756 scope.go:117] "RemoveContainer" containerID="49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90" Dec 05 17:06:53 crc kubenswrapper[4756]: E1205 17:06:53.297730 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:07:04 crc kubenswrapper[4756]: I1205 17:07:04.050523 4756 scope.go:117] "RemoveContainer" containerID="49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90" Dec 05 17:07:04 crc kubenswrapper[4756]: E1205 17:07:04.051323 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.047051 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d2dx8"] Dec 05 17:07:06 crc kubenswrapper[4756]: E1205 17:07:06.047747 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a90af7-a15a-4249-9a48-ef4be190d61b" containerName="extract-content" Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.047766 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a90af7-a15a-4249-9a48-ef4be190d61b" containerName="extract-content" Dec 05 17:07:06 crc kubenswrapper[4756]: E1205 17:07:06.047777 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a90af7-a15a-4249-9a48-ef4be190d61b" containerName="registry-server" Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.047784 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a90af7-a15a-4249-9a48-ef4be190d61b" containerName="registry-server" Dec 05 17:07:06 crc kubenswrapper[4756]: E1205 17:07:06.047806 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a90af7-a15a-4249-9a48-ef4be190d61b" containerName="extract-utilities" Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.047814 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a90af7-a15a-4249-9a48-ef4be190d61b" containerName="extract-utilities" Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.047975 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0a90af7-a15a-4249-9a48-ef4be190d61b" containerName="registry-server" Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.049380 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d2dx8" Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.095724 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d2dx8"] Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.103204 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dttx\" (UniqueName: \"kubernetes.io/projected/fc74fb2f-749c-40c0-9df6-42b06d55ba4f-kube-api-access-4dttx\") pod \"certified-operators-d2dx8\" (UID: \"fc74fb2f-749c-40c0-9df6-42b06d55ba4f\") " pod="openshift-marketplace/certified-operators-d2dx8" Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.103322 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc74fb2f-749c-40c0-9df6-42b06d55ba4f-utilities\") pod \"certified-operators-d2dx8\" (UID: \"fc74fb2f-749c-40c0-9df6-42b06d55ba4f\") " pod="openshift-marketplace/certified-operators-d2dx8" Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.103382 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc74fb2f-749c-40c0-9df6-42b06d55ba4f-catalog-content\") pod \"certified-operators-d2dx8\" (UID: \"fc74fb2f-749c-40c0-9df6-42b06d55ba4f\") " pod="openshift-marketplace/certified-operators-d2dx8" Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.204988 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dttx\" (UniqueName: \"kubernetes.io/projected/fc74fb2f-749c-40c0-9df6-42b06d55ba4f-kube-api-access-4dttx\") pod \"certified-operators-d2dx8\" (UID: \"fc74fb2f-749c-40c0-9df6-42b06d55ba4f\") " pod="openshift-marketplace/certified-operators-d2dx8" Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.205485 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc74fb2f-749c-40c0-9df6-42b06d55ba4f-utilities\") pod \"certified-operators-d2dx8\" (UID: \"fc74fb2f-749c-40c0-9df6-42b06d55ba4f\") " pod="openshift-marketplace/certified-operators-d2dx8" Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.205899 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc74fb2f-749c-40c0-9df6-42b06d55ba4f-catalog-content\") pod \"certified-operators-d2dx8\" (UID: \"fc74fb2f-749c-40c0-9df6-42b06d55ba4f\") " pod="openshift-marketplace/certified-operators-d2dx8" Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.206065 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc74fb2f-749c-40c0-9df6-42b06d55ba4f-utilities\") pod \"certified-operators-d2dx8\" (UID: \"fc74fb2f-749c-40c0-9df6-42b06d55ba4f\") " pod="openshift-marketplace/certified-operators-d2dx8" Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.207788 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc74fb2f-749c-40c0-9df6-42b06d55ba4f-catalog-content\") pod \"certified-operators-d2dx8\" (UID: \"fc74fb2f-749c-40c0-9df6-42b06d55ba4f\") " pod="openshift-marketplace/certified-operators-d2dx8" Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.234836 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fr8w8"] Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.239220 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fr8w8" Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.242723 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dttx\" (UniqueName: \"kubernetes.io/projected/fc74fb2f-749c-40c0-9df6-42b06d55ba4f-kube-api-access-4dttx\") pod \"certified-operators-d2dx8\" (UID: \"fc74fb2f-749c-40c0-9df6-42b06d55ba4f\") " pod="openshift-marketplace/certified-operators-d2dx8" Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.249769 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fr8w8"] Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.308253 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5x9d\" (UniqueName: \"kubernetes.io/projected/65a7e955-c704-4aee-b990-4df46b91b2ca-kube-api-access-g5x9d\") pod \"community-operators-fr8w8\" (UID: \"65a7e955-c704-4aee-b990-4df46b91b2ca\") " pod="openshift-marketplace/community-operators-fr8w8" Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.308304 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65a7e955-c704-4aee-b990-4df46b91b2ca-catalog-content\") pod \"community-operators-fr8w8\" (UID: \"65a7e955-c704-4aee-b990-4df46b91b2ca\") " pod="openshift-marketplace/community-operators-fr8w8" Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.308355 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65a7e955-c704-4aee-b990-4df46b91b2ca-utilities\") pod \"community-operators-fr8w8\" (UID: \"65a7e955-c704-4aee-b990-4df46b91b2ca\") " pod="openshift-marketplace/community-operators-fr8w8" Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.395317 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d2dx8" Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.409245 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5x9d\" (UniqueName: \"kubernetes.io/projected/65a7e955-c704-4aee-b990-4df46b91b2ca-kube-api-access-g5x9d\") pod \"community-operators-fr8w8\" (UID: \"65a7e955-c704-4aee-b990-4df46b91b2ca\") " pod="openshift-marketplace/community-operators-fr8w8" Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.409298 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65a7e955-c704-4aee-b990-4df46b91b2ca-catalog-content\") pod \"community-operators-fr8w8\" (UID: \"65a7e955-c704-4aee-b990-4df46b91b2ca\") " pod="openshift-marketplace/community-operators-fr8w8" Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.409349 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65a7e955-c704-4aee-b990-4df46b91b2ca-utilities\") pod \"community-operators-fr8w8\" (UID: \"65a7e955-c704-4aee-b990-4df46b91b2ca\") " pod="openshift-marketplace/community-operators-fr8w8" Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.409955 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65a7e955-c704-4aee-b990-4df46b91b2ca-utilities\") pod \"community-operators-fr8w8\" (UID: \"65a7e955-c704-4aee-b990-4df46b91b2ca\") " pod="openshift-marketplace/community-operators-fr8w8" Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.410466 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65a7e955-c704-4aee-b990-4df46b91b2ca-catalog-content\") pod \"community-operators-fr8w8\" (UID: \"65a7e955-c704-4aee-b990-4df46b91b2ca\") " pod="openshift-marketplace/community-operators-fr8w8" Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.433347 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5x9d\" (UniqueName: \"kubernetes.io/projected/65a7e955-c704-4aee-b990-4df46b91b2ca-kube-api-access-g5x9d\") pod \"community-operators-fr8w8\" (UID: \"65a7e955-c704-4aee-b990-4df46b91b2ca\") " pod="openshift-marketplace/community-operators-fr8w8" Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.586195 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fr8w8" Dec 05 17:07:06 crc kubenswrapper[4756]: I1205 17:07:06.854219 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d2dx8"] Dec 05 17:07:07 crc kubenswrapper[4756]: I1205 17:07:07.021900 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fr8w8"] Dec 05 17:07:07 crc kubenswrapper[4756]: I1205 17:07:07.412808 4756 generic.go:334] "Generic (PLEG): container finished" podID="65a7e955-c704-4aee-b990-4df46b91b2ca" containerID="449cbd66bd10045bb70cce68f426606eff0682b23c7ffee73b680a0c89a32fa9" exitCode=0 Dec 05 17:07:07 crc kubenswrapper[4756]: I1205 17:07:07.412960 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fr8w8" event={"ID":"65a7e955-c704-4aee-b990-4df46b91b2ca","Type":"ContainerDied","Data":"449cbd66bd10045bb70cce68f426606eff0682b23c7ffee73b680a0c89a32fa9"} Dec 05 17:07:07 crc kubenswrapper[4756]: I1205 17:07:07.413294 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fr8w8" event={"ID":"65a7e955-c704-4aee-b990-4df46b91b2ca","Type":"ContainerStarted","Data":"030f3c163ab899847a9f2b4167e8f69ee4d8e61ae169fb186e4db81519450136"} Dec 05 17:07:07 crc kubenswrapper[4756]: I1205 17:07:07.417286 4756 generic.go:334] "Generic (PLEG): container finished" podID="fc74fb2f-749c-40c0-9df6-42b06d55ba4f" containerID="dbc71d1e874f9fafeb471276ab3c3be7ce7dbfb76b442b2dc03a0150bad9be28" exitCode=0 Dec 05 17:07:07 crc kubenswrapper[4756]: I1205 17:07:07.417331 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2dx8" event={"ID":"fc74fb2f-749c-40c0-9df6-42b06d55ba4f","Type":"ContainerDied","Data":"dbc71d1e874f9fafeb471276ab3c3be7ce7dbfb76b442b2dc03a0150bad9be28"} Dec 05 17:07:07 crc kubenswrapper[4756]: I1205 17:07:07.417360 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2dx8" event={"ID":"fc74fb2f-749c-40c0-9df6-42b06d55ba4f","Type":"ContainerStarted","Data":"df0b976e4e81b3068c20647e08705cf3310eaa5e0a94e9d53f6beb01f6f05ff7"} Dec 05 17:07:08 crc kubenswrapper[4756]: I1205 17:07:08.429170 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fr8w8" event={"ID":"65a7e955-c704-4aee-b990-4df46b91b2ca","Type":"ContainerStarted","Data":"00c239c2dc2018b7f2aebaba0cfc1a7bf2f22534708627af4947d0efd66180ef"} Dec 05 17:07:08 crc kubenswrapper[4756]: I1205 17:07:08.443120 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2dx8" event={"ID":"fc74fb2f-749c-40c0-9df6-42b06d55ba4f","Type":"ContainerStarted","Data":"752532e6a1fbca8a88a0ba993c5eb92d83d70361232381025aca2d7bb619127e"} Dec 05 17:07:09 crc kubenswrapper[4756]: I1205 17:07:09.458219 4756 generic.go:334] "Generic (PLEG): container finished" podID="65a7e955-c704-4aee-b990-4df46b91b2ca" containerID="00c239c2dc2018b7f2aebaba0cfc1a7bf2f22534708627af4947d0efd66180ef" exitCode=0 Dec 05 17:07:09 crc kubenswrapper[4756]: I1205 17:07:09.458361 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fr8w8" event={"ID":"65a7e955-c704-4aee-b990-4df46b91b2ca","Type":"ContainerDied","Data":"00c239c2dc2018b7f2aebaba0cfc1a7bf2f22534708627af4947d0efd66180ef"} Dec 05 17:07:09 crc kubenswrapper[4756]: I1205 17:07:09.461788 4756 generic.go:334] "Generic (PLEG): container finished" podID="fc74fb2f-749c-40c0-9df6-42b06d55ba4f" containerID="752532e6a1fbca8a88a0ba993c5eb92d83d70361232381025aca2d7bb619127e" exitCode=0 Dec 05 17:07:09 crc kubenswrapper[4756]: I1205 17:07:09.461844 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2dx8" event={"ID":"fc74fb2f-749c-40c0-9df6-42b06d55ba4f","Type":"ContainerDied","Data":"752532e6a1fbca8a88a0ba993c5eb92d83d70361232381025aca2d7bb619127e"} Dec 05 17:07:10 crc kubenswrapper[4756]: I1205 17:07:10.042103 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vqjrl"] Dec 05 17:07:10 crc kubenswrapper[4756]: I1205 17:07:10.044097 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vqjrl" Dec 05 17:07:10 crc kubenswrapper[4756]: I1205 17:07:10.078282 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61f63f61-e1b5-45d5-b491-ef5ff04bca49-catalog-content\") pod \"redhat-operators-vqjrl\" (UID: \"61f63f61-e1b5-45d5-b491-ef5ff04bca49\") " pod="openshift-marketplace/redhat-operators-vqjrl" Dec 05 17:07:10 crc kubenswrapper[4756]: I1205 17:07:10.079555 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61f63f61-e1b5-45d5-b491-ef5ff04bca49-utilities\") pod \"redhat-operators-vqjrl\" (UID: \"61f63f61-e1b5-45d5-b491-ef5ff04bca49\") " pod="openshift-marketplace/redhat-operators-vqjrl" Dec 05 17:07:10 crc kubenswrapper[4756]: I1205 17:07:10.079753 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkrdl\" (UniqueName: \"kubernetes.io/projected/61f63f61-e1b5-45d5-b491-ef5ff04bca49-kube-api-access-vkrdl\") pod \"redhat-operators-vqjrl\" (UID: \"61f63f61-e1b5-45d5-b491-ef5ff04bca49\") " pod="openshift-marketplace/redhat-operators-vqjrl" Dec 05 17:07:10 crc kubenswrapper[4756]: I1205 17:07:10.087341 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vqjrl"] Dec 05 17:07:10 crc kubenswrapper[4756]: I1205 17:07:10.181089 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61f63f61-e1b5-45d5-b491-ef5ff04bca49-utilities\") pod \"redhat-operators-vqjrl\" (UID: \"61f63f61-e1b5-45d5-b491-ef5ff04bca49\") " pod="openshift-marketplace/redhat-operators-vqjrl" Dec 05 17:07:10 crc kubenswrapper[4756]: I1205 17:07:10.181714 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkrdl\" (UniqueName: \"kubernetes.io/projected/61f63f61-e1b5-45d5-b491-ef5ff04bca49-kube-api-access-vkrdl\") pod \"redhat-operators-vqjrl\" (UID: \"61f63f61-e1b5-45d5-b491-ef5ff04bca49\") " pod="openshift-marketplace/redhat-operators-vqjrl" Dec 05 17:07:10 crc kubenswrapper[4756]: I1205 17:07:10.181872 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61f63f61-e1b5-45d5-b491-ef5ff04bca49-catalog-content\") pod \"redhat-operators-vqjrl\" (UID: \"61f63f61-e1b5-45d5-b491-ef5ff04bca49\") " pod="openshift-marketplace/redhat-operators-vqjrl" Dec 05 17:07:10 crc kubenswrapper[4756]: I1205 17:07:10.181630 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61f63f61-e1b5-45d5-b491-ef5ff04bca49-utilities\") pod \"redhat-operators-vqjrl\" (UID: \"61f63f61-e1b5-45d5-b491-ef5ff04bca49\") " pod="openshift-marketplace/redhat-operators-vqjrl" Dec 05 17:07:10 crc kubenswrapper[4756]: I1205 17:07:10.182355 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61f63f61-e1b5-45d5-b491-ef5ff04bca49-catalog-content\") pod \"redhat-operators-vqjrl\" (UID: \"61f63f61-e1b5-45d5-b491-ef5ff04bca49\") " pod="openshift-marketplace/redhat-operators-vqjrl" Dec 05 17:07:10 crc kubenswrapper[4756]: I1205 17:07:10.200513 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkrdl\" (UniqueName: \"kubernetes.io/projected/61f63f61-e1b5-45d5-b491-ef5ff04bca49-kube-api-access-vkrdl\") pod \"redhat-operators-vqjrl\" (UID: \"61f63f61-e1b5-45d5-b491-ef5ff04bca49\") " pod="openshift-marketplace/redhat-operators-vqjrl" Dec 05 17:07:10 crc kubenswrapper[4756]: I1205 17:07:10.367301 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vqjrl" Dec 05 17:07:10 crc kubenswrapper[4756]: I1205 17:07:10.476103 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fr8w8" event={"ID":"65a7e955-c704-4aee-b990-4df46b91b2ca","Type":"ContainerStarted","Data":"e02b4854b195c0b2da55f15cc5141ce4f4a8cb3d47000c972eb0e341a1c0a45d"} Dec 05 17:07:10 crc kubenswrapper[4756]: I1205 17:07:10.479674 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2dx8" event={"ID":"fc74fb2f-749c-40c0-9df6-42b06d55ba4f","Type":"ContainerStarted","Data":"7174f0207ad678c5108d67f3bc2e73bf305cd91582572b505ee1111d5e25ecd5"} Dec 05 17:07:10 crc kubenswrapper[4756]: I1205 17:07:10.503940 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fr8w8" podStartSLOduration=2.058567369 podStartE2EDuration="4.50391519s" podCreationTimestamp="2025-12-05 17:07:06 +0000 UTC" firstStartedPulling="2025-12-05 17:07:07.414567197 +0000 UTC m=+3087.653240709" lastFinishedPulling="2025-12-05 17:07:09.859915018 +0000 UTC m=+3090.098588530" observedRunningTime="2025-12-05 17:07:10.497049225 +0000 UTC m=+3090.735722747" watchObservedRunningTime="2025-12-05 17:07:10.50391519 +0000 UTC m=+3090.742588702" Dec 05 17:07:10 crc kubenswrapper[4756]: I1205 17:07:10.528658 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d2dx8" podStartSLOduration=2.077425718 podStartE2EDuration="4.528639568s" podCreationTimestamp="2025-12-05 17:07:06 +0000 UTC" firstStartedPulling="2025-12-05 17:07:07.419692455 +0000 UTC m=+3087.658365967" lastFinishedPulling="2025-12-05 17:07:09.870906305 +0000 UTC m=+3090.109579817" observedRunningTime="2025-12-05 17:07:10.524004353 +0000 UTC m=+3090.762677865" watchObservedRunningTime="2025-12-05 17:07:10.528639568 +0000 UTC m=+3090.767313080" Dec 05 17:07:10 crc kubenswrapper[4756]: I1205 17:07:10.637772 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vqjrl"] Dec 05 17:07:10 crc kubenswrapper[4756]: W1205 17:07:10.651192 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61f63f61_e1b5_45d5_b491_ef5ff04bca49.slice/crio-59684ceb5cc3cf05855dffd82b942ad03aa70f56112d88f9df10e94004bd2963 WatchSource:0}: Error finding container 59684ceb5cc3cf05855dffd82b942ad03aa70f56112d88f9df10e94004bd2963: Status 404 returned error can't find the container with id 59684ceb5cc3cf05855dffd82b942ad03aa70f56112d88f9df10e94004bd2963 Dec 05 17:07:11 crc kubenswrapper[4756]: I1205 17:07:11.490984 4756 generic.go:334] "Generic (PLEG): container finished" podID="61f63f61-e1b5-45d5-b491-ef5ff04bca49" containerID="98117712cfa5cb58292bd20e00907ef248c0d4d2928aff36df78af9489c8d3bc" exitCode=0 Dec 05 17:07:11 crc kubenswrapper[4756]: I1205 17:07:11.491057 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vqjrl" event={"ID":"61f63f61-e1b5-45d5-b491-ef5ff04bca49","Type":"ContainerDied","Data":"98117712cfa5cb58292bd20e00907ef248c0d4d2928aff36df78af9489c8d3bc"} Dec 05 17:07:11 crc kubenswrapper[4756]: I1205 17:07:11.491555 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vqjrl" event={"ID":"61f63f61-e1b5-45d5-b491-ef5ff04bca49","Type":"ContainerStarted","Data":"59684ceb5cc3cf05855dffd82b942ad03aa70f56112d88f9df10e94004bd2963"} Dec 05 17:07:12 crc kubenswrapper[4756]: I1205 17:07:12.506200 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vqjrl" event={"ID":"61f63f61-e1b5-45d5-b491-ef5ff04bca49","Type":"ContainerStarted","Data":"426eb046db78fab56c89aff889362004fa098cd5414dcb913c7b308ca9d27b73"} Dec 05 17:07:13 crc kubenswrapper[4756]: I1205 17:07:13.516116 4756 generic.go:334] "Generic (PLEG): container finished" podID="61f63f61-e1b5-45d5-b491-ef5ff04bca49" containerID="426eb046db78fab56c89aff889362004fa098cd5414dcb913c7b308ca9d27b73" exitCode=0 Dec 05 17:07:13 crc kubenswrapper[4756]: I1205 17:07:13.516203 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vqjrl" event={"ID":"61f63f61-e1b5-45d5-b491-ef5ff04bca49","Type":"ContainerDied","Data":"426eb046db78fab56c89aff889362004fa098cd5414dcb913c7b308ca9d27b73"} Dec 05 17:07:15 crc kubenswrapper[4756]: I1205 17:07:15.543703 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vqjrl" event={"ID":"61f63f61-e1b5-45d5-b491-ef5ff04bca49","Type":"ContainerStarted","Data":"b52d832c234453426e56a6dfb35c488359a9040a0e2f78508933ce4fbf4d594b"} Dec 05 17:07:15 crc kubenswrapper[4756]: I1205 17:07:15.574744 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vqjrl" podStartSLOduration=2.235645939 podStartE2EDuration="5.574717872s" podCreationTimestamp="2025-12-05 17:07:10 +0000 UTC" firstStartedPulling="2025-12-05 17:07:11.493841778 +0000 UTC m=+3091.732515290" lastFinishedPulling="2025-12-05 17:07:14.832913711 +0000 UTC m=+3095.071587223" observedRunningTime="2025-12-05 17:07:15.566872451 +0000 UTC m=+3095.805546013" watchObservedRunningTime="2025-12-05 17:07:15.574717872 +0000 UTC m=+3095.813391414" Dec 05 17:07:16 crc kubenswrapper[4756]: I1205 17:07:16.395877 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d2dx8" Dec 05 17:07:16 crc kubenswrapper[4756]: I1205 17:07:16.396205 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d2dx8" Dec 05 17:07:16 crc kubenswrapper[4756]: I1205 17:07:16.434957 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d2dx8" Dec 05 17:07:16 crc kubenswrapper[4756]: I1205 17:07:16.589714 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fr8w8" Dec 05 17:07:16 crc kubenswrapper[4756]: I1205 17:07:16.589787 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fr8w8" Dec 05 17:07:16 crc kubenswrapper[4756]: I1205 17:07:16.600122 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d2dx8" Dec 05 17:07:16 crc kubenswrapper[4756]: I1205 17:07:16.634092 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fr8w8" Dec 05 17:07:17 crc kubenswrapper[4756]: I1205 17:07:17.050735 4756 scope.go:117] "RemoveContainer" containerID="49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90" Dec 05 17:07:17 crc kubenswrapper[4756]: E1205 17:07:17.051180 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:07:17 crc kubenswrapper[4756]: I1205 17:07:17.599810 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fr8w8" Dec 05 17:07:18 crc kubenswrapper[4756]: I1205 17:07:18.838310 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d2dx8"] Dec 05 17:07:18 crc kubenswrapper[4756]: I1205 17:07:18.839487 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d2dx8" podUID="fc74fb2f-749c-40c0-9df6-42b06d55ba4f" containerName="registry-server" containerID="cri-o://7174f0207ad678c5108d67f3bc2e73bf305cd91582572b505ee1111d5e25ecd5" gracePeriod=2 Dec 05 17:07:20 crc kubenswrapper[4756]: I1205 17:07:20.230878 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fr8w8"] Dec 05 17:07:20 crc kubenswrapper[4756]: I1205 17:07:20.231150 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fr8w8" podUID="65a7e955-c704-4aee-b990-4df46b91b2ca" containerName="registry-server" containerID="cri-o://e02b4854b195c0b2da55f15cc5141ce4f4a8cb3d47000c972eb0e341a1c0a45d" gracePeriod=2 Dec 05 17:07:20 crc kubenswrapper[4756]: I1205 17:07:20.368143 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vqjrl" Dec 05 17:07:20 crc kubenswrapper[4756]: I1205 17:07:20.368525 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vqjrl" Dec 05 17:07:20 crc kubenswrapper[4756]: I1205 17:07:20.457550 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vqjrl" Dec 05 17:07:20 crc kubenswrapper[4756]: I1205 17:07:20.588289 4756 generic.go:334] "Generic (PLEG): container finished" podID="fc74fb2f-749c-40c0-9df6-42b06d55ba4f" containerID="7174f0207ad678c5108d67f3bc2e73bf305cd91582572b505ee1111d5e25ecd5" exitCode=0 Dec 05 17:07:20 crc kubenswrapper[4756]: I1205 17:07:20.588465 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2dx8" event={"ID":"fc74fb2f-749c-40c0-9df6-42b06d55ba4f","Type":"ContainerDied","Data":"7174f0207ad678c5108d67f3bc2e73bf305cd91582572b505ee1111d5e25ecd5"} Dec 05 17:07:20 crc kubenswrapper[4756]: I1205 17:07:20.591292 4756 generic.go:334] "Generic (PLEG): container finished" podID="65a7e955-c704-4aee-b990-4df46b91b2ca" containerID="e02b4854b195c0b2da55f15cc5141ce4f4a8cb3d47000c972eb0e341a1c0a45d" exitCode=0 Dec 05 17:07:20 crc kubenswrapper[4756]: I1205 17:07:20.591397 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fr8w8" event={"ID":"65a7e955-c704-4aee-b990-4df46b91b2ca","Type":"ContainerDied","Data":"e02b4854b195c0b2da55f15cc5141ce4f4a8cb3d47000c972eb0e341a1c0a45d"} Dec 05 17:07:20 crc kubenswrapper[4756]: I1205 17:07:20.639498 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vqjrl" Dec 05 17:07:20 crc kubenswrapper[4756]: I1205 17:07:20.669946 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fr8w8" Dec 05 17:07:20 crc kubenswrapper[4756]: I1205 17:07:20.856720 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5x9d\" (UniqueName: \"kubernetes.io/projected/65a7e955-c704-4aee-b990-4df46b91b2ca-kube-api-access-g5x9d\") pod \"65a7e955-c704-4aee-b990-4df46b91b2ca\" (UID: \"65a7e955-c704-4aee-b990-4df46b91b2ca\") " Dec 05 17:07:20 crc kubenswrapper[4756]: I1205 17:07:20.856768 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65a7e955-c704-4aee-b990-4df46b91b2ca-utilities\") pod \"65a7e955-c704-4aee-b990-4df46b91b2ca\" (UID: \"65a7e955-c704-4aee-b990-4df46b91b2ca\") " Dec 05 17:07:20 crc kubenswrapper[4756]: I1205 17:07:20.856794 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65a7e955-c704-4aee-b990-4df46b91b2ca-catalog-content\") pod \"65a7e955-c704-4aee-b990-4df46b91b2ca\" (UID: \"65a7e955-c704-4aee-b990-4df46b91b2ca\") " Dec 05 17:07:20 crc kubenswrapper[4756]: I1205 17:07:20.857731 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65a7e955-c704-4aee-b990-4df46b91b2ca-utilities" (OuterVolumeSpecName: "utilities") pod "65a7e955-c704-4aee-b990-4df46b91b2ca" (UID: "65a7e955-c704-4aee-b990-4df46b91b2ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:07:20 crc kubenswrapper[4756]: I1205 17:07:20.864292 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65a7e955-c704-4aee-b990-4df46b91b2ca-kube-api-access-g5x9d" (OuterVolumeSpecName: "kube-api-access-g5x9d") pod "65a7e955-c704-4aee-b990-4df46b91b2ca" (UID: "65a7e955-c704-4aee-b990-4df46b91b2ca"). InnerVolumeSpecName "kube-api-access-g5x9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:07:20 crc kubenswrapper[4756]: I1205 17:07:20.911452 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65a7e955-c704-4aee-b990-4df46b91b2ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65a7e955-c704-4aee-b990-4df46b91b2ca" (UID: "65a7e955-c704-4aee-b990-4df46b91b2ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:07:20 crc kubenswrapper[4756]: I1205 17:07:20.958514 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5x9d\" (UniqueName: \"kubernetes.io/projected/65a7e955-c704-4aee-b990-4df46b91b2ca-kube-api-access-g5x9d\") on node \"crc\" DevicePath \"\"" Dec 05 17:07:20 crc kubenswrapper[4756]: I1205 17:07:20.958560 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65a7e955-c704-4aee-b990-4df46b91b2ca-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 17:07:20 crc kubenswrapper[4756]: I1205 17:07:20.958576 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65a7e955-c704-4aee-b990-4df46b91b2ca-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 17:07:21 crc kubenswrapper[4756]: I1205 17:07:21.137141 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d2dx8" Dec 05 17:07:21 crc kubenswrapper[4756]: I1205 17:07:21.263302 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc74fb2f-749c-40c0-9df6-42b06d55ba4f-utilities\") pod \"fc74fb2f-749c-40c0-9df6-42b06d55ba4f\" (UID: \"fc74fb2f-749c-40c0-9df6-42b06d55ba4f\") " Dec 05 17:07:21 crc kubenswrapper[4756]: I1205 17:07:21.263389 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc74fb2f-749c-40c0-9df6-42b06d55ba4f-catalog-content\") pod \"fc74fb2f-749c-40c0-9df6-42b06d55ba4f\" (UID: \"fc74fb2f-749c-40c0-9df6-42b06d55ba4f\") " Dec 05 17:07:21 crc kubenswrapper[4756]: I1205 17:07:21.263508 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dttx\" (UniqueName: \"kubernetes.io/projected/fc74fb2f-749c-40c0-9df6-42b06d55ba4f-kube-api-access-4dttx\") pod \"fc74fb2f-749c-40c0-9df6-42b06d55ba4f\" (UID: \"fc74fb2f-749c-40c0-9df6-42b06d55ba4f\") " Dec 05 17:07:21 crc kubenswrapper[4756]: I1205 17:07:21.264388 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc74fb2f-749c-40c0-9df6-42b06d55ba4f-utilities" (OuterVolumeSpecName: "utilities") pod "fc74fb2f-749c-40c0-9df6-42b06d55ba4f" (UID: "fc74fb2f-749c-40c0-9df6-42b06d55ba4f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:07:21 crc kubenswrapper[4756]: I1205 17:07:21.268487 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc74fb2f-749c-40c0-9df6-42b06d55ba4f-kube-api-access-4dttx" (OuterVolumeSpecName: "kube-api-access-4dttx") pod "fc74fb2f-749c-40c0-9df6-42b06d55ba4f" (UID: "fc74fb2f-749c-40c0-9df6-42b06d55ba4f"). InnerVolumeSpecName "kube-api-access-4dttx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:07:21 crc kubenswrapper[4756]: I1205 17:07:21.310931 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc74fb2f-749c-40c0-9df6-42b06d55ba4f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc74fb2f-749c-40c0-9df6-42b06d55ba4f" (UID: "fc74fb2f-749c-40c0-9df6-42b06d55ba4f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:07:21 crc kubenswrapper[4756]: I1205 17:07:21.365376 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dttx\" (UniqueName: \"kubernetes.io/projected/fc74fb2f-749c-40c0-9df6-42b06d55ba4f-kube-api-access-4dttx\") on node \"crc\" DevicePath \"\"" Dec 05 17:07:21 crc kubenswrapper[4756]: I1205 17:07:21.365487 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc74fb2f-749c-40c0-9df6-42b06d55ba4f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 17:07:21 crc kubenswrapper[4756]: I1205 17:07:21.365501 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc74fb2f-749c-40c0-9df6-42b06d55ba4f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 17:07:21 crc kubenswrapper[4756]: I1205 17:07:21.601562 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fr8w8" event={"ID":"65a7e955-c704-4aee-b990-4df46b91b2ca","Type":"ContainerDied","Data":"030f3c163ab899847a9f2b4167e8f69ee4d8e61ae169fb186e4db81519450136"} Dec 05 17:07:21 crc kubenswrapper[4756]: I1205 17:07:21.601598 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fr8w8" Dec 05 17:07:21 crc kubenswrapper[4756]: I1205 17:07:21.601624 4756 scope.go:117] "RemoveContainer" containerID="e02b4854b195c0b2da55f15cc5141ce4f4a8cb3d47000c972eb0e341a1c0a45d" Dec 05 17:07:21 crc kubenswrapper[4756]: I1205 17:07:21.622129 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d2dx8" Dec 05 17:07:21 crc kubenswrapper[4756]: I1205 17:07:21.624690 4756 scope.go:117] "RemoveContainer" containerID="00c239c2dc2018b7f2aebaba0cfc1a7bf2f22534708627af4947d0efd66180ef" Dec 05 17:07:21 crc kubenswrapper[4756]: I1205 17:07:21.617043 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2dx8" event={"ID":"fc74fb2f-749c-40c0-9df6-42b06d55ba4f","Type":"ContainerDied","Data":"df0b976e4e81b3068c20647e08705cf3310eaa5e0a94e9d53f6beb01f6f05ff7"} Dec 05 17:07:21 crc kubenswrapper[4756]: I1205 17:07:21.678623 4756 scope.go:117] "RemoveContainer" containerID="449cbd66bd10045bb70cce68f426606eff0682b23c7ffee73b680a0c89a32fa9" Dec 05 17:07:21 crc kubenswrapper[4756]: I1205 17:07:21.702071 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fr8w8"] Dec 05 17:07:21 crc kubenswrapper[4756]: I1205 17:07:21.730244 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fr8w8"] Dec 05 17:07:21 crc kubenswrapper[4756]: I1205 17:07:21.733000 4756 scope.go:117] "RemoveContainer" containerID="7174f0207ad678c5108d67f3bc2e73bf305cd91582572b505ee1111d5e25ecd5" Dec 05 17:07:21 crc kubenswrapper[4756]: I1205 17:07:21.736185 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d2dx8"] Dec 05 17:07:21 crc kubenswrapper[4756]: I1205 17:07:21.743046 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d2dx8"] Dec 05 17:07:21 crc kubenswrapper[4756]: I1205 17:07:21.748326 4756 scope.go:117] "RemoveContainer" containerID="752532e6a1fbca8a88a0ba993c5eb92d83d70361232381025aca2d7bb619127e" Dec 05 17:07:21 crc kubenswrapper[4756]: I1205 17:07:21.767589 4756 scope.go:117] "RemoveContainer" containerID="dbc71d1e874f9fafeb471276ab3c3be7ce7dbfb76b442b2dc03a0150bad9be28" Dec 05 17:07:22 crc kubenswrapper[4756]: I1205 17:07:22.064408 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65a7e955-c704-4aee-b990-4df46b91b2ca" path="/var/lib/kubelet/pods/65a7e955-c704-4aee-b990-4df46b91b2ca/volumes" Dec 05 17:07:22 crc kubenswrapper[4756]: I1205 17:07:22.065928 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc74fb2f-749c-40c0-9df6-42b06d55ba4f" path="/var/lib/kubelet/pods/fc74fb2f-749c-40c0-9df6-42b06d55ba4f/volumes" Dec 05 17:07:23 crc kubenswrapper[4756]: I1205 17:07:23.632569 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vqjrl"] Dec 05 17:07:23 crc kubenswrapper[4756]: I1205 17:07:23.633204 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vqjrl" podUID="61f63f61-e1b5-45d5-b491-ef5ff04bca49" containerName="registry-server" containerID="cri-o://b52d832c234453426e56a6dfb35c488359a9040a0e2f78508933ce4fbf4d594b" gracePeriod=2 Dec 05 17:07:25 crc kubenswrapper[4756]: I1205 17:07:25.657037 4756 generic.go:334] "Generic (PLEG): container finished" podID="61f63f61-e1b5-45d5-b491-ef5ff04bca49" containerID="b52d832c234453426e56a6dfb35c488359a9040a0e2f78508933ce4fbf4d594b" exitCode=0 Dec 05 17:07:25 crc kubenswrapper[4756]: I1205 17:07:25.657373 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vqjrl" event={"ID":"61f63f61-e1b5-45d5-b491-ef5ff04bca49","Type":"ContainerDied","Data":"b52d832c234453426e56a6dfb35c488359a9040a0e2f78508933ce4fbf4d594b"} Dec 05 17:07:25 crc kubenswrapper[4756]: I1205 17:07:25.837686 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vqjrl" Dec 05 17:07:25 crc kubenswrapper[4756]: I1205 17:07:25.943717 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61f63f61-e1b5-45d5-b491-ef5ff04bca49-catalog-content\") pod \"61f63f61-e1b5-45d5-b491-ef5ff04bca49\" (UID: \"61f63f61-e1b5-45d5-b491-ef5ff04bca49\") " Dec 05 17:07:25 crc kubenswrapper[4756]: I1205 17:07:25.943851 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61f63f61-e1b5-45d5-b491-ef5ff04bca49-utilities\") pod \"61f63f61-e1b5-45d5-b491-ef5ff04bca49\" (UID: \"61f63f61-e1b5-45d5-b491-ef5ff04bca49\") " Dec 05 17:07:25 crc kubenswrapper[4756]: I1205 17:07:25.943923 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkrdl\" (UniqueName: \"kubernetes.io/projected/61f63f61-e1b5-45d5-b491-ef5ff04bca49-kube-api-access-vkrdl\") pod \"61f63f61-e1b5-45d5-b491-ef5ff04bca49\" (UID: \"61f63f61-e1b5-45d5-b491-ef5ff04bca49\") " Dec 05 17:07:25 crc kubenswrapper[4756]: I1205 17:07:25.945295 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61f63f61-e1b5-45d5-b491-ef5ff04bca49-utilities" (OuterVolumeSpecName: "utilities") pod "61f63f61-e1b5-45d5-b491-ef5ff04bca49" (UID: "61f63f61-e1b5-45d5-b491-ef5ff04bca49"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:07:25 crc kubenswrapper[4756]: I1205 17:07:25.951118 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61f63f61-e1b5-45d5-b491-ef5ff04bca49-kube-api-access-vkrdl" (OuterVolumeSpecName: "kube-api-access-vkrdl") pod "61f63f61-e1b5-45d5-b491-ef5ff04bca49" (UID: "61f63f61-e1b5-45d5-b491-ef5ff04bca49"). InnerVolumeSpecName "kube-api-access-vkrdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:07:26 crc kubenswrapper[4756]: I1205 17:07:26.045723 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61f63f61-e1b5-45d5-b491-ef5ff04bca49-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 17:07:26 crc kubenswrapper[4756]: I1205 17:07:26.045772 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkrdl\" (UniqueName: \"kubernetes.io/projected/61f63f61-e1b5-45d5-b491-ef5ff04bca49-kube-api-access-vkrdl\") on node \"crc\" DevicePath \"\"" Dec 05 17:07:26 crc kubenswrapper[4756]: I1205 17:07:26.070294 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61f63f61-e1b5-45d5-b491-ef5ff04bca49-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "61f63f61-e1b5-45d5-b491-ef5ff04bca49" (UID: "61f63f61-e1b5-45d5-b491-ef5ff04bca49"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:07:26 crc kubenswrapper[4756]: I1205 17:07:26.147205 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61f63f61-e1b5-45d5-b491-ef5ff04bca49-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 17:07:26 crc kubenswrapper[4756]: I1205 17:07:26.666516 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vqjrl" event={"ID":"61f63f61-e1b5-45d5-b491-ef5ff04bca49","Type":"ContainerDied","Data":"59684ceb5cc3cf05855dffd82b942ad03aa70f56112d88f9df10e94004bd2963"} Dec 05 17:07:26 crc kubenswrapper[4756]: I1205 17:07:26.666586 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vqjrl" Dec 05 17:07:26 crc kubenswrapper[4756]: I1205 17:07:26.667270 4756 scope.go:117] "RemoveContainer" containerID="b52d832c234453426e56a6dfb35c488359a9040a0e2f78508933ce4fbf4d594b" Dec 05 17:07:26 crc kubenswrapper[4756]: I1205 17:07:26.699782 4756 scope.go:117] "RemoveContainer" containerID="426eb046db78fab56c89aff889362004fa098cd5414dcb913c7b308ca9d27b73" Dec 05 17:07:26 crc kubenswrapper[4756]: I1205 17:07:26.705288 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vqjrl"] Dec 05 17:07:26 crc kubenswrapper[4756]: I1205 17:07:26.710445 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vqjrl"] Dec 05 17:07:26 crc kubenswrapper[4756]: I1205 17:07:26.752004 4756 scope.go:117] "RemoveContainer" containerID="98117712cfa5cb58292bd20e00907ef248c0d4d2928aff36df78af9489c8d3bc" Dec 05 17:07:28 crc kubenswrapper[4756]: I1205 17:07:28.060316 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61f63f61-e1b5-45d5-b491-ef5ff04bca49" path="/var/lib/kubelet/pods/61f63f61-e1b5-45d5-b491-ef5ff04bca49/volumes" Dec 05 17:07:32 crc kubenswrapper[4756]: I1205 17:07:32.050366 4756 scope.go:117] "RemoveContainer" containerID="49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90" Dec 05 17:07:32 crc kubenswrapper[4756]: E1205 17:07:32.051190 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:07:45 crc kubenswrapper[4756]: I1205 17:07:45.050224 4756 scope.go:117] "RemoveContainer" containerID="49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90" Dec 05 17:07:45 crc kubenswrapper[4756]: E1205 17:07:45.051396 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:07:56 crc kubenswrapper[4756]: I1205 17:07:56.049361 4756 scope.go:117] "RemoveContainer" containerID="49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90" Dec 05 17:07:56 crc kubenswrapper[4756]: E1205 17:07:56.050147 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:08:11 crc kubenswrapper[4756]: I1205 17:08:11.050737 4756 scope.go:117] "RemoveContainer" containerID="49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90" Dec 05 17:08:11 crc kubenswrapper[4756]: E1205 17:08:11.052091 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:08:24 crc kubenswrapper[4756]: I1205 17:08:24.049602 4756 scope.go:117] "RemoveContainer" containerID="49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90" Dec 05 17:08:24 crc kubenswrapper[4756]: E1205 17:08:24.050919 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:08:35 crc kubenswrapper[4756]: I1205 17:08:35.049938 4756 scope.go:117] "RemoveContainer" containerID="49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90" Dec 05 17:08:35 crc kubenswrapper[4756]: E1205 17:08:35.051264 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:08:50 crc kubenswrapper[4756]: I1205 17:08:50.056954 4756 scope.go:117] "RemoveContainer" containerID="49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90" Dec 05 17:08:50 crc kubenswrapper[4756]: E1205 17:08:50.057900 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:09:01 crc kubenswrapper[4756]: I1205 17:09:01.049451 4756 scope.go:117] "RemoveContainer" containerID="49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90" Dec 05 17:09:01 crc kubenswrapper[4756]: E1205 17:09:01.050292 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:09:14 crc kubenswrapper[4756]: I1205 17:09:14.049890 4756 scope.go:117] "RemoveContainer" containerID="49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90" Dec 05 17:09:14 crc kubenswrapper[4756]: E1205 17:09:14.051156 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:09:26 crc kubenswrapper[4756]: I1205 17:09:26.049609 4756 scope.go:117] "RemoveContainer" containerID="49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90" Dec 05 17:09:26 crc kubenswrapper[4756]: E1205 17:09:26.050407 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:09:41 crc kubenswrapper[4756]: I1205 17:09:41.049935 4756 scope.go:117] "RemoveContainer" containerID="49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90" Dec 05 17:09:41 crc kubenswrapper[4756]: E1205 17:09:41.050717 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:09:56 crc kubenswrapper[4756]: I1205 17:09:56.050574 4756 scope.go:117] "RemoveContainer" containerID="49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90" Dec 05 17:09:56 crc kubenswrapper[4756]: E1205 17:09:56.051828 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:10:07 crc kubenswrapper[4756]: I1205 17:10:07.049884 4756 scope.go:117] "RemoveContainer" containerID="49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90" Dec 05 17:10:07 crc kubenswrapper[4756]: E1205 17:10:07.050757 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:10:19 crc kubenswrapper[4756]: I1205 17:10:19.049055 4756 scope.go:117] "RemoveContainer" containerID="49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90" Dec 05 17:10:19 crc kubenswrapper[4756]: E1205 17:10:19.050055 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:10:30 crc kubenswrapper[4756]: I1205 17:10:30.054515 4756 scope.go:117] "RemoveContainer" containerID="49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90" Dec 05 17:10:30 crc kubenswrapper[4756]: E1205 17:10:30.055215 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:10:41 crc kubenswrapper[4756]: I1205 17:10:41.049525 4756 scope.go:117] "RemoveContainer" containerID="49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90" Dec 05 17:10:41 crc kubenswrapper[4756]: E1205 17:10:41.050491 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:10:54 crc kubenswrapper[4756]: I1205 17:10:54.050859 4756 scope.go:117] "RemoveContainer" containerID="49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90" Dec 05 17:10:54 crc kubenswrapper[4756]: E1205 17:10:54.053563 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:11:05 crc kubenswrapper[4756]: I1205 17:11:05.049470 4756 scope.go:117] "RemoveContainer" containerID="49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90" Dec 05 17:11:05 crc kubenswrapper[4756]: E1205 17:11:05.050204 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:11:20 crc kubenswrapper[4756]: I1205 17:11:20.050196 4756 scope.go:117] "RemoveContainer" containerID="49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90" Dec 05 17:11:20 crc kubenswrapper[4756]: E1205 17:11:20.051563 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:11:34 crc kubenswrapper[4756]: I1205 17:11:34.049614 4756 scope.go:117] "RemoveContainer" containerID="49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90" Dec 05 17:11:34 crc kubenswrapper[4756]: E1205 17:11:34.050475 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:11:47 crc kubenswrapper[4756]: I1205 17:11:47.050919 4756 scope.go:117] "RemoveContainer" containerID="49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90" Dec 05 17:11:47 crc kubenswrapper[4756]: E1205 17:11:47.052618 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:11:58 crc kubenswrapper[4756]: I1205 17:11:58.050102 4756 scope.go:117] "RemoveContainer" containerID="49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90" Dec 05 17:11:58 crc kubenswrapper[4756]: I1205 17:11:58.938828 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"fb829834883dcdc384034349ec5b152a5e50be349cace05fa772c48c6be30b43"} Dec 05 17:14:05 crc kubenswrapper[4756]: I1205 17:14:05.618310 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4lkvh"] Dec 05 17:14:05 crc kubenswrapper[4756]: E1205 17:14:05.619771 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc74fb2f-749c-40c0-9df6-42b06d55ba4f" containerName="extract-utilities" Dec 05 17:14:05 crc kubenswrapper[4756]: I1205 17:14:05.619806 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc74fb2f-749c-40c0-9df6-42b06d55ba4f" containerName="extract-utilities" Dec 05 17:14:05 crc kubenswrapper[4756]: E1205 17:14:05.619838 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a7e955-c704-4aee-b990-4df46b91b2ca" containerName="extract-utilities" Dec 05 17:14:05 crc kubenswrapper[4756]: I1205 17:14:05.619856 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a7e955-c704-4aee-b990-4df46b91b2ca" containerName="extract-utilities" Dec 05 17:14:05 crc kubenswrapper[4756]: E1205 17:14:05.619890 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc74fb2f-749c-40c0-9df6-42b06d55ba4f" containerName="registry-server" Dec 05 17:14:05 crc kubenswrapper[4756]: I1205 17:14:05.619907 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc74fb2f-749c-40c0-9df6-42b06d55ba4f" containerName="registry-server" Dec 05 17:14:05 crc kubenswrapper[4756]: E1205 17:14:05.619941 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc74fb2f-749c-40c0-9df6-42b06d55ba4f" containerName="extract-content" Dec 05 17:14:05 crc kubenswrapper[4756]: I1205 17:14:05.619957 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc74fb2f-749c-40c0-9df6-42b06d55ba4f" containerName="extract-content" Dec 05 17:14:05 crc kubenswrapper[4756]: E1205 17:14:05.619997 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a7e955-c704-4aee-b990-4df46b91b2ca" containerName="registry-server" Dec 05 17:14:05 crc kubenswrapper[4756]: I1205 17:14:05.620014 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a7e955-c704-4aee-b990-4df46b91b2ca" containerName="registry-server" Dec 05 17:14:05 crc kubenswrapper[4756]: E1205 17:14:05.620048 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f63f61-e1b5-45d5-b491-ef5ff04bca49" containerName="extract-utilities" Dec 05 17:14:05 crc kubenswrapper[4756]: I1205 17:14:05.620065 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f63f61-e1b5-45d5-b491-ef5ff04bca49" containerName="extract-utilities" Dec 05 17:14:05 crc kubenswrapper[4756]: E1205 17:14:05.620095 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a7e955-c704-4aee-b990-4df46b91b2ca" containerName="extract-content" Dec 05 17:14:05 crc kubenswrapper[4756]: I1205 17:14:05.620112 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a7e955-c704-4aee-b990-4df46b91b2ca" containerName="extract-content" Dec 05 17:14:05 crc kubenswrapper[4756]: E1205 17:14:05.620131 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f63f61-e1b5-45d5-b491-ef5ff04bca49" containerName="registry-server" Dec 05 17:14:05 crc kubenswrapper[4756]: I1205 17:14:05.620148 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f63f61-e1b5-45d5-b491-ef5ff04bca49" containerName="registry-server" Dec 05 17:14:05 crc kubenswrapper[4756]: E1205 17:14:05.620179 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f63f61-e1b5-45d5-b491-ef5ff04bca49" containerName="extract-content" Dec 05 17:14:05 crc kubenswrapper[4756]: I1205 17:14:05.620196 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f63f61-e1b5-45d5-b491-ef5ff04bca49" containerName="extract-content" Dec 05 17:14:05 crc kubenswrapper[4756]: I1205 17:14:05.620606 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="65a7e955-c704-4aee-b990-4df46b91b2ca" containerName="registry-server" Dec 05 17:14:05 crc kubenswrapper[4756]: I1205 17:14:05.620656 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="61f63f61-e1b5-45d5-b491-ef5ff04bca49" containerName="registry-server" Dec 05 17:14:05 crc kubenswrapper[4756]: I1205 17:14:05.620703 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc74fb2f-749c-40c0-9df6-42b06d55ba4f" containerName="registry-server" Dec 05 17:14:05 crc kubenswrapper[4756]: I1205 17:14:05.623393 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4lkvh" Dec 05 17:14:05 crc kubenswrapper[4756]: I1205 17:14:05.629176 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4lkvh"] Dec 05 17:14:05 crc kubenswrapper[4756]: I1205 17:14:05.756489 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29a95529-088f-4d96-becc-5efff3ef0671-utilities\") pod \"redhat-marketplace-4lkvh\" (UID: \"29a95529-088f-4d96-becc-5efff3ef0671\") " pod="openshift-marketplace/redhat-marketplace-4lkvh" Dec 05 17:14:05 crc kubenswrapper[4756]: I1205 17:14:05.756563 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcm8s\" (UniqueName: \"kubernetes.io/projected/29a95529-088f-4d96-becc-5efff3ef0671-kube-api-access-rcm8s\") pod \"redhat-marketplace-4lkvh\" (UID: \"29a95529-088f-4d96-becc-5efff3ef0671\") " pod="openshift-marketplace/redhat-marketplace-4lkvh" Dec 05 17:14:05 crc kubenswrapper[4756]: I1205 17:14:05.756798 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29a95529-088f-4d96-becc-5efff3ef0671-catalog-content\") pod \"redhat-marketplace-4lkvh\" (UID: \"29a95529-088f-4d96-becc-5efff3ef0671\") " pod="openshift-marketplace/redhat-marketplace-4lkvh" Dec 05 17:14:05 crc kubenswrapper[4756]: I1205 17:14:05.858410 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29a95529-088f-4d96-becc-5efff3ef0671-catalog-content\") pod \"redhat-marketplace-4lkvh\" (UID: \"29a95529-088f-4d96-becc-5efff3ef0671\") " pod="openshift-marketplace/redhat-marketplace-4lkvh" Dec 05 17:14:05 crc kubenswrapper[4756]: I1205 17:14:05.858503 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29a95529-088f-4d96-becc-5efff3ef0671-utilities\") pod \"redhat-marketplace-4lkvh\" (UID: \"29a95529-088f-4d96-becc-5efff3ef0671\") " pod="openshift-marketplace/redhat-marketplace-4lkvh" Dec 05 17:14:05 crc kubenswrapper[4756]: I1205 17:14:05.858521 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcm8s\" (UniqueName: \"kubernetes.io/projected/29a95529-088f-4d96-becc-5efff3ef0671-kube-api-access-rcm8s\") pod \"redhat-marketplace-4lkvh\" (UID: \"29a95529-088f-4d96-becc-5efff3ef0671\") " pod="openshift-marketplace/redhat-marketplace-4lkvh" Dec 05 17:14:05 crc kubenswrapper[4756]: I1205 17:14:05.859070 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29a95529-088f-4d96-becc-5efff3ef0671-catalog-content\") pod \"redhat-marketplace-4lkvh\" (UID: \"29a95529-088f-4d96-becc-5efff3ef0671\") " pod="openshift-marketplace/redhat-marketplace-4lkvh" Dec 05 17:14:05 crc kubenswrapper[4756]: I1205 17:14:05.859196 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29a95529-088f-4d96-becc-5efff3ef0671-utilities\") pod \"redhat-marketplace-4lkvh\" (UID: \"29a95529-088f-4d96-becc-5efff3ef0671\") " pod="openshift-marketplace/redhat-marketplace-4lkvh" Dec 05 17:14:05 crc kubenswrapper[4756]: I1205 17:14:05.879040 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcm8s\" (UniqueName: \"kubernetes.io/projected/29a95529-088f-4d96-becc-5efff3ef0671-kube-api-access-rcm8s\") pod \"redhat-marketplace-4lkvh\" (UID: \"29a95529-088f-4d96-becc-5efff3ef0671\") " pod="openshift-marketplace/redhat-marketplace-4lkvh" Dec 05 17:14:05 crc kubenswrapper[4756]: I1205 17:14:05.957771 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4lkvh" Dec 05 17:14:06 crc kubenswrapper[4756]: I1205 17:14:06.525709 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4lkvh"] Dec 05 17:14:06 crc kubenswrapper[4756]: W1205 17:14:06.532394 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29a95529_088f_4d96_becc_5efff3ef0671.slice/crio-10a5e7f45e84c126321a4df8343572f6c46071ac5e198e3eff450a410a53e8ff WatchSource:0}: Error finding container 10a5e7f45e84c126321a4df8343572f6c46071ac5e198e3eff450a410a53e8ff: Status 404 returned error can't find the container with id 10a5e7f45e84c126321a4df8343572f6c46071ac5e198e3eff450a410a53e8ff Dec 05 17:14:07 crc kubenswrapper[4756]: I1205 17:14:07.099178 4756 generic.go:334] "Generic (PLEG): container finished" podID="29a95529-088f-4d96-becc-5efff3ef0671" containerID="7b060c1ad6470a3b221c110b6e069cb257b846d57d7ffed64c9223c3b29d2ae4" exitCode=0 Dec 05 17:14:07 crc kubenswrapper[4756]: I1205 17:14:07.099307 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4lkvh" event={"ID":"29a95529-088f-4d96-becc-5efff3ef0671","Type":"ContainerDied","Data":"7b060c1ad6470a3b221c110b6e069cb257b846d57d7ffed64c9223c3b29d2ae4"} Dec 05 17:14:07 crc kubenswrapper[4756]: I1205 17:14:07.099554 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4lkvh" event={"ID":"29a95529-088f-4d96-becc-5efff3ef0671","Type":"ContainerStarted","Data":"10a5e7f45e84c126321a4df8343572f6c46071ac5e198e3eff450a410a53e8ff"} Dec 05 17:14:07 crc kubenswrapper[4756]: I1205 17:14:07.101999 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 17:14:09 crc kubenswrapper[4756]: I1205 17:14:09.116002 4756 generic.go:334] "Generic (PLEG): container finished" podID="29a95529-088f-4d96-becc-5efff3ef0671" containerID="ad99a724b1ca21c2e4994d650fe727ba0c3390021d92dbeffb618c845fcbf4da" exitCode=0 Dec 05 17:14:09 crc kubenswrapper[4756]: I1205 17:14:09.116092 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4lkvh" event={"ID":"29a95529-088f-4d96-becc-5efff3ef0671","Type":"ContainerDied","Data":"ad99a724b1ca21c2e4994d650fe727ba0c3390021d92dbeffb618c845fcbf4da"} Dec 05 17:14:10 crc kubenswrapper[4756]: I1205 17:14:10.128008 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4lkvh" event={"ID":"29a95529-088f-4d96-becc-5efff3ef0671","Type":"ContainerStarted","Data":"2b9c42b58e7f3b8727ac09af4c0c92518dc4b4b2a7233974efa9bda44870db2a"} Dec 05 17:14:10 crc kubenswrapper[4756]: I1205 17:14:10.154295 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4lkvh" podStartSLOduration=2.698521731 podStartE2EDuration="5.154269783s" podCreationTimestamp="2025-12-05 17:14:05 +0000 UTC" firstStartedPulling="2025-12-05 17:14:07.101709183 +0000 UTC m=+3507.340382695" lastFinishedPulling="2025-12-05 17:14:09.557457225 +0000 UTC m=+3509.796130747" observedRunningTime="2025-12-05 17:14:10.151035715 +0000 UTC m=+3510.389709247" watchObservedRunningTime="2025-12-05 17:14:10.154269783 +0000 UTC m=+3510.392943305" Dec 05 17:14:15 crc kubenswrapper[4756]: I1205 17:14:15.958564 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4lkvh" Dec 05 17:14:15 crc kubenswrapper[4756]: I1205 17:14:15.959277 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4lkvh" Dec 05 17:14:16 crc kubenswrapper[4756]: I1205 17:14:16.035586 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4lkvh" Dec 05 17:14:16 crc kubenswrapper[4756]: I1205 17:14:16.225585 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4lkvh" Dec 05 17:14:16 crc kubenswrapper[4756]: I1205 17:14:16.279416 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4lkvh"] Dec 05 17:14:18 crc kubenswrapper[4756]: I1205 17:14:18.191676 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4lkvh" podUID="29a95529-088f-4d96-becc-5efff3ef0671" containerName="registry-server" containerID="cri-o://2b9c42b58e7f3b8727ac09af4c0c92518dc4b4b2a7233974efa9bda44870db2a" gracePeriod=2 Dec 05 17:14:18 crc kubenswrapper[4756]: I1205 17:14:18.652246 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4lkvh" Dec 05 17:14:18 crc kubenswrapper[4756]: I1205 17:14:18.745947 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29a95529-088f-4d96-becc-5efff3ef0671-catalog-content\") pod \"29a95529-088f-4d96-becc-5efff3ef0671\" (UID: \"29a95529-088f-4d96-becc-5efff3ef0671\") " Dec 05 17:14:18 crc kubenswrapper[4756]: I1205 17:14:18.746377 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcm8s\" (UniqueName: \"kubernetes.io/projected/29a95529-088f-4d96-becc-5efff3ef0671-kube-api-access-rcm8s\") pod \"29a95529-088f-4d96-becc-5efff3ef0671\" (UID: \"29a95529-088f-4d96-becc-5efff3ef0671\") " Dec 05 17:14:18 crc kubenswrapper[4756]: I1205 17:14:18.746461 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29a95529-088f-4d96-becc-5efff3ef0671-utilities\") pod \"29a95529-088f-4d96-becc-5efff3ef0671\" (UID: \"29a95529-088f-4d96-becc-5efff3ef0671\") " Dec 05 17:14:18 crc kubenswrapper[4756]: I1205 17:14:18.747640 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29a95529-088f-4d96-becc-5efff3ef0671-utilities" (OuterVolumeSpecName: "utilities") pod "29a95529-088f-4d96-becc-5efff3ef0671" (UID: "29a95529-088f-4d96-becc-5efff3ef0671"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:14:18 crc kubenswrapper[4756]: I1205 17:14:18.753904 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29a95529-088f-4d96-becc-5efff3ef0671-kube-api-access-rcm8s" (OuterVolumeSpecName: "kube-api-access-rcm8s") pod "29a95529-088f-4d96-becc-5efff3ef0671" (UID: "29a95529-088f-4d96-becc-5efff3ef0671"). InnerVolumeSpecName "kube-api-access-rcm8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:14:18 crc kubenswrapper[4756]: I1205 17:14:18.772826 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29a95529-088f-4d96-becc-5efff3ef0671-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "29a95529-088f-4d96-becc-5efff3ef0671" (UID: "29a95529-088f-4d96-becc-5efff3ef0671"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:14:18 crc kubenswrapper[4756]: I1205 17:14:18.848976 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29a95529-088f-4d96-becc-5efff3ef0671-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 17:14:18 crc kubenswrapper[4756]: I1205 17:14:18.849013 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29a95529-088f-4d96-becc-5efff3ef0671-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 17:14:18 crc kubenswrapper[4756]: I1205 17:14:18.849025 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcm8s\" (UniqueName: \"kubernetes.io/projected/29a95529-088f-4d96-becc-5efff3ef0671-kube-api-access-rcm8s\") on node \"crc\" DevicePath \"\"" Dec 05 17:14:19 crc kubenswrapper[4756]: I1205 17:14:19.206853 4756 generic.go:334] "Generic (PLEG): container finished" podID="29a95529-088f-4d96-becc-5efff3ef0671" containerID="2b9c42b58e7f3b8727ac09af4c0c92518dc4b4b2a7233974efa9bda44870db2a" exitCode=0 Dec 05 17:14:19 crc kubenswrapper[4756]: I1205 17:14:19.206912 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4lkvh" event={"ID":"29a95529-088f-4d96-becc-5efff3ef0671","Type":"ContainerDied","Data":"2b9c42b58e7f3b8727ac09af4c0c92518dc4b4b2a7233974efa9bda44870db2a"} Dec 05 17:14:19 crc kubenswrapper[4756]: I1205 17:14:19.206950 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4lkvh" event={"ID":"29a95529-088f-4d96-becc-5efff3ef0671","Type":"ContainerDied","Data":"10a5e7f45e84c126321a4df8343572f6c46071ac5e198e3eff450a410a53e8ff"} Dec 05 17:14:19 crc kubenswrapper[4756]: I1205 17:14:19.206970 4756 scope.go:117] "RemoveContainer" containerID="2b9c42b58e7f3b8727ac09af4c0c92518dc4b4b2a7233974efa9bda44870db2a" Dec 05 17:14:19 crc kubenswrapper[4756]: I1205 17:14:19.207439 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4lkvh" Dec 05 17:14:19 crc kubenswrapper[4756]: I1205 17:14:19.226783 4756 scope.go:117] "RemoveContainer" containerID="ad99a724b1ca21c2e4994d650fe727ba0c3390021d92dbeffb618c845fcbf4da" Dec 05 17:14:19 crc kubenswrapper[4756]: I1205 17:14:19.247178 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4lkvh"] Dec 05 17:14:19 crc kubenswrapper[4756]: I1205 17:14:19.255206 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4lkvh"] Dec 05 17:14:19 crc kubenswrapper[4756]: I1205 17:14:19.264999 4756 scope.go:117] "RemoveContainer" containerID="7b060c1ad6470a3b221c110b6e069cb257b846d57d7ffed64c9223c3b29d2ae4" Dec 05 17:14:19 crc kubenswrapper[4756]: I1205 17:14:19.279816 4756 scope.go:117] "RemoveContainer" containerID="2b9c42b58e7f3b8727ac09af4c0c92518dc4b4b2a7233974efa9bda44870db2a" Dec 05 17:14:19 crc kubenswrapper[4756]: E1205 17:14:19.280190 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b9c42b58e7f3b8727ac09af4c0c92518dc4b4b2a7233974efa9bda44870db2a\": container with ID starting with 2b9c42b58e7f3b8727ac09af4c0c92518dc4b4b2a7233974efa9bda44870db2a not found: ID does not exist" containerID="2b9c42b58e7f3b8727ac09af4c0c92518dc4b4b2a7233974efa9bda44870db2a" Dec 05 17:14:19 crc kubenswrapper[4756]: I1205 17:14:19.280221 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b9c42b58e7f3b8727ac09af4c0c92518dc4b4b2a7233974efa9bda44870db2a"} err="failed to get container status \"2b9c42b58e7f3b8727ac09af4c0c92518dc4b4b2a7233974efa9bda44870db2a\": rpc error: code = NotFound desc = could not find container \"2b9c42b58e7f3b8727ac09af4c0c92518dc4b4b2a7233974efa9bda44870db2a\": container with ID starting with 2b9c42b58e7f3b8727ac09af4c0c92518dc4b4b2a7233974efa9bda44870db2a not found: ID does not exist" Dec 05 17:14:19 crc kubenswrapper[4756]: I1205 17:14:19.280248 4756 scope.go:117] "RemoveContainer" containerID="ad99a724b1ca21c2e4994d650fe727ba0c3390021d92dbeffb618c845fcbf4da" Dec 05 17:14:19 crc kubenswrapper[4756]: E1205 17:14:19.280636 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad99a724b1ca21c2e4994d650fe727ba0c3390021d92dbeffb618c845fcbf4da\": container with ID starting with ad99a724b1ca21c2e4994d650fe727ba0c3390021d92dbeffb618c845fcbf4da not found: ID does not exist" containerID="ad99a724b1ca21c2e4994d650fe727ba0c3390021d92dbeffb618c845fcbf4da" Dec 05 17:14:19 crc kubenswrapper[4756]: I1205 17:14:19.280684 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad99a724b1ca21c2e4994d650fe727ba0c3390021d92dbeffb618c845fcbf4da"} err="failed to get container status \"ad99a724b1ca21c2e4994d650fe727ba0c3390021d92dbeffb618c845fcbf4da\": rpc error: code = NotFound desc = could not find container \"ad99a724b1ca21c2e4994d650fe727ba0c3390021d92dbeffb618c845fcbf4da\": container with ID starting with ad99a724b1ca21c2e4994d650fe727ba0c3390021d92dbeffb618c845fcbf4da not found: ID does not exist" Dec 05 17:14:19 crc kubenswrapper[4756]: I1205 17:14:19.280710 4756 scope.go:117] "RemoveContainer" containerID="7b060c1ad6470a3b221c110b6e069cb257b846d57d7ffed64c9223c3b29d2ae4" Dec 05 17:14:19 crc kubenswrapper[4756]: E1205 17:14:19.281184 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b060c1ad6470a3b221c110b6e069cb257b846d57d7ffed64c9223c3b29d2ae4\": container with ID starting with 7b060c1ad6470a3b221c110b6e069cb257b846d57d7ffed64c9223c3b29d2ae4 not found: ID does not exist" containerID="7b060c1ad6470a3b221c110b6e069cb257b846d57d7ffed64c9223c3b29d2ae4" Dec 05 17:14:19 crc kubenswrapper[4756]: I1205 17:14:19.281207 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b060c1ad6470a3b221c110b6e069cb257b846d57d7ffed64c9223c3b29d2ae4"} err="failed to get container status \"7b060c1ad6470a3b221c110b6e069cb257b846d57d7ffed64c9223c3b29d2ae4\": rpc error: code = NotFound desc = could not find container \"7b060c1ad6470a3b221c110b6e069cb257b846d57d7ffed64c9223c3b29d2ae4\": container with ID starting with 7b060c1ad6470a3b221c110b6e069cb257b846d57d7ffed64c9223c3b29d2ae4 not found: ID does not exist" Dec 05 17:14:20 crc kubenswrapper[4756]: I1205 17:14:20.073612 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29a95529-088f-4d96-becc-5efff3ef0671" path="/var/lib/kubelet/pods/29a95529-088f-4d96-becc-5efff3ef0671/volumes" Dec 05 17:14:22 crc kubenswrapper[4756]: I1205 17:14:22.295668 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 17:14:22 crc kubenswrapper[4756]: I1205 17:14:22.295789 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 17:14:52 crc kubenswrapper[4756]: I1205 17:14:52.295877 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 17:14:52 crc kubenswrapper[4756]: I1205 17:14:52.298142 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 17:15:00 crc kubenswrapper[4756]: I1205 17:15:00.164098 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415915-6kx7p"] Dec 05 17:15:00 crc kubenswrapper[4756]: E1205 17:15:00.165032 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a95529-088f-4d96-becc-5efff3ef0671" containerName="registry-server" Dec 05 17:15:00 crc kubenswrapper[4756]: I1205 17:15:00.165052 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a95529-088f-4d96-becc-5efff3ef0671" containerName="registry-server" Dec 05 17:15:00 crc kubenswrapper[4756]: E1205 17:15:00.165072 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a95529-088f-4d96-becc-5efff3ef0671" containerName="extract-utilities" Dec 05 17:15:00 crc kubenswrapper[4756]: I1205 17:15:00.165080 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a95529-088f-4d96-becc-5efff3ef0671" containerName="extract-utilities" Dec 05 17:15:00 crc kubenswrapper[4756]: E1205 17:15:00.165095 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a95529-088f-4d96-becc-5efff3ef0671" containerName="extract-content" Dec 05 17:15:00 crc kubenswrapper[4756]: I1205 17:15:00.165101 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a95529-088f-4d96-becc-5efff3ef0671" containerName="extract-content" Dec 05 17:15:00 crc kubenswrapper[4756]: I1205 17:15:00.165299 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="29a95529-088f-4d96-becc-5efff3ef0671" containerName="registry-server" Dec 05 17:15:00 crc kubenswrapper[4756]: I1205 17:15:00.165880 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415915-6kx7p" Dec 05 17:15:00 crc kubenswrapper[4756]: I1205 17:15:00.170626 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 17:15:00 crc kubenswrapper[4756]: I1205 17:15:00.170776 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 17:15:00 crc kubenswrapper[4756]: I1205 17:15:00.178514 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415915-6kx7p"] Dec 05 17:15:00 crc kubenswrapper[4756]: I1205 17:15:00.180454 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgj97\" (UniqueName: \"kubernetes.io/projected/38f37263-1348-4de5-924d-13e6b4d4c921-kube-api-access-vgj97\") pod \"collect-profiles-29415915-6kx7p\" (UID: \"38f37263-1348-4de5-924d-13e6b4d4c921\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415915-6kx7p" Dec 05 17:15:00 crc kubenswrapper[4756]: I1205 17:15:00.181108 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38f37263-1348-4de5-924d-13e6b4d4c921-secret-volume\") pod \"collect-profiles-29415915-6kx7p\" (UID: \"38f37263-1348-4de5-924d-13e6b4d4c921\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415915-6kx7p" Dec 05 17:15:00 crc kubenswrapper[4756]: I1205 17:15:00.181449 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38f37263-1348-4de5-924d-13e6b4d4c921-config-volume\") pod \"collect-profiles-29415915-6kx7p\" (UID: \"38f37263-1348-4de5-924d-13e6b4d4c921\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415915-6kx7p" Dec 05 17:15:00 crc kubenswrapper[4756]: I1205 17:15:00.283526 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38f37263-1348-4de5-924d-13e6b4d4c921-config-volume\") pod \"collect-profiles-29415915-6kx7p\" (UID: \"38f37263-1348-4de5-924d-13e6b4d4c921\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415915-6kx7p" Dec 05 17:15:00 crc kubenswrapper[4756]: I1205 17:15:00.283931 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgj97\" (UniqueName: \"kubernetes.io/projected/38f37263-1348-4de5-924d-13e6b4d4c921-kube-api-access-vgj97\") pod \"collect-profiles-29415915-6kx7p\" (UID: \"38f37263-1348-4de5-924d-13e6b4d4c921\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415915-6kx7p" Dec 05 17:15:00 crc kubenswrapper[4756]: I1205 17:15:00.284054 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38f37263-1348-4de5-924d-13e6b4d4c921-secret-volume\") pod \"collect-profiles-29415915-6kx7p\" (UID: \"38f37263-1348-4de5-924d-13e6b4d4c921\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415915-6kx7p" Dec 05 17:15:00 crc kubenswrapper[4756]: I1205 17:15:00.284437 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38f37263-1348-4de5-924d-13e6b4d4c921-config-volume\") pod \"collect-profiles-29415915-6kx7p\" (UID: \"38f37263-1348-4de5-924d-13e6b4d4c921\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415915-6kx7p" Dec 05 17:15:00 crc kubenswrapper[4756]: I1205 17:15:00.289761 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38f37263-1348-4de5-924d-13e6b4d4c921-secret-volume\") pod \"collect-profiles-29415915-6kx7p\" (UID: \"38f37263-1348-4de5-924d-13e6b4d4c921\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415915-6kx7p" Dec 05 17:15:00 crc kubenswrapper[4756]: I1205 17:15:00.299861 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgj97\" (UniqueName: \"kubernetes.io/projected/38f37263-1348-4de5-924d-13e6b4d4c921-kube-api-access-vgj97\") pod \"collect-profiles-29415915-6kx7p\" (UID: \"38f37263-1348-4de5-924d-13e6b4d4c921\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415915-6kx7p" Dec 05 17:15:00 crc kubenswrapper[4756]: I1205 17:15:00.504607 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415915-6kx7p" Dec 05 17:15:00 crc kubenswrapper[4756]: I1205 17:15:00.988880 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415915-6kx7p"] Dec 05 17:15:01 crc kubenswrapper[4756]: I1205 17:15:01.573132 4756 generic.go:334] "Generic (PLEG): container finished" podID="38f37263-1348-4de5-924d-13e6b4d4c921" containerID="f75fde1eb5ed3887d9da5236335adfcaec29bb57e53fc9f01cbc3cdb3751942a" exitCode=0 Dec 05 17:15:01 crc kubenswrapper[4756]: I1205 17:15:01.573174 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415915-6kx7p" event={"ID":"38f37263-1348-4de5-924d-13e6b4d4c921","Type":"ContainerDied","Data":"f75fde1eb5ed3887d9da5236335adfcaec29bb57e53fc9f01cbc3cdb3751942a"} Dec 05 17:15:01 crc kubenswrapper[4756]: I1205 17:15:01.573528 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415915-6kx7p" event={"ID":"38f37263-1348-4de5-924d-13e6b4d4c921","Type":"ContainerStarted","Data":"ae9fb3e08f6737af1a7e9728b04df4695d83937eaf2632125634bd9c8265bb4f"} Dec 05 17:15:02 crc kubenswrapper[4756]: I1205 17:15:02.906587 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415915-6kx7p" Dec 05 17:15:03 crc kubenswrapper[4756]: I1205 17:15:03.026580 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgj97\" (UniqueName: \"kubernetes.io/projected/38f37263-1348-4de5-924d-13e6b4d4c921-kube-api-access-vgj97\") pod \"38f37263-1348-4de5-924d-13e6b4d4c921\" (UID: \"38f37263-1348-4de5-924d-13e6b4d4c921\") " Dec 05 17:15:03 crc kubenswrapper[4756]: I1205 17:15:03.026650 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38f37263-1348-4de5-924d-13e6b4d4c921-config-volume\") pod \"38f37263-1348-4de5-924d-13e6b4d4c921\" (UID: \"38f37263-1348-4de5-924d-13e6b4d4c921\") " Dec 05 17:15:03 crc kubenswrapper[4756]: I1205 17:15:03.026778 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38f37263-1348-4de5-924d-13e6b4d4c921-secret-volume\") pod \"38f37263-1348-4de5-924d-13e6b4d4c921\" (UID: \"38f37263-1348-4de5-924d-13e6b4d4c921\") " Dec 05 17:15:03 crc kubenswrapper[4756]: I1205 17:15:03.028094 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38f37263-1348-4de5-924d-13e6b4d4c921-config-volume" (OuterVolumeSpecName: "config-volume") pod "38f37263-1348-4de5-924d-13e6b4d4c921" (UID: "38f37263-1348-4de5-924d-13e6b4d4c921"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:15:03 crc kubenswrapper[4756]: I1205 17:15:03.039869 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38f37263-1348-4de5-924d-13e6b4d4c921-kube-api-access-vgj97" (OuterVolumeSpecName: "kube-api-access-vgj97") pod "38f37263-1348-4de5-924d-13e6b4d4c921" (UID: "38f37263-1348-4de5-924d-13e6b4d4c921"). InnerVolumeSpecName "kube-api-access-vgj97". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:15:03 crc kubenswrapper[4756]: I1205 17:15:03.041764 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38f37263-1348-4de5-924d-13e6b4d4c921-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "38f37263-1348-4de5-924d-13e6b4d4c921" (UID: "38f37263-1348-4de5-924d-13e6b4d4c921"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:15:03 crc kubenswrapper[4756]: I1205 17:15:03.129175 4756 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38f37263-1348-4de5-924d-13e6b4d4c921-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 17:15:03 crc kubenswrapper[4756]: I1205 17:15:03.129256 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgj97\" (UniqueName: \"kubernetes.io/projected/38f37263-1348-4de5-924d-13e6b4d4c921-kube-api-access-vgj97\") on node \"crc\" DevicePath \"\"" Dec 05 17:15:03 crc kubenswrapper[4756]: I1205 17:15:03.129292 4756 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38f37263-1348-4de5-924d-13e6b4d4c921-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 17:15:03 crc kubenswrapper[4756]: I1205 17:15:03.599125 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415915-6kx7p" event={"ID":"38f37263-1348-4de5-924d-13e6b4d4c921","Type":"ContainerDied","Data":"ae9fb3e08f6737af1a7e9728b04df4695d83937eaf2632125634bd9c8265bb4f"} Dec 05 17:15:03 crc kubenswrapper[4756]: I1205 17:15:03.599162 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae9fb3e08f6737af1a7e9728b04df4695d83937eaf2632125634bd9c8265bb4f" Dec 05 17:15:03 crc kubenswrapper[4756]: I1205 17:15:03.599743 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415915-6kx7p" Dec 05 17:15:03 crc kubenswrapper[4756]: I1205 17:15:03.982930 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415870-t5g2j"] Dec 05 17:15:03 crc kubenswrapper[4756]: I1205 17:15:03.990050 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415870-t5g2j"] Dec 05 17:15:04 crc kubenswrapper[4756]: I1205 17:15:04.064316 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="575bbbf2-c59a-42c7-8404-4e907139734d" path="/var/lib/kubelet/pods/575bbbf2-c59a-42c7-8404-4e907139734d/volumes" Dec 05 17:15:22 crc kubenswrapper[4756]: I1205 17:15:22.295774 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 17:15:22 crc kubenswrapper[4756]: I1205 17:15:22.296566 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 17:15:22 crc kubenswrapper[4756]: I1205 17:15:22.296636 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 17:15:22 crc kubenswrapper[4756]: I1205 17:15:22.297590 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fb829834883dcdc384034349ec5b152a5e50be349cace05fa772c48c6be30b43"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 17:15:22 crc kubenswrapper[4756]: I1205 17:15:22.297689 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://fb829834883dcdc384034349ec5b152a5e50be349cace05fa772c48c6be30b43" gracePeriod=600 Dec 05 17:15:22 crc kubenswrapper[4756]: I1205 17:15:22.780948 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="fb829834883dcdc384034349ec5b152a5e50be349cace05fa772c48c6be30b43" exitCode=0 Dec 05 17:15:22 crc kubenswrapper[4756]: I1205 17:15:22.781013 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"fb829834883dcdc384034349ec5b152a5e50be349cace05fa772c48c6be30b43"} Dec 05 17:15:22 crc kubenswrapper[4756]: I1205 17:15:22.781530 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"ac8114e29e03f74b87c3bbc2bd35e0ede9b016d14878a52f18289c6c255820fc"} Dec 05 17:15:22 crc kubenswrapper[4756]: I1205 17:15:22.781562 4756 scope.go:117] "RemoveContainer" containerID="49506a9bdc53ac30d67217ca5d82bfa3a9f85392c5b27d05e0c45ef828366f90" Dec 05 17:15:57 crc kubenswrapper[4756]: I1205 17:15:57.745463 4756 scope.go:117] "RemoveContainer" containerID="4aaee59e97e1eb51ba43c6a0b78c62b9c980ef955404c0c0b81ac0064035dff5" Dec 05 17:17:22 crc kubenswrapper[4756]: I1205 17:17:22.295618 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 17:17:22 crc kubenswrapper[4756]: I1205 17:17:22.296835 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 17:17:30 crc kubenswrapper[4756]: I1205 17:17:30.476572 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8hr2d"] Dec 05 17:17:30 crc kubenswrapper[4756]: E1205 17:17:30.479485 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38f37263-1348-4de5-924d-13e6b4d4c921" containerName="collect-profiles" Dec 05 17:17:30 crc kubenswrapper[4756]: I1205 17:17:30.479676 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="38f37263-1348-4de5-924d-13e6b4d4c921" containerName="collect-profiles" Dec 05 17:17:30 crc kubenswrapper[4756]: I1205 17:17:30.480074 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="38f37263-1348-4de5-924d-13e6b4d4c921" containerName="collect-profiles" Dec 05 17:17:30 crc kubenswrapper[4756]: I1205 17:17:30.482508 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8hr2d"] Dec 05 17:17:30 crc kubenswrapper[4756]: I1205 17:17:30.482679 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8hr2d" Dec 05 17:17:30 crc kubenswrapper[4756]: I1205 17:17:30.599167 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nps24\" (UniqueName: \"kubernetes.io/projected/11248b14-87bb-4cd3-9877-543252b70d96-kube-api-access-nps24\") pod \"community-operators-8hr2d\" (UID: \"11248b14-87bb-4cd3-9877-543252b70d96\") " pod="openshift-marketplace/community-operators-8hr2d" Dec 05 17:17:30 crc kubenswrapper[4756]: I1205 17:17:30.599230 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11248b14-87bb-4cd3-9877-543252b70d96-catalog-content\") pod \"community-operators-8hr2d\" (UID: \"11248b14-87bb-4cd3-9877-543252b70d96\") " pod="openshift-marketplace/community-operators-8hr2d" Dec 05 17:17:30 crc kubenswrapper[4756]: I1205 17:17:30.599254 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11248b14-87bb-4cd3-9877-543252b70d96-utilities\") pod \"community-operators-8hr2d\" (UID: \"11248b14-87bb-4cd3-9877-543252b70d96\") " pod="openshift-marketplace/community-operators-8hr2d" Dec 05 17:17:30 crc kubenswrapper[4756]: I1205 17:17:30.700662 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nps24\" (UniqueName: \"kubernetes.io/projected/11248b14-87bb-4cd3-9877-543252b70d96-kube-api-access-nps24\") pod \"community-operators-8hr2d\" (UID: \"11248b14-87bb-4cd3-9877-543252b70d96\") " pod="openshift-marketplace/community-operators-8hr2d" Dec 05 17:17:30 crc kubenswrapper[4756]: I1205 17:17:30.700709 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11248b14-87bb-4cd3-9877-543252b70d96-catalog-content\") pod \"community-operators-8hr2d\" (UID: \"11248b14-87bb-4cd3-9877-543252b70d96\") " pod="openshift-marketplace/community-operators-8hr2d" Dec 05 17:17:30 crc kubenswrapper[4756]: I1205 17:17:30.700745 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11248b14-87bb-4cd3-9877-543252b70d96-utilities\") pod \"community-operators-8hr2d\" (UID: \"11248b14-87bb-4cd3-9877-543252b70d96\") " pod="openshift-marketplace/community-operators-8hr2d" Dec 05 17:17:30 crc kubenswrapper[4756]: I1205 17:17:30.701366 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11248b14-87bb-4cd3-9877-543252b70d96-utilities\") pod \"community-operators-8hr2d\" (UID: \"11248b14-87bb-4cd3-9877-543252b70d96\") " pod="openshift-marketplace/community-operators-8hr2d" Dec 05 17:17:30 crc kubenswrapper[4756]: I1205 17:17:30.701793 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11248b14-87bb-4cd3-9877-543252b70d96-catalog-content\") pod \"community-operators-8hr2d\" (UID: \"11248b14-87bb-4cd3-9877-543252b70d96\") " pod="openshift-marketplace/community-operators-8hr2d" Dec 05 17:17:30 crc kubenswrapper[4756]: I1205 17:17:30.724610 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nps24\" (UniqueName: \"kubernetes.io/projected/11248b14-87bb-4cd3-9877-543252b70d96-kube-api-access-nps24\") pod \"community-operators-8hr2d\" (UID: \"11248b14-87bb-4cd3-9877-543252b70d96\") " pod="openshift-marketplace/community-operators-8hr2d" Dec 05 17:17:30 crc kubenswrapper[4756]: I1205 17:17:30.810266 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8hr2d" Dec 05 17:17:31 crc kubenswrapper[4756]: I1205 17:17:31.350444 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8hr2d"] Dec 05 17:17:31 crc kubenswrapper[4756]: I1205 17:17:31.947713 4756 generic.go:334] "Generic (PLEG): container finished" podID="11248b14-87bb-4cd3-9877-543252b70d96" containerID="22246ae53056e26ce1da5d0d276578956c9e23e148b3a3411c48e91b98a05c7f" exitCode=0 Dec 05 17:17:31 crc kubenswrapper[4756]: I1205 17:17:31.947825 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hr2d" event={"ID":"11248b14-87bb-4cd3-9877-543252b70d96","Type":"ContainerDied","Data":"22246ae53056e26ce1da5d0d276578956c9e23e148b3a3411c48e91b98a05c7f"} Dec 05 17:17:31 crc kubenswrapper[4756]: I1205 17:17:31.948555 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hr2d" event={"ID":"11248b14-87bb-4cd3-9877-543252b70d96","Type":"ContainerStarted","Data":"e09ca6579ac96c5a9f6f39f9b54789255a349c89fe630d047bf081f6d449af91"} Dec 05 17:17:32 crc kubenswrapper[4756]: I1205 17:17:32.959674 4756 generic.go:334] "Generic (PLEG): container finished" podID="11248b14-87bb-4cd3-9877-543252b70d96" containerID="6bcb8a4924e69ce8b8ddafd2c4cb5d0c13ed637055901dd582dc70d6d5da0e93" exitCode=0 Dec 05 17:17:32 crc kubenswrapper[4756]: I1205 17:17:32.959921 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hr2d" event={"ID":"11248b14-87bb-4cd3-9877-543252b70d96","Type":"ContainerDied","Data":"6bcb8a4924e69ce8b8ddafd2c4cb5d0c13ed637055901dd582dc70d6d5da0e93"} Dec 05 17:17:33 crc kubenswrapper[4756]: I1205 17:17:33.970046 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hr2d" event={"ID":"11248b14-87bb-4cd3-9877-543252b70d96","Type":"ContainerStarted","Data":"2c994a9bbd864452bd9ac5f05567692ce29349b94fdb31873cd972161887aec3"} Dec 05 17:17:34 crc kubenswrapper[4756]: I1205 17:17:34.008347 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8hr2d" podStartSLOduration=2.554832847 podStartE2EDuration="4.008316047s" podCreationTimestamp="2025-12-05 17:17:30 +0000 UTC" firstStartedPulling="2025-12-05 17:17:31.949252353 +0000 UTC m=+3712.187925875" lastFinishedPulling="2025-12-05 17:17:33.402735563 +0000 UTC m=+3713.641409075" observedRunningTime="2025-12-05 17:17:33.998499784 +0000 UTC m=+3714.237173346" watchObservedRunningTime="2025-12-05 17:17:34.008316047 +0000 UTC m=+3714.246989599" Dec 05 17:17:40 crc kubenswrapper[4756]: I1205 17:17:40.810966 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8hr2d" Dec 05 17:17:40 crc kubenswrapper[4756]: I1205 17:17:40.811632 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8hr2d" Dec 05 17:17:40 crc kubenswrapper[4756]: I1205 17:17:40.868578 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8hr2d" Dec 05 17:17:41 crc kubenswrapper[4756]: I1205 17:17:41.082038 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8hr2d" Dec 05 17:17:41 crc kubenswrapper[4756]: I1205 17:17:41.135976 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8hr2d"] Dec 05 17:17:43 crc kubenswrapper[4756]: I1205 17:17:43.038361 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8hr2d" podUID="11248b14-87bb-4cd3-9877-543252b70d96" containerName="registry-server" containerID="cri-o://2c994a9bbd864452bd9ac5f05567692ce29349b94fdb31873cd972161887aec3" gracePeriod=2 Dec 05 17:17:43 crc kubenswrapper[4756]: I1205 17:17:43.961509 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8hr2d" Dec 05 17:17:44 crc kubenswrapper[4756]: I1205 17:17:44.052370 4756 generic.go:334] "Generic (PLEG): container finished" podID="11248b14-87bb-4cd3-9877-543252b70d96" containerID="2c994a9bbd864452bd9ac5f05567692ce29349b94fdb31873cd972161887aec3" exitCode=0 Dec 05 17:17:44 crc kubenswrapper[4756]: I1205 17:17:44.052572 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8hr2d" Dec 05 17:17:44 crc kubenswrapper[4756]: I1205 17:17:44.062278 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hr2d" event={"ID":"11248b14-87bb-4cd3-9877-543252b70d96","Type":"ContainerDied","Data":"2c994a9bbd864452bd9ac5f05567692ce29349b94fdb31873cd972161887aec3"} Dec 05 17:17:44 crc kubenswrapper[4756]: I1205 17:17:44.062361 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hr2d" event={"ID":"11248b14-87bb-4cd3-9877-543252b70d96","Type":"ContainerDied","Data":"e09ca6579ac96c5a9f6f39f9b54789255a349c89fe630d047bf081f6d449af91"} Dec 05 17:17:44 crc kubenswrapper[4756]: I1205 17:17:44.062398 4756 scope.go:117] "RemoveContainer" containerID="2c994a9bbd864452bd9ac5f05567692ce29349b94fdb31873cd972161887aec3" Dec 05 17:17:44 crc kubenswrapper[4756]: I1205 17:17:44.083465 4756 scope.go:117] "RemoveContainer" containerID="6bcb8a4924e69ce8b8ddafd2c4cb5d0c13ed637055901dd582dc70d6d5da0e93" Dec 05 17:17:44 crc kubenswrapper[4756]: I1205 17:17:44.099632 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11248b14-87bb-4cd3-9877-543252b70d96-catalog-content\") pod \"11248b14-87bb-4cd3-9877-543252b70d96\" (UID: \"11248b14-87bb-4cd3-9877-543252b70d96\") " Dec 05 17:17:44 crc kubenswrapper[4756]: I1205 17:17:44.099797 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nps24\" (UniqueName: \"kubernetes.io/projected/11248b14-87bb-4cd3-9877-543252b70d96-kube-api-access-nps24\") pod \"11248b14-87bb-4cd3-9877-543252b70d96\" (UID: \"11248b14-87bb-4cd3-9877-543252b70d96\") " Dec 05 17:17:44 crc kubenswrapper[4756]: I1205 17:17:44.099807 4756 scope.go:117] "RemoveContainer" containerID="22246ae53056e26ce1da5d0d276578956c9e23e148b3a3411c48e91b98a05c7f" Dec 05 17:17:44 crc kubenswrapper[4756]: I1205 17:17:44.099857 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11248b14-87bb-4cd3-9877-543252b70d96-utilities\") pod \"11248b14-87bb-4cd3-9877-543252b70d96\" (UID: \"11248b14-87bb-4cd3-9877-543252b70d96\") " Dec 05 17:17:44 crc kubenswrapper[4756]: I1205 17:17:44.101216 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11248b14-87bb-4cd3-9877-543252b70d96-utilities" (OuterVolumeSpecName: "utilities") pod "11248b14-87bb-4cd3-9877-543252b70d96" (UID: "11248b14-87bb-4cd3-9877-543252b70d96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:17:44 crc kubenswrapper[4756]: I1205 17:17:44.107904 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11248b14-87bb-4cd3-9877-543252b70d96-kube-api-access-nps24" (OuterVolumeSpecName: "kube-api-access-nps24") pod "11248b14-87bb-4cd3-9877-543252b70d96" (UID: "11248b14-87bb-4cd3-9877-543252b70d96"). InnerVolumeSpecName "kube-api-access-nps24". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:17:44 crc kubenswrapper[4756]: I1205 17:17:44.143812 4756 scope.go:117] "RemoveContainer" containerID="2c994a9bbd864452bd9ac5f05567692ce29349b94fdb31873cd972161887aec3" Dec 05 17:17:44 crc kubenswrapper[4756]: E1205 17:17:44.144376 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c994a9bbd864452bd9ac5f05567692ce29349b94fdb31873cd972161887aec3\": container with ID starting with 2c994a9bbd864452bd9ac5f05567692ce29349b94fdb31873cd972161887aec3 not found: ID does not exist" containerID="2c994a9bbd864452bd9ac5f05567692ce29349b94fdb31873cd972161887aec3" Dec 05 17:17:44 crc kubenswrapper[4756]: I1205 17:17:44.144486 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c994a9bbd864452bd9ac5f05567692ce29349b94fdb31873cd972161887aec3"} err="failed to get container status \"2c994a9bbd864452bd9ac5f05567692ce29349b94fdb31873cd972161887aec3\": rpc error: code = NotFound desc = could not find container \"2c994a9bbd864452bd9ac5f05567692ce29349b94fdb31873cd972161887aec3\": container with ID starting with 2c994a9bbd864452bd9ac5f05567692ce29349b94fdb31873cd972161887aec3 not found: ID does not exist" Dec 05 17:17:44 crc kubenswrapper[4756]: I1205 17:17:44.144522 4756 scope.go:117] "RemoveContainer" containerID="6bcb8a4924e69ce8b8ddafd2c4cb5d0c13ed637055901dd582dc70d6d5da0e93" Dec 05 17:17:44 crc kubenswrapper[4756]: E1205 17:17:44.144959 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bcb8a4924e69ce8b8ddafd2c4cb5d0c13ed637055901dd582dc70d6d5da0e93\": container with ID starting with 6bcb8a4924e69ce8b8ddafd2c4cb5d0c13ed637055901dd582dc70d6d5da0e93 not found: ID does not exist" containerID="6bcb8a4924e69ce8b8ddafd2c4cb5d0c13ed637055901dd582dc70d6d5da0e93" Dec 05 17:17:44 crc kubenswrapper[4756]: I1205 17:17:44.145010 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bcb8a4924e69ce8b8ddafd2c4cb5d0c13ed637055901dd582dc70d6d5da0e93"} err="failed to get container status \"6bcb8a4924e69ce8b8ddafd2c4cb5d0c13ed637055901dd582dc70d6d5da0e93\": rpc error: code = NotFound desc = could not find container \"6bcb8a4924e69ce8b8ddafd2c4cb5d0c13ed637055901dd582dc70d6d5da0e93\": container with ID starting with 6bcb8a4924e69ce8b8ddafd2c4cb5d0c13ed637055901dd582dc70d6d5da0e93 not found: ID does not exist" Dec 05 17:17:44 crc kubenswrapper[4756]: I1205 17:17:44.145027 4756 scope.go:117] "RemoveContainer" containerID="22246ae53056e26ce1da5d0d276578956c9e23e148b3a3411c48e91b98a05c7f" Dec 05 17:17:44 crc kubenswrapper[4756]: E1205 17:17:44.145465 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22246ae53056e26ce1da5d0d276578956c9e23e148b3a3411c48e91b98a05c7f\": container with ID starting with 22246ae53056e26ce1da5d0d276578956c9e23e148b3a3411c48e91b98a05c7f not found: ID does not exist" containerID="22246ae53056e26ce1da5d0d276578956c9e23e148b3a3411c48e91b98a05c7f" Dec 05 17:17:44 crc kubenswrapper[4756]: I1205 17:17:44.145502 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22246ae53056e26ce1da5d0d276578956c9e23e148b3a3411c48e91b98a05c7f"} err="failed to get container status \"22246ae53056e26ce1da5d0d276578956c9e23e148b3a3411c48e91b98a05c7f\": rpc error: code = NotFound desc = could not find container \"22246ae53056e26ce1da5d0d276578956c9e23e148b3a3411c48e91b98a05c7f\": container with ID starting with 22246ae53056e26ce1da5d0d276578956c9e23e148b3a3411c48e91b98a05c7f not found: ID does not exist" Dec 05 17:17:44 crc kubenswrapper[4756]: I1205 17:17:44.174575 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11248b14-87bb-4cd3-9877-543252b70d96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "11248b14-87bb-4cd3-9877-543252b70d96" (UID: "11248b14-87bb-4cd3-9877-543252b70d96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:17:44 crc kubenswrapper[4756]: I1205 17:17:44.201592 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nps24\" (UniqueName: \"kubernetes.io/projected/11248b14-87bb-4cd3-9877-543252b70d96-kube-api-access-nps24\") on node \"crc\" DevicePath \"\"" Dec 05 17:17:44 crc kubenswrapper[4756]: I1205 17:17:44.201650 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11248b14-87bb-4cd3-9877-543252b70d96-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 17:17:44 crc kubenswrapper[4756]: I1205 17:17:44.201668 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11248b14-87bb-4cd3-9877-543252b70d96-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 17:17:44 crc kubenswrapper[4756]: I1205 17:17:44.390147 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8hr2d"] Dec 05 17:17:44 crc kubenswrapper[4756]: I1205 17:17:44.397613 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8hr2d"] Dec 05 17:17:46 crc kubenswrapper[4756]: I1205 17:17:46.069593 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11248b14-87bb-4cd3-9877-543252b70d96" path="/var/lib/kubelet/pods/11248b14-87bb-4cd3-9877-543252b70d96/volumes" Dec 05 17:17:52 crc kubenswrapper[4756]: I1205 17:17:52.295302 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 17:17:52 crc kubenswrapper[4756]: I1205 17:17:52.296822 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 17:18:17 crc kubenswrapper[4756]: I1205 17:18:17.027286 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s9lq9"] Dec 05 17:18:17 crc kubenswrapper[4756]: E1205 17:18:17.028455 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11248b14-87bb-4cd3-9877-543252b70d96" containerName="extract-content" Dec 05 17:18:17 crc kubenswrapper[4756]: I1205 17:18:17.028478 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="11248b14-87bb-4cd3-9877-543252b70d96" containerName="extract-content" Dec 05 17:18:17 crc kubenswrapper[4756]: E1205 17:18:17.028501 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11248b14-87bb-4cd3-9877-543252b70d96" containerName="registry-server" Dec 05 17:18:17 crc kubenswrapper[4756]: I1205 17:18:17.028511 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="11248b14-87bb-4cd3-9877-543252b70d96" containerName="registry-server" Dec 05 17:18:17 crc kubenswrapper[4756]: E1205 17:18:17.028533 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11248b14-87bb-4cd3-9877-543252b70d96" containerName="extract-utilities" Dec 05 17:18:17 crc kubenswrapper[4756]: I1205 17:18:17.028544 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="11248b14-87bb-4cd3-9877-543252b70d96" containerName="extract-utilities" Dec 05 17:18:17 crc kubenswrapper[4756]: I1205 17:18:17.028854 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="11248b14-87bb-4cd3-9877-543252b70d96" containerName="registry-server" Dec 05 17:18:17 crc kubenswrapper[4756]: I1205 17:18:17.030708 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s9lq9" Dec 05 17:18:17 crc kubenswrapper[4756]: I1205 17:18:17.037254 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s9lq9"] Dec 05 17:18:17 crc kubenswrapper[4756]: I1205 17:18:17.173069 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5debc48-2b02-4d74-9873-1c689e65ce9c-catalog-content\") pod \"certified-operators-s9lq9\" (UID: \"d5debc48-2b02-4d74-9873-1c689e65ce9c\") " pod="openshift-marketplace/certified-operators-s9lq9" Dec 05 17:18:17 crc kubenswrapper[4756]: I1205 17:18:17.173215 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpk6z\" (UniqueName: \"kubernetes.io/projected/d5debc48-2b02-4d74-9873-1c689e65ce9c-kube-api-access-dpk6z\") pod \"certified-operators-s9lq9\" (UID: \"d5debc48-2b02-4d74-9873-1c689e65ce9c\") " pod="openshift-marketplace/certified-operators-s9lq9" Dec 05 17:18:17 crc kubenswrapper[4756]: I1205 17:18:17.173247 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5debc48-2b02-4d74-9873-1c689e65ce9c-utilities\") pod \"certified-operators-s9lq9\" (UID: \"d5debc48-2b02-4d74-9873-1c689e65ce9c\") " pod="openshift-marketplace/certified-operators-s9lq9" Dec 05 17:18:17 crc kubenswrapper[4756]: I1205 17:18:17.274070 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5debc48-2b02-4d74-9873-1c689e65ce9c-catalog-content\") pod \"certified-operators-s9lq9\" (UID: \"d5debc48-2b02-4d74-9873-1c689e65ce9c\") " pod="openshift-marketplace/certified-operators-s9lq9" Dec 05 17:18:17 crc kubenswrapper[4756]: I1205 17:18:17.274194 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpk6z\" (UniqueName: \"kubernetes.io/projected/d5debc48-2b02-4d74-9873-1c689e65ce9c-kube-api-access-dpk6z\") pod \"certified-operators-s9lq9\" (UID: \"d5debc48-2b02-4d74-9873-1c689e65ce9c\") " pod="openshift-marketplace/certified-operators-s9lq9" Dec 05 17:18:17 crc kubenswrapper[4756]: I1205 17:18:17.274229 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5debc48-2b02-4d74-9873-1c689e65ce9c-utilities\") pod \"certified-operators-s9lq9\" (UID: \"d5debc48-2b02-4d74-9873-1c689e65ce9c\") " pod="openshift-marketplace/certified-operators-s9lq9" Dec 05 17:18:17 crc kubenswrapper[4756]: I1205 17:18:17.274810 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5debc48-2b02-4d74-9873-1c689e65ce9c-catalog-content\") pod \"certified-operators-s9lq9\" (UID: \"d5debc48-2b02-4d74-9873-1c689e65ce9c\") " pod="openshift-marketplace/certified-operators-s9lq9" Dec 05 17:18:17 crc kubenswrapper[4756]: I1205 17:18:17.274825 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5debc48-2b02-4d74-9873-1c689e65ce9c-utilities\") pod \"certified-operators-s9lq9\" (UID: \"d5debc48-2b02-4d74-9873-1c689e65ce9c\") " pod="openshift-marketplace/certified-operators-s9lq9" Dec 05 17:18:17 crc kubenswrapper[4756]: I1205 17:18:17.294579 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpk6z\" (UniqueName: \"kubernetes.io/projected/d5debc48-2b02-4d74-9873-1c689e65ce9c-kube-api-access-dpk6z\") pod \"certified-operators-s9lq9\" (UID: \"d5debc48-2b02-4d74-9873-1c689e65ce9c\") " pod="openshift-marketplace/certified-operators-s9lq9" Dec 05 17:18:17 crc kubenswrapper[4756]: I1205 17:18:17.350673 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s9lq9" Dec 05 17:18:17 crc kubenswrapper[4756]: I1205 17:18:17.823842 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s9lq9"] Dec 05 17:18:18 crc kubenswrapper[4756]: I1205 17:18:18.317513 4756 generic.go:334] "Generic (PLEG): container finished" podID="d5debc48-2b02-4d74-9873-1c689e65ce9c" containerID="9bc435ed6a98bc82b23248e64ef54d93f296f2526fa6575bf8062899c475989b" exitCode=0 Dec 05 17:18:18 crc kubenswrapper[4756]: I1205 17:18:18.317637 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9lq9" event={"ID":"d5debc48-2b02-4d74-9873-1c689e65ce9c","Type":"ContainerDied","Data":"9bc435ed6a98bc82b23248e64ef54d93f296f2526fa6575bf8062899c475989b"} Dec 05 17:18:18 crc kubenswrapper[4756]: I1205 17:18:18.317813 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9lq9" event={"ID":"d5debc48-2b02-4d74-9873-1c689e65ce9c","Type":"ContainerStarted","Data":"94382816b670cb1163a6b4f4a557b61c1c35a14e70a4549d85cdb8554b4b1487"} Dec 05 17:18:19 crc kubenswrapper[4756]: I1205 17:18:19.328502 4756 generic.go:334] "Generic (PLEG): container finished" podID="d5debc48-2b02-4d74-9873-1c689e65ce9c" containerID="ee685a3f1f06e265f0d1677a1a53a2cbf2613c31e4d7d144ef4f4072af66f2fb" exitCode=0 Dec 05 17:18:19 crc kubenswrapper[4756]: I1205 17:18:19.328595 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9lq9" event={"ID":"d5debc48-2b02-4d74-9873-1c689e65ce9c","Type":"ContainerDied","Data":"ee685a3f1f06e265f0d1677a1a53a2cbf2613c31e4d7d144ef4f4072af66f2fb"} Dec 05 17:18:20 crc kubenswrapper[4756]: I1205 17:18:20.340525 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9lq9" event={"ID":"d5debc48-2b02-4d74-9873-1c689e65ce9c","Type":"ContainerStarted","Data":"5b5ce65cd064f5f838a73a91d43403af2b2b58638ca5c96f4bdedfb9aceeec21"} Dec 05 17:18:20 crc kubenswrapper[4756]: I1205 17:18:20.362438 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s9lq9" podStartSLOduration=2.934092399 podStartE2EDuration="4.362399881s" podCreationTimestamp="2025-12-05 17:18:16 +0000 UTC" firstStartedPulling="2025-12-05 17:18:18.319427771 +0000 UTC m=+3758.558101283" lastFinishedPulling="2025-12-05 17:18:19.747735253 +0000 UTC m=+3759.986408765" observedRunningTime="2025-12-05 17:18:20.356383556 +0000 UTC m=+3760.595057068" watchObservedRunningTime="2025-12-05 17:18:20.362399881 +0000 UTC m=+3760.601073403" Dec 05 17:18:22 crc kubenswrapper[4756]: I1205 17:18:22.295527 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 17:18:22 crc kubenswrapper[4756]: I1205 17:18:22.295586 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 17:18:22 crc kubenswrapper[4756]: I1205 17:18:22.295630 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 17:18:22 crc kubenswrapper[4756]: I1205 17:18:22.296257 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ac8114e29e03f74b87c3bbc2bd35e0ede9b016d14878a52f18289c6c255820fc"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 17:18:22 crc kubenswrapper[4756]: I1205 17:18:22.296309 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://ac8114e29e03f74b87c3bbc2bd35e0ede9b016d14878a52f18289c6c255820fc" gracePeriod=600 Dec 05 17:18:22 crc kubenswrapper[4756]: E1205 17:18:22.445190 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:18:23 crc kubenswrapper[4756]: I1205 17:18:23.002906 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cxp6w"] Dec 05 17:18:23 crc kubenswrapper[4756]: I1205 17:18:23.004315 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxp6w" Dec 05 17:18:23 crc kubenswrapper[4756]: I1205 17:18:23.029403 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cxp6w"] Dec 05 17:18:23 crc kubenswrapper[4756]: I1205 17:18:23.154833 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlcgx\" (UniqueName: \"kubernetes.io/projected/3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1-kube-api-access-vlcgx\") pod \"redhat-operators-cxp6w\" (UID: \"3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1\") " pod="openshift-marketplace/redhat-operators-cxp6w" Dec 05 17:18:23 crc kubenswrapper[4756]: I1205 17:18:23.155162 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1-utilities\") pod \"redhat-operators-cxp6w\" (UID: \"3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1\") " pod="openshift-marketplace/redhat-operators-cxp6w" Dec 05 17:18:23 crc kubenswrapper[4756]: I1205 17:18:23.155265 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1-catalog-content\") pod \"redhat-operators-cxp6w\" (UID: \"3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1\") " pod="openshift-marketplace/redhat-operators-cxp6w" Dec 05 17:18:23 crc kubenswrapper[4756]: I1205 17:18:23.256044 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1-utilities\") pod \"redhat-operators-cxp6w\" (UID: \"3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1\") " pod="openshift-marketplace/redhat-operators-cxp6w" Dec 05 17:18:23 crc kubenswrapper[4756]: I1205 17:18:23.256100 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1-catalog-content\") pod \"redhat-operators-cxp6w\" (UID: \"3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1\") " pod="openshift-marketplace/redhat-operators-cxp6w" Dec 05 17:18:23 crc kubenswrapper[4756]: I1205 17:18:23.256134 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlcgx\" (UniqueName: \"kubernetes.io/projected/3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1-kube-api-access-vlcgx\") pod \"redhat-operators-cxp6w\" (UID: \"3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1\") " pod="openshift-marketplace/redhat-operators-cxp6w" Dec 05 17:18:23 crc kubenswrapper[4756]: I1205 17:18:23.257054 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1-utilities\") pod \"redhat-operators-cxp6w\" (UID: \"3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1\") " pod="openshift-marketplace/redhat-operators-cxp6w" Dec 05 17:18:23 crc kubenswrapper[4756]: I1205 17:18:23.257091 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1-catalog-content\") pod \"redhat-operators-cxp6w\" (UID: \"3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1\") " pod="openshift-marketplace/redhat-operators-cxp6w" Dec 05 17:18:23 crc kubenswrapper[4756]: I1205 17:18:23.276449 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlcgx\" (UniqueName: \"kubernetes.io/projected/3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1-kube-api-access-vlcgx\") pod \"redhat-operators-cxp6w\" (UID: \"3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1\") " pod="openshift-marketplace/redhat-operators-cxp6w" Dec 05 17:18:23 crc kubenswrapper[4756]: I1205 17:18:23.332159 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxp6w" Dec 05 17:18:23 crc kubenswrapper[4756]: I1205 17:18:23.370455 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="ac8114e29e03f74b87c3bbc2bd35e0ede9b016d14878a52f18289c6c255820fc" exitCode=0 Dec 05 17:18:23 crc kubenswrapper[4756]: I1205 17:18:23.370519 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"ac8114e29e03f74b87c3bbc2bd35e0ede9b016d14878a52f18289c6c255820fc"} Dec 05 17:18:23 crc kubenswrapper[4756]: I1205 17:18:23.370885 4756 scope.go:117] "RemoveContainer" containerID="fb829834883dcdc384034349ec5b152a5e50be349cace05fa772c48c6be30b43" Dec 05 17:18:23 crc kubenswrapper[4756]: I1205 17:18:23.371461 4756 scope.go:117] "RemoveContainer" containerID="ac8114e29e03f74b87c3bbc2bd35e0ede9b016d14878a52f18289c6c255820fc" Dec 05 17:18:23 crc kubenswrapper[4756]: E1205 17:18:23.371797 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:18:23 crc kubenswrapper[4756]: I1205 17:18:23.798045 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cxp6w"] Dec 05 17:18:23 crc kubenswrapper[4756]: W1205 17:18:23.802893 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d0d08d4_ad8e_4588_84c0_6aa46d5f56f1.slice/crio-5f4d21e33dc4cfa955cdda05c5519ce6311c05ccef3e3bd2e0f3b4d70e69f497 WatchSource:0}: Error finding container 5f4d21e33dc4cfa955cdda05c5519ce6311c05ccef3e3bd2e0f3b4d70e69f497: Status 404 returned error can't find the container with id 5f4d21e33dc4cfa955cdda05c5519ce6311c05ccef3e3bd2e0f3b4d70e69f497 Dec 05 17:18:24 crc kubenswrapper[4756]: I1205 17:18:24.380625 4756 generic.go:334] "Generic (PLEG): container finished" podID="3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1" containerID="a931708d53cb8e5090091fce0d93e125b5df564fe56db02c069902128c24ce14" exitCode=0 Dec 05 17:18:24 crc kubenswrapper[4756]: I1205 17:18:24.380691 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxp6w" event={"ID":"3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1","Type":"ContainerDied","Data":"a931708d53cb8e5090091fce0d93e125b5df564fe56db02c069902128c24ce14"} Dec 05 17:18:24 crc kubenswrapper[4756]: I1205 17:18:24.381106 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxp6w" event={"ID":"3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1","Type":"ContainerStarted","Data":"5f4d21e33dc4cfa955cdda05c5519ce6311c05ccef3e3bd2e0f3b4d70e69f497"} Dec 05 17:18:25 crc kubenswrapper[4756]: I1205 17:18:25.411554 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxp6w" event={"ID":"3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1","Type":"ContainerStarted","Data":"1eff3aa24b3cc2f16d626bea12e2f39623d058acf43d75c5a7180662fc00f30d"} Dec 05 17:18:26 crc kubenswrapper[4756]: I1205 17:18:26.420409 4756 generic.go:334] "Generic (PLEG): container finished" podID="3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1" containerID="1eff3aa24b3cc2f16d626bea12e2f39623d058acf43d75c5a7180662fc00f30d" exitCode=0 Dec 05 17:18:26 crc kubenswrapper[4756]: I1205 17:18:26.420480 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxp6w" event={"ID":"3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1","Type":"ContainerDied","Data":"1eff3aa24b3cc2f16d626bea12e2f39623d058acf43d75c5a7180662fc00f30d"} Dec 05 17:18:27 crc kubenswrapper[4756]: I1205 17:18:27.351555 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s9lq9" Dec 05 17:18:27 crc kubenswrapper[4756]: I1205 17:18:27.351885 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s9lq9" Dec 05 17:18:27 crc kubenswrapper[4756]: I1205 17:18:27.393041 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s9lq9" Dec 05 17:18:27 crc kubenswrapper[4756]: I1205 17:18:27.434080 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxp6w" event={"ID":"3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1","Type":"ContainerStarted","Data":"59972a4ace51ddb8ec76f09f66e24d868ada4d8b34b3dc8fda77647c094d8e43"} Dec 05 17:18:27 crc kubenswrapper[4756]: I1205 17:18:27.474320 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cxp6w" podStartSLOduration=3.017087356 podStartE2EDuration="5.474296414s" podCreationTimestamp="2025-12-05 17:18:22 +0000 UTC" firstStartedPulling="2025-12-05 17:18:24.387696489 +0000 UTC m=+3764.626370031" lastFinishedPulling="2025-12-05 17:18:26.844905537 +0000 UTC m=+3767.083579089" observedRunningTime="2025-12-05 17:18:27.464580144 +0000 UTC m=+3767.703253696" watchObservedRunningTime="2025-12-05 17:18:27.474296414 +0000 UTC m=+3767.712969936" Dec 05 17:18:27 crc kubenswrapper[4756]: I1205 17:18:27.482736 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s9lq9" Dec 05 17:18:29 crc kubenswrapper[4756]: I1205 17:18:29.815068 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s9lq9"] Dec 05 17:18:29 crc kubenswrapper[4756]: I1205 17:18:29.815713 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s9lq9" podUID="d5debc48-2b02-4d74-9873-1c689e65ce9c" containerName="registry-server" containerID="cri-o://5b5ce65cd064f5f838a73a91d43403af2b2b58638ca5c96f4bdedfb9aceeec21" gracePeriod=2 Dec 05 17:18:29 crc kubenswrapper[4756]: E1205 17:18:29.999773 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5debc48_2b02_4d74_9873_1c689e65ce9c.slice/crio-5b5ce65cd064f5f838a73a91d43403af2b2b58638ca5c96f4bdedfb9aceeec21.scope\": RecentStats: unable to find data in memory cache]" Dec 05 17:18:30 crc kubenswrapper[4756]: I1205 17:18:30.463703 4756 generic.go:334] "Generic (PLEG): container finished" podID="d5debc48-2b02-4d74-9873-1c689e65ce9c" containerID="5b5ce65cd064f5f838a73a91d43403af2b2b58638ca5c96f4bdedfb9aceeec21" exitCode=0 Dec 05 17:18:30 crc kubenswrapper[4756]: I1205 17:18:30.463768 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9lq9" event={"ID":"d5debc48-2b02-4d74-9873-1c689e65ce9c","Type":"ContainerDied","Data":"5b5ce65cd064f5f838a73a91d43403af2b2b58638ca5c96f4bdedfb9aceeec21"} Dec 05 17:18:30 crc kubenswrapper[4756]: I1205 17:18:30.716099 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s9lq9" Dec 05 17:18:30 crc kubenswrapper[4756]: I1205 17:18:30.777443 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpk6z\" (UniqueName: \"kubernetes.io/projected/d5debc48-2b02-4d74-9873-1c689e65ce9c-kube-api-access-dpk6z\") pod \"d5debc48-2b02-4d74-9873-1c689e65ce9c\" (UID: \"d5debc48-2b02-4d74-9873-1c689e65ce9c\") " Dec 05 17:18:30 crc kubenswrapper[4756]: I1205 17:18:30.777512 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5debc48-2b02-4d74-9873-1c689e65ce9c-utilities\") pod \"d5debc48-2b02-4d74-9873-1c689e65ce9c\" (UID: \"d5debc48-2b02-4d74-9873-1c689e65ce9c\") " Dec 05 17:18:30 crc kubenswrapper[4756]: I1205 17:18:30.777640 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5debc48-2b02-4d74-9873-1c689e65ce9c-catalog-content\") pod \"d5debc48-2b02-4d74-9873-1c689e65ce9c\" (UID: \"d5debc48-2b02-4d74-9873-1c689e65ce9c\") " Dec 05 17:18:30 crc kubenswrapper[4756]: I1205 17:18:30.778463 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5debc48-2b02-4d74-9873-1c689e65ce9c-utilities" (OuterVolumeSpecName: "utilities") pod "d5debc48-2b02-4d74-9873-1c689e65ce9c" (UID: "d5debc48-2b02-4d74-9873-1c689e65ce9c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:18:30 crc kubenswrapper[4756]: I1205 17:18:30.786278 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5debc48-2b02-4d74-9873-1c689e65ce9c-kube-api-access-dpk6z" (OuterVolumeSpecName: "kube-api-access-dpk6z") pod "d5debc48-2b02-4d74-9873-1c689e65ce9c" (UID: "d5debc48-2b02-4d74-9873-1c689e65ce9c"). InnerVolumeSpecName "kube-api-access-dpk6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:18:30 crc kubenswrapper[4756]: I1205 17:18:30.839516 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5debc48-2b02-4d74-9873-1c689e65ce9c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5debc48-2b02-4d74-9873-1c689e65ce9c" (UID: "d5debc48-2b02-4d74-9873-1c689e65ce9c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:18:30 crc kubenswrapper[4756]: I1205 17:18:30.880130 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5debc48-2b02-4d74-9873-1c689e65ce9c-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 17:18:30 crc kubenswrapper[4756]: I1205 17:18:30.880184 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5debc48-2b02-4d74-9873-1c689e65ce9c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 17:18:30 crc kubenswrapper[4756]: I1205 17:18:30.880197 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpk6z\" (UniqueName: \"kubernetes.io/projected/d5debc48-2b02-4d74-9873-1c689e65ce9c-kube-api-access-dpk6z\") on node \"crc\" DevicePath \"\"" Dec 05 17:18:31 crc kubenswrapper[4756]: I1205 17:18:31.475201 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9lq9" event={"ID":"d5debc48-2b02-4d74-9873-1c689e65ce9c","Type":"ContainerDied","Data":"94382816b670cb1163a6b4f4a557b61c1c35a14e70a4549d85cdb8554b4b1487"} Dec 05 17:18:31 crc kubenswrapper[4756]: I1205 17:18:31.475281 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s9lq9" Dec 05 17:18:31 crc kubenswrapper[4756]: I1205 17:18:31.475541 4756 scope.go:117] "RemoveContainer" containerID="5b5ce65cd064f5f838a73a91d43403af2b2b58638ca5c96f4bdedfb9aceeec21" Dec 05 17:18:31 crc kubenswrapper[4756]: I1205 17:18:31.513494 4756 scope.go:117] "RemoveContainer" containerID="ee685a3f1f06e265f0d1677a1a53a2cbf2613c31e4d7d144ef4f4072af66f2fb" Dec 05 17:18:31 crc kubenswrapper[4756]: I1205 17:18:31.516749 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s9lq9"] Dec 05 17:18:31 crc kubenswrapper[4756]: I1205 17:18:31.523925 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s9lq9"] Dec 05 17:18:31 crc kubenswrapper[4756]: I1205 17:18:31.552613 4756 scope.go:117] "RemoveContainer" containerID="9bc435ed6a98bc82b23248e64ef54d93f296f2526fa6575bf8062899c475989b" Dec 05 17:18:32 crc kubenswrapper[4756]: I1205 17:18:32.059858 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5debc48-2b02-4d74-9873-1c689e65ce9c" path="/var/lib/kubelet/pods/d5debc48-2b02-4d74-9873-1c689e65ce9c/volumes" Dec 05 17:18:33 crc kubenswrapper[4756]: I1205 17:18:33.332913 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cxp6w" Dec 05 17:18:33 crc kubenswrapper[4756]: I1205 17:18:33.334275 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cxp6w" Dec 05 17:18:33 crc kubenswrapper[4756]: I1205 17:18:33.399554 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cxp6w" Dec 05 17:18:33 crc kubenswrapper[4756]: I1205 17:18:33.545099 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cxp6w" Dec 05 17:18:33 crc kubenswrapper[4756]: I1205 17:18:33.791882 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cxp6w"] Dec 05 17:18:35 crc kubenswrapper[4756]: I1205 17:18:35.508471 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cxp6w" podUID="3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1" containerName="registry-server" containerID="cri-o://59972a4ace51ddb8ec76f09f66e24d868ada4d8b34b3dc8fda77647c094d8e43" gracePeriod=2 Dec 05 17:18:36 crc kubenswrapper[4756]: I1205 17:18:36.418092 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxp6w" Dec 05 17:18:36 crc kubenswrapper[4756]: I1205 17:18:36.527234 4756 generic.go:334] "Generic (PLEG): container finished" podID="3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1" containerID="59972a4ace51ddb8ec76f09f66e24d868ada4d8b34b3dc8fda77647c094d8e43" exitCode=0 Dec 05 17:18:36 crc kubenswrapper[4756]: I1205 17:18:36.527278 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxp6w" event={"ID":"3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1","Type":"ContainerDied","Data":"59972a4ace51ddb8ec76f09f66e24d868ada4d8b34b3dc8fda77647c094d8e43"} Dec 05 17:18:36 crc kubenswrapper[4756]: I1205 17:18:36.527304 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxp6w" event={"ID":"3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1","Type":"ContainerDied","Data":"5f4d21e33dc4cfa955cdda05c5519ce6311c05ccef3e3bd2e0f3b4d70e69f497"} Dec 05 17:18:36 crc kubenswrapper[4756]: I1205 17:18:36.527321 4756 scope.go:117] "RemoveContainer" containerID="59972a4ace51ddb8ec76f09f66e24d868ada4d8b34b3dc8fda77647c094d8e43" Dec 05 17:18:36 crc kubenswrapper[4756]: I1205 17:18:36.527488 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxp6w" Dec 05 17:18:36 crc kubenswrapper[4756]: I1205 17:18:36.562643 4756 scope.go:117] "RemoveContainer" containerID="1eff3aa24b3cc2f16d626bea12e2f39623d058acf43d75c5a7180662fc00f30d" Dec 05 17:18:36 crc kubenswrapper[4756]: I1205 17:18:36.580857 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlcgx\" (UniqueName: \"kubernetes.io/projected/3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1-kube-api-access-vlcgx\") pod \"3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1\" (UID: \"3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1\") " Dec 05 17:18:36 crc kubenswrapper[4756]: I1205 17:18:36.580944 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1-utilities\") pod \"3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1\" (UID: \"3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1\") " Dec 05 17:18:36 crc kubenswrapper[4756]: I1205 17:18:36.580963 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1-catalog-content\") pod \"3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1\" (UID: \"3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1\") " Dec 05 17:18:36 crc kubenswrapper[4756]: I1205 17:18:36.585667 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1-utilities" (OuterVolumeSpecName: "utilities") pod "3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1" (UID: "3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:18:36 crc kubenswrapper[4756]: I1205 17:18:36.606211 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1-kube-api-access-vlcgx" (OuterVolumeSpecName: "kube-api-access-vlcgx") pod "3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1" (UID: "3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1"). InnerVolumeSpecName "kube-api-access-vlcgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:18:36 crc kubenswrapper[4756]: I1205 17:18:36.611007 4756 scope.go:117] "RemoveContainer" containerID="a931708d53cb8e5090091fce0d93e125b5df564fe56db02c069902128c24ce14" Dec 05 17:18:36 crc kubenswrapper[4756]: I1205 17:18:36.651632 4756 scope.go:117] "RemoveContainer" containerID="59972a4ace51ddb8ec76f09f66e24d868ada4d8b34b3dc8fda77647c094d8e43" Dec 05 17:18:36 crc kubenswrapper[4756]: E1205 17:18:36.653835 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59972a4ace51ddb8ec76f09f66e24d868ada4d8b34b3dc8fda77647c094d8e43\": container with ID starting with 59972a4ace51ddb8ec76f09f66e24d868ada4d8b34b3dc8fda77647c094d8e43 not found: ID does not exist" containerID="59972a4ace51ddb8ec76f09f66e24d868ada4d8b34b3dc8fda77647c094d8e43" Dec 05 17:18:36 crc kubenswrapper[4756]: I1205 17:18:36.653898 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59972a4ace51ddb8ec76f09f66e24d868ada4d8b34b3dc8fda77647c094d8e43"} err="failed to get container status \"59972a4ace51ddb8ec76f09f66e24d868ada4d8b34b3dc8fda77647c094d8e43\": rpc error: code = NotFound desc = could not find container \"59972a4ace51ddb8ec76f09f66e24d868ada4d8b34b3dc8fda77647c094d8e43\": container with ID starting with 59972a4ace51ddb8ec76f09f66e24d868ada4d8b34b3dc8fda77647c094d8e43 not found: ID does not exist" Dec 05 17:18:36 crc kubenswrapper[4756]: I1205 17:18:36.653927 4756 scope.go:117] "RemoveContainer" containerID="1eff3aa24b3cc2f16d626bea12e2f39623d058acf43d75c5a7180662fc00f30d" Dec 05 17:18:36 crc kubenswrapper[4756]: E1205 17:18:36.654430 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eff3aa24b3cc2f16d626bea12e2f39623d058acf43d75c5a7180662fc00f30d\": container with ID starting with 1eff3aa24b3cc2f16d626bea12e2f39623d058acf43d75c5a7180662fc00f30d not found: ID does not exist" containerID="1eff3aa24b3cc2f16d626bea12e2f39623d058acf43d75c5a7180662fc00f30d" Dec 05 17:18:36 crc kubenswrapper[4756]: I1205 17:18:36.654459 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eff3aa24b3cc2f16d626bea12e2f39623d058acf43d75c5a7180662fc00f30d"} err="failed to get container status \"1eff3aa24b3cc2f16d626bea12e2f39623d058acf43d75c5a7180662fc00f30d\": rpc error: code = NotFound desc = could not find container \"1eff3aa24b3cc2f16d626bea12e2f39623d058acf43d75c5a7180662fc00f30d\": container with ID starting with 1eff3aa24b3cc2f16d626bea12e2f39623d058acf43d75c5a7180662fc00f30d not found: ID does not exist" Dec 05 17:18:36 crc kubenswrapper[4756]: I1205 17:18:36.654472 4756 scope.go:117] "RemoveContainer" containerID="a931708d53cb8e5090091fce0d93e125b5df564fe56db02c069902128c24ce14" Dec 05 17:18:36 crc kubenswrapper[4756]: E1205 17:18:36.654739 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a931708d53cb8e5090091fce0d93e125b5df564fe56db02c069902128c24ce14\": container with ID starting with a931708d53cb8e5090091fce0d93e125b5df564fe56db02c069902128c24ce14 not found: ID does not exist" containerID="a931708d53cb8e5090091fce0d93e125b5df564fe56db02c069902128c24ce14" Dec 05 17:18:36 crc kubenswrapper[4756]: I1205 17:18:36.654758 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a931708d53cb8e5090091fce0d93e125b5df564fe56db02c069902128c24ce14"} err="failed to get container status \"a931708d53cb8e5090091fce0d93e125b5df564fe56db02c069902128c24ce14\": rpc error: code = NotFound desc = could not find container \"a931708d53cb8e5090091fce0d93e125b5df564fe56db02c069902128c24ce14\": container with ID starting with a931708d53cb8e5090091fce0d93e125b5df564fe56db02c069902128c24ce14 not found: ID does not exist" Dec 05 17:18:36 crc kubenswrapper[4756]: I1205 17:18:36.682784 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlcgx\" (UniqueName: \"kubernetes.io/projected/3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1-kube-api-access-vlcgx\") on node \"crc\" DevicePath \"\"" Dec 05 17:18:36 crc kubenswrapper[4756]: I1205 17:18:36.682834 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 17:18:36 crc kubenswrapper[4756]: I1205 17:18:36.715682 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1" (UID: "3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:18:36 crc kubenswrapper[4756]: I1205 17:18:36.784107 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 17:18:36 crc kubenswrapper[4756]: I1205 17:18:36.857893 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cxp6w"] Dec 05 17:18:36 crc kubenswrapper[4756]: I1205 17:18:36.864725 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cxp6w"] Dec 05 17:18:38 crc kubenswrapper[4756]: I1205 17:18:38.064228 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1" path="/var/lib/kubelet/pods/3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1/volumes" Dec 05 17:18:39 crc kubenswrapper[4756]: I1205 17:18:39.049977 4756 scope.go:117] "RemoveContainer" containerID="ac8114e29e03f74b87c3bbc2bd35e0ede9b016d14878a52f18289c6c255820fc" Dec 05 17:18:39 crc kubenswrapper[4756]: E1205 17:18:39.050447 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:18:54 crc kubenswrapper[4756]: I1205 17:18:54.049316 4756 scope.go:117] "RemoveContainer" containerID="ac8114e29e03f74b87c3bbc2bd35e0ede9b016d14878a52f18289c6c255820fc" Dec 05 17:18:54 crc kubenswrapper[4756]: E1205 17:18:54.050110 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:19:09 crc kubenswrapper[4756]: I1205 17:19:09.050014 4756 scope.go:117] "RemoveContainer" containerID="ac8114e29e03f74b87c3bbc2bd35e0ede9b016d14878a52f18289c6c255820fc" Dec 05 17:19:09 crc kubenswrapper[4756]: E1205 17:19:09.052373 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:19:23 crc kubenswrapper[4756]: I1205 17:19:23.050336 4756 scope.go:117] "RemoveContainer" containerID="ac8114e29e03f74b87c3bbc2bd35e0ede9b016d14878a52f18289c6c255820fc" Dec 05 17:19:23 crc kubenswrapper[4756]: E1205 17:19:23.051110 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:19:37 crc kubenswrapper[4756]: I1205 17:19:37.049644 4756 scope.go:117] "RemoveContainer" containerID="ac8114e29e03f74b87c3bbc2bd35e0ede9b016d14878a52f18289c6c255820fc" Dec 05 17:19:37 crc kubenswrapper[4756]: E1205 17:19:37.050363 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:19:50 crc kubenswrapper[4756]: I1205 17:19:50.060026 4756 scope.go:117] "RemoveContainer" containerID="ac8114e29e03f74b87c3bbc2bd35e0ede9b016d14878a52f18289c6c255820fc" Dec 05 17:19:50 crc kubenswrapper[4756]: E1205 17:19:50.060801 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:20:04 crc kubenswrapper[4756]: I1205 17:20:04.049904 4756 scope.go:117] "RemoveContainer" containerID="ac8114e29e03f74b87c3bbc2bd35e0ede9b016d14878a52f18289c6c255820fc" Dec 05 17:20:04 crc kubenswrapper[4756]: E1205 17:20:04.050668 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:20:16 crc kubenswrapper[4756]: I1205 17:20:16.050576 4756 scope.go:117] "RemoveContainer" containerID="ac8114e29e03f74b87c3bbc2bd35e0ede9b016d14878a52f18289c6c255820fc" Dec 05 17:20:16 crc kubenswrapper[4756]: E1205 17:20:16.051406 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:20:28 crc kubenswrapper[4756]: I1205 17:20:28.049527 4756 scope.go:117] "RemoveContainer" containerID="ac8114e29e03f74b87c3bbc2bd35e0ede9b016d14878a52f18289c6c255820fc" Dec 05 17:20:28 crc kubenswrapper[4756]: E1205 17:20:28.050268 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:20:39 crc kubenswrapper[4756]: I1205 17:20:39.050311 4756 scope.go:117] "RemoveContainer" containerID="ac8114e29e03f74b87c3bbc2bd35e0ede9b016d14878a52f18289c6c255820fc" Dec 05 17:20:39 crc kubenswrapper[4756]: E1205 17:20:39.051051 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:20:51 crc kubenswrapper[4756]: I1205 17:20:51.049937 4756 scope.go:117] "RemoveContainer" containerID="ac8114e29e03f74b87c3bbc2bd35e0ede9b016d14878a52f18289c6c255820fc" Dec 05 17:20:51 crc kubenswrapper[4756]: E1205 17:20:51.050868 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:21:04 crc kubenswrapper[4756]: I1205 17:21:04.049525 4756 scope.go:117] "RemoveContainer" containerID="ac8114e29e03f74b87c3bbc2bd35e0ede9b016d14878a52f18289c6c255820fc" Dec 05 17:21:04 crc kubenswrapper[4756]: E1205 17:21:04.050264 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:21:17 crc kubenswrapper[4756]: I1205 17:21:17.050505 4756 scope.go:117] "RemoveContainer" containerID="ac8114e29e03f74b87c3bbc2bd35e0ede9b016d14878a52f18289c6c255820fc" Dec 05 17:21:17 crc kubenswrapper[4756]: E1205 17:21:17.051577 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:21:29 crc kubenswrapper[4756]: I1205 17:21:29.049707 4756 scope.go:117] "RemoveContainer" containerID="ac8114e29e03f74b87c3bbc2bd35e0ede9b016d14878a52f18289c6c255820fc" Dec 05 17:21:29 crc kubenswrapper[4756]: E1205 17:21:29.050410 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:21:41 crc kubenswrapper[4756]: I1205 17:21:41.049931 4756 scope.go:117] "RemoveContainer" containerID="ac8114e29e03f74b87c3bbc2bd35e0ede9b016d14878a52f18289c6c255820fc" Dec 05 17:21:41 crc kubenswrapper[4756]: E1205 17:21:41.052239 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:21:52 crc kubenswrapper[4756]: I1205 17:21:52.050381 4756 scope.go:117] "RemoveContainer" containerID="ac8114e29e03f74b87c3bbc2bd35e0ede9b016d14878a52f18289c6c255820fc" Dec 05 17:21:52 crc kubenswrapper[4756]: E1205 17:21:52.051723 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:22:06 crc kubenswrapper[4756]: I1205 17:22:06.049626 4756 scope.go:117] "RemoveContainer" containerID="ac8114e29e03f74b87c3bbc2bd35e0ede9b016d14878a52f18289c6c255820fc" Dec 05 17:22:06 crc kubenswrapper[4756]: E1205 17:22:06.051567 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:22:20 crc kubenswrapper[4756]: I1205 17:22:20.053706 4756 scope.go:117] "RemoveContainer" containerID="ac8114e29e03f74b87c3bbc2bd35e0ede9b016d14878a52f18289c6c255820fc" Dec 05 17:22:20 crc kubenswrapper[4756]: E1205 17:22:20.054542 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:22:31 crc kubenswrapper[4756]: I1205 17:22:31.049902 4756 scope.go:117] "RemoveContainer" containerID="ac8114e29e03f74b87c3bbc2bd35e0ede9b016d14878a52f18289c6c255820fc" Dec 05 17:22:31 crc kubenswrapper[4756]: E1205 17:22:31.051998 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:22:43 crc kubenswrapper[4756]: I1205 17:22:43.049341 4756 scope.go:117] "RemoveContainer" containerID="ac8114e29e03f74b87c3bbc2bd35e0ede9b016d14878a52f18289c6c255820fc" Dec 05 17:22:43 crc kubenswrapper[4756]: E1205 17:22:43.050034 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:22:56 crc kubenswrapper[4756]: I1205 17:22:56.050067 4756 scope.go:117] "RemoveContainer" containerID="ac8114e29e03f74b87c3bbc2bd35e0ede9b016d14878a52f18289c6c255820fc" Dec 05 17:22:56 crc kubenswrapper[4756]: E1205 17:22:56.051176 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:23:11 crc kubenswrapper[4756]: I1205 17:23:11.049970 4756 scope.go:117] "RemoveContainer" containerID="ac8114e29e03f74b87c3bbc2bd35e0ede9b016d14878a52f18289c6c255820fc" Dec 05 17:23:11 crc kubenswrapper[4756]: E1205 17:23:11.050856 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:23:25 crc kubenswrapper[4756]: I1205 17:23:25.050084 4756 scope.go:117] "RemoveContainer" containerID="ac8114e29e03f74b87c3bbc2bd35e0ede9b016d14878a52f18289c6c255820fc" Dec 05 17:23:25 crc kubenswrapper[4756]: I1205 17:23:25.965168 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"559d99bb2b2903a0e0d08c03f3e2cdf229933fe33baa698fb9bf73438ee152f5"} Dec 05 17:24:13 crc kubenswrapper[4756]: I1205 17:24:13.692732 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r6txr"] Dec 05 17:24:13 crc kubenswrapper[4756]: E1205 17:24:13.702756 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5debc48-2b02-4d74-9873-1c689e65ce9c" containerName="extract-utilities" Dec 05 17:24:13 crc kubenswrapper[4756]: I1205 17:24:13.703069 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5debc48-2b02-4d74-9873-1c689e65ce9c" containerName="extract-utilities" Dec 05 17:24:13 crc kubenswrapper[4756]: E1205 17:24:13.703166 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1" containerName="extract-utilities" Dec 05 17:24:13 crc kubenswrapper[4756]: I1205 17:24:13.703239 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1" containerName="extract-utilities" Dec 05 17:24:13 crc kubenswrapper[4756]: E1205 17:24:13.703321 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5debc48-2b02-4d74-9873-1c689e65ce9c" containerName="extract-content" Dec 05 17:24:13 crc kubenswrapper[4756]: I1205 17:24:13.703403 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5debc48-2b02-4d74-9873-1c689e65ce9c" containerName="extract-content" Dec 05 17:24:13 crc kubenswrapper[4756]: E1205 17:24:13.703519 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1" containerName="extract-content" Dec 05 17:24:13 crc kubenswrapper[4756]: I1205 17:24:13.703613 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1" containerName="extract-content" Dec 05 17:24:13 crc kubenswrapper[4756]: E1205 17:24:13.703701 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5debc48-2b02-4d74-9873-1c689e65ce9c" containerName="registry-server" Dec 05 17:24:13 crc kubenswrapper[4756]: I1205 17:24:13.703780 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5debc48-2b02-4d74-9873-1c689e65ce9c" containerName="registry-server" Dec 05 17:24:13 crc kubenswrapper[4756]: E1205 17:24:13.703870 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1" containerName="registry-server" Dec 05 17:24:13 crc kubenswrapper[4756]: I1205 17:24:13.703961 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1" containerName="registry-server" Dec 05 17:24:13 crc kubenswrapper[4756]: I1205 17:24:13.704184 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d0d08d4-ad8e-4588-84c0-6aa46d5f56f1" containerName="registry-server" Dec 05 17:24:13 crc kubenswrapper[4756]: I1205 17:24:13.704334 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5debc48-2b02-4d74-9873-1c689e65ce9c" containerName="registry-server" Dec 05 17:24:13 crc kubenswrapper[4756]: I1205 17:24:13.705383 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r6txr"] Dec 05 17:24:13 crc kubenswrapper[4756]: I1205 17:24:13.705668 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r6txr" Dec 05 17:24:13 crc kubenswrapper[4756]: I1205 17:24:13.812558 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlfqk\" (UniqueName: \"kubernetes.io/projected/98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e-kube-api-access-hlfqk\") pod \"redhat-marketplace-r6txr\" (UID: \"98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e\") " pod="openshift-marketplace/redhat-marketplace-r6txr" Dec 05 17:24:13 crc kubenswrapper[4756]: I1205 17:24:13.813057 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e-catalog-content\") pod \"redhat-marketplace-r6txr\" (UID: \"98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e\") " pod="openshift-marketplace/redhat-marketplace-r6txr" Dec 05 17:24:13 crc kubenswrapper[4756]: I1205 17:24:13.813127 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e-utilities\") pod \"redhat-marketplace-r6txr\" (UID: \"98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e\") " pod="openshift-marketplace/redhat-marketplace-r6txr" Dec 05 17:24:13 crc kubenswrapper[4756]: I1205 17:24:13.914014 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlfqk\" (UniqueName: \"kubernetes.io/projected/98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e-kube-api-access-hlfqk\") pod \"redhat-marketplace-r6txr\" (UID: \"98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e\") " pod="openshift-marketplace/redhat-marketplace-r6txr" Dec 05 17:24:13 crc kubenswrapper[4756]: I1205 17:24:13.914085 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e-catalog-content\") pod \"redhat-marketplace-r6txr\" (UID: \"98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e\") " pod="openshift-marketplace/redhat-marketplace-r6txr" Dec 05 17:24:13 crc kubenswrapper[4756]: I1205 17:24:13.914108 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e-utilities\") pod \"redhat-marketplace-r6txr\" (UID: \"98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e\") " pod="openshift-marketplace/redhat-marketplace-r6txr" Dec 05 17:24:13 crc kubenswrapper[4756]: I1205 17:24:13.914594 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e-utilities\") pod \"redhat-marketplace-r6txr\" (UID: \"98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e\") " pod="openshift-marketplace/redhat-marketplace-r6txr" Dec 05 17:24:13 crc kubenswrapper[4756]: I1205 17:24:13.914868 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e-catalog-content\") pod \"redhat-marketplace-r6txr\" (UID: \"98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e\") " pod="openshift-marketplace/redhat-marketplace-r6txr" Dec 05 17:24:13 crc kubenswrapper[4756]: I1205 17:24:13.934596 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlfqk\" (UniqueName: \"kubernetes.io/projected/98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e-kube-api-access-hlfqk\") pod \"redhat-marketplace-r6txr\" (UID: \"98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e\") " pod="openshift-marketplace/redhat-marketplace-r6txr" Dec 05 17:24:14 crc kubenswrapper[4756]: I1205 17:24:14.069973 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r6txr" Dec 05 17:24:14 crc kubenswrapper[4756]: I1205 17:24:14.308894 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r6txr"] Dec 05 17:24:14 crc kubenswrapper[4756]: I1205 17:24:14.362502 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r6txr" event={"ID":"98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e","Type":"ContainerStarted","Data":"bdddd9c48446fe856dd3d812c69bdc06ffa00bb43d28ff1ef78d5045fb9970a0"} Dec 05 17:24:15 crc kubenswrapper[4756]: I1205 17:24:15.372502 4756 generic.go:334] "Generic (PLEG): container finished" podID="98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e" containerID="fd7fdcf38c7cc89579fcf8c9620d9a7c5b1a8a5f887b3a8e2b8e7a5108ee7d37" exitCode=0 Dec 05 17:24:15 crc kubenswrapper[4756]: I1205 17:24:15.372593 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r6txr" event={"ID":"98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e","Type":"ContainerDied","Data":"fd7fdcf38c7cc89579fcf8c9620d9a7c5b1a8a5f887b3a8e2b8e7a5108ee7d37"} Dec 05 17:24:15 crc kubenswrapper[4756]: I1205 17:24:15.375369 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 17:24:16 crc kubenswrapper[4756]: I1205 17:24:16.391964 4756 generic.go:334] "Generic (PLEG): container finished" podID="98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e" containerID="a1b5ffc2d1b768a6b969881599ad8541cd80155de7c14330e1eb3e9ce527e565" exitCode=0 Dec 05 17:24:16 crc kubenswrapper[4756]: I1205 17:24:16.392074 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r6txr" event={"ID":"98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e","Type":"ContainerDied","Data":"a1b5ffc2d1b768a6b969881599ad8541cd80155de7c14330e1eb3e9ce527e565"} Dec 05 17:24:17 crc kubenswrapper[4756]: I1205 17:24:17.403090 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r6txr" event={"ID":"98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e","Type":"ContainerStarted","Data":"1c5b833cb84649bb773df30a888e9b1001a03c2bb605400273692728973e0c82"} Dec 05 17:24:17 crc kubenswrapper[4756]: I1205 17:24:17.428852 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r6txr" podStartSLOduration=3.023654537 podStartE2EDuration="4.428836264s" podCreationTimestamp="2025-12-05 17:24:13 +0000 UTC" firstStartedPulling="2025-12-05 17:24:15.374795773 +0000 UTC m=+4115.613469285" lastFinishedPulling="2025-12-05 17:24:16.7799775 +0000 UTC m=+4117.018651012" observedRunningTime="2025-12-05 17:24:17.423190231 +0000 UTC m=+4117.661863743" watchObservedRunningTime="2025-12-05 17:24:17.428836264 +0000 UTC m=+4117.667509776" Dec 05 17:24:24 crc kubenswrapper[4756]: I1205 17:24:24.070653 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r6txr" Dec 05 17:24:24 crc kubenswrapper[4756]: I1205 17:24:24.071647 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r6txr" Dec 05 17:24:24 crc kubenswrapper[4756]: I1205 17:24:24.199498 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r6txr" Dec 05 17:24:24 crc kubenswrapper[4756]: I1205 17:24:24.522309 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r6txr" Dec 05 17:24:24 crc kubenswrapper[4756]: I1205 17:24:24.591182 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r6txr"] Dec 05 17:24:26 crc kubenswrapper[4756]: I1205 17:24:26.495527 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-r6txr" podUID="98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e" containerName="registry-server" containerID="cri-o://1c5b833cb84649bb773df30a888e9b1001a03c2bb605400273692728973e0c82" gracePeriod=2 Dec 05 17:24:27 crc kubenswrapper[4756]: I1205 17:24:27.512936 4756 generic.go:334] "Generic (PLEG): container finished" podID="98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e" containerID="1c5b833cb84649bb773df30a888e9b1001a03c2bb605400273692728973e0c82" exitCode=0 Dec 05 17:24:27 crc kubenswrapper[4756]: I1205 17:24:27.513030 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r6txr" event={"ID":"98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e","Type":"ContainerDied","Data":"1c5b833cb84649bb773df30a888e9b1001a03c2bb605400273692728973e0c82"} Dec 05 17:24:28 crc kubenswrapper[4756]: I1205 17:24:28.083144 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r6txr" Dec 05 17:24:28 crc kubenswrapper[4756]: I1205 17:24:28.194207 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e-utilities\") pod \"98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e\" (UID: \"98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e\") " Dec 05 17:24:28 crc kubenswrapper[4756]: I1205 17:24:28.194297 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e-catalog-content\") pod \"98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e\" (UID: \"98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e\") " Dec 05 17:24:28 crc kubenswrapper[4756]: I1205 17:24:28.194324 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlfqk\" (UniqueName: \"kubernetes.io/projected/98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e-kube-api-access-hlfqk\") pod \"98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e\" (UID: \"98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e\") " Dec 05 17:24:28 crc kubenswrapper[4756]: I1205 17:24:28.195533 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e-utilities" (OuterVolumeSpecName: "utilities") pod "98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e" (UID: "98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:24:28 crc kubenswrapper[4756]: I1205 17:24:28.201391 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e-kube-api-access-hlfqk" (OuterVolumeSpecName: "kube-api-access-hlfqk") pod "98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e" (UID: "98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e"). InnerVolumeSpecName "kube-api-access-hlfqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:24:28 crc kubenswrapper[4756]: I1205 17:24:28.214330 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e" (UID: "98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:24:28 crc kubenswrapper[4756]: I1205 17:24:28.295813 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 17:24:28 crc kubenswrapper[4756]: I1205 17:24:28.295847 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlfqk\" (UniqueName: \"kubernetes.io/projected/98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e-kube-api-access-hlfqk\") on node \"crc\" DevicePath \"\"" Dec 05 17:24:28 crc kubenswrapper[4756]: I1205 17:24:28.295857 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 17:24:28 crc kubenswrapper[4756]: I1205 17:24:28.526151 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r6txr" event={"ID":"98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e","Type":"ContainerDied","Data":"bdddd9c48446fe856dd3d812c69bdc06ffa00bb43d28ff1ef78d5045fb9970a0"} Dec 05 17:24:28 crc kubenswrapper[4756]: I1205 17:24:28.526217 4756 scope.go:117] "RemoveContainer" containerID="1c5b833cb84649bb773df30a888e9b1001a03c2bb605400273692728973e0c82" Dec 05 17:24:28 crc kubenswrapper[4756]: I1205 17:24:28.526223 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r6txr" Dec 05 17:24:28 crc kubenswrapper[4756]: I1205 17:24:28.549159 4756 scope.go:117] "RemoveContainer" containerID="a1b5ffc2d1b768a6b969881599ad8541cd80155de7c14330e1eb3e9ce527e565" Dec 05 17:24:28 crc kubenswrapper[4756]: I1205 17:24:28.557982 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r6txr"] Dec 05 17:24:28 crc kubenswrapper[4756]: I1205 17:24:28.567304 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-r6txr"] Dec 05 17:24:28 crc kubenswrapper[4756]: I1205 17:24:28.576739 4756 scope.go:117] "RemoveContainer" containerID="fd7fdcf38c7cc89579fcf8c9620d9a7c5b1a8a5f887b3a8e2b8e7a5108ee7d37" Dec 05 17:24:30 crc kubenswrapper[4756]: I1205 17:24:30.061920 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e" path="/var/lib/kubelet/pods/98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e/volumes" Dec 05 17:25:52 crc kubenswrapper[4756]: I1205 17:25:52.295315 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 17:25:52 crc kubenswrapper[4756]: I1205 17:25:52.295882 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 17:26:22 crc kubenswrapper[4756]: I1205 17:26:22.295402 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 17:26:22 crc kubenswrapper[4756]: I1205 17:26:22.296014 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 17:26:52 crc kubenswrapper[4756]: I1205 17:26:52.295400 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 17:26:52 crc kubenswrapper[4756]: I1205 17:26:52.295943 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 17:26:52 crc kubenswrapper[4756]: I1205 17:26:52.295986 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 17:26:52 crc kubenswrapper[4756]: I1205 17:26:52.296435 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"559d99bb2b2903a0e0d08c03f3e2cdf229933fe33baa698fb9bf73438ee152f5"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 17:26:52 crc kubenswrapper[4756]: I1205 17:26:52.296489 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://559d99bb2b2903a0e0d08c03f3e2cdf229933fe33baa698fb9bf73438ee152f5" gracePeriod=600 Dec 05 17:26:52 crc kubenswrapper[4756]: I1205 17:26:52.777004 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="559d99bb2b2903a0e0d08c03f3e2cdf229933fe33baa698fb9bf73438ee152f5" exitCode=0 Dec 05 17:26:52 crc kubenswrapper[4756]: I1205 17:26:52.777109 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"559d99bb2b2903a0e0d08c03f3e2cdf229933fe33baa698fb9bf73438ee152f5"} Dec 05 17:26:52 crc kubenswrapper[4756]: I1205 17:26:52.777588 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"018dc95e3554740d0c2762c4f5c3c10d9130221083b834744135d7253826ab0e"} Dec 05 17:26:52 crc kubenswrapper[4756]: I1205 17:26:52.777612 4756 scope.go:117] "RemoveContainer" containerID="ac8114e29e03f74b87c3bbc2bd35e0ede9b016d14878a52f18289c6c255820fc" Dec 05 17:28:52 crc kubenswrapper[4756]: I1205 17:28:52.295241 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 17:28:52 crc kubenswrapper[4756]: I1205 17:28:52.295811 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 17:29:06 crc kubenswrapper[4756]: I1205 17:29:06.823136 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qs4t5"] Dec 05 17:29:06 crc kubenswrapper[4756]: E1205 17:29:06.824027 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e" containerName="extract-content" Dec 05 17:29:06 crc kubenswrapper[4756]: I1205 17:29:06.824041 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e" containerName="extract-content" Dec 05 17:29:06 crc kubenswrapper[4756]: E1205 17:29:06.824061 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e" containerName="extract-utilities" Dec 05 17:29:06 crc kubenswrapper[4756]: I1205 17:29:06.824067 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e" containerName="extract-utilities" Dec 05 17:29:06 crc kubenswrapper[4756]: E1205 17:29:06.824080 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e" containerName="registry-server" Dec 05 17:29:06 crc kubenswrapper[4756]: I1205 17:29:06.824085 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e" containerName="registry-server" Dec 05 17:29:06 crc kubenswrapper[4756]: I1205 17:29:06.824241 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="98d9b9d7-5e5d-4ef7-9ba8-780ae845f12e" containerName="registry-server" Dec 05 17:29:06 crc kubenswrapper[4756]: I1205 17:29:06.825395 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qs4t5" Dec 05 17:29:06 crc kubenswrapper[4756]: I1205 17:29:06.848579 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qs4t5"] Dec 05 17:29:06 crc kubenswrapper[4756]: I1205 17:29:06.864990 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03b03cf4-0745-404e-81d2-33a579791c19-utilities\") pod \"certified-operators-qs4t5\" (UID: \"03b03cf4-0745-404e-81d2-33a579791c19\") " pod="openshift-marketplace/certified-operators-qs4t5" Dec 05 17:29:06 crc kubenswrapper[4756]: I1205 17:29:06.865066 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03b03cf4-0745-404e-81d2-33a579791c19-catalog-content\") pod \"certified-operators-qs4t5\" (UID: \"03b03cf4-0745-404e-81d2-33a579791c19\") " pod="openshift-marketplace/certified-operators-qs4t5" Dec 05 17:29:06 crc kubenswrapper[4756]: I1205 17:29:06.865124 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccxbz\" (UniqueName: \"kubernetes.io/projected/03b03cf4-0745-404e-81d2-33a579791c19-kube-api-access-ccxbz\") pod \"certified-operators-qs4t5\" (UID: \"03b03cf4-0745-404e-81d2-33a579791c19\") " pod="openshift-marketplace/certified-operators-qs4t5" Dec 05 17:29:06 crc kubenswrapper[4756]: I1205 17:29:06.966642 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03b03cf4-0745-404e-81d2-33a579791c19-utilities\") pod \"certified-operators-qs4t5\" (UID: \"03b03cf4-0745-404e-81d2-33a579791c19\") " pod="openshift-marketplace/certified-operators-qs4t5" Dec 05 17:29:06 crc kubenswrapper[4756]: I1205 17:29:06.966734 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03b03cf4-0745-404e-81d2-33a579791c19-catalog-content\") pod \"certified-operators-qs4t5\" (UID: \"03b03cf4-0745-404e-81d2-33a579791c19\") " pod="openshift-marketplace/certified-operators-qs4t5" Dec 05 17:29:06 crc kubenswrapper[4756]: I1205 17:29:06.966809 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccxbz\" (UniqueName: \"kubernetes.io/projected/03b03cf4-0745-404e-81d2-33a579791c19-kube-api-access-ccxbz\") pod \"certified-operators-qs4t5\" (UID: \"03b03cf4-0745-404e-81d2-33a579791c19\") " pod="openshift-marketplace/certified-operators-qs4t5" Dec 05 17:29:06 crc kubenswrapper[4756]: I1205 17:29:06.967321 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03b03cf4-0745-404e-81d2-33a579791c19-catalog-content\") pod \"certified-operators-qs4t5\" (UID: \"03b03cf4-0745-404e-81d2-33a579791c19\") " pod="openshift-marketplace/certified-operators-qs4t5" Dec 05 17:29:06 crc kubenswrapper[4756]: I1205 17:29:06.967810 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03b03cf4-0745-404e-81d2-33a579791c19-utilities\") pod \"certified-operators-qs4t5\" (UID: \"03b03cf4-0745-404e-81d2-33a579791c19\") " pod="openshift-marketplace/certified-operators-qs4t5" Dec 05 17:29:06 crc kubenswrapper[4756]: I1205 17:29:06.985200 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccxbz\" (UniqueName: \"kubernetes.io/projected/03b03cf4-0745-404e-81d2-33a579791c19-kube-api-access-ccxbz\") pod \"certified-operators-qs4t5\" (UID: \"03b03cf4-0745-404e-81d2-33a579791c19\") " pod="openshift-marketplace/certified-operators-qs4t5" Dec 05 17:29:07 crc kubenswrapper[4756]: I1205 17:29:07.154113 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qs4t5" Dec 05 17:29:07 crc kubenswrapper[4756]: I1205 17:29:07.652651 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qs4t5"] Dec 05 17:29:08 crc kubenswrapper[4756]: I1205 17:29:08.530171 4756 generic.go:334] "Generic (PLEG): container finished" podID="03b03cf4-0745-404e-81d2-33a579791c19" containerID="a28769eb35281ef6aa48a85e0e5dac3dc45d481f558107a0e8ff3c6f5662cca1" exitCode=0 Dec 05 17:29:08 crc kubenswrapper[4756]: I1205 17:29:08.530264 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qs4t5" event={"ID":"03b03cf4-0745-404e-81d2-33a579791c19","Type":"ContainerDied","Data":"a28769eb35281ef6aa48a85e0e5dac3dc45d481f558107a0e8ff3c6f5662cca1"} Dec 05 17:29:08 crc kubenswrapper[4756]: I1205 17:29:08.530480 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qs4t5" event={"ID":"03b03cf4-0745-404e-81d2-33a579791c19","Type":"ContainerStarted","Data":"c6c14ca1915022491dfb8cc172a7a6ea87fd1df39e366e1b35087ed515e6eba4"} Dec 05 17:29:09 crc kubenswrapper[4756]: I1205 17:29:09.542246 4756 generic.go:334] "Generic (PLEG): container finished" podID="03b03cf4-0745-404e-81d2-33a579791c19" containerID="d449a01bae6abfaf85b00b324ea9684266505237b11caf4e3a97f42830c90092" exitCode=0 Dec 05 17:29:09 crc kubenswrapper[4756]: I1205 17:29:09.542477 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qs4t5" event={"ID":"03b03cf4-0745-404e-81d2-33a579791c19","Type":"ContainerDied","Data":"d449a01bae6abfaf85b00b324ea9684266505237b11caf4e3a97f42830c90092"} Dec 05 17:29:10 crc kubenswrapper[4756]: I1205 17:29:10.552012 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qs4t5" event={"ID":"03b03cf4-0745-404e-81d2-33a579791c19","Type":"ContainerStarted","Data":"af8d215ebbac8fd407afeb16538aeb9d13e0f7b6041861cd3cffa12c699d4d52"} Dec 05 17:29:10 crc kubenswrapper[4756]: I1205 17:29:10.580353 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qs4t5" podStartSLOduration=3.128371503 podStartE2EDuration="4.580324777s" podCreationTimestamp="2025-12-05 17:29:06 +0000 UTC" firstStartedPulling="2025-12-05 17:29:08.531873898 +0000 UTC m=+4408.770547410" lastFinishedPulling="2025-12-05 17:29:09.983827172 +0000 UTC m=+4410.222500684" observedRunningTime="2025-12-05 17:29:10.57138525 +0000 UTC m=+4410.810058782" watchObservedRunningTime="2025-12-05 17:29:10.580324777 +0000 UTC m=+4410.818998319" Dec 05 17:29:17 crc kubenswrapper[4756]: I1205 17:29:17.154973 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qs4t5" Dec 05 17:29:17 crc kubenswrapper[4756]: I1205 17:29:17.157187 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qs4t5" Dec 05 17:29:17 crc kubenswrapper[4756]: I1205 17:29:17.219294 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qs4t5" Dec 05 17:29:17 crc kubenswrapper[4756]: I1205 17:29:17.662004 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qs4t5" Dec 05 17:29:17 crc kubenswrapper[4756]: I1205 17:29:17.713789 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qs4t5"] Dec 05 17:29:19 crc kubenswrapper[4756]: I1205 17:29:19.635005 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qs4t5" podUID="03b03cf4-0745-404e-81d2-33a579791c19" containerName="registry-server" containerID="cri-o://af8d215ebbac8fd407afeb16538aeb9d13e0f7b6041861cd3cffa12c699d4d52" gracePeriod=2 Dec 05 17:29:20 crc kubenswrapper[4756]: I1205 17:29:20.636818 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qs4t5" Dec 05 17:29:20 crc kubenswrapper[4756]: I1205 17:29:20.641875 4756 generic.go:334] "Generic (PLEG): container finished" podID="03b03cf4-0745-404e-81d2-33a579791c19" containerID="af8d215ebbac8fd407afeb16538aeb9d13e0f7b6041861cd3cffa12c699d4d52" exitCode=0 Dec 05 17:29:20 crc kubenswrapper[4756]: I1205 17:29:20.641915 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qs4t5" event={"ID":"03b03cf4-0745-404e-81d2-33a579791c19","Type":"ContainerDied","Data":"af8d215ebbac8fd407afeb16538aeb9d13e0f7b6041861cd3cffa12c699d4d52"} Dec 05 17:29:20 crc kubenswrapper[4756]: I1205 17:29:20.641939 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qs4t5" event={"ID":"03b03cf4-0745-404e-81d2-33a579791c19","Type":"ContainerDied","Data":"c6c14ca1915022491dfb8cc172a7a6ea87fd1df39e366e1b35087ed515e6eba4"} Dec 05 17:29:20 crc kubenswrapper[4756]: I1205 17:29:20.641947 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qs4t5" Dec 05 17:29:20 crc kubenswrapper[4756]: I1205 17:29:20.641955 4756 scope.go:117] "RemoveContainer" containerID="af8d215ebbac8fd407afeb16538aeb9d13e0f7b6041861cd3cffa12c699d4d52" Dec 05 17:29:20 crc kubenswrapper[4756]: I1205 17:29:20.685542 4756 scope.go:117] "RemoveContainer" containerID="d449a01bae6abfaf85b00b324ea9684266505237b11caf4e3a97f42830c90092" Dec 05 17:29:20 crc kubenswrapper[4756]: I1205 17:29:20.686755 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccxbz\" (UniqueName: \"kubernetes.io/projected/03b03cf4-0745-404e-81d2-33a579791c19-kube-api-access-ccxbz\") pod \"03b03cf4-0745-404e-81d2-33a579791c19\" (UID: \"03b03cf4-0745-404e-81d2-33a579791c19\") " Dec 05 17:29:20 crc kubenswrapper[4756]: I1205 17:29:20.686882 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03b03cf4-0745-404e-81d2-33a579791c19-utilities\") pod \"03b03cf4-0745-404e-81d2-33a579791c19\" (UID: \"03b03cf4-0745-404e-81d2-33a579791c19\") " Dec 05 17:29:20 crc kubenswrapper[4756]: I1205 17:29:20.687901 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03b03cf4-0745-404e-81d2-33a579791c19-utilities" (OuterVolumeSpecName: "utilities") pod "03b03cf4-0745-404e-81d2-33a579791c19" (UID: "03b03cf4-0745-404e-81d2-33a579791c19"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:29:20 crc kubenswrapper[4756]: I1205 17:29:20.697712 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03b03cf4-0745-404e-81d2-33a579791c19-kube-api-access-ccxbz" (OuterVolumeSpecName: "kube-api-access-ccxbz") pod "03b03cf4-0745-404e-81d2-33a579791c19" (UID: "03b03cf4-0745-404e-81d2-33a579791c19"). InnerVolumeSpecName "kube-api-access-ccxbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:29:20 crc kubenswrapper[4756]: I1205 17:29:20.709688 4756 scope.go:117] "RemoveContainer" containerID="a28769eb35281ef6aa48a85e0e5dac3dc45d481f558107a0e8ff3c6f5662cca1" Dec 05 17:29:20 crc kubenswrapper[4756]: I1205 17:29:20.772212 4756 scope.go:117] "RemoveContainer" containerID="af8d215ebbac8fd407afeb16538aeb9d13e0f7b6041861cd3cffa12c699d4d52" Dec 05 17:29:20 crc kubenswrapper[4756]: E1205 17:29:20.772623 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af8d215ebbac8fd407afeb16538aeb9d13e0f7b6041861cd3cffa12c699d4d52\": container with ID starting with af8d215ebbac8fd407afeb16538aeb9d13e0f7b6041861cd3cffa12c699d4d52 not found: ID does not exist" containerID="af8d215ebbac8fd407afeb16538aeb9d13e0f7b6041861cd3cffa12c699d4d52" Dec 05 17:29:20 crc kubenswrapper[4756]: I1205 17:29:20.772682 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af8d215ebbac8fd407afeb16538aeb9d13e0f7b6041861cd3cffa12c699d4d52"} err="failed to get container status \"af8d215ebbac8fd407afeb16538aeb9d13e0f7b6041861cd3cffa12c699d4d52\": rpc error: code = NotFound desc = could not find container \"af8d215ebbac8fd407afeb16538aeb9d13e0f7b6041861cd3cffa12c699d4d52\": container with ID starting with af8d215ebbac8fd407afeb16538aeb9d13e0f7b6041861cd3cffa12c699d4d52 not found: ID does not exist" Dec 05 17:29:20 crc kubenswrapper[4756]: I1205 17:29:20.772717 4756 scope.go:117] "RemoveContainer" containerID="d449a01bae6abfaf85b00b324ea9684266505237b11caf4e3a97f42830c90092" Dec 05 17:29:20 crc kubenswrapper[4756]: E1205 17:29:20.773008 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d449a01bae6abfaf85b00b324ea9684266505237b11caf4e3a97f42830c90092\": container with ID starting with d449a01bae6abfaf85b00b324ea9684266505237b11caf4e3a97f42830c90092 not found: ID does not exist" containerID="d449a01bae6abfaf85b00b324ea9684266505237b11caf4e3a97f42830c90092" Dec 05 17:29:20 crc kubenswrapper[4756]: I1205 17:29:20.773050 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d449a01bae6abfaf85b00b324ea9684266505237b11caf4e3a97f42830c90092"} err="failed to get container status \"d449a01bae6abfaf85b00b324ea9684266505237b11caf4e3a97f42830c90092\": rpc error: code = NotFound desc = could not find container \"d449a01bae6abfaf85b00b324ea9684266505237b11caf4e3a97f42830c90092\": container with ID starting with d449a01bae6abfaf85b00b324ea9684266505237b11caf4e3a97f42830c90092 not found: ID does not exist" Dec 05 17:29:20 crc kubenswrapper[4756]: I1205 17:29:20.773064 4756 scope.go:117] "RemoveContainer" containerID="a28769eb35281ef6aa48a85e0e5dac3dc45d481f558107a0e8ff3c6f5662cca1" Dec 05 17:29:20 crc kubenswrapper[4756]: E1205 17:29:20.773450 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a28769eb35281ef6aa48a85e0e5dac3dc45d481f558107a0e8ff3c6f5662cca1\": container with ID starting with a28769eb35281ef6aa48a85e0e5dac3dc45d481f558107a0e8ff3c6f5662cca1 not found: ID does not exist" containerID="a28769eb35281ef6aa48a85e0e5dac3dc45d481f558107a0e8ff3c6f5662cca1" Dec 05 17:29:20 crc kubenswrapper[4756]: I1205 17:29:20.773475 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a28769eb35281ef6aa48a85e0e5dac3dc45d481f558107a0e8ff3c6f5662cca1"} err="failed to get container status \"a28769eb35281ef6aa48a85e0e5dac3dc45d481f558107a0e8ff3c6f5662cca1\": rpc error: code = NotFound desc = could not find container \"a28769eb35281ef6aa48a85e0e5dac3dc45d481f558107a0e8ff3c6f5662cca1\": container with ID starting with a28769eb35281ef6aa48a85e0e5dac3dc45d481f558107a0e8ff3c6f5662cca1 not found: ID does not exist" Dec 05 17:29:20 crc kubenswrapper[4756]: I1205 17:29:20.788114 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03b03cf4-0745-404e-81d2-33a579791c19-catalog-content\") pod \"03b03cf4-0745-404e-81d2-33a579791c19\" (UID: \"03b03cf4-0745-404e-81d2-33a579791c19\") " Dec 05 17:29:20 crc kubenswrapper[4756]: I1205 17:29:20.788999 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccxbz\" (UniqueName: \"kubernetes.io/projected/03b03cf4-0745-404e-81d2-33a579791c19-kube-api-access-ccxbz\") on node \"crc\" DevicePath \"\"" Dec 05 17:29:20 crc kubenswrapper[4756]: I1205 17:29:20.789042 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03b03cf4-0745-404e-81d2-33a579791c19-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 17:29:20 crc kubenswrapper[4756]: I1205 17:29:20.837539 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03b03cf4-0745-404e-81d2-33a579791c19-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "03b03cf4-0745-404e-81d2-33a579791c19" (UID: "03b03cf4-0745-404e-81d2-33a579791c19"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:29:20 crc kubenswrapper[4756]: I1205 17:29:20.890107 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03b03cf4-0745-404e-81d2-33a579791c19-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 17:29:20 crc kubenswrapper[4756]: I1205 17:29:20.977995 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qs4t5"] Dec 05 17:29:20 crc kubenswrapper[4756]: I1205 17:29:20.983673 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qs4t5"] Dec 05 17:29:22 crc kubenswrapper[4756]: I1205 17:29:22.057407 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03b03cf4-0745-404e-81d2-33a579791c19" path="/var/lib/kubelet/pods/03b03cf4-0745-404e-81d2-33a579791c19/volumes" Dec 05 17:29:22 crc kubenswrapper[4756]: I1205 17:29:22.295584 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 17:29:22 crc kubenswrapper[4756]: I1205 17:29:22.295674 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 17:29:27 crc kubenswrapper[4756]: I1205 17:29:27.766835 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gjzd8"] Dec 05 17:29:27 crc kubenswrapper[4756]: E1205 17:29:27.768245 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03b03cf4-0745-404e-81d2-33a579791c19" containerName="extract-utilities" Dec 05 17:29:27 crc kubenswrapper[4756]: I1205 17:29:27.768277 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="03b03cf4-0745-404e-81d2-33a579791c19" containerName="extract-utilities" Dec 05 17:29:27 crc kubenswrapper[4756]: E1205 17:29:27.768352 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03b03cf4-0745-404e-81d2-33a579791c19" containerName="registry-server" Dec 05 17:29:27 crc kubenswrapper[4756]: I1205 17:29:27.768366 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="03b03cf4-0745-404e-81d2-33a579791c19" containerName="registry-server" Dec 05 17:29:27 crc kubenswrapper[4756]: E1205 17:29:27.768395 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03b03cf4-0745-404e-81d2-33a579791c19" containerName="extract-content" Dec 05 17:29:27 crc kubenswrapper[4756]: I1205 17:29:27.768407 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="03b03cf4-0745-404e-81d2-33a579791c19" containerName="extract-content" Dec 05 17:29:27 crc kubenswrapper[4756]: I1205 17:29:27.768731 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="03b03cf4-0745-404e-81d2-33a579791c19" containerName="registry-server" Dec 05 17:29:27 crc kubenswrapper[4756]: I1205 17:29:27.771063 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjzd8" Dec 05 17:29:27 crc kubenswrapper[4756]: I1205 17:29:27.782690 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gjzd8"] Dec 05 17:29:27 crc kubenswrapper[4756]: I1205 17:29:27.880839 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91-utilities\") pod \"redhat-operators-gjzd8\" (UID: \"ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91\") " pod="openshift-marketplace/redhat-operators-gjzd8" Dec 05 17:29:27 crc kubenswrapper[4756]: I1205 17:29:27.881028 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91-catalog-content\") pod \"redhat-operators-gjzd8\" (UID: \"ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91\") " pod="openshift-marketplace/redhat-operators-gjzd8" Dec 05 17:29:27 crc kubenswrapper[4756]: I1205 17:29:27.881514 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9j5s\" (UniqueName: \"kubernetes.io/projected/ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91-kube-api-access-h9j5s\") pod \"redhat-operators-gjzd8\" (UID: \"ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91\") " pod="openshift-marketplace/redhat-operators-gjzd8" Dec 05 17:29:27 crc kubenswrapper[4756]: I1205 17:29:27.982501 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91-utilities\") pod \"redhat-operators-gjzd8\" (UID: \"ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91\") " pod="openshift-marketplace/redhat-operators-gjzd8" Dec 05 17:29:27 crc kubenswrapper[4756]: I1205 17:29:27.982564 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91-catalog-content\") pod \"redhat-operators-gjzd8\" (UID: \"ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91\") " pod="openshift-marketplace/redhat-operators-gjzd8" Dec 05 17:29:27 crc kubenswrapper[4756]: I1205 17:29:27.982694 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9j5s\" (UniqueName: \"kubernetes.io/projected/ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91-kube-api-access-h9j5s\") pod \"redhat-operators-gjzd8\" (UID: \"ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91\") " pod="openshift-marketplace/redhat-operators-gjzd8" Dec 05 17:29:27 crc kubenswrapper[4756]: I1205 17:29:27.983047 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91-utilities\") pod \"redhat-operators-gjzd8\" (UID: \"ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91\") " pod="openshift-marketplace/redhat-operators-gjzd8" Dec 05 17:29:27 crc kubenswrapper[4756]: I1205 17:29:27.983520 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91-catalog-content\") pod \"redhat-operators-gjzd8\" (UID: \"ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91\") " pod="openshift-marketplace/redhat-operators-gjzd8" Dec 05 17:29:28 crc kubenswrapper[4756]: I1205 17:29:28.001518 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9j5s\" (UniqueName: \"kubernetes.io/projected/ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91-kube-api-access-h9j5s\") pod \"redhat-operators-gjzd8\" (UID: \"ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91\") " pod="openshift-marketplace/redhat-operators-gjzd8" Dec 05 17:29:28 crc kubenswrapper[4756]: I1205 17:29:28.104452 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjzd8" Dec 05 17:29:28 crc kubenswrapper[4756]: I1205 17:29:28.519294 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gjzd8"] Dec 05 17:29:28 crc kubenswrapper[4756]: I1205 17:29:28.709371 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjzd8" event={"ID":"ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91","Type":"ContainerStarted","Data":"4049982241cd6870310d10b602220362167006fe1bfc23711c711a993acaa19b"} Dec 05 17:29:28 crc kubenswrapper[4756]: I1205 17:29:28.710120 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjzd8" event={"ID":"ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91","Type":"ContainerStarted","Data":"e743943f6d5c861300ff9db91fc53da8883ba663a9fbbcc49afd8b4c36a919b5"} Dec 05 17:29:29 crc kubenswrapper[4756]: I1205 17:29:29.718744 4756 generic.go:334] "Generic (PLEG): container finished" podID="ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91" containerID="4049982241cd6870310d10b602220362167006fe1bfc23711c711a993acaa19b" exitCode=0 Dec 05 17:29:29 crc kubenswrapper[4756]: I1205 17:29:29.718807 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjzd8" event={"ID":"ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91","Type":"ContainerDied","Data":"4049982241cd6870310d10b602220362167006fe1bfc23711c711a993acaa19b"} Dec 05 17:29:29 crc kubenswrapper[4756]: I1205 17:29:29.722176 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 17:29:30 crc kubenswrapper[4756]: I1205 17:29:30.728921 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjzd8" event={"ID":"ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91","Type":"ContainerStarted","Data":"c7ad91abebdbc7c2b5d63b615e46be81cfa585406d322cd4e10014f3e74e7abe"} Dec 05 17:29:31 crc kubenswrapper[4756]: I1205 17:29:31.746848 4756 generic.go:334] "Generic (PLEG): container finished" podID="ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91" containerID="c7ad91abebdbc7c2b5d63b615e46be81cfa585406d322cd4e10014f3e74e7abe" exitCode=0 Dec 05 17:29:31 crc kubenswrapper[4756]: I1205 17:29:31.746923 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjzd8" event={"ID":"ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91","Type":"ContainerDied","Data":"c7ad91abebdbc7c2b5d63b615e46be81cfa585406d322cd4e10014f3e74e7abe"} Dec 05 17:29:32 crc kubenswrapper[4756]: I1205 17:29:32.758126 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjzd8" event={"ID":"ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91","Type":"ContainerStarted","Data":"191ec6ac05fbe1ee9d63279dafc296f301a0c04f1d100a0dc7721257ecc63055"} Dec 05 17:29:32 crc kubenswrapper[4756]: I1205 17:29:32.787587 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gjzd8" podStartSLOduration=3.332821413 podStartE2EDuration="5.787564829s" podCreationTimestamp="2025-12-05 17:29:27 +0000 UTC" firstStartedPulling="2025-12-05 17:29:29.721870151 +0000 UTC m=+4429.960543663" lastFinishedPulling="2025-12-05 17:29:32.176613527 +0000 UTC m=+4432.415287079" observedRunningTime="2025-12-05 17:29:32.782363887 +0000 UTC m=+4433.021037399" watchObservedRunningTime="2025-12-05 17:29:32.787564829 +0000 UTC m=+4433.026238351" Dec 05 17:29:38 crc kubenswrapper[4756]: I1205 17:29:38.105213 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gjzd8" Dec 05 17:29:38 crc kubenswrapper[4756]: I1205 17:29:38.106049 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gjzd8" Dec 05 17:29:38 crc kubenswrapper[4756]: I1205 17:29:38.175331 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gjzd8" Dec 05 17:29:38 crc kubenswrapper[4756]: I1205 17:29:38.877066 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gjzd8" Dec 05 17:29:38 crc kubenswrapper[4756]: I1205 17:29:38.929590 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gjzd8"] Dec 05 17:29:40 crc kubenswrapper[4756]: I1205 17:29:40.838933 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gjzd8" podUID="ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91" containerName="registry-server" containerID="cri-o://191ec6ac05fbe1ee9d63279dafc296f301a0c04f1d100a0dc7721257ecc63055" gracePeriod=2 Dec 05 17:29:41 crc kubenswrapper[4756]: I1205 17:29:41.318102 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjzd8" Dec 05 17:29:41 crc kubenswrapper[4756]: I1205 17:29:41.491334 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91-utilities\") pod \"ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91\" (UID: \"ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91\") " Dec 05 17:29:41 crc kubenswrapper[4756]: I1205 17:29:41.491532 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91-catalog-content\") pod \"ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91\" (UID: \"ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91\") " Dec 05 17:29:41 crc kubenswrapper[4756]: I1205 17:29:41.491681 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9j5s\" (UniqueName: \"kubernetes.io/projected/ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91-kube-api-access-h9j5s\") pod \"ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91\" (UID: \"ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91\") " Dec 05 17:29:41 crc kubenswrapper[4756]: I1205 17:29:41.492841 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91-utilities" (OuterVolumeSpecName: "utilities") pod "ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91" (UID: "ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:29:41 crc kubenswrapper[4756]: I1205 17:29:41.503769 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91-kube-api-access-h9j5s" (OuterVolumeSpecName: "kube-api-access-h9j5s") pod "ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91" (UID: "ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91"). InnerVolumeSpecName "kube-api-access-h9j5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:29:41 crc kubenswrapper[4756]: I1205 17:29:41.592982 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9j5s\" (UniqueName: \"kubernetes.io/projected/ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91-kube-api-access-h9j5s\") on node \"crc\" DevicePath \"\"" Dec 05 17:29:41 crc kubenswrapper[4756]: I1205 17:29:41.593022 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 17:29:41 crc kubenswrapper[4756]: I1205 17:29:41.851495 4756 generic.go:334] "Generic (PLEG): container finished" podID="ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91" containerID="191ec6ac05fbe1ee9d63279dafc296f301a0c04f1d100a0dc7721257ecc63055" exitCode=0 Dec 05 17:29:41 crc kubenswrapper[4756]: I1205 17:29:41.851541 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjzd8" event={"ID":"ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91","Type":"ContainerDied","Data":"191ec6ac05fbe1ee9d63279dafc296f301a0c04f1d100a0dc7721257ecc63055"} Dec 05 17:29:41 crc kubenswrapper[4756]: I1205 17:29:41.851568 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjzd8" event={"ID":"ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91","Type":"ContainerDied","Data":"e743943f6d5c861300ff9db91fc53da8883ba663a9fbbcc49afd8b4c36a919b5"} Dec 05 17:29:41 crc kubenswrapper[4756]: I1205 17:29:41.851553 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjzd8" Dec 05 17:29:41 crc kubenswrapper[4756]: I1205 17:29:41.851587 4756 scope.go:117] "RemoveContainer" containerID="191ec6ac05fbe1ee9d63279dafc296f301a0c04f1d100a0dc7721257ecc63055" Dec 05 17:29:41 crc kubenswrapper[4756]: I1205 17:29:41.882411 4756 scope.go:117] "RemoveContainer" containerID="c7ad91abebdbc7c2b5d63b615e46be81cfa585406d322cd4e10014f3e74e7abe" Dec 05 17:29:41 crc kubenswrapper[4756]: I1205 17:29:41.952671 4756 scope.go:117] "RemoveContainer" containerID="4049982241cd6870310d10b602220362167006fe1bfc23711c711a993acaa19b" Dec 05 17:29:42 crc kubenswrapper[4756]: I1205 17:29:42.062761 4756 scope.go:117] "RemoveContainer" containerID="191ec6ac05fbe1ee9d63279dafc296f301a0c04f1d100a0dc7721257ecc63055" Dec 05 17:29:42 crc kubenswrapper[4756]: E1205 17:29:42.063174 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"191ec6ac05fbe1ee9d63279dafc296f301a0c04f1d100a0dc7721257ecc63055\": container with ID starting with 191ec6ac05fbe1ee9d63279dafc296f301a0c04f1d100a0dc7721257ecc63055 not found: ID does not exist" containerID="191ec6ac05fbe1ee9d63279dafc296f301a0c04f1d100a0dc7721257ecc63055" Dec 05 17:29:42 crc kubenswrapper[4756]: I1205 17:29:42.063214 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"191ec6ac05fbe1ee9d63279dafc296f301a0c04f1d100a0dc7721257ecc63055"} err="failed to get container status \"191ec6ac05fbe1ee9d63279dafc296f301a0c04f1d100a0dc7721257ecc63055\": rpc error: code = NotFound desc = could not find container \"191ec6ac05fbe1ee9d63279dafc296f301a0c04f1d100a0dc7721257ecc63055\": container with ID starting with 191ec6ac05fbe1ee9d63279dafc296f301a0c04f1d100a0dc7721257ecc63055 not found: ID does not exist" Dec 05 17:29:42 crc kubenswrapper[4756]: I1205 17:29:42.063241 4756 scope.go:117] "RemoveContainer" containerID="c7ad91abebdbc7c2b5d63b615e46be81cfa585406d322cd4e10014f3e74e7abe" Dec 05 17:29:42 crc kubenswrapper[4756]: E1205 17:29:42.063640 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7ad91abebdbc7c2b5d63b615e46be81cfa585406d322cd4e10014f3e74e7abe\": container with ID starting with c7ad91abebdbc7c2b5d63b615e46be81cfa585406d322cd4e10014f3e74e7abe not found: ID does not exist" containerID="c7ad91abebdbc7c2b5d63b615e46be81cfa585406d322cd4e10014f3e74e7abe" Dec 05 17:29:42 crc kubenswrapper[4756]: I1205 17:29:42.063668 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7ad91abebdbc7c2b5d63b615e46be81cfa585406d322cd4e10014f3e74e7abe"} err="failed to get container status \"c7ad91abebdbc7c2b5d63b615e46be81cfa585406d322cd4e10014f3e74e7abe\": rpc error: code = NotFound desc = could not find container \"c7ad91abebdbc7c2b5d63b615e46be81cfa585406d322cd4e10014f3e74e7abe\": container with ID starting with c7ad91abebdbc7c2b5d63b615e46be81cfa585406d322cd4e10014f3e74e7abe not found: ID does not exist" Dec 05 17:29:42 crc kubenswrapper[4756]: I1205 17:29:42.063684 4756 scope.go:117] "RemoveContainer" containerID="4049982241cd6870310d10b602220362167006fe1bfc23711c711a993acaa19b" Dec 05 17:29:42 crc kubenswrapper[4756]: E1205 17:29:42.064085 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4049982241cd6870310d10b602220362167006fe1bfc23711c711a993acaa19b\": container with ID starting with 4049982241cd6870310d10b602220362167006fe1bfc23711c711a993acaa19b not found: ID does not exist" containerID="4049982241cd6870310d10b602220362167006fe1bfc23711c711a993acaa19b" Dec 05 17:29:42 crc kubenswrapper[4756]: I1205 17:29:42.064167 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4049982241cd6870310d10b602220362167006fe1bfc23711c711a993acaa19b"} err="failed to get container status \"4049982241cd6870310d10b602220362167006fe1bfc23711c711a993acaa19b\": rpc error: code = NotFound desc = could not find container \"4049982241cd6870310d10b602220362167006fe1bfc23711c711a993acaa19b\": container with ID starting with 4049982241cd6870310d10b602220362167006fe1bfc23711c711a993acaa19b not found: ID does not exist" Dec 05 17:29:42 crc kubenswrapper[4756]: I1205 17:29:42.552106 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91" (UID: "ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:29:42 crc kubenswrapper[4756]: I1205 17:29:42.609145 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 17:29:42 crc kubenswrapper[4756]: I1205 17:29:42.794144 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gjzd8"] Dec 05 17:29:42 crc kubenswrapper[4756]: I1205 17:29:42.801362 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gjzd8"] Dec 05 17:29:44 crc kubenswrapper[4756]: I1205 17:29:44.061904 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91" path="/var/lib/kubelet/pods/ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91/volumes" Dec 05 17:29:49 crc kubenswrapper[4756]: I1205 17:29:49.244047 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mv4pq"] Dec 05 17:29:49 crc kubenswrapper[4756]: E1205 17:29:49.249196 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91" containerName="extract-content" Dec 05 17:29:49 crc kubenswrapper[4756]: I1205 17:29:49.249361 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91" containerName="extract-content" Dec 05 17:29:49 crc kubenswrapper[4756]: E1205 17:29:49.257801 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91" containerName="registry-server" Dec 05 17:29:49 crc kubenswrapper[4756]: I1205 17:29:49.258144 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91" containerName="registry-server" Dec 05 17:29:49 crc kubenswrapper[4756]: E1205 17:29:49.258297 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91" containerName="extract-utilities" Dec 05 17:29:49 crc kubenswrapper[4756]: I1205 17:29:49.258473 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91" containerName="extract-utilities" Dec 05 17:29:49 crc kubenswrapper[4756]: I1205 17:29:49.259000 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea4bbc76-6a86-4ec6-8ee7-e6859f6baf91" containerName="registry-server" Dec 05 17:29:49 crc kubenswrapper[4756]: I1205 17:29:49.271665 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mv4pq"] Dec 05 17:29:49 crc kubenswrapper[4756]: I1205 17:29:49.271858 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mv4pq" Dec 05 17:29:49 crc kubenswrapper[4756]: I1205 17:29:49.276917 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8hfn\" (UniqueName: \"kubernetes.io/projected/c2a3514d-c642-4e86-a9fa-dc27fa6bca81-kube-api-access-b8hfn\") pod \"community-operators-mv4pq\" (UID: \"c2a3514d-c642-4e86-a9fa-dc27fa6bca81\") " pod="openshift-marketplace/community-operators-mv4pq" Dec 05 17:29:49 crc kubenswrapper[4756]: I1205 17:29:49.277119 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2a3514d-c642-4e86-a9fa-dc27fa6bca81-catalog-content\") pod \"community-operators-mv4pq\" (UID: \"c2a3514d-c642-4e86-a9fa-dc27fa6bca81\") " pod="openshift-marketplace/community-operators-mv4pq" Dec 05 17:29:49 crc kubenswrapper[4756]: I1205 17:29:49.277263 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2a3514d-c642-4e86-a9fa-dc27fa6bca81-utilities\") pod \"community-operators-mv4pq\" (UID: \"c2a3514d-c642-4e86-a9fa-dc27fa6bca81\") " pod="openshift-marketplace/community-operators-mv4pq" Dec 05 17:29:49 crc kubenswrapper[4756]: I1205 17:29:49.378571 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2a3514d-c642-4e86-a9fa-dc27fa6bca81-catalog-content\") pod \"community-operators-mv4pq\" (UID: \"c2a3514d-c642-4e86-a9fa-dc27fa6bca81\") " pod="openshift-marketplace/community-operators-mv4pq" Dec 05 17:29:49 crc kubenswrapper[4756]: I1205 17:29:49.378969 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2a3514d-c642-4e86-a9fa-dc27fa6bca81-utilities\") pod \"community-operators-mv4pq\" (UID: \"c2a3514d-c642-4e86-a9fa-dc27fa6bca81\") " pod="openshift-marketplace/community-operators-mv4pq" Dec 05 17:29:49 crc kubenswrapper[4756]: I1205 17:29:49.379166 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8hfn\" (UniqueName: \"kubernetes.io/projected/c2a3514d-c642-4e86-a9fa-dc27fa6bca81-kube-api-access-b8hfn\") pod \"community-operators-mv4pq\" (UID: \"c2a3514d-c642-4e86-a9fa-dc27fa6bca81\") " pod="openshift-marketplace/community-operators-mv4pq" Dec 05 17:29:49 crc kubenswrapper[4756]: I1205 17:29:49.379365 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2a3514d-c642-4e86-a9fa-dc27fa6bca81-utilities\") pod \"community-operators-mv4pq\" (UID: \"c2a3514d-c642-4e86-a9fa-dc27fa6bca81\") " pod="openshift-marketplace/community-operators-mv4pq" Dec 05 17:29:49 crc kubenswrapper[4756]: I1205 17:29:49.379591 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2a3514d-c642-4e86-a9fa-dc27fa6bca81-catalog-content\") pod \"community-operators-mv4pq\" (UID: \"c2a3514d-c642-4e86-a9fa-dc27fa6bca81\") " pod="openshift-marketplace/community-operators-mv4pq" Dec 05 17:29:49 crc kubenswrapper[4756]: I1205 17:29:49.405835 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8hfn\" (UniqueName: \"kubernetes.io/projected/c2a3514d-c642-4e86-a9fa-dc27fa6bca81-kube-api-access-b8hfn\") pod \"community-operators-mv4pq\" (UID: \"c2a3514d-c642-4e86-a9fa-dc27fa6bca81\") " pod="openshift-marketplace/community-operators-mv4pq" Dec 05 17:29:49 crc kubenswrapper[4756]: I1205 17:29:49.593816 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mv4pq" Dec 05 17:29:50 crc kubenswrapper[4756]: I1205 17:29:50.078311 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mv4pq"] Dec 05 17:29:50 crc kubenswrapper[4756]: I1205 17:29:50.920625 4756 generic.go:334] "Generic (PLEG): container finished" podID="c2a3514d-c642-4e86-a9fa-dc27fa6bca81" containerID="ef4d890a9daaf578cb0949d41f0de5b4772ed1cab47eaf303a260d71affed1ef" exitCode=0 Dec 05 17:29:50 crc kubenswrapper[4756]: I1205 17:29:50.920680 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mv4pq" event={"ID":"c2a3514d-c642-4e86-a9fa-dc27fa6bca81","Type":"ContainerDied","Data":"ef4d890a9daaf578cb0949d41f0de5b4772ed1cab47eaf303a260d71affed1ef"} Dec 05 17:29:50 crc kubenswrapper[4756]: I1205 17:29:50.920716 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mv4pq" event={"ID":"c2a3514d-c642-4e86-a9fa-dc27fa6bca81","Type":"ContainerStarted","Data":"c2b10663ae97956e9b7fa7cdc38af5837492bd23fa0b5971a5b615653cb203c4"} Dec 05 17:29:51 crc kubenswrapper[4756]: I1205 17:29:51.930855 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mv4pq" event={"ID":"c2a3514d-c642-4e86-a9fa-dc27fa6bca81","Type":"ContainerStarted","Data":"f5b7486e4c867bf5c23e5e51e2b818cde66b7640e82901d86e47858e76456383"} Dec 05 17:29:52 crc kubenswrapper[4756]: I1205 17:29:52.295483 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 17:29:52 crc kubenswrapper[4756]: I1205 17:29:52.295562 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 17:29:52 crc kubenswrapper[4756]: I1205 17:29:52.295618 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 17:29:52 crc kubenswrapper[4756]: I1205 17:29:52.296320 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"018dc95e3554740d0c2762c4f5c3c10d9130221083b834744135d7253826ab0e"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 17:29:52 crc kubenswrapper[4756]: I1205 17:29:52.296394 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://018dc95e3554740d0c2762c4f5c3c10d9130221083b834744135d7253826ab0e" gracePeriod=600 Dec 05 17:29:52 crc kubenswrapper[4756]: E1205 17:29:52.430030 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:29:52 crc kubenswrapper[4756]: I1205 17:29:52.940992 4756 generic.go:334] "Generic (PLEG): container finished" podID="c2a3514d-c642-4e86-a9fa-dc27fa6bca81" containerID="f5b7486e4c867bf5c23e5e51e2b818cde66b7640e82901d86e47858e76456383" exitCode=0 Dec 05 17:29:52 crc kubenswrapper[4756]: I1205 17:29:52.941073 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mv4pq" event={"ID":"c2a3514d-c642-4e86-a9fa-dc27fa6bca81","Type":"ContainerDied","Data":"f5b7486e4c867bf5c23e5e51e2b818cde66b7640e82901d86e47858e76456383"} Dec 05 17:29:52 crc kubenswrapper[4756]: I1205 17:29:52.944665 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="018dc95e3554740d0c2762c4f5c3c10d9130221083b834744135d7253826ab0e" exitCode=0 Dec 05 17:29:52 crc kubenswrapper[4756]: I1205 17:29:52.944714 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"018dc95e3554740d0c2762c4f5c3c10d9130221083b834744135d7253826ab0e"} Dec 05 17:29:52 crc kubenswrapper[4756]: I1205 17:29:52.944749 4756 scope.go:117] "RemoveContainer" containerID="559d99bb2b2903a0e0d08c03f3e2cdf229933fe33baa698fb9bf73438ee152f5" Dec 05 17:29:52 crc kubenswrapper[4756]: I1205 17:29:52.945373 4756 scope.go:117] "RemoveContainer" containerID="018dc95e3554740d0c2762c4f5c3c10d9130221083b834744135d7253826ab0e" Dec 05 17:29:52 crc kubenswrapper[4756]: E1205 17:29:52.945701 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:29:53 crc kubenswrapper[4756]: I1205 17:29:53.955057 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mv4pq" event={"ID":"c2a3514d-c642-4e86-a9fa-dc27fa6bca81","Type":"ContainerStarted","Data":"f406ad7327ed728f91d2ed9ae789e7b45ce88012aa35ba1292404f5841f44b3a"} Dec 05 17:29:53 crc kubenswrapper[4756]: I1205 17:29:53.981622 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mv4pq" podStartSLOduration=2.582283528 podStartE2EDuration="4.981601805s" podCreationTimestamp="2025-12-05 17:29:49 +0000 UTC" firstStartedPulling="2025-12-05 17:29:50.923001709 +0000 UTC m=+4451.161675231" lastFinishedPulling="2025-12-05 17:29:53.322319996 +0000 UTC m=+4453.560993508" observedRunningTime="2025-12-05 17:29:53.972612257 +0000 UTC m=+4454.211285779" watchObservedRunningTime="2025-12-05 17:29:53.981601805 +0000 UTC m=+4454.220275337" Dec 05 17:29:59 crc kubenswrapper[4756]: I1205 17:29:59.594961 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mv4pq" Dec 05 17:29:59 crc kubenswrapper[4756]: I1205 17:29:59.595278 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mv4pq" Dec 05 17:29:59 crc kubenswrapper[4756]: I1205 17:29:59.634772 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mv4pq" Dec 05 17:30:00 crc kubenswrapper[4756]: I1205 17:30:00.065950 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mv4pq" Dec 05 17:30:00 crc kubenswrapper[4756]: I1205 17:30:00.181293 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415930-nzhkp"] Dec 05 17:30:00 crc kubenswrapper[4756]: I1205 17:30:00.182492 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415930-nzhkp" Dec 05 17:30:00 crc kubenswrapper[4756]: I1205 17:30:00.184895 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 17:30:00 crc kubenswrapper[4756]: I1205 17:30:00.184901 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 17:30:00 crc kubenswrapper[4756]: I1205 17:30:00.192136 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415930-nzhkp"] Dec 05 17:30:00 crc kubenswrapper[4756]: I1205 17:30:00.230459 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlgxt\" (UniqueName: \"kubernetes.io/projected/10262f71-e744-4bd2-9a0a-0645feca3a76-kube-api-access-vlgxt\") pod \"collect-profiles-29415930-nzhkp\" (UID: \"10262f71-e744-4bd2-9a0a-0645feca3a76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415930-nzhkp" Dec 05 17:30:00 crc kubenswrapper[4756]: I1205 17:30:00.230584 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10262f71-e744-4bd2-9a0a-0645feca3a76-config-volume\") pod \"collect-profiles-29415930-nzhkp\" (UID: \"10262f71-e744-4bd2-9a0a-0645feca3a76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415930-nzhkp" Dec 05 17:30:00 crc kubenswrapper[4756]: I1205 17:30:00.230613 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10262f71-e744-4bd2-9a0a-0645feca3a76-secret-volume\") pod \"collect-profiles-29415930-nzhkp\" (UID: \"10262f71-e744-4bd2-9a0a-0645feca3a76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415930-nzhkp" Dec 05 17:30:00 crc kubenswrapper[4756]: I1205 17:30:00.331614 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10262f71-e744-4bd2-9a0a-0645feca3a76-config-volume\") pod \"collect-profiles-29415930-nzhkp\" (UID: \"10262f71-e744-4bd2-9a0a-0645feca3a76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415930-nzhkp" Dec 05 17:30:00 crc kubenswrapper[4756]: I1205 17:30:00.331872 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10262f71-e744-4bd2-9a0a-0645feca3a76-secret-volume\") pod \"collect-profiles-29415930-nzhkp\" (UID: \"10262f71-e744-4bd2-9a0a-0645feca3a76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415930-nzhkp" Dec 05 17:30:00 crc kubenswrapper[4756]: I1205 17:30:00.331932 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlgxt\" (UniqueName: \"kubernetes.io/projected/10262f71-e744-4bd2-9a0a-0645feca3a76-kube-api-access-vlgxt\") pod \"collect-profiles-29415930-nzhkp\" (UID: \"10262f71-e744-4bd2-9a0a-0645feca3a76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415930-nzhkp" Dec 05 17:30:00 crc kubenswrapper[4756]: I1205 17:30:00.332434 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10262f71-e744-4bd2-9a0a-0645feca3a76-config-volume\") pod \"collect-profiles-29415930-nzhkp\" (UID: \"10262f71-e744-4bd2-9a0a-0645feca3a76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415930-nzhkp" Dec 05 17:30:00 crc kubenswrapper[4756]: I1205 17:30:00.347398 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10262f71-e744-4bd2-9a0a-0645feca3a76-secret-volume\") pod \"collect-profiles-29415930-nzhkp\" (UID: \"10262f71-e744-4bd2-9a0a-0645feca3a76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415930-nzhkp" Dec 05 17:30:00 crc kubenswrapper[4756]: I1205 17:30:00.347567 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlgxt\" (UniqueName: \"kubernetes.io/projected/10262f71-e744-4bd2-9a0a-0645feca3a76-kube-api-access-vlgxt\") pod \"collect-profiles-29415930-nzhkp\" (UID: \"10262f71-e744-4bd2-9a0a-0645feca3a76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415930-nzhkp" Dec 05 17:30:00 crc kubenswrapper[4756]: I1205 17:30:00.514258 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415930-nzhkp" Dec 05 17:30:00 crc kubenswrapper[4756]: I1205 17:30:00.866874 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mv4pq"] Dec 05 17:30:00 crc kubenswrapper[4756]: I1205 17:30:00.934251 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415930-nzhkp"] Dec 05 17:30:01 crc kubenswrapper[4756]: I1205 17:30:01.015329 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415930-nzhkp" event={"ID":"10262f71-e744-4bd2-9a0a-0645feca3a76","Type":"ContainerStarted","Data":"c2ecaed0c7c7d9d4d1b1ce7a0d715922fc771dbbd2fc2b1caa15624036b6794a"} Dec 05 17:30:02 crc kubenswrapper[4756]: I1205 17:30:02.024120 4756 generic.go:334] "Generic (PLEG): container finished" podID="10262f71-e744-4bd2-9a0a-0645feca3a76" containerID="19646591add41f3b387f29aa12e936e59ae3ed7296ea1fa7a249eb04af334055" exitCode=0 Dec 05 17:30:02 crc kubenswrapper[4756]: I1205 17:30:02.024237 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415930-nzhkp" event={"ID":"10262f71-e744-4bd2-9a0a-0645feca3a76","Type":"ContainerDied","Data":"19646591add41f3b387f29aa12e936e59ae3ed7296ea1fa7a249eb04af334055"} Dec 05 17:30:02 crc kubenswrapper[4756]: I1205 17:30:02.024718 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mv4pq" podUID="c2a3514d-c642-4e86-a9fa-dc27fa6bca81" containerName="registry-server" containerID="cri-o://f406ad7327ed728f91d2ed9ae789e7b45ce88012aa35ba1292404f5841f44b3a" gracePeriod=2 Dec 05 17:30:03 crc kubenswrapper[4756]: I1205 17:30:03.039816 4756 generic.go:334] "Generic (PLEG): container finished" podID="c2a3514d-c642-4e86-a9fa-dc27fa6bca81" containerID="f406ad7327ed728f91d2ed9ae789e7b45ce88012aa35ba1292404f5841f44b3a" exitCode=0 Dec 05 17:30:03 crc kubenswrapper[4756]: I1205 17:30:03.039908 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mv4pq" event={"ID":"c2a3514d-c642-4e86-a9fa-dc27fa6bca81","Type":"ContainerDied","Data":"f406ad7327ed728f91d2ed9ae789e7b45ce88012aa35ba1292404f5841f44b3a"} Dec 05 17:30:03 crc kubenswrapper[4756]: I1205 17:30:03.039998 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mv4pq" event={"ID":"c2a3514d-c642-4e86-a9fa-dc27fa6bca81","Type":"ContainerDied","Data":"c2b10663ae97956e9b7fa7cdc38af5837492bd23fa0b5971a5b615653cb203c4"} Dec 05 17:30:03 crc kubenswrapper[4756]: I1205 17:30:03.040020 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2b10663ae97956e9b7fa7cdc38af5837492bd23fa0b5971a5b615653cb203c4" Dec 05 17:30:03 crc kubenswrapper[4756]: I1205 17:30:03.045064 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mv4pq" Dec 05 17:30:03 crc kubenswrapper[4756]: I1205 17:30:03.181398 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8hfn\" (UniqueName: \"kubernetes.io/projected/c2a3514d-c642-4e86-a9fa-dc27fa6bca81-kube-api-access-b8hfn\") pod \"c2a3514d-c642-4e86-a9fa-dc27fa6bca81\" (UID: \"c2a3514d-c642-4e86-a9fa-dc27fa6bca81\") " Dec 05 17:30:03 crc kubenswrapper[4756]: I1205 17:30:03.182790 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2a3514d-c642-4e86-a9fa-dc27fa6bca81-catalog-content\") pod \"c2a3514d-c642-4e86-a9fa-dc27fa6bca81\" (UID: \"c2a3514d-c642-4e86-a9fa-dc27fa6bca81\") " Dec 05 17:30:03 crc kubenswrapper[4756]: I1205 17:30:03.182907 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2a3514d-c642-4e86-a9fa-dc27fa6bca81-utilities\") pod \"c2a3514d-c642-4e86-a9fa-dc27fa6bca81\" (UID: \"c2a3514d-c642-4e86-a9fa-dc27fa6bca81\") " Dec 05 17:30:03 crc kubenswrapper[4756]: I1205 17:30:03.183774 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2a3514d-c642-4e86-a9fa-dc27fa6bca81-utilities" (OuterVolumeSpecName: "utilities") pod "c2a3514d-c642-4e86-a9fa-dc27fa6bca81" (UID: "c2a3514d-c642-4e86-a9fa-dc27fa6bca81"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:30:03 crc kubenswrapper[4756]: I1205 17:30:03.201941 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2a3514d-c642-4e86-a9fa-dc27fa6bca81-kube-api-access-b8hfn" (OuterVolumeSpecName: "kube-api-access-b8hfn") pod "c2a3514d-c642-4e86-a9fa-dc27fa6bca81" (UID: "c2a3514d-c642-4e86-a9fa-dc27fa6bca81"). InnerVolumeSpecName "kube-api-access-b8hfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:30:03 crc kubenswrapper[4756]: I1205 17:30:03.303800 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8hfn\" (UniqueName: \"kubernetes.io/projected/c2a3514d-c642-4e86-a9fa-dc27fa6bca81-kube-api-access-b8hfn\") on node \"crc\" DevicePath \"\"" Dec 05 17:30:03 crc kubenswrapper[4756]: I1205 17:30:03.303834 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2a3514d-c642-4e86-a9fa-dc27fa6bca81-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 17:30:03 crc kubenswrapper[4756]: I1205 17:30:03.342759 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2a3514d-c642-4e86-a9fa-dc27fa6bca81-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2a3514d-c642-4e86-a9fa-dc27fa6bca81" (UID: "c2a3514d-c642-4e86-a9fa-dc27fa6bca81"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:30:03 crc kubenswrapper[4756]: I1205 17:30:03.405323 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2a3514d-c642-4e86-a9fa-dc27fa6bca81-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 17:30:03 crc kubenswrapper[4756]: I1205 17:30:03.618826 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415930-nzhkp" Dec 05 17:30:03 crc kubenswrapper[4756]: I1205 17:30:03.810189 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10262f71-e744-4bd2-9a0a-0645feca3a76-secret-volume\") pod \"10262f71-e744-4bd2-9a0a-0645feca3a76\" (UID: \"10262f71-e744-4bd2-9a0a-0645feca3a76\") " Dec 05 17:30:03 crc kubenswrapper[4756]: I1205 17:30:03.810282 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlgxt\" (UniqueName: \"kubernetes.io/projected/10262f71-e744-4bd2-9a0a-0645feca3a76-kube-api-access-vlgxt\") pod \"10262f71-e744-4bd2-9a0a-0645feca3a76\" (UID: \"10262f71-e744-4bd2-9a0a-0645feca3a76\") " Dec 05 17:30:03 crc kubenswrapper[4756]: I1205 17:30:03.810371 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10262f71-e744-4bd2-9a0a-0645feca3a76-config-volume\") pod \"10262f71-e744-4bd2-9a0a-0645feca3a76\" (UID: \"10262f71-e744-4bd2-9a0a-0645feca3a76\") " Dec 05 17:30:03 crc kubenswrapper[4756]: I1205 17:30:03.811236 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10262f71-e744-4bd2-9a0a-0645feca3a76-config-volume" (OuterVolumeSpecName: "config-volume") pod "10262f71-e744-4bd2-9a0a-0645feca3a76" (UID: "10262f71-e744-4bd2-9a0a-0645feca3a76"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:30:03 crc kubenswrapper[4756]: I1205 17:30:03.813999 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10262f71-e744-4bd2-9a0a-0645feca3a76-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "10262f71-e744-4bd2-9a0a-0645feca3a76" (UID: "10262f71-e744-4bd2-9a0a-0645feca3a76"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:30:03 crc kubenswrapper[4756]: I1205 17:30:03.814110 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10262f71-e744-4bd2-9a0a-0645feca3a76-kube-api-access-vlgxt" (OuterVolumeSpecName: "kube-api-access-vlgxt") pod "10262f71-e744-4bd2-9a0a-0645feca3a76" (UID: "10262f71-e744-4bd2-9a0a-0645feca3a76"). InnerVolumeSpecName "kube-api-access-vlgxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:30:03 crc kubenswrapper[4756]: I1205 17:30:03.912134 4756 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10262f71-e744-4bd2-9a0a-0645feca3a76-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 17:30:03 crc kubenswrapper[4756]: I1205 17:30:03.912182 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlgxt\" (UniqueName: \"kubernetes.io/projected/10262f71-e744-4bd2-9a0a-0645feca3a76-kube-api-access-vlgxt\") on node \"crc\" DevicePath \"\"" Dec 05 17:30:03 crc kubenswrapper[4756]: I1205 17:30:03.912196 4756 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10262f71-e744-4bd2-9a0a-0645feca3a76-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 17:30:04 crc kubenswrapper[4756]: I1205 17:30:04.052999 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mv4pq" Dec 05 17:30:04 crc kubenswrapper[4756]: I1205 17:30:04.053507 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415930-nzhkp" Dec 05 17:30:04 crc kubenswrapper[4756]: I1205 17:30:04.062431 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415930-nzhkp" event={"ID":"10262f71-e744-4bd2-9a0a-0645feca3a76","Type":"ContainerDied","Data":"c2ecaed0c7c7d9d4d1b1ce7a0d715922fc771dbbd2fc2b1caa15624036b6794a"} Dec 05 17:30:04 crc kubenswrapper[4756]: I1205 17:30:04.062481 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2ecaed0c7c7d9d4d1b1ce7a0d715922fc771dbbd2fc2b1caa15624036b6794a" Dec 05 17:30:04 crc kubenswrapper[4756]: I1205 17:30:04.093113 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mv4pq"] Dec 05 17:30:04 crc kubenswrapper[4756]: I1205 17:30:04.098350 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mv4pq"] Dec 05 17:30:04 crc kubenswrapper[4756]: I1205 17:30:04.697526 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415885-ptx2k"] Dec 05 17:30:04 crc kubenswrapper[4756]: I1205 17:30:04.705121 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415885-ptx2k"] Dec 05 17:30:06 crc kubenswrapper[4756]: I1205 17:30:06.062830 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c8f6798-be3d-499c-98fb-ab59439176ff" path="/var/lib/kubelet/pods/1c8f6798-be3d-499c-98fb-ab59439176ff/volumes" Dec 05 17:30:06 crc kubenswrapper[4756]: I1205 17:30:06.063673 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2a3514d-c642-4e86-a9fa-dc27fa6bca81" path="/var/lib/kubelet/pods/c2a3514d-c642-4e86-a9fa-dc27fa6bca81/volumes" Dec 05 17:30:07 crc kubenswrapper[4756]: I1205 17:30:07.050006 4756 scope.go:117] "RemoveContainer" containerID="018dc95e3554740d0c2762c4f5c3c10d9130221083b834744135d7253826ab0e" Dec 05 17:30:07 crc kubenswrapper[4756]: E1205 17:30:07.050575 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:30:22 crc kubenswrapper[4756]: I1205 17:30:22.049889 4756 scope.go:117] "RemoveContainer" containerID="018dc95e3554740d0c2762c4f5c3c10d9130221083b834744135d7253826ab0e" Dec 05 17:30:22 crc kubenswrapper[4756]: E1205 17:30:22.050914 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:30:36 crc kubenswrapper[4756]: I1205 17:30:36.049601 4756 scope.go:117] "RemoveContainer" containerID="018dc95e3554740d0c2762c4f5c3c10d9130221083b834744135d7253826ab0e" Dec 05 17:30:36 crc kubenswrapper[4756]: E1205 17:30:36.050575 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:30:50 crc kubenswrapper[4756]: I1205 17:30:50.054503 4756 scope.go:117] "RemoveContainer" containerID="018dc95e3554740d0c2762c4f5c3c10d9130221083b834744135d7253826ab0e" Dec 05 17:30:50 crc kubenswrapper[4756]: E1205 17:30:50.055391 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:30:58 crc kubenswrapper[4756]: I1205 17:30:58.178551 4756 scope.go:117] "RemoveContainer" containerID="3cf229a4daba2cbc5b3f8d04c99533c9d834eeac3bcf221d39628223dd960629" Dec 05 17:31:05 crc kubenswrapper[4756]: I1205 17:31:05.049938 4756 scope.go:117] "RemoveContainer" containerID="018dc95e3554740d0c2762c4f5c3c10d9130221083b834744135d7253826ab0e" Dec 05 17:31:05 crc kubenswrapper[4756]: E1205 17:31:05.050489 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:31:18 crc kubenswrapper[4756]: I1205 17:31:18.050687 4756 scope.go:117] "RemoveContainer" containerID="018dc95e3554740d0c2762c4f5c3c10d9130221083b834744135d7253826ab0e" Dec 05 17:31:18 crc kubenswrapper[4756]: E1205 17:31:18.052043 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:31:33 crc kubenswrapper[4756]: I1205 17:31:33.049674 4756 scope.go:117] "RemoveContainer" containerID="018dc95e3554740d0c2762c4f5c3c10d9130221083b834744135d7253826ab0e" Dec 05 17:31:33 crc kubenswrapper[4756]: E1205 17:31:33.050622 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:31:44 crc kubenswrapper[4756]: I1205 17:31:44.050111 4756 scope.go:117] "RemoveContainer" containerID="018dc95e3554740d0c2762c4f5c3c10d9130221083b834744135d7253826ab0e" Dec 05 17:31:44 crc kubenswrapper[4756]: E1205 17:31:44.050879 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:31:58 crc kubenswrapper[4756]: I1205 17:31:58.050576 4756 scope.go:117] "RemoveContainer" containerID="018dc95e3554740d0c2762c4f5c3c10d9130221083b834744135d7253826ab0e" Dec 05 17:31:58 crc kubenswrapper[4756]: E1205 17:31:58.051591 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:32:12 crc kubenswrapper[4756]: I1205 17:32:12.050282 4756 scope.go:117] "RemoveContainer" containerID="018dc95e3554740d0c2762c4f5c3c10d9130221083b834744135d7253826ab0e" Dec 05 17:32:12 crc kubenswrapper[4756]: E1205 17:32:12.051264 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:32:27 crc kubenswrapper[4756]: I1205 17:32:27.050060 4756 scope.go:117] "RemoveContainer" containerID="018dc95e3554740d0c2762c4f5c3c10d9130221083b834744135d7253826ab0e" Dec 05 17:32:27 crc kubenswrapper[4756]: E1205 17:32:27.051587 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:32:39 crc kubenswrapper[4756]: I1205 17:32:39.050365 4756 scope.go:117] "RemoveContainer" containerID="018dc95e3554740d0c2762c4f5c3c10d9130221083b834744135d7253826ab0e" Dec 05 17:32:39 crc kubenswrapper[4756]: E1205 17:32:39.051020 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:32:52 crc kubenswrapper[4756]: I1205 17:32:52.050349 4756 scope.go:117] "RemoveContainer" containerID="018dc95e3554740d0c2762c4f5c3c10d9130221083b834744135d7253826ab0e" Dec 05 17:32:52 crc kubenswrapper[4756]: E1205 17:32:52.051435 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:33:03 crc kubenswrapper[4756]: I1205 17:33:03.049528 4756 scope.go:117] "RemoveContainer" containerID="018dc95e3554740d0c2762c4f5c3c10d9130221083b834744135d7253826ab0e" Dec 05 17:33:03 crc kubenswrapper[4756]: E1205 17:33:03.050403 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:33:15 crc kubenswrapper[4756]: I1205 17:33:15.050187 4756 scope.go:117] "RemoveContainer" containerID="018dc95e3554740d0c2762c4f5c3c10d9130221083b834744135d7253826ab0e" Dec 05 17:33:15 crc kubenswrapper[4756]: E1205 17:33:15.051058 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:33:26 crc kubenswrapper[4756]: I1205 17:33:26.050345 4756 scope.go:117] "RemoveContainer" containerID="018dc95e3554740d0c2762c4f5c3c10d9130221083b834744135d7253826ab0e" Dec 05 17:33:26 crc kubenswrapper[4756]: E1205 17:33:26.051224 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:33:40 crc kubenswrapper[4756]: I1205 17:33:40.058454 4756 scope.go:117] "RemoveContainer" containerID="018dc95e3554740d0c2762c4f5c3c10d9130221083b834744135d7253826ab0e" Dec 05 17:33:40 crc kubenswrapper[4756]: E1205 17:33:40.059155 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:33:54 crc kubenswrapper[4756]: I1205 17:33:54.049777 4756 scope.go:117] "RemoveContainer" containerID="018dc95e3554740d0c2762c4f5c3c10d9130221083b834744135d7253826ab0e" Dec 05 17:33:54 crc kubenswrapper[4756]: E1205 17:33:54.050357 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:34:06 crc kubenswrapper[4756]: I1205 17:34:06.050017 4756 scope.go:117] "RemoveContainer" containerID="018dc95e3554740d0c2762c4f5c3c10d9130221083b834744135d7253826ab0e" Dec 05 17:34:06 crc kubenswrapper[4756]: E1205 17:34:06.050972 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:34:20 crc kubenswrapper[4756]: I1205 17:34:20.057971 4756 scope.go:117] "RemoveContainer" containerID="018dc95e3554740d0c2762c4f5c3c10d9130221083b834744135d7253826ab0e" Dec 05 17:34:20 crc kubenswrapper[4756]: E1205 17:34:20.058972 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:34:32 crc kubenswrapper[4756]: I1205 17:34:32.050208 4756 scope.go:117] "RemoveContainer" containerID="018dc95e3554740d0c2762c4f5c3c10d9130221083b834744135d7253826ab0e" Dec 05 17:34:32 crc kubenswrapper[4756]: E1205 17:34:32.050988 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:34:46 crc kubenswrapper[4756]: I1205 17:34:46.051621 4756 scope.go:117] "RemoveContainer" containerID="018dc95e3554740d0c2762c4f5c3c10d9130221083b834744135d7253826ab0e" Dec 05 17:34:46 crc kubenswrapper[4756]: E1205 17:34:46.054681 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:34:58 crc kubenswrapper[4756]: I1205 17:34:58.049705 4756 scope.go:117] "RemoveContainer" containerID="018dc95e3554740d0c2762c4f5c3c10d9130221083b834744135d7253826ab0e" Dec 05 17:34:58 crc kubenswrapper[4756]: I1205 17:34:58.612883 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"f1c4de2bc9063f8e3fc76ba02b88979769bea7b75c8852b80a31670bb4de72f9"} Dec 05 17:35:58 crc kubenswrapper[4756]: I1205 17:35:58.335792 4756 scope.go:117] "RemoveContainer" containerID="f406ad7327ed728f91d2ed9ae789e7b45ce88012aa35ba1292404f5841f44b3a" Dec 05 17:35:58 crc kubenswrapper[4756]: I1205 17:35:58.360575 4756 scope.go:117] "RemoveContainer" containerID="ef4d890a9daaf578cb0949d41f0de5b4772ed1cab47eaf303a260d71affed1ef" Dec 05 17:35:58 crc kubenswrapper[4756]: I1205 17:35:58.385061 4756 scope.go:117] "RemoveContainer" containerID="f5b7486e4c867bf5c23e5e51e2b818cde66b7640e82901d86e47858e76456383" Dec 05 17:37:22 crc kubenswrapper[4756]: I1205 17:37:22.295411 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 17:37:22 crc kubenswrapper[4756]: I1205 17:37:22.295912 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 17:37:39 crc kubenswrapper[4756]: I1205 17:37:39.019994 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-ch5kc"] Dec 05 17:37:39 crc kubenswrapper[4756]: I1205 17:37:39.030110 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-ch5kc"] Dec 05 17:37:39 crc kubenswrapper[4756]: I1205 17:37:39.204520 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-99xvt"] Dec 05 17:37:39 crc kubenswrapper[4756]: E1205 17:37:39.205008 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a3514d-c642-4e86-a9fa-dc27fa6bca81" containerName="extract-utilities" Dec 05 17:37:39 crc kubenswrapper[4756]: I1205 17:37:39.205041 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a3514d-c642-4e86-a9fa-dc27fa6bca81" containerName="extract-utilities" Dec 05 17:37:39 crc kubenswrapper[4756]: E1205 17:37:39.205061 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10262f71-e744-4bd2-9a0a-0645feca3a76" containerName="collect-profiles" Dec 05 17:37:39 crc kubenswrapper[4756]: I1205 17:37:39.205075 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="10262f71-e744-4bd2-9a0a-0645feca3a76" containerName="collect-profiles" Dec 05 17:37:39 crc kubenswrapper[4756]: E1205 17:37:39.205110 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a3514d-c642-4e86-a9fa-dc27fa6bca81" containerName="extract-content" Dec 05 17:37:39 crc kubenswrapper[4756]: I1205 17:37:39.205123 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a3514d-c642-4e86-a9fa-dc27fa6bca81" containerName="extract-content" Dec 05 17:37:39 crc kubenswrapper[4756]: E1205 17:37:39.205152 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a3514d-c642-4e86-a9fa-dc27fa6bca81" containerName="registry-server" Dec 05 17:37:39 crc kubenswrapper[4756]: I1205 17:37:39.205163 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a3514d-c642-4e86-a9fa-dc27fa6bca81" containerName="registry-server" Dec 05 17:37:39 crc kubenswrapper[4756]: I1205 17:37:39.205510 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2a3514d-c642-4e86-a9fa-dc27fa6bca81" containerName="registry-server" Dec 05 17:37:39 crc kubenswrapper[4756]: I1205 17:37:39.205573 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="10262f71-e744-4bd2-9a0a-0645feca3a76" containerName="collect-profiles" Dec 05 17:37:39 crc kubenswrapper[4756]: I1205 17:37:39.206651 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-99xvt" Dec 05 17:37:39 crc kubenswrapper[4756]: I1205 17:37:39.211040 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 05 17:37:39 crc kubenswrapper[4756]: I1205 17:37:39.211469 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 05 17:37:39 crc kubenswrapper[4756]: I1205 17:37:39.211721 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 05 17:37:39 crc kubenswrapper[4756]: I1205 17:37:39.211951 4756 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-7bf44" Dec 05 17:37:39 crc kubenswrapper[4756]: I1205 17:37:39.230698 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-99xvt"] Dec 05 17:37:39 crc kubenswrapper[4756]: I1205 17:37:39.289283 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d8vn\" (UniqueName: \"kubernetes.io/projected/33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e-kube-api-access-4d8vn\") pod \"crc-storage-crc-99xvt\" (UID: \"33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e\") " pod="crc-storage/crc-storage-crc-99xvt" Dec 05 17:37:39 crc kubenswrapper[4756]: I1205 17:37:39.289370 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e-node-mnt\") pod \"crc-storage-crc-99xvt\" (UID: \"33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e\") " pod="crc-storage/crc-storage-crc-99xvt" Dec 05 17:37:39 crc kubenswrapper[4756]: I1205 17:37:39.289471 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e-crc-storage\") pod \"crc-storage-crc-99xvt\" (UID: \"33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e\") " pod="crc-storage/crc-storage-crc-99xvt" Dec 05 17:37:39 crc kubenswrapper[4756]: I1205 17:37:39.390644 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d8vn\" (UniqueName: \"kubernetes.io/projected/33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e-kube-api-access-4d8vn\") pod \"crc-storage-crc-99xvt\" (UID: \"33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e\") " pod="crc-storage/crc-storage-crc-99xvt" Dec 05 17:37:39 crc kubenswrapper[4756]: I1205 17:37:39.390694 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e-node-mnt\") pod \"crc-storage-crc-99xvt\" (UID: \"33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e\") " pod="crc-storage/crc-storage-crc-99xvt" Dec 05 17:37:39 crc kubenswrapper[4756]: I1205 17:37:39.390729 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e-crc-storage\") pod \"crc-storage-crc-99xvt\" (UID: \"33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e\") " pod="crc-storage/crc-storage-crc-99xvt" Dec 05 17:37:39 crc kubenswrapper[4756]: I1205 17:37:39.391139 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e-node-mnt\") pod \"crc-storage-crc-99xvt\" (UID: \"33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e\") " pod="crc-storage/crc-storage-crc-99xvt" Dec 05 17:37:39 crc kubenswrapper[4756]: I1205 17:37:39.391360 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e-crc-storage\") pod \"crc-storage-crc-99xvt\" (UID: \"33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e\") " pod="crc-storage/crc-storage-crc-99xvt" Dec 05 17:37:39 crc kubenswrapper[4756]: I1205 17:37:39.415819 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d8vn\" (UniqueName: \"kubernetes.io/projected/33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e-kube-api-access-4d8vn\") pod \"crc-storage-crc-99xvt\" (UID: \"33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e\") " pod="crc-storage/crc-storage-crc-99xvt" Dec 05 17:37:39 crc kubenswrapper[4756]: I1205 17:37:39.572551 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-99xvt" Dec 05 17:37:40 crc kubenswrapper[4756]: I1205 17:37:40.094559 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 17:37:40 crc kubenswrapper[4756]: I1205 17:37:40.105624 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aac6ba13-41b0-4db9-9dfd-9280afe331f0" path="/var/lib/kubelet/pods/aac6ba13-41b0-4db9-9dfd-9280afe331f0/volumes" Dec 05 17:37:40 crc kubenswrapper[4756]: I1205 17:37:40.107205 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-99xvt"] Dec 05 17:37:40 crc kubenswrapper[4756]: I1205 17:37:40.128353 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-99xvt" event={"ID":"33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e","Type":"ContainerStarted","Data":"64a012b098dde5e75cd89386d5120098969ccea0ed49c7cc124bab849d548af2"} Dec 05 17:37:41 crc kubenswrapper[4756]: I1205 17:37:41.141882 4756 generic.go:334] "Generic (PLEG): container finished" podID="33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e" containerID="3842b3b1d1f9299be117cda3bccba4c0a417bef95e44d334f78b1621ce8832d3" exitCode=0 Dec 05 17:37:41 crc kubenswrapper[4756]: I1205 17:37:41.142077 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-99xvt" event={"ID":"33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e","Type":"ContainerDied","Data":"3842b3b1d1f9299be117cda3bccba4c0a417bef95e44d334f78b1621ce8832d3"} Dec 05 17:37:42 crc kubenswrapper[4756]: I1205 17:37:42.529868 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-99xvt" Dec 05 17:37:42 crc kubenswrapper[4756]: I1205 17:37:42.647629 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d8vn\" (UniqueName: \"kubernetes.io/projected/33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e-kube-api-access-4d8vn\") pod \"33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e\" (UID: \"33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e\") " Dec 05 17:37:42 crc kubenswrapper[4756]: I1205 17:37:42.647698 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e-node-mnt\") pod \"33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e\" (UID: \"33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e\") " Dec 05 17:37:42 crc kubenswrapper[4756]: I1205 17:37:42.647737 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e-crc-storage\") pod \"33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e\" (UID: \"33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e\") " Dec 05 17:37:42 crc kubenswrapper[4756]: I1205 17:37:42.647827 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e" (UID: "33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 17:37:42 crc kubenswrapper[4756]: I1205 17:37:42.648065 4756 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 05 17:37:42 crc kubenswrapper[4756]: I1205 17:37:42.653136 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e-kube-api-access-4d8vn" (OuterVolumeSpecName: "kube-api-access-4d8vn") pod "33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e" (UID: "33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e"). InnerVolumeSpecName "kube-api-access-4d8vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:37:42 crc kubenswrapper[4756]: I1205 17:37:42.680825 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e" (UID: "33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:37:42 crc kubenswrapper[4756]: I1205 17:37:42.749174 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d8vn\" (UniqueName: \"kubernetes.io/projected/33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e-kube-api-access-4d8vn\") on node \"crc\" DevicePath \"\"" Dec 05 17:37:42 crc kubenswrapper[4756]: I1205 17:37:42.749204 4756 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 05 17:37:43 crc kubenswrapper[4756]: I1205 17:37:43.160842 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-99xvt" event={"ID":"33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e","Type":"ContainerDied","Data":"64a012b098dde5e75cd89386d5120098969ccea0ed49c7cc124bab849d548af2"} Dec 05 17:37:43 crc kubenswrapper[4756]: I1205 17:37:43.160877 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64a012b098dde5e75cd89386d5120098969ccea0ed49c7cc124bab849d548af2" Dec 05 17:37:43 crc kubenswrapper[4756]: I1205 17:37:43.160904 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-99xvt" Dec 05 17:37:45 crc kubenswrapper[4756]: I1205 17:37:45.604542 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-99xvt"] Dec 05 17:37:45 crc kubenswrapper[4756]: I1205 17:37:45.611083 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-99xvt"] Dec 05 17:37:45 crc kubenswrapper[4756]: I1205 17:37:45.749439 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-5vq2c"] Dec 05 17:37:45 crc kubenswrapper[4756]: E1205 17:37:45.749848 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e" containerName="storage" Dec 05 17:37:45 crc kubenswrapper[4756]: I1205 17:37:45.749868 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e" containerName="storage" Dec 05 17:37:45 crc kubenswrapper[4756]: I1205 17:37:45.750063 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e" containerName="storage" Dec 05 17:37:45 crc kubenswrapper[4756]: I1205 17:37:45.750710 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5vq2c" Dec 05 17:37:45 crc kubenswrapper[4756]: I1205 17:37:45.762348 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 05 17:37:45 crc kubenswrapper[4756]: I1205 17:37:45.762956 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 05 17:37:45 crc kubenswrapper[4756]: I1205 17:37:45.763166 4756 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-7bf44" Dec 05 17:37:45 crc kubenswrapper[4756]: I1205 17:37:45.763805 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 05 17:37:45 crc kubenswrapper[4756]: I1205 17:37:45.774250 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-5vq2c"] Dec 05 17:37:45 crc kubenswrapper[4756]: I1205 17:37:45.820863 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/347fdc4e-7621-4d4e-86a6-5fee4174e38b-crc-storage\") pod \"crc-storage-crc-5vq2c\" (UID: \"347fdc4e-7621-4d4e-86a6-5fee4174e38b\") " pod="crc-storage/crc-storage-crc-5vq2c" Dec 05 17:37:45 crc kubenswrapper[4756]: I1205 17:37:45.820999 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/347fdc4e-7621-4d4e-86a6-5fee4174e38b-node-mnt\") pod \"crc-storage-crc-5vq2c\" (UID: \"347fdc4e-7621-4d4e-86a6-5fee4174e38b\") " pod="crc-storage/crc-storage-crc-5vq2c" Dec 05 17:37:45 crc kubenswrapper[4756]: I1205 17:37:45.821043 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln4v6\" (UniqueName: \"kubernetes.io/projected/347fdc4e-7621-4d4e-86a6-5fee4174e38b-kube-api-access-ln4v6\") pod \"crc-storage-crc-5vq2c\" (UID: \"347fdc4e-7621-4d4e-86a6-5fee4174e38b\") " pod="crc-storage/crc-storage-crc-5vq2c" Dec 05 17:37:45 crc kubenswrapper[4756]: I1205 17:37:45.922496 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/347fdc4e-7621-4d4e-86a6-5fee4174e38b-crc-storage\") pod \"crc-storage-crc-5vq2c\" (UID: \"347fdc4e-7621-4d4e-86a6-5fee4174e38b\") " pod="crc-storage/crc-storage-crc-5vq2c" Dec 05 17:37:45 crc kubenswrapper[4756]: I1205 17:37:45.923116 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/347fdc4e-7621-4d4e-86a6-5fee4174e38b-node-mnt\") pod \"crc-storage-crc-5vq2c\" (UID: \"347fdc4e-7621-4d4e-86a6-5fee4174e38b\") " pod="crc-storage/crc-storage-crc-5vq2c" Dec 05 17:37:45 crc kubenswrapper[4756]: I1205 17:37:45.923206 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln4v6\" (UniqueName: \"kubernetes.io/projected/347fdc4e-7621-4d4e-86a6-5fee4174e38b-kube-api-access-ln4v6\") pod \"crc-storage-crc-5vq2c\" (UID: \"347fdc4e-7621-4d4e-86a6-5fee4174e38b\") " pod="crc-storage/crc-storage-crc-5vq2c" Dec 05 17:37:45 crc kubenswrapper[4756]: I1205 17:37:45.923554 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/347fdc4e-7621-4d4e-86a6-5fee4174e38b-node-mnt\") pod \"crc-storage-crc-5vq2c\" (UID: \"347fdc4e-7621-4d4e-86a6-5fee4174e38b\") " pod="crc-storage/crc-storage-crc-5vq2c" Dec 05 17:37:45 crc kubenswrapper[4756]: I1205 17:37:45.923861 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/347fdc4e-7621-4d4e-86a6-5fee4174e38b-crc-storage\") pod \"crc-storage-crc-5vq2c\" (UID: \"347fdc4e-7621-4d4e-86a6-5fee4174e38b\") " pod="crc-storage/crc-storage-crc-5vq2c" Dec 05 17:37:45 crc kubenswrapper[4756]: I1205 17:37:45.961251 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln4v6\" (UniqueName: \"kubernetes.io/projected/347fdc4e-7621-4d4e-86a6-5fee4174e38b-kube-api-access-ln4v6\") pod \"crc-storage-crc-5vq2c\" (UID: \"347fdc4e-7621-4d4e-86a6-5fee4174e38b\") " pod="crc-storage/crc-storage-crc-5vq2c" Dec 05 17:37:46 crc kubenswrapper[4756]: I1205 17:37:46.066976 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e" path="/var/lib/kubelet/pods/33e404f8-d3c3-4eae-a89c-3f47bb0bbc9e/volumes" Dec 05 17:37:46 crc kubenswrapper[4756]: I1205 17:37:46.076037 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5vq2c" Dec 05 17:37:46 crc kubenswrapper[4756]: I1205 17:37:46.604191 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-5vq2c"] Dec 05 17:37:47 crc kubenswrapper[4756]: I1205 17:37:47.203043 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-5vq2c" event={"ID":"347fdc4e-7621-4d4e-86a6-5fee4174e38b","Type":"ContainerStarted","Data":"707c99c24d32b986621b2be75ee32d677374071163ba43bd2052688290b31c75"} Dec 05 17:37:47 crc kubenswrapper[4756]: I1205 17:37:47.241750 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8n4f5"] Dec 05 17:37:47 crc kubenswrapper[4756]: I1205 17:37:47.244147 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8n4f5" Dec 05 17:37:47 crc kubenswrapper[4756]: I1205 17:37:47.247861 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8n4f5"] Dec 05 17:37:47 crc kubenswrapper[4756]: I1205 17:37:47.343766 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c026d80-1703-4e34-9469-5e5f90266f94-catalog-content\") pod \"redhat-marketplace-8n4f5\" (UID: \"7c026d80-1703-4e34-9469-5e5f90266f94\") " pod="openshift-marketplace/redhat-marketplace-8n4f5" Dec 05 17:37:47 crc kubenswrapper[4756]: I1205 17:37:47.343811 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hk5r\" (UniqueName: \"kubernetes.io/projected/7c026d80-1703-4e34-9469-5e5f90266f94-kube-api-access-8hk5r\") pod \"redhat-marketplace-8n4f5\" (UID: \"7c026d80-1703-4e34-9469-5e5f90266f94\") " pod="openshift-marketplace/redhat-marketplace-8n4f5" Dec 05 17:37:47 crc kubenswrapper[4756]: I1205 17:37:47.343832 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c026d80-1703-4e34-9469-5e5f90266f94-utilities\") pod \"redhat-marketplace-8n4f5\" (UID: \"7c026d80-1703-4e34-9469-5e5f90266f94\") " pod="openshift-marketplace/redhat-marketplace-8n4f5" Dec 05 17:37:47 crc kubenswrapper[4756]: I1205 17:37:47.449760 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c026d80-1703-4e34-9469-5e5f90266f94-catalog-content\") pod \"redhat-marketplace-8n4f5\" (UID: \"7c026d80-1703-4e34-9469-5e5f90266f94\") " pod="openshift-marketplace/redhat-marketplace-8n4f5" Dec 05 17:37:47 crc kubenswrapper[4756]: I1205 17:37:47.449805 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hk5r\" (UniqueName: \"kubernetes.io/projected/7c026d80-1703-4e34-9469-5e5f90266f94-kube-api-access-8hk5r\") pod \"redhat-marketplace-8n4f5\" (UID: \"7c026d80-1703-4e34-9469-5e5f90266f94\") " pod="openshift-marketplace/redhat-marketplace-8n4f5" Dec 05 17:37:47 crc kubenswrapper[4756]: I1205 17:37:47.449830 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c026d80-1703-4e34-9469-5e5f90266f94-utilities\") pod \"redhat-marketplace-8n4f5\" (UID: \"7c026d80-1703-4e34-9469-5e5f90266f94\") " pod="openshift-marketplace/redhat-marketplace-8n4f5" Dec 05 17:37:47 crc kubenswrapper[4756]: I1205 17:37:47.450275 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c026d80-1703-4e34-9469-5e5f90266f94-utilities\") pod \"redhat-marketplace-8n4f5\" (UID: \"7c026d80-1703-4e34-9469-5e5f90266f94\") " pod="openshift-marketplace/redhat-marketplace-8n4f5" Dec 05 17:37:47 crc kubenswrapper[4756]: I1205 17:37:47.450273 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c026d80-1703-4e34-9469-5e5f90266f94-catalog-content\") pod \"redhat-marketplace-8n4f5\" (UID: \"7c026d80-1703-4e34-9469-5e5f90266f94\") " pod="openshift-marketplace/redhat-marketplace-8n4f5" Dec 05 17:37:47 crc kubenswrapper[4756]: I1205 17:37:47.478307 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hk5r\" (UniqueName: \"kubernetes.io/projected/7c026d80-1703-4e34-9469-5e5f90266f94-kube-api-access-8hk5r\") pod \"redhat-marketplace-8n4f5\" (UID: \"7c026d80-1703-4e34-9469-5e5f90266f94\") " pod="openshift-marketplace/redhat-marketplace-8n4f5" Dec 05 17:37:47 crc kubenswrapper[4756]: I1205 17:37:47.729599 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8n4f5" Dec 05 17:37:47 crc kubenswrapper[4756]: I1205 17:37:47.985249 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8n4f5"] Dec 05 17:37:47 crc kubenswrapper[4756]: W1205 17:37:47.991655 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c026d80_1703_4e34_9469_5e5f90266f94.slice/crio-a4e660ebcc5d02b235754f07c3cda837ed94d7befc1a71ee11b995bc9459376c WatchSource:0}: Error finding container a4e660ebcc5d02b235754f07c3cda837ed94d7befc1a71ee11b995bc9459376c: Status 404 returned error can't find the container with id a4e660ebcc5d02b235754f07c3cda837ed94d7befc1a71ee11b995bc9459376c Dec 05 17:37:48 crc kubenswrapper[4756]: I1205 17:37:48.213723 4756 generic.go:334] "Generic (PLEG): container finished" podID="347fdc4e-7621-4d4e-86a6-5fee4174e38b" containerID="0387349233dedf204c9d1a598685ad5fc41931d42da74d758d74a18d5b76a791" exitCode=0 Dec 05 17:37:48 crc kubenswrapper[4756]: I1205 17:37:48.213984 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-5vq2c" event={"ID":"347fdc4e-7621-4d4e-86a6-5fee4174e38b","Type":"ContainerDied","Data":"0387349233dedf204c9d1a598685ad5fc41931d42da74d758d74a18d5b76a791"} Dec 05 17:37:48 crc kubenswrapper[4756]: I1205 17:37:48.216864 4756 generic.go:334] "Generic (PLEG): container finished" podID="7c026d80-1703-4e34-9469-5e5f90266f94" containerID="cd221104a39da5ab3501be9cf1d733f80c02bb731379a79d3ff26e983c45a99f" exitCode=0 Dec 05 17:37:48 crc kubenswrapper[4756]: I1205 17:37:48.216923 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8n4f5" event={"ID":"7c026d80-1703-4e34-9469-5e5f90266f94","Type":"ContainerDied","Data":"cd221104a39da5ab3501be9cf1d733f80c02bb731379a79d3ff26e983c45a99f"} Dec 05 17:37:48 crc kubenswrapper[4756]: I1205 17:37:48.216971 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8n4f5" event={"ID":"7c026d80-1703-4e34-9469-5e5f90266f94","Type":"ContainerStarted","Data":"a4e660ebcc5d02b235754f07c3cda837ed94d7befc1a71ee11b995bc9459376c"} Dec 05 17:37:49 crc kubenswrapper[4756]: I1205 17:37:49.230668 4756 generic.go:334] "Generic (PLEG): container finished" podID="7c026d80-1703-4e34-9469-5e5f90266f94" containerID="5eb2dd607e54e25a57bf8d7d38d4a049ea5941a6720db08711b136000f4b43b2" exitCode=0 Dec 05 17:37:49 crc kubenswrapper[4756]: I1205 17:37:49.230738 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8n4f5" event={"ID":"7c026d80-1703-4e34-9469-5e5f90266f94","Type":"ContainerDied","Data":"5eb2dd607e54e25a57bf8d7d38d4a049ea5941a6720db08711b136000f4b43b2"} Dec 05 17:37:49 crc kubenswrapper[4756]: I1205 17:37:49.618981 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5vq2c" Dec 05 17:37:49 crc kubenswrapper[4756]: I1205 17:37:49.762797 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/347fdc4e-7621-4d4e-86a6-5fee4174e38b-crc-storage\") pod \"347fdc4e-7621-4d4e-86a6-5fee4174e38b\" (UID: \"347fdc4e-7621-4d4e-86a6-5fee4174e38b\") " Dec 05 17:37:49 crc kubenswrapper[4756]: I1205 17:37:49.762851 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/347fdc4e-7621-4d4e-86a6-5fee4174e38b-node-mnt\") pod \"347fdc4e-7621-4d4e-86a6-5fee4174e38b\" (UID: \"347fdc4e-7621-4d4e-86a6-5fee4174e38b\") " Dec 05 17:37:49 crc kubenswrapper[4756]: I1205 17:37:49.762875 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ln4v6\" (UniqueName: \"kubernetes.io/projected/347fdc4e-7621-4d4e-86a6-5fee4174e38b-kube-api-access-ln4v6\") pod \"347fdc4e-7621-4d4e-86a6-5fee4174e38b\" (UID: \"347fdc4e-7621-4d4e-86a6-5fee4174e38b\") " Dec 05 17:37:49 crc kubenswrapper[4756]: I1205 17:37:49.763953 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/347fdc4e-7621-4d4e-86a6-5fee4174e38b-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "347fdc4e-7621-4d4e-86a6-5fee4174e38b" (UID: "347fdc4e-7621-4d4e-86a6-5fee4174e38b"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 17:37:49 crc kubenswrapper[4756]: I1205 17:37:49.770567 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/347fdc4e-7621-4d4e-86a6-5fee4174e38b-kube-api-access-ln4v6" (OuterVolumeSpecName: "kube-api-access-ln4v6") pod "347fdc4e-7621-4d4e-86a6-5fee4174e38b" (UID: "347fdc4e-7621-4d4e-86a6-5fee4174e38b"). InnerVolumeSpecName "kube-api-access-ln4v6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:37:49 crc kubenswrapper[4756]: I1205 17:37:49.783788 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/347fdc4e-7621-4d4e-86a6-5fee4174e38b-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "347fdc4e-7621-4d4e-86a6-5fee4174e38b" (UID: "347fdc4e-7621-4d4e-86a6-5fee4174e38b"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:37:49 crc kubenswrapper[4756]: I1205 17:37:49.866640 4756 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/347fdc4e-7621-4d4e-86a6-5fee4174e38b-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 05 17:37:49 crc kubenswrapper[4756]: I1205 17:37:49.866676 4756 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/347fdc4e-7621-4d4e-86a6-5fee4174e38b-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 05 17:37:49 crc kubenswrapper[4756]: I1205 17:37:49.866688 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ln4v6\" (UniqueName: \"kubernetes.io/projected/347fdc4e-7621-4d4e-86a6-5fee4174e38b-kube-api-access-ln4v6\") on node \"crc\" DevicePath \"\"" Dec 05 17:37:50 crc kubenswrapper[4756]: I1205 17:37:50.243523 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8n4f5" event={"ID":"7c026d80-1703-4e34-9469-5e5f90266f94","Type":"ContainerStarted","Data":"781d200768156412fe9342cf0daea7ac926307c4959b0b027dbb6d4d6c9733a9"} Dec 05 17:37:50 crc kubenswrapper[4756]: I1205 17:37:50.246130 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-5vq2c" event={"ID":"347fdc4e-7621-4d4e-86a6-5fee4174e38b","Type":"ContainerDied","Data":"707c99c24d32b986621b2be75ee32d677374071163ba43bd2052688290b31c75"} Dec 05 17:37:50 crc kubenswrapper[4756]: I1205 17:37:50.246192 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="707c99c24d32b986621b2be75ee32d677374071163ba43bd2052688290b31c75" Dec 05 17:37:50 crc kubenswrapper[4756]: I1205 17:37:50.246224 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5vq2c" Dec 05 17:37:50 crc kubenswrapper[4756]: I1205 17:37:50.265121 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8n4f5" podStartSLOduration=1.869167383 podStartE2EDuration="3.26509812s" podCreationTimestamp="2025-12-05 17:37:47 +0000 UTC" firstStartedPulling="2025-12-05 17:37:48.220436197 +0000 UTC m=+4928.459109719" lastFinishedPulling="2025-12-05 17:37:49.616366944 +0000 UTC m=+4929.855040456" observedRunningTime="2025-12-05 17:37:50.260007745 +0000 UTC m=+4930.498681297" watchObservedRunningTime="2025-12-05 17:37:50.26509812 +0000 UTC m=+4930.503771673" Dec 05 17:37:52 crc kubenswrapper[4756]: I1205 17:37:52.295222 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 17:37:52 crc kubenswrapper[4756]: I1205 17:37:52.295567 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 17:37:57 crc kubenswrapper[4756]: I1205 17:37:57.730363 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8n4f5" Dec 05 17:37:57 crc kubenswrapper[4756]: I1205 17:37:57.731242 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8n4f5" Dec 05 17:37:57 crc kubenswrapper[4756]: I1205 17:37:57.796810 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8n4f5" Dec 05 17:37:58 crc kubenswrapper[4756]: I1205 17:37:58.390813 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8n4f5" Dec 05 17:37:58 crc kubenswrapper[4756]: I1205 17:37:58.456271 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8n4f5"] Dec 05 17:37:58 crc kubenswrapper[4756]: I1205 17:37:58.460810 4756 scope.go:117] "RemoveContainer" containerID="00e826abeb1e4814f50006f7d5860ca14bfe871c096fbe9022b06a2651165e25" Dec 05 17:38:00 crc kubenswrapper[4756]: I1205 17:38:00.333453 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8n4f5" podUID="7c026d80-1703-4e34-9469-5e5f90266f94" containerName="registry-server" containerID="cri-o://781d200768156412fe9342cf0daea7ac926307c4959b0b027dbb6d4d6c9733a9" gracePeriod=2 Dec 05 17:38:01 crc kubenswrapper[4756]: I1205 17:38:01.332556 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8n4f5" Dec 05 17:38:01 crc kubenswrapper[4756]: I1205 17:38:01.383614 4756 generic.go:334] "Generic (PLEG): container finished" podID="7c026d80-1703-4e34-9469-5e5f90266f94" containerID="781d200768156412fe9342cf0daea7ac926307c4959b0b027dbb6d4d6c9733a9" exitCode=0 Dec 05 17:38:01 crc kubenswrapper[4756]: I1205 17:38:01.383658 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8n4f5" event={"ID":"7c026d80-1703-4e34-9469-5e5f90266f94","Type":"ContainerDied","Data":"781d200768156412fe9342cf0daea7ac926307c4959b0b027dbb6d4d6c9733a9"} Dec 05 17:38:01 crc kubenswrapper[4756]: I1205 17:38:01.383684 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8n4f5" event={"ID":"7c026d80-1703-4e34-9469-5e5f90266f94","Type":"ContainerDied","Data":"a4e660ebcc5d02b235754f07c3cda837ed94d7befc1a71ee11b995bc9459376c"} Dec 05 17:38:01 crc kubenswrapper[4756]: I1205 17:38:01.383701 4756 scope.go:117] "RemoveContainer" containerID="781d200768156412fe9342cf0daea7ac926307c4959b0b027dbb6d4d6c9733a9" Dec 05 17:38:01 crc kubenswrapper[4756]: I1205 17:38:01.383911 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8n4f5" Dec 05 17:38:01 crc kubenswrapper[4756]: I1205 17:38:01.410678 4756 scope.go:117] "RemoveContainer" containerID="5eb2dd607e54e25a57bf8d7d38d4a049ea5941a6720db08711b136000f4b43b2" Dec 05 17:38:01 crc kubenswrapper[4756]: I1205 17:38:01.427749 4756 scope.go:117] "RemoveContainer" containerID="cd221104a39da5ab3501be9cf1d733f80c02bb731379a79d3ff26e983c45a99f" Dec 05 17:38:01 crc kubenswrapper[4756]: I1205 17:38:01.456476 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c026d80-1703-4e34-9469-5e5f90266f94-catalog-content\") pod \"7c026d80-1703-4e34-9469-5e5f90266f94\" (UID: \"7c026d80-1703-4e34-9469-5e5f90266f94\") " Dec 05 17:38:01 crc kubenswrapper[4756]: I1205 17:38:01.456676 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hk5r\" (UniqueName: \"kubernetes.io/projected/7c026d80-1703-4e34-9469-5e5f90266f94-kube-api-access-8hk5r\") pod \"7c026d80-1703-4e34-9469-5e5f90266f94\" (UID: \"7c026d80-1703-4e34-9469-5e5f90266f94\") " Dec 05 17:38:01 crc kubenswrapper[4756]: I1205 17:38:01.456718 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c026d80-1703-4e34-9469-5e5f90266f94-utilities\") pod \"7c026d80-1703-4e34-9469-5e5f90266f94\" (UID: \"7c026d80-1703-4e34-9469-5e5f90266f94\") " Dec 05 17:38:01 crc kubenswrapper[4756]: I1205 17:38:01.457431 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c026d80-1703-4e34-9469-5e5f90266f94-utilities" (OuterVolumeSpecName: "utilities") pod "7c026d80-1703-4e34-9469-5e5f90266f94" (UID: "7c026d80-1703-4e34-9469-5e5f90266f94"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:38:01 crc kubenswrapper[4756]: I1205 17:38:01.460698 4756 scope.go:117] "RemoveContainer" containerID="781d200768156412fe9342cf0daea7ac926307c4959b0b027dbb6d4d6c9733a9" Dec 05 17:38:01 crc kubenswrapper[4756]: I1205 17:38:01.462748 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c026d80-1703-4e34-9469-5e5f90266f94-kube-api-access-8hk5r" (OuterVolumeSpecName: "kube-api-access-8hk5r") pod "7c026d80-1703-4e34-9469-5e5f90266f94" (UID: "7c026d80-1703-4e34-9469-5e5f90266f94"). InnerVolumeSpecName "kube-api-access-8hk5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:38:01 crc kubenswrapper[4756]: E1205 17:38:01.468079 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"781d200768156412fe9342cf0daea7ac926307c4959b0b027dbb6d4d6c9733a9\": container with ID starting with 781d200768156412fe9342cf0daea7ac926307c4959b0b027dbb6d4d6c9733a9 not found: ID does not exist" containerID="781d200768156412fe9342cf0daea7ac926307c4959b0b027dbb6d4d6c9733a9" Dec 05 17:38:01 crc kubenswrapper[4756]: I1205 17:38:01.468113 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"781d200768156412fe9342cf0daea7ac926307c4959b0b027dbb6d4d6c9733a9"} err="failed to get container status \"781d200768156412fe9342cf0daea7ac926307c4959b0b027dbb6d4d6c9733a9\": rpc error: code = NotFound desc = could not find container \"781d200768156412fe9342cf0daea7ac926307c4959b0b027dbb6d4d6c9733a9\": container with ID starting with 781d200768156412fe9342cf0daea7ac926307c4959b0b027dbb6d4d6c9733a9 not found: ID does not exist" Dec 05 17:38:01 crc kubenswrapper[4756]: I1205 17:38:01.468137 4756 scope.go:117] "RemoveContainer" containerID="5eb2dd607e54e25a57bf8d7d38d4a049ea5941a6720db08711b136000f4b43b2" Dec 05 17:38:01 crc kubenswrapper[4756]: E1205 17:38:01.469145 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eb2dd607e54e25a57bf8d7d38d4a049ea5941a6720db08711b136000f4b43b2\": container with ID starting with 5eb2dd607e54e25a57bf8d7d38d4a049ea5941a6720db08711b136000f4b43b2 not found: ID does not exist" containerID="5eb2dd607e54e25a57bf8d7d38d4a049ea5941a6720db08711b136000f4b43b2" Dec 05 17:38:01 crc kubenswrapper[4756]: I1205 17:38:01.469258 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eb2dd607e54e25a57bf8d7d38d4a049ea5941a6720db08711b136000f4b43b2"} err="failed to get container status \"5eb2dd607e54e25a57bf8d7d38d4a049ea5941a6720db08711b136000f4b43b2\": rpc error: code = NotFound desc = could not find container \"5eb2dd607e54e25a57bf8d7d38d4a049ea5941a6720db08711b136000f4b43b2\": container with ID starting with 5eb2dd607e54e25a57bf8d7d38d4a049ea5941a6720db08711b136000f4b43b2 not found: ID does not exist" Dec 05 17:38:01 crc kubenswrapper[4756]: I1205 17:38:01.469292 4756 scope.go:117] "RemoveContainer" containerID="cd221104a39da5ab3501be9cf1d733f80c02bb731379a79d3ff26e983c45a99f" Dec 05 17:38:01 crc kubenswrapper[4756]: E1205 17:38:01.469949 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd221104a39da5ab3501be9cf1d733f80c02bb731379a79d3ff26e983c45a99f\": container with ID starting with cd221104a39da5ab3501be9cf1d733f80c02bb731379a79d3ff26e983c45a99f not found: ID does not exist" containerID="cd221104a39da5ab3501be9cf1d733f80c02bb731379a79d3ff26e983c45a99f" Dec 05 17:38:01 crc kubenswrapper[4756]: I1205 17:38:01.469992 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd221104a39da5ab3501be9cf1d733f80c02bb731379a79d3ff26e983c45a99f"} err="failed to get container status \"cd221104a39da5ab3501be9cf1d733f80c02bb731379a79d3ff26e983c45a99f\": rpc error: code = NotFound desc = could not find container \"cd221104a39da5ab3501be9cf1d733f80c02bb731379a79d3ff26e983c45a99f\": container with ID starting with cd221104a39da5ab3501be9cf1d733f80c02bb731379a79d3ff26e983c45a99f not found: ID does not exist" Dec 05 17:38:01 crc kubenswrapper[4756]: I1205 17:38:01.496467 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c026d80-1703-4e34-9469-5e5f90266f94-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c026d80-1703-4e34-9469-5e5f90266f94" (UID: "7c026d80-1703-4e34-9469-5e5f90266f94"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:38:01 crc kubenswrapper[4756]: I1205 17:38:01.559972 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hk5r\" (UniqueName: \"kubernetes.io/projected/7c026d80-1703-4e34-9469-5e5f90266f94-kube-api-access-8hk5r\") on node \"crc\" DevicePath \"\"" Dec 05 17:38:01 crc kubenswrapper[4756]: I1205 17:38:01.560002 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c026d80-1703-4e34-9469-5e5f90266f94-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 17:38:01 crc kubenswrapper[4756]: I1205 17:38:01.560014 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c026d80-1703-4e34-9469-5e5f90266f94-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 17:38:01 crc kubenswrapper[4756]: I1205 17:38:01.726553 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8n4f5"] Dec 05 17:38:01 crc kubenswrapper[4756]: I1205 17:38:01.734147 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8n4f5"] Dec 05 17:38:02 crc kubenswrapper[4756]: I1205 17:38:02.066446 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c026d80-1703-4e34-9469-5e5f90266f94" path="/var/lib/kubelet/pods/7c026d80-1703-4e34-9469-5e5f90266f94/volumes" Dec 05 17:38:22 crc kubenswrapper[4756]: I1205 17:38:22.295901 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 17:38:22 crc kubenswrapper[4756]: I1205 17:38:22.296632 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 17:38:22 crc kubenswrapper[4756]: I1205 17:38:22.296715 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 17:38:22 crc kubenswrapper[4756]: I1205 17:38:22.297624 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f1c4de2bc9063f8e3fc76ba02b88979769bea7b75c8852b80a31670bb4de72f9"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 17:38:22 crc kubenswrapper[4756]: I1205 17:38:22.297719 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://f1c4de2bc9063f8e3fc76ba02b88979769bea7b75c8852b80a31670bb4de72f9" gracePeriod=600 Dec 05 17:38:22 crc kubenswrapper[4756]: I1205 17:38:22.600363 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="f1c4de2bc9063f8e3fc76ba02b88979769bea7b75c8852b80a31670bb4de72f9" exitCode=0 Dec 05 17:38:22 crc kubenswrapper[4756]: I1205 17:38:22.600410 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"f1c4de2bc9063f8e3fc76ba02b88979769bea7b75c8852b80a31670bb4de72f9"} Dec 05 17:38:22 crc kubenswrapper[4756]: I1205 17:38:22.600461 4756 scope.go:117] "RemoveContainer" containerID="018dc95e3554740d0c2762c4f5c3c10d9130221083b834744135d7253826ab0e" Dec 05 17:38:23 crc kubenswrapper[4756]: I1205 17:38:23.632900 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"13145d7e051555462bf10e555fd5c9ca6a7f1db46b7c17462e37d0086f868bf2"} Dec 05 17:39:27 crc kubenswrapper[4756]: I1205 17:39:27.274041 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gvq8d"] Dec 05 17:39:27 crc kubenswrapper[4756]: E1205 17:39:27.275557 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c026d80-1703-4e34-9469-5e5f90266f94" containerName="extract-utilities" Dec 05 17:39:27 crc kubenswrapper[4756]: I1205 17:39:27.275589 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c026d80-1703-4e34-9469-5e5f90266f94" containerName="extract-utilities" Dec 05 17:39:27 crc kubenswrapper[4756]: E1205 17:39:27.275658 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c026d80-1703-4e34-9469-5e5f90266f94" containerName="registry-server" Dec 05 17:39:27 crc kubenswrapper[4756]: I1205 17:39:27.275677 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c026d80-1703-4e34-9469-5e5f90266f94" containerName="registry-server" Dec 05 17:39:27 crc kubenswrapper[4756]: E1205 17:39:27.275706 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c026d80-1703-4e34-9469-5e5f90266f94" containerName="extract-content" Dec 05 17:39:27 crc kubenswrapper[4756]: I1205 17:39:27.275723 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c026d80-1703-4e34-9469-5e5f90266f94" containerName="extract-content" Dec 05 17:39:27 crc kubenswrapper[4756]: E1205 17:39:27.275749 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="347fdc4e-7621-4d4e-86a6-5fee4174e38b" containerName="storage" Dec 05 17:39:27 crc kubenswrapper[4756]: I1205 17:39:27.275765 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="347fdc4e-7621-4d4e-86a6-5fee4174e38b" containerName="storage" Dec 05 17:39:27 crc kubenswrapper[4756]: I1205 17:39:27.276094 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="347fdc4e-7621-4d4e-86a6-5fee4174e38b" containerName="storage" Dec 05 17:39:27 crc kubenswrapper[4756]: I1205 17:39:27.276164 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c026d80-1703-4e34-9469-5e5f90266f94" containerName="registry-server" Dec 05 17:39:27 crc kubenswrapper[4756]: I1205 17:39:27.278745 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gvq8d" Dec 05 17:39:27 crc kubenswrapper[4756]: I1205 17:39:27.290658 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gvq8d"] Dec 05 17:39:27 crc kubenswrapper[4756]: I1205 17:39:27.480067 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6qrm\" (UniqueName: \"kubernetes.io/projected/7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc-kube-api-access-h6qrm\") pod \"redhat-operators-gvq8d\" (UID: \"7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc\") " pod="openshift-marketplace/redhat-operators-gvq8d" Dec 05 17:39:27 crc kubenswrapper[4756]: I1205 17:39:27.480117 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc-utilities\") pod \"redhat-operators-gvq8d\" (UID: \"7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc\") " pod="openshift-marketplace/redhat-operators-gvq8d" Dec 05 17:39:27 crc kubenswrapper[4756]: I1205 17:39:27.480307 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc-catalog-content\") pod \"redhat-operators-gvq8d\" (UID: \"7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc\") " pod="openshift-marketplace/redhat-operators-gvq8d" Dec 05 17:39:27 crc kubenswrapper[4756]: I1205 17:39:27.582077 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6qrm\" (UniqueName: \"kubernetes.io/projected/7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc-kube-api-access-h6qrm\") pod \"redhat-operators-gvq8d\" (UID: \"7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc\") " pod="openshift-marketplace/redhat-operators-gvq8d" Dec 05 17:39:27 crc kubenswrapper[4756]: I1205 17:39:27.582139 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc-utilities\") pod \"redhat-operators-gvq8d\" (UID: \"7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc\") " pod="openshift-marketplace/redhat-operators-gvq8d" Dec 05 17:39:27 crc kubenswrapper[4756]: I1205 17:39:27.582209 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc-catalog-content\") pod \"redhat-operators-gvq8d\" (UID: \"7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc\") " pod="openshift-marketplace/redhat-operators-gvq8d" Dec 05 17:39:27 crc kubenswrapper[4756]: I1205 17:39:27.582765 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc-catalog-content\") pod \"redhat-operators-gvq8d\" (UID: \"7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc\") " pod="openshift-marketplace/redhat-operators-gvq8d" Dec 05 17:39:27 crc kubenswrapper[4756]: I1205 17:39:27.582985 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc-utilities\") pod \"redhat-operators-gvq8d\" (UID: \"7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc\") " pod="openshift-marketplace/redhat-operators-gvq8d" Dec 05 17:39:27 crc kubenswrapper[4756]: I1205 17:39:27.606508 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6qrm\" (UniqueName: \"kubernetes.io/projected/7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc-kube-api-access-h6qrm\") pod \"redhat-operators-gvq8d\" (UID: \"7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc\") " pod="openshift-marketplace/redhat-operators-gvq8d" Dec 05 17:39:27 crc kubenswrapper[4756]: I1205 17:39:27.902552 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gvq8d" Dec 05 17:39:28 crc kubenswrapper[4756]: I1205 17:39:28.154179 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gvq8d"] Dec 05 17:39:28 crc kubenswrapper[4756]: I1205 17:39:28.233378 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gvq8d" event={"ID":"7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc","Type":"ContainerStarted","Data":"cd24e2798b27ef9f07b97250d3b5198985dd940c4333c8731f84a1ad069a55e1"} Dec 05 17:39:29 crc kubenswrapper[4756]: I1205 17:39:29.241509 4756 generic.go:334] "Generic (PLEG): container finished" podID="7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc" containerID="e29372e14ca96645c303d5c2caad411b1e49b9f7492bcbf00ec298ac3c35a83c" exitCode=0 Dec 05 17:39:29 crc kubenswrapper[4756]: I1205 17:39:29.241573 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gvq8d" event={"ID":"7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc","Type":"ContainerDied","Data":"e29372e14ca96645c303d5c2caad411b1e49b9f7492bcbf00ec298ac3c35a83c"} Dec 05 17:39:30 crc kubenswrapper[4756]: I1205 17:39:30.249764 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gvq8d" event={"ID":"7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc","Type":"ContainerStarted","Data":"29286855d0ed2714b3ec72d9555cc544af755eed132dac12d866a94e9ed00a01"} Dec 05 17:39:31 crc kubenswrapper[4756]: I1205 17:39:31.260243 4756 generic.go:334] "Generic (PLEG): container finished" podID="7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc" containerID="29286855d0ed2714b3ec72d9555cc544af755eed132dac12d866a94e9ed00a01" exitCode=0 Dec 05 17:39:31 crc kubenswrapper[4756]: I1205 17:39:31.260291 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gvq8d" event={"ID":"7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc","Type":"ContainerDied","Data":"29286855d0ed2714b3ec72d9555cc544af755eed132dac12d866a94e9ed00a01"} Dec 05 17:39:32 crc kubenswrapper[4756]: I1205 17:39:32.270291 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gvq8d" event={"ID":"7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc","Type":"ContainerStarted","Data":"a56680f7d8a0e7ad941b0786fe60a06c56a08f4912543e02faf304913ff34087"} Dec 05 17:39:32 crc kubenswrapper[4756]: I1205 17:39:32.294313 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gvq8d" podStartSLOduration=2.884152678 podStartE2EDuration="5.29428787s" podCreationTimestamp="2025-12-05 17:39:27 +0000 UTC" firstStartedPulling="2025-12-05 17:39:29.243746302 +0000 UTC m=+5029.482419814" lastFinishedPulling="2025-12-05 17:39:31.653881494 +0000 UTC m=+5031.892555006" observedRunningTime="2025-12-05 17:39:32.289714919 +0000 UTC m=+5032.528388521" watchObservedRunningTime="2025-12-05 17:39:32.29428787 +0000 UTC m=+5032.532961402" Dec 05 17:39:37 crc kubenswrapper[4756]: I1205 17:39:37.903604 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gvq8d" Dec 05 17:39:37 crc kubenswrapper[4756]: I1205 17:39:37.904231 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gvq8d" Dec 05 17:39:37 crc kubenswrapper[4756]: I1205 17:39:37.944124 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gvq8d" Dec 05 17:39:38 crc kubenswrapper[4756]: I1205 17:39:38.430237 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gvq8d" Dec 05 17:39:38 crc kubenswrapper[4756]: I1205 17:39:38.506087 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gvq8d"] Dec 05 17:39:40 crc kubenswrapper[4756]: I1205 17:39:40.350185 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gvq8d" podUID="7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc" containerName="registry-server" containerID="cri-o://a56680f7d8a0e7ad941b0786fe60a06c56a08f4912543e02faf304913ff34087" gracePeriod=2 Dec 05 17:39:43 crc kubenswrapper[4756]: I1205 17:39:43.377223 4756 generic.go:334] "Generic (PLEG): container finished" podID="7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc" containerID="a56680f7d8a0e7ad941b0786fe60a06c56a08f4912543e02faf304913ff34087" exitCode=0 Dec 05 17:39:43 crc kubenswrapper[4756]: I1205 17:39:43.377326 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gvq8d" event={"ID":"7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc","Type":"ContainerDied","Data":"a56680f7d8a0e7ad941b0786fe60a06c56a08f4912543e02faf304913ff34087"} Dec 05 17:39:43 crc kubenswrapper[4756]: I1205 17:39:43.377746 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gvq8d" event={"ID":"7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc","Type":"ContainerDied","Data":"cd24e2798b27ef9f07b97250d3b5198985dd940c4333c8731f84a1ad069a55e1"} Dec 05 17:39:43 crc kubenswrapper[4756]: I1205 17:39:43.377774 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd24e2798b27ef9f07b97250d3b5198985dd940c4333c8731f84a1ad069a55e1" Dec 05 17:39:43 crc kubenswrapper[4756]: I1205 17:39:43.422544 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gvq8d" Dec 05 17:39:43 crc kubenswrapper[4756]: I1205 17:39:43.432132 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc-catalog-content\") pod \"7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc\" (UID: \"7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc\") " Dec 05 17:39:43 crc kubenswrapper[4756]: I1205 17:39:43.432242 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6qrm\" (UniqueName: \"kubernetes.io/projected/7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc-kube-api-access-h6qrm\") pod \"7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc\" (UID: \"7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc\") " Dec 05 17:39:43 crc kubenswrapper[4756]: I1205 17:39:43.432318 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc-utilities\") pod \"7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc\" (UID: \"7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc\") " Dec 05 17:39:43 crc kubenswrapper[4756]: I1205 17:39:43.433470 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc-utilities" (OuterVolumeSpecName: "utilities") pod "7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc" (UID: "7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:39:43 crc kubenswrapper[4756]: I1205 17:39:43.439533 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc-kube-api-access-h6qrm" (OuterVolumeSpecName: "kube-api-access-h6qrm") pod "7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc" (UID: "7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc"). InnerVolumeSpecName "kube-api-access-h6qrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:39:43 crc kubenswrapper[4756]: I1205 17:39:43.533327 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6qrm\" (UniqueName: \"kubernetes.io/projected/7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc-kube-api-access-h6qrm\") on node \"crc\" DevicePath \"\"" Dec 05 17:39:43 crc kubenswrapper[4756]: I1205 17:39:43.533361 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 17:39:43 crc kubenswrapper[4756]: I1205 17:39:43.608651 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc" (UID: "7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:39:43 crc kubenswrapper[4756]: I1205 17:39:43.634869 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 17:39:44 crc kubenswrapper[4756]: I1205 17:39:44.387694 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gvq8d" Dec 05 17:39:44 crc kubenswrapper[4756]: I1205 17:39:44.438821 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gvq8d"] Dec 05 17:39:44 crc kubenswrapper[4756]: I1205 17:39:44.466025 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gvq8d"] Dec 05 17:39:46 crc kubenswrapper[4756]: I1205 17:39:46.060297 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc" path="/var/lib/kubelet/pods/7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc/volumes" Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.644845 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-599f5d6f75-529t2"] Dec 05 17:39:48 crc kubenswrapper[4756]: E1205 17:39:48.645495 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc" containerName="registry-server" Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.645510 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc" containerName="registry-server" Dec 05 17:39:48 crc kubenswrapper[4756]: E1205 17:39:48.645524 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc" containerName="extract-utilities" Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.645531 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc" containerName="extract-utilities" Dec 05 17:39:48 crc kubenswrapper[4756]: E1205 17:39:48.645563 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc" containerName="extract-content" Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.645569 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc" containerName="extract-content" Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.645698 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f6a21d0-63d4-4213-9a8e-dfa1b6a76dcc" containerName="registry-server" Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.646481 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-599f5d6f75-529t2" Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.653119 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f5d88f885-zq75v"] Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.654453 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f5d88f885-zq75v" Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.656626 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.656804 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.656912 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.657012 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.657276 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-pwzwg" Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.678811 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-599f5d6f75-529t2"] Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.688491 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f5d88f885-zq75v"] Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.810882 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wxcj\" (UniqueName: \"kubernetes.io/projected/aa75605f-d081-44b7-88d4-ef580c4d2ee8-kube-api-access-7wxcj\") pod \"dnsmasq-dns-599f5d6f75-529t2\" (UID: \"aa75605f-d081-44b7-88d4-ef580c4d2ee8\") " pod="openstack/dnsmasq-dns-599f5d6f75-529t2" Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.810946 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa75605f-d081-44b7-88d4-ef580c4d2ee8-config\") pod \"dnsmasq-dns-599f5d6f75-529t2\" (UID: \"aa75605f-d081-44b7-88d4-ef580c4d2ee8\") " pod="openstack/dnsmasq-dns-599f5d6f75-529t2" Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.811010 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa75605f-d081-44b7-88d4-ef580c4d2ee8-dns-svc\") pod \"dnsmasq-dns-599f5d6f75-529t2\" (UID: \"aa75605f-d081-44b7-88d4-ef580c4d2ee8\") " pod="openstack/dnsmasq-dns-599f5d6f75-529t2" Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.811042 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9d9bdec-a1a5-4504-ad70-5f198fb210a8-config\") pod \"dnsmasq-dns-7f5d88f885-zq75v\" (UID: \"a9d9bdec-a1a5-4504-ad70-5f198fb210a8\") " pod="openstack/dnsmasq-dns-7f5d88f885-zq75v" Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.811175 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qrst\" (UniqueName: \"kubernetes.io/projected/a9d9bdec-a1a5-4504-ad70-5f198fb210a8-kube-api-access-2qrst\") pod \"dnsmasq-dns-7f5d88f885-zq75v\" (UID: \"a9d9bdec-a1a5-4504-ad70-5f198fb210a8\") " pod="openstack/dnsmasq-dns-7f5d88f885-zq75v" Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.912147 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wxcj\" (UniqueName: \"kubernetes.io/projected/aa75605f-d081-44b7-88d4-ef580c4d2ee8-kube-api-access-7wxcj\") pod \"dnsmasq-dns-599f5d6f75-529t2\" (UID: \"aa75605f-d081-44b7-88d4-ef580c4d2ee8\") " pod="openstack/dnsmasq-dns-599f5d6f75-529t2" Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.912212 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa75605f-d081-44b7-88d4-ef580c4d2ee8-config\") pod \"dnsmasq-dns-599f5d6f75-529t2\" (UID: \"aa75605f-d081-44b7-88d4-ef580c4d2ee8\") " pod="openstack/dnsmasq-dns-599f5d6f75-529t2" Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.912252 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa75605f-d081-44b7-88d4-ef580c4d2ee8-dns-svc\") pod \"dnsmasq-dns-599f5d6f75-529t2\" (UID: \"aa75605f-d081-44b7-88d4-ef580c4d2ee8\") " pod="openstack/dnsmasq-dns-599f5d6f75-529t2" Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.912281 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9d9bdec-a1a5-4504-ad70-5f198fb210a8-config\") pod \"dnsmasq-dns-7f5d88f885-zq75v\" (UID: \"a9d9bdec-a1a5-4504-ad70-5f198fb210a8\") " pod="openstack/dnsmasq-dns-7f5d88f885-zq75v" Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.912307 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qrst\" (UniqueName: \"kubernetes.io/projected/a9d9bdec-a1a5-4504-ad70-5f198fb210a8-kube-api-access-2qrst\") pod \"dnsmasq-dns-7f5d88f885-zq75v\" (UID: \"a9d9bdec-a1a5-4504-ad70-5f198fb210a8\") " pod="openstack/dnsmasq-dns-7f5d88f885-zq75v" Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.913505 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa75605f-d081-44b7-88d4-ef580c4d2ee8-config\") pod \"dnsmasq-dns-599f5d6f75-529t2\" (UID: \"aa75605f-d081-44b7-88d4-ef580c4d2ee8\") " pod="openstack/dnsmasq-dns-599f5d6f75-529t2" Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.914109 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9d9bdec-a1a5-4504-ad70-5f198fb210a8-config\") pod \"dnsmasq-dns-7f5d88f885-zq75v\" (UID: \"a9d9bdec-a1a5-4504-ad70-5f198fb210a8\") " pod="openstack/dnsmasq-dns-7f5d88f885-zq75v" Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.914395 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa75605f-d081-44b7-88d4-ef580c4d2ee8-dns-svc\") pod \"dnsmasq-dns-599f5d6f75-529t2\" (UID: \"aa75605f-d081-44b7-88d4-ef580c4d2ee8\") " pod="openstack/dnsmasq-dns-599f5d6f75-529t2" Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.928049 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-599f5d6f75-529t2"] Dec 05 17:39:48 crc kubenswrapper[4756]: E1205 17:39:48.928678 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-7wxcj], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-599f5d6f75-529t2" podUID="aa75605f-d081-44b7-88d4-ef580c4d2ee8" Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.954676 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qrst\" (UniqueName: \"kubernetes.io/projected/a9d9bdec-a1a5-4504-ad70-5f198fb210a8-kube-api-access-2qrst\") pod \"dnsmasq-dns-7f5d88f885-zq75v\" (UID: \"a9d9bdec-a1a5-4504-ad70-5f198fb210a8\") " pod="openstack/dnsmasq-dns-7f5d88f885-zq75v" Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.959525 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76d8c4d77f-wthjz"] Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.963337 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76d8c4d77f-wthjz" Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.964853 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wxcj\" (UniqueName: \"kubernetes.io/projected/aa75605f-d081-44b7-88d4-ef580c4d2ee8-kube-api-access-7wxcj\") pod \"dnsmasq-dns-599f5d6f75-529t2\" (UID: \"aa75605f-d081-44b7-88d4-ef580c4d2ee8\") " pod="openstack/dnsmasq-dns-599f5d6f75-529t2" Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.973018 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76d8c4d77f-wthjz"] Dec 05 17:39:48 crc kubenswrapper[4756]: I1205 17:39:48.984219 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f5d88f885-zq75v" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.122762 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed17f1ff-6031-4c0e-ab80-fa1425029bde-dns-svc\") pod \"dnsmasq-dns-76d8c4d77f-wthjz\" (UID: \"ed17f1ff-6031-4c0e-ab80-fa1425029bde\") " pod="openstack/dnsmasq-dns-76d8c4d77f-wthjz" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.123059 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7tjx\" (UniqueName: \"kubernetes.io/projected/ed17f1ff-6031-4c0e-ab80-fa1425029bde-kube-api-access-s7tjx\") pod \"dnsmasq-dns-76d8c4d77f-wthjz\" (UID: \"ed17f1ff-6031-4c0e-ab80-fa1425029bde\") " pod="openstack/dnsmasq-dns-76d8c4d77f-wthjz" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.123085 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed17f1ff-6031-4c0e-ab80-fa1425029bde-config\") pod \"dnsmasq-dns-76d8c4d77f-wthjz\" (UID: \"ed17f1ff-6031-4c0e-ab80-fa1425029bde\") " pod="openstack/dnsmasq-dns-76d8c4d77f-wthjz" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.225079 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7tjx\" (UniqueName: \"kubernetes.io/projected/ed17f1ff-6031-4c0e-ab80-fa1425029bde-kube-api-access-s7tjx\") pod \"dnsmasq-dns-76d8c4d77f-wthjz\" (UID: \"ed17f1ff-6031-4c0e-ab80-fa1425029bde\") " pod="openstack/dnsmasq-dns-76d8c4d77f-wthjz" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.225156 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed17f1ff-6031-4c0e-ab80-fa1425029bde-config\") pod \"dnsmasq-dns-76d8c4d77f-wthjz\" (UID: \"ed17f1ff-6031-4c0e-ab80-fa1425029bde\") " pod="openstack/dnsmasq-dns-76d8c4d77f-wthjz" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.225254 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed17f1ff-6031-4c0e-ab80-fa1425029bde-dns-svc\") pod \"dnsmasq-dns-76d8c4d77f-wthjz\" (UID: \"ed17f1ff-6031-4c0e-ab80-fa1425029bde\") " pod="openstack/dnsmasq-dns-76d8c4d77f-wthjz" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.226719 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed17f1ff-6031-4c0e-ab80-fa1425029bde-config\") pod \"dnsmasq-dns-76d8c4d77f-wthjz\" (UID: \"ed17f1ff-6031-4c0e-ab80-fa1425029bde\") " pod="openstack/dnsmasq-dns-76d8c4d77f-wthjz" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.226811 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed17f1ff-6031-4c0e-ab80-fa1425029bde-dns-svc\") pod \"dnsmasq-dns-76d8c4d77f-wthjz\" (UID: \"ed17f1ff-6031-4c0e-ab80-fa1425029bde\") " pod="openstack/dnsmasq-dns-76d8c4d77f-wthjz" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.268438 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7tjx\" (UniqueName: \"kubernetes.io/projected/ed17f1ff-6031-4c0e-ab80-fa1425029bde-kube-api-access-s7tjx\") pod \"dnsmasq-dns-76d8c4d77f-wthjz\" (UID: \"ed17f1ff-6031-4c0e-ab80-fa1425029bde\") " pod="openstack/dnsmasq-dns-76d8c4d77f-wthjz" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.313070 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76d8c4d77f-wthjz"] Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.314955 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76d8c4d77f-wthjz" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.333884 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cbb4f659c-dzl29"] Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.337666 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cbb4f659c-dzl29" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.345114 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cbb4f659c-dzl29"] Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.430193 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svdcd\" (UniqueName: \"kubernetes.io/projected/b2174c60-6954-4a66-9a3e-f16ead5d484e-kube-api-access-svdcd\") pod \"dnsmasq-dns-7cbb4f659c-dzl29\" (UID: \"b2174c60-6954-4a66-9a3e-f16ead5d484e\") " pod="openstack/dnsmasq-dns-7cbb4f659c-dzl29" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.430230 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2174c60-6954-4a66-9a3e-f16ead5d484e-config\") pod \"dnsmasq-dns-7cbb4f659c-dzl29\" (UID: \"b2174c60-6954-4a66-9a3e-f16ead5d484e\") " pod="openstack/dnsmasq-dns-7cbb4f659c-dzl29" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.430288 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2174c60-6954-4a66-9a3e-f16ead5d484e-dns-svc\") pod \"dnsmasq-dns-7cbb4f659c-dzl29\" (UID: \"b2174c60-6954-4a66-9a3e-f16ead5d484e\") " pod="openstack/dnsmasq-dns-7cbb4f659c-dzl29" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.440033 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-599f5d6f75-529t2" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.462404 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-599f5d6f75-529t2" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.475545 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-t8kpb"] Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.477382 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t8kpb" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.497746 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t8kpb"] Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.531127 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wxcj\" (UniqueName: \"kubernetes.io/projected/aa75605f-d081-44b7-88d4-ef580c4d2ee8-kube-api-access-7wxcj\") pod \"aa75605f-d081-44b7-88d4-ef580c4d2ee8\" (UID: \"aa75605f-d081-44b7-88d4-ef580c4d2ee8\") " Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.531316 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa75605f-d081-44b7-88d4-ef580c4d2ee8-config\") pod \"aa75605f-d081-44b7-88d4-ef580c4d2ee8\" (UID: \"aa75605f-d081-44b7-88d4-ef580c4d2ee8\") " Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.531355 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa75605f-d081-44b7-88d4-ef580c4d2ee8-dns-svc\") pod \"aa75605f-d081-44b7-88d4-ef580c4d2ee8\" (UID: \"aa75605f-d081-44b7-88d4-ef580c4d2ee8\") " Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.531546 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2174c60-6954-4a66-9a3e-f16ead5d484e-dns-svc\") pod \"dnsmasq-dns-7cbb4f659c-dzl29\" (UID: \"b2174c60-6954-4a66-9a3e-f16ead5d484e\") " pod="openstack/dnsmasq-dns-7cbb4f659c-dzl29" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.531607 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7e04838-1bc9-46f5-b43b-192f242ac804-catalog-content\") pod \"community-operators-t8kpb\" (UID: \"b7e04838-1bc9-46f5-b43b-192f242ac804\") " pod="openshift-marketplace/community-operators-t8kpb" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.531673 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svdcd\" (UniqueName: \"kubernetes.io/projected/b2174c60-6954-4a66-9a3e-f16ead5d484e-kube-api-access-svdcd\") pod \"dnsmasq-dns-7cbb4f659c-dzl29\" (UID: \"b2174c60-6954-4a66-9a3e-f16ead5d484e\") " pod="openstack/dnsmasq-dns-7cbb4f659c-dzl29" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.531706 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2174c60-6954-4a66-9a3e-f16ead5d484e-config\") pod \"dnsmasq-dns-7cbb4f659c-dzl29\" (UID: \"b2174c60-6954-4a66-9a3e-f16ead5d484e\") " pod="openstack/dnsmasq-dns-7cbb4f659c-dzl29" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.531780 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjfkn\" (UniqueName: \"kubernetes.io/projected/b7e04838-1bc9-46f5-b43b-192f242ac804-kube-api-access-jjfkn\") pod \"community-operators-t8kpb\" (UID: \"b7e04838-1bc9-46f5-b43b-192f242ac804\") " pod="openshift-marketplace/community-operators-t8kpb" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.531845 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7e04838-1bc9-46f5-b43b-192f242ac804-utilities\") pod \"community-operators-t8kpb\" (UID: \"b7e04838-1bc9-46f5-b43b-192f242ac804\") " pod="openshift-marketplace/community-operators-t8kpb" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.532330 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa75605f-d081-44b7-88d4-ef580c4d2ee8-config" (OuterVolumeSpecName: "config") pod "aa75605f-d081-44b7-88d4-ef580c4d2ee8" (UID: "aa75605f-d081-44b7-88d4-ef580c4d2ee8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.532635 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa75605f-d081-44b7-88d4-ef580c4d2ee8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aa75605f-d081-44b7-88d4-ef580c4d2ee8" (UID: "aa75605f-d081-44b7-88d4-ef580c4d2ee8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.533268 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2174c60-6954-4a66-9a3e-f16ead5d484e-dns-svc\") pod \"dnsmasq-dns-7cbb4f659c-dzl29\" (UID: \"b2174c60-6954-4a66-9a3e-f16ead5d484e\") " pod="openstack/dnsmasq-dns-7cbb4f659c-dzl29" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.533377 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2174c60-6954-4a66-9a3e-f16ead5d484e-config\") pod \"dnsmasq-dns-7cbb4f659c-dzl29\" (UID: \"b2174c60-6954-4a66-9a3e-f16ead5d484e\") " pod="openstack/dnsmasq-dns-7cbb4f659c-dzl29" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.534763 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa75605f-d081-44b7-88d4-ef580c4d2ee8-kube-api-access-7wxcj" (OuterVolumeSpecName: "kube-api-access-7wxcj") pod "aa75605f-d081-44b7-88d4-ef580c4d2ee8" (UID: "aa75605f-d081-44b7-88d4-ef580c4d2ee8"). InnerVolumeSpecName "kube-api-access-7wxcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.547283 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svdcd\" (UniqueName: \"kubernetes.io/projected/b2174c60-6954-4a66-9a3e-f16ead5d484e-kube-api-access-svdcd\") pod \"dnsmasq-dns-7cbb4f659c-dzl29\" (UID: \"b2174c60-6954-4a66-9a3e-f16ead5d484e\") " pod="openstack/dnsmasq-dns-7cbb4f659c-dzl29" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.579865 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f5d88f885-zq75v"] Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.638249 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7e04838-1bc9-46f5-b43b-192f242ac804-utilities\") pod \"community-operators-t8kpb\" (UID: \"b7e04838-1bc9-46f5-b43b-192f242ac804\") " pod="openshift-marketplace/community-operators-t8kpb" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.638791 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7e04838-1bc9-46f5-b43b-192f242ac804-catalog-content\") pod \"community-operators-t8kpb\" (UID: \"b7e04838-1bc9-46f5-b43b-192f242ac804\") " pod="openshift-marketplace/community-operators-t8kpb" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.638855 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjfkn\" (UniqueName: \"kubernetes.io/projected/b7e04838-1bc9-46f5-b43b-192f242ac804-kube-api-access-jjfkn\") pod \"community-operators-t8kpb\" (UID: \"b7e04838-1bc9-46f5-b43b-192f242ac804\") " pod="openshift-marketplace/community-operators-t8kpb" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.638916 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa75605f-d081-44b7-88d4-ef580c4d2ee8-config\") on node \"crc\" DevicePath \"\"" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.638930 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa75605f-d081-44b7-88d4-ef580c4d2ee8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.638943 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wxcj\" (UniqueName: \"kubernetes.io/projected/aa75605f-d081-44b7-88d4-ef580c4d2ee8-kube-api-access-7wxcj\") on node \"crc\" DevicePath \"\"" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.639086 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7e04838-1bc9-46f5-b43b-192f242ac804-utilities\") pod \"community-operators-t8kpb\" (UID: \"b7e04838-1bc9-46f5-b43b-192f242ac804\") " pod="openshift-marketplace/community-operators-t8kpb" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.639349 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7e04838-1bc9-46f5-b43b-192f242ac804-catalog-content\") pod \"community-operators-t8kpb\" (UID: \"b7e04838-1bc9-46f5-b43b-192f242ac804\") " pod="openshift-marketplace/community-operators-t8kpb" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.660616 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjfkn\" (UniqueName: \"kubernetes.io/projected/b7e04838-1bc9-46f5-b43b-192f242ac804-kube-api-access-jjfkn\") pod \"community-operators-t8kpb\" (UID: \"b7e04838-1bc9-46f5-b43b-192f242ac804\") " pod="openshift-marketplace/community-operators-t8kpb" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.671495 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cbb4f659c-dzl29" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.805279 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t8kpb" Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.823805 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76d8c4d77f-wthjz"] Dec 05 17:39:49 crc kubenswrapper[4756]: W1205 17:39:49.853672 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded17f1ff_6031_4c0e_ab80_fa1425029bde.slice/crio-4c3036b805e0d12f24ea11d253cdfffbd46283c526e23e7999097418eb50b289 WatchSource:0}: Error finding container 4c3036b805e0d12f24ea11d253cdfffbd46283c526e23e7999097418eb50b289: Status 404 returned error can't find the container with id 4c3036b805e0d12f24ea11d253cdfffbd46283c526e23e7999097418eb50b289 Dec 05 17:39:49 crc kubenswrapper[4756]: I1205 17:39:49.941804 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cbb4f659c-dzl29"] Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.128941 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.134029 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.139967 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.140339 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.140480 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-2bgzw" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.140745 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.140862 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.141021 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.141125 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.169097 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.247155 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/939b8b9a-81e1-4297-a82e-a590445d9049-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.247294 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/939b8b9a-81e1-4297-a82e-a590445d9049-server-conf\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.247368 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/939b8b9a-81e1-4297-a82e-a590445d9049-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.247403 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/939b8b9a-81e1-4297-a82e-a590445d9049-config-data\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.247498 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-08d8e5b3-f151-430e-a4c9-5e1337ac2e38\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08d8e5b3-f151-430e-a4c9-5e1337ac2e38\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.247588 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/939b8b9a-81e1-4297-a82e-a590445d9049-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.247612 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/939b8b9a-81e1-4297-a82e-a590445d9049-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.247634 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lhcf\" (UniqueName: \"kubernetes.io/projected/939b8b9a-81e1-4297-a82e-a590445d9049-kube-api-access-8lhcf\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.247665 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/939b8b9a-81e1-4297-a82e-a590445d9049-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.247686 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/939b8b9a-81e1-4297-a82e-a590445d9049-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.247785 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/939b8b9a-81e1-4297-a82e-a590445d9049-pod-info\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.349512 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/939b8b9a-81e1-4297-a82e-a590445d9049-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.349574 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/939b8b9a-81e1-4297-a82e-a590445d9049-server-conf\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.349602 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/939b8b9a-81e1-4297-a82e-a590445d9049-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.349627 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/939b8b9a-81e1-4297-a82e-a590445d9049-config-data\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.349650 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-08d8e5b3-f151-430e-a4c9-5e1337ac2e38\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08d8e5b3-f151-430e-a4c9-5e1337ac2e38\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.349692 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/939b8b9a-81e1-4297-a82e-a590445d9049-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.349707 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/939b8b9a-81e1-4297-a82e-a590445d9049-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.349726 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lhcf\" (UniqueName: \"kubernetes.io/projected/939b8b9a-81e1-4297-a82e-a590445d9049-kube-api-access-8lhcf\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.349755 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/939b8b9a-81e1-4297-a82e-a590445d9049-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.349775 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/939b8b9a-81e1-4297-a82e-a590445d9049-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.349823 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/939b8b9a-81e1-4297-a82e-a590445d9049-pod-info\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.350821 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/939b8b9a-81e1-4297-a82e-a590445d9049-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.350847 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/939b8b9a-81e1-4297-a82e-a590445d9049-config-data\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.350983 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/939b8b9a-81e1-4297-a82e-a590445d9049-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.351284 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/939b8b9a-81e1-4297-a82e-a590445d9049-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.352311 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/939b8b9a-81e1-4297-a82e-a590445d9049-server-conf\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.352593 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.352629 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-08d8e5b3-f151-430e-a4c9-5e1337ac2e38\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08d8e5b3-f151-430e-a4c9-5e1337ac2e38\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0aeac3a6231cb0f4bc98e3a95c98c77fc8b23722041dd5c489b5b8ddfbe5e6bd/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.354443 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/939b8b9a-81e1-4297-a82e-a590445d9049-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.354727 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/939b8b9a-81e1-4297-a82e-a590445d9049-pod-info\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.354771 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/939b8b9a-81e1-4297-a82e-a590445d9049-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.356163 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/939b8b9a-81e1-4297-a82e-a590445d9049-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.368301 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lhcf\" (UniqueName: \"kubernetes.io/projected/939b8b9a-81e1-4297-a82e-a590445d9049-kube-api-access-8lhcf\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.381183 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-08d8e5b3-f151-430e-a4c9-5e1337ac2e38\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08d8e5b3-f151-430e-a4c9-5e1337ac2e38\") pod \"rabbitmq-server-0\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.422476 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t8kpb"] Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.435325 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.436724 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.439531 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.439677 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-466mr" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.439734 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.439841 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.439950 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.439886 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.440217 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.457146 4756 generic.go:334] "Generic (PLEG): container finished" podID="ed17f1ff-6031-4c0e-ab80-fa1425029bde" containerID="82f8b660eba9abc746361302bd8c41e88d77e9e32ad46e5521e16d2186a4fdcf" exitCode=0 Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.457210 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76d8c4d77f-wthjz" event={"ID":"ed17f1ff-6031-4c0e-ab80-fa1425029bde","Type":"ContainerDied","Data":"82f8b660eba9abc746361302bd8c41e88d77e9e32ad46e5521e16d2186a4fdcf"} Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.457237 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76d8c4d77f-wthjz" event={"ID":"ed17f1ff-6031-4c0e-ab80-fa1425029bde","Type":"ContainerStarted","Data":"4c3036b805e0d12f24ea11d253cdfffbd46283c526e23e7999097418eb50b289"} Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.459775 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.464413 4756 generic.go:334] "Generic (PLEG): container finished" podID="a9d9bdec-a1a5-4504-ad70-5f198fb210a8" containerID="37159267aed5141b5f2ff0e4d4c1f409a6c5d4d8dc9daa73825a68b43887c454" exitCode=0 Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.464668 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f5d88f885-zq75v" event={"ID":"a9d9bdec-a1a5-4504-ad70-5f198fb210a8","Type":"ContainerDied","Data":"37159267aed5141b5f2ff0e4d4c1f409a6c5d4d8dc9daa73825a68b43887c454"} Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.464781 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f5d88f885-zq75v" event={"ID":"a9d9bdec-a1a5-4504-ad70-5f198fb210a8","Type":"ContainerStarted","Data":"b6354aab8aee1bf9738e5fda1532c75f04a25a484f3a01e2536513da92c2f1b0"} Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.468628 4756 generic.go:334] "Generic (PLEG): container finished" podID="b2174c60-6954-4a66-9a3e-f16ead5d484e" containerID="ba10013dd32b41236b556e5e89bc898d0a551f5e3daf8af4e9adc37f75224a93" exitCode=0 Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.468739 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cbb4f659c-dzl29" event={"ID":"b2174c60-6954-4a66-9a3e-f16ead5d484e","Type":"ContainerDied","Data":"ba10013dd32b41236b556e5e89bc898d0a551f5e3daf8af4e9adc37f75224a93"} Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.468773 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cbb4f659c-dzl29" event={"ID":"b2174c60-6954-4a66-9a3e-f16ead5d484e","Type":"ContainerStarted","Data":"f6746e791d5bbd11b7de76618baf050d2bcb37a3ebbfe673f8b3d56f29cc085e"} Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.477071 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-599f5d6f75-529t2" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.478303 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t8kpb" event={"ID":"b7e04838-1bc9-46f5-b43b-192f242ac804","Type":"ContainerStarted","Data":"040a9ef8e1abdebed245fd6b70de444d53b8133a9f6db1da19eba36f1179d955"} Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.487719 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.553029 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.553352 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.553382 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.553434 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.553462 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p42ph\" (UniqueName: \"kubernetes.io/projected/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-kube-api-access-p42ph\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.553485 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.553515 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5fde1dc8-f147-4848-860f-3639768765dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fde1dc8-f147-4848-860f-3639768765dc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.553537 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.553564 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.553590 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.559867 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.578027 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-599f5d6f75-529t2"] Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.585149 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-599f5d6f75-529t2"] Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.661134 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5fde1dc8-f147-4848-860f-3639768765dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fde1dc8-f147-4848-860f-3639768765dc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.661182 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.661213 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.661239 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.661258 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.661306 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.661323 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.661349 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.661372 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.661391 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p42ph\" (UniqueName: \"kubernetes.io/projected/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-kube-api-access-p42ph\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.661414 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.661838 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.662472 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.662989 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.663734 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.665563 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.665591 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5fde1dc8-f147-4848-860f-3639768765dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fde1dc8-f147-4848-860f-3639768765dc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/34ae40e872a478d717326fdb9340b30a4e079c9fae3013f926bff3ee4eb0b4d3/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.666291 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.669975 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.670433 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.670493 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.670557 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.681992 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p42ph\" (UniqueName: \"kubernetes.io/projected/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-kube-api-access-p42ph\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.695222 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5fde1dc8-f147-4848-860f-3639768765dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fde1dc8-f147-4848-860f-3639768765dc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.754524 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76d8c4d77f-wthjz" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.758570 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.863552 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed17f1ff-6031-4c0e-ab80-fa1425029bde-config\") pod \"ed17f1ff-6031-4c0e-ab80-fa1425029bde\" (UID: \"ed17f1ff-6031-4c0e-ab80-fa1425029bde\") " Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.863622 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7tjx\" (UniqueName: \"kubernetes.io/projected/ed17f1ff-6031-4c0e-ab80-fa1425029bde-kube-api-access-s7tjx\") pod \"ed17f1ff-6031-4c0e-ab80-fa1425029bde\" (UID: \"ed17f1ff-6031-4c0e-ab80-fa1425029bde\") " Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.863663 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed17f1ff-6031-4c0e-ab80-fa1425029bde-dns-svc\") pod \"ed17f1ff-6031-4c0e-ab80-fa1425029bde\" (UID: \"ed17f1ff-6031-4c0e-ab80-fa1425029bde\") " Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.867789 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed17f1ff-6031-4c0e-ab80-fa1425029bde-kube-api-access-s7tjx" (OuterVolumeSpecName: "kube-api-access-s7tjx") pod "ed17f1ff-6031-4c0e-ab80-fa1425029bde" (UID: "ed17f1ff-6031-4c0e-ab80-fa1425029bde"). InnerVolumeSpecName "kube-api-access-s7tjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.883161 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed17f1ff-6031-4c0e-ab80-fa1425029bde-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ed17f1ff-6031-4c0e-ab80-fa1425029bde" (UID: "ed17f1ff-6031-4c0e-ab80-fa1425029bde"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.889410 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed17f1ff-6031-4c0e-ab80-fa1425029bde-config" (OuterVolumeSpecName: "config") pod "ed17f1ff-6031-4c0e-ab80-fa1425029bde" (UID: "ed17f1ff-6031-4c0e-ab80-fa1425029bde"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.965748 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed17f1ff-6031-4c0e-ab80-fa1425029bde-config\") on node \"crc\" DevicePath \"\"" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.965789 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7tjx\" (UniqueName: \"kubernetes.io/projected/ed17f1ff-6031-4c0e-ab80-fa1425029bde-kube-api-access-s7tjx\") on node \"crc\" DevicePath \"\"" Dec 05 17:39:50 crc kubenswrapper[4756]: I1205 17:39:50.965802 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed17f1ff-6031-4c0e-ab80-fa1425029bde-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.029371 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 17:39:51 crc kubenswrapper[4756]: W1205 17:39:51.036641 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod939b8b9a_81e1_4297_a82e_a590445d9049.slice/crio-5045542884f12b9cc1ee7ae541ab2600543571f67ad3008d7575e0596194ff15 WatchSource:0}: Error finding container 5045542884f12b9cc1ee7ae541ab2600543571f67ad3008d7575e0596194ff15: Status 404 returned error can't find the container with id 5045542884f12b9cc1ee7ae541ab2600543571f67ad3008d7575e0596194ff15 Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.187074 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 17:39:51 crc kubenswrapper[4756]: W1205 17:39:51.189679 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1ce2df7_97eb_43b0_a5c7_37d930a3240c.slice/crio-f07b7a77c7a5f54ff40b2e0507226885451ae6c34524d2fb72d6ad55cddd5a0d WatchSource:0}: Error finding container f07b7a77c7a5f54ff40b2e0507226885451ae6c34524d2fb72d6ad55cddd5a0d: Status 404 returned error can't find the container with id f07b7a77c7a5f54ff40b2e0507226885451ae6c34524d2fb72d6ad55cddd5a0d Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.285382 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 05 17:39:51 crc kubenswrapper[4756]: E1205 17:39:51.285728 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed17f1ff-6031-4c0e-ab80-fa1425029bde" containerName="init" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.285740 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed17f1ff-6031-4c0e-ab80-fa1425029bde" containerName="init" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.285879 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed17f1ff-6031-4c0e-ab80-fa1425029bde" containerName="init" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.289214 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.293928 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.294188 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-cxzlm" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.294353 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.294690 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.301120 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.303810 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.370303 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/969a4b86-3398-4066-9515-34123347f863-config-data-generated\") pod \"openstack-galera-0\" (UID: \"969a4b86-3398-4066-9515-34123347f863\") " pod="openstack/openstack-galera-0" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.370487 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/969a4b86-3398-4066-9515-34123347f863-operator-scripts\") pod \"openstack-galera-0\" (UID: \"969a4b86-3398-4066-9515-34123347f863\") " pod="openstack/openstack-galera-0" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.370604 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/969a4b86-3398-4066-9515-34123347f863-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"969a4b86-3398-4066-9515-34123347f863\") " pod="openstack/openstack-galera-0" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.370637 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/969a4b86-3398-4066-9515-34123347f863-config-data-default\") pod \"openstack-galera-0\" (UID: \"969a4b86-3398-4066-9515-34123347f863\") " pod="openstack/openstack-galera-0" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.370763 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/969a4b86-3398-4066-9515-34123347f863-kolla-config\") pod \"openstack-galera-0\" (UID: \"969a4b86-3398-4066-9515-34123347f863\") " pod="openstack/openstack-galera-0" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.370840 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/969a4b86-3398-4066-9515-34123347f863-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"969a4b86-3398-4066-9515-34123347f863\") " pod="openstack/openstack-galera-0" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.370905 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stqxw\" (UniqueName: \"kubernetes.io/projected/969a4b86-3398-4066-9515-34123347f863-kube-api-access-stqxw\") pod \"openstack-galera-0\" (UID: \"969a4b86-3398-4066-9515-34123347f863\") " pod="openstack/openstack-galera-0" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.370935 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d78217bc-9f42-40e0-8742-0835e0d76620\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d78217bc-9f42-40e0-8742-0835e0d76620\") pod \"openstack-galera-0\" (UID: \"969a4b86-3398-4066-9515-34123347f863\") " pod="openstack/openstack-galera-0" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.472264 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/969a4b86-3398-4066-9515-34123347f863-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"969a4b86-3398-4066-9515-34123347f863\") " pod="openstack/openstack-galera-0" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.472336 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stqxw\" (UniqueName: \"kubernetes.io/projected/969a4b86-3398-4066-9515-34123347f863-kube-api-access-stqxw\") pod \"openstack-galera-0\" (UID: \"969a4b86-3398-4066-9515-34123347f863\") " pod="openstack/openstack-galera-0" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.472365 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d78217bc-9f42-40e0-8742-0835e0d76620\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d78217bc-9f42-40e0-8742-0835e0d76620\") pod \"openstack-galera-0\" (UID: \"969a4b86-3398-4066-9515-34123347f863\") " pod="openstack/openstack-galera-0" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.472434 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/969a4b86-3398-4066-9515-34123347f863-config-data-generated\") pod \"openstack-galera-0\" (UID: \"969a4b86-3398-4066-9515-34123347f863\") " pod="openstack/openstack-galera-0" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.472485 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/969a4b86-3398-4066-9515-34123347f863-operator-scripts\") pod \"openstack-galera-0\" (UID: \"969a4b86-3398-4066-9515-34123347f863\") " pod="openstack/openstack-galera-0" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.472529 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/969a4b86-3398-4066-9515-34123347f863-config-data-default\") pod \"openstack-galera-0\" (UID: \"969a4b86-3398-4066-9515-34123347f863\") " pod="openstack/openstack-galera-0" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.472554 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/969a4b86-3398-4066-9515-34123347f863-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"969a4b86-3398-4066-9515-34123347f863\") " pod="openstack/openstack-galera-0" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.472608 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/969a4b86-3398-4066-9515-34123347f863-kolla-config\") pod \"openstack-galera-0\" (UID: \"969a4b86-3398-4066-9515-34123347f863\") " pod="openstack/openstack-galera-0" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.472952 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/969a4b86-3398-4066-9515-34123347f863-config-data-generated\") pod \"openstack-galera-0\" (UID: \"969a4b86-3398-4066-9515-34123347f863\") " pod="openstack/openstack-galera-0" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.473528 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/969a4b86-3398-4066-9515-34123347f863-kolla-config\") pod \"openstack-galera-0\" (UID: \"969a4b86-3398-4066-9515-34123347f863\") " pod="openstack/openstack-galera-0" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.474037 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/969a4b86-3398-4066-9515-34123347f863-config-data-default\") pod \"openstack-galera-0\" (UID: \"969a4b86-3398-4066-9515-34123347f863\") " pod="openstack/openstack-galera-0" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.474995 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/969a4b86-3398-4066-9515-34123347f863-operator-scripts\") pod \"openstack-galera-0\" (UID: \"969a4b86-3398-4066-9515-34123347f863\") " pod="openstack/openstack-galera-0" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.476649 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/969a4b86-3398-4066-9515-34123347f863-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"969a4b86-3398-4066-9515-34123347f863\") " pod="openstack/openstack-galera-0" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.477828 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/969a4b86-3398-4066-9515-34123347f863-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"969a4b86-3398-4066-9515-34123347f863\") " pod="openstack/openstack-galera-0" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.479190 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.479218 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d78217bc-9f42-40e0-8742-0835e0d76620\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d78217bc-9f42-40e0-8742-0835e0d76620\") pod \"openstack-galera-0\" (UID: \"969a4b86-3398-4066-9515-34123347f863\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/44e6586d35bfe8ef5bc2defc566bbc291a6b05063b74b7338d5c2c0070d39d82/globalmount\"" pod="openstack/openstack-galera-0" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.489615 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e1ce2df7-97eb-43b0-a5c7-37d930a3240c","Type":"ContainerStarted","Data":"f07b7a77c7a5f54ff40b2e0507226885451ae6c34524d2fb72d6ad55cddd5a0d"} Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.494434 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76d8c4d77f-wthjz" event={"ID":"ed17f1ff-6031-4c0e-ab80-fa1425029bde","Type":"ContainerDied","Data":"4c3036b805e0d12f24ea11d253cdfffbd46283c526e23e7999097418eb50b289"} Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.494499 4756 scope.go:117] "RemoveContainer" containerID="82f8b660eba9abc746361302bd8c41e88d77e9e32ad46e5521e16d2186a4fdcf" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.494656 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76d8c4d77f-wthjz" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.497208 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f5d88f885-zq75v" event={"ID":"a9d9bdec-a1a5-4504-ad70-5f198fb210a8","Type":"ContainerStarted","Data":"c6034ba412122077a53de6ece70a15b916d9f50f014b5bf0fbc1b84dd5f79412"} Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.497307 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f5d88f885-zq75v" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.501110 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cbb4f659c-dzl29" event={"ID":"b2174c60-6954-4a66-9a3e-f16ead5d484e","Type":"ContainerStarted","Data":"ae014ad50fa9deff75b63a46671b7e79bcdd19eb9e54e4471f07d73c04701392"} Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.501184 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7cbb4f659c-dzl29" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.505664 4756 generic.go:334] "Generic (PLEG): container finished" podID="b7e04838-1bc9-46f5-b43b-192f242ac804" containerID="6a81fe96c8e7fe24f6ce83f37f186bc61d3a1e7e374e7c3aa65e0065ec5bbfe1" exitCode=0 Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.505750 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t8kpb" event={"ID":"b7e04838-1bc9-46f5-b43b-192f242ac804","Type":"ContainerDied","Data":"6a81fe96c8e7fe24f6ce83f37f186bc61d3a1e7e374e7c3aa65e0065ec5bbfe1"} Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.506889 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stqxw\" (UniqueName: \"kubernetes.io/projected/969a4b86-3398-4066-9515-34123347f863-kube-api-access-stqxw\") pod \"openstack-galera-0\" (UID: \"969a4b86-3398-4066-9515-34123347f863\") " pod="openstack/openstack-galera-0" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.507631 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"939b8b9a-81e1-4297-a82e-a590445d9049","Type":"ContainerStarted","Data":"5045542884f12b9cc1ee7ae541ab2600543571f67ad3008d7575e0596194ff15"} Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.517867 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d78217bc-9f42-40e0-8742-0835e0d76620\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d78217bc-9f42-40e0-8742-0835e0d76620\") pod \"openstack-galera-0\" (UID: \"969a4b86-3398-4066-9515-34123347f863\") " pod="openstack/openstack-galera-0" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.528466 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7cbb4f659c-dzl29" podStartSLOduration=2.528416864 podStartE2EDuration="2.528416864s" podCreationTimestamp="2025-12-05 17:39:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:39:51.522392844 +0000 UTC m=+5051.761066356" watchObservedRunningTime="2025-12-05 17:39:51.528416864 +0000 UTC m=+5051.767090376" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.553707 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f5d88f885-zq75v" podStartSLOduration=3.553683414 podStartE2EDuration="3.553683414s" podCreationTimestamp="2025-12-05 17:39:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:39:51.548170048 +0000 UTC m=+5051.786843600" watchObservedRunningTime="2025-12-05 17:39:51.553683414 +0000 UTC m=+5051.792356936" Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.621686 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76d8c4d77f-wthjz"] Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.636516 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76d8c4d77f-wthjz"] Dec 05 17:39:51 crc kubenswrapper[4756]: I1205 17:39:51.656044 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 05 17:39:52 crc kubenswrapper[4756]: I1205 17:39:52.061917 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa75605f-d081-44b7-88d4-ef580c4d2ee8" path="/var/lib/kubelet/pods/aa75605f-d081-44b7-88d4-ef580c4d2ee8/volumes" Dec 05 17:39:52 crc kubenswrapper[4756]: I1205 17:39:52.062560 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed17f1ff-6031-4c0e-ab80-fa1425029bde" path="/var/lib/kubelet/pods/ed17f1ff-6031-4c0e-ab80-fa1425029bde/volumes" Dec 05 17:39:52 crc kubenswrapper[4756]: W1205 17:39:52.139071 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod969a4b86_3398_4066_9515_34123347f863.slice/crio-23e5b6223b256f016e637ab5120f23e2a86d14b276bb5ff354b5d758152d94fe WatchSource:0}: Error finding container 23e5b6223b256f016e637ab5120f23e2a86d14b276bb5ff354b5d758152d94fe: Status 404 returned error can't find the container with id 23e5b6223b256f016e637ab5120f23e2a86d14b276bb5ff354b5d758152d94fe Dec 05 17:39:52 crc kubenswrapper[4756]: I1205 17:39:52.157842 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 05 17:39:52 crc kubenswrapper[4756]: I1205 17:39:52.516525 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e1ce2df7-97eb-43b0-a5c7-37d930a3240c","Type":"ContainerStarted","Data":"51e382dbc6027a5dcaf06b8cec47ef3782406571bc91378fc85269e786c699b3"} Dec 05 17:39:52 crc kubenswrapper[4756]: I1205 17:39:52.519637 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"969a4b86-3398-4066-9515-34123347f863","Type":"ContainerStarted","Data":"bc8e97ff9a952f465dfa781000bee9d7d989139c8b93f7bbb3eb46933791567b"} Dec 05 17:39:52 crc kubenswrapper[4756]: I1205 17:39:52.519670 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"969a4b86-3398-4066-9515-34123347f863","Type":"ContainerStarted","Data":"23e5b6223b256f016e637ab5120f23e2a86d14b276bb5ff354b5d758152d94fe"} Dec 05 17:39:52 crc kubenswrapper[4756]: I1205 17:39:52.521780 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t8kpb" event={"ID":"b7e04838-1bc9-46f5-b43b-192f242ac804","Type":"ContainerStarted","Data":"ff455624c4dde2c04e056ac7a1418d917898b3139fc5e7de69a1c3b7d10c1087"} Dec 05 17:39:52 crc kubenswrapper[4756]: I1205 17:39:52.526095 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"939b8b9a-81e1-4297-a82e-a590445d9049","Type":"ContainerStarted","Data":"d392465fd0efb61ea44648c6fda43046358aa06fb08b6458c16cf97d36218a5c"} Dec 05 17:39:52 crc kubenswrapper[4756]: I1205 17:39:52.883295 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 17:39:52 crc kubenswrapper[4756]: I1205 17:39:52.884962 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 05 17:39:52 crc kubenswrapper[4756]: I1205 17:39:52.890371 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 05 17:39:52 crc kubenswrapper[4756]: I1205 17:39:52.891040 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 05 17:39:52 crc kubenswrapper[4756]: I1205 17:39:52.891281 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 05 17:39:52 crc kubenswrapper[4756]: I1205 17:39:52.899876 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-9vz56" Dec 05 17:39:52 crc kubenswrapper[4756]: I1205 17:39:52.910753 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbd67003-7ac6-402a-84ff-91aa8e5c7b37-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"dbd67003-7ac6-402a-84ff-91aa8e5c7b37\") " pod="openstack/openstack-cell1-galera-0" Dec 05 17:39:52 crc kubenswrapper[4756]: I1205 17:39:52.910849 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd67003-7ac6-402a-84ff-91aa8e5c7b37-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"dbd67003-7ac6-402a-84ff-91aa8e5c7b37\") " pod="openstack/openstack-cell1-galera-0" Dec 05 17:39:52 crc kubenswrapper[4756]: I1205 17:39:52.910921 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dbd67003-7ac6-402a-84ff-91aa8e5c7b37-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"dbd67003-7ac6-402a-84ff-91aa8e5c7b37\") " pod="openstack/openstack-cell1-galera-0" Dec 05 17:39:52 crc kubenswrapper[4756]: I1205 17:39:52.910969 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-219e28be-1360-4504-9569-1aab9d2ae5ff\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-219e28be-1360-4504-9569-1aab9d2ae5ff\") pod \"openstack-cell1-galera-0\" (UID: \"dbd67003-7ac6-402a-84ff-91aa8e5c7b37\") " pod="openstack/openstack-cell1-galera-0" Dec 05 17:39:52 crc kubenswrapper[4756]: I1205 17:39:52.911040 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jvjw\" (UniqueName: \"kubernetes.io/projected/dbd67003-7ac6-402a-84ff-91aa8e5c7b37-kube-api-access-2jvjw\") pod \"openstack-cell1-galera-0\" (UID: \"dbd67003-7ac6-402a-84ff-91aa8e5c7b37\") " pod="openstack/openstack-cell1-galera-0" Dec 05 17:39:52 crc kubenswrapper[4756]: I1205 17:39:52.911095 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dbd67003-7ac6-402a-84ff-91aa8e5c7b37-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"dbd67003-7ac6-402a-84ff-91aa8e5c7b37\") " pod="openstack/openstack-cell1-galera-0" Dec 05 17:39:52 crc kubenswrapper[4756]: I1205 17:39:52.911205 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbd67003-7ac6-402a-84ff-91aa8e5c7b37-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"dbd67003-7ac6-402a-84ff-91aa8e5c7b37\") " pod="openstack/openstack-cell1-galera-0" Dec 05 17:39:52 crc kubenswrapper[4756]: I1205 17:39:52.911242 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dbd67003-7ac6-402a-84ff-91aa8e5c7b37-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"dbd67003-7ac6-402a-84ff-91aa8e5c7b37\") " pod="openstack/openstack-cell1-galera-0" Dec 05 17:39:52 crc kubenswrapper[4756]: I1205 17:39:52.912731 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.012811 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd67003-7ac6-402a-84ff-91aa8e5c7b37-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"dbd67003-7ac6-402a-84ff-91aa8e5c7b37\") " pod="openstack/openstack-cell1-galera-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.012887 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dbd67003-7ac6-402a-84ff-91aa8e5c7b37-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"dbd67003-7ac6-402a-84ff-91aa8e5c7b37\") " pod="openstack/openstack-cell1-galera-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.012983 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-219e28be-1360-4504-9569-1aab9d2ae5ff\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-219e28be-1360-4504-9569-1aab9d2ae5ff\") pod \"openstack-cell1-galera-0\" (UID: \"dbd67003-7ac6-402a-84ff-91aa8e5c7b37\") " pod="openstack/openstack-cell1-galera-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.013049 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jvjw\" (UniqueName: \"kubernetes.io/projected/dbd67003-7ac6-402a-84ff-91aa8e5c7b37-kube-api-access-2jvjw\") pod \"openstack-cell1-galera-0\" (UID: \"dbd67003-7ac6-402a-84ff-91aa8e5c7b37\") " pod="openstack/openstack-cell1-galera-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.013094 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dbd67003-7ac6-402a-84ff-91aa8e5c7b37-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"dbd67003-7ac6-402a-84ff-91aa8e5c7b37\") " pod="openstack/openstack-cell1-galera-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.013168 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbd67003-7ac6-402a-84ff-91aa8e5c7b37-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"dbd67003-7ac6-402a-84ff-91aa8e5c7b37\") " pod="openstack/openstack-cell1-galera-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.013192 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dbd67003-7ac6-402a-84ff-91aa8e5c7b37-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"dbd67003-7ac6-402a-84ff-91aa8e5c7b37\") " pod="openstack/openstack-cell1-galera-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.013231 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbd67003-7ac6-402a-84ff-91aa8e5c7b37-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"dbd67003-7ac6-402a-84ff-91aa8e5c7b37\") " pod="openstack/openstack-cell1-galera-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.013373 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dbd67003-7ac6-402a-84ff-91aa8e5c7b37-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"dbd67003-7ac6-402a-84ff-91aa8e5c7b37\") " pod="openstack/openstack-cell1-galera-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.014113 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dbd67003-7ac6-402a-84ff-91aa8e5c7b37-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"dbd67003-7ac6-402a-84ff-91aa8e5c7b37\") " pod="openstack/openstack-cell1-galera-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.014978 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dbd67003-7ac6-402a-84ff-91aa8e5c7b37-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"dbd67003-7ac6-402a-84ff-91aa8e5c7b37\") " pod="openstack/openstack-cell1-galera-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.015347 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.015371 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-219e28be-1360-4504-9569-1aab9d2ae5ff\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-219e28be-1360-4504-9569-1aab9d2ae5ff\") pod \"openstack-cell1-galera-0\" (UID: \"dbd67003-7ac6-402a-84ff-91aa8e5c7b37\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/969172cd9fe5836fe92959cdea6dbe382840f65d2e9f1996be2fb11df9e57926/globalmount\"" pod="openstack/openstack-cell1-galera-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.015896 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbd67003-7ac6-402a-84ff-91aa8e5c7b37-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"dbd67003-7ac6-402a-84ff-91aa8e5c7b37\") " pod="openstack/openstack-cell1-galera-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.018798 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd67003-7ac6-402a-84ff-91aa8e5c7b37-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"dbd67003-7ac6-402a-84ff-91aa8e5c7b37\") " pod="openstack/openstack-cell1-galera-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.019030 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbd67003-7ac6-402a-84ff-91aa8e5c7b37-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"dbd67003-7ac6-402a-84ff-91aa8e5c7b37\") " pod="openstack/openstack-cell1-galera-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.031462 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jvjw\" (UniqueName: \"kubernetes.io/projected/dbd67003-7ac6-402a-84ff-91aa8e5c7b37-kube-api-access-2jvjw\") pod \"openstack-cell1-galera-0\" (UID: \"dbd67003-7ac6-402a-84ff-91aa8e5c7b37\") " pod="openstack/openstack-cell1-galera-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.044946 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-219e28be-1360-4504-9569-1aab9d2ae5ff\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-219e28be-1360-4504-9569-1aab9d2ae5ff\") pod \"openstack-cell1-galera-0\" (UID: \"dbd67003-7ac6-402a-84ff-91aa8e5c7b37\") " pod="openstack/openstack-cell1-galera-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.121590 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.122595 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.124763 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.125761 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.126596 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-nc2qh" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.140312 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.204466 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.216120 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5f97483-1670-469a-afec-002e63d0de2f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f5f97483-1670-469a-afec-002e63d0de2f\") " pod="openstack/memcached-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.216322 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f5f97483-1670-469a-afec-002e63d0de2f-kolla-config\") pod \"memcached-0\" (UID: \"f5f97483-1670-469a-afec-002e63d0de2f\") " pod="openstack/memcached-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.216530 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j59hq\" (UniqueName: \"kubernetes.io/projected/f5f97483-1670-469a-afec-002e63d0de2f-kube-api-access-j59hq\") pod \"memcached-0\" (UID: \"f5f97483-1670-469a-afec-002e63d0de2f\") " pod="openstack/memcached-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.216634 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5f97483-1670-469a-afec-002e63d0de2f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f5f97483-1670-469a-afec-002e63d0de2f\") " pod="openstack/memcached-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.216684 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f5f97483-1670-469a-afec-002e63d0de2f-config-data\") pod \"memcached-0\" (UID: \"f5f97483-1670-469a-afec-002e63d0de2f\") " pod="openstack/memcached-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.317856 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f5f97483-1670-469a-afec-002e63d0de2f-kolla-config\") pod \"memcached-0\" (UID: \"f5f97483-1670-469a-afec-002e63d0de2f\") " pod="openstack/memcached-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.317914 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j59hq\" (UniqueName: \"kubernetes.io/projected/f5f97483-1670-469a-afec-002e63d0de2f-kube-api-access-j59hq\") pod \"memcached-0\" (UID: \"f5f97483-1670-469a-afec-002e63d0de2f\") " pod="openstack/memcached-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.317935 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5f97483-1670-469a-afec-002e63d0de2f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f5f97483-1670-469a-afec-002e63d0de2f\") " pod="openstack/memcached-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.317969 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f5f97483-1670-469a-afec-002e63d0de2f-config-data\") pod \"memcached-0\" (UID: \"f5f97483-1670-469a-afec-002e63d0de2f\") " pod="openstack/memcached-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.318008 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5f97483-1670-469a-afec-002e63d0de2f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f5f97483-1670-469a-afec-002e63d0de2f\") " pod="openstack/memcached-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.319889 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f5f97483-1670-469a-afec-002e63d0de2f-config-data\") pod \"memcached-0\" (UID: \"f5f97483-1670-469a-afec-002e63d0de2f\") " pod="openstack/memcached-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.320027 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f5f97483-1670-469a-afec-002e63d0de2f-kolla-config\") pod \"memcached-0\" (UID: \"f5f97483-1670-469a-afec-002e63d0de2f\") " pod="openstack/memcached-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.332380 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5f97483-1670-469a-afec-002e63d0de2f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f5f97483-1670-469a-afec-002e63d0de2f\") " pod="openstack/memcached-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.335488 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5f97483-1670-469a-afec-002e63d0de2f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f5f97483-1670-469a-afec-002e63d0de2f\") " pod="openstack/memcached-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.336037 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j59hq\" (UniqueName: \"kubernetes.io/projected/f5f97483-1670-469a-afec-002e63d0de2f-kube-api-access-j59hq\") pod \"memcached-0\" (UID: \"f5f97483-1670-469a-afec-002e63d0de2f\") " pod="openstack/memcached-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.440370 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.554037 4756 generic.go:334] "Generic (PLEG): container finished" podID="b7e04838-1bc9-46f5-b43b-192f242ac804" containerID="ff455624c4dde2c04e056ac7a1418d917898b3139fc5e7de69a1c3b7d10c1087" exitCode=0 Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.554091 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t8kpb" event={"ID":"b7e04838-1bc9-46f5-b43b-192f242ac804","Type":"ContainerDied","Data":"ff455624c4dde2c04e056ac7a1418d917898b3139fc5e7de69a1c3b7d10c1087"} Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.643682 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 17:39:53 crc kubenswrapper[4756]: W1205 17:39:53.649894 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbd67003_7ac6_402a_84ff_91aa8e5c7b37.slice/crio-4c686856301f1a38177ec2f7fd85edc4eb79c12ed52e90817e944b652b87aef9 WatchSource:0}: Error finding container 4c686856301f1a38177ec2f7fd85edc4eb79c12ed52e90817e944b652b87aef9: Status 404 returned error can't find the container with id 4c686856301f1a38177ec2f7fd85edc4eb79c12ed52e90817e944b652b87aef9 Dec 05 17:39:53 crc kubenswrapper[4756]: I1205 17:39:53.880231 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 05 17:39:54 crc kubenswrapper[4756]: I1205 17:39:54.564159 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f5f97483-1670-469a-afec-002e63d0de2f","Type":"ContainerStarted","Data":"e3df4cbd781f76b6ae2e288ac59247369fc0c32509d5ee85da676e4ec0c2b64c"} Dec 05 17:39:54 crc kubenswrapper[4756]: I1205 17:39:54.564207 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f5f97483-1670-469a-afec-002e63d0de2f","Type":"ContainerStarted","Data":"d802c406d3fe1f86abd392e7f43522d42abf46c1eb0618dadadd0fe1f65a793a"} Dec 05 17:39:54 crc kubenswrapper[4756]: I1205 17:39:54.564336 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 05 17:39:54 crc kubenswrapper[4756]: I1205 17:39:54.566847 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t8kpb" event={"ID":"b7e04838-1bc9-46f5-b43b-192f242ac804","Type":"ContainerStarted","Data":"35cf1f7f888ff332e44703ab90889d93c9bad19d424022169d986942a21673b6"} Dec 05 17:39:54 crc kubenswrapper[4756]: I1205 17:39:54.568401 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"dbd67003-7ac6-402a-84ff-91aa8e5c7b37","Type":"ContainerStarted","Data":"8c5bf9addb06fe32a987c46a3f3cdf1d141ea8635e035c29690c57d887bfac9b"} Dec 05 17:39:54 crc kubenswrapper[4756]: I1205 17:39:54.568476 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"dbd67003-7ac6-402a-84ff-91aa8e5c7b37","Type":"ContainerStarted","Data":"4c686856301f1a38177ec2f7fd85edc4eb79c12ed52e90817e944b652b87aef9"} Dec 05 17:39:54 crc kubenswrapper[4756]: I1205 17:39:54.595275 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=1.5952365240000002 podStartE2EDuration="1.595236524s" podCreationTimestamp="2025-12-05 17:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:39:54.584001896 +0000 UTC m=+5054.822675418" watchObservedRunningTime="2025-12-05 17:39:54.595236524 +0000 UTC m=+5054.833910126" Dec 05 17:39:54 crc kubenswrapper[4756]: I1205 17:39:54.625285 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-t8kpb" podStartSLOduration=3.150609347 podStartE2EDuration="5.625261571s" podCreationTimestamp="2025-12-05 17:39:49 +0000 UTC" firstStartedPulling="2025-12-05 17:39:51.525745363 +0000 UTC m=+5051.764418875" lastFinishedPulling="2025-12-05 17:39:54.000397577 +0000 UTC m=+5054.239071099" observedRunningTime="2025-12-05 17:39:54.615685307 +0000 UTC m=+5054.854358849" watchObservedRunningTime="2025-12-05 17:39:54.625261571 +0000 UTC m=+5054.863935113" Dec 05 17:39:56 crc kubenswrapper[4756]: I1205 17:39:56.601059 4756 generic.go:334] "Generic (PLEG): container finished" podID="969a4b86-3398-4066-9515-34123347f863" containerID="bc8e97ff9a952f465dfa781000bee9d7d989139c8b93f7bbb3eb46933791567b" exitCode=0 Dec 05 17:39:56 crc kubenswrapper[4756]: I1205 17:39:56.601153 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"969a4b86-3398-4066-9515-34123347f863","Type":"ContainerDied","Data":"bc8e97ff9a952f465dfa781000bee9d7d989139c8b93f7bbb3eb46933791567b"} Dec 05 17:39:57 crc kubenswrapper[4756]: I1205 17:39:57.615563 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"969a4b86-3398-4066-9515-34123347f863","Type":"ContainerStarted","Data":"afb3cc4313382c53b0cb76fc18e918b5124a7ba22a547b58f3da560cd19bd3b9"} Dec 05 17:39:57 crc kubenswrapper[4756]: I1205 17:39:57.617830 4756 generic.go:334] "Generic (PLEG): container finished" podID="dbd67003-7ac6-402a-84ff-91aa8e5c7b37" containerID="8c5bf9addb06fe32a987c46a3f3cdf1d141ea8635e035c29690c57d887bfac9b" exitCode=0 Dec 05 17:39:57 crc kubenswrapper[4756]: I1205 17:39:57.617904 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"dbd67003-7ac6-402a-84ff-91aa8e5c7b37","Type":"ContainerDied","Data":"8c5bf9addb06fe32a987c46a3f3cdf1d141ea8635e035c29690c57d887bfac9b"} Dec 05 17:39:57 crc kubenswrapper[4756]: I1205 17:39:57.645602 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=7.645578377 podStartE2EDuration="7.645578377s" podCreationTimestamp="2025-12-05 17:39:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:39:57.644315343 +0000 UTC m=+5057.882988855" watchObservedRunningTime="2025-12-05 17:39:57.645578377 +0000 UTC m=+5057.884251909" Dec 05 17:39:58 crc kubenswrapper[4756]: I1205 17:39:58.625241 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"dbd67003-7ac6-402a-84ff-91aa8e5c7b37","Type":"ContainerStarted","Data":"e77b122c70c3dc43d7cd81f31123e6842e6b3c2931704feb375704d199015f8e"} Dec 05 17:39:58 crc kubenswrapper[4756]: I1205 17:39:58.655291 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.655274693 podStartE2EDuration="7.655274693s" podCreationTimestamp="2025-12-05 17:39:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:39:58.653308121 +0000 UTC m=+5058.891981643" watchObservedRunningTime="2025-12-05 17:39:58.655274693 +0000 UTC m=+5058.893948195" Dec 05 17:39:58 crc kubenswrapper[4756]: I1205 17:39:58.986401 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f5d88f885-zq75v" Dec 05 17:39:59 crc kubenswrapper[4756]: I1205 17:39:59.673663 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7cbb4f659c-dzl29" Dec 05 17:39:59 crc kubenswrapper[4756]: I1205 17:39:59.735370 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f5d88f885-zq75v"] Dec 05 17:39:59 crc kubenswrapper[4756]: I1205 17:39:59.735688 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f5d88f885-zq75v" podUID="a9d9bdec-a1a5-4504-ad70-5f198fb210a8" containerName="dnsmasq-dns" containerID="cri-o://c6034ba412122077a53de6ece70a15b916d9f50f014b5bf0fbc1b84dd5f79412" gracePeriod=10 Dec 05 17:39:59 crc kubenswrapper[4756]: I1205 17:39:59.807739 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-t8kpb" Dec 05 17:39:59 crc kubenswrapper[4756]: I1205 17:39:59.808030 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-t8kpb" Dec 05 17:39:59 crc kubenswrapper[4756]: I1205 17:39:59.868970 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-t8kpb" Dec 05 17:40:00 crc kubenswrapper[4756]: I1205 17:40:00.642188 4756 generic.go:334] "Generic (PLEG): container finished" podID="a9d9bdec-a1a5-4504-ad70-5f198fb210a8" containerID="c6034ba412122077a53de6ece70a15b916d9f50f014b5bf0fbc1b84dd5f79412" exitCode=0 Dec 05 17:40:00 crc kubenswrapper[4756]: I1205 17:40:00.642265 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f5d88f885-zq75v" event={"ID":"a9d9bdec-a1a5-4504-ad70-5f198fb210a8","Type":"ContainerDied","Data":"c6034ba412122077a53de6ece70a15b916d9f50f014b5bf0fbc1b84dd5f79412"} Dec 05 17:40:00 crc kubenswrapper[4756]: I1205 17:40:00.713907 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-t8kpb" Dec 05 17:40:00 crc kubenswrapper[4756]: I1205 17:40:00.768197 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t8kpb"] Dec 05 17:40:00 crc kubenswrapper[4756]: I1205 17:40:00.792271 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f5d88f885-zq75v" Dec 05 17:40:00 crc kubenswrapper[4756]: I1205 17:40:00.942851 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qrst\" (UniqueName: \"kubernetes.io/projected/a9d9bdec-a1a5-4504-ad70-5f198fb210a8-kube-api-access-2qrst\") pod \"a9d9bdec-a1a5-4504-ad70-5f198fb210a8\" (UID: \"a9d9bdec-a1a5-4504-ad70-5f198fb210a8\") " Dec 05 17:40:00 crc kubenswrapper[4756]: I1205 17:40:00.943079 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9d9bdec-a1a5-4504-ad70-5f198fb210a8-config\") pod \"a9d9bdec-a1a5-4504-ad70-5f198fb210a8\" (UID: \"a9d9bdec-a1a5-4504-ad70-5f198fb210a8\") " Dec 05 17:40:00 crc kubenswrapper[4756]: I1205 17:40:00.949134 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9d9bdec-a1a5-4504-ad70-5f198fb210a8-kube-api-access-2qrst" (OuterVolumeSpecName: "kube-api-access-2qrst") pod "a9d9bdec-a1a5-4504-ad70-5f198fb210a8" (UID: "a9d9bdec-a1a5-4504-ad70-5f198fb210a8"). InnerVolumeSpecName "kube-api-access-2qrst". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:40:00 crc kubenswrapper[4756]: I1205 17:40:00.984804 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9d9bdec-a1a5-4504-ad70-5f198fb210a8-config" (OuterVolumeSpecName: "config") pod "a9d9bdec-a1a5-4504-ad70-5f198fb210a8" (UID: "a9d9bdec-a1a5-4504-ad70-5f198fb210a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:40:01 crc kubenswrapper[4756]: I1205 17:40:01.045603 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9d9bdec-a1a5-4504-ad70-5f198fb210a8-config\") on node \"crc\" DevicePath \"\"" Dec 05 17:40:01 crc kubenswrapper[4756]: I1205 17:40:01.045660 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qrst\" (UniqueName: \"kubernetes.io/projected/a9d9bdec-a1a5-4504-ad70-5f198fb210a8-kube-api-access-2qrst\") on node \"crc\" DevicePath \"\"" Dec 05 17:40:01 crc kubenswrapper[4756]: I1205 17:40:01.650956 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f5d88f885-zq75v" event={"ID":"a9d9bdec-a1a5-4504-ad70-5f198fb210a8","Type":"ContainerDied","Data":"b6354aab8aee1bf9738e5fda1532c75f04a25a484f3a01e2536513da92c2f1b0"} Dec 05 17:40:01 crc kubenswrapper[4756]: I1205 17:40:01.650990 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f5d88f885-zq75v" Dec 05 17:40:01 crc kubenswrapper[4756]: I1205 17:40:01.651045 4756 scope.go:117] "RemoveContainer" containerID="c6034ba412122077a53de6ece70a15b916d9f50f014b5bf0fbc1b84dd5f79412" Dec 05 17:40:01 crc kubenswrapper[4756]: I1205 17:40:01.656125 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 05 17:40:01 crc kubenswrapper[4756]: I1205 17:40:01.656706 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 05 17:40:01 crc kubenswrapper[4756]: I1205 17:40:01.678987 4756 scope.go:117] "RemoveContainer" containerID="37159267aed5141b5f2ff0e4d4c1f409a6c5d4d8dc9daa73825a68b43887c454" Dec 05 17:40:01 crc kubenswrapper[4756]: I1205 17:40:01.689914 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f5d88f885-zq75v"] Dec 05 17:40:01 crc kubenswrapper[4756]: I1205 17:40:01.698344 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f5d88f885-zq75v"] Dec 05 17:40:01 crc kubenswrapper[4756]: I1205 17:40:01.736648 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 05 17:40:02 crc kubenswrapper[4756]: I1205 17:40:02.068359 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9d9bdec-a1a5-4504-ad70-5f198fb210a8" path="/var/lib/kubelet/pods/a9d9bdec-a1a5-4504-ad70-5f198fb210a8/volumes" Dec 05 17:40:02 crc kubenswrapper[4756]: I1205 17:40:02.664302 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-t8kpb" podUID="b7e04838-1bc9-46f5-b43b-192f242ac804" containerName="registry-server" containerID="cri-o://35cf1f7f888ff332e44703ab90889d93c9bad19d424022169d986942a21673b6" gracePeriod=2 Dec 05 17:40:02 crc kubenswrapper[4756]: I1205 17:40:02.762139 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 05 17:40:03 crc kubenswrapper[4756]: I1205 17:40:03.141703 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t8kpb" Dec 05 17:40:03 crc kubenswrapper[4756]: I1205 17:40:03.180502 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7e04838-1bc9-46f5-b43b-192f242ac804-catalog-content\") pod \"b7e04838-1bc9-46f5-b43b-192f242ac804\" (UID: \"b7e04838-1bc9-46f5-b43b-192f242ac804\") " Dec 05 17:40:03 crc kubenswrapper[4756]: I1205 17:40:03.180603 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7e04838-1bc9-46f5-b43b-192f242ac804-utilities\") pod \"b7e04838-1bc9-46f5-b43b-192f242ac804\" (UID: \"b7e04838-1bc9-46f5-b43b-192f242ac804\") " Dec 05 17:40:03 crc kubenswrapper[4756]: I1205 17:40:03.180713 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjfkn\" (UniqueName: \"kubernetes.io/projected/b7e04838-1bc9-46f5-b43b-192f242ac804-kube-api-access-jjfkn\") pod \"b7e04838-1bc9-46f5-b43b-192f242ac804\" (UID: \"b7e04838-1bc9-46f5-b43b-192f242ac804\") " Dec 05 17:40:03 crc kubenswrapper[4756]: I1205 17:40:03.181748 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7e04838-1bc9-46f5-b43b-192f242ac804-utilities" (OuterVolumeSpecName: "utilities") pod "b7e04838-1bc9-46f5-b43b-192f242ac804" (UID: "b7e04838-1bc9-46f5-b43b-192f242ac804"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:40:03 crc kubenswrapper[4756]: I1205 17:40:03.186782 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7e04838-1bc9-46f5-b43b-192f242ac804-kube-api-access-jjfkn" (OuterVolumeSpecName: "kube-api-access-jjfkn") pod "b7e04838-1bc9-46f5-b43b-192f242ac804" (UID: "b7e04838-1bc9-46f5-b43b-192f242ac804"). InnerVolumeSpecName "kube-api-access-jjfkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:40:03 crc kubenswrapper[4756]: I1205 17:40:03.205224 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 05 17:40:03 crc kubenswrapper[4756]: I1205 17:40:03.205522 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 05 17:40:03 crc kubenswrapper[4756]: I1205 17:40:03.258750 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7e04838-1bc9-46f5-b43b-192f242ac804-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b7e04838-1bc9-46f5-b43b-192f242ac804" (UID: "b7e04838-1bc9-46f5-b43b-192f242ac804"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:40:03 crc kubenswrapper[4756]: I1205 17:40:03.283253 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7e04838-1bc9-46f5-b43b-192f242ac804-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 17:40:03 crc kubenswrapper[4756]: I1205 17:40:03.283292 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjfkn\" (UniqueName: \"kubernetes.io/projected/b7e04838-1bc9-46f5-b43b-192f242ac804-kube-api-access-jjfkn\") on node \"crc\" DevicePath \"\"" Dec 05 17:40:03 crc kubenswrapper[4756]: I1205 17:40:03.283307 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7e04838-1bc9-46f5-b43b-192f242ac804-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 17:40:03 crc kubenswrapper[4756]: I1205 17:40:03.442315 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 05 17:40:03 crc kubenswrapper[4756]: I1205 17:40:03.688284 4756 generic.go:334] "Generic (PLEG): container finished" podID="b7e04838-1bc9-46f5-b43b-192f242ac804" containerID="35cf1f7f888ff332e44703ab90889d93c9bad19d424022169d986942a21673b6" exitCode=0 Dec 05 17:40:03 crc kubenswrapper[4756]: I1205 17:40:03.688447 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t8kpb" Dec 05 17:40:03 crc kubenswrapper[4756]: I1205 17:40:03.688501 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t8kpb" event={"ID":"b7e04838-1bc9-46f5-b43b-192f242ac804","Type":"ContainerDied","Data":"35cf1f7f888ff332e44703ab90889d93c9bad19d424022169d986942a21673b6"} Dec 05 17:40:03 crc kubenswrapper[4756]: I1205 17:40:03.689117 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t8kpb" event={"ID":"b7e04838-1bc9-46f5-b43b-192f242ac804","Type":"ContainerDied","Data":"040a9ef8e1abdebed245fd6b70de444d53b8133a9f6db1da19eba36f1179d955"} Dec 05 17:40:03 crc kubenswrapper[4756]: I1205 17:40:03.689162 4756 scope.go:117] "RemoveContainer" containerID="35cf1f7f888ff332e44703ab90889d93c9bad19d424022169d986942a21673b6" Dec 05 17:40:03 crc kubenswrapper[4756]: I1205 17:40:03.740672 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t8kpb"] Dec 05 17:40:03 crc kubenswrapper[4756]: I1205 17:40:03.749478 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-t8kpb"] Dec 05 17:40:03 crc kubenswrapper[4756]: I1205 17:40:03.750035 4756 scope.go:117] "RemoveContainer" containerID="ff455624c4dde2c04e056ac7a1418d917898b3139fc5e7de69a1c3b7d10c1087" Dec 05 17:40:03 crc kubenswrapper[4756]: I1205 17:40:03.779961 4756 scope.go:117] "RemoveContainer" containerID="6a81fe96c8e7fe24f6ce83f37f186bc61d3a1e7e374e7c3aa65e0065ec5bbfe1" Dec 05 17:40:03 crc kubenswrapper[4756]: I1205 17:40:03.815898 4756 scope.go:117] "RemoveContainer" containerID="35cf1f7f888ff332e44703ab90889d93c9bad19d424022169d986942a21673b6" Dec 05 17:40:03 crc kubenswrapper[4756]: E1205 17:40:03.816632 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35cf1f7f888ff332e44703ab90889d93c9bad19d424022169d986942a21673b6\": container with ID starting with 35cf1f7f888ff332e44703ab90889d93c9bad19d424022169d986942a21673b6 not found: ID does not exist" containerID="35cf1f7f888ff332e44703ab90889d93c9bad19d424022169d986942a21673b6" Dec 05 17:40:03 crc kubenswrapper[4756]: I1205 17:40:03.816701 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35cf1f7f888ff332e44703ab90889d93c9bad19d424022169d986942a21673b6"} err="failed to get container status \"35cf1f7f888ff332e44703ab90889d93c9bad19d424022169d986942a21673b6\": rpc error: code = NotFound desc = could not find container \"35cf1f7f888ff332e44703ab90889d93c9bad19d424022169d986942a21673b6\": container with ID starting with 35cf1f7f888ff332e44703ab90889d93c9bad19d424022169d986942a21673b6 not found: ID does not exist" Dec 05 17:40:03 crc kubenswrapper[4756]: I1205 17:40:03.816733 4756 scope.go:117] "RemoveContainer" containerID="ff455624c4dde2c04e056ac7a1418d917898b3139fc5e7de69a1c3b7d10c1087" Dec 05 17:40:03 crc kubenswrapper[4756]: E1205 17:40:03.817303 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff455624c4dde2c04e056ac7a1418d917898b3139fc5e7de69a1c3b7d10c1087\": container with ID starting with ff455624c4dde2c04e056ac7a1418d917898b3139fc5e7de69a1c3b7d10c1087 not found: ID does not exist" containerID="ff455624c4dde2c04e056ac7a1418d917898b3139fc5e7de69a1c3b7d10c1087" Dec 05 17:40:03 crc kubenswrapper[4756]: I1205 17:40:03.817358 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff455624c4dde2c04e056ac7a1418d917898b3139fc5e7de69a1c3b7d10c1087"} err="failed to get container status \"ff455624c4dde2c04e056ac7a1418d917898b3139fc5e7de69a1c3b7d10c1087\": rpc error: code = NotFound desc = could not find container \"ff455624c4dde2c04e056ac7a1418d917898b3139fc5e7de69a1c3b7d10c1087\": container with ID starting with ff455624c4dde2c04e056ac7a1418d917898b3139fc5e7de69a1c3b7d10c1087 not found: ID does not exist" Dec 05 17:40:03 crc kubenswrapper[4756]: I1205 17:40:03.817396 4756 scope.go:117] "RemoveContainer" containerID="6a81fe96c8e7fe24f6ce83f37f186bc61d3a1e7e374e7c3aa65e0065ec5bbfe1" Dec 05 17:40:03 crc kubenswrapper[4756]: E1205 17:40:03.817753 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a81fe96c8e7fe24f6ce83f37f186bc61d3a1e7e374e7c3aa65e0065ec5bbfe1\": container with ID starting with 6a81fe96c8e7fe24f6ce83f37f186bc61d3a1e7e374e7c3aa65e0065ec5bbfe1 not found: ID does not exist" containerID="6a81fe96c8e7fe24f6ce83f37f186bc61d3a1e7e374e7c3aa65e0065ec5bbfe1" Dec 05 17:40:03 crc kubenswrapper[4756]: I1205 17:40:03.817794 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a81fe96c8e7fe24f6ce83f37f186bc61d3a1e7e374e7c3aa65e0065ec5bbfe1"} err="failed to get container status \"6a81fe96c8e7fe24f6ce83f37f186bc61d3a1e7e374e7c3aa65e0065ec5bbfe1\": rpc error: code = NotFound desc = could not find container \"6a81fe96c8e7fe24f6ce83f37f186bc61d3a1e7e374e7c3aa65e0065ec5bbfe1\": container with ID starting with 6a81fe96c8e7fe24f6ce83f37f186bc61d3a1e7e374e7c3aa65e0065ec5bbfe1 not found: ID does not exist" Dec 05 17:40:04 crc kubenswrapper[4756]: I1205 17:40:04.065004 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7e04838-1bc9-46f5-b43b-192f242ac804" path="/var/lib/kubelet/pods/b7e04838-1bc9-46f5-b43b-192f242ac804/volumes" Dec 05 17:40:05 crc kubenswrapper[4756]: I1205 17:40:05.379510 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 05 17:40:05 crc kubenswrapper[4756]: I1205 17:40:05.464182 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 05 17:40:22 crc kubenswrapper[4756]: I1205 17:40:22.295641 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 17:40:22 crc kubenswrapper[4756]: I1205 17:40:22.296835 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 17:40:24 crc kubenswrapper[4756]: I1205 17:40:24.866425 4756 generic.go:334] "Generic (PLEG): container finished" podID="939b8b9a-81e1-4297-a82e-a590445d9049" containerID="d392465fd0efb61ea44648c6fda43046358aa06fb08b6458c16cf97d36218a5c" exitCode=0 Dec 05 17:40:24 crc kubenswrapper[4756]: I1205 17:40:24.866547 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"939b8b9a-81e1-4297-a82e-a590445d9049","Type":"ContainerDied","Data":"d392465fd0efb61ea44648c6fda43046358aa06fb08b6458c16cf97d36218a5c"} Dec 05 17:40:24 crc kubenswrapper[4756]: I1205 17:40:24.868946 4756 generic.go:334] "Generic (PLEG): container finished" podID="e1ce2df7-97eb-43b0-a5c7-37d930a3240c" containerID="51e382dbc6027a5dcaf06b8cec47ef3782406571bc91378fc85269e786c699b3" exitCode=0 Dec 05 17:40:24 crc kubenswrapper[4756]: I1205 17:40:24.868984 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e1ce2df7-97eb-43b0-a5c7-37d930a3240c","Type":"ContainerDied","Data":"51e382dbc6027a5dcaf06b8cec47ef3782406571bc91378fc85269e786c699b3"} Dec 05 17:40:25 crc kubenswrapper[4756]: I1205 17:40:25.879810 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"939b8b9a-81e1-4297-a82e-a590445d9049","Type":"ContainerStarted","Data":"2e673948e342d136e4d3c0f12eda415a160eb3633c467acf6db1d569bc51cd48"} Dec 05 17:40:25 crc kubenswrapper[4756]: I1205 17:40:25.880227 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 05 17:40:25 crc kubenswrapper[4756]: I1205 17:40:25.882214 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e1ce2df7-97eb-43b0-a5c7-37d930a3240c","Type":"ContainerStarted","Data":"584b7f8915428d5bc7dc66db35076c84fa787f100fd02b3c92d3a791e19efdce"} Dec 05 17:40:25 crc kubenswrapper[4756]: I1205 17:40:25.882495 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:25 crc kubenswrapper[4756]: I1205 17:40:25.918956 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.918931409 podStartE2EDuration="36.918931409s" podCreationTimestamp="2025-12-05 17:39:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:40:25.913781013 +0000 UTC m=+5086.152454565" watchObservedRunningTime="2025-12-05 17:40:25.918931409 +0000 UTC m=+5086.157604951" Dec 05 17:40:25 crc kubenswrapper[4756]: I1205 17:40:25.941119 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.941096048 podStartE2EDuration="36.941096048s" podCreationTimestamp="2025-12-05 17:39:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:40:25.938687304 +0000 UTC m=+5086.177360816" watchObservedRunningTime="2025-12-05 17:40:25.941096048 +0000 UTC m=+5086.179769560" Dec 05 17:40:40 crc kubenswrapper[4756]: I1205 17:40:40.491607 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 05 17:40:40 crc kubenswrapper[4756]: I1205 17:40:40.761588 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:42 crc kubenswrapper[4756]: E1205 17:40:42.105217 4756 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.189:34302->38.102.83.189:33157: write tcp 38.102.83.189:34302->38.102.83.189:33157: write: connection reset by peer Dec 05 17:40:45 crc kubenswrapper[4756]: I1205 17:40:45.625547 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f79bf7859-9dmcz"] Dec 05 17:40:45 crc kubenswrapper[4756]: E1205 17:40:45.626842 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9d9bdec-a1a5-4504-ad70-5f198fb210a8" containerName="init" Dec 05 17:40:45 crc kubenswrapper[4756]: I1205 17:40:45.626862 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9d9bdec-a1a5-4504-ad70-5f198fb210a8" containerName="init" Dec 05 17:40:45 crc kubenswrapper[4756]: E1205 17:40:45.626882 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e04838-1bc9-46f5-b43b-192f242ac804" containerName="extract-utilities" Dec 05 17:40:45 crc kubenswrapper[4756]: I1205 17:40:45.626890 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e04838-1bc9-46f5-b43b-192f242ac804" containerName="extract-utilities" Dec 05 17:40:45 crc kubenswrapper[4756]: E1205 17:40:45.626910 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9d9bdec-a1a5-4504-ad70-5f198fb210a8" containerName="dnsmasq-dns" Dec 05 17:40:45 crc kubenswrapper[4756]: I1205 17:40:45.626920 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9d9bdec-a1a5-4504-ad70-5f198fb210a8" containerName="dnsmasq-dns" Dec 05 17:40:45 crc kubenswrapper[4756]: E1205 17:40:45.626927 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e04838-1bc9-46f5-b43b-192f242ac804" containerName="registry-server" Dec 05 17:40:45 crc kubenswrapper[4756]: I1205 17:40:45.626934 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e04838-1bc9-46f5-b43b-192f242ac804" containerName="registry-server" Dec 05 17:40:45 crc kubenswrapper[4756]: E1205 17:40:45.626949 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e04838-1bc9-46f5-b43b-192f242ac804" containerName="extract-content" Dec 05 17:40:45 crc kubenswrapper[4756]: I1205 17:40:45.626957 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e04838-1bc9-46f5-b43b-192f242ac804" containerName="extract-content" Dec 05 17:40:45 crc kubenswrapper[4756]: I1205 17:40:45.627152 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7e04838-1bc9-46f5-b43b-192f242ac804" containerName="registry-server" Dec 05 17:40:45 crc kubenswrapper[4756]: I1205 17:40:45.627171 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9d9bdec-a1a5-4504-ad70-5f198fb210a8" containerName="dnsmasq-dns" Dec 05 17:40:45 crc kubenswrapper[4756]: I1205 17:40:45.628001 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f79bf7859-9dmcz" Dec 05 17:40:45 crc kubenswrapper[4756]: I1205 17:40:45.641456 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f79bf7859-9dmcz"] Dec 05 17:40:45 crc kubenswrapper[4756]: I1205 17:40:45.711939 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4-dns-svc\") pod \"dnsmasq-dns-f79bf7859-9dmcz\" (UID: \"60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4\") " pod="openstack/dnsmasq-dns-f79bf7859-9dmcz" Dec 05 17:40:45 crc kubenswrapper[4756]: I1205 17:40:45.712011 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4-config\") pod \"dnsmasq-dns-f79bf7859-9dmcz\" (UID: \"60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4\") " pod="openstack/dnsmasq-dns-f79bf7859-9dmcz" Dec 05 17:40:45 crc kubenswrapper[4756]: I1205 17:40:45.712137 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptj7h\" (UniqueName: \"kubernetes.io/projected/60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4-kube-api-access-ptj7h\") pod \"dnsmasq-dns-f79bf7859-9dmcz\" (UID: \"60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4\") " pod="openstack/dnsmasq-dns-f79bf7859-9dmcz" Dec 05 17:40:45 crc kubenswrapper[4756]: I1205 17:40:45.813760 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4-dns-svc\") pod \"dnsmasq-dns-f79bf7859-9dmcz\" (UID: \"60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4\") " pod="openstack/dnsmasq-dns-f79bf7859-9dmcz" Dec 05 17:40:45 crc kubenswrapper[4756]: I1205 17:40:45.813837 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4-config\") pod \"dnsmasq-dns-f79bf7859-9dmcz\" (UID: \"60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4\") " pod="openstack/dnsmasq-dns-f79bf7859-9dmcz" Dec 05 17:40:45 crc kubenswrapper[4756]: I1205 17:40:45.813895 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptj7h\" (UniqueName: \"kubernetes.io/projected/60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4-kube-api-access-ptj7h\") pod \"dnsmasq-dns-f79bf7859-9dmcz\" (UID: \"60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4\") " pod="openstack/dnsmasq-dns-f79bf7859-9dmcz" Dec 05 17:40:45 crc kubenswrapper[4756]: I1205 17:40:45.814640 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4-dns-svc\") pod \"dnsmasq-dns-f79bf7859-9dmcz\" (UID: \"60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4\") " pod="openstack/dnsmasq-dns-f79bf7859-9dmcz" Dec 05 17:40:45 crc kubenswrapper[4756]: I1205 17:40:45.815118 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4-config\") pod \"dnsmasq-dns-f79bf7859-9dmcz\" (UID: \"60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4\") " pod="openstack/dnsmasq-dns-f79bf7859-9dmcz" Dec 05 17:40:45 crc kubenswrapper[4756]: I1205 17:40:45.847376 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptj7h\" (UniqueName: \"kubernetes.io/projected/60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4-kube-api-access-ptj7h\") pod \"dnsmasq-dns-f79bf7859-9dmcz\" (UID: \"60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4\") " pod="openstack/dnsmasq-dns-f79bf7859-9dmcz" Dec 05 17:40:45 crc kubenswrapper[4756]: I1205 17:40:45.991081 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f79bf7859-9dmcz" Dec 05 17:40:46 crc kubenswrapper[4756]: I1205 17:40:46.476553 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f79bf7859-9dmcz"] Dec 05 17:40:46 crc kubenswrapper[4756]: I1205 17:40:46.516497 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 17:40:47 crc kubenswrapper[4756]: I1205 17:40:47.076980 4756 generic.go:334] "Generic (PLEG): container finished" podID="60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4" containerID="398d1e691d0b96463977ad2114c9159e74dc0b4c4516bff6f85f0cdb9f480856" exitCode=0 Dec 05 17:40:47 crc kubenswrapper[4756]: I1205 17:40:47.077300 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f79bf7859-9dmcz" event={"ID":"60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4","Type":"ContainerDied","Data":"398d1e691d0b96463977ad2114c9159e74dc0b4c4516bff6f85f0cdb9f480856"} Dec 05 17:40:47 crc kubenswrapper[4756]: I1205 17:40:47.077329 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f79bf7859-9dmcz" event={"ID":"60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4","Type":"ContainerStarted","Data":"c73cd581e88cf8778370eb47b0e84ad05eeedbc2078685d1a78b2ec51cc2ca90"} Dec 05 17:40:47 crc kubenswrapper[4756]: I1205 17:40:47.283658 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 17:40:48 crc kubenswrapper[4756]: I1205 17:40:48.085668 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f79bf7859-9dmcz" event={"ID":"60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4","Type":"ContainerStarted","Data":"45f521eb2ce784eeaf52148c8e3002c86a7a6063a23cf679e8d86eaaaead7a21"} Dec 05 17:40:48 crc kubenswrapper[4756]: I1205 17:40:48.085995 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f79bf7859-9dmcz" Dec 05 17:40:48 crc kubenswrapper[4756]: I1205 17:40:48.106442 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f79bf7859-9dmcz" podStartSLOduration=3.106407192 podStartE2EDuration="3.106407192s" podCreationTimestamp="2025-12-05 17:40:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:40:48.101069291 +0000 UTC m=+5108.339742823" watchObservedRunningTime="2025-12-05 17:40:48.106407192 +0000 UTC m=+5108.345080704" Dec 05 17:40:50 crc kubenswrapper[4756]: I1205 17:40:50.741703 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="939b8b9a-81e1-4297-a82e-a590445d9049" containerName="rabbitmq" containerID="cri-o://2e673948e342d136e4d3c0f12eda415a160eb3633c467acf6db1d569bc51cd48" gracePeriod=604796 Dec 05 17:40:51 crc kubenswrapper[4756]: I1205 17:40:51.532907 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="e1ce2df7-97eb-43b0-a5c7-37d930a3240c" containerName="rabbitmq" containerID="cri-o://584b7f8915428d5bc7dc66db35076c84fa787f100fd02b3c92d3a791e19efdce" gracePeriod=604796 Dec 05 17:40:52 crc kubenswrapper[4756]: I1205 17:40:52.296062 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 17:40:52 crc kubenswrapper[4756]: I1205 17:40:52.296730 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 17:40:55 crc kubenswrapper[4756]: I1205 17:40:55.992625 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f79bf7859-9dmcz" Dec 05 17:40:56 crc kubenswrapper[4756]: I1205 17:40:56.072023 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cbb4f659c-dzl29"] Dec 05 17:40:56 crc kubenswrapper[4756]: I1205 17:40:56.072405 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7cbb4f659c-dzl29" podUID="b2174c60-6954-4a66-9a3e-f16ead5d484e" containerName="dnsmasq-dns" containerID="cri-o://ae014ad50fa9deff75b63a46671b7e79bcdd19eb9e54e4471f07d73c04701392" gracePeriod=10 Dec 05 17:40:56 crc kubenswrapper[4756]: I1205 17:40:56.520851 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cbb4f659c-dzl29" Dec 05 17:40:56 crc kubenswrapper[4756]: I1205 17:40:56.583435 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2174c60-6954-4a66-9a3e-f16ead5d484e-dns-svc\") pod \"b2174c60-6954-4a66-9a3e-f16ead5d484e\" (UID: \"b2174c60-6954-4a66-9a3e-f16ead5d484e\") " Dec 05 17:40:56 crc kubenswrapper[4756]: I1205 17:40:56.583580 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2174c60-6954-4a66-9a3e-f16ead5d484e-config\") pod \"b2174c60-6954-4a66-9a3e-f16ead5d484e\" (UID: \"b2174c60-6954-4a66-9a3e-f16ead5d484e\") " Dec 05 17:40:56 crc kubenswrapper[4756]: I1205 17:40:56.583633 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svdcd\" (UniqueName: \"kubernetes.io/projected/b2174c60-6954-4a66-9a3e-f16ead5d484e-kube-api-access-svdcd\") pod \"b2174c60-6954-4a66-9a3e-f16ead5d484e\" (UID: \"b2174c60-6954-4a66-9a3e-f16ead5d484e\") " Dec 05 17:40:56 crc kubenswrapper[4756]: I1205 17:40:56.593899 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2174c60-6954-4a66-9a3e-f16ead5d484e-kube-api-access-svdcd" (OuterVolumeSpecName: "kube-api-access-svdcd") pod "b2174c60-6954-4a66-9a3e-f16ead5d484e" (UID: "b2174c60-6954-4a66-9a3e-f16ead5d484e"). InnerVolumeSpecName "kube-api-access-svdcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:40:56 crc kubenswrapper[4756]: I1205 17:40:56.621569 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2174c60-6954-4a66-9a3e-f16ead5d484e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b2174c60-6954-4a66-9a3e-f16ead5d484e" (UID: "b2174c60-6954-4a66-9a3e-f16ead5d484e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:40:56 crc kubenswrapper[4756]: I1205 17:40:56.624525 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2174c60-6954-4a66-9a3e-f16ead5d484e-config" (OuterVolumeSpecName: "config") pod "b2174c60-6954-4a66-9a3e-f16ead5d484e" (UID: "b2174c60-6954-4a66-9a3e-f16ead5d484e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:40:56 crc kubenswrapper[4756]: I1205 17:40:56.685058 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2174c60-6954-4a66-9a3e-f16ead5d484e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 17:40:56 crc kubenswrapper[4756]: I1205 17:40:56.685095 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2174c60-6954-4a66-9a3e-f16ead5d484e-config\") on node \"crc\" DevicePath \"\"" Dec 05 17:40:56 crc kubenswrapper[4756]: I1205 17:40:56.685106 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svdcd\" (UniqueName: \"kubernetes.io/projected/b2174c60-6954-4a66-9a3e-f16ead5d484e-kube-api-access-svdcd\") on node \"crc\" DevicePath \"\"" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.182847 4756 generic.go:334] "Generic (PLEG): container finished" podID="b2174c60-6954-4a66-9a3e-f16ead5d484e" containerID="ae014ad50fa9deff75b63a46671b7e79bcdd19eb9e54e4471f07d73c04701392" exitCode=0 Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.182972 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cbb4f659c-dzl29" event={"ID":"b2174c60-6954-4a66-9a3e-f16ead5d484e","Type":"ContainerDied","Data":"ae014ad50fa9deff75b63a46671b7e79bcdd19eb9e54e4471f07d73c04701392"} Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.183011 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cbb4f659c-dzl29" event={"ID":"b2174c60-6954-4a66-9a3e-f16ead5d484e","Type":"ContainerDied","Data":"f6746e791d5bbd11b7de76618baf050d2bcb37a3ebbfe673f8b3d56f29cc085e"} Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.183083 4756 scope.go:117] "RemoveContainer" containerID="ae014ad50fa9deff75b63a46671b7e79bcdd19eb9e54e4471f07d73c04701392" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.183206 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cbb4f659c-dzl29" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.189893 4756 generic.go:334] "Generic (PLEG): container finished" podID="939b8b9a-81e1-4297-a82e-a590445d9049" containerID="2e673948e342d136e4d3c0f12eda415a160eb3633c467acf6db1d569bc51cd48" exitCode=0 Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.189934 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"939b8b9a-81e1-4297-a82e-a590445d9049","Type":"ContainerDied","Data":"2e673948e342d136e4d3c0f12eda415a160eb3633c467acf6db1d569bc51cd48"} Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.268358 4756 scope.go:117] "RemoveContainer" containerID="ba10013dd32b41236b556e5e89bc898d0a551f5e3daf8af4e9adc37f75224a93" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.273642 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cbb4f659c-dzl29"] Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.288733 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cbb4f659c-dzl29"] Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.293206 4756 scope.go:117] "RemoveContainer" containerID="ae014ad50fa9deff75b63a46671b7e79bcdd19eb9e54e4471f07d73c04701392" Dec 05 17:40:57 crc kubenswrapper[4756]: E1205 17:40:57.293789 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae014ad50fa9deff75b63a46671b7e79bcdd19eb9e54e4471f07d73c04701392\": container with ID starting with ae014ad50fa9deff75b63a46671b7e79bcdd19eb9e54e4471f07d73c04701392 not found: ID does not exist" containerID="ae014ad50fa9deff75b63a46671b7e79bcdd19eb9e54e4471f07d73c04701392" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.293823 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae014ad50fa9deff75b63a46671b7e79bcdd19eb9e54e4471f07d73c04701392"} err="failed to get container status \"ae014ad50fa9deff75b63a46671b7e79bcdd19eb9e54e4471f07d73c04701392\": rpc error: code = NotFound desc = could not find container \"ae014ad50fa9deff75b63a46671b7e79bcdd19eb9e54e4471f07d73c04701392\": container with ID starting with ae014ad50fa9deff75b63a46671b7e79bcdd19eb9e54e4471f07d73c04701392 not found: ID does not exist" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.293843 4756 scope.go:117] "RemoveContainer" containerID="ba10013dd32b41236b556e5e89bc898d0a551f5e3daf8af4e9adc37f75224a93" Dec 05 17:40:57 crc kubenswrapper[4756]: E1205 17:40:57.294155 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba10013dd32b41236b556e5e89bc898d0a551f5e3daf8af4e9adc37f75224a93\": container with ID starting with ba10013dd32b41236b556e5e89bc898d0a551f5e3daf8af4e9adc37f75224a93 not found: ID does not exist" containerID="ba10013dd32b41236b556e5e89bc898d0a551f5e3daf8af4e9adc37f75224a93" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.294201 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba10013dd32b41236b556e5e89bc898d0a551f5e3daf8af4e9adc37f75224a93"} err="failed to get container status \"ba10013dd32b41236b556e5e89bc898d0a551f5e3daf8af4e9adc37f75224a93\": rpc error: code = NotFound desc = could not find container \"ba10013dd32b41236b556e5e89bc898d0a551f5e3daf8af4e9adc37f75224a93\": container with ID starting with ba10013dd32b41236b556e5e89bc898d0a551f5e3daf8af4e9adc37f75224a93 not found: ID does not exist" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.503899 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.598490 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/939b8b9a-81e1-4297-a82e-a590445d9049-rabbitmq-erlang-cookie\") pod \"939b8b9a-81e1-4297-a82e-a590445d9049\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.598564 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/939b8b9a-81e1-4297-a82e-a590445d9049-rabbitmq-confd\") pod \"939b8b9a-81e1-4297-a82e-a590445d9049\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.599057 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/939b8b9a-81e1-4297-a82e-a590445d9049-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "939b8b9a-81e1-4297-a82e-a590445d9049" (UID: "939b8b9a-81e1-4297-a82e-a590445d9049"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.601697 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/939b8b9a-81e1-4297-a82e-a590445d9049-erlang-cookie-secret\") pod \"939b8b9a-81e1-4297-a82e-a590445d9049\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.601751 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lhcf\" (UniqueName: \"kubernetes.io/projected/939b8b9a-81e1-4297-a82e-a590445d9049-kube-api-access-8lhcf\") pod \"939b8b9a-81e1-4297-a82e-a590445d9049\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.601881 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08d8e5b3-f151-430e-a4c9-5e1337ac2e38\") pod \"939b8b9a-81e1-4297-a82e-a590445d9049\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.601934 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/939b8b9a-81e1-4297-a82e-a590445d9049-rabbitmq-plugins\") pod \"939b8b9a-81e1-4297-a82e-a590445d9049\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.601988 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/939b8b9a-81e1-4297-a82e-a590445d9049-server-conf\") pod \"939b8b9a-81e1-4297-a82e-a590445d9049\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.602006 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/939b8b9a-81e1-4297-a82e-a590445d9049-plugins-conf\") pod \"939b8b9a-81e1-4297-a82e-a590445d9049\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.602025 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/939b8b9a-81e1-4297-a82e-a590445d9049-config-data\") pod \"939b8b9a-81e1-4297-a82e-a590445d9049\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.602045 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/939b8b9a-81e1-4297-a82e-a590445d9049-pod-info\") pod \"939b8b9a-81e1-4297-a82e-a590445d9049\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.602066 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/939b8b9a-81e1-4297-a82e-a590445d9049-rabbitmq-tls\") pod \"939b8b9a-81e1-4297-a82e-a590445d9049\" (UID: \"939b8b9a-81e1-4297-a82e-a590445d9049\") " Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.602404 4756 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/939b8b9a-81e1-4297-a82e-a590445d9049-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.603798 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/939b8b9a-81e1-4297-a82e-a590445d9049-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "939b8b9a-81e1-4297-a82e-a590445d9049" (UID: "939b8b9a-81e1-4297-a82e-a590445d9049"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.604960 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/939b8b9a-81e1-4297-a82e-a590445d9049-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "939b8b9a-81e1-4297-a82e-a590445d9049" (UID: "939b8b9a-81e1-4297-a82e-a590445d9049"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.610031 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/939b8b9a-81e1-4297-a82e-a590445d9049-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "939b8b9a-81e1-4297-a82e-a590445d9049" (UID: "939b8b9a-81e1-4297-a82e-a590445d9049"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.612058 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/939b8b9a-81e1-4297-a82e-a590445d9049-pod-info" (OuterVolumeSpecName: "pod-info") pod "939b8b9a-81e1-4297-a82e-a590445d9049" (UID: "939b8b9a-81e1-4297-a82e-a590445d9049"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.617669 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/939b8b9a-81e1-4297-a82e-a590445d9049-kube-api-access-8lhcf" (OuterVolumeSpecName: "kube-api-access-8lhcf") pod "939b8b9a-81e1-4297-a82e-a590445d9049" (UID: "939b8b9a-81e1-4297-a82e-a590445d9049"). InnerVolumeSpecName "kube-api-access-8lhcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.624542 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/939b8b9a-81e1-4297-a82e-a590445d9049-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "939b8b9a-81e1-4297-a82e-a590445d9049" (UID: "939b8b9a-81e1-4297-a82e-a590445d9049"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.634535 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/939b8b9a-81e1-4297-a82e-a590445d9049-config-data" (OuterVolumeSpecName: "config-data") pod "939b8b9a-81e1-4297-a82e-a590445d9049" (UID: "939b8b9a-81e1-4297-a82e-a590445d9049"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.635318 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08d8e5b3-f151-430e-a4c9-5e1337ac2e38" (OuterVolumeSpecName: "persistence") pod "939b8b9a-81e1-4297-a82e-a590445d9049" (UID: "939b8b9a-81e1-4297-a82e-a590445d9049"). InnerVolumeSpecName "pvc-08d8e5b3-f151-430e-a4c9-5e1337ac2e38". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.667920 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/939b8b9a-81e1-4297-a82e-a590445d9049-server-conf" (OuterVolumeSpecName: "server-conf") pod "939b8b9a-81e1-4297-a82e-a590445d9049" (UID: "939b8b9a-81e1-4297-a82e-a590445d9049"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.686440 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/939b8b9a-81e1-4297-a82e-a590445d9049-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "939b8b9a-81e1-4297-a82e-a590445d9049" (UID: "939b8b9a-81e1-4297-a82e-a590445d9049"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.704328 4756 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/939b8b9a-81e1-4297-a82e-a590445d9049-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.704364 4756 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/939b8b9a-81e1-4297-a82e-a590445d9049-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.704374 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lhcf\" (UniqueName: \"kubernetes.io/projected/939b8b9a-81e1-4297-a82e-a590445d9049-kube-api-access-8lhcf\") on node \"crc\" DevicePath \"\"" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.704583 4756 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-08d8e5b3-f151-430e-a4c9-5e1337ac2e38\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08d8e5b3-f151-430e-a4c9-5e1337ac2e38\") on node \"crc\" " Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.704596 4756 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/939b8b9a-81e1-4297-a82e-a590445d9049-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.704606 4756 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/939b8b9a-81e1-4297-a82e-a590445d9049-server-conf\") on node \"crc\" DevicePath \"\"" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.704614 4756 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/939b8b9a-81e1-4297-a82e-a590445d9049-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.704623 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/939b8b9a-81e1-4297-a82e-a590445d9049-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.704631 4756 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/939b8b9a-81e1-4297-a82e-a590445d9049-pod-info\") on node \"crc\" DevicePath \"\"" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.704638 4756 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/939b8b9a-81e1-4297-a82e-a590445d9049-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.721129 4756 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.721263 4756 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-08d8e5b3-f151-430e-a4c9-5e1337ac2e38" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08d8e5b3-f151-430e-a4c9-5e1337ac2e38") on node "crc" Dec 05 17:40:57 crc kubenswrapper[4756]: I1205 17:40:57.806644 4756 reconciler_common.go:293] "Volume detached for volume \"pvc-08d8e5b3-f151-430e-a4c9-5e1337ac2e38\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08d8e5b3-f151-430e-a4c9-5e1337ac2e38\") on node \"crc\" DevicePath \"\"" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.067218 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2174c60-6954-4a66-9a3e-f16ead5d484e" path="/var/lib/kubelet/pods/b2174c60-6954-4a66-9a3e-f16ead5d484e/volumes" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.100980 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.212444 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-plugins-conf\") pod \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.212559 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p42ph\" (UniqueName: \"kubernetes.io/projected/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-kube-api-access-p42ph\") pod \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.212580 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-rabbitmq-confd\") pod \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.212637 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-config-data\") pod \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.212666 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-erlang-cookie-secret\") pod \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.212710 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-server-conf\") pod \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.212743 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-rabbitmq-erlang-cookie\") pod \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.212824 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-pod-info\") pod \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.212990 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fde1dc8-f147-4848-860f-3639768765dc\") pod \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.213061 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-rabbitmq-plugins\") pod \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.213111 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-rabbitmq-tls\") pod \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\" (UID: \"e1ce2df7-97eb-43b0-a5c7-37d930a3240c\") " Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.214563 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e1ce2df7-97eb-43b0-a5c7-37d930a3240c" (UID: "e1ce2df7-97eb-43b0-a5c7-37d930a3240c"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.216616 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e1ce2df7-97eb-43b0-a5c7-37d930a3240c" (UID: "e1ce2df7-97eb-43b0-a5c7-37d930a3240c"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.217472 4756 generic.go:334] "Generic (PLEG): container finished" podID="e1ce2df7-97eb-43b0-a5c7-37d930a3240c" containerID="584b7f8915428d5bc7dc66db35076c84fa787f100fd02b3c92d3a791e19efdce" exitCode=0 Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.217567 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e1ce2df7-97eb-43b0-a5c7-37d930a3240c","Type":"ContainerDied","Data":"584b7f8915428d5bc7dc66db35076c84fa787f100fd02b3c92d3a791e19efdce"} Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.217598 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e1ce2df7-97eb-43b0-a5c7-37d930a3240c","Type":"ContainerDied","Data":"f07b7a77c7a5f54ff40b2e0507226885451ae6c34524d2fb72d6ad55cddd5a0d"} Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.217590 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.217617 4756 scope.go:117] "RemoveContainer" containerID="584b7f8915428d5bc7dc66db35076c84fa787f100fd02b3c92d3a791e19efdce" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.220263 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e1ce2df7-97eb-43b0-a5c7-37d930a3240c" (UID: "e1ce2df7-97eb-43b0-a5c7-37d930a3240c"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.225380 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e1ce2df7-97eb-43b0-a5c7-37d930a3240c" (UID: "e1ce2df7-97eb-43b0-a5c7-37d930a3240c"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.225627 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e1ce2df7-97eb-43b0-a5c7-37d930a3240c" (UID: "e1ce2df7-97eb-43b0-a5c7-37d930a3240c"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.227769 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-pod-info" (OuterVolumeSpecName: "pod-info") pod "e1ce2df7-97eb-43b0-a5c7-37d930a3240c" (UID: "e1ce2df7-97eb-43b0-a5c7-37d930a3240c"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.227896 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"939b8b9a-81e1-4297-a82e-a590445d9049","Type":"ContainerDied","Data":"5045542884f12b9cc1ee7ae541ab2600543571f67ad3008d7575e0596194ff15"} Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.227974 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.227785 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-kube-api-access-p42ph" (OuterVolumeSpecName: "kube-api-access-p42ph") pod "e1ce2df7-97eb-43b0-a5c7-37d930a3240c" (UID: "e1ce2df7-97eb-43b0-a5c7-37d930a3240c"). InnerVolumeSpecName "kube-api-access-p42ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.234346 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fde1dc8-f147-4848-860f-3639768765dc" (OuterVolumeSpecName: "persistence") pod "e1ce2df7-97eb-43b0-a5c7-37d930a3240c" (UID: "e1ce2df7-97eb-43b0-a5c7-37d930a3240c"). InnerVolumeSpecName "pvc-5fde1dc8-f147-4848-860f-3639768765dc". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.259484 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-config-data" (OuterVolumeSpecName: "config-data") pod "e1ce2df7-97eb-43b0-a5c7-37d930a3240c" (UID: "e1ce2df7-97eb-43b0-a5c7-37d930a3240c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.270770 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-server-conf" (OuterVolumeSpecName: "server-conf") pod "e1ce2df7-97eb-43b0-a5c7-37d930a3240c" (UID: "e1ce2df7-97eb-43b0-a5c7-37d930a3240c"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.310564 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e1ce2df7-97eb-43b0-a5c7-37d930a3240c" (UID: "e1ce2df7-97eb-43b0-a5c7-37d930a3240c"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.314827 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p42ph\" (UniqueName: \"kubernetes.io/projected/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-kube-api-access-p42ph\") on node \"crc\" DevicePath \"\"" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.314850 4756 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.314860 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.314869 4756 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.314876 4756 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-server-conf\") on node \"crc\" DevicePath \"\"" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.314885 4756 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.314892 4756 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-pod-info\") on node \"crc\" DevicePath \"\"" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.315107 4756 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-5fde1dc8-f147-4848-860f-3639768765dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fde1dc8-f147-4848-860f-3639768765dc\") on node \"crc\" " Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.315117 4756 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.315126 4756 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.315134 4756 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e1ce2df7-97eb-43b0-a5c7-37d930a3240c-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.331052 4756 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.331429 4756 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-5fde1dc8-f147-4848-860f-3639768765dc" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fde1dc8-f147-4848-860f-3639768765dc") on node "crc" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.367936 4756 scope.go:117] "RemoveContainer" containerID="51e382dbc6027a5dcaf06b8cec47ef3782406571bc91378fc85269e786c699b3" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.381725 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.388619 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.401253 4756 scope.go:117] "RemoveContainer" containerID="584b7f8915428d5bc7dc66db35076c84fa787f100fd02b3c92d3a791e19efdce" Dec 05 17:40:58 crc kubenswrapper[4756]: E1205 17:40:58.403626 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"584b7f8915428d5bc7dc66db35076c84fa787f100fd02b3c92d3a791e19efdce\": container with ID starting with 584b7f8915428d5bc7dc66db35076c84fa787f100fd02b3c92d3a791e19efdce not found: ID does not exist" containerID="584b7f8915428d5bc7dc66db35076c84fa787f100fd02b3c92d3a791e19efdce" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.403662 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"584b7f8915428d5bc7dc66db35076c84fa787f100fd02b3c92d3a791e19efdce"} err="failed to get container status \"584b7f8915428d5bc7dc66db35076c84fa787f100fd02b3c92d3a791e19efdce\": rpc error: code = NotFound desc = could not find container \"584b7f8915428d5bc7dc66db35076c84fa787f100fd02b3c92d3a791e19efdce\": container with ID starting with 584b7f8915428d5bc7dc66db35076c84fa787f100fd02b3c92d3a791e19efdce not found: ID does not exist" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.403718 4756 scope.go:117] "RemoveContainer" containerID="51e382dbc6027a5dcaf06b8cec47ef3782406571bc91378fc85269e786c699b3" Dec 05 17:40:58 crc kubenswrapper[4756]: E1205 17:40:58.403928 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51e382dbc6027a5dcaf06b8cec47ef3782406571bc91378fc85269e786c699b3\": container with ID starting with 51e382dbc6027a5dcaf06b8cec47ef3782406571bc91378fc85269e786c699b3 not found: ID does not exist" containerID="51e382dbc6027a5dcaf06b8cec47ef3782406571bc91378fc85269e786c699b3" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.403953 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51e382dbc6027a5dcaf06b8cec47ef3782406571bc91378fc85269e786c699b3"} err="failed to get container status \"51e382dbc6027a5dcaf06b8cec47ef3782406571bc91378fc85269e786c699b3\": rpc error: code = NotFound desc = could not find container \"51e382dbc6027a5dcaf06b8cec47ef3782406571bc91378fc85269e786c699b3\": container with ID starting with 51e382dbc6027a5dcaf06b8cec47ef3782406571bc91378fc85269e786c699b3 not found: ID does not exist" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.403966 4756 scope.go:117] "RemoveContainer" containerID="2e673948e342d136e4d3c0f12eda415a160eb3633c467acf6db1d569bc51cd48" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.407573 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 17:40:58 crc kubenswrapper[4756]: E1205 17:40:58.407869 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1ce2df7-97eb-43b0-a5c7-37d930a3240c" containerName="rabbitmq" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.407887 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1ce2df7-97eb-43b0-a5c7-37d930a3240c" containerName="rabbitmq" Dec 05 17:40:58 crc kubenswrapper[4756]: E1205 17:40:58.407911 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="939b8b9a-81e1-4297-a82e-a590445d9049" containerName="setup-container" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.407917 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="939b8b9a-81e1-4297-a82e-a590445d9049" containerName="setup-container" Dec 05 17:40:58 crc kubenswrapper[4756]: E1205 17:40:58.407928 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2174c60-6954-4a66-9a3e-f16ead5d484e" containerName="dnsmasq-dns" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.407935 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2174c60-6954-4a66-9a3e-f16ead5d484e" containerName="dnsmasq-dns" Dec 05 17:40:58 crc kubenswrapper[4756]: E1205 17:40:58.407941 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1ce2df7-97eb-43b0-a5c7-37d930a3240c" containerName="setup-container" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.407947 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1ce2df7-97eb-43b0-a5c7-37d930a3240c" containerName="setup-container" Dec 05 17:40:58 crc kubenswrapper[4756]: E1205 17:40:58.407963 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="939b8b9a-81e1-4297-a82e-a590445d9049" containerName="rabbitmq" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.407968 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="939b8b9a-81e1-4297-a82e-a590445d9049" containerName="rabbitmq" Dec 05 17:40:58 crc kubenswrapper[4756]: E1205 17:40:58.407982 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2174c60-6954-4a66-9a3e-f16ead5d484e" containerName="init" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.407988 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2174c60-6954-4a66-9a3e-f16ead5d484e" containerName="init" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.408145 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1ce2df7-97eb-43b0-a5c7-37d930a3240c" containerName="rabbitmq" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.408158 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="939b8b9a-81e1-4297-a82e-a590445d9049" containerName="rabbitmq" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.408173 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2174c60-6954-4a66-9a3e-f16ead5d484e" containerName="dnsmasq-dns" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.409521 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.415087 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.415105 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.415144 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-2bgzw" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.415193 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.415098 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.415288 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.415150 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.415925 4756 reconciler_common.go:293] "Volume detached for volume \"pvc-5fde1dc8-f147-4848-860f-3639768765dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fde1dc8-f147-4848-860f-3639768765dc\") on node \"crc\" DevicePath \"\"" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.422612 4756 scope.go:117] "RemoveContainer" containerID="d392465fd0efb61ea44648c6fda43046358aa06fb08b6458c16cf97d36218a5c" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.444190 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.516980 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.517041 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czqc2\" (UniqueName: \"kubernetes.io/projected/2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc-kube-api-access-czqc2\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.517073 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.517101 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-08d8e5b3-f151-430e-a4c9-5e1337ac2e38\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08d8e5b3-f151-430e-a4c9-5e1337ac2e38\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.517122 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.517149 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.517173 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.517196 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc-config-data\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.517220 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.517264 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.517282 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.545355 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.552084 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.573285 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.577399 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.580110 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-466mr" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.583517 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.585486 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.585508 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.585750 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.586018 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.586255 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.586690 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.618395 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.618464 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-08d8e5b3-f151-430e-a4c9-5e1337ac2e38\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08d8e5b3-f151-430e-a4c9-5e1337ac2e38\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.618493 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.618519 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.618537 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.618559 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc-config-data\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.618756 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f2f77107-b6d1-42d0-afbd-482dfefc3524-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.618872 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.618892 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f2f77107-b6d1-42d0-afbd-482dfefc3524-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.618933 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f2f77107-b6d1-42d0-afbd-482dfefc3524-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.619669 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5fde1dc8-f147-4848-860f-3639768765dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fde1dc8-f147-4848-860f-3639768765dc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.619727 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f2f77107-b6d1-42d0-afbd-482dfefc3524-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.619900 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f2f77107-b6d1-42d0-afbd-482dfefc3524-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.619966 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.619988 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.620038 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f2f77107-b6d1-42d0-afbd-482dfefc3524-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.620075 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f2f77107-b6d1-42d0-afbd-482dfefc3524-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.620283 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.620339 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f2f77107-b6d1-42d0-afbd-482dfefc3524-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.620372 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czqc2\" (UniqueName: \"kubernetes.io/projected/2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc-kube-api-access-czqc2\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.620400 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f2f77107-b6d1-42d0-afbd-482dfefc3524-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.620437 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmj89\" (UniqueName: \"kubernetes.io/projected/f2f77107-b6d1-42d0-afbd-482dfefc3524-kube-api-access-fmj89\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.622163 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.622492 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.622728 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.623010 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc-config-data\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.623256 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.625438 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.625574 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-08d8e5b3-f151-430e-a4c9-5e1337ac2e38\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08d8e5b3-f151-430e-a4c9-5e1337ac2e38\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0aeac3a6231cb0f4bc98e3a95c98c77fc8b23722041dd5c489b5b8ddfbe5e6bd/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.625487 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.628095 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.629041 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.632326 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.643769 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czqc2\" (UniqueName: \"kubernetes.io/projected/2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc-kube-api-access-czqc2\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.670508 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-08d8e5b3-f151-430e-a4c9-5e1337ac2e38\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08d8e5b3-f151-430e-a4c9-5e1337ac2e38\") pod \"rabbitmq-server-0\" (UID: \"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc\") " pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.723350 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f2f77107-b6d1-42d0-afbd-482dfefc3524-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.723397 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f2f77107-b6d1-42d0-afbd-482dfefc3524-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.723427 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f2f77107-b6d1-42d0-afbd-482dfefc3524-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.723464 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5fde1dc8-f147-4848-860f-3639768765dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fde1dc8-f147-4848-860f-3639768765dc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.723487 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f2f77107-b6d1-42d0-afbd-482dfefc3524-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.723516 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f2f77107-b6d1-42d0-afbd-482dfefc3524-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.723539 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f2f77107-b6d1-42d0-afbd-482dfefc3524-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.723564 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f2f77107-b6d1-42d0-afbd-482dfefc3524-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.723589 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f2f77107-b6d1-42d0-afbd-482dfefc3524-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.723622 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f2f77107-b6d1-42d0-afbd-482dfefc3524-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.723643 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmj89\" (UniqueName: \"kubernetes.io/projected/f2f77107-b6d1-42d0-afbd-482dfefc3524-kube-api-access-fmj89\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.723872 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f2f77107-b6d1-42d0-afbd-482dfefc3524-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.724160 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f2f77107-b6d1-42d0-afbd-482dfefc3524-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.724500 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f2f77107-b6d1-42d0-afbd-482dfefc3524-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.725328 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f2f77107-b6d1-42d0-afbd-482dfefc3524-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.725452 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f2f77107-b6d1-42d0-afbd-482dfefc3524-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.726926 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.726959 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5fde1dc8-f147-4848-860f-3639768765dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fde1dc8-f147-4848-860f-3639768765dc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/34ae40e872a478d717326fdb9340b30a4e079c9fae3013f926bff3ee4eb0b4d3/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.727314 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f2f77107-b6d1-42d0-afbd-482dfefc3524-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.727435 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f2f77107-b6d1-42d0-afbd-482dfefc3524-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.727613 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f2f77107-b6d1-42d0-afbd-482dfefc3524-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.729144 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f2f77107-b6d1-42d0-afbd-482dfefc3524-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.741809 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmj89\" (UniqueName: \"kubernetes.io/projected/f2f77107-b6d1-42d0-afbd-482dfefc3524-kube-api-access-fmj89\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.747444 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 17:40:58 crc kubenswrapper[4756]: I1205 17:40:58.756294 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5fde1dc8-f147-4848-860f-3639768765dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5fde1dc8-f147-4848-860f-3639768765dc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f2f77107-b6d1-42d0-afbd-482dfefc3524\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:59 crc kubenswrapper[4756]: I1205 17:40:58.944883 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:40:59 crc kubenswrapper[4756]: I1205 17:40:59.214673 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 17:40:59 crc kubenswrapper[4756]: I1205 17:40:59.257167 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc","Type":"ContainerStarted","Data":"8f32710d28df71ff27d8a6abd6e6463b4cf5e18021a343abdad0285b9638f40f"} Dec 05 17:40:59 crc kubenswrapper[4756]: W1205 17:40:59.938764 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2f77107_b6d1_42d0_afbd_482dfefc3524.slice/crio-99127b4cb3e9312557bec7794d11aaf8b3843d8b77d484cd00478310fe3595ac WatchSource:0}: Error finding container 99127b4cb3e9312557bec7794d11aaf8b3843d8b77d484cd00478310fe3595ac: Status 404 returned error can't find the container with id 99127b4cb3e9312557bec7794d11aaf8b3843d8b77d484cd00478310fe3595ac Dec 05 17:40:59 crc kubenswrapper[4756]: I1205 17:40:59.943264 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 17:41:00 crc kubenswrapper[4756]: I1205 17:41:00.061893 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="939b8b9a-81e1-4297-a82e-a590445d9049" path="/var/lib/kubelet/pods/939b8b9a-81e1-4297-a82e-a590445d9049/volumes" Dec 05 17:41:00 crc kubenswrapper[4756]: I1205 17:41:00.063860 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1ce2df7-97eb-43b0-a5c7-37d930a3240c" path="/var/lib/kubelet/pods/e1ce2df7-97eb-43b0-a5c7-37d930a3240c/volumes" Dec 05 17:41:00 crc kubenswrapper[4756]: I1205 17:41:00.269649 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f2f77107-b6d1-42d0-afbd-482dfefc3524","Type":"ContainerStarted","Data":"99127b4cb3e9312557bec7794d11aaf8b3843d8b77d484cd00478310fe3595ac"} Dec 05 17:41:01 crc kubenswrapper[4756]: I1205 17:41:01.279734 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc","Type":"ContainerStarted","Data":"7a9e910c605445af65460d1a7af6f5a9671a48b3a7e3b1ba359e011b253f3a37"} Dec 05 17:41:02 crc kubenswrapper[4756]: I1205 17:41:02.294374 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f2f77107-b6d1-42d0-afbd-482dfefc3524","Type":"ContainerStarted","Data":"b0804cf812f6fea1cd3fb5966b67b201f4e2179c906dbbbd8a4925e07aeca2a5"} Dec 05 17:41:22 crc kubenswrapper[4756]: I1205 17:41:22.295463 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 17:41:22 crc kubenswrapper[4756]: I1205 17:41:22.296051 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 17:41:22 crc kubenswrapper[4756]: I1205 17:41:22.296093 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 17:41:22 crc kubenswrapper[4756]: I1205 17:41:22.296735 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"13145d7e051555462bf10e555fd5c9ca6a7f1db46b7c17462e37d0086f868bf2"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 17:41:22 crc kubenswrapper[4756]: I1205 17:41:22.296800 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://13145d7e051555462bf10e555fd5c9ca6a7f1db46b7c17462e37d0086f868bf2" gracePeriod=600 Dec 05 17:41:22 crc kubenswrapper[4756]: E1205 17:41:22.427942 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:41:22 crc kubenswrapper[4756]: I1205 17:41:22.495390 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="13145d7e051555462bf10e555fd5c9ca6a7f1db46b7c17462e37d0086f868bf2" exitCode=0 Dec 05 17:41:22 crc kubenswrapper[4756]: I1205 17:41:22.495464 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"13145d7e051555462bf10e555fd5c9ca6a7f1db46b7c17462e37d0086f868bf2"} Dec 05 17:41:22 crc kubenswrapper[4756]: I1205 17:41:22.495806 4756 scope.go:117] "RemoveContainer" containerID="f1c4de2bc9063f8e3fc76ba02b88979769bea7b75c8852b80a31670bb4de72f9" Dec 05 17:41:22 crc kubenswrapper[4756]: I1205 17:41:22.496579 4756 scope.go:117] "RemoveContainer" containerID="13145d7e051555462bf10e555fd5c9ca6a7f1db46b7c17462e37d0086f868bf2" Dec 05 17:41:22 crc kubenswrapper[4756]: E1205 17:41:22.497122 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:41:33 crc kubenswrapper[4756]: I1205 17:41:33.611962 4756 generic.go:334] "Generic (PLEG): container finished" podID="2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc" containerID="7a9e910c605445af65460d1a7af6f5a9671a48b3a7e3b1ba359e011b253f3a37" exitCode=0 Dec 05 17:41:33 crc kubenswrapper[4756]: I1205 17:41:33.612219 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc","Type":"ContainerDied","Data":"7a9e910c605445af65460d1a7af6f5a9671a48b3a7e3b1ba359e011b253f3a37"} Dec 05 17:41:34 crc kubenswrapper[4756]: I1205 17:41:34.621168 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc","Type":"ContainerStarted","Data":"f66da80e047a79b5425114453e46af8a6900376e58a16540f200226e41e2f335"} Dec 05 17:41:34 crc kubenswrapper[4756]: I1205 17:41:34.621827 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 05 17:41:34 crc kubenswrapper[4756]: I1205 17:41:34.622660 4756 generic.go:334] "Generic (PLEG): container finished" podID="f2f77107-b6d1-42d0-afbd-482dfefc3524" containerID="b0804cf812f6fea1cd3fb5966b67b201f4e2179c906dbbbd8a4925e07aeca2a5" exitCode=0 Dec 05 17:41:34 crc kubenswrapper[4756]: I1205 17:41:34.622696 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f2f77107-b6d1-42d0-afbd-482dfefc3524","Type":"ContainerDied","Data":"b0804cf812f6fea1cd3fb5966b67b201f4e2179c906dbbbd8a4925e07aeca2a5"} Dec 05 17:41:34 crc kubenswrapper[4756]: I1205 17:41:34.682323 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.682301266 podStartE2EDuration="36.682301266s" podCreationTimestamp="2025-12-05 17:40:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:41:34.647027289 +0000 UTC m=+5154.885700811" watchObservedRunningTime="2025-12-05 17:41:34.682301266 +0000 UTC m=+5154.920974778" Dec 05 17:41:35 crc kubenswrapper[4756]: I1205 17:41:35.065363 4756 scope.go:117] "RemoveContainer" containerID="13145d7e051555462bf10e555fd5c9ca6a7f1db46b7c17462e37d0086f868bf2" Dec 05 17:41:35 crc kubenswrapper[4756]: E1205 17:41:35.065687 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:41:35 crc kubenswrapper[4756]: I1205 17:41:35.632305 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f2f77107-b6d1-42d0-afbd-482dfefc3524","Type":"ContainerStarted","Data":"d3c3dbe0a2ebbbd442159066739615df8b1d1dce647c6b5843e20db3a57ffc97"} Dec 05 17:41:35 crc kubenswrapper[4756]: I1205 17:41:35.632965 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:41:35 crc kubenswrapper[4756]: I1205 17:41:35.662113 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.662099979 podStartE2EDuration="37.662099979s" podCreationTimestamp="2025-12-05 17:40:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:41:35.657483456 +0000 UTC m=+5155.896156988" watchObservedRunningTime="2025-12-05 17:41:35.662099979 +0000 UTC m=+5155.900773481" Dec 05 17:41:46 crc kubenswrapper[4756]: I1205 17:41:46.049691 4756 scope.go:117] "RemoveContainer" containerID="13145d7e051555462bf10e555fd5c9ca6a7f1db46b7c17462e37d0086f868bf2" Dec 05 17:41:46 crc kubenswrapper[4756]: E1205 17:41:46.050770 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:41:48 crc kubenswrapper[4756]: I1205 17:41:48.753630 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 05 17:41:48 crc kubenswrapper[4756]: I1205 17:41:48.947639 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 05 17:41:54 crc kubenswrapper[4756]: I1205 17:41:54.028764 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Dec 05 17:41:54 crc kubenswrapper[4756]: I1205 17:41:54.032665 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 05 17:41:54 crc kubenswrapper[4756]: I1205 17:41:54.039556 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-22h6q" Dec 05 17:41:54 crc kubenswrapper[4756]: I1205 17:41:54.064156 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 05 17:41:54 crc kubenswrapper[4756]: I1205 17:41:54.113286 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6dbp\" (UniqueName: \"kubernetes.io/projected/7875fd8d-cc12-4405-93ca-d3f2b86faacd-kube-api-access-g6dbp\") pod \"mariadb-client-1-default\" (UID: \"7875fd8d-cc12-4405-93ca-d3f2b86faacd\") " pod="openstack/mariadb-client-1-default" Dec 05 17:41:54 crc kubenswrapper[4756]: I1205 17:41:54.215335 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6dbp\" (UniqueName: \"kubernetes.io/projected/7875fd8d-cc12-4405-93ca-d3f2b86faacd-kube-api-access-g6dbp\") pod \"mariadb-client-1-default\" (UID: \"7875fd8d-cc12-4405-93ca-d3f2b86faacd\") " pod="openstack/mariadb-client-1-default" Dec 05 17:41:54 crc kubenswrapper[4756]: I1205 17:41:54.241778 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6dbp\" (UniqueName: \"kubernetes.io/projected/7875fd8d-cc12-4405-93ca-d3f2b86faacd-kube-api-access-g6dbp\") pod \"mariadb-client-1-default\" (UID: \"7875fd8d-cc12-4405-93ca-d3f2b86faacd\") " pod="openstack/mariadb-client-1-default" Dec 05 17:41:54 crc kubenswrapper[4756]: I1205 17:41:54.373613 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 05 17:41:54 crc kubenswrapper[4756]: I1205 17:41:54.777898 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 05 17:41:54 crc kubenswrapper[4756]: W1205 17:41:54.780078 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7875fd8d_cc12_4405_93ca_d3f2b86faacd.slice/crio-303da3d59498bf1a9a4ed7bb71f5db440455bf75d434f41093cbf670ed6e47a7 WatchSource:0}: Error finding container 303da3d59498bf1a9a4ed7bb71f5db440455bf75d434f41093cbf670ed6e47a7: Status 404 returned error can't find the container with id 303da3d59498bf1a9a4ed7bb71f5db440455bf75d434f41093cbf670ed6e47a7 Dec 05 17:41:54 crc kubenswrapper[4756]: I1205 17:41:54.818286 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"7875fd8d-cc12-4405-93ca-d3f2b86faacd","Type":"ContainerStarted","Data":"303da3d59498bf1a9a4ed7bb71f5db440455bf75d434f41093cbf670ed6e47a7"} Dec 05 17:41:55 crc kubenswrapper[4756]: I1205 17:41:55.829176 4756 generic.go:334] "Generic (PLEG): container finished" podID="7875fd8d-cc12-4405-93ca-d3f2b86faacd" containerID="a5257ada9706b9ac3881b750ad36a18fbe7581f35160a25fe036a60436880e5c" exitCode=0 Dec 05 17:41:55 crc kubenswrapper[4756]: I1205 17:41:55.829231 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"7875fd8d-cc12-4405-93ca-d3f2b86faacd","Type":"ContainerDied","Data":"a5257ada9706b9ac3881b750ad36a18fbe7581f35160a25fe036a60436880e5c"} Dec 05 17:41:57 crc kubenswrapper[4756]: I1205 17:41:57.264830 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 05 17:41:57 crc kubenswrapper[4756]: I1205 17:41:57.297766 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_7875fd8d-cc12-4405-93ca-d3f2b86faacd/mariadb-client-1-default/0.log" Dec 05 17:41:57 crc kubenswrapper[4756]: I1205 17:41:57.335720 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 05 17:41:57 crc kubenswrapper[4756]: I1205 17:41:57.364072 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 05 17:41:57 crc kubenswrapper[4756]: I1205 17:41:57.364363 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6dbp\" (UniqueName: \"kubernetes.io/projected/7875fd8d-cc12-4405-93ca-d3f2b86faacd-kube-api-access-g6dbp\") pod \"7875fd8d-cc12-4405-93ca-d3f2b86faacd\" (UID: \"7875fd8d-cc12-4405-93ca-d3f2b86faacd\") " Dec 05 17:41:57 crc kubenswrapper[4756]: I1205 17:41:57.386746 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7875fd8d-cc12-4405-93ca-d3f2b86faacd-kube-api-access-g6dbp" (OuterVolumeSpecName: "kube-api-access-g6dbp") pod "7875fd8d-cc12-4405-93ca-d3f2b86faacd" (UID: "7875fd8d-cc12-4405-93ca-d3f2b86faacd"). InnerVolumeSpecName "kube-api-access-g6dbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:41:57 crc kubenswrapper[4756]: I1205 17:41:57.466738 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6dbp\" (UniqueName: \"kubernetes.io/projected/7875fd8d-cc12-4405-93ca-d3f2b86faacd-kube-api-access-g6dbp\") on node \"crc\" DevicePath \"\"" Dec 05 17:41:57 crc kubenswrapper[4756]: I1205 17:41:57.848883 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="303da3d59498bf1a9a4ed7bb71f5db440455bf75d434f41093cbf670ed6e47a7" Dec 05 17:41:57 crc kubenswrapper[4756]: I1205 17:41:57.848975 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 05 17:41:57 crc kubenswrapper[4756]: I1205 17:41:57.942113 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Dec 05 17:41:57 crc kubenswrapper[4756]: E1205 17:41:57.943096 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7875fd8d-cc12-4405-93ca-d3f2b86faacd" containerName="mariadb-client-1-default" Dec 05 17:41:57 crc kubenswrapper[4756]: I1205 17:41:57.943147 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7875fd8d-cc12-4405-93ca-d3f2b86faacd" containerName="mariadb-client-1-default" Dec 05 17:41:57 crc kubenswrapper[4756]: I1205 17:41:57.943570 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="7875fd8d-cc12-4405-93ca-d3f2b86faacd" containerName="mariadb-client-1-default" Dec 05 17:41:57 crc kubenswrapper[4756]: I1205 17:41:57.944943 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 05 17:41:57 crc kubenswrapper[4756]: I1205 17:41:57.948128 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-22h6q" Dec 05 17:41:57 crc kubenswrapper[4756]: I1205 17:41:57.954645 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 05 17:41:58 crc kubenswrapper[4756]: I1205 17:41:58.062697 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7875fd8d-cc12-4405-93ca-d3f2b86faacd" path="/var/lib/kubelet/pods/7875fd8d-cc12-4405-93ca-d3f2b86faacd/volumes" Dec 05 17:41:58 crc kubenswrapper[4756]: I1205 17:41:58.076676 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhxkr\" (UniqueName: \"kubernetes.io/projected/f1b7c42c-1055-4765-9ec9-f6933daf27fa-kube-api-access-nhxkr\") pod \"mariadb-client-2-default\" (UID: \"f1b7c42c-1055-4765-9ec9-f6933daf27fa\") " pod="openstack/mariadb-client-2-default" Dec 05 17:41:58 crc kubenswrapper[4756]: I1205 17:41:58.178553 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhxkr\" (UniqueName: \"kubernetes.io/projected/f1b7c42c-1055-4765-9ec9-f6933daf27fa-kube-api-access-nhxkr\") pod \"mariadb-client-2-default\" (UID: \"f1b7c42c-1055-4765-9ec9-f6933daf27fa\") " pod="openstack/mariadb-client-2-default" Dec 05 17:41:58 crc kubenswrapper[4756]: I1205 17:41:58.208802 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhxkr\" (UniqueName: \"kubernetes.io/projected/f1b7c42c-1055-4765-9ec9-f6933daf27fa-kube-api-access-nhxkr\") pod \"mariadb-client-2-default\" (UID: \"f1b7c42c-1055-4765-9ec9-f6933daf27fa\") " pod="openstack/mariadb-client-2-default" Dec 05 17:41:58 crc kubenswrapper[4756]: I1205 17:41:58.264751 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 05 17:41:58 crc kubenswrapper[4756]: I1205 17:41:58.889216 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 05 17:41:59 crc kubenswrapper[4756]: I1205 17:41:59.049839 4756 scope.go:117] "RemoveContainer" containerID="13145d7e051555462bf10e555fd5c9ca6a7f1db46b7c17462e37d0086f868bf2" Dec 05 17:41:59 crc kubenswrapper[4756]: E1205 17:41:59.050113 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:41:59 crc kubenswrapper[4756]: I1205 17:41:59.873364 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"f1b7c42c-1055-4765-9ec9-f6933daf27fa","Type":"ContainerStarted","Data":"43071d64150832e1f7257f592891f10ccaa44dc7605cf77953ae6f6e489e3c57"} Dec 05 17:41:59 crc kubenswrapper[4756]: I1205 17:41:59.873926 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"f1b7c42c-1055-4765-9ec9-f6933daf27fa","Type":"ContainerStarted","Data":"379ac5bc9fc353dd657d30ef0ae23b264265a3ae5dcb902c1a2cba972a36f1d1"} Dec 05 17:41:59 crc kubenswrapper[4756]: I1205 17:41:59.900910 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-2-default" podStartSLOduration=2.9008911360000003 podStartE2EDuration="2.900891136s" podCreationTimestamp="2025-12-05 17:41:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:41:59.895878663 +0000 UTC m=+5180.134552165" watchObservedRunningTime="2025-12-05 17:41:59.900891136 +0000 UTC m=+5180.139564648" Dec 05 17:41:59 crc kubenswrapper[4756]: I1205 17:41:59.961932 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2-default_f1b7c42c-1055-4765-9ec9-f6933daf27fa/mariadb-client-2-default/0.log" Dec 05 17:42:00 crc kubenswrapper[4756]: I1205 17:42:00.885614 4756 generic.go:334] "Generic (PLEG): container finished" podID="f1b7c42c-1055-4765-9ec9-f6933daf27fa" containerID="43071d64150832e1f7257f592891f10ccaa44dc7605cf77953ae6f6e489e3c57" exitCode=1 Dec 05 17:42:00 crc kubenswrapper[4756]: I1205 17:42:00.885660 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"f1b7c42c-1055-4765-9ec9-f6933daf27fa","Type":"ContainerDied","Data":"43071d64150832e1f7257f592891f10ccaa44dc7605cf77953ae6f6e489e3c57"} Dec 05 17:42:02 crc kubenswrapper[4756]: I1205 17:42:02.422471 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 05 17:42:02 crc kubenswrapper[4756]: I1205 17:42:02.468572 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 05 17:42:02 crc kubenswrapper[4756]: I1205 17:42:02.475913 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 05 17:42:02 crc kubenswrapper[4756]: I1205 17:42:02.552252 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhxkr\" (UniqueName: \"kubernetes.io/projected/f1b7c42c-1055-4765-9ec9-f6933daf27fa-kube-api-access-nhxkr\") pod \"f1b7c42c-1055-4765-9ec9-f6933daf27fa\" (UID: \"f1b7c42c-1055-4765-9ec9-f6933daf27fa\") " Dec 05 17:42:02 crc kubenswrapper[4756]: I1205 17:42:02.561812 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1b7c42c-1055-4765-9ec9-f6933daf27fa-kube-api-access-nhxkr" (OuterVolumeSpecName: "kube-api-access-nhxkr") pod "f1b7c42c-1055-4765-9ec9-f6933daf27fa" (UID: "f1b7c42c-1055-4765-9ec9-f6933daf27fa"). InnerVolumeSpecName "kube-api-access-nhxkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:42:02 crc kubenswrapper[4756]: I1205 17:42:02.653886 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhxkr\" (UniqueName: \"kubernetes.io/projected/f1b7c42c-1055-4765-9ec9-f6933daf27fa-kube-api-access-nhxkr\") on node \"crc\" DevicePath \"\"" Dec 05 17:42:02 crc kubenswrapper[4756]: I1205 17:42:02.912276 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="379ac5bc9fc353dd657d30ef0ae23b264265a3ae5dcb902c1a2cba972a36f1d1" Dec 05 17:42:02 crc kubenswrapper[4756]: I1205 17:42:02.912358 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 05 17:42:02 crc kubenswrapper[4756]: I1205 17:42:02.938386 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Dec 05 17:42:02 crc kubenswrapper[4756]: E1205 17:42:02.939163 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1b7c42c-1055-4765-9ec9-f6933daf27fa" containerName="mariadb-client-2-default" Dec 05 17:42:02 crc kubenswrapper[4756]: I1205 17:42:02.939219 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1b7c42c-1055-4765-9ec9-f6933daf27fa" containerName="mariadb-client-2-default" Dec 05 17:42:02 crc kubenswrapper[4756]: I1205 17:42:02.939554 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1b7c42c-1055-4765-9ec9-f6933daf27fa" containerName="mariadb-client-2-default" Dec 05 17:42:02 crc kubenswrapper[4756]: I1205 17:42:02.940493 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 05 17:42:02 crc kubenswrapper[4756]: I1205 17:42:02.942746 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-22h6q" Dec 05 17:42:02 crc kubenswrapper[4756]: I1205 17:42:02.949084 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Dec 05 17:42:03 crc kubenswrapper[4756]: I1205 17:42:03.065237 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d46p\" (UniqueName: \"kubernetes.io/projected/1a461663-9809-462b-a289-92e7a0c025ea-kube-api-access-7d46p\") pod \"mariadb-client-1\" (UID: \"1a461663-9809-462b-a289-92e7a0c025ea\") " pod="openstack/mariadb-client-1" Dec 05 17:42:03 crc kubenswrapper[4756]: I1205 17:42:03.167550 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d46p\" (UniqueName: \"kubernetes.io/projected/1a461663-9809-462b-a289-92e7a0c025ea-kube-api-access-7d46p\") pod \"mariadb-client-1\" (UID: \"1a461663-9809-462b-a289-92e7a0c025ea\") " pod="openstack/mariadb-client-1" Dec 05 17:42:03 crc kubenswrapper[4756]: I1205 17:42:03.205481 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d46p\" (UniqueName: \"kubernetes.io/projected/1a461663-9809-462b-a289-92e7a0c025ea-kube-api-access-7d46p\") pod \"mariadb-client-1\" (UID: \"1a461663-9809-462b-a289-92e7a0c025ea\") " pod="openstack/mariadb-client-1" Dec 05 17:42:03 crc kubenswrapper[4756]: I1205 17:42:03.269173 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 05 17:42:03 crc kubenswrapper[4756]: I1205 17:42:03.812610 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Dec 05 17:42:03 crc kubenswrapper[4756]: W1205 17:42:03.818884 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a461663_9809_462b_a289_92e7a0c025ea.slice/crio-92ef8de23e73a622e51f166a6bcc13ad12b87fa30864354a7f38462333dc95ca WatchSource:0}: Error finding container 92ef8de23e73a622e51f166a6bcc13ad12b87fa30864354a7f38462333dc95ca: Status 404 returned error can't find the container with id 92ef8de23e73a622e51f166a6bcc13ad12b87fa30864354a7f38462333dc95ca Dec 05 17:42:03 crc kubenswrapper[4756]: I1205 17:42:03.923796 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"1a461663-9809-462b-a289-92e7a0c025ea","Type":"ContainerStarted","Data":"92ef8de23e73a622e51f166a6bcc13ad12b87fa30864354a7f38462333dc95ca"} Dec 05 17:42:04 crc kubenswrapper[4756]: I1205 17:42:04.060228 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1b7c42c-1055-4765-9ec9-f6933daf27fa" path="/var/lib/kubelet/pods/f1b7c42c-1055-4765-9ec9-f6933daf27fa/volumes" Dec 05 17:42:04 crc kubenswrapper[4756]: I1205 17:42:04.935098 4756 generic.go:334] "Generic (PLEG): container finished" podID="1a461663-9809-462b-a289-92e7a0c025ea" containerID="142a5da11e0eb0e54b9412c5e504f31d6cdc1511a6de841c4168bd18837e9f9b" exitCode=0 Dec 05 17:42:04 crc kubenswrapper[4756]: I1205 17:42:04.935159 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"1a461663-9809-462b-a289-92e7a0c025ea","Type":"ContainerDied","Data":"142a5da11e0eb0e54b9412c5e504f31d6cdc1511a6de841c4168bd18837e9f9b"} Dec 05 17:42:06 crc kubenswrapper[4756]: I1205 17:42:06.378339 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 05 17:42:06 crc kubenswrapper[4756]: I1205 17:42:06.401382 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_1a461663-9809-462b-a289-92e7a0c025ea/mariadb-client-1/0.log" Dec 05 17:42:06 crc kubenswrapper[4756]: I1205 17:42:06.435900 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Dec 05 17:42:06 crc kubenswrapper[4756]: I1205 17:42:06.441100 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Dec 05 17:42:06 crc kubenswrapper[4756]: I1205 17:42:06.526531 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d46p\" (UniqueName: \"kubernetes.io/projected/1a461663-9809-462b-a289-92e7a0c025ea-kube-api-access-7d46p\") pod \"1a461663-9809-462b-a289-92e7a0c025ea\" (UID: \"1a461663-9809-462b-a289-92e7a0c025ea\") " Dec 05 17:42:06 crc kubenswrapper[4756]: I1205 17:42:06.533411 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a461663-9809-462b-a289-92e7a0c025ea-kube-api-access-7d46p" (OuterVolumeSpecName: "kube-api-access-7d46p") pod "1a461663-9809-462b-a289-92e7a0c025ea" (UID: "1a461663-9809-462b-a289-92e7a0c025ea"). InnerVolumeSpecName "kube-api-access-7d46p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:42:06 crc kubenswrapper[4756]: I1205 17:42:06.628395 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7d46p\" (UniqueName: \"kubernetes.io/projected/1a461663-9809-462b-a289-92e7a0c025ea-kube-api-access-7d46p\") on node \"crc\" DevicePath \"\"" Dec 05 17:42:06 crc kubenswrapper[4756]: I1205 17:42:06.906258 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Dec 05 17:42:06 crc kubenswrapper[4756]: E1205 17:42:06.906969 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a461663-9809-462b-a289-92e7a0c025ea" containerName="mariadb-client-1" Dec 05 17:42:06 crc kubenswrapper[4756]: I1205 17:42:06.907019 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a461663-9809-462b-a289-92e7a0c025ea" containerName="mariadb-client-1" Dec 05 17:42:06 crc kubenswrapper[4756]: I1205 17:42:06.907408 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a461663-9809-462b-a289-92e7a0c025ea" containerName="mariadb-client-1" Dec 05 17:42:06 crc kubenswrapper[4756]: I1205 17:42:06.908616 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 05 17:42:06 crc kubenswrapper[4756]: I1205 17:42:06.914335 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 05 17:42:06 crc kubenswrapper[4756]: I1205 17:42:06.960739 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92ef8de23e73a622e51f166a6bcc13ad12b87fa30864354a7f38462333dc95ca" Dec 05 17:42:06 crc kubenswrapper[4756]: I1205 17:42:06.960800 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 05 17:42:07 crc kubenswrapper[4756]: I1205 17:42:07.035370 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lbnt\" (UniqueName: \"kubernetes.io/projected/7e08833f-2c7c-47b3-aaf8-c3fa736ba6c7-kube-api-access-2lbnt\") pod \"mariadb-client-4-default\" (UID: \"7e08833f-2c7c-47b3-aaf8-c3fa736ba6c7\") " pod="openstack/mariadb-client-4-default" Dec 05 17:42:07 crc kubenswrapper[4756]: I1205 17:42:07.136804 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lbnt\" (UniqueName: \"kubernetes.io/projected/7e08833f-2c7c-47b3-aaf8-c3fa736ba6c7-kube-api-access-2lbnt\") pod \"mariadb-client-4-default\" (UID: \"7e08833f-2c7c-47b3-aaf8-c3fa736ba6c7\") " pod="openstack/mariadb-client-4-default" Dec 05 17:42:07 crc kubenswrapper[4756]: I1205 17:42:07.164337 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lbnt\" (UniqueName: \"kubernetes.io/projected/7e08833f-2c7c-47b3-aaf8-c3fa736ba6c7-kube-api-access-2lbnt\") pod \"mariadb-client-4-default\" (UID: \"7e08833f-2c7c-47b3-aaf8-c3fa736ba6c7\") " pod="openstack/mariadb-client-4-default" Dec 05 17:42:07 crc kubenswrapper[4756]: I1205 17:42:07.242668 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 05 17:42:07 crc kubenswrapper[4756]: I1205 17:42:07.807186 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 05 17:42:07 crc kubenswrapper[4756]: W1205 17:42:07.815402 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e08833f_2c7c_47b3_aaf8_c3fa736ba6c7.slice/crio-31681ccf0af696a2569d26b4a26ad8ed929d492cd04436f41173a3856b4366db WatchSource:0}: Error finding container 31681ccf0af696a2569d26b4a26ad8ed929d492cd04436f41173a3856b4366db: Status 404 returned error can't find the container with id 31681ccf0af696a2569d26b4a26ad8ed929d492cd04436f41173a3856b4366db Dec 05 17:42:07 crc kubenswrapper[4756]: I1205 17:42:07.973723 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"7e08833f-2c7c-47b3-aaf8-c3fa736ba6c7","Type":"ContainerStarted","Data":"31681ccf0af696a2569d26b4a26ad8ed929d492cd04436f41173a3856b4366db"} Dec 05 17:42:08 crc kubenswrapper[4756]: I1205 17:42:08.070667 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a461663-9809-462b-a289-92e7a0c025ea" path="/var/lib/kubelet/pods/1a461663-9809-462b-a289-92e7a0c025ea/volumes" Dec 05 17:42:08 crc kubenswrapper[4756]: I1205 17:42:08.982537 4756 generic.go:334] "Generic (PLEG): container finished" podID="7e08833f-2c7c-47b3-aaf8-c3fa736ba6c7" containerID="9f4d2f47b98a59cb595396df6af312e4fdd7fe8112acc149c4f16bc9c161840d" exitCode=0 Dec 05 17:42:08 crc kubenswrapper[4756]: I1205 17:42:08.982620 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"7e08833f-2c7c-47b3-aaf8-c3fa736ba6c7","Type":"ContainerDied","Data":"9f4d2f47b98a59cb595396df6af312e4fdd7fe8112acc149c4f16bc9c161840d"} Dec 05 17:42:10 crc kubenswrapper[4756]: I1205 17:42:10.439032 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 05 17:42:10 crc kubenswrapper[4756]: I1205 17:42:10.480838 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_7e08833f-2c7c-47b3-aaf8-c3fa736ba6c7/mariadb-client-4-default/0.log" Dec 05 17:42:10 crc kubenswrapper[4756]: I1205 17:42:10.520581 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 05 17:42:10 crc kubenswrapper[4756]: I1205 17:42:10.527044 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 05 17:42:10 crc kubenswrapper[4756]: I1205 17:42:10.595710 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lbnt\" (UniqueName: \"kubernetes.io/projected/7e08833f-2c7c-47b3-aaf8-c3fa736ba6c7-kube-api-access-2lbnt\") pod \"7e08833f-2c7c-47b3-aaf8-c3fa736ba6c7\" (UID: \"7e08833f-2c7c-47b3-aaf8-c3fa736ba6c7\") " Dec 05 17:42:10 crc kubenswrapper[4756]: I1205 17:42:10.606143 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e08833f-2c7c-47b3-aaf8-c3fa736ba6c7-kube-api-access-2lbnt" (OuterVolumeSpecName: "kube-api-access-2lbnt") pod "7e08833f-2c7c-47b3-aaf8-c3fa736ba6c7" (UID: "7e08833f-2c7c-47b3-aaf8-c3fa736ba6c7"). InnerVolumeSpecName "kube-api-access-2lbnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:42:10 crc kubenswrapper[4756]: I1205 17:42:10.697825 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lbnt\" (UniqueName: \"kubernetes.io/projected/7e08833f-2c7c-47b3-aaf8-c3fa736ba6c7-kube-api-access-2lbnt\") on node \"crc\" DevicePath \"\"" Dec 05 17:42:11 crc kubenswrapper[4756]: I1205 17:42:11.009925 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31681ccf0af696a2569d26b4a26ad8ed929d492cd04436f41173a3856b4366db" Dec 05 17:42:11 crc kubenswrapper[4756]: I1205 17:42:11.010001 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 05 17:42:11 crc kubenswrapper[4756]: I1205 17:42:11.050198 4756 scope.go:117] "RemoveContainer" containerID="13145d7e051555462bf10e555fd5c9ca6a7f1db46b7c17462e37d0086f868bf2" Dec 05 17:42:11 crc kubenswrapper[4756]: E1205 17:42:11.051291 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:42:12 crc kubenswrapper[4756]: I1205 17:42:12.066087 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e08833f-2c7c-47b3-aaf8-c3fa736ba6c7" path="/var/lib/kubelet/pods/7e08833f-2c7c-47b3-aaf8-c3fa736ba6c7/volumes" Dec 05 17:42:14 crc kubenswrapper[4756]: I1205 17:42:14.172736 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Dec 05 17:42:14 crc kubenswrapper[4756]: E1205 17:42:14.173584 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e08833f-2c7c-47b3-aaf8-c3fa736ba6c7" containerName="mariadb-client-4-default" Dec 05 17:42:14 crc kubenswrapper[4756]: I1205 17:42:14.173605 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e08833f-2c7c-47b3-aaf8-c3fa736ba6c7" containerName="mariadb-client-4-default" Dec 05 17:42:14 crc kubenswrapper[4756]: I1205 17:42:14.173881 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e08833f-2c7c-47b3-aaf8-c3fa736ba6c7" containerName="mariadb-client-4-default" Dec 05 17:42:14 crc kubenswrapper[4756]: I1205 17:42:14.174715 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 05 17:42:14 crc kubenswrapper[4756]: I1205 17:42:14.179884 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-22h6q" Dec 05 17:42:14 crc kubenswrapper[4756]: I1205 17:42:14.189586 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 05 17:42:14 crc kubenswrapper[4756]: I1205 17:42:14.254579 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpqxn\" (UniqueName: \"kubernetes.io/projected/96232644-6788-4f5b-a7b7-48434b12f454-kube-api-access-wpqxn\") pod \"mariadb-client-5-default\" (UID: \"96232644-6788-4f5b-a7b7-48434b12f454\") " pod="openstack/mariadb-client-5-default" Dec 05 17:42:14 crc kubenswrapper[4756]: I1205 17:42:14.355795 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpqxn\" (UniqueName: \"kubernetes.io/projected/96232644-6788-4f5b-a7b7-48434b12f454-kube-api-access-wpqxn\") pod \"mariadb-client-5-default\" (UID: \"96232644-6788-4f5b-a7b7-48434b12f454\") " pod="openstack/mariadb-client-5-default" Dec 05 17:42:14 crc kubenswrapper[4756]: I1205 17:42:14.381318 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpqxn\" (UniqueName: \"kubernetes.io/projected/96232644-6788-4f5b-a7b7-48434b12f454-kube-api-access-wpqxn\") pod \"mariadb-client-5-default\" (UID: \"96232644-6788-4f5b-a7b7-48434b12f454\") " pod="openstack/mariadb-client-5-default" Dec 05 17:42:14 crc kubenswrapper[4756]: I1205 17:42:14.510823 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 05 17:42:15 crc kubenswrapper[4756]: I1205 17:42:15.111222 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 05 17:42:16 crc kubenswrapper[4756]: I1205 17:42:16.060249 4756 generic.go:334] "Generic (PLEG): container finished" podID="96232644-6788-4f5b-a7b7-48434b12f454" containerID="8e8f7a421a65d2e331ab773c0785a23d75d01d86309429c64fcec97cf2a0d5ef" exitCode=0 Dec 05 17:42:16 crc kubenswrapper[4756]: I1205 17:42:16.060300 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"96232644-6788-4f5b-a7b7-48434b12f454","Type":"ContainerDied","Data":"8e8f7a421a65d2e331ab773c0785a23d75d01d86309429c64fcec97cf2a0d5ef"} Dec 05 17:42:16 crc kubenswrapper[4756]: I1205 17:42:16.060778 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"96232644-6788-4f5b-a7b7-48434b12f454","Type":"ContainerStarted","Data":"e1171366545d9bc95fc0a5eb69c0592544637251109faca20ba413d7d977bc43"} Dec 05 17:42:17 crc kubenswrapper[4756]: I1205 17:42:17.560464 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 05 17:42:17 crc kubenswrapper[4756]: I1205 17:42:17.582180 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_96232644-6788-4f5b-a7b7-48434b12f454/mariadb-client-5-default/0.log" Dec 05 17:42:17 crc kubenswrapper[4756]: I1205 17:42:17.620549 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 05 17:42:17 crc kubenswrapper[4756]: I1205 17:42:17.628623 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 05 17:42:17 crc kubenswrapper[4756]: I1205 17:42:17.710980 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpqxn\" (UniqueName: \"kubernetes.io/projected/96232644-6788-4f5b-a7b7-48434b12f454-kube-api-access-wpqxn\") pod \"96232644-6788-4f5b-a7b7-48434b12f454\" (UID: \"96232644-6788-4f5b-a7b7-48434b12f454\") " Dec 05 17:42:17 crc kubenswrapper[4756]: I1205 17:42:17.718285 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96232644-6788-4f5b-a7b7-48434b12f454-kube-api-access-wpqxn" (OuterVolumeSpecName: "kube-api-access-wpqxn") pod "96232644-6788-4f5b-a7b7-48434b12f454" (UID: "96232644-6788-4f5b-a7b7-48434b12f454"). InnerVolumeSpecName "kube-api-access-wpqxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:42:17 crc kubenswrapper[4756]: I1205 17:42:17.790233 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Dec 05 17:42:17 crc kubenswrapper[4756]: E1205 17:42:17.790777 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96232644-6788-4f5b-a7b7-48434b12f454" containerName="mariadb-client-5-default" Dec 05 17:42:17 crc kubenswrapper[4756]: I1205 17:42:17.790810 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="96232644-6788-4f5b-a7b7-48434b12f454" containerName="mariadb-client-5-default" Dec 05 17:42:17 crc kubenswrapper[4756]: I1205 17:42:17.791125 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="96232644-6788-4f5b-a7b7-48434b12f454" containerName="mariadb-client-5-default" Dec 05 17:42:17 crc kubenswrapper[4756]: I1205 17:42:17.792322 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 05 17:42:17 crc kubenswrapper[4756]: I1205 17:42:17.809574 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 05 17:42:17 crc kubenswrapper[4756]: I1205 17:42:17.815613 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpqxn\" (UniqueName: \"kubernetes.io/projected/96232644-6788-4f5b-a7b7-48434b12f454-kube-api-access-wpqxn\") on node \"crc\" DevicePath \"\"" Dec 05 17:42:17 crc kubenswrapper[4756]: I1205 17:42:17.917615 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8jqg\" (UniqueName: \"kubernetes.io/projected/bae5f640-6ae3-45e6-a286-3381c66bf728-kube-api-access-d8jqg\") pod \"mariadb-client-6-default\" (UID: \"bae5f640-6ae3-45e6-a286-3381c66bf728\") " pod="openstack/mariadb-client-6-default" Dec 05 17:42:18 crc kubenswrapper[4756]: I1205 17:42:18.019388 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8jqg\" (UniqueName: \"kubernetes.io/projected/bae5f640-6ae3-45e6-a286-3381c66bf728-kube-api-access-d8jqg\") pod \"mariadb-client-6-default\" (UID: \"bae5f640-6ae3-45e6-a286-3381c66bf728\") " pod="openstack/mariadb-client-6-default" Dec 05 17:42:18 crc kubenswrapper[4756]: I1205 17:42:18.049106 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8jqg\" (UniqueName: \"kubernetes.io/projected/bae5f640-6ae3-45e6-a286-3381c66bf728-kube-api-access-d8jqg\") pod \"mariadb-client-6-default\" (UID: \"bae5f640-6ae3-45e6-a286-3381c66bf728\") " pod="openstack/mariadb-client-6-default" Dec 05 17:42:18 crc kubenswrapper[4756]: I1205 17:42:18.062004 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96232644-6788-4f5b-a7b7-48434b12f454" path="/var/lib/kubelet/pods/96232644-6788-4f5b-a7b7-48434b12f454/volumes" Dec 05 17:42:18 crc kubenswrapper[4756]: I1205 17:42:18.085324 4756 scope.go:117] "RemoveContainer" containerID="8e8f7a421a65d2e331ab773c0785a23d75d01d86309429c64fcec97cf2a0d5ef" Dec 05 17:42:18 crc kubenswrapper[4756]: I1205 17:42:18.085457 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 05 17:42:18 crc kubenswrapper[4756]: I1205 17:42:18.120556 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 05 17:42:18 crc kubenswrapper[4756]: I1205 17:42:18.812591 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 05 17:42:19 crc kubenswrapper[4756]: I1205 17:42:19.094970 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"bae5f640-6ae3-45e6-a286-3381c66bf728","Type":"ContainerStarted","Data":"715e3a7ba1a866ee482d82e1a7b8a2faaaf1e706d783a24fc069fcb58f454e84"} Dec 05 17:42:19 crc kubenswrapper[4756]: I1205 17:42:19.095026 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"bae5f640-6ae3-45e6-a286-3381c66bf728","Type":"ContainerStarted","Data":"423bfd25dcff8b5355cc85fbb4d103994053919ae53b529a3cd7922d3dd1f676"} Dec 05 17:42:19 crc kubenswrapper[4756]: I1205 17:42:19.111550 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-6-default" podStartSLOduration=2.111534808 podStartE2EDuration="2.111534808s" podCreationTimestamp="2025-12-05 17:42:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:42:19.109593926 +0000 UTC m=+5199.348267488" watchObservedRunningTime="2025-12-05 17:42:19.111534808 +0000 UTC m=+5199.350208320" Dec 05 17:42:20 crc kubenswrapper[4756]: I1205 17:42:20.110312 4756 generic.go:334] "Generic (PLEG): container finished" podID="bae5f640-6ae3-45e6-a286-3381c66bf728" containerID="715e3a7ba1a866ee482d82e1a7b8a2faaaf1e706d783a24fc069fcb58f454e84" exitCode=1 Dec 05 17:42:20 crc kubenswrapper[4756]: I1205 17:42:20.110393 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"bae5f640-6ae3-45e6-a286-3381c66bf728","Type":"ContainerDied","Data":"715e3a7ba1a866ee482d82e1a7b8a2faaaf1e706d783a24fc069fcb58f454e84"} Dec 05 17:42:21 crc kubenswrapper[4756]: I1205 17:42:21.518786 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 05 17:42:21 crc kubenswrapper[4756]: I1205 17:42:21.555245 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 05 17:42:21 crc kubenswrapper[4756]: I1205 17:42:21.563039 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 05 17:42:21 crc kubenswrapper[4756]: I1205 17:42:21.579129 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8jqg\" (UniqueName: \"kubernetes.io/projected/bae5f640-6ae3-45e6-a286-3381c66bf728-kube-api-access-d8jqg\") pod \"bae5f640-6ae3-45e6-a286-3381c66bf728\" (UID: \"bae5f640-6ae3-45e6-a286-3381c66bf728\") " Dec 05 17:42:21 crc kubenswrapper[4756]: I1205 17:42:21.585216 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bae5f640-6ae3-45e6-a286-3381c66bf728-kube-api-access-d8jqg" (OuterVolumeSpecName: "kube-api-access-d8jqg") pod "bae5f640-6ae3-45e6-a286-3381c66bf728" (UID: "bae5f640-6ae3-45e6-a286-3381c66bf728"). InnerVolumeSpecName "kube-api-access-d8jqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:42:21 crc kubenswrapper[4756]: I1205 17:42:21.681583 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8jqg\" (UniqueName: \"kubernetes.io/projected/bae5f640-6ae3-45e6-a286-3381c66bf728-kube-api-access-d8jqg\") on node \"crc\" DevicePath \"\"" Dec 05 17:42:21 crc kubenswrapper[4756]: I1205 17:42:21.715067 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Dec 05 17:42:21 crc kubenswrapper[4756]: E1205 17:42:21.715550 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bae5f640-6ae3-45e6-a286-3381c66bf728" containerName="mariadb-client-6-default" Dec 05 17:42:21 crc kubenswrapper[4756]: I1205 17:42:21.715572 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="bae5f640-6ae3-45e6-a286-3381c66bf728" containerName="mariadb-client-6-default" Dec 05 17:42:21 crc kubenswrapper[4756]: I1205 17:42:21.715748 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="bae5f640-6ae3-45e6-a286-3381c66bf728" containerName="mariadb-client-6-default" Dec 05 17:42:21 crc kubenswrapper[4756]: I1205 17:42:21.716452 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 05 17:42:21 crc kubenswrapper[4756]: I1205 17:42:21.725819 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 05 17:42:21 crc kubenswrapper[4756]: I1205 17:42:21.783658 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fck7m\" (UniqueName: \"kubernetes.io/projected/37de2ccf-f20f-4714-9f06-8d9363a5e939-kube-api-access-fck7m\") pod \"mariadb-client-7-default\" (UID: \"37de2ccf-f20f-4714-9f06-8d9363a5e939\") " pod="openstack/mariadb-client-7-default" Dec 05 17:42:21 crc kubenswrapper[4756]: I1205 17:42:21.884810 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fck7m\" (UniqueName: \"kubernetes.io/projected/37de2ccf-f20f-4714-9f06-8d9363a5e939-kube-api-access-fck7m\") pod \"mariadb-client-7-default\" (UID: \"37de2ccf-f20f-4714-9f06-8d9363a5e939\") " pod="openstack/mariadb-client-7-default" Dec 05 17:42:21 crc kubenswrapper[4756]: I1205 17:42:21.916659 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fck7m\" (UniqueName: \"kubernetes.io/projected/37de2ccf-f20f-4714-9f06-8d9363a5e939-kube-api-access-fck7m\") pod \"mariadb-client-7-default\" (UID: \"37de2ccf-f20f-4714-9f06-8d9363a5e939\") " pod="openstack/mariadb-client-7-default" Dec 05 17:42:22 crc kubenswrapper[4756]: I1205 17:42:22.046483 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 05 17:42:22 crc kubenswrapper[4756]: I1205 17:42:22.066504 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bae5f640-6ae3-45e6-a286-3381c66bf728" path="/var/lib/kubelet/pods/bae5f640-6ae3-45e6-a286-3381c66bf728/volumes" Dec 05 17:42:22 crc kubenswrapper[4756]: I1205 17:42:22.134932 4756 scope.go:117] "RemoveContainer" containerID="715e3a7ba1a866ee482d82e1a7b8a2faaaf1e706d783a24fc069fcb58f454e84" Dec 05 17:42:22 crc kubenswrapper[4756]: I1205 17:42:22.134983 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 05 17:42:22 crc kubenswrapper[4756]: I1205 17:42:22.608262 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 05 17:42:23 crc kubenswrapper[4756]: I1205 17:42:23.145504 4756 generic.go:334] "Generic (PLEG): container finished" podID="37de2ccf-f20f-4714-9f06-8d9363a5e939" containerID="6ec4bbc0ea895b014c2cc25bc26cae3caad2fd43e852bec1c7a401ede64188b6" exitCode=0 Dec 05 17:42:23 crc kubenswrapper[4756]: I1205 17:42:23.145591 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"37de2ccf-f20f-4714-9f06-8d9363a5e939","Type":"ContainerDied","Data":"6ec4bbc0ea895b014c2cc25bc26cae3caad2fd43e852bec1c7a401ede64188b6"} Dec 05 17:42:23 crc kubenswrapper[4756]: I1205 17:42:23.145919 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"37de2ccf-f20f-4714-9f06-8d9363a5e939","Type":"ContainerStarted","Data":"a2fb64cef8f16840f595f3a9f38220d9137de2ed587441699ec0640238641dc9"} Dec 05 17:42:24 crc kubenswrapper[4756]: I1205 17:42:24.576930 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 05 17:42:24 crc kubenswrapper[4756]: I1205 17:42:24.594475 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_37de2ccf-f20f-4714-9f06-8d9363a5e939/mariadb-client-7-default/0.log" Dec 05 17:42:24 crc kubenswrapper[4756]: I1205 17:42:24.621891 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 05 17:42:24 crc kubenswrapper[4756]: I1205 17:42:24.627555 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 05 17:42:24 crc kubenswrapper[4756]: I1205 17:42:24.629382 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fck7m\" (UniqueName: \"kubernetes.io/projected/37de2ccf-f20f-4714-9f06-8d9363a5e939-kube-api-access-fck7m\") pod \"37de2ccf-f20f-4714-9f06-8d9363a5e939\" (UID: \"37de2ccf-f20f-4714-9f06-8d9363a5e939\") " Dec 05 17:42:24 crc kubenswrapper[4756]: I1205 17:42:24.639452 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37de2ccf-f20f-4714-9f06-8d9363a5e939-kube-api-access-fck7m" (OuterVolumeSpecName: "kube-api-access-fck7m") pod "37de2ccf-f20f-4714-9f06-8d9363a5e939" (UID: "37de2ccf-f20f-4714-9f06-8d9363a5e939"). InnerVolumeSpecName "kube-api-access-fck7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:42:24 crc kubenswrapper[4756]: I1205 17:42:24.732222 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fck7m\" (UniqueName: \"kubernetes.io/projected/37de2ccf-f20f-4714-9f06-8d9363a5e939-kube-api-access-fck7m\") on node \"crc\" DevicePath \"\"" Dec 05 17:42:24 crc kubenswrapper[4756]: I1205 17:42:24.811353 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Dec 05 17:42:24 crc kubenswrapper[4756]: E1205 17:42:24.811881 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37de2ccf-f20f-4714-9f06-8d9363a5e939" containerName="mariadb-client-7-default" Dec 05 17:42:24 crc kubenswrapper[4756]: I1205 17:42:24.811911 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="37de2ccf-f20f-4714-9f06-8d9363a5e939" containerName="mariadb-client-7-default" Dec 05 17:42:24 crc kubenswrapper[4756]: I1205 17:42:24.812352 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="37de2ccf-f20f-4714-9f06-8d9363a5e939" containerName="mariadb-client-7-default" Dec 05 17:42:24 crc kubenswrapper[4756]: I1205 17:42:24.813193 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 05 17:42:24 crc kubenswrapper[4756]: I1205 17:42:24.834732 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Dec 05 17:42:24 crc kubenswrapper[4756]: I1205 17:42:24.937254 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnd7m\" (UniqueName: \"kubernetes.io/projected/ef4b788d-53b5-41a3-a107-dbdf3fc6aa26-kube-api-access-mnd7m\") pod \"mariadb-client-2\" (UID: \"ef4b788d-53b5-41a3-a107-dbdf3fc6aa26\") " pod="openstack/mariadb-client-2" Dec 05 17:42:25 crc kubenswrapper[4756]: I1205 17:42:25.038559 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnd7m\" (UniqueName: \"kubernetes.io/projected/ef4b788d-53b5-41a3-a107-dbdf3fc6aa26-kube-api-access-mnd7m\") pod \"mariadb-client-2\" (UID: \"ef4b788d-53b5-41a3-a107-dbdf3fc6aa26\") " pod="openstack/mariadb-client-2" Dec 05 17:42:25 crc kubenswrapper[4756]: I1205 17:42:25.049853 4756 scope.go:117] "RemoveContainer" containerID="13145d7e051555462bf10e555fd5c9ca6a7f1db46b7c17462e37d0086f868bf2" Dec 05 17:42:25 crc kubenswrapper[4756]: E1205 17:42:25.050378 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:42:25 crc kubenswrapper[4756]: I1205 17:42:25.061407 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnd7m\" (UniqueName: \"kubernetes.io/projected/ef4b788d-53b5-41a3-a107-dbdf3fc6aa26-kube-api-access-mnd7m\") pod \"mariadb-client-2\" (UID: \"ef4b788d-53b5-41a3-a107-dbdf3fc6aa26\") " pod="openstack/mariadb-client-2" Dec 05 17:42:25 crc kubenswrapper[4756]: I1205 17:42:25.169254 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2fb64cef8f16840f595f3a9f38220d9137de2ed587441699ec0640238641dc9" Dec 05 17:42:25 crc kubenswrapper[4756]: I1205 17:42:25.169338 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 05 17:42:25 crc kubenswrapper[4756]: I1205 17:42:25.188309 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 05 17:42:25 crc kubenswrapper[4756]: I1205 17:42:25.744150 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Dec 05 17:42:25 crc kubenswrapper[4756]: W1205 17:42:25.746719 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef4b788d_53b5_41a3_a107_dbdf3fc6aa26.slice/crio-e22c29f2ea8965f2ecbba0b5b04e55d1681cbe00211b43addc64fa92cd5c9268 WatchSource:0}: Error finding container e22c29f2ea8965f2ecbba0b5b04e55d1681cbe00211b43addc64fa92cd5c9268: Status 404 returned error can't find the container with id e22c29f2ea8965f2ecbba0b5b04e55d1681cbe00211b43addc64fa92cd5c9268 Dec 05 17:42:26 crc kubenswrapper[4756]: I1205 17:42:26.061597 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37de2ccf-f20f-4714-9f06-8d9363a5e939" path="/var/lib/kubelet/pods/37de2ccf-f20f-4714-9f06-8d9363a5e939/volumes" Dec 05 17:42:26 crc kubenswrapper[4756]: I1205 17:42:26.180596 4756 generic.go:334] "Generic (PLEG): container finished" podID="ef4b788d-53b5-41a3-a107-dbdf3fc6aa26" containerID="dc950a936bd6999ca6ac22f26131182322b32933c6920346acac367bc88e8c2d" exitCode=0 Dec 05 17:42:26 crc kubenswrapper[4756]: I1205 17:42:26.180678 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"ef4b788d-53b5-41a3-a107-dbdf3fc6aa26","Type":"ContainerDied","Data":"dc950a936bd6999ca6ac22f26131182322b32933c6920346acac367bc88e8c2d"} Dec 05 17:42:26 crc kubenswrapper[4756]: I1205 17:42:26.180736 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"ef4b788d-53b5-41a3-a107-dbdf3fc6aa26","Type":"ContainerStarted","Data":"e22c29f2ea8965f2ecbba0b5b04e55d1681cbe00211b43addc64fa92cd5c9268"} Dec 05 17:42:27 crc kubenswrapper[4756]: I1205 17:42:27.554385 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 05 17:42:27 crc kubenswrapper[4756]: I1205 17:42:27.572989 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_ef4b788d-53b5-41a3-a107-dbdf3fc6aa26/mariadb-client-2/0.log" Dec 05 17:42:27 crc kubenswrapper[4756]: I1205 17:42:27.603397 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Dec 05 17:42:27 crc kubenswrapper[4756]: I1205 17:42:27.615687 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Dec 05 17:42:27 crc kubenswrapper[4756]: I1205 17:42:27.685587 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnd7m\" (UniqueName: \"kubernetes.io/projected/ef4b788d-53b5-41a3-a107-dbdf3fc6aa26-kube-api-access-mnd7m\") pod \"ef4b788d-53b5-41a3-a107-dbdf3fc6aa26\" (UID: \"ef4b788d-53b5-41a3-a107-dbdf3fc6aa26\") " Dec 05 17:42:27 crc kubenswrapper[4756]: I1205 17:42:27.691976 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef4b788d-53b5-41a3-a107-dbdf3fc6aa26-kube-api-access-mnd7m" (OuterVolumeSpecName: "kube-api-access-mnd7m") pod "ef4b788d-53b5-41a3-a107-dbdf3fc6aa26" (UID: "ef4b788d-53b5-41a3-a107-dbdf3fc6aa26"). InnerVolumeSpecName "kube-api-access-mnd7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:42:27 crc kubenswrapper[4756]: I1205 17:42:27.787692 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnd7m\" (UniqueName: \"kubernetes.io/projected/ef4b788d-53b5-41a3-a107-dbdf3fc6aa26-kube-api-access-mnd7m\") on node \"crc\" DevicePath \"\"" Dec 05 17:42:28 crc kubenswrapper[4756]: I1205 17:42:28.066654 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef4b788d-53b5-41a3-a107-dbdf3fc6aa26" path="/var/lib/kubelet/pods/ef4b788d-53b5-41a3-a107-dbdf3fc6aa26/volumes" Dec 05 17:42:28 crc kubenswrapper[4756]: I1205 17:42:28.203156 4756 scope.go:117] "RemoveContainer" containerID="dc950a936bd6999ca6ac22f26131182322b32933c6920346acac367bc88e8c2d" Dec 05 17:42:28 crc kubenswrapper[4756]: I1205 17:42:28.203254 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 05 17:42:37 crc kubenswrapper[4756]: I1205 17:42:37.051183 4756 scope.go:117] "RemoveContainer" containerID="13145d7e051555462bf10e555fd5c9ca6a7f1db46b7c17462e37d0086f868bf2" Dec 05 17:42:37 crc kubenswrapper[4756]: E1205 17:42:37.052277 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:42:48 crc kubenswrapper[4756]: I1205 17:42:48.050520 4756 scope.go:117] "RemoveContainer" containerID="13145d7e051555462bf10e555fd5c9ca6a7f1db46b7c17462e37d0086f868bf2" Dec 05 17:42:48 crc kubenswrapper[4756]: E1205 17:42:48.052178 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:42:48 crc kubenswrapper[4756]: I1205 17:42:48.103526 4756 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod96232644-6788-4f5b-a7b7-48434b12f454"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod96232644-6788-4f5b-a7b7-48434b12f454] : Timed out while waiting for systemd to remove kubepods-besteffort-pod96232644_6788_4f5b_a7b7_48434b12f454.slice" Dec 05 17:43:03 crc kubenswrapper[4756]: I1205 17:43:03.050143 4756 scope.go:117] "RemoveContainer" containerID="13145d7e051555462bf10e555fd5c9ca6a7f1db46b7c17462e37d0086f868bf2" Dec 05 17:43:03 crc kubenswrapper[4756]: E1205 17:43:03.051244 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:43:16 crc kubenswrapper[4756]: I1205 17:43:16.049963 4756 scope.go:117] "RemoveContainer" containerID="13145d7e051555462bf10e555fd5c9ca6a7f1db46b7c17462e37d0086f868bf2" Dec 05 17:43:16 crc kubenswrapper[4756]: E1205 17:43:16.050894 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:43:31 crc kubenswrapper[4756]: I1205 17:43:31.051054 4756 scope.go:117] "RemoveContainer" containerID="13145d7e051555462bf10e555fd5c9ca6a7f1db46b7c17462e37d0086f868bf2" Dec 05 17:43:31 crc kubenswrapper[4756]: E1205 17:43:31.052037 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:43:46 crc kubenswrapper[4756]: I1205 17:43:46.051272 4756 scope.go:117] "RemoveContainer" containerID="13145d7e051555462bf10e555fd5c9ca6a7f1db46b7c17462e37d0086f868bf2" Dec 05 17:43:46 crc kubenswrapper[4756]: E1205 17:43:46.052652 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:43:57 crc kubenswrapper[4756]: I1205 17:43:57.050109 4756 scope.go:117] "RemoveContainer" containerID="13145d7e051555462bf10e555fd5c9ca6a7f1db46b7c17462e37d0086f868bf2" Dec 05 17:43:57 crc kubenswrapper[4756]: E1205 17:43:57.051170 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:43:58 crc kubenswrapper[4756]: I1205 17:43:58.755756 4756 scope.go:117] "RemoveContainer" containerID="3842b3b1d1f9299be117cda3bccba4c0a417bef95e44d334f78b1621ce8832d3" Dec 05 17:44:11 crc kubenswrapper[4756]: I1205 17:44:11.050297 4756 scope.go:117] "RemoveContainer" containerID="13145d7e051555462bf10e555fd5c9ca6a7f1db46b7c17462e37d0086f868bf2" Dec 05 17:44:11 crc kubenswrapper[4756]: E1205 17:44:11.051678 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:44:25 crc kubenswrapper[4756]: I1205 17:44:25.050879 4756 scope.go:117] "RemoveContainer" containerID="13145d7e051555462bf10e555fd5c9ca6a7f1db46b7c17462e37d0086f868bf2" Dec 05 17:44:25 crc kubenswrapper[4756]: E1205 17:44:25.052159 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:44:39 crc kubenswrapper[4756]: I1205 17:44:39.050768 4756 scope.go:117] "RemoveContainer" containerID="13145d7e051555462bf10e555fd5c9ca6a7f1db46b7c17462e37d0086f868bf2" Dec 05 17:44:39 crc kubenswrapper[4756]: E1205 17:44:39.052123 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:44:52 crc kubenswrapper[4756]: I1205 17:44:52.050976 4756 scope.go:117] "RemoveContainer" containerID="13145d7e051555462bf10e555fd5c9ca6a7f1db46b7c17462e37d0086f868bf2" Dec 05 17:44:52 crc kubenswrapper[4756]: E1205 17:44:52.052395 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:45:00 crc kubenswrapper[4756]: I1205 17:45:00.175964 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415945-9cs75"] Dec 05 17:45:00 crc kubenswrapper[4756]: E1205 17:45:00.177400 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef4b788d-53b5-41a3-a107-dbdf3fc6aa26" containerName="mariadb-client-2" Dec 05 17:45:00 crc kubenswrapper[4756]: I1205 17:45:00.177477 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef4b788d-53b5-41a3-a107-dbdf3fc6aa26" containerName="mariadb-client-2" Dec 05 17:45:00 crc kubenswrapper[4756]: I1205 17:45:00.177874 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef4b788d-53b5-41a3-a107-dbdf3fc6aa26" containerName="mariadb-client-2" Dec 05 17:45:00 crc kubenswrapper[4756]: I1205 17:45:00.179050 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415945-9cs75" Dec 05 17:45:00 crc kubenswrapper[4756]: I1205 17:45:00.183309 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 17:45:00 crc kubenswrapper[4756]: I1205 17:45:00.183918 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 17:45:00 crc kubenswrapper[4756]: I1205 17:45:00.185945 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415945-9cs75"] Dec 05 17:45:00 crc kubenswrapper[4756]: I1205 17:45:00.296370 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9cde916-c76d-4c7b-8fa6-2e7632818f2c-secret-volume\") pod \"collect-profiles-29415945-9cs75\" (UID: \"d9cde916-c76d-4c7b-8fa6-2e7632818f2c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415945-9cs75" Dec 05 17:45:00 crc kubenswrapper[4756]: I1205 17:45:00.296459 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbghm\" (UniqueName: \"kubernetes.io/projected/d9cde916-c76d-4c7b-8fa6-2e7632818f2c-kube-api-access-jbghm\") pod \"collect-profiles-29415945-9cs75\" (UID: \"d9cde916-c76d-4c7b-8fa6-2e7632818f2c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415945-9cs75" Dec 05 17:45:00 crc kubenswrapper[4756]: I1205 17:45:00.296534 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9cde916-c76d-4c7b-8fa6-2e7632818f2c-config-volume\") pod \"collect-profiles-29415945-9cs75\" (UID: \"d9cde916-c76d-4c7b-8fa6-2e7632818f2c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415945-9cs75" Dec 05 17:45:00 crc kubenswrapper[4756]: I1205 17:45:00.398454 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbghm\" (UniqueName: \"kubernetes.io/projected/d9cde916-c76d-4c7b-8fa6-2e7632818f2c-kube-api-access-jbghm\") pod \"collect-profiles-29415945-9cs75\" (UID: \"d9cde916-c76d-4c7b-8fa6-2e7632818f2c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415945-9cs75" Dec 05 17:45:00 crc kubenswrapper[4756]: I1205 17:45:00.398531 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9cde916-c76d-4c7b-8fa6-2e7632818f2c-config-volume\") pod \"collect-profiles-29415945-9cs75\" (UID: \"d9cde916-c76d-4c7b-8fa6-2e7632818f2c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415945-9cs75" Dec 05 17:45:00 crc kubenswrapper[4756]: I1205 17:45:00.398591 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9cde916-c76d-4c7b-8fa6-2e7632818f2c-secret-volume\") pod \"collect-profiles-29415945-9cs75\" (UID: \"d9cde916-c76d-4c7b-8fa6-2e7632818f2c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415945-9cs75" Dec 05 17:45:00 crc kubenswrapper[4756]: I1205 17:45:00.399686 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9cde916-c76d-4c7b-8fa6-2e7632818f2c-config-volume\") pod \"collect-profiles-29415945-9cs75\" (UID: \"d9cde916-c76d-4c7b-8fa6-2e7632818f2c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415945-9cs75" Dec 05 17:45:00 crc kubenswrapper[4756]: I1205 17:45:00.405132 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9cde916-c76d-4c7b-8fa6-2e7632818f2c-secret-volume\") pod \"collect-profiles-29415945-9cs75\" (UID: \"d9cde916-c76d-4c7b-8fa6-2e7632818f2c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415945-9cs75" Dec 05 17:45:00 crc kubenswrapper[4756]: I1205 17:45:00.424084 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbghm\" (UniqueName: \"kubernetes.io/projected/d9cde916-c76d-4c7b-8fa6-2e7632818f2c-kube-api-access-jbghm\") pod \"collect-profiles-29415945-9cs75\" (UID: \"d9cde916-c76d-4c7b-8fa6-2e7632818f2c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415945-9cs75" Dec 05 17:45:00 crc kubenswrapper[4756]: I1205 17:45:00.513651 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415945-9cs75" Dec 05 17:45:00 crc kubenswrapper[4756]: I1205 17:45:00.933073 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415945-9cs75"] Dec 05 17:45:01 crc kubenswrapper[4756]: I1205 17:45:01.667653 4756 generic.go:334] "Generic (PLEG): container finished" podID="d9cde916-c76d-4c7b-8fa6-2e7632818f2c" containerID="7553c094e1595a33ff5d20362be6cc3e66fd159e6f376684479e298a7bde4070" exitCode=0 Dec 05 17:45:01 crc kubenswrapper[4756]: I1205 17:45:01.667720 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415945-9cs75" event={"ID":"d9cde916-c76d-4c7b-8fa6-2e7632818f2c","Type":"ContainerDied","Data":"7553c094e1595a33ff5d20362be6cc3e66fd159e6f376684479e298a7bde4070"} Dec 05 17:45:01 crc kubenswrapper[4756]: I1205 17:45:01.667758 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415945-9cs75" event={"ID":"d9cde916-c76d-4c7b-8fa6-2e7632818f2c","Type":"ContainerStarted","Data":"7e9e7117411b1412a47b01cbcc2b61d8705699a9f841478153497627698cef7b"} Dec 05 17:45:03 crc kubenswrapper[4756]: I1205 17:45:03.049936 4756 scope.go:117] "RemoveContainer" containerID="13145d7e051555462bf10e555fd5c9ca6a7f1db46b7c17462e37d0086f868bf2" Dec 05 17:45:03 crc kubenswrapper[4756]: E1205 17:45:03.050671 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:45:03 crc kubenswrapper[4756]: I1205 17:45:03.158156 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415945-9cs75" Dec 05 17:45:03 crc kubenswrapper[4756]: I1205 17:45:03.244683 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9cde916-c76d-4c7b-8fa6-2e7632818f2c-secret-volume\") pod \"d9cde916-c76d-4c7b-8fa6-2e7632818f2c\" (UID: \"d9cde916-c76d-4c7b-8fa6-2e7632818f2c\") " Dec 05 17:45:03 crc kubenswrapper[4756]: I1205 17:45:03.245346 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9cde916-c76d-4c7b-8fa6-2e7632818f2c-config-volume\") pod \"d9cde916-c76d-4c7b-8fa6-2e7632818f2c\" (UID: \"d9cde916-c76d-4c7b-8fa6-2e7632818f2c\") " Dec 05 17:45:03 crc kubenswrapper[4756]: I1205 17:45:03.245533 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbghm\" (UniqueName: \"kubernetes.io/projected/d9cde916-c76d-4c7b-8fa6-2e7632818f2c-kube-api-access-jbghm\") pod \"d9cde916-c76d-4c7b-8fa6-2e7632818f2c\" (UID: \"d9cde916-c76d-4c7b-8fa6-2e7632818f2c\") " Dec 05 17:45:03 crc kubenswrapper[4756]: I1205 17:45:03.246466 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9cde916-c76d-4c7b-8fa6-2e7632818f2c-config-volume" (OuterVolumeSpecName: "config-volume") pod "d9cde916-c76d-4c7b-8fa6-2e7632818f2c" (UID: "d9cde916-c76d-4c7b-8fa6-2e7632818f2c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:45:03 crc kubenswrapper[4756]: I1205 17:45:03.252361 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9cde916-c76d-4c7b-8fa6-2e7632818f2c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d9cde916-c76d-4c7b-8fa6-2e7632818f2c" (UID: "d9cde916-c76d-4c7b-8fa6-2e7632818f2c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:45:03 crc kubenswrapper[4756]: I1205 17:45:03.254677 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9cde916-c76d-4c7b-8fa6-2e7632818f2c-kube-api-access-jbghm" (OuterVolumeSpecName: "kube-api-access-jbghm") pod "d9cde916-c76d-4c7b-8fa6-2e7632818f2c" (UID: "d9cde916-c76d-4c7b-8fa6-2e7632818f2c"). InnerVolumeSpecName "kube-api-access-jbghm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:45:03 crc kubenswrapper[4756]: I1205 17:45:03.348536 4756 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9cde916-c76d-4c7b-8fa6-2e7632818f2c-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 17:45:03 crc kubenswrapper[4756]: I1205 17:45:03.348591 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbghm\" (UniqueName: \"kubernetes.io/projected/d9cde916-c76d-4c7b-8fa6-2e7632818f2c-kube-api-access-jbghm\") on node \"crc\" DevicePath \"\"" Dec 05 17:45:03 crc kubenswrapper[4756]: I1205 17:45:03.348613 4756 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9cde916-c76d-4c7b-8fa6-2e7632818f2c-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 17:45:03 crc kubenswrapper[4756]: I1205 17:45:03.685349 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415945-9cs75" event={"ID":"d9cde916-c76d-4c7b-8fa6-2e7632818f2c","Type":"ContainerDied","Data":"7e9e7117411b1412a47b01cbcc2b61d8705699a9f841478153497627698cef7b"} Dec 05 17:45:03 crc kubenswrapper[4756]: I1205 17:45:03.685386 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e9e7117411b1412a47b01cbcc2b61d8705699a9f841478153497627698cef7b" Dec 05 17:45:03 crc kubenswrapper[4756]: I1205 17:45:03.685874 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415945-9cs75" Dec 05 17:45:04 crc kubenswrapper[4756]: I1205 17:45:04.268897 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415900-g8cns"] Dec 05 17:45:04 crc kubenswrapper[4756]: I1205 17:45:04.276518 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415900-g8cns"] Dec 05 17:45:06 crc kubenswrapper[4756]: I1205 17:45:06.066633 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14bb829d-b636-4040-a51c-121ba1163192" path="/var/lib/kubelet/pods/14bb829d-b636-4040-a51c-121ba1163192/volumes" Dec 05 17:45:16 crc kubenswrapper[4756]: I1205 17:45:16.049916 4756 scope.go:117] "RemoveContainer" containerID="13145d7e051555462bf10e555fd5c9ca6a7f1db46b7c17462e37d0086f868bf2" Dec 05 17:45:16 crc kubenswrapper[4756]: E1205 17:45:16.052804 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:45:31 crc kubenswrapper[4756]: I1205 17:45:31.050635 4756 scope.go:117] "RemoveContainer" containerID="13145d7e051555462bf10e555fd5c9ca6a7f1db46b7c17462e37d0086f868bf2" Dec 05 17:45:31 crc kubenswrapper[4756]: E1205 17:45:31.051716 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:45:46 crc kubenswrapper[4756]: I1205 17:45:46.050257 4756 scope.go:117] "RemoveContainer" containerID="13145d7e051555462bf10e555fd5c9ca6a7f1db46b7c17462e37d0086f868bf2" Dec 05 17:45:46 crc kubenswrapper[4756]: E1205 17:45:46.051266 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:45:57 crc kubenswrapper[4756]: I1205 17:45:57.049816 4756 scope.go:117] "RemoveContainer" containerID="13145d7e051555462bf10e555fd5c9ca6a7f1db46b7c17462e37d0086f868bf2" Dec 05 17:45:57 crc kubenswrapper[4756]: E1205 17:45:57.050806 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:45:58 crc kubenswrapper[4756]: I1205 17:45:58.863566 4756 scope.go:117] "RemoveContainer" containerID="e29372e14ca96645c303d5c2caad411b1e49b9f7492bcbf00ec298ac3c35a83c" Dec 05 17:45:58 crc kubenswrapper[4756]: I1205 17:45:58.901562 4756 scope.go:117] "RemoveContainer" containerID="29286855d0ed2714b3ec72d9555cc544af755eed132dac12d866a94e9ed00a01" Dec 05 17:45:58 crc kubenswrapper[4756]: I1205 17:45:58.989109 4756 scope.go:117] "RemoveContainer" containerID="a56680f7d8a0e7ad941b0786fe60a06c56a08f4912543e02faf304913ff34087" Dec 05 17:45:59 crc kubenswrapper[4756]: I1205 17:45:59.021988 4756 scope.go:117] "RemoveContainer" containerID="685747f7f6066d7b8cf513d37c6955b3490ea1c3a2e22a5cecaa20cf2500928d" Dec 05 17:46:09 crc kubenswrapper[4756]: I1205 17:46:09.145303 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Dec 05 17:46:09 crc kubenswrapper[4756]: E1205 17:46:09.146821 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9cde916-c76d-4c7b-8fa6-2e7632818f2c" containerName="collect-profiles" Dec 05 17:46:09 crc kubenswrapper[4756]: I1205 17:46:09.146876 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9cde916-c76d-4c7b-8fa6-2e7632818f2c" containerName="collect-profiles" Dec 05 17:46:09 crc kubenswrapper[4756]: I1205 17:46:09.147295 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9cde916-c76d-4c7b-8fa6-2e7632818f2c" containerName="collect-profiles" Dec 05 17:46:09 crc kubenswrapper[4756]: I1205 17:46:09.148467 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 05 17:46:09 crc kubenswrapper[4756]: I1205 17:46:09.154377 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-22h6q" Dec 05 17:46:09 crc kubenswrapper[4756]: I1205 17:46:09.163843 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Dec 05 17:46:09 crc kubenswrapper[4756]: I1205 17:46:09.227699 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhk54\" (UniqueName: \"kubernetes.io/projected/a7d9dfae-05d0-49c7-85d5-360812e2abb5-kube-api-access-fhk54\") pod \"mariadb-copy-data\" (UID: \"a7d9dfae-05d0-49c7-85d5-360812e2abb5\") " pod="openstack/mariadb-copy-data" Dec 05 17:46:09 crc kubenswrapper[4756]: I1205 17:46:09.227843 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1be34eff-2459-432f-8043-64371ab1dcb2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1be34eff-2459-432f-8043-64371ab1dcb2\") pod \"mariadb-copy-data\" (UID: \"a7d9dfae-05d0-49c7-85d5-360812e2abb5\") " pod="openstack/mariadb-copy-data" Dec 05 17:46:09 crc kubenswrapper[4756]: I1205 17:46:09.330158 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1be34eff-2459-432f-8043-64371ab1dcb2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1be34eff-2459-432f-8043-64371ab1dcb2\") pod \"mariadb-copy-data\" (UID: \"a7d9dfae-05d0-49c7-85d5-360812e2abb5\") " pod="openstack/mariadb-copy-data" Dec 05 17:46:09 crc kubenswrapper[4756]: I1205 17:46:09.330388 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhk54\" (UniqueName: \"kubernetes.io/projected/a7d9dfae-05d0-49c7-85d5-360812e2abb5-kube-api-access-fhk54\") pod \"mariadb-copy-data\" (UID: \"a7d9dfae-05d0-49c7-85d5-360812e2abb5\") " pod="openstack/mariadb-copy-data" Dec 05 17:46:09 crc kubenswrapper[4756]: I1205 17:46:09.334933 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 17:46:09 crc kubenswrapper[4756]: I1205 17:46:09.335007 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1be34eff-2459-432f-8043-64371ab1dcb2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1be34eff-2459-432f-8043-64371ab1dcb2\") pod \"mariadb-copy-data\" (UID: \"a7d9dfae-05d0-49c7-85d5-360812e2abb5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bfe79b86ef10ec0605955d07e3fd640c87b17152b79436890acba216b6ecd479/globalmount\"" pod="openstack/mariadb-copy-data" Dec 05 17:46:09 crc kubenswrapper[4756]: I1205 17:46:09.368833 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhk54\" (UniqueName: \"kubernetes.io/projected/a7d9dfae-05d0-49c7-85d5-360812e2abb5-kube-api-access-fhk54\") pod \"mariadb-copy-data\" (UID: \"a7d9dfae-05d0-49c7-85d5-360812e2abb5\") " pod="openstack/mariadb-copy-data" Dec 05 17:46:09 crc kubenswrapper[4756]: I1205 17:46:09.387618 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1be34eff-2459-432f-8043-64371ab1dcb2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1be34eff-2459-432f-8043-64371ab1dcb2\") pod \"mariadb-copy-data\" (UID: \"a7d9dfae-05d0-49c7-85d5-360812e2abb5\") " pod="openstack/mariadb-copy-data" Dec 05 17:46:09 crc kubenswrapper[4756]: I1205 17:46:09.486910 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 05 17:46:09 crc kubenswrapper[4756]: I1205 17:46:09.935803 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Dec 05 17:46:09 crc kubenswrapper[4756]: W1205 17:46:09.945956 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7d9dfae_05d0_49c7_85d5_360812e2abb5.slice/crio-f6a3ab852f6470ae750a7608984dae2dbd8587cbd1f1e9cefd2a72b60079c9e2 WatchSource:0}: Error finding container f6a3ab852f6470ae750a7608984dae2dbd8587cbd1f1e9cefd2a72b60079c9e2: Status 404 returned error can't find the container with id f6a3ab852f6470ae750a7608984dae2dbd8587cbd1f1e9cefd2a72b60079c9e2 Dec 05 17:46:10 crc kubenswrapper[4756]: I1205 17:46:10.055858 4756 scope.go:117] "RemoveContainer" containerID="13145d7e051555462bf10e555fd5c9ca6a7f1db46b7c17462e37d0086f868bf2" Dec 05 17:46:10 crc kubenswrapper[4756]: E1205 17:46:10.056199 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:46:10 crc kubenswrapper[4756]: I1205 17:46:10.347538 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"a7d9dfae-05d0-49c7-85d5-360812e2abb5","Type":"ContainerStarted","Data":"12bdaefb9881d933b8044829a25a50122a3b3f8067aaabdedc65a4517cb9b33e"} Dec 05 17:46:10 crc kubenswrapper[4756]: I1205 17:46:10.347726 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"a7d9dfae-05d0-49c7-85d5-360812e2abb5","Type":"ContainerStarted","Data":"f6a3ab852f6470ae750a7608984dae2dbd8587cbd1f1e9cefd2a72b60079c9e2"} Dec 05 17:46:10 crc kubenswrapper[4756]: I1205 17:46:10.378750 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=2.37872524 podStartE2EDuration="2.37872524s" podCreationTimestamp="2025-12-05 17:46:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:46:10.370316667 +0000 UTC m=+5430.608990199" watchObservedRunningTime="2025-12-05 17:46:10.37872524 +0000 UTC m=+5430.617398782" Dec 05 17:46:13 crc kubenswrapper[4756]: I1205 17:46:13.588089 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Dec 05 17:46:13 crc kubenswrapper[4756]: I1205 17:46:13.591203 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 05 17:46:13 crc kubenswrapper[4756]: I1205 17:46:13.601321 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 05 17:46:13 crc kubenswrapper[4756]: I1205 17:46:13.729544 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9859f\" (UniqueName: \"kubernetes.io/projected/70548e47-42ec-4f52-9f1d-fe88d7a1763e-kube-api-access-9859f\") pod \"mariadb-client\" (UID: \"70548e47-42ec-4f52-9f1d-fe88d7a1763e\") " pod="openstack/mariadb-client" Dec 05 17:46:13 crc kubenswrapper[4756]: I1205 17:46:13.831960 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9859f\" (UniqueName: \"kubernetes.io/projected/70548e47-42ec-4f52-9f1d-fe88d7a1763e-kube-api-access-9859f\") pod \"mariadb-client\" (UID: \"70548e47-42ec-4f52-9f1d-fe88d7a1763e\") " pod="openstack/mariadb-client" Dec 05 17:46:13 crc kubenswrapper[4756]: I1205 17:46:13.856955 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9859f\" (UniqueName: \"kubernetes.io/projected/70548e47-42ec-4f52-9f1d-fe88d7a1763e-kube-api-access-9859f\") pod \"mariadb-client\" (UID: \"70548e47-42ec-4f52-9f1d-fe88d7a1763e\") " pod="openstack/mariadb-client" Dec 05 17:46:13 crc kubenswrapper[4756]: I1205 17:46:13.925295 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 05 17:46:14 crc kubenswrapper[4756]: W1205 17:46:14.238617 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70548e47_42ec_4f52_9f1d_fe88d7a1763e.slice/crio-85f904b5b28540554212af2811d377aacb6ddf577ac6613012b22a5a39db0441 WatchSource:0}: Error finding container 85f904b5b28540554212af2811d377aacb6ddf577ac6613012b22a5a39db0441: Status 404 returned error can't find the container with id 85f904b5b28540554212af2811d377aacb6ddf577ac6613012b22a5a39db0441 Dec 05 17:46:14 crc kubenswrapper[4756]: I1205 17:46:14.240081 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 05 17:46:14 crc kubenswrapper[4756]: I1205 17:46:14.386016 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"70548e47-42ec-4f52-9f1d-fe88d7a1763e","Type":"ContainerStarted","Data":"85f904b5b28540554212af2811d377aacb6ddf577ac6613012b22a5a39db0441"} Dec 05 17:46:15 crc kubenswrapper[4756]: I1205 17:46:15.396374 4756 generic.go:334] "Generic (PLEG): container finished" podID="70548e47-42ec-4f52-9f1d-fe88d7a1763e" containerID="432927b238449feb6c4705064a876afad9a5c0f8dcb13ad2a84b5a633b257239" exitCode=0 Dec 05 17:46:15 crc kubenswrapper[4756]: I1205 17:46:15.396456 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"70548e47-42ec-4f52-9f1d-fe88d7a1763e","Type":"ContainerDied","Data":"432927b238449feb6c4705064a876afad9a5c0f8dcb13ad2a84b5a633b257239"} Dec 05 17:46:16 crc kubenswrapper[4756]: I1205 17:46:16.830974 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 05 17:46:16 crc kubenswrapper[4756]: I1205 17:46:16.858680 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_70548e47-42ec-4f52-9f1d-fe88d7a1763e/mariadb-client/0.log" Dec 05 17:46:16 crc kubenswrapper[4756]: I1205 17:46:16.888449 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Dec 05 17:46:16 crc kubenswrapper[4756]: I1205 17:46:16.888720 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9859f\" (UniqueName: \"kubernetes.io/projected/70548e47-42ec-4f52-9f1d-fe88d7a1763e-kube-api-access-9859f\") pod \"70548e47-42ec-4f52-9f1d-fe88d7a1763e\" (UID: \"70548e47-42ec-4f52-9f1d-fe88d7a1763e\") " Dec 05 17:46:16 crc kubenswrapper[4756]: I1205 17:46:16.897889 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70548e47-42ec-4f52-9f1d-fe88d7a1763e-kube-api-access-9859f" (OuterVolumeSpecName: "kube-api-access-9859f") pod "70548e47-42ec-4f52-9f1d-fe88d7a1763e" (UID: "70548e47-42ec-4f52-9f1d-fe88d7a1763e"). InnerVolumeSpecName "kube-api-access-9859f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:46:16 crc kubenswrapper[4756]: I1205 17:46:16.901512 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Dec 05 17:46:16 crc kubenswrapper[4756]: I1205 17:46:16.990973 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9859f\" (UniqueName: \"kubernetes.io/projected/70548e47-42ec-4f52-9f1d-fe88d7a1763e-kube-api-access-9859f\") on node \"crc\" DevicePath \"\"" Dec 05 17:46:17 crc kubenswrapper[4756]: I1205 17:46:17.068794 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Dec 05 17:46:17 crc kubenswrapper[4756]: E1205 17:46:17.069265 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70548e47-42ec-4f52-9f1d-fe88d7a1763e" containerName="mariadb-client" Dec 05 17:46:17 crc kubenswrapper[4756]: I1205 17:46:17.069296 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="70548e47-42ec-4f52-9f1d-fe88d7a1763e" containerName="mariadb-client" Dec 05 17:46:17 crc kubenswrapper[4756]: I1205 17:46:17.069620 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="70548e47-42ec-4f52-9f1d-fe88d7a1763e" containerName="mariadb-client" Dec 05 17:46:17 crc kubenswrapper[4756]: I1205 17:46:17.070480 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 05 17:46:17 crc kubenswrapper[4756]: I1205 17:46:17.087909 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 05 17:46:17 crc kubenswrapper[4756]: I1205 17:46:17.195117 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cqb4\" (UniqueName: \"kubernetes.io/projected/34dcc531-58f7-41c4-bba5-61a0027f4d52-kube-api-access-7cqb4\") pod \"mariadb-client\" (UID: \"34dcc531-58f7-41c4-bba5-61a0027f4d52\") " pod="openstack/mariadb-client" Dec 05 17:46:17 crc kubenswrapper[4756]: I1205 17:46:17.297838 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cqb4\" (UniqueName: \"kubernetes.io/projected/34dcc531-58f7-41c4-bba5-61a0027f4d52-kube-api-access-7cqb4\") pod \"mariadb-client\" (UID: \"34dcc531-58f7-41c4-bba5-61a0027f4d52\") " pod="openstack/mariadb-client" Dec 05 17:46:17 crc kubenswrapper[4756]: I1205 17:46:17.345266 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cqb4\" (UniqueName: \"kubernetes.io/projected/34dcc531-58f7-41c4-bba5-61a0027f4d52-kube-api-access-7cqb4\") pod \"mariadb-client\" (UID: \"34dcc531-58f7-41c4-bba5-61a0027f4d52\") " pod="openstack/mariadb-client" Dec 05 17:46:17 crc kubenswrapper[4756]: I1205 17:46:17.412674 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 05 17:46:17 crc kubenswrapper[4756]: I1205 17:46:17.428686 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85f904b5b28540554212af2811d377aacb6ddf577ac6613012b22a5a39db0441" Dec 05 17:46:17 crc kubenswrapper[4756]: I1205 17:46:17.428754 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 05 17:46:17 crc kubenswrapper[4756]: I1205 17:46:17.495018 4756 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="70548e47-42ec-4f52-9f1d-fe88d7a1763e" podUID="34dcc531-58f7-41c4-bba5-61a0027f4d52" Dec 05 17:46:17 crc kubenswrapper[4756]: I1205 17:46:17.928990 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 05 17:46:18 crc kubenswrapper[4756]: I1205 17:46:18.066870 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70548e47-42ec-4f52-9f1d-fe88d7a1763e" path="/var/lib/kubelet/pods/70548e47-42ec-4f52-9f1d-fe88d7a1763e/volumes" Dec 05 17:46:18 crc kubenswrapper[4756]: I1205 17:46:18.442404 4756 generic.go:334] "Generic (PLEG): container finished" podID="34dcc531-58f7-41c4-bba5-61a0027f4d52" containerID="7c2b4348a438cd261d06337ee666610b473f9a5f2a1af197e1448d1e1c02c730" exitCode=0 Dec 05 17:46:18 crc kubenswrapper[4756]: I1205 17:46:18.442535 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"34dcc531-58f7-41c4-bba5-61a0027f4d52","Type":"ContainerDied","Data":"7c2b4348a438cd261d06337ee666610b473f9a5f2a1af197e1448d1e1c02c730"} Dec 05 17:46:18 crc kubenswrapper[4756]: I1205 17:46:18.442580 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"34dcc531-58f7-41c4-bba5-61a0027f4d52","Type":"ContainerStarted","Data":"75dd5bc703c9395f8930e7170ad097a8525d26ce82df49652ccd06a92d99864c"} Dec 05 17:46:19 crc kubenswrapper[4756]: I1205 17:46:19.813720 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 05 17:46:19 crc kubenswrapper[4756]: I1205 17:46:19.831390 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_34dcc531-58f7-41c4-bba5-61a0027f4d52/mariadb-client/0.log" Dec 05 17:46:19 crc kubenswrapper[4756]: I1205 17:46:19.860034 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Dec 05 17:46:19 crc kubenswrapper[4756]: I1205 17:46:19.864953 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Dec 05 17:46:19 crc kubenswrapper[4756]: I1205 17:46:19.949325 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cqb4\" (UniqueName: \"kubernetes.io/projected/34dcc531-58f7-41c4-bba5-61a0027f4d52-kube-api-access-7cqb4\") pod \"34dcc531-58f7-41c4-bba5-61a0027f4d52\" (UID: \"34dcc531-58f7-41c4-bba5-61a0027f4d52\") " Dec 05 17:46:19 crc kubenswrapper[4756]: I1205 17:46:19.955805 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34dcc531-58f7-41c4-bba5-61a0027f4d52-kube-api-access-7cqb4" (OuterVolumeSpecName: "kube-api-access-7cqb4") pod "34dcc531-58f7-41c4-bba5-61a0027f4d52" (UID: "34dcc531-58f7-41c4-bba5-61a0027f4d52"). InnerVolumeSpecName "kube-api-access-7cqb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:46:20 crc kubenswrapper[4756]: I1205 17:46:20.050780 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cqb4\" (UniqueName: \"kubernetes.io/projected/34dcc531-58f7-41c4-bba5-61a0027f4d52-kube-api-access-7cqb4\") on node \"crc\" DevicePath \"\"" Dec 05 17:46:20 crc kubenswrapper[4756]: I1205 17:46:20.058278 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34dcc531-58f7-41c4-bba5-61a0027f4d52" path="/var/lib/kubelet/pods/34dcc531-58f7-41c4-bba5-61a0027f4d52/volumes" Dec 05 17:46:20 crc kubenswrapper[4756]: I1205 17:46:20.461471 4756 scope.go:117] "RemoveContainer" containerID="7c2b4348a438cd261d06337ee666610b473f9a5f2a1af197e1448d1e1c02c730" Dec 05 17:46:20 crc kubenswrapper[4756]: I1205 17:46:20.461881 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 05 17:46:24 crc kubenswrapper[4756]: I1205 17:46:24.050294 4756 scope.go:117] "RemoveContainer" containerID="13145d7e051555462bf10e555fd5c9ca6a7f1db46b7c17462e37d0086f868bf2" Dec 05 17:46:24 crc kubenswrapper[4756]: I1205 17:46:24.519929 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"6b5a9908bca7e41a64a674dbf6cbb63be6b207127fbfed3e63c4b2167cea9c0c"} Dec 05 17:46:51 crc kubenswrapper[4756]: I1205 17:46:51.950002 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 17:46:51 crc kubenswrapper[4756]: E1205 17:46:51.951042 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34dcc531-58f7-41c4-bba5-61a0027f4d52" containerName="mariadb-client" Dec 05 17:46:51 crc kubenswrapper[4756]: I1205 17:46:51.951059 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="34dcc531-58f7-41c4-bba5-61a0027f4d52" containerName="mariadb-client" Dec 05 17:46:51 crc kubenswrapper[4756]: I1205 17:46:51.951326 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="34dcc531-58f7-41c4-bba5-61a0027f4d52" containerName="mariadb-client" Dec 05 17:46:51 crc kubenswrapper[4756]: I1205 17:46:51.952783 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:51 crc kubenswrapper[4756]: I1205 17:46:51.955223 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 05 17:46:51 crc kubenswrapper[4756]: I1205 17:46:51.956032 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-bw49g" Dec 05 17:46:51 crc kubenswrapper[4756]: I1205 17:46:51.956408 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 05 17:46:51 crc kubenswrapper[4756]: I1205 17:46:51.956630 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 05 17:46:51 crc kubenswrapper[4756]: I1205 17:46:51.958609 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 05 17:46:51 crc kubenswrapper[4756]: I1205 17:46:51.969493 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.004218 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.005901 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.011620 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.012970 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.018407 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.024634 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.062039 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dfcfe1d-4220-400b-83fd-f35c88cf6fa0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2dfcfe1d-4220-400b-83fd-f35c88cf6fa0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.062083 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmf7t\" (UniqueName: \"kubernetes.io/projected/2dfcfe1d-4220-400b-83fd-f35c88cf6fa0-kube-api-access-hmf7t\") pod \"ovsdbserver-nb-0\" (UID: \"2dfcfe1d-4220-400b-83fd-f35c88cf6fa0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.062107 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dfcfe1d-4220-400b-83fd-f35c88cf6fa0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2dfcfe1d-4220-400b-83fd-f35c88cf6fa0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.062181 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-470c903c-1da7-4a35-b0ca-b1142471e248\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-470c903c-1da7-4a35-b0ca-b1142471e248\") pod \"ovsdbserver-nb-0\" (UID: \"2dfcfe1d-4220-400b-83fd-f35c88cf6fa0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.062211 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dfcfe1d-4220-400b-83fd-f35c88cf6fa0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2dfcfe1d-4220-400b-83fd-f35c88cf6fa0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.062227 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2dfcfe1d-4220-400b-83fd-f35c88cf6fa0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2dfcfe1d-4220-400b-83fd-f35c88cf6fa0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.062241 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dfcfe1d-4220-400b-83fd-f35c88cf6fa0-config\") pod \"ovsdbserver-nb-0\" (UID: \"2dfcfe1d-4220-400b-83fd-f35c88cf6fa0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.062277 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2dfcfe1d-4220-400b-83fd-f35c88cf6fa0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2dfcfe1d-4220-400b-83fd-f35c88cf6fa0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.164504 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ef50d87b-3afa-45e1-a575-5f50ca06943b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ef50d87b-3afa-45e1-a575-5f50ca06943b\") pod \"ovsdbserver-nb-2\" (UID: \"f55cd61d-b3a2-456f-8b79-0d207405dd13\") " pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.164618 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dfcfe1d-4220-400b-83fd-f35c88cf6fa0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2dfcfe1d-4220-400b-83fd-f35c88cf6fa0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.164647 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2dfcfe1d-4220-400b-83fd-f35c88cf6fa0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2dfcfe1d-4220-400b-83fd-f35c88cf6fa0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.164671 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dfcfe1d-4220-400b-83fd-f35c88cf6fa0-config\") pod \"ovsdbserver-nb-0\" (UID: \"2dfcfe1d-4220-400b-83fd-f35c88cf6fa0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.164707 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/81682d31-ae36-4120-ab44-7f69483622d1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"81682d31-ae36-4120-ab44-7f69483622d1\") " pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.164728 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f55cd61d-b3a2-456f-8b79-0d207405dd13-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"f55cd61d-b3a2-456f-8b79-0d207405dd13\") " pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.164759 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f55cd61d-b3a2-456f-8b79-0d207405dd13-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"f55cd61d-b3a2-456f-8b79-0d207405dd13\") " pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.164790 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2dfcfe1d-4220-400b-83fd-f35c88cf6fa0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2dfcfe1d-4220-400b-83fd-f35c88cf6fa0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.164813 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f55cd61d-b3a2-456f-8b79-0d207405dd13-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"f55cd61d-b3a2-456f-8b79-0d207405dd13\") " pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.164834 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk6kd\" (UniqueName: \"kubernetes.io/projected/f55cd61d-b3a2-456f-8b79-0d207405dd13-kube-api-access-fk6kd\") pod \"ovsdbserver-nb-2\" (UID: \"f55cd61d-b3a2-456f-8b79-0d207405dd13\") " pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.164855 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dfcfe1d-4220-400b-83fd-f35c88cf6fa0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2dfcfe1d-4220-400b-83fd-f35c88cf6fa0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.164876 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81682d31-ae36-4120-ab44-7f69483622d1-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"81682d31-ae36-4120-ab44-7f69483622d1\") " pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.164902 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a0adc65c-116c-4240-ab36-407e218be107\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0adc65c-116c-4240-ab36-407e218be107\") pod \"ovsdbserver-nb-1\" (UID: \"81682d31-ae36-4120-ab44-7f69483622d1\") " pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.164931 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/81682d31-ae36-4120-ab44-7f69483622d1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"81682d31-ae36-4120-ab44-7f69483622d1\") " pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.164955 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmf7t\" (UniqueName: \"kubernetes.io/projected/2dfcfe1d-4220-400b-83fd-f35c88cf6fa0-kube-api-access-hmf7t\") pod \"ovsdbserver-nb-0\" (UID: \"2dfcfe1d-4220-400b-83fd-f35c88cf6fa0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.164982 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81682d31-ae36-4120-ab44-7f69483622d1-config\") pod \"ovsdbserver-nb-1\" (UID: \"81682d31-ae36-4120-ab44-7f69483622d1\") " pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.165011 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dfcfe1d-4220-400b-83fd-f35c88cf6fa0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2dfcfe1d-4220-400b-83fd-f35c88cf6fa0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.165032 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c68xz\" (UniqueName: \"kubernetes.io/projected/81682d31-ae36-4120-ab44-7f69483622d1-kube-api-access-c68xz\") pod \"ovsdbserver-nb-1\" (UID: \"81682d31-ae36-4120-ab44-7f69483622d1\") " pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.165067 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f55cd61d-b3a2-456f-8b79-0d207405dd13-config\") pod \"ovsdbserver-nb-2\" (UID: \"f55cd61d-b3a2-456f-8b79-0d207405dd13\") " pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.165199 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2dfcfe1d-4220-400b-83fd-f35c88cf6fa0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2dfcfe1d-4220-400b-83fd-f35c88cf6fa0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.165373 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f55cd61d-b3a2-456f-8b79-0d207405dd13-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"f55cd61d-b3a2-456f-8b79-0d207405dd13\") " pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.165469 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f55cd61d-b3a2-456f-8b79-0d207405dd13-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"f55cd61d-b3a2-456f-8b79-0d207405dd13\") " pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.165571 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/81682d31-ae36-4120-ab44-7f69483622d1-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"81682d31-ae36-4120-ab44-7f69483622d1\") " pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.165788 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-470c903c-1da7-4a35-b0ca-b1142471e248\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-470c903c-1da7-4a35-b0ca-b1142471e248\") pod \"ovsdbserver-nb-0\" (UID: \"2dfcfe1d-4220-400b-83fd-f35c88cf6fa0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.165841 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81682d31-ae36-4120-ab44-7f69483622d1-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"81682d31-ae36-4120-ab44-7f69483622d1\") " pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.166055 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dfcfe1d-4220-400b-83fd-f35c88cf6fa0-config\") pod \"ovsdbserver-nb-0\" (UID: \"2dfcfe1d-4220-400b-83fd-f35c88cf6fa0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.167460 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2dfcfe1d-4220-400b-83fd-f35c88cf6fa0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2dfcfe1d-4220-400b-83fd-f35c88cf6fa0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.173163 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.173221 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-470c903c-1da7-4a35-b0ca-b1142471e248\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-470c903c-1da7-4a35-b0ca-b1142471e248\") pod \"ovsdbserver-nb-0\" (UID: \"2dfcfe1d-4220-400b-83fd-f35c88cf6fa0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/497e4eab5d31e44263821816b16e8c6e4218d94ee2972c860fa00f7619eb5ab7/globalmount\"" pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.173255 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dfcfe1d-4220-400b-83fd-f35c88cf6fa0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2dfcfe1d-4220-400b-83fd-f35c88cf6fa0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.177668 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dfcfe1d-4220-400b-83fd-f35c88cf6fa0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2dfcfe1d-4220-400b-83fd-f35c88cf6fa0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.180145 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dfcfe1d-4220-400b-83fd-f35c88cf6fa0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2dfcfe1d-4220-400b-83fd-f35c88cf6fa0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.186210 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmf7t\" (UniqueName: \"kubernetes.io/projected/2dfcfe1d-4220-400b-83fd-f35c88cf6fa0-kube-api-access-hmf7t\") pod \"ovsdbserver-nb-0\" (UID: \"2dfcfe1d-4220-400b-83fd-f35c88cf6fa0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.223267 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-470c903c-1da7-4a35-b0ca-b1142471e248\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-470c903c-1da7-4a35-b0ca-b1142471e248\") pod \"ovsdbserver-nb-0\" (UID: \"2dfcfe1d-4220-400b-83fd-f35c88cf6fa0\") " pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.267628 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f55cd61d-b3a2-456f-8b79-0d207405dd13-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"f55cd61d-b3a2-456f-8b79-0d207405dd13\") " pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.267877 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f55cd61d-b3a2-456f-8b79-0d207405dd13-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"f55cd61d-b3a2-456f-8b79-0d207405dd13\") " pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.267999 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk6kd\" (UniqueName: \"kubernetes.io/projected/f55cd61d-b3a2-456f-8b79-0d207405dd13-kube-api-access-fk6kd\") pod \"ovsdbserver-nb-2\" (UID: \"f55cd61d-b3a2-456f-8b79-0d207405dd13\") " pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.268079 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81682d31-ae36-4120-ab44-7f69483622d1-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"81682d31-ae36-4120-ab44-7f69483622d1\") " pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.268160 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a0adc65c-116c-4240-ab36-407e218be107\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0adc65c-116c-4240-ab36-407e218be107\") pod \"ovsdbserver-nb-1\" (UID: \"81682d31-ae36-4120-ab44-7f69483622d1\") " pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.268262 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/81682d31-ae36-4120-ab44-7f69483622d1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"81682d31-ae36-4120-ab44-7f69483622d1\") " pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.268380 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81682d31-ae36-4120-ab44-7f69483622d1-config\") pod \"ovsdbserver-nb-1\" (UID: \"81682d31-ae36-4120-ab44-7f69483622d1\") " pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.269136 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81682d31-ae36-4120-ab44-7f69483622d1-config\") pod \"ovsdbserver-nb-1\" (UID: \"81682d31-ae36-4120-ab44-7f69483622d1\") " pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.269301 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c68xz\" (UniqueName: \"kubernetes.io/projected/81682d31-ae36-4120-ab44-7f69483622d1-kube-api-access-c68xz\") pod \"ovsdbserver-nb-1\" (UID: \"81682d31-ae36-4120-ab44-7f69483622d1\") " pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.269403 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f55cd61d-b3a2-456f-8b79-0d207405dd13-config\") pod \"ovsdbserver-nb-2\" (UID: \"f55cd61d-b3a2-456f-8b79-0d207405dd13\") " pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.269541 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f55cd61d-b3a2-456f-8b79-0d207405dd13-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"f55cd61d-b3a2-456f-8b79-0d207405dd13\") " pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.269619 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f55cd61d-b3a2-456f-8b79-0d207405dd13-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"f55cd61d-b3a2-456f-8b79-0d207405dd13\") " pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.269704 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/81682d31-ae36-4120-ab44-7f69483622d1-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"81682d31-ae36-4120-ab44-7f69483622d1\") " pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.269787 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81682d31-ae36-4120-ab44-7f69483622d1-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"81682d31-ae36-4120-ab44-7f69483622d1\") " pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.269860 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ef50d87b-3afa-45e1-a575-5f50ca06943b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ef50d87b-3afa-45e1-a575-5f50ca06943b\") pod \"ovsdbserver-nb-2\" (UID: \"f55cd61d-b3a2-456f-8b79-0d207405dd13\") " pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.269967 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/81682d31-ae36-4120-ab44-7f69483622d1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"81682d31-ae36-4120-ab44-7f69483622d1\") " pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.270032 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f55cd61d-b3a2-456f-8b79-0d207405dd13-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"f55cd61d-b3a2-456f-8b79-0d207405dd13\") " pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.269348 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81682d31-ae36-4120-ab44-7f69483622d1-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"81682d31-ae36-4120-ab44-7f69483622d1\") " pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.270638 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f55cd61d-b3a2-456f-8b79-0d207405dd13-config\") pod \"ovsdbserver-nb-2\" (UID: \"f55cd61d-b3a2-456f-8b79-0d207405dd13\") " pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.270984 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/81682d31-ae36-4120-ab44-7f69483622d1-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"81682d31-ae36-4120-ab44-7f69483622d1\") " pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.271004 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f55cd61d-b3a2-456f-8b79-0d207405dd13-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"f55cd61d-b3a2-456f-8b79-0d207405dd13\") " pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.268293 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f55cd61d-b3a2-456f-8b79-0d207405dd13-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"f55cd61d-b3a2-456f-8b79-0d207405dd13\") " pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.273108 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f55cd61d-b3a2-456f-8b79-0d207405dd13-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"f55cd61d-b3a2-456f-8b79-0d207405dd13\") " pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.273645 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.273677 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ef50d87b-3afa-45e1-a575-5f50ca06943b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ef50d87b-3afa-45e1-a575-5f50ca06943b\") pod \"ovsdbserver-nb-2\" (UID: \"f55cd61d-b3a2-456f-8b79-0d207405dd13\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/732b011e7bfe61735652518fb31b0a8b34aaf9ecf45a2d43cf023bf21102d77a/globalmount\"" pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.273757 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/81682d31-ae36-4120-ab44-7f69483622d1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"81682d31-ae36-4120-ab44-7f69483622d1\") " pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.274576 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.274642 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a0adc65c-116c-4240-ab36-407e218be107\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0adc65c-116c-4240-ab36-407e218be107\") pod \"ovsdbserver-nb-1\" (UID: \"81682d31-ae36-4120-ab44-7f69483622d1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4e0ae25df6443eb39856876f5ec62b9c0c049f9e87d816b484f34f848c2391d3/globalmount\"" pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.276723 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/81682d31-ae36-4120-ab44-7f69483622d1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"81682d31-ae36-4120-ab44-7f69483622d1\") " pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.276834 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81682d31-ae36-4120-ab44-7f69483622d1-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"81682d31-ae36-4120-ab44-7f69483622d1\") " pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.279057 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f55cd61d-b3a2-456f-8b79-0d207405dd13-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"f55cd61d-b3a2-456f-8b79-0d207405dd13\") " pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.279502 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f55cd61d-b3a2-456f-8b79-0d207405dd13-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"f55cd61d-b3a2-456f-8b79-0d207405dd13\") " pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.289830 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.291167 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk6kd\" (UniqueName: \"kubernetes.io/projected/f55cd61d-b3a2-456f-8b79-0d207405dd13-kube-api-access-fk6kd\") pod \"ovsdbserver-nb-2\" (UID: \"f55cd61d-b3a2-456f-8b79-0d207405dd13\") " pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.310873 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c68xz\" (UniqueName: \"kubernetes.io/projected/81682d31-ae36-4120-ab44-7f69483622d1-kube-api-access-c68xz\") pod \"ovsdbserver-nb-1\" (UID: \"81682d31-ae36-4120-ab44-7f69483622d1\") " pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.323929 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ef50d87b-3afa-45e1-a575-5f50ca06943b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ef50d87b-3afa-45e1-a575-5f50ca06943b\") pod \"ovsdbserver-nb-2\" (UID: \"f55cd61d-b3a2-456f-8b79-0d207405dd13\") " pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.336122 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a0adc65c-116c-4240-ab36-407e218be107\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0adc65c-116c-4240-ab36-407e218be107\") pod \"ovsdbserver-nb-1\" (UID: \"81682d31-ae36-4120-ab44-7f69483622d1\") " pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.338007 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.626310 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.696864 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.806007 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"f55cd61d-b3a2-456f-8b79-0d207405dd13","Type":"ContainerStarted","Data":"445175fa4254472d0be97cfcd52acf35a8fe1f4d97d8dcc1613ad9964b4b1514"} Dec 05 17:46:52 crc kubenswrapper[4756]: W1205 17:46:52.838272 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2dfcfe1d_4220_400b_83fd_f35c88cf6fa0.slice/crio-77fb9e3817c2f8ee28ed592d921c27480497430b405be8c2ca4071274f869645 WatchSource:0}: Error finding container 77fb9e3817c2f8ee28ed592d921c27480497430b405be8c2ca4071274f869645: Status 404 returned error can't find the container with id 77fb9e3817c2f8ee28ed592d921c27480497430b405be8c2ca4071274f869645 Dec 05 17:46:52 crc kubenswrapper[4756]: I1205 17:46:52.846083 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 17:46:53 crc kubenswrapper[4756]: I1205 17:46:53.212352 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 05 17:46:53 crc kubenswrapper[4756]: W1205 17:46:53.225702 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81682d31_ae36_4120_ab44_7f69483622d1.slice/crio-7932cdaa26f2ff07b5cb33b6af5067ded1fef08d21a99cd6eda554a36944088c WatchSource:0}: Error finding container 7932cdaa26f2ff07b5cb33b6af5067ded1fef08d21a99cd6eda554a36944088c: Status 404 returned error can't find the container with id 7932cdaa26f2ff07b5cb33b6af5067ded1fef08d21a99cd6eda554a36944088c Dec 05 17:46:53 crc kubenswrapper[4756]: I1205 17:46:53.767605 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lt2qs"] Dec 05 17:46:53 crc kubenswrapper[4756]: I1205 17:46:53.769173 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lt2qs" Dec 05 17:46:53 crc kubenswrapper[4756]: I1205 17:46:53.796845 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lt2qs"] Dec 05 17:46:53 crc kubenswrapper[4756]: I1205 17:46:53.821098 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"81682d31-ae36-4120-ab44-7f69483622d1","Type":"ContainerStarted","Data":"2b6385a3f2d25efafedc6243b13bd7e78da12e44ea61cf2837e19059a225e1db"} Dec 05 17:46:53 crc kubenswrapper[4756]: I1205 17:46:53.821166 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"81682d31-ae36-4120-ab44-7f69483622d1","Type":"ContainerStarted","Data":"831f45be1c053280b94c91242d13fed503897d203be3f5aca2f1dc6bc9b0becc"} Dec 05 17:46:53 crc kubenswrapper[4756]: I1205 17:46:53.821188 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"81682d31-ae36-4120-ab44-7f69483622d1","Type":"ContainerStarted","Data":"7932cdaa26f2ff07b5cb33b6af5067ded1fef08d21a99cd6eda554a36944088c"} Dec 05 17:46:53 crc kubenswrapper[4756]: I1205 17:46:53.834545 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2dfcfe1d-4220-400b-83fd-f35c88cf6fa0","Type":"ContainerStarted","Data":"693aee27a4a367b08a0e7ec6a5024ba83f01f836a3d536d682203e15055f1330"} Dec 05 17:46:53 crc kubenswrapper[4756]: I1205 17:46:53.834601 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2dfcfe1d-4220-400b-83fd-f35c88cf6fa0","Type":"ContainerStarted","Data":"51e4cff1fbde50ef507a433608b880f2f94d9a21e0ce53581c6571dfc954ac94"} Dec 05 17:46:53 crc kubenswrapper[4756]: I1205 17:46:53.834627 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2dfcfe1d-4220-400b-83fd-f35c88cf6fa0","Type":"ContainerStarted","Data":"77fb9e3817c2f8ee28ed592d921c27480497430b405be8c2ca4071274f869645"} Dec 05 17:46:53 crc kubenswrapper[4756]: I1205 17:46:53.841832 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"f55cd61d-b3a2-456f-8b79-0d207405dd13","Type":"ContainerStarted","Data":"cae64db9e1c0539b13f299feb764c2d48665cdd76c2f3616d836c8bf365eb2ac"} Dec 05 17:46:53 crc kubenswrapper[4756]: I1205 17:46:53.841907 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"f55cd61d-b3a2-456f-8b79-0d207405dd13","Type":"ContainerStarted","Data":"e7e37b1c8c907697adfb3082235ec006282202165a019802a88a21edfcb099ae"} Dec 05 17:46:53 crc kubenswrapper[4756]: I1205 17:46:53.865980 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=3.865953856 podStartE2EDuration="3.865953856s" podCreationTimestamp="2025-12-05 17:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:46:53.859695959 +0000 UTC m=+5474.098369481" watchObservedRunningTime="2025-12-05 17:46:53.865953856 +0000 UTC m=+5474.104627388" Dec 05 17:46:53 crc kubenswrapper[4756]: I1205 17:46:53.889060 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.889043439 podStartE2EDuration="3.889043439s" podCreationTimestamp="2025-12-05 17:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:46:53.886660155 +0000 UTC m=+5474.125333667" watchObservedRunningTime="2025-12-05 17:46:53.889043439 +0000 UTC m=+5474.127716951" Dec 05 17:46:53 crc kubenswrapper[4756]: I1205 17:46:53.897832 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f19a770b-26e8-4073-ac04-4cbfd91aa9a1-utilities\") pod \"certified-operators-lt2qs\" (UID: \"f19a770b-26e8-4073-ac04-4cbfd91aa9a1\") " pod="openshift-marketplace/certified-operators-lt2qs" Dec 05 17:46:53 crc kubenswrapper[4756]: I1205 17:46:53.898041 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw5j7\" (UniqueName: \"kubernetes.io/projected/f19a770b-26e8-4073-ac04-4cbfd91aa9a1-kube-api-access-vw5j7\") pod \"certified-operators-lt2qs\" (UID: \"f19a770b-26e8-4073-ac04-4cbfd91aa9a1\") " pod="openshift-marketplace/certified-operators-lt2qs" Dec 05 17:46:53 crc kubenswrapper[4756]: I1205 17:46:53.898387 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f19a770b-26e8-4073-ac04-4cbfd91aa9a1-catalog-content\") pod \"certified-operators-lt2qs\" (UID: \"f19a770b-26e8-4073-ac04-4cbfd91aa9a1\") " pod="openshift-marketplace/certified-operators-lt2qs" Dec 05 17:46:53 crc kubenswrapper[4756]: I1205 17:46:53.908782 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=3.908752792 podStartE2EDuration="3.908752792s" podCreationTimestamp="2025-12-05 17:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:46:53.905868605 +0000 UTC m=+5474.144542127" watchObservedRunningTime="2025-12-05 17:46:53.908752792 +0000 UTC m=+5474.147426304" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.001215 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f19a770b-26e8-4073-ac04-4cbfd91aa9a1-catalog-content\") pod \"certified-operators-lt2qs\" (UID: \"f19a770b-26e8-4073-ac04-4cbfd91aa9a1\") " pod="openshift-marketplace/certified-operators-lt2qs" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.001463 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f19a770b-26e8-4073-ac04-4cbfd91aa9a1-utilities\") pod \"certified-operators-lt2qs\" (UID: \"f19a770b-26e8-4073-ac04-4cbfd91aa9a1\") " pod="openshift-marketplace/certified-operators-lt2qs" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.001490 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw5j7\" (UniqueName: \"kubernetes.io/projected/f19a770b-26e8-4073-ac04-4cbfd91aa9a1-kube-api-access-vw5j7\") pod \"certified-operators-lt2qs\" (UID: \"f19a770b-26e8-4073-ac04-4cbfd91aa9a1\") " pod="openshift-marketplace/certified-operators-lt2qs" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.003027 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f19a770b-26e8-4073-ac04-4cbfd91aa9a1-utilities\") pod \"certified-operators-lt2qs\" (UID: \"f19a770b-26e8-4073-ac04-4cbfd91aa9a1\") " pod="openshift-marketplace/certified-operators-lt2qs" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.003353 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f19a770b-26e8-4073-ac04-4cbfd91aa9a1-catalog-content\") pod \"certified-operators-lt2qs\" (UID: \"f19a770b-26e8-4073-ac04-4cbfd91aa9a1\") " pod="openshift-marketplace/certified-operators-lt2qs" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.022124 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw5j7\" (UniqueName: \"kubernetes.io/projected/f19a770b-26e8-4073-ac04-4cbfd91aa9a1-kube-api-access-vw5j7\") pod \"certified-operators-lt2qs\" (UID: \"f19a770b-26e8-4073-ac04-4cbfd91aa9a1\") " pod="openshift-marketplace/certified-operators-lt2qs" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.106442 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.106897 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lt2qs" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.108451 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.111832 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.111998 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.116600 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-qwpt6" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.123660 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.124803 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.126486 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.146775 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.148098 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.158488 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.166738 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.174042 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.220261 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/554b5c45-8327-4229-8c3e-294407a96bed-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"554b5c45-8327-4229-8c3e-294407a96bed\") " pod="openstack/ovsdbserver-sb-0" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.220327 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39409198-58b6-4300-a65c-911eb49c7455-config\") pod \"ovsdbserver-sb-1\" (UID: \"39409198-58b6-4300-a65c-911eb49c7455\") " pod="openstack/ovsdbserver-sb-1" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.220369 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/554b5c45-8327-4229-8c3e-294407a96bed-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"554b5c45-8327-4229-8c3e-294407a96bed\") " pod="openstack/ovsdbserver-sb-0" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.220436 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7dfdb26-c3f2-4176-a880-f526a052983c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"b7dfdb26-c3f2-4176-a880-f526a052983c\") " pod="openstack/ovsdbserver-sb-2" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.220474 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/554b5c45-8327-4229-8c3e-294407a96bed-config\") pod \"ovsdbserver-sb-0\" (UID: \"554b5c45-8327-4229-8c3e-294407a96bed\") " pod="openstack/ovsdbserver-sb-0" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.220519 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr47f\" (UniqueName: \"kubernetes.io/projected/b7dfdb26-c3f2-4176-a880-f526a052983c-kube-api-access-gr47f\") pod \"ovsdbserver-sb-2\" (UID: \"b7dfdb26-c3f2-4176-a880-f526a052983c\") " pod="openstack/ovsdbserver-sb-2" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.220565 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39409198-58b6-4300-a65c-911eb49c7455-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"39409198-58b6-4300-a65c-911eb49c7455\") " pod="openstack/ovsdbserver-sb-1" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.220602 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39409198-58b6-4300-a65c-911eb49c7455-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"39409198-58b6-4300-a65c-911eb49c7455\") " pod="openstack/ovsdbserver-sb-1" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.220635 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b7dfdb26-c3f2-4176-a880-f526a052983c-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"b7dfdb26-c3f2-4176-a880-f526a052983c\") " pod="openstack/ovsdbserver-sb-2" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.220664 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/554b5c45-8327-4229-8c3e-294407a96bed-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"554b5c45-8327-4229-8c3e-294407a96bed\") " pod="openstack/ovsdbserver-sb-0" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.220700 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-74c0a21c-6d6b-4681-8b7a-f700fd7ccdb8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74c0a21c-6d6b-4681-8b7a-f700fd7ccdb8\") pod \"ovsdbserver-sb-2\" (UID: \"b7dfdb26-c3f2-4176-a880-f526a052983c\") " pod="openstack/ovsdbserver-sb-2" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.220744 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m5rn\" (UniqueName: \"kubernetes.io/projected/554b5c45-8327-4229-8c3e-294407a96bed-kube-api-access-9m5rn\") pod \"ovsdbserver-sb-0\" (UID: \"554b5c45-8327-4229-8c3e-294407a96bed\") " pod="openstack/ovsdbserver-sb-0" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.220792 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7dfdb26-c3f2-4176-a880-f526a052983c-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"b7dfdb26-c3f2-4176-a880-f526a052983c\") " pod="openstack/ovsdbserver-sb-2" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.220834 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/39409198-58b6-4300-a65c-911eb49c7455-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"39409198-58b6-4300-a65c-911eb49c7455\") " pod="openstack/ovsdbserver-sb-1" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.220892 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3bc5369f-e89b-4169-858f-3f1ec5fe7379\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3bc5369f-e89b-4169-858f-3f1ec5fe7379\") pod \"ovsdbserver-sb-1\" (UID: \"39409198-58b6-4300-a65c-911eb49c7455\") " pod="openstack/ovsdbserver-sb-1" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.220933 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/39409198-58b6-4300-a65c-911eb49c7455-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"39409198-58b6-4300-a65c-911eb49c7455\") " pod="openstack/ovsdbserver-sb-1" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.220964 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7dfdb26-c3f2-4176-a880-f526a052983c-config\") pod \"ovsdbserver-sb-2\" (UID: \"b7dfdb26-c3f2-4176-a880-f526a052983c\") " pod="openstack/ovsdbserver-sb-2" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.220997 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7dfdb26-c3f2-4176-a880-f526a052983c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"b7dfdb26-c3f2-4176-a880-f526a052983c\") " pod="openstack/ovsdbserver-sb-2" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.221029 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/39409198-58b6-4300-a65c-911eb49c7455-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"39409198-58b6-4300-a65c-911eb49c7455\") " pod="openstack/ovsdbserver-sb-1" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.221066 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/554b5c45-8327-4229-8c3e-294407a96bed-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"554b5c45-8327-4229-8c3e-294407a96bed\") " pod="openstack/ovsdbserver-sb-0" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.221097 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/554b5c45-8327-4229-8c3e-294407a96bed-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"554b5c45-8327-4229-8c3e-294407a96bed\") " pod="openstack/ovsdbserver-sb-0" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.221170 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tzpz\" (UniqueName: \"kubernetes.io/projected/39409198-58b6-4300-a65c-911eb49c7455-kube-api-access-8tzpz\") pod \"ovsdbserver-sb-1\" (UID: \"39409198-58b6-4300-a65c-911eb49c7455\") " pod="openstack/ovsdbserver-sb-1" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.221214 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7dfdb26-c3f2-4176-a880-f526a052983c-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"b7dfdb26-c3f2-4176-a880-f526a052983c\") " pod="openstack/ovsdbserver-sb-2" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.221286 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-528a7285-f2e1-4db2-962d-a9c9396efed9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-528a7285-f2e1-4db2-962d-a9c9396efed9\") pod \"ovsdbserver-sb-0\" (UID: \"554b5c45-8327-4229-8c3e-294407a96bed\") " pod="openstack/ovsdbserver-sb-0" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.323029 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m5rn\" (UniqueName: \"kubernetes.io/projected/554b5c45-8327-4229-8c3e-294407a96bed-kube-api-access-9m5rn\") pod \"ovsdbserver-sb-0\" (UID: \"554b5c45-8327-4229-8c3e-294407a96bed\") " pod="openstack/ovsdbserver-sb-0" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.323079 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7dfdb26-c3f2-4176-a880-f526a052983c-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"b7dfdb26-c3f2-4176-a880-f526a052983c\") " pod="openstack/ovsdbserver-sb-2" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.323101 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/39409198-58b6-4300-a65c-911eb49c7455-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"39409198-58b6-4300-a65c-911eb49c7455\") " pod="openstack/ovsdbserver-sb-1" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.323134 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3bc5369f-e89b-4169-858f-3f1ec5fe7379\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3bc5369f-e89b-4169-858f-3f1ec5fe7379\") pod \"ovsdbserver-sb-1\" (UID: \"39409198-58b6-4300-a65c-911eb49c7455\") " pod="openstack/ovsdbserver-sb-1" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.323155 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/39409198-58b6-4300-a65c-911eb49c7455-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"39409198-58b6-4300-a65c-911eb49c7455\") " pod="openstack/ovsdbserver-sb-1" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.323173 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7dfdb26-c3f2-4176-a880-f526a052983c-config\") pod \"ovsdbserver-sb-2\" (UID: \"b7dfdb26-c3f2-4176-a880-f526a052983c\") " pod="openstack/ovsdbserver-sb-2" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.323192 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7dfdb26-c3f2-4176-a880-f526a052983c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"b7dfdb26-c3f2-4176-a880-f526a052983c\") " pod="openstack/ovsdbserver-sb-2" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.323216 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/39409198-58b6-4300-a65c-911eb49c7455-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"39409198-58b6-4300-a65c-911eb49c7455\") " pod="openstack/ovsdbserver-sb-1" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.323240 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/554b5c45-8327-4229-8c3e-294407a96bed-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"554b5c45-8327-4229-8c3e-294407a96bed\") " pod="openstack/ovsdbserver-sb-0" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.323255 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/554b5c45-8327-4229-8c3e-294407a96bed-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"554b5c45-8327-4229-8c3e-294407a96bed\") " pod="openstack/ovsdbserver-sb-0" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.323273 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tzpz\" (UniqueName: \"kubernetes.io/projected/39409198-58b6-4300-a65c-911eb49c7455-kube-api-access-8tzpz\") pod \"ovsdbserver-sb-1\" (UID: \"39409198-58b6-4300-a65c-911eb49c7455\") " pod="openstack/ovsdbserver-sb-1" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.323289 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7dfdb26-c3f2-4176-a880-f526a052983c-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"b7dfdb26-c3f2-4176-a880-f526a052983c\") " pod="openstack/ovsdbserver-sb-2" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.323317 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-528a7285-f2e1-4db2-962d-a9c9396efed9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-528a7285-f2e1-4db2-962d-a9c9396efed9\") pod \"ovsdbserver-sb-0\" (UID: \"554b5c45-8327-4229-8c3e-294407a96bed\") " pod="openstack/ovsdbserver-sb-0" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.323343 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/554b5c45-8327-4229-8c3e-294407a96bed-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"554b5c45-8327-4229-8c3e-294407a96bed\") " pod="openstack/ovsdbserver-sb-0" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.323359 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39409198-58b6-4300-a65c-911eb49c7455-config\") pod \"ovsdbserver-sb-1\" (UID: \"39409198-58b6-4300-a65c-911eb49c7455\") " pod="openstack/ovsdbserver-sb-1" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.323373 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/554b5c45-8327-4229-8c3e-294407a96bed-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"554b5c45-8327-4229-8c3e-294407a96bed\") " pod="openstack/ovsdbserver-sb-0" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.323393 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7dfdb26-c3f2-4176-a880-f526a052983c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"b7dfdb26-c3f2-4176-a880-f526a052983c\") " pod="openstack/ovsdbserver-sb-2" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.323409 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/554b5c45-8327-4229-8c3e-294407a96bed-config\") pod \"ovsdbserver-sb-0\" (UID: \"554b5c45-8327-4229-8c3e-294407a96bed\") " pod="openstack/ovsdbserver-sb-0" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.323445 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr47f\" (UniqueName: \"kubernetes.io/projected/b7dfdb26-c3f2-4176-a880-f526a052983c-kube-api-access-gr47f\") pod \"ovsdbserver-sb-2\" (UID: \"b7dfdb26-c3f2-4176-a880-f526a052983c\") " pod="openstack/ovsdbserver-sb-2" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.323468 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39409198-58b6-4300-a65c-911eb49c7455-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"39409198-58b6-4300-a65c-911eb49c7455\") " pod="openstack/ovsdbserver-sb-1" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.323487 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39409198-58b6-4300-a65c-911eb49c7455-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"39409198-58b6-4300-a65c-911eb49c7455\") " pod="openstack/ovsdbserver-sb-1" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.323504 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b7dfdb26-c3f2-4176-a880-f526a052983c-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"b7dfdb26-c3f2-4176-a880-f526a052983c\") " pod="openstack/ovsdbserver-sb-2" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.323519 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/554b5c45-8327-4229-8c3e-294407a96bed-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"554b5c45-8327-4229-8c3e-294407a96bed\") " pod="openstack/ovsdbserver-sb-0" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.323536 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-74c0a21c-6d6b-4681-8b7a-f700fd7ccdb8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74c0a21c-6d6b-4681-8b7a-f700fd7ccdb8\") pod \"ovsdbserver-sb-2\" (UID: \"b7dfdb26-c3f2-4176-a880-f526a052983c\") " pod="openstack/ovsdbserver-sb-2" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.325498 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/39409198-58b6-4300-a65c-911eb49c7455-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"39409198-58b6-4300-a65c-911eb49c7455\") " pod="openstack/ovsdbserver-sb-1" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.326497 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39409198-58b6-4300-a65c-911eb49c7455-config\") pod \"ovsdbserver-sb-1\" (UID: \"39409198-58b6-4300-a65c-911eb49c7455\") " pod="openstack/ovsdbserver-sb-1" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.326894 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39409198-58b6-4300-a65c-911eb49c7455-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"39409198-58b6-4300-a65c-911eb49c7455\") " pod="openstack/ovsdbserver-sb-1" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.327870 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/554b5c45-8327-4229-8c3e-294407a96bed-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"554b5c45-8327-4229-8c3e-294407a96bed\") " pod="openstack/ovsdbserver-sb-0" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.328368 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/554b5c45-8327-4229-8c3e-294407a96bed-config\") pod \"ovsdbserver-sb-0\" (UID: \"554b5c45-8327-4229-8c3e-294407a96bed\") " pod="openstack/ovsdbserver-sb-0" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.329639 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/554b5c45-8327-4229-8c3e-294407a96bed-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"554b5c45-8327-4229-8c3e-294407a96bed\") " pod="openstack/ovsdbserver-sb-0" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.330642 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7dfdb26-c3f2-4176-a880-f526a052983c-config\") pod \"ovsdbserver-sb-2\" (UID: \"b7dfdb26-c3f2-4176-a880-f526a052983c\") " pod="openstack/ovsdbserver-sb-2" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.330783 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/554b5c45-8327-4229-8c3e-294407a96bed-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"554b5c45-8327-4229-8c3e-294407a96bed\") " pod="openstack/ovsdbserver-sb-0" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.330866 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b7dfdb26-c3f2-4176-a880-f526a052983c-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"b7dfdb26-c3f2-4176-a880-f526a052983c\") " pod="openstack/ovsdbserver-sb-2" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.331261 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7dfdb26-c3f2-4176-a880-f526a052983c-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"b7dfdb26-c3f2-4176-a880-f526a052983c\") " pod="openstack/ovsdbserver-sb-2" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.331319 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7dfdb26-c3f2-4176-a880-f526a052983c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"b7dfdb26-c3f2-4176-a880-f526a052983c\") " pod="openstack/ovsdbserver-sb-2" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.331687 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7dfdb26-c3f2-4176-a880-f526a052983c-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"b7dfdb26-c3f2-4176-a880-f526a052983c\") " pod="openstack/ovsdbserver-sb-2" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.331815 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.331843 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3bc5369f-e89b-4169-858f-3f1ec5fe7379\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3bc5369f-e89b-4169-858f-3f1ec5fe7379\") pod \"ovsdbserver-sb-1\" (UID: \"39409198-58b6-4300-a65c-911eb49c7455\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e8a61b836a2e81e44f2110a55ec65cf586e450738cf9e3d2dc28ff4e6c0b237e/globalmount\"" pod="openstack/ovsdbserver-sb-1" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.331909 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.331931 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-528a7285-f2e1-4db2-962d-a9c9396efed9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-528a7285-f2e1-4db2-962d-a9c9396efed9\") pod \"ovsdbserver-sb-0\" (UID: \"554b5c45-8327-4229-8c3e-294407a96bed\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a562ed317c27d0c30bf7e2b1d098478142d47bd2522a73bbf35bf312ec7e81e1/globalmount\"" pod="openstack/ovsdbserver-sb-0" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.333284 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.333309 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-74c0a21c-6d6b-4681-8b7a-f700fd7ccdb8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74c0a21c-6d6b-4681-8b7a-f700fd7ccdb8\") pod \"ovsdbserver-sb-2\" (UID: \"b7dfdb26-c3f2-4176-a880-f526a052983c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8ea861d0b1dd8b77ffe1704fd1b7d4623fade9143742314a28ad8cad86058204/globalmount\"" pod="openstack/ovsdbserver-sb-2" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.339034 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/39409198-58b6-4300-a65c-911eb49c7455-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"39409198-58b6-4300-a65c-911eb49c7455\") " pod="openstack/ovsdbserver-sb-1" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.339119 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39409198-58b6-4300-a65c-911eb49c7455-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"39409198-58b6-4300-a65c-911eb49c7455\") " pod="openstack/ovsdbserver-sb-1" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.339171 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7dfdb26-c3f2-4176-a880-f526a052983c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"b7dfdb26-c3f2-4176-a880-f526a052983c\") " pod="openstack/ovsdbserver-sb-2" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.339354 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/39409198-58b6-4300-a65c-911eb49c7455-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"39409198-58b6-4300-a65c-911eb49c7455\") " pod="openstack/ovsdbserver-sb-1" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.339397 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/554b5c45-8327-4229-8c3e-294407a96bed-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"554b5c45-8327-4229-8c3e-294407a96bed\") " pod="openstack/ovsdbserver-sb-0" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.351227 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tzpz\" (UniqueName: \"kubernetes.io/projected/39409198-58b6-4300-a65c-911eb49c7455-kube-api-access-8tzpz\") pod \"ovsdbserver-sb-1\" (UID: \"39409198-58b6-4300-a65c-911eb49c7455\") " pod="openstack/ovsdbserver-sb-1" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.351396 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m5rn\" (UniqueName: \"kubernetes.io/projected/554b5c45-8327-4229-8c3e-294407a96bed-kube-api-access-9m5rn\") pod \"ovsdbserver-sb-0\" (UID: \"554b5c45-8327-4229-8c3e-294407a96bed\") " pod="openstack/ovsdbserver-sb-0" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.356063 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr47f\" (UniqueName: \"kubernetes.io/projected/b7dfdb26-c3f2-4176-a880-f526a052983c-kube-api-access-gr47f\") pod \"ovsdbserver-sb-2\" (UID: \"b7dfdb26-c3f2-4176-a880-f526a052983c\") " pod="openstack/ovsdbserver-sb-2" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.365229 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/554b5c45-8327-4229-8c3e-294407a96bed-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"554b5c45-8327-4229-8c3e-294407a96bed\") " pod="openstack/ovsdbserver-sb-0" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.433764 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-528a7285-f2e1-4db2-962d-a9c9396efed9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-528a7285-f2e1-4db2-962d-a9c9396efed9\") pod \"ovsdbserver-sb-0\" (UID: \"554b5c45-8327-4229-8c3e-294407a96bed\") " pod="openstack/ovsdbserver-sb-0" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.442228 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-74c0a21c-6d6b-4681-8b7a-f700fd7ccdb8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74c0a21c-6d6b-4681-8b7a-f700fd7ccdb8\") pod \"ovsdbserver-sb-2\" (UID: \"b7dfdb26-c3f2-4176-a880-f526a052983c\") " pod="openstack/ovsdbserver-sb-2" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.468590 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3bc5369f-e89b-4169-858f-3f1ec5fe7379\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3bc5369f-e89b-4169-858f-3f1ec5fe7379\") pod \"ovsdbserver-sb-1\" (UID: \"39409198-58b6-4300-a65c-911eb49c7455\") " pod="openstack/ovsdbserver-sb-1" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.538469 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.649997 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lt2qs"] Dec 05 17:46:54 crc kubenswrapper[4756]: W1205 17:46:54.652260 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf19a770b_26e8_4073_ac04_4cbfd91aa9a1.slice/crio-90c9b559550bae8fa05d67dca827fa7a1f54ef137de771e49be42a6aac27975b WatchSource:0}: Error finding container 90c9b559550bae8fa05d67dca827fa7a1f54ef137de771e49be42a6aac27975b: Status 404 returned error can't find the container with id 90c9b559550bae8fa05d67dca827fa7a1f54ef137de771e49be42a6aac27975b Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.725862 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.758245 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.860320 4756 generic.go:334] "Generic (PLEG): container finished" podID="f19a770b-26e8-4073-ac04-4cbfd91aa9a1" containerID="61db25834d47f849d8d0d128c95accee651df8e62a5d649aa42ad9d447edd1e6" exitCode=0 Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.860398 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lt2qs" event={"ID":"f19a770b-26e8-4073-ac04-4cbfd91aa9a1","Type":"ContainerDied","Data":"61db25834d47f849d8d0d128c95accee651df8e62a5d649aa42ad9d447edd1e6"} Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.860685 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lt2qs" event={"ID":"f19a770b-26e8-4073-ac04-4cbfd91aa9a1","Type":"ContainerStarted","Data":"90c9b559550bae8fa05d67dca827fa7a1f54ef137de771e49be42a6aac27975b"} Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.863990 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 17:46:54 crc kubenswrapper[4756]: I1205 17:46:54.867648 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 05 17:46:54 crc kubenswrapper[4756]: W1205 17:46:54.890820 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39409198_58b6_4300_a65c_911eb49c7455.slice/crio-f1096a0310299d356e9d47115e35ae79a8d46dd75798581511149591caa72746 WatchSource:0}: Error finding container f1096a0310299d356e9d47115e35ae79a8d46dd75798581511149591caa72746: Status 404 returned error can't find the container with id f1096a0310299d356e9d47115e35ae79a8d46dd75798581511149591caa72746 Dec 05 17:46:55 crc kubenswrapper[4756]: I1205 17:46:55.273230 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 17:46:55 crc kubenswrapper[4756]: W1205 17:46:55.279144 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod554b5c45_8327_4229_8c3e_294407a96bed.slice/crio-92ce283928fd829f21516b6f8b89e56114c00fa062fe49904a7d83ef536d1bf8 WatchSource:0}: Error finding container 92ce283928fd829f21516b6f8b89e56114c00fa062fe49904a7d83ef536d1bf8: Status 404 returned error can't find the container with id 92ce283928fd829f21516b6f8b89e56114c00fa062fe49904a7d83ef536d1bf8 Dec 05 17:46:55 crc kubenswrapper[4756]: I1205 17:46:55.290254 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:55 crc kubenswrapper[4756]: I1205 17:46:55.339103 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:55 crc kubenswrapper[4756]: I1205 17:46:55.391141 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:55 crc kubenswrapper[4756]: I1205 17:46:55.404381 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 05 17:46:55 crc kubenswrapper[4756]: W1205 17:46:55.405228 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7dfdb26_c3f2_4176_a880_f526a052983c.slice/crio-73b8e4a4b0affd9074c4edb50363709073012496406a02f31b8d41ab18406f36 WatchSource:0}: Error finding container 73b8e4a4b0affd9074c4edb50363709073012496406a02f31b8d41ab18406f36: Status 404 returned error can't find the container with id 73b8e4a4b0affd9074c4edb50363709073012496406a02f31b8d41ab18406f36 Dec 05 17:46:55 crc kubenswrapper[4756]: I1205 17:46:55.627534 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:55 crc kubenswrapper[4756]: I1205 17:46:55.870025 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"554b5c45-8327-4229-8c3e-294407a96bed","Type":"ContainerStarted","Data":"fa825af60ac19ca32af921222b57de8f67e02a43f9bd5a098aba74e831a063e1"} Dec 05 17:46:55 crc kubenswrapper[4756]: I1205 17:46:55.870076 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"554b5c45-8327-4229-8c3e-294407a96bed","Type":"ContainerStarted","Data":"bd723b19c39ecff9d833089257ee1cec3c9c2702d483593f8ea8be66649a8b05"} Dec 05 17:46:55 crc kubenswrapper[4756]: I1205 17:46:55.870091 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"554b5c45-8327-4229-8c3e-294407a96bed","Type":"ContainerStarted","Data":"92ce283928fd829f21516b6f8b89e56114c00fa062fe49904a7d83ef536d1bf8"} Dec 05 17:46:55 crc kubenswrapper[4756]: I1205 17:46:55.872215 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"b7dfdb26-c3f2-4176-a880-f526a052983c","Type":"ContainerStarted","Data":"72f32d69331edf5a328a1ad40b79551b0429dfafc46717a59e5fe022ce75c2f2"} Dec 05 17:46:55 crc kubenswrapper[4756]: I1205 17:46:55.872300 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"b7dfdb26-c3f2-4176-a880-f526a052983c","Type":"ContainerStarted","Data":"7033079a09d15896f990af7aebf8f8e5fadcc868bd756e62ce6928211156572e"} Dec 05 17:46:55 crc kubenswrapper[4756]: I1205 17:46:55.872311 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"b7dfdb26-c3f2-4176-a880-f526a052983c","Type":"ContainerStarted","Data":"73b8e4a4b0affd9074c4edb50363709073012496406a02f31b8d41ab18406f36"} Dec 05 17:46:55 crc kubenswrapper[4756]: I1205 17:46:55.874255 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"39409198-58b6-4300-a65c-911eb49c7455","Type":"ContainerStarted","Data":"042ec99a33c5a122111e6e25cc88684defd425b8d6e85e609265879bfb6db12f"} Dec 05 17:46:55 crc kubenswrapper[4756]: I1205 17:46:55.874298 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"39409198-58b6-4300-a65c-911eb49c7455","Type":"ContainerStarted","Data":"14ca1b7aa70dd027820614dee1d2721ed5036f55d5db844d82f5fe6898c1cf24"} Dec 05 17:46:55 crc kubenswrapper[4756]: I1205 17:46:55.874308 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"39409198-58b6-4300-a65c-911eb49c7455","Type":"ContainerStarted","Data":"f1096a0310299d356e9d47115e35ae79a8d46dd75798581511149591caa72746"} Dec 05 17:46:55 crc kubenswrapper[4756]: I1205 17:46:55.876368 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lt2qs" event={"ID":"f19a770b-26e8-4073-ac04-4cbfd91aa9a1","Type":"ContainerStarted","Data":"b2ceaf3375bf95ee89e75f3462ebf109f401fbb97454e58f41df50c4d8f9a8d0"} Dec 05 17:46:55 crc kubenswrapper[4756]: I1205 17:46:55.877035 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:55 crc kubenswrapper[4756]: I1205 17:46:55.896260 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=2.896240187 podStartE2EDuration="2.896240187s" podCreationTimestamp="2025-12-05 17:46:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:46:55.89143181 +0000 UTC m=+5476.130105322" watchObservedRunningTime="2025-12-05 17:46:55.896240187 +0000 UTC m=+5476.134913709" Dec 05 17:46:55 crc kubenswrapper[4756]: I1205 17:46:55.915226 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=2.915202791 podStartE2EDuration="2.915202791s" podCreationTimestamp="2025-12-05 17:46:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:46:55.909073398 +0000 UTC m=+5476.147746960" watchObservedRunningTime="2025-12-05 17:46:55.915202791 +0000 UTC m=+5476.153876333" Dec 05 17:46:55 crc kubenswrapper[4756]: I1205 17:46:55.938068 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=2.938048647 podStartE2EDuration="2.938048647s" podCreationTimestamp="2025-12-05 17:46:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:46:55.928479523 +0000 UTC m=+5476.167153075" watchObservedRunningTime="2025-12-05 17:46:55.938048647 +0000 UTC m=+5476.176722169" Dec 05 17:46:56 crc kubenswrapper[4756]: I1205 17:46:56.891964 4756 generic.go:334] "Generic (PLEG): container finished" podID="f19a770b-26e8-4073-ac04-4cbfd91aa9a1" containerID="b2ceaf3375bf95ee89e75f3462ebf109f401fbb97454e58f41df50c4d8f9a8d0" exitCode=0 Dec 05 17:46:56 crc kubenswrapper[4756]: I1205 17:46:56.892021 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lt2qs" event={"ID":"f19a770b-26e8-4073-ac04-4cbfd91aa9a1","Type":"ContainerDied","Data":"b2ceaf3375bf95ee89e75f3462ebf109f401fbb97454e58f41df50c4d8f9a8d0"} Dec 05 17:46:57 crc kubenswrapper[4756]: I1205 17:46:57.290250 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:57 crc kubenswrapper[4756]: I1205 17:46:57.404849 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Dec 05 17:46:57 crc kubenswrapper[4756]: I1205 17:46:57.539225 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Dec 05 17:46:57 crc kubenswrapper[4756]: I1205 17:46:57.643643 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:57 crc kubenswrapper[4756]: I1205 17:46:57.721519 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f76bf6bf9-ks5t4"] Dec 05 17:46:57 crc kubenswrapper[4756]: I1205 17:46:57.726281 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f76bf6bf9-ks5t4" Dec 05 17:46:57 crc kubenswrapper[4756]: I1205 17:46:57.726809 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 05 17:46:57 crc kubenswrapper[4756]: I1205 17:46:57.731900 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 05 17:46:57 crc kubenswrapper[4756]: I1205 17:46:57.745193 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f76bf6bf9-ks5t4"] Dec 05 17:46:57 crc kubenswrapper[4756]: I1205 17:46:57.759780 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Dec 05 17:46:57 crc kubenswrapper[4756]: I1205 17:46:57.799074 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwt6p\" (UniqueName: \"kubernetes.io/projected/f0fa3a1e-72ed-435f-8f8c-6d719d260e88-kube-api-access-kwt6p\") pod \"dnsmasq-dns-6f76bf6bf9-ks5t4\" (UID: \"f0fa3a1e-72ed-435f-8f8c-6d719d260e88\") " pod="openstack/dnsmasq-dns-6f76bf6bf9-ks5t4" Dec 05 17:46:57 crc kubenswrapper[4756]: I1205 17:46:57.799151 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0fa3a1e-72ed-435f-8f8c-6d719d260e88-ovsdbserver-nb\") pod \"dnsmasq-dns-6f76bf6bf9-ks5t4\" (UID: \"f0fa3a1e-72ed-435f-8f8c-6d719d260e88\") " pod="openstack/dnsmasq-dns-6f76bf6bf9-ks5t4" Dec 05 17:46:57 crc kubenswrapper[4756]: I1205 17:46:57.799469 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0fa3a1e-72ed-435f-8f8c-6d719d260e88-dns-svc\") pod \"dnsmasq-dns-6f76bf6bf9-ks5t4\" (UID: \"f0fa3a1e-72ed-435f-8f8c-6d719d260e88\") " pod="openstack/dnsmasq-dns-6f76bf6bf9-ks5t4" Dec 05 17:46:57 crc kubenswrapper[4756]: I1205 17:46:57.799530 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0fa3a1e-72ed-435f-8f8c-6d719d260e88-config\") pod \"dnsmasq-dns-6f76bf6bf9-ks5t4\" (UID: \"f0fa3a1e-72ed-435f-8f8c-6d719d260e88\") " pod="openstack/dnsmasq-dns-6f76bf6bf9-ks5t4" Dec 05 17:46:57 crc kubenswrapper[4756]: I1205 17:46:57.900520 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0fa3a1e-72ed-435f-8f8c-6d719d260e88-dns-svc\") pod \"dnsmasq-dns-6f76bf6bf9-ks5t4\" (UID: \"f0fa3a1e-72ed-435f-8f8c-6d719d260e88\") " pod="openstack/dnsmasq-dns-6f76bf6bf9-ks5t4" Dec 05 17:46:57 crc kubenswrapper[4756]: I1205 17:46:57.900573 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0fa3a1e-72ed-435f-8f8c-6d719d260e88-config\") pod \"dnsmasq-dns-6f76bf6bf9-ks5t4\" (UID: \"f0fa3a1e-72ed-435f-8f8c-6d719d260e88\") " pod="openstack/dnsmasq-dns-6f76bf6bf9-ks5t4" Dec 05 17:46:57 crc kubenswrapper[4756]: I1205 17:46:57.900650 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwt6p\" (UniqueName: \"kubernetes.io/projected/f0fa3a1e-72ed-435f-8f8c-6d719d260e88-kube-api-access-kwt6p\") pod \"dnsmasq-dns-6f76bf6bf9-ks5t4\" (UID: \"f0fa3a1e-72ed-435f-8f8c-6d719d260e88\") " pod="openstack/dnsmasq-dns-6f76bf6bf9-ks5t4" Dec 05 17:46:57 crc kubenswrapper[4756]: I1205 17:46:57.900712 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0fa3a1e-72ed-435f-8f8c-6d719d260e88-ovsdbserver-nb\") pod \"dnsmasq-dns-6f76bf6bf9-ks5t4\" (UID: \"f0fa3a1e-72ed-435f-8f8c-6d719d260e88\") " pod="openstack/dnsmasq-dns-6f76bf6bf9-ks5t4" Dec 05 17:46:57 crc kubenswrapper[4756]: I1205 17:46:57.901454 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lt2qs" event={"ID":"f19a770b-26e8-4073-ac04-4cbfd91aa9a1","Type":"ContainerStarted","Data":"fa500bf70bff7d23c7493f7e17c9767fca1984aade720d0c3aa8974df582722f"} Dec 05 17:46:57 crc kubenswrapper[4756]: I1205 17:46:57.902968 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0fa3a1e-72ed-435f-8f8c-6d719d260e88-dns-svc\") pod \"dnsmasq-dns-6f76bf6bf9-ks5t4\" (UID: \"f0fa3a1e-72ed-435f-8f8c-6d719d260e88\") " pod="openstack/dnsmasq-dns-6f76bf6bf9-ks5t4" Dec 05 17:46:57 crc kubenswrapper[4756]: I1205 17:46:57.903861 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0fa3a1e-72ed-435f-8f8c-6d719d260e88-ovsdbserver-nb\") pod \"dnsmasq-dns-6f76bf6bf9-ks5t4\" (UID: \"f0fa3a1e-72ed-435f-8f8c-6d719d260e88\") " pod="openstack/dnsmasq-dns-6f76bf6bf9-ks5t4" Dec 05 17:46:57 crc kubenswrapper[4756]: I1205 17:46:57.903950 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0fa3a1e-72ed-435f-8f8c-6d719d260e88-config\") pod \"dnsmasq-dns-6f76bf6bf9-ks5t4\" (UID: \"f0fa3a1e-72ed-435f-8f8c-6d719d260e88\") " pod="openstack/dnsmasq-dns-6f76bf6bf9-ks5t4" Dec 05 17:46:57 crc kubenswrapper[4756]: I1205 17:46:57.923932 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lt2qs" podStartSLOduration=2.378036599 podStartE2EDuration="4.923907921s" podCreationTimestamp="2025-12-05 17:46:53 +0000 UTC" firstStartedPulling="2025-12-05 17:46:54.863671893 +0000 UTC m=+5475.102345415" lastFinishedPulling="2025-12-05 17:46:57.409543195 +0000 UTC m=+5477.648216737" observedRunningTime="2025-12-05 17:46:57.919824232 +0000 UTC m=+5478.158497764" watchObservedRunningTime="2025-12-05 17:46:57.923907921 +0000 UTC m=+5478.162581453" Dec 05 17:46:57 crc kubenswrapper[4756]: I1205 17:46:57.925295 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwt6p\" (UniqueName: \"kubernetes.io/projected/f0fa3a1e-72ed-435f-8f8c-6d719d260e88-kube-api-access-kwt6p\") pod \"dnsmasq-dns-6f76bf6bf9-ks5t4\" (UID: \"f0fa3a1e-72ed-435f-8f8c-6d719d260e88\") " pod="openstack/dnsmasq-dns-6f76bf6bf9-ks5t4" Dec 05 17:46:58 crc kubenswrapper[4756]: I1205 17:46:58.045097 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f76bf6bf9-ks5t4" Dec 05 17:46:58 crc kubenswrapper[4756]: I1205 17:46:58.339811 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:58 crc kubenswrapper[4756]: I1205 17:46:58.388525 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 05 17:46:58 crc kubenswrapper[4756]: I1205 17:46:58.516400 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f76bf6bf9-ks5t4"] Dec 05 17:46:58 crc kubenswrapper[4756]: I1205 17:46:58.668895 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:58 crc kubenswrapper[4756]: I1205 17:46:58.707249 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Dec 05 17:46:58 crc kubenswrapper[4756]: I1205 17:46:58.911997 4756 generic.go:334] "Generic (PLEG): container finished" podID="f0fa3a1e-72ed-435f-8f8c-6d719d260e88" containerID="22518c51a51ea603737fcf4d96e1a49320c849ea895379a461816f1da62e803c" exitCode=0 Dec 05 17:46:58 crc kubenswrapper[4756]: I1205 17:46:58.912084 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f76bf6bf9-ks5t4" event={"ID":"f0fa3a1e-72ed-435f-8f8c-6d719d260e88","Type":"ContainerDied","Data":"22518c51a51ea603737fcf4d96e1a49320c849ea895379a461816f1da62e803c"} Dec 05 17:46:58 crc kubenswrapper[4756]: I1205 17:46:58.912135 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f76bf6bf9-ks5t4" event={"ID":"f0fa3a1e-72ed-435f-8f8c-6d719d260e88","Type":"ContainerStarted","Data":"24ab3efe481e3c1f53e7cc62b3d14da06cc33f400410a4e04fb6d5b006b57986"} Dec 05 17:46:59 crc kubenswrapper[4756]: I1205 17:46:59.540110 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Dec 05 17:46:59 crc kubenswrapper[4756]: I1205 17:46:59.727259 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 05 17:46:59 crc kubenswrapper[4756]: I1205 17:46:59.759161 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Dec 05 17:46:59 crc kubenswrapper[4756]: I1205 17:46:59.931794 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f76bf6bf9-ks5t4" event={"ID":"f0fa3a1e-72ed-435f-8f8c-6d719d260e88","Type":"ContainerStarted","Data":"85838c36f0ad009335e0c89ee5233ed93fb6d016c8033045baabc755b2103d54"} Dec 05 17:46:59 crc kubenswrapper[4756]: I1205 17:46:59.933461 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f76bf6bf9-ks5t4" Dec 05 17:46:59 crc kubenswrapper[4756]: I1205 17:46:59.968684 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f76bf6bf9-ks5t4" podStartSLOduration=2.968658017 podStartE2EDuration="2.968658017s" podCreationTimestamp="2025-12-05 17:46:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:46:59.957658105 +0000 UTC m=+5480.196331657" watchObservedRunningTime="2025-12-05 17:46:59.968658017 +0000 UTC m=+5480.207331569" Dec 05 17:47:00 crc kubenswrapper[4756]: I1205 17:47:00.615003 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Dec 05 17:47:00 crc kubenswrapper[4756]: I1205 17:47:00.686940 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Dec 05 17:47:00 crc kubenswrapper[4756]: I1205 17:47:00.807325 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 05 17:47:00 crc kubenswrapper[4756]: I1205 17:47:00.822352 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Dec 05 17:47:00 crc kubenswrapper[4756]: I1205 17:47:00.890794 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 05 17:47:00 crc kubenswrapper[4756]: I1205 17:47:00.910083 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Dec 05 17:47:01 crc kubenswrapper[4756]: I1205 17:47:01.015428 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f76bf6bf9-ks5t4"] Dec 05 17:47:01 crc kubenswrapper[4756]: I1205 17:47:01.037492 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b455b6c4f-vhj69"] Dec 05 17:47:01 crc kubenswrapper[4756]: I1205 17:47:01.039181 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b455b6c4f-vhj69" Dec 05 17:47:01 crc kubenswrapper[4756]: I1205 17:47:01.043397 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 05 17:47:01 crc kubenswrapper[4756]: I1205 17:47:01.050099 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b455b6c4f-vhj69"] Dec 05 17:47:01 crc kubenswrapper[4756]: I1205 17:47:01.193332 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a87cfcd-63b6-4ef6-841e-a17de8443742-ovsdbserver-nb\") pod \"dnsmasq-dns-b455b6c4f-vhj69\" (UID: \"8a87cfcd-63b6-4ef6-841e-a17de8443742\") " pod="openstack/dnsmasq-dns-b455b6c4f-vhj69" Dec 05 17:47:01 crc kubenswrapper[4756]: I1205 17:47:01.193581 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a87cfcd-63b6-4ef6-841e-a17de8443742-ovsdbserver-sb\") pod \"dnsmasq-dns-b455b6c4f-vhj69\" (UID: \"8a87cfcd-63b6-4ef6-841e-a17de8443742\") " pod="openstack/dnsmasq-dns-b455b6c4f-vhj69" Dec 05 17:47:01 crc kubenswrapper[4756]: I1205 17:47:01.193892 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a87cfcd-63b6-4ef6-841e-a17de8443742-config\") pod \"dnsmasq-dns-b455b6c4f-vhj69\" (UID: \"8a87cfcd-63b6-4ef6-841e-a17de8443742\") " pod="openstack/dnsmasq-dns-b455b6c4f-vhj69" Dec 05 17:47:01 crc kubenswrapper[4756]: I1205 17:47:01.193961 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh5sm\" (UniqueName: \"kubernetes.io/projected/8a87cfcd-63b6-4ef6-841e-a17de8443742-kube-api-access-nh5sm\") pod \"dnsmasq-dns-b455b6c4f-vhj69\" (UID: \"8a87cfcd-63b6-4ef6-841e-a17de8443742\") " pod="openstack/dnsmasq-dns-b455b6c4f-vhj69" Dec 05 17:47:01 crc kubenswrapper[4756]: I1205 17:47:01.194042 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a87cfcd-63b6-4ef6-841e-a17de8443742-dns-svc\") pod \"dnsmasq-dns-b455b6c4f-vhj69\" (UID: \"8a87cfcd-63b6-4ef6-841e-a17de8443742\") " pod="openstack/dnsmasq-dns-b455b6c4f-vhj69" Dec 05 17:47:01 crc kubenswrapper[4756]: I1205 17:47:01.296769 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a87cfcd-63b6-4ef6-841e-a17de8443742-ovsdbserver-nb\") pod \"dnsmasq-dns-b455b6c4f-vhj69\" (UID: \"8a87cfcd-63b6-4ef6-841e-a17de8443742\") " pod="openstack/dnsmasq-dns-b455b6c4f-vhj69" Dec 05 17:47:01 crc kubenswrapper[4756]: I1205 17:47:01.296878 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a87cfcd-63b6-4ef6-841e-a17de8443742-ovsdbserver-sb\") pod \"dnsmasq-dns-b455b6c4f-vhj69\" (UID: \"8a87cfcd-63b6-4ef6-841e-a17de8443742\") " pod="openstack/dnsmasq-dns-b455b6c4f-vhj69" Dec 05 17:47:01 crc kubenswrapper[4756]: I1205 17:47:01.296976 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a87cfcd-63b6-4ef6-841e-a17de8443742-config\") pod \"dnsmasq-dns-b455b6c4f-vhj69\" (UID: \"8a87cfcd-63b6-4ef6-841e-a17de8443742\") " pod="openstack/dnsmasq-dns-b455b6c4f-vhj69" Dec 05 17:47:01 crc kubenswrapper[4756]: I1205 17:47:01.297016 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh5sm\" (UniqueName: \"kubernetes.io/projected/8a87cfcd-63b6-4ef6-841e-a17de8443742-kube-api-access-nh5sm\") pod \"dnsmasq-dns-b455b6c4f-vhj69\" (UID: \"8a87cfcd-63b6-4ef6-841e-a17de8443742\") " pod="openstack/dnsmasq-dns-b455b6c4f-vhj69" Dec 05 17:47:01 crc kubenswrapper[4756]: I1205 17:47:01.297048 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a87cfcd-63b6-4ef6-841e-a17de8443742-dns-svc\") pod \"dnsmasq-dns-b455b6c4f-vhj69\" (UID: \"8a87cfcd-63b6-4ef6-841e-a17de8443742\") " pod="openstack/dnsmasq-dns-b455b6c4f-vhj69" Dec 05 17:47:01 crc kubenswrapper[4756]: I1205 17:47:01.298559 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a87cfcd-63b6-4ef6-841e-a17de8443742-dns-svc\") pod \"dnsmasq-dns-b455b6c4f-vhj69\" (UID: \"8a87cfcd-63b6-4ef6-841e-a17de8443742\") " pod="openstack/dnsmasq-dns-b455b6c4f-vhj69" Dec 05 17:47:01 crc kubenswrapper[4756]: I1205 17:47:01.298698 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a87cfcd-63b6-4ef6-841e-a17de8443742-config\") pod \"dnsmasq-dns-b455b6c4f-vhj69\" (UID: \"8a87cfcd-63b6-4ef6-841e-a17de8443742\") " pod="openstack/dnsmasq-dns-b455b6c4f-vhj69" Dec 05 17:47:01 crc kubenswrapper[4756]: I1205 17:47:01.298698 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a87cfcd-63b6-4ef6-841e-a17de8443742-ovsdbserver-nb\") pod \"dnsmasq-dns-b455b6c4f-vhj69\" (UID: \"8a87cfcd-63b6-4ef6-841e-a17de8443742\") " pod="openstack/dnsmasq-dns-b455b6c4f-vhj69" Dec 05 17:47:01 crc kubenswrapper[4756]: I1205 17:47:01.299016 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a87cfcd-63b6-4ef6-841e-a17de8443742-ovsdbserver-sb\") pod \"dnsmasq-dns-b455b6c4f-vhj69\" (UID: \"8a87cfcd-63b6-4ef6-841e-a17de8443742\") " pod="openstack/dnsmasq-dns-b455b6c4f-vhj69" Dec 05 17:47:01 crc kubenswrapper[4756]: I1205 17:47:01.327319 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh5sm\" (UniqueName: \"kubernetes.io/projected/8a87cfcd-63b6-4ef6-841e-a17de8443742-kube-api-access-nh5sm\") pod \"dnsmasq-dns-b455b6c4f-vhj69\" (UID: \"8a87cfcd-63b6-4ef6-841e-a17de8443742\") " pod="openstack/dnsmasq-dns-b455b6c4f-vhj69" Dec 05 17:47:01 crc kubenswrapper[4756]: I1205 17:47:01.359208 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b455b6c4f-vhj69" Dec 05 17:47:01 crc kubenswrapper[4756]: I1205 17:47:01.671403 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b455b6c4f-vhj69"] Dec 05 17:47:01 crc kubenswrapper[4756]: W1205 17:47:01.682889 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a87cfcd_63b6_4ef6_841e_a17de8443742.slice/crio-5af504504790d00a121659e3354712bf874ce069c8f979e6fe67896c3b2bc583 WatchSource:0}: Error finding container 5af504504790d00a121659e3354712bf874ce069c8f979e6fe67896c3b2bc583: Status 404 returned error can't find the container with id 5af504504790d00a121659e3354712bf874ce069c8f979e6fe67896c3b2bc583 Dec 05 17:47:01 crc kubenswrapper[4756]: I1205 17:47:01.955286 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f76bf6bf9-ks5t4" podUID="f0fa3a1e-72ed-435f-8f8c-6d719d260e88" containerName="dnsmasq-dns" containerID="cri-o://85838c36f0ad009335e0c89ee5233ed93fb6d016c8033045baabc755b2103d54" gracePeriod=10 Dec 05 17:47:01 crc kubenswrapper[4756]: I1205 17:47:01.955721 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b455b6c4f-vhj69" event={"ID":"8a87cfcd-63b6-4ef6-841e-a17de8443742","Type":"ContainerStarted","Data":"5af504504790d00a121659e3354712bf874ce069c8f979e6fe67896c3b2bc583"} Dec 05 17:47:02 crc kubenswrapper[4756]: I1205 17:47:02.504628 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f76bf6bf9-ks5t4" Dec 05 17:47:02 crc kubenswrapper[4756]: I1205 17:47:02.621168 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwt6p\" (UniqueName: \"kubernetes.io/projected/f0fa3a1e-72ed-435f-8f8c-6d719d260e88-kube-api-access-kwt6p\") pod \"f0fa3a1e-72ed-435f-8f8c-6d719d260e88\" (UID: \"f0fa3a1e-72ed-435f-8f8c-6d719d260e88\") " Dec 05 17:47:02 crc kubenswrapper[4756]: I1205 17:47:02.621490 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0fa3a1e-72ed-435f-8f8c-6d719d260e88-config\") pod \"f0fa3a1e-72ed-435f-8f8c-6d719d260e88\" (UID: \"f0fa3a1e-72ed-435f-8f8c-6d719d260e88\") " Dec 05 17:47:02 crc kubenswrapper[4756]: I1205 17:47:02.621558 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0fa3a1e-72ed-435f-8f8c-6d719d260e88-dns-svc\") pod \"f0fa3a1e-72ed-435f-8f8c-6d719d260e88\" (UID: \"f0fa3a1e-72ed-435f-8f8c-6d719d260e88\") " Dec 05 17:47:02 crc kubenswrapper[4756]: I1205 17:47:02.621596 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0fa3a1e-72ed-435f-8f8c-6d719d260e88-ovsdbserver-nb\") pod \"f0fa3a1e-72ed-435f-8f8c-6d719d260e88\" (UID: \"f0fa3a1e-72ed-435f-8f8c-6d719d260e88\") " Dec 05 17:47:02 crc kubenswrapper[4756]: I1205 17:47:02.630688 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0fa3a1e-72ed-435f-8f8c-6d719d260e88-kube-api-access-kwt6p" (OuterVolumeSpecName: "kube-api-access-kwt6p") pod "f0fa3a1e-72ed-435f-8f8c-6d719d260e88" (UID: "f0fa3a1e-72ed-435f-8f8c-6d719d260e88"). InnerVolumeSpecName "kube-api-access-kwt6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:47:02 crc kubenswrapper[4756]: I1205 17:47:02.674933 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0fa3a1e-72ed-435f-8f8c-6d719d260e88-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f0fa3a1e-72ed-435f-8f8c-6d719d260e88" (UID: "f0fa3a1e-72ed-435f-8f8c-6d719d260e88"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:47:02 crc kubenswrapper[4756]: I1205 17:47:02.697949 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0fa3a1e-72ed-435f-8f8c-6d719d260e88-config" (OuterVolumeSpecName: "config") pod "f0fa3a1e-72ed-435f-8f8c-6d719d260e88" (UID: "f0fa3a1e-72ed-435f-8f8c-6d719d260e88"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:47:02 crc kubenswrapper[4756]: I1205 17:47:02.703335 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0fa3a1e-72ed-435f-8f8c-6d719d260e88-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f0fa3a1e-72ed-435f-8f8c-6d719d260e88" (UID: "f0fa3a1e-72ed-435f-8f8c-6d719d260e88"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:47:02 crc kubenswrapper[4756]: I1205 17:47:02.723702 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0fa3a1e-72ed-435f-8f8c-6d719d260e88-config\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:02 crc kubenswrapper[4756]: I1205 17:47:02.723742 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0fa3a1e-72ed-435f-8f8c-6d719d260e88-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:02 crc kubenswrapper[4756]: I1205 17:47:02.723759 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0fa3a1e-72ed-435f-8f8c-6d719d260e88-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:02 crc kubenswrapper[4756]: I1205 17:47:02.723773 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwt6p\" (UniqueName: \"kubernetes.io/projected/f0fa3a1e-72ed-435f-8f8c-6d719d260e88-kube-api-access-kwt6p\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:02 crc kubenswrapper[4756]: I1205 17:47:02.965926 4756 generic.go:334] "Generic (PLEG): container finished" podID="8a87cfcd-63b6-4ef6-841e-a17de8443742" containerID="47080795a7b80872cae5c517b00f5d4f61342942f94cf7b059fe85c782a3795e" exitCode=0 Dec 05 17:47:02 crc kubenswrapper[4756]: I1205 17:47:02.966037 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b455b6c4f-vhj69" event={"ID":"8a87cfcd-63b6-4ef6-841e-a17de8443742","Type":"ContainerDied","Data":"47080795a7b80872cae5c517b00f5d4f61342942f94cf7b059fe85c782a3795e"} Dec 05 17:47:02 crc kubenswrapper[4756]: I1205 17:47:02.969485 4756 generic.go:334] "Generic (PLEG): container finished" podID="f0fa3a1e-72ed-435f-8f8c-6d719d260e88" containerID="85838c36f0ad009335e0c89ee5233ed93fb6d016c8033045baabc755b2103d54" exitCode=0 Dec 05 17:47:02 crc kubenswrapper[4756]: I1205 17:47:02.969535 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f76bf6bf9-ks5t4" event={"ID":"f0fa3a1e-72ed-435f-8f8c-6d719d260e88","Type":"ContainerDied","Data":"85838c36f0ad009335e0c89ee5233ed93fb6d016c8033045baabc755b2103d54"} Dec 05 17:47:02 crc kubenswrapper[4756]: I1205 17:47:02.969574 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f76bf6bf9-ks5t4" event={"ID":"f0fa3a1e-72ed-435f-8f8c-6d719d260e88","Type":"ContainerDied","Data":"24ab3efe481e3c1f53e7cc62b3d14da06cc33f400410a4e04fb6d5b006b57986"} Dec 05 17:47:02 crc kubenswrapper[4756]: I1205 17:47:02.969585 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f76bf6bf9-ks5t4" Dec 05 17:47:02 crc kubenswrapper[4756]: I1205 17:47:02.969603 4756 scope.go:117] "RemoveContainer" containerID="85838c36f0ad009335e0c89ee5233ed93fb6d016c8033045baabc755b2103d54" Dec 05 17:47:03 crc kubenswrapper[4756]: I1205 17:47:02.998921 4756 scope.go:117] "RemoveContainer" containerID="22518c51a51ea603737fcf4d96e1a49320c849ea895379a461816f1da62e803c" Dec 05 17:47:03 crc kubenswrapper[4756]: I1205 17:47:03.031780 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f76bf6bf9-ks5t4"] Dec 05 17:47:03 crc kubenswrapper[4756]: I1205 17:47:03.040754 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f76bf6bf9-ks5t4"] Dec 05 17:47:03 crc kubenswrapper[4756]: I1205 17:47:03.041096 4756 scope.go:117] "RemoveContainer" containerID="85838c36f0ad009335e0c89ee5233ed93fb6d016c8033045baabc755b2103d54" Dec 05 17:47:03 crc kubenswrapper[4756]: E1205 17:47:03.041954 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85838c36f0ad009335e0c89ee5233ed93fb6d016c8033045baabc755b2103d54\": container with ID starting with 85838c36f0ad009335e0c89ee5233ed93fb6d016c8033045baabc755b2103d54 not found: ID does not exist" containerID="85838c36f0ad009335e0c89ee5233ed93fb6d016c8033045baabc755b2103d54" Dec 05 17:47:03 crc kubenswrapper[4756]: I1205 17:47:03.042025 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85838c36f0ad009335e0c89ee5233ed93fb6d016c8033045baabc755b2103d54"} err="failed to get container status \"85838c36f0ad009335e0c89ee5233ed93fb6d016c8033045baabc755b2103d54\": rpc error: code = NotFound desc = could not find container \"85838c36f0ad009335e0c89ee5233ed93fb6d016c8033045baabc755b2103d54\": container with ID starting with 85838c36f0ad009335e0c89ee5233ed93fb6d016c8033045baabc755b2103d54 not found: ID does not exist" Dec 05 17:47:03 crc kubenswrapper[4756]: I1205 17:47:03.042053 4756 scope.go:117] "RemoveContainer" containerID="22518c51a51ea603737fcf4d96e1a49320c849ea895379a461816f1da62e803c" Dec 05 17:47:03 crc kubenswrapper[4756]: E1205 17:47:03.042529 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22518c51a51ea603737fcf4d96e1a49320c849ea895379a461816f1da62e803c\": container with ID starting with 22518c51a51ea603737fcf4d96e1a49320c849ea895379a461816f1da62e803c not found: ID does not exist" containerID="22518c51a51ea603737fcf4d96e1a49320c849ea895379a461816f1da62e803c" Dec 05 17:47:03 crc kubenswrapper[4756]: I1205 17:47:03.042572 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22518c51a51ea603737fcf4d96e1a49320c849ea895379a461816f1da62e803c"} err="failed to get container status \"22518c51a51ea603737fcf4d96e1a49320c849ea895379a461816f1da62e803c\": rpc error: code = NotFound desc = could not find container \"22518c51a51ea603737fcf4d96e1a49320c849ea895379a461816f1da62e803c\": container with ID starting with 22518c51a51ea603737fcf4d96e1a49320c849ea895379a461816f1da62e803c not found: ID does not exist" Dec 05 17:47:03 crc kubenswrapper[4756]: I1205 17:47:03.675039 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Dec 05 17:47:03 crc kubenswrapper[4756]: E1205 17:47:03.675617 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0fa3a1e-72ed-435f-8f8c-6d719d260e88" containerName="dnsmasq-dns" Dec 05 17:47:03 crc kubenswrapper[4756]: I1205 17:47:03.675639 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0fa3a1e-72ed-435f-8f8c-6d719d260e88" containerName="dnsmasq-dns" Dec 05 17:47:03 crc kubenswrapper[4756]: E1205 17:47:03.675669 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0fa3a1e-72ed-435f-8f8c-6d719d260e88" containerName="init" Dec 05 17:47:03 crc kubenswrapper[4756]: I1205 17:47:03.675682 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0fa3a1e-72ed-435f-8f8c-6d719d260e88" containerName="init" Dec 05 17:47:03 crc kubenswrapper[4756]: I1205 17:47:03.676003 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0fa3a1e-72ed-435f-8f8c-6d719d260e88" containerName="dnsmasq-dns" Dec 05 17:47:03 crc kubenswrapper[4756]: I1205 17:47:03.677201 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 05 17:47:03 crc kubenswrapper[4756]: I1205 17:47:03.681808 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Dec 05 17:47:03 crc kubenswrapper[4756]: I1205 17:47:03.716636 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Dec 05 17:47:03 crc kubenswrapper[4756]: I1205 17:47:03.841760 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbnvn\" (UniqueName: \"kubernetes.io/projected/3772b5ca-1c29-49df-9dd8-6f15da9484a5-kube-api-access-vbnvn\") pod \"ovn-copy-data\" (UID: \"3772b5ca-1c29-49df-9dd8-6f15da9484a5\") " pod="openstack/ovn-copy-data" Dec 05 17:47:03 crc kubenswrapper[4756]: I1205 17:47:03.841992 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a1fc566a-0eda-4388-924d-ad18e2929773\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a1fc566a-0eda-4388-924d-ad18e2929773\") pod \"ovn-copy-data\" (UID: \"3772b5ca-1c29-49df-9dd8-6f15da9484a5\") " pod="openstack/ovn-copy-data" Dec 05 17:47:03 crc kubenswrapper[4756]: I1205 17:47:03.842156 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/3772b5ca-1c29-49df-9dd8-6f15da9484a5-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"3772b5ca-1c29-49df-9dd8-6f15da9484a5\") " pod="openstack/ovn-copy-data" Dec 05 17:47:03 crc kubenswrapper[4756]: I1205 17:47:03.944216 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/3772b5ca-1c29-49df-9dd8-6f15da9484a5-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"3772b5ca-1c29-49df-9dd8-6f15da9484a5\") " pod="openstack/ovn-copy-data" Dec 05 17:47:03 crc kubenswrapper[4756]: I1205 17:47:03.945203 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbnvn\" (UniqueName: \"kubernetes.io/projected/3772b5ca-1c29-49df-9dd8-6f15da9484a5-kube-api-access-vbnvn\") pod \"ovn-copy-data\" (UID: \"3772b5ca-1c29-49df-9dd8-6f15da9484a5\") " pod="openstack/ovn-copy-data" Dec 05 17:47:03 crc kubenswrapper[4756]: I1205 17:47:03.945322 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a1fc566a-0eda-4388-924d-ad18e2929773\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a1fc566a-0eda-4388-924d-ad18e2929773\") pod \"ovn-copy-data\" (UID: \"3772b5ca-1c29-49df-9dd8-6f15da9484a5\") " pod="openstack/ovn-copy-data" Dec 05 17:47:03 crc kubenswrapper[4756]: I1205 17:47:03.950168 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/3772b5ca-1c29-49df-9dd8-6f15da9484a5-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"3772b5ca-1c29-49df-9dd8-6f15da9484a5\") " pod="openstack/ovn-copy-data" Dec 05 17:47:03 crc kubenswrapper[4756]: I1205 17:47:03.952471 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 17:47:03 crc kubenswrapper[4756]: I1205 17:47:03.952522 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a1fc566a-0eda-4388-924d-ad18e2929773\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a1fc566a-0eda-4388-924d-ad18e2929773\") pod \"ovn-copy-data\" (UID: \"3772b5ca-1c29-49df-9dd8-6f15da9484a5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/847ac8002eccfcdba2f3b2d7191f9cfc7cfb900e4dbf23c8922dc0716eecde1c/globalmount\"" pod="openstack/ovn-copy-data" Dec 05 17:47:03 crc kubenswrapper[4756]: I1205 17:47:03.964835 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbnvn\" (UniqueName: \"kubernetes.io/projected/3772b5ca-1c29-49df-9dd8-6f15da9484a5-kube-api-access-vbnvn\") pod \"ovn-copy-data\" (UID: \"3772b5ca-1c29-49df-9dd8-6f15da9484a5\") " pod="openstack/ovn-copy-data" Dec 05 17:47:03 crc kubenswrapper[4756]: I1205 17:47:03.985033 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b455b6c4f-vhj69" event={"ID":"8a87cfcd-63b6-4ef6-841e-a17de8443742","Type":"ContainerStarted","Data":"1bd1bf8edbd92480817b020d8359a13993dc69672ab613ce2c7ec94ef705c111"} Dec 05 17:47:03 crc kubenswrapper[4756]: I1205 17:47:03.985606 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b455b6c4f-vhj69" Dec 05 17:47:04 crc kubenswrapper[4756]: I1205 17:47:04.006237 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a1fc566a-0eda-4388-924d-ad18e2929773\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a1fc566a-0eda-4388-924d-ad18e2929773\") pod \"ovn-copy-data\" (UID: \"3772b5ca-1c29-49df-9dd8-6f15da9484a5\") " pod="openstack/ovn-copy-data" Dec 05 17:47:04 crc kubenswrapper[4756]: I1205 17:47:04.019962 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 05 17:47:04 crc kubenswrapper[4756]: I1205 17:47:04.020602 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b455b6c4f-vhj69" podStartSLOduration=3.020571701 podStartE2EDuration="3.020571701s" podCreationTimestamp="2025-12-05 17:47:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:47:04.014153131 +0000 UTC m=+5484.252826703" watchObservedRunningTime="2025-12-05 17:47:04.020571701 +0000 UTC m=+5484.259245283" Dec 05 17:47:04 crc kubenswrapper[4756]: I1205 17:47:04.065949 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0fa3a1e-72ed-435f-8f8c-6d719d260e88" path="/var/lib/kubelet/pods/f0fa3a1e-72ed-435f-8f8c-6d719d260e88/volumes" Dec 05 17:47:04 crc kubenswrapper[4756]: I1205 17:47:04.108548 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lt2qs" Dec 05 17:47:04 crc kubenswrapper[4756]: I1205 17:47:04.108727 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lt2qs" Dec 05 17:47:04 crc kubenswrapper[4756]: I1205 17:47:04.191328 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lt2qs" Dec 05 17:47:04 crc kubenswrapper[4756]: I1205 17:47:04.565060 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Dec 05 17:47:04 crc kubenswrapper[4756]: W1205 17:47:04.567183 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3772b5ca_1c29_49df_9dd8_6f15da9484a5.slice/crio-ccc3d921de28c32230e29f6168576d9d4357f1932e3166eacca10b3deb58b157 WatchSource:0}: Error finding container ccc3d921de28c32230e29f6168576d9d4357f1932e3166eacca10b3deb58b157: Status 404 returned error can't find the container with id ccc3d921de28c32230e29f6168576d9d4357f1932e3166eacca10b3deb58b157 Dec 05 17:47:05 crc kubenswrapper[4756]: I1205 17:47:05.001078 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"3772b5ca-1c29-49df-9dd8-6f15da9484a5","Type":"ContainerStarted","Data":"ccc3d921de28c32230e29f6168576d9d4357f1932e3166eacca10b3deb58b157"} Dec 05 17:47:05 crc kubenswrapper[4756]: I1205 17:47:05.083400 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lt2qs" Dec 05 17:47:05 crc kubenswrapper[4756]: I1205 17:47:05.140686 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lt2qs"] Dec 05 17:47:06 crc kubenswrapper[4756]: I1205 17:47:06.012175 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"3772b5ca-1c29-49df-9dd8-6f15da9484a5","Type":"ContainerStarted","Data":"b0234ac2db69a76b680a5d9e65f35f60efd19e2770220e47fe079eb2ac098cee"} Dec 05 17:47:06 crc kubenswrapper[4756]: I1205 17:47:06.038152 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.596408038 podStartE2EDuration="4.038129035s" podCreationTimestamp="2025-12-05 17:47:02 +0000 UTC" firstStartedPulling="2025-12-05 17:47:04.570534693 +0000 UTC m=+5484.809208245" lastFinishedPulling="2025-12-05 17:47:05.01225574 +0000 UTC m=+5485.250929242" observedRunningTime="2025-12-05 17:47:06.033337588 +0000 UTC m=+5486.272011130" watchObservedRunningTime="2025-12-05 17:47:06.038129035 +0000 UTC m=+5486.276802557" Dec 05 17:47:07 crc kubenswrapper[4756]: I1205 17:47:07.023033 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lt2qs" podUID="f19a770b-26e8-4073-ac04-4cbfd91aa9a1" containerName="registry-server" containerID="cri-o://fa500bf70bff7d23c7493f7e17c9767fca1984aade720d0c3aa8974df582722f" gracePeriod=2 Dec 05 17:47:08 crc kubenswrapper[4756]: I1205 17:47:08.034087 4756 generic.go:334] "Generic (PLEG): container finished" podID="f19a770b-26e8-4073-ac04-4cbfd91aa9a1" containerID="fa500bf70bff7d23c7493f7e17c9767fca1984aade720d0c3aa8974df582722f" exitCode=0 Dec 05 17:47:08 crc kubenswrapper[4756]: I1205 17:47:08.034179 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lt2qs" event={"ID":"f19a770b-26e8-4073-ac04-4cbfd91aa9a1","Type":"ContainerDied","Data":"fa500bf70bff7d23c7493f7e17c9767fca1984aade720d0c3aa8974df582722f"} Dec 05 17:47:08 crc kubenswrapper[4756]: I1205 17:47:08.127751 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lt2qs" Dec 05 17:47:08 crc kubenswrapper[4756]: I1205 17:47:08.224578 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f19a770b-26e8-4073-ac04-4cbfd91aa9a1-utilities\") pod \"f19a770b-26e8-4073-ac04-4cbfd91aa9a1\" (UID: \"f19a770b-26e8-4073-ac04-4cbfd91aa9a1\") " Dec 05 17:47:08 crc kubenswrapper[4756]: I1205 17:47:08.224660 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f19a770b-26e8-4073-ac04-4cbfd91aa9a1-catalog-content\") pod \"f19a770b-26e8-4073-ac04-4cbfd91aa9a1\" (UID: \"f19a770b-26e8-4073-ac04-4cbfd91aa9a1\") " Dec 05 17:47:08 crc kubenswrapper[4756]: I1205 17:47:08.224873 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw5j7\" (UniqueName: \"kubernetes.io/projected/f19a770b-26e8-4073-ac04-4cbfd91aa9a1-kube-api-access-vw5j7\") pod \"f19a770b-26e8-4073-ac04-4cbfd91aa9a1\" (UID: \"f19a770b-26e8-4073-ac04-4cbfd91aa9a1\") " Dec 05 17:47:08 crc kubenswrapper[4756]: I1205 17:47:08.226582 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f19a770b-26e8-4073-ac04-4cbfd91aa9a1-utilities" (OuterVolumeSpecName: "utilities") pod "f19a770b-26e8-4073-ac04-4cbfd91aa9a1" (UID: "f19a770b-26e8-4073-ac04-4cbfd91aa9a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:47:08 crc kubenswrapper[4756]: I1205 17:47:08.235728 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f19a770b-26e8-4073-ac04-4cbfd91aa9a1-kube-api-access-vw5j7" (OuterVolumeSpecName: "kube-api-access-vw5j7") pod "f19a770b-26e8-4073-ac04-4cbfd91aa9a1" (UID: "f19a770b-26e8-4073-ac04-4cbfd91aa9a1"). InnerVolumeSpecName "kube-api-access-vw5j7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:47:08 crc kubenswrapper[4756]: I1205 17:47:08.314740 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f19a770b-26e8-4073-ac04-4cbfd91aa9a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f19a770b-26e8-4073-ac04-4cbfd91aa9a1" (UID: "f19a770b-26e8-4073-ac04-4cbfd91aa9a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:47:08 crc kubenswrapper[4756]: I1205 17:47:08.327777 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw5j7\" (UniqueName: \"kubernetes.io/projected/f19a770b-26e8-4073-ac04-4cbfd91aa9a1-kube-api-access-vw5j7\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:08 crc kubenswrapper[4756]: I1205 17:47:08.328529 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f19a770b-26e8-4073-ac04-4cbfd91aa9a1-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:08 crc kubenswrapper[4756]: I1205 17:47:08.328556 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f19a770b-26e8-4073-ac04-4cbfd91aa9a1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:09 crc kubenswrapper[4756]: I1205 17:47:09.048772 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lt2qs" event={"ID":"f19a770b-26e8-4073-ac04-4cbfd91aa9a1","Type":"ContainerDied","Data":"90c9b559550bae8fa05d67dca827fa7a1f54ef137de771e49be42a6aac27975b"} Dec 05 17:47:09 crc kubenswrapper[4756]: I1205 17:47:09.049164 4756 scope.go:117] "RemoveContainer" containerID="fa500bf70bff7d23c7493f7e17c9767fca1984aade720d0c3aa8974df582722f" Dec 05 17:47:09 crc kubenswrapper[4756]: I1205 17:47:09.048863 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lt2qs" Dec 05 17:47:09 crc kubenswrapper[4756]: I1205 17:47:09.091007 4756 scope.go:117] "RemoveContainer" containerID="b2ceaf3375bf95ee89e75f3462ebf109f401fbb97454e58f41df50c4d8f9a8d0" Dec 05 17:47:09 crc kubenswrapper[4756]: I1205 17:47:09.093857 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lt2qs"] Dec 05 17:47:09 crc kubenswrapper[4756]: I1205 17:47:09.104952 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lt2qs"] Dec 05 17:47:09 crc kubenswrapper[4756]: I1205 17:47:09.114799 4756 scope.go:117] "RemoveContainer" containerID="61db25834d47f849d8d0d128c95accee651df8e62a5d649aa42ad9d447edd1e6" Dec 05 17:47:10 crc kubenswrapper[4756]: I1205 17:47:10.067309 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f19a770b-26e8-4073-ac04-4cbfd91aa9a1" path="/var/lib/kubelet/pods/f19a770b-26e8-4073-ac04-4cbfd91aa9a1/volumes" Dec 05 17:47:11 crc kubenswrapper[4756]: I1205 17:47:11.361730 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b455b6c4f-vhj69" Dec 05 17:47:11 crc kubenswrapper[4756]: I1205 17:47:11.506095 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f79bf7859-9dmcz"] Dec 05 17:47:11 crc kubenswrapper[4756]: I1205 17:47:11.506451 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f79bf7859-9dmcz" podUID="60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4" containerName="dnsmasq-dns" containerID="cri-o://45f521eb2ce784eeaf52148c8e3002c86a7a6063a23cf679e8d86eaaaead7a21" gracePeriod=10 Dec 05 17:47:11 crc kubenswrapper[4756]: I1205 17:47:11.964209 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 05 17:47:11 crc kubenswrapper[4756]: E1205 17:47:11.964772 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f19a770b-26e8-4073-ac04-4cbfd91aa9a1" containerName="extract-content" Dec 05 17:47:11 crc kubenswrapper[4756]: I1205 17:47:11.964789 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f19a770b-26e8-4073-ac04-4cbfd91aa9a1" containerName="extract-content" Dec 05 17:47:11 crc kubenswrapper[4756]: E1205 17:47:11.964816 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f19a770b-26e8-4073-ac04-4cbfd91aa9a1" containerName="registry-server" Dec 05 17:47:11 crc kubenswrapper[4756]: I1205 17:47:11.964822 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f19a770b-26e8-4073-ac04-4cbfd91aa9a1" containerName="registry-server" Dec 05 17:47:11 crc kubenswrapper[4756]: E1205 17:47:11.964831 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f19a770b-26e8-4073-ac04-4cbfd91aa9a1" containerName="extract-utilities" Dec 05 17:47:11 crc kubenswrapper[4756]: I1205 17:47:11.964839 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f19a770b-26e8-4073-ac04-4cbfd91aa9a1" containerName="extract-utilities" Dec 05 17:47:11 crc kubenswrapper[4756]: I1205 17:47:11.964980 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f19a770b-26e8-4073-ac04-4cbfd91aa9a1" containerName="registry-server" Dec 05 17:47:11 crc kubenswrapper[4756]: I1205 17:47:11.969068 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 05 17:47:11 crc kubenswrapper[4756]: I1205 17:47:11.977951 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 05 17:47:11 crc kubenswrapper[4756]: I1205 17:47:11.978125 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-wkgdr" Dec 05 17:47:11 crc kubenswrapper[4756]: I1205 17:47:11.978217 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 05 17:47:11 crc kubenswrapper[4756]: I1205 17:47:11.977971 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 05 17:47:11 crc kubenswrapper[4756]: I1205 17:47:11.996336 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.006824 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f79bf7859-9dmcz" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.094857 4756 generic.go:334] "Generic (PLEG): container finished" podID="60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4" containerID="45f521eb2ce784eeaf52148c8e3002c86a7a6063a23cf679e8d86eaaaead7a21" exitCode=0 Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.094932 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f79bf7859-9dmcz" event={"ID":"60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4","Type":"ContainerDied","Data":"45f521eb2ce784eeaf52148c8e3002c86a7a6063a23cf679e8d86eaaaead7a21"} Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.095358 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f79bf7859-9dmcz" event={"ID":"60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4","Type":"ContainerDied","Data":"c73cd581e88cf8778370eb47b0e84ad05eeedbc2078685d1a78b2ec51cc2ca90"} Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.095384 4756 scope.go:117] "RemoveContainer" containerID="45f521eb2ce784eeaf52148c8e3002c86a7a6063a23cf679e8d86eaaaead7a21" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.095004 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f79bf7859-9dmcz" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.098826 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4-dns-svc\") pod \"60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4\" (UID: \"60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4\") " Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.098914 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4-config\") pod \"60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4\" (UID: \"60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4\") " Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.098938 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptj7h\" (UniqueName: \"kubernetes.io/projected/60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4-kube-api-access-ptj7h\") pod \"60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4\" (UID: \"60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4\") " Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.099220 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgq2j\" (UniqueName: \"kubernetes.io/projected/94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41-kube-api-access-wgq2j\") pod \"ovn-northd-0\" (UID: \"94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41\") " pod="openstack/ovn-northd-0" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.099247 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41\") " pod="openstack/ovn-northd-0" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.099300 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41-config\") pod \"ovn-northd-0\" (UID: \"94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41\") " pod="openstack/ovn-northd-0" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.099344 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41\") " pod="openstack/ovn-northd-0" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.099368 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41\") " pod="openstack/ovn-northd-0" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.099385 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41-scripts\") pod \"ovn-northd-0\" (UID: \"94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41\") " pod="openstack/ovn-northd-0" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.100048 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41\") " pod="openstack/ovn-northd-0" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.106645 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4-kube-api-access-ptj7h" (OuterVolumeSpecName: "kube-api-access-ptj7h") pod "60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4" (UID: "60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4"). InnerVolumeSpecName "kube-api-access-ptj7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.112142 4756 scope.go:117] "RemoveContainer" containerID="398d1e691d0b96463977ad2114c9159e74dc0b4c4516bff6f85f0cdb9f480856" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.140266 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4" (UID: "60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.142731 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4-config" (OuterVolumeSpecName: "config") pod "60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4" (UID: "60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.183829 4756 scope.go:117] "RemoveContainer" containerID="45f521eb2ce784eeaf52148c8e3002c86a7a6063a23cf679e8d86eaaaead7a21" Dec 05 17:47:12 crc kubenswrapper[4756]: E1205 17:47:12.184299 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45f521eb2ce784eeaf52148c8e3002c86a7a6063a23cf679e8d86eaaaead7a21\": container with ID starting with 45f521eb2ce784eeaf52148c8e3002c86a7a6063a23cf679e8d86eaaaead7a21 not found: ID does not exist" containerID="45f521eb2ce784eeaf52148c8e3002c86a7a6063a23cf679e8d86eaaaead7a21" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.184328 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f521eb2ce784eeaf52148c8e3002c86a7a6063a23cf679e8d86eaaaead7a21"} err="failed to get container status \"45f521eb2ce784eeaf52148c8e3002c86a7a6063a23cf679e8d86eaaaead7a21\": rpc error: code = NotFound desc = could not find container \"45f521eb2ce784eeaf52148c8e3002c86a7a6063a23cf679e8d86eaaaead7a21\": container with ID starting with 45f521eb2ce784eeaf52148c8e3002c86a7a6063a23cf679e8d86eaaaead7a21 not found: ID does not exist" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.184347 4756 scope.go:117] "RemoveContainer" containerID="398d1e691d0b96463977ad2114c9159e74dc0b4c4516bff6f85f0cdb9f480856" Dec 05 17:47:12 crc kubenswrapper[4756]: E1205 17:47:12.184594 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"398d1e691d0b96463977ad2114c9159e74dc0b4c4516bff6f85f0cdb9f480856\": container with ID starting with 398d1e691d0b96463977ad2114c9159e74dc0b4c4516bff6f85f0cdb9f480856 not found: ID does not exist" containerID="398d1e691d0b96463977ad2114c9159e74dc0b4c4516bff6f85f0cdb9f480856" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.184620 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"398d1e691d0b96463977ad2114c9159e74dc0b4c4516bff6f85f0cdb9f480856"} err="failed to get container status \"398d1e691d0b96463977ad2114c9159e74dc0b4c4516bff6f85f0cdb9f480856\": rpc error: code = NotFound desc = could not find container \"398d1e691d0b96463977ad2114c9159e74dc0b4c4516bff6f85f0cdb9f480856\": container with ID starting with 398d1e691d0b96463977ad2114c9159e74dc0b4c4516bff6f85f0cdb9f480856 not found: ID does not exist" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.201159 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41-config\") pod \"ovn-northd-0\" (UID: \"94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41\") " pod="openstack/ovn-northd-0" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.201214 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41\") " pod="openstack/ovn-northd-0" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.201244 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41\") " pod="openstack/ovn-northd-0" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.201263 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41-scripts\") pod \"ovn-northd-0\" (UID: \"94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41\") " pod="openstack/ovn-northd-0" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.201310 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41\") " pod="openstack/ovn-northd-0" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.201337 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgq2j\" (UniqueName: \"kubernetes.io/projected/94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41-kube-api-access-wgq2j\") pod \"ovn-northd-0\" (UID: \"94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41\") " pod="openstack/ovn-northd-0" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.201356 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41\") " pod="openstack/ovn-northd-0" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.201487 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.201632 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4-config\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.201747 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptj7h\" (UniqueName: \"kubernetes.io/projected/60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4-kube-api-access-ptj7h\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.201868 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41\") " pod="openstack/ovn-northd-0" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.202398 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41-config\") pod \"ovn-northd-0\" (UID: \"94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41\") " pod="openstack/ovn-northd-0" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.202556 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41-scripts\") pod \"ovn-northd-0\" (UID: \"94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41\") " pod="openstack/ovn-northd-0" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.205346 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41\") " pod="openstack/ovn-northd-0" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.212162 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41\") " pod="openstack/ovn-northd-0" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.213477 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41\") " pod="openstack/ovn-northd-0" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.220599 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgq2j\" (UniqueName: \"kubernetes.io/projected/94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41-kube-api-access-wgq2j\") pod \"ovn-northd-0\" (UID: \"94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41\") " pod="openstack/ovn-northd-0" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.320022 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.440758 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f79bf7859-9dmcz"] Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.447110 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f79bf7859-9dmcz"] Dec 05 17:47:12 crc kubenswrapper[4756]: I1205 17:47:12.779911 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 05 17:47:13 crc kubenswrapper[4756]: I1205 17:47:13.112735 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41","Type":"ContainerStarted","Data":"2a77237b1a6abe8b10db27aaa6548815d142f1cd23239da16dcfd447086116a1"} Dec 05 17:47:13 crc kubenswrapper[4756]: I1205 17:47:13.113058 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41","Type":"ContainerStarted","Data":"d8b7e2655594d2dde9f40892f7b5c867073ca8fedd8ac94b9bc380d651e794d7"} Dec 05 17:47:14 crc kubenswrapper[4756]: I1205 17:47:14.073052 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4" path="/var/lib/kubelet/pods/60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4/volumes" Dec 05 17:47:14 crc kubenswrapper[4756]: I1205 17:47:14.126667 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41","Type":"ContainerStarted","Data":"72ed4cd1c10d12749d67c305072a3edfea2f24b74af73c6d1ff716cd03f119dc"} Dec 05 17:47:14 crc kubenswrapper[4756]: I1205 17:47:14.126945 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 05 17:47:14 crc kubenswrapper[4756]: I1205 17:47:14.165234 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.16520138 podStartE2EDuration="3.16520138s" podCreationTimestamp="2025-12-05 17:47:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:47:14.156756766 +0000 UTC m=+5494.395430358" watchObservedRunningTime="2025-12-05 17:47:14.16520138 +0000 UTC m=+5494.403874962" Dec 05 17:47:17 crc kubenswrapper[4756]: I1205 17:47:17.319631 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-hvkpt"] Dec 05 17:47:17 crc kubenswrapper[4756]: E1205 17:47:17.320698 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4" containerName="init" Dec 05 17:47:17 crc kubenswrapper[4756]: I1205 17:47:17.320720 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4" containerName="init" Dec 05 17:47:17 crc kubenswrapper[4756]: E1205 17:47:17.320759 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4" containerName="dnsmasq-dns" Dec 05 17:47:17 crc kubenswrapper[4756]: I1205 17:47:17.320769 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4" containerName="dnsmasq-dns" Dec 05 17:47:17 crc kubenswrapper[4756]: I1205 17:47:17.320990 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="60e1e0d9-c7bf-4aaf-bed0-8e2893e341c4" containerName="dnsmasq-dns" Dec 05 17:47:17 crc kubenswrapper[4756]: I1205 17:47:17.321781 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hvkpt" Dec 05 17:47:17 crc kubenswrapper[4756]: I1205 17:47:17.329042 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-ab3f-account-create-update-7b27p"] Dec 05 17:47:17 crc kubenswrapper[4756]: I1205 17:47:17.330430 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ab3f-account-create-update-7b27p" Dec 05 17:47:17 crc kubenswrapper[4756]: I1205 17:47:17.336345 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 05 17:47:17 crc kubenswrapper[4756]: I1205 17:47:17.349475 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-hvkpt"] Dec 05 17:47:17 crc kubenswrapper[4756]: I1205 17:47:17.356183 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ab3f-account-create-update-7b27p"] Dec 05 17:47:17 crc kubenswrapper[4756]: I1205 17:47:17.418627 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l6xk\" (UniqueName: \"kubernetes.io/projected/7c64432f-bdc1-4b21-a99f-4b0892bf16c4-kube-api-access-6l6xk\") pod \"keystone-db-create-hvkpt\" (UID: \"7c64432f-bdc1-4b21-a99f-4b0892bf16c4\") " pod="openstack/keystone-db-create-hvkpt" Dec 05 17:47:17 crc kubenswrapper[4756]: I1205 17:47:17.418842 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c64432f-bdc1-4b21-a99f-4b0892bf16c4-operator-scripts\") pod \"keystone-db-create-hvkpt\" (UID: \"7c64432f-bdc1-4b21-a99f-4b0892bf16c4\") " pod="openstack/keystone-db-create-hvkpt" Dec 05 17:47:17 crc kubenswrapper[4756]: I1205 17:47:17.419114 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9dp9\" (UniqueName: \"kubernetes.io/projected/3d1e4851-886b-4648-8022-d9fb79072e9b-kube-api-access-m9dp9\") pod \"keystone-ab3f-account-create-update-7b27p\" (UID: \"3d1e4851-886b-4648-8022-d9fb79072e9b\") " pod="openstack/keystone-ab3f-account-create-update-7b27p" Dec 05 17:47:17 crc kubenswrapper[4756]: I1205 17:47:17.419156 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d1e4851-886b-4648-8022-d9fb79072e9b-operator-scripts\") pod \"keystone-ab3f-account-create-update-7b27p\" (UID: \"3d1e4851-886b-4648-8022-d9fb79072e9b\") " pod="openstack/keystone-ab3f-account-create-update-7b27p" Dec 05 17:47:17 crc kubenswrapper[4756]: I1205 17:47:17.522823 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c64432f-bdc1-4b21-a99f-4b0892bf16c4-operator-scripts\") pod \"keystone-db-create-hvkpt\" (UID: \"7c64432f-bdc1-4b21-a99f-4b0892bf16c4\") " pod="openstack/keystone-db-create-hvkpt" Dec 05 17:47:17 crc kubenswrapper[4756]: I1205 17:47:17.523007 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9dp9\" (UniqueName: \"kubernetes.io/projected/3d1e4851-886b-4648-8022-d9fb79072e9b-kube-api-access-m9dp9\") pod \"keystone-ab3f-account-create-update-7b27p\" (UID: \"3d1e4851-886b-4648-8022-d9fb79072e9b\") " pod="openstack/keystone-ab3f-account-create-update-7b27p" Dec 05 17:47:17 crc kubenswrapper[4756]: I1205 17:47:17.523045 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d1e4851-886b-4648-8022-d9fb79072e9b-operator-scripts\") pod \"keystone-ab3f-account-create-update-7b27p\" (UID: \"3d1e4851-886b-4648-8022-d9fb79072e9b\") " pod="openstack/keystone-ab3f-account-create-update-7b27p" Dec 05 17:47:17 crc kubenswrapper[4756]: I1205 17:47:17.523157 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l6xk\" (UniqueName: \"kubernetes.io/projected/7c64432f-bdc1-4b21-a99f-4b0892bf16c4-kube-api-access-6l6xk\") pod \"keystone-db-create-hvkpt\" (UID: \"7c64432f-bdc1-4b21-a99f-4b0892bf16c4\") " pod="openstack/keystone-db-create-hvkpt" Dec 05 17:47:17 crc kubenswrapper[4756]: I1205 17:47:17.523579 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c64432f-bdc1-4b21-a99f-4b0892bf16c4-operator-scripts\") pod \"keystone-db-create-hvkpt\" (UID: \"7c64432f-bdc1-4b21-a99f-4b0892bf16c4\") " pod="openstack/keystone-db-create-hvkpt" Dec 05 17:47:17 crc kubenswrapper[4756]: I1205 17:47:17.524224 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d1e4851-886b-4648-8022-d9fb79072e9b-operator-scripts\") pod \"keystone-ab3f-account-create-update-7b27p\" (UID: \"3d1e4851-886b-4648-8022-d9fb79072e9b\") " pod="openstack/keystone-ab3f-account-create-update-7b27p" Dec 05 17:47:17 crc kubenswrapper[4756]: I1205 17:47:17.549148 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l6xk\" (UniqueName: \"kubernetes.io/projected/7c64432f-bdc1-4b21-a99f-4b0892bf16c4-kube-api-access-6l6xk\") pod \"keystone-db-create-hvkpt\" (UID: \"7c64432f-bdc1-4b21-a99f-4b0892bf16c4\") " pod="openstack/keystone-db-create-hvkpt" Dec 05 17:47:17 crc kubenswrapper[4756]: I1205 17:47:17.551282 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9dp9\" (UniqueName: \"kubernetes.io/projected/3d1e4851-886b-4648-8022-d9fb79072e9b-kube-api-access-m9dp9\") pod \"keystone-ab3f-account-create-update-7b27p\" (UID: \"3d1e4851-886b-4648-8022-d9fb79072e9b\") " pod="openstack/keystone-ab3f-account-create-update-7b27p" Dec 05 17:47:17 crc kubenswrapper[4756]: I1205 17:47:17.652163 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hvkpt" Dec 05 17:47:17 crc kubenswrapper[4756]: I1205 17:47:17.661485 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ab3f-account-create-update-7b27p" Dec 05 17:47:18 crc kubenswrapper[4756]: I1205 17:47:18.136672 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ab3f-account-create-update-7b27p"] Dec 05 17:47:18 crc kubenswrapper[4756]: W1205 17:47:18.143223 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d1e4851_886b_4648_8022_d9fb79072e9b.slice/crio-f2e5f536f3e2d063ce477c7f8a4d71779d453d613bc96226db9a3fa970e4ffcb WatchSource:0}: Error finding container f2e5f536f3e2d063ce477c7f8a4d71779d453d613bc96226db9a3fa970e4ffcb: Status 404 returned error can't find the container with id f2e5f536f3e2d063ce477c7f8a4d71779d453d613bc96226db9a3fa970e4ffcb Dec 05 17:47:18 crc kubenswrapper[4756]: I1205 17:47:18.167229 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ab3f-account-create-update-7b27p" event={"ID":"3d1e4851-886b-4648-8022-d9fb79072e9b","Type":"ContainerStarted","Data":"f2e5f536f3e2d063ce477c7f8a4d71779d453d613bc96226db9a3fa970e4ffcb"} Dec 05 17:47:18 crc kubenswrapper[4756]: I1205 17:47:18.267904 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-hvkpt"] Dec 05 17:47:18 crc kubenswrapper[4756]: W1205 17:47:18.275931 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c64432f_bdc1_4b21_a99f_4b0892bf16c4.slice/crio-5ba86e239f288782a543deb52786e0a4d819176e0a99024ed988a2e4e8b5520e WatchSource:0}: Error finding container 5ba86e239f288782a543deb52786e0a4d819176e0a99024ed988a2e4e8b5520e: Status 404 returned error can't find the container with id 5ba86e239f288782a543deb52786e0a4d819176e0a99024ed988a2e4e8b5520e Dec 05 17:47:19 crc kubenswrapper[4756]: I1205 17:47:19.180904 4756 generic.go:334] "Generic (PLEG): container finished" podID="7c64432f-bdc1-4b21-a99f-4b0892bf16c4" containerID="5794c48ac0e2702d5d1a40ca7fb5be48807def8eb8d332e8543afd872c8d2193" exitCode=0 Dec 05 17:47:19 crc kubenswrapper[4756]: I1205 17:47:19.181060 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hvkpt" event={"ID":"7c64432f-bdc1-4b21-a99f-4b0892bf16c4","Type":"ContainerDied","Data":"5794c48ac0e2702d5d1a40ca7fb5be48807def8eb8d332e8543afd872c8d2193"} Dec 05 17:47:19 crc kubenswrapper[4756]: I1205 17:47:19.181412 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hvkpt" event={"ID":"7c64432f-bdc1-4b21-a99f-4b0892bf16c4","Type":"ContainerStarted","Data":"5ba86e239f288782a543deb52786e0a4d819176e0a99024ed988a2e4e8b5520e"} Dec 05 17:47:19 crc kubenswrapper[4756]: I1205 17:47:19.184229 4756 generic.go:334] "Generic (PLEG): container finished" podID="3d1e4851-886b-4648-8022-d9fb79072e9b" containerID="de76082a3823a246cae35a684e13925d4e81561572c9edb6da76ca2b05e5211a" exitCode=0 Dec 05 17:47:19 crc kubenswrapper[4756]: I1205 17:47:19.184274 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ab3f-account-create-update-7b27p" event={"ID":"3d1e4851-886b-4648-8022-d9fb79072e9b","Type":"ContainerDied","Data":"de76082a3823a246cae35a684e13925d4e81561572c9edb6da76ca2b05e5211a"} Dec 05 17:47:20 crc kubenswrapper[4756]: I1205 17:47:20.715464 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ab3f-account-create-update-7b27p" Dec 05 17:47:20 crc kubenswrapper[4756]: I1205 17:47:20.721704 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hvkpt" Dec 05 17:47:20 crc kubenswrapper[4756]: I1205 17:47:20.780186 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9dp9\" (UniqueName: \"kubernetes.io/projected/3d1e4851-886b-4648-8022-d9fb79072e9b-kube-api-access-m9dp9\") pod \"3d1e4851-886b-4648-8022-d9fb79072e9b\" (UID: \"3d1e4851-886b-4648-8022-d9fb79072e9b\") " Dec 05 17:47:20 crc kubenswrapper[4756]: I1205 17:47:20.780234 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d1e4851-886b-4648-8022-d9fb79072e9b-operator-scripts\") pod \"3d1e4851-886b-4648-8022-d9fb79072e9b\" (UID: \"3d1e4851-886b-4648-8022-d9fb79072e9b\") " Dec 05 17:47:20 crc kubenswrapper[4756]: I1205 17:47:20.780350 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6l6xk\" (UniqueName: \"kubernetes.io/projected/7c64432f-bdc1-4b21-a99f-4b0892bf16c4-kube-api-access-6l6xk\") pod \"7c64432f-bdc1-4b21-a99f-4b0892bf16c4\" (UID: \"7c64432f-bdc1-4b21-a99f-4b0892bf16c4\") " Dec 05 17:47:20 crc kubenswrapper[4756]: I1205 17:47:20.780374 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c64432f-bdc1-4b21-a99f-4b0892bf16c4-operator-scripts\") pod \"7c64432f-bdc1-4b21-a99f-4b0892bf16c4\" (UID: \"7c64432f-bdc1-4b21-a99f-4b0892bf16c4\") " Dec 05 17:47:20 crc kubenswrapper[4756]: I1205 17:47:20.781316 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c64432f-bdc1-4b21-a99f-4b0892bf16c4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7c64432f-bdc1-4b21-a99f-4b0892bf16c4" (UID: "7c64432f-bdc1-4b21-a99f-4b0892bf16c4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:47:20 crc kubenswrapper[4756]: I1205 17:47:20.781363 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d1e4851-886b-4648-8022-d9fb79072e9b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d1e4851-886b-4648-8022-d9fb79072e9b" (UID: "3d1e4851-886b-4648-8022-d9fb79072e9b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:47:20 crc kubenswrapper[4756]: I1205 17:47:20.786253 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d1e4851-886b-4648-8022-d9fb79072e9b-kube-api-access-m9dp9" (OuterVolumeSpecName: "kube-api-access-m9dp9") pod "3d1e4851-886b-4648-8022-d9fb79072e9b" (UID: "3d1e4851-886b-4648-8022-d9fb79072e9b"). InnerVolumeSpecName "kube-api-access-m9dp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:47:20 crc kubenswrapper[4756]: I1205 17:47:20.787236 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c64432f-bdc1-4b21-a99f-4b0892bf16c4-kube-api-access-6l6xk" (OuterVolumeSpecName: "kube-api-access-6l6xk") pod "7c64432f-bdc1-4b21-a99f-4b0892bf16c4" (UID: "7c64432f-bdc1-4b21-a99f-4b0892bf16c4"). InnerVolumeSpecName "kube-api-access-6l6xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:47:20 crc kubenswrapper[4756]: I1205 17:47:20.883156 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9dp9\" (UniqueName: \"kubernetes.io/projected/3d1e4851-886b-4648-8022-d9fb79072e9b-kube-api-access-m9dp9\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:20 crc kubenswrapper[4756]: I1205 17:47:20.883209 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d1e4851-886b-4648-8022-d9fb79072e9b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:20 crc kubenswrapper[4756]: I1205 17:47:20.883227 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6l6xk\" (UniqueName: \"kubernetes.io/projected/7c64432f-bdc1-4b21-a99f-4b0892bf16c4-kube-api-access-6l6xk\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:20 crc kubenswrapper[4756]: I1205 17:47:20.883244 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c64432f-bdc1-4b21-a99f-4b0892bf16c4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:21 crc kubenswrapper[4756]: I1205 17:47:21.209166 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ab3f-account-create-update-7b27p" event={"ID":"3d1e4851-886b-4648-8022-d9fb79072e9b","Type":"ContainerDied","Data":"f2e5f536f3e2d063ce477c7f8a4d71779d453d613bc96226db9a3fa970e4ffcb"} Dec 05 17:47:21 crc kubenswrapper[4756]: I1205 17:47:21.209599 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2e5f536f3e2d063ce477c7f8a4d71779d453d613bc96226db9a3fa970e4ffcb" Dec 05 17:47:21 crc kubenswrapper[4756]: I1205 17:47:21.209200 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ab3f-account-create-update-7b27p" Dec 05 17:47:21 crc kubenswrapper[4756]: I1205 17:47:21.211574 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hvkpt" event={"ID":"7c64432f-bdc1-4b21-a99f-4b0892bf16c4","Type":"ContainerDied","Data":"5ba86e239f288782a543deb52786e0a4d819176e0a99024ed988a2e4e8b5520e"} Dec 05 17:47:21 crc kubenswrapper[4756]: I1205 17:47:21.211614 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ba86e239f288782a543deb52786e0a4d819176e0a99024ed988a2e4e8b5520e" Dec 05 17:47:21 crc kubenswrapper[4756]: I1205 17:47:21.211665 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hvkpt" Dec 05 17:47:22 crc kubenswrapper[4756]: I1205 17:47:22.412062 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 05 17:47:23 crc kubenswrapper[4756]: I1205 17:47:23.008227 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-2wb2k"] Dec 05 17:47:23 crc kubenswrapper[4756]: E1205 17:47:23.009043 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c64432f-bdc1-4b21-a99f-4b0892bf16c4" containerName="mariadb-database-create" Dec 05 17:47:23 crc kubenswrapper[4756]: I1205 17:47:23.009074 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c64432f-bdc1-4b21-a99f-4b0892bf16c4" containerName="mariadb-database-create" Dec 05 17:47:23 crc kubenswrapper[4756]: E1205 17:47:23.009106 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d1e4851-886b-4648-8022-d9fb79072e9b" containerName="mariadb-account-create-update" Dec 05 17:47:23 crc kubenswrapper[4756]: I1205 17:47:23.009119 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d1e4851-886b-4648-8022-d9fb79072e9b" containerName="mariadb-account-create-update" Dec 05 17:47:23 crc kubenswrapper[4756]: I1205 17:47:23.009407 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c64432f-bdc1-4b21-a99f-4b0892bf16c4" containerName="mariadb-database-create" Dec 05 17:47:23 crc kubenswrapper[4756]: I1205 17:47:23.009482 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d1e4851-886b-4648-8022-d9fb79072e9b" containerName="mariadb-account-create-update" Dec 05 17:47:23 crc kubenswrapper[4756]: I1205 17:47:23.010330 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2wb2k" Dec 05 17:47:23 crc kubenswrapper[4756]: I1205 17:47:23.025144 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 17:47:23 crc kubenswrapper[4756]: I1205 17:47:23.025654 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-d86lz" Dec 05 17:47:23 crc kubenswrapper[4756]: I1205 17:47:23.026031 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 17:47:23 crc kubenswrapper[4756]: I1205 17:47:23.027408 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 17:47:23 crc kubenswrapper[4756]: I1205 17:47:23.034486 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-2wb2k"] Dec 05 17:47:23 crc kubenswrapper[4756]: I1205 17:47:23.130893 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea68c9d-a357-41d9-a5ce-c7b09cad33c2-combined-ca-bundle\") pod \"keystone-db-sync-2wb2k\" (UID: \"cea68c9d-a357-41d9-a5ce-c7b09cad33c2\") " pod="openstack/keystone-db-sync-2wb2k" Dec 05 17:47:23 crc kubenswrapper[4756]: I1205 17:47:23.131020 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea68c9d-a357-41d9-a5ce-c7b09cad33c2-config-data\") pod \"keystone-db-sync-2wb2k\" (UID: \"cea68c9d-a357-41d9-a5ce-c7b09cad33c2\") " pod="openstack/keystone-db-sync-2wb2k" Dec 05 17:47:23 crc kubenswrapper[4756]: I1205 17:47:23.131058 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhmfv\" (UniqueName: \"kubernetes.io/projected/cea68c9d-a357-41d9-a5ce-c7b09cad33c2-kube-api-access-hhmfv\") pod \"keystone-db-sync-2wb2k\" (UID: \"cea68c9d-a357-41d9-a5ce-c7b09cad33c2\") " pod="openstack/keystone-db-sync-2wb2k" Dec 05 17:47:23 crc kubenswrapper[4756]: I1205 17:47:23.232459 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea68c9d-a357-41d9-a5ce-c7b09cad33c2-config-data\") pod \"keystone-db-sync-2wb2k\" (UID: \"cea68c9d-a357-41d9-a5ce-c7b09cad33c2\") " pod="openstack/keystone-db-sync-2wb2k" Dec 05 17:47:23 crc kubenswrapper[4756]: I1205 17:47:23.232555 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhmfv\" (UniqueName: \"kubernetes.io/projected/cea68c9d-a357-41d9-a5ce-c7b09cad33c2-kube-api-access-hhmfv\") pod \"keystone-db-sync-2wb2k\" (UID: \"cea68c9d-a357-41d9-a5ce-c7b09cad33c2\") " pod="openstack/keystone-db-sync-2wb2k" Dec 05 17:47:23 crc kubenswrapper[4756]: I1205 17:47:23.232683 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea68c9d-a357-41d9-a5ce-c7b09cad33c2-combined-ca-bundle\") pod \"keystone-db-sync-2wb2k\" (UID: \"cea68c9d-a357-41d9-a5ce-c7b09cad33c2\") " pod="openstack/keystone-db-sync-2wb2k" Dec 05 17:47:23 crc kubenswrapper[4756]: I1205 17:47:23.241972 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea68c9d-a357-41d9-a5ce-c7b09cad33c2-combined-ca-bundle\") pod \"keystone-db-sync-2wb2k\" (UID: \"cea68c9d-a357-41d9-a5ce-c7b09cad33c2\") " pod="openstack/keystone-db-sync-2wb2k" Dec 05 17:47:23 crc kubenswrapper[4756]: I1205 17:47:23.258288 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea68c9d-a357-41d9-a5ce-c7b09cad33c2-config-data\") pod \"keystone-db-sync-2wb2k\" (UID: \"cea68c9d-a357-41d9-a5ce-c7b09cad33c2\") " pod="openstack/keystone-db-sync-2wb2k" Dec 05 17:47:23 crc kubenswrapper[4756]: I1205 17:47:23.262363 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhmfv\" (UniqueName: \"kubernetes.io/projected/cea68c9d-a357-41d9-a5ce-c7b09cad33c2-kube-api-access-hhmfv\") pod \"keystone-db-sync-2wb2k\" (UID: \"cea68c9d-a357-41d9-a5ce-c7b09cad33c2\") " pod="openstack/keystone-db-sync-2wb2k" Dec 05 17:47:23 crc kubenswrapper[4756]: I1205 17:47:23.376279 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2wb2k" Dec 05 17:47:23 crc kubenswrapper[4756]: I1205 17:47:23.645370 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-2wb2k"] Dec 05 17:47:24 crc kubenswrapper[4756]: I1205 17:47:24.246777 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2wb2k" event={"ID":"cea68c9d-a357-41d9-a5ce-c7b09cad33c2","Type":"ContainerStarted","Data":"82c4701c7ba6529ca2be2c911b39bdd1295f70c4c76d041fd53cca44edee2a57"} Dec 05 17:47:24 crc kubenswrapper[4756]: I1205 17:47:24.247306 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2wb2k" event={"ID":"cea68c9d-a357-41d9-a5ce-c7b09cad33c2","Type":"ContainerStarted","Data":"8d183cfb92a8eefa3457c6f12d0e8ca640ccdf462d2bd9357002d14aecc625d5"} Dec 05 17:47:24 crc kubenswrapper[4756]: I1205 17:47:24.280861 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-2wb2k" podStartSLOduration=2.280836751 podStartE2EDuration="2.280836751s" podCreationTimestamp="2025-12-05 17:47:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:47:24.274225465 +0000 UTC m=+5504.512899007" watchObservedRunningTime="2025-12-05 17:47:24.280836751 +0000 UTC m=+5504.519510293" Dec 05 17:47:26 crc kubenswrapper[4756]: I1205 17:47:26.275363 4756 generic.go:334] "Generic (PLEG): container finished" podID="cea68c9d-a357-41d9-a5ce-c7b09cad33c2" containerID="82c4701c7ba6529ca2be2c911b39bdd1295f70c4c76d041fd53cca44edee2a57" exitCode=0 Dec 05 17:47:26 crc kubenswrapper[4756]: I1205 17:47:26.275448 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2wb2k" event={"ID":"cea68c9d-a357-41d9-a5ce-c7b09cad33c2","Type":"ContainerDied","Data":"82c4701c7ba6529ca2be2c911b39bdd1295f70c4c76d041fd53cca44edee2a57"} Dec 05 17:47:27 crc kubenswrapper[4756]: I1205 17:47:27.707168 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2wb2k" Dec 05 17:47:27 crc kubenswrapper[4756]: I1205 17:47:27.829196 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea68c9d-a357-41d9-a5ce-c7b09cad33c2-combined-ca-bundle\") pod \"cea68c9d-a357-41d9-a5ce-c7b09cad33c2\" (UID: \"cea68c9d-a357-41d9-a5ce-c7b09cad33c2\") " Dec 05 17:47:27 crc kubenswrapper[4756]: I1205 17:47:27.829303 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea68c9d-a357-41d9-a5ce-c7b09cad33c2-config-data\") pod \"cea68c9d-a357-41d9-a5ce-c7b09cad33c2\" (UID: \"cea68c9d-a357-41d9-a5ce-c7b09cad33c2\") " Dec 05 17:47:27 crc kubenswrapper[4756]: I1205 17:47:27.829356 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhmfv\" (UniqueName: \"kubernetes.io/projected/cea68c9d-a357-41d9-a5ce-c7b09cad33c2-kube-api-access-hhmfv\") pod \"cea68c9d-a357-41d9-a5ce-c7b09cad33c2\" (UID: \"cea68c9d-a357-41d9-a5ce-c7b09cad33c2\") " Dec 05 17:47:27 crc kubenswrapper[4756]: I1205 17:47:27.834584 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cea68c9d-a357-41d9-a5ce-c7b09cad33c2-kube-api-access-hhmfv" (OuterVolumeSpecName: "kube-api-access-hhmfv") pod "cea68c9d-a357-41d9-a5ce-c7b09cad33c2" (UID: "cea68c9d-a357-41d9-a5ce-c7b09cad33c2"). InnerVolumeSpecName "kube-api-access-hhmfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:47:27 crc kubenswrapper[4756]: I1205 17:47:27.855232 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cea68c9d-a357-41d9-a5ce-c7b09cad33c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cea68c9d-a357-41d9-a5ce-c7b09cad33c2" (UID: "cea68c9d-a357-41d9-a5ce-c7b09cad33c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:47:27 crc kubenswrapper[4756]: I1205 17:47:27.887872 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cea68c9d-a357-41d9-a5ce-c7b09cad33c2-config-data" (OuterVolumeSpecName: "config-data") pod "cea68c9d-a357-41d9-a5ce-c7b09cad33c2" (UID: "cea68c9d-a357-41d9-a5ce-c7b09cad33c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:47:27 crc kubenswrapper[4756]: I1205 17:47:27.932110 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea68c9d-a357-41d9-a5ce-c7b09cad33c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:27 crc kubenswrapper[4756]: I1205 17:47:27.932156 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea68c9d-a357-41d9-a5ce-c7b09cad33c2-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:27 crc kubenswrapper[4756]: I1205 17:47:27.932179 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhmfv\" (UniqueName: \"kubernetes.io/projected/cea68c9d-a357-41d9-a5ce-c7b09cad33c2-kube-api-access-hhmfv\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.301299 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2wb2k" event={"ID":"cea68c9d-a357-41d9-a5ce-c7b09cad33c2","Type":"ContainerDied","Data":"8d183cfb92a8eefa3457c6f12d0e8ca640ccdf462d2bd9357002d14aecc625d5"} Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.301374 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d183cfb92a8eefa3457c6f12d0e8ca640ccdf462d2bd9357002d14aecc625d5" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.301570 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2wb2k" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.469009 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-q892z"] Dec 05 17:47:28 crc kubenswrapper[4756]: E1205 17:47:28.469452 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea68c9d-a357-41d9-a5ce-c7b09cad33c2" containerName="keystone-db-sync" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.469469 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea68c9d-a357-41d9-a5ce-c7b09cad33c2" containerName="keystone-db-sync" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.469672 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="cea68c9d-a357-41d9-a5ce-c7b09cad33c2" containerName="keystone-db-sync" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.470407 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q892z" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.472462 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.472890 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-d86lz" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.473600 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.477711 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f75cb8b55-mj5gh"] Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.480567 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f75cb8b55-mj5gh" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.480715 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.480931 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.491937 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-q892z"] Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.507286 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f75cb8b55-mj5gh"] Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.541560 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbnb7\" (UniqueName: \"kubernetes.io/projected/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-kube-api-access-kbnb7\") pod \"keystone-bootstrap-q892z\" (UID: \"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f\") " pod="openstack/keystone-bootstrap-q892z" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.541677 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-fernet-keys\") pod \"keystone-bootstrap-q892z\" (UID: \"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f\") " pod="openstack/keystone-bootstrap-q892z" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.541707 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1356577d-3bc1-401b-a8e8-0d6b83a99d96-dns-svc\") pod \"dnsmasq-dns-7f75cb8b55-mj5gh\" (UID: \"1356577d-3bc1-401b-a8e8-0d6b83a99d96\") " pod="openstack/dnsmasq-dns-7f75cb8b55-mj5gh" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.541753 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1356577d-3bc1-401b-a8e8-0d6b83a99d96-ovsdbserver-sb\") pod \"dnsmasq-dns-7f75cb8b55-mj5gh\" (UID: \"1356577d-3bc1-401b-a8e8-0d6b83a99d96\") " pod="openstack/dnsmasq-dns-7f75cb8b55-mj5gh" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.541798 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1356577d-3bc1-401b-a8e8-0d6b83a99d96-config\") pod \"dnsmasq-dns-7f75cb8b55-mj5gh\" (UID: \"1356577d-3bc1-401b-a8e8-0d6b83a99d96\") " pod="openstack/dnsmasq-dns-7f75cb8b55-mj5gh" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.541830 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1356577d-3bc1-401b-a8e8-0d6b83a99d96-ovsdbserver-nb\") pod \"dnsmasq-dns-7f75cb8b55-mj5gh\" (UID: \"1356577d-3bc1-401b-a8e8-0d6b83a99d96\") " pod="openstack/dnsmasq-dns-7f75cb8b55-mj5gh" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.541860 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-config-data\") pod \"keystone-bootstrap-q892z\" (UID: \"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f\") " pod="openstack/keystone-bootstrap-q892z" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.541889 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-combined-ca-bundle\") pod \"keystone-bootstrap-q892z\" (UID: \"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f\") " pod="openstack/keystone-bootstrap-q892z" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.542332 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-credential-keys\") pod \"keystone-bootstrap-q892z\" (UID: \"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f\") " pod="openstack/keystone-bootstrap-q892z" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.542388 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-scripts\") pod \"keystone-bootstrap-q892z\" (UID: \"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f\") " pod="openstack/keystone-bootstrap-q892z" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.542586 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cdcv\" (UniqueName: \"kubernetes.io/projected/1356577d-3bc1-401b-a8e8-0d6b83a99d96-kube-api-access-2cdcv\") pod \"dnsmasq-dns-7f75cb8b55-mj5gh\" (UID: \"1356577d-3bc1-401b-a8e8-0d6b83a99d96\") " pod="openstack/dnsmasq-dns-7f75cb8b55-mj5gh" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.644397 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-credential-keys\") pod \"keystone-bootstrap-q892z\" (UID: \"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f\") " pod="openstack/keystone-bootstrap-q892z" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.644802 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-scripts\") pod \"keystone-bootstrap-q892z\" (UID: \"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f\") " pod="openstack/keystone-bootstrap-q892z" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.644871 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cdcv\" (UniqueName: \"kubernetes.io/projected/1356577d-3bc1-401b-a8e8-0d6b83a99d96-kube-api-access-2cdcv\") pod \"dnsmasq-dns-7f75cb8b55-mj5gh\" (UID: \"1356577d-3bc1-401b-a8e8-0d6b83a99d96\") " pod="openstack/dnsmasq-dns-7f75cb8b55-mj5gh" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.644916 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbnb7\" (UniqueName: \"kubernetes.io/projected/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-kube-api-access-kbnb7\") pod \"keystone-bootstrap-q892z\" (UID: \"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f\") " pod="openstack/keystone-bootstrap-q892z" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.644977 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-fernet-keys\") pod \"keystone-bootstrap-q892z\" (UID: \"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f\") " pod="openstack/keystone-bootstrap-q892z" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.645000 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1356577d-3bc1-401b-a8e8-0d6b83a99d96-dns-svc\") pod \"dnsmasq-dns-7f75cb8b55-mj5gh\" (UID: \"1356577d-3bc1-401b-a8e8-0d6b83a99d96\") " pod="openstack/dnsmasq-dns-7f75cb8b55-mj5gh" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.645038 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1356577d-3bc1-401b-a8e8-0d6b83a99d96-ovsdbserver-sb\") pod \"dnsmasq-dns-7f75cb8b55-mj5gh\" (UID: \"1356577d-3bc1-401b-a8e8-0d6b83a99d96\") " pod="openstack/dnsmasq-dns-7f75cb8b55-mj5gh" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.645079 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1356577d-3bc1-401b-a8e8-0d6b83a99d96-config\") pod \"dnsmasq-dns-7f75cb8b55-mj5gh\" (UID: \"1356577d-3bc1-401b-a8e8-0d6b83a99d96\") " pod="openstack/dnsmasq-dns-7f75cb8b55-mj5gh" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.645112 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1356577d-3bc1-401b-a8e8-0d6b83a99d96-ovsdbserver-nb\") pod \"dnsmasq-dns-7f75cb8b55-mj5gh\" (UID: \"1356577d-3bc1-401b-a8e8-0d6b83a99d96\") " pod="openstack/dnsmasq-dns-7f75cb8b55-mj5gh" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.645142 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-config-data\") pod \"keystone-bootstrap-q892z\" (UID: \"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f\") " pod="openstack/keystone-bootstrap-q892z" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.645168 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-combined-ca-bundle\") pod \"keystone-bootstrap-q892z\" (UID: \"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f\") " pod="openstack/keystone-bootstrap-q892z" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.646662 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1356577d-3bc1-401b-a8e8-0d6b83a99d96-ovsdbserver-sb\") pod \"dnsmasq-dns-7f75cb8b55-mj5gh\" (UID: \"1356577d-3bc1-401b-a8e8-0d6b83a99d96\") " pod="openstack/dnsmasq-dns-7f75cb8b55-mj5gh" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.647291 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1356577d-3bc1-401b-a8e8-0d6b83a99d96-dns-svc\") pod \"dnsmasq-dns-7f75cb8b55-mj5gh\" (UID: \"1356577d-3bc1-401b-a8e8-0d6b83a99d96\") " pod="openstack/dnsmasq-dns-7f75cb8b55-mj5gh" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.647898 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1356577d-3bc1-401b-a8e8-0d6b83a99d96-config\") pod \"dnsmasq-dns-7f75cb8b55-mj5gh\" (UID: \"1356577d-3bc1-401b-a8e8-0d6b83a99d96\") " pod="openstack/dnsmasq-dns-7f75cb8b55-mj5gh" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.653917 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1356577d-3bc1-401b-a8e8-0d6b83a99d96-ovsdbserver-nb\") pod \"dnsmasq-dns-7f75cb8b55-mj5gh\" (UID: \"1356577d-3bc1-401b-a8e8-0d6b83a99d96\") " pod="openstack/dnsmasq-dns-7f75cb8b55-mj5gh" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.665131 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbnb7\" (UniqueName: \"kubernetes.io/projected/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-kube-api-access-kbnb7\") pod \"keystone-bootstrap-q892z\" (UID: \"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f\") " pod="openstack/keystone-bootstrap-q892z" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.669302 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cdcv\" (UniqueName: \"kubernetes.io/projected/1356577d-3bc1-401b-a8e8-0d6b83a99d96-kube-api-access-2cdcv\") pod \"dnsmasq-dns-7f75cb8b55-mj5gh\" (UID: \"1356577d-3bc1-401b-a8e8-0d6b83a99d96\") " pod="openstack/dnsmasq-dns-7f75cb8b55-mj5gh" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.670908 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-scripts\") pod \"keystone-bootstrap-q892z\" (UID: \"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f\") " pod="openstack/keystone-bootstrap-q892z" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.673026 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-fernet-keys\") pod \"keystone-bootstrap-q892z\" (UID: \"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f\") " pod="openstack/keystone-bootstrap-q892z" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.673223 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-credential-keys\") pod \"keystone-bootstrap-q892z\" (UID: \"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f\") " pod="openstack/keystone-bootstrap-q892z" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.675077 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-config-data\") pod \"keystone-bootstrap-q892z\" (UID: \"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f\") " pod="openstack/keystone-bootstrap-q892z" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.675764 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-combined-ca-bundle\") pod \"keystone-bootstrap-q892z\" (UID: \"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f\") " pod="openstack/keystone-bootstrap-q892z" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.791659 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q892z" Dec 05 17:47:28 crc kubenswrapper[4756]: I1205 17:47:28.799213 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f75cb8b55-mj5gh" Dec 05 17:47:29 crc kubenswrapper[4756]: I1205 17:47:29.303857 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f75cb8b55-mj5gh"] Dec 05 17:47:29 crc kubenswrapper[4756]: W1205 17:47:29.313432 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1356577d_3bc1_401b_a8e8_0d6b83a99d96.slice/crio-e00c5de34a0d682152662530ca585e958edc08a0272c9b4d4960d1a1c11b0b67 WatchSource:0}: Error finding container e00c5de34a0d682152662530ca585e958edc08a0272c9b4d4960d1a1c11b0b67: Status 404 returned error can't find the container with id e00c5de34a0d682152662530ca585e958edc08a0272c9b4d4960d1a1c11b0b67 Dec 05 17:47:29 crc kubenswrapper[4756]: W1205 17:47:29.393748 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod249a19ad_bbfc_4ba2_b2d7_ddb3113ea23f.slice/crio-2d0e5801b74c6cc077851d4e9f3ecebc25adb4a18f3dc8f3559922e66e2fee53 WatchSource:0}: Error finding container 2d0e5801b74c6cc077851d4e9f3ecebc25adb4a18f3dc8f3559922e66e2fee53: Status 404 returned error can't find the container with id 2d0e5801b74c6cc077851d4e9f3ecebc25adb4a18f3dc8f3559922e66e2fee53 Dec 05 17:47:29 crc kubenswrapper[4756]: I1205 17:47:29.396268 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-q892z"] Dec 05 17:47:30 crc kubenswrapper[4756]: I1205 17:47:30.331332 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q892z" event={"ID":"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f","Type":"ContainerStarted","Data":"cd79b4f9681e98882728d713fd1b6c509aaeff89234dd5682a5418bde4b746a5"} Dec 05 17:47:30 crc kubenswrapper[4756]: I1205 17:47:30.331698 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q892z" event={"ID":"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f","Type":"ContainerStarted","Data":"2d0e5801b74c6cc077851d4e9f3ecebc25adb4a18f3dc8f3559922e66e2fee53"} Dec 05 17:47:30 crc kubenswrapper[4756]: I1205 17:47:30.334493 4756 generic.go:334] "Generic (PLEG): container finished" podID="1356577d-3bc1-401b-a8e8-0d6b83a99d96" containerID="45bb28482deb65e210bd2da99213216508976252c6bb77e5d0c5405b99b8cfc8" exitCode=0 Dec 05 17:47:30 crc kubenswrapper[4756]: I1205 17:47:30.334531 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f75cb8b55-mj5gh" event={"ID":"1356577d-3bc1-401b-a8e8-0d6b83a99d96","Type":"ContainerDied","Data":"45bb28482deb65e210bd2da99213216508976252c6bb77e5d0c5405b99b8cfc8"} Dec 05 17:47:30 crc kubenswrapper[4756]: I1205 17:47:30.334550 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f75cb8b55-mj5gh" event={"ID":"1356577d-3bc1-401b-a8e8-0d6b83a99d96","Type":"ContainerStarted","Data":"e00c5de34a0d682152662530ca585e958edc08a0272c9b4d4960d1a1c11b0b67"} Dec 05 17:47:30 crc kubenswrapper[4756]: I1205 17:47:30.374905 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-q892z" podStartSLOduration=2.374879303 podStartE2EDuration="2.374879303s" podCreationTimestamp="2025-12-05 17:47:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:47:30.366695186 +0000 UTC m=+5510.605368728" watchObservedRunningTime="2025-12-05 17:47:30.374879303 +0000 UTC m=+5510.613552855" Dec 05 17:47:31 crc kubenswrapper[4756]: I1205 17:47:31.350613 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f75cb8b55-mj5gh" event={"ID":"1356577d-3bc1-401b-a8e8-0d6b83a99d96","Type":"ContainerStarted","Data":"c9df30652a40f86f3f12679e1d2beeedb89cc43f6a122ec18e09ba1559c0d627"} Dec 05 17:47:31 crc kubenswrapper[4756]: I1205 17:47:31.353256 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f75cb8b55-mj5gh" Dec 05 17:47:31 crc kubenswrapper[4756]: I1205 17:47:31.388129 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f75cb8b55-mj5gh" podStartSLOduration=3.388099892 podStartE2EDuration="3.388099892s" podCreationTimestamp="2025-12-05 17:47:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:47:31.370531966 +0000 UTC m=+5511.609205518" watchObservedRunningTime="2025-12-05 17:47:31.388099892 +0000 UTC m=+5511.626773444" Dec 05 17:47:33 crc kubenswrapper[4756]: I1205 17:47:33.376559 4756 generic.go:334] "Generic (PLEG): container finished" podID="249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f" containerID="cd79b4f9681e98882728d713fd1b6c509aaeff89234dd5682a5418bde4b746a5" exitCode=0 Dec 05 17:47:33 crc kubenswrapper[4756]: I1205 17:47:33.376655 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q892z" event={"ID":"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f","Type":"ContainerDied","Data":"cd79b4f9681e98882728d713fd1b6c509aaeff89234dd5682a5418bde4b746a5"} Dec 05 17:47:34 crc kubenswrapper[4756]: I1205 17:47:34.842805 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q892z" Dec 05 17:47:34 crc kubenswrapper[4756]: I1205 17:47:34.989075 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-credential-keys\") pod \"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f\" (UID: \"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f\") " Dec 05 17:47:34 crc kubenswrapper[4756]: I1205 17:47:34.989201 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-config-data\") pod \"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f\" (UID: \"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f\") " Dec 05 17:47:34 crc kubenswrapper[4756]: I1205 17:47:34.989297 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-fernet-keys\") pod \"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f\" (UID: \"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f\") " Dec 05 17:47:34 crc kubenswrapper[4756]: I1205 17:47:34.989398 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbnb7\" (UniqueName: \"kubernetes.io/projected/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-kube-api-access-kbnb7\") pod \"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f\" (UID: \"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f\") " Dec 05 17:47:34 crc kubenswrapper[4756]: I1205 17:47:34.989480 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-scripts\") pod \"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f\" (UID: \"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f\") " Dec 05 17:47:34 crc kubenswrapper[4756]: I1205 17:47:34.989523 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-combined-ca-bundle\") pod \"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f\" (UID: \"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f\") " Dec 05 17:47:34 crc kubenswrapper[4756]: I1205 17:47:34.998472 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f" (UID: "249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:47:34 crc kubenswrapper[4756]: I1205 17:47:34.998518 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-kube-api-access-kbnb7" (OuterVolumeSpecName: "kube-api-access-kbnb7") pod "249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f" (UID: "249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f"). InnerVolumeSpecName "kube-api-access-kbnb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:47:34 crc kubenswrapper[4756]: I1205 17:47:34.999342 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-scripts" (OuterVolumeSpecName: "scripts") pod "249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f" (UID: "249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.004150 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f" (UID: "249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.038089 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f" (UID: "249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.041547 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-config-data" (OuterVolumeSpecName: "config-data") pod "249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f" (UID: "249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.093092 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.093638 4756 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.093721 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbnb7\" (UniqueName: \"kubernetes.io/projected/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-kube-api-access-kbnb7\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.093851 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.093928 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.094017 4756 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.404502 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q892z" event={"ID":"249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f","Type":"ContainerDied","Data":"2d0e5801b74c6cc077851d4e9f3ecebc25adb4a18f3dc8f3559922e66e2fee53"} Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.404545 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d0e5801b74c6cc077851d4e9f3ecebc25adb4a18f3dc8f3559922e66e2fee53" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.404579 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q892z" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.609605 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-q892z"] Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.624141 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-q892z"] Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.705670 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-t47bv"] Dec 05 17:47:35 crc kubenswrapper[4756]: E1205 17:47:35.706239 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f" containerName="keystone-bootstrap" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.706268 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f" containerName="keystone-bootstrap" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.706612 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f" containerName="keystone-bootstrap" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.707734 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t47bv" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.711165 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.711393 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.712184 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.712257 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.712383 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-d86lz" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.721562 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-t47bv"] Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.807870 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-scripts\") pod \"keystone-bootstrap-t47bv\" (UID: \"1228931b-c0ef-49b3-8f30-c6ffcddca7f6\") " pod="openstack/keystone-bootstrap-t47bv" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.808002 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-fernet-keys\") pod \"keystone-bootstrap-t47bv\" (UID: \"1228931b-c0ef-49b3-8f30-c6ffcddca7f6\") " pod="openstack/keystone-bootstrap-t47bv" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.808044 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qplnd\" (UniqueName: \"kubernetes.io/projected/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-kube-api-access-qplnd\") pod \"keystone-bootstrap-t47bv\" (UID: \"1228931b-c0ef-49b3-8f30-c6ffcddca7f6\") " pod="openstack/keystone-bootstrap-t47bv" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.808187 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-combined-ca-bundle\") pod \"keystone-bootstrap-t47bv\" (UID: \"1228931b-c0ef-49b3-8f30-c6ffcddca7f6\") " pod="openstack/keystone-bootstrap-t47bv" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.808226 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-config-data\") pod \"keystone-bootstrap-t47bv\" (UID: \"1228931b-c0ef-49b3-8f30-c6ffcddca7f6\") " pod="openstack/keystone-bootstrap-t47bv" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.808314 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-credential-keys\") pod \"keystone-bootstrap-t47bv\" (UID: \"1228931b-c0ef-49b3-8f30-c6ffcddca7f6\") " pod="openstack/keystone-bootstrap-t47bv" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.910270 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-scripts\") pod \"keystone-bootstrap-t47bv\" (UID: \"1228931b-c0ef-49b3-8f30-c6ffcddca7f6\") " pod="openstack/keystone-bootstrap-t47bv" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.910354 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-fernet-keys\") pod \"keystone-bootstrap-t47bv\" (UID: \"1228931b-c0ef-49b3-8f30-c6ffcddca7f6\") " pod="openstack/keystone-bootstrap-t47bv" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.910384 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qplnd\" (UniqueName: \"kubernetes.io/projected/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-kube-api-access-qplnd\") pod \"keystone-bootstrap-t47bv\" (UID: \"1228931b-c0ef-49b3-8f30-c6ffcddca7f6\") " pod="openstack/keystone-bootstrap-t47bv" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.910498 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-combined-ca-bundle\") pod \"keystone-bootstrap-t47bv\" (UID: \"1228931b-c0ef-49b3-8f30-c6ffcddca7f6\") " pod="openstack/keystone-bootstrap-t47bv" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.910524 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-config-data\") pod \"keystone-bootstrap-t47bv\" (UID: \"1228931b-c0ef-49b3-8f30-c6ffcddca7f6\") " pod="openstack/keystone-bootstrap-t47bv" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.910551 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-credential-keys\") pod \"keystone-bootstrap-t47bv\" (UID: \"1228931b-c0ef-49b3-8f30-c6ffcddca7f6\") " pod="openstack/keystone-bootstrap-t47bv" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.916720 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-scripts\") pod \"keystone-bootstrap-t47bv\" (UID: \"1228931b-c0ef-49b3-8f30-c6ffcddca7f6\") " pod="openstack/keystone-bootstrap-t47bv" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.917728 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-config-data\") pod \"keystone-bootstrap-t47bv\" (UID: \"1228931b-c0ef-49b3-8f30-c6ffcddca7f6\") " pod="openstack/keystone-bootstrap-t47bv" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.920562 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-fernet-keys\") pod \"keystone-bootstrap-t47bv\" (UID: \"1228931b-c0ef-49b3-8f30-c6ffcddca7f6\") " pod="openstack/keystone-bootstrap-t47bv" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.922272 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-combined-ca-bundle\") pod \"keystone-bootstrap-t47bv\" (UID: \"1228931b-c0ef-49b3-8f30-c6ffcddca7f6\") " pod="openstack/keystone-bootstrap-t47bv" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.929217 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-credential-keys\") pod \"keystone-bootstrap-t47bv\" (UID: \"1228931b-c0ef-49b3-8f30-c6ffcddca7f6\") " pod="openstack/keystone-bootstrap-t47bv" Dec 05 17:47:35 crc kubenswrapper[4756]: I1205 17:47:35.935071 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qplnd\" (UniqueName: \"kubernetes.io/projected/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-kube-api-access-qplnd\") pod \"keystone-bootstrap-t47bv\" (UID: \"1228931b-c0ef-49b3-8f30-c6ffcddca7f6\") " pod="openstack/keystone-bootstrap-t47bv" Dec 05 17:47:36 crc kubenswrapper[4756]: I1205 17:47:36.034094 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t47bv" Dec 05 17:47:36 crc kubenswrapper[4756]: I1205 17:47:36.072648 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f" path="/var/lib/kubelet/pods/249a19ad-bbfc-4ba2-b2d7-ddb3113ea23f/volumes" Dec 05 17:47:36 crc kubenswrapper[4756]: I1205 17:47:36.522899 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-t47bv"] Dec 05 17:47:36 crc kubenswrapper[4756]: W1205 17:47:36.524598 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1228931b_c0ef_49b3_8f30_c6ffcddca7f6.slice/crio-1250fc150fa7606f5d61455ed91f11477b17854feb82a10bea6b10e988bd0417 WatchSource:0}: Error finding container 1250fc150fa7606f5d61455ed91f11477b17854feb82a10bea6b10e988bd0417: Status 404 returned error can't find the container with id 1250fc150fa7606f5d61455ed91f11477b17854feb82a10bea6b10e988bd0417 Dec 05 17:47:37 crc kubenswrapper[4756]: I1205 17:47:37.423790 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t47bv" event={"ID":"1228931b-c0ef-49b3-8f30-c6ffcddca7f6","Type":"ContainerStarted","Data":"853eeb4f1011e1cefe738208cf579562c74e0c2ca576a85fe990c9a40b5830d3"} Dec 05 17:47:37 crc kubenswrapper[4756]: I1205 17:47:37.424188 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t47bv" event={"ID":"1228931b-c0ef-49b3-8f30-c6ffcddca7f6","Type":"ContainerStarted","Data":"1250fc150fa7606f5d61455ed91f11477b17854feb82a10bea6b10e988bd0417"} Dec 05 17:47:37 crc kubenswrapper[4756]: I1205 17:47:37.460191 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-t47bv" podStartSLOduration=2.460173381 podStartE2EDuration="2.460173381s" podCreationTimestamp="2025-12-05 17:47:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:47:37.454674975 +0000 UTC m=+5517.693348507" watchObservedRunningTime="2025-12-05 17:47:37.460173381 +0000 UTC m=+5517.698846903" Dec 05 17:47:38 crc kubenswrapper[4756]: I1205 17:47:38.800813 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f75cb8b55-mj5gh" Dec 05 17:47:38 crc kubenswrapper[4756]: I1205 17:47:38.878838 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b455b6c4f-vhj69"] Dec 05 17:47:38 crc kubenswrapper[4756]: I1205 17:47:38.879248 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b455b6c4f-vhj69" podUID="8a87cfcd-63b6-4ef6-841e-a17de8443742" containerName="dnsmasq-dns" containerID="cri-o://1bd1bf8edbd92480817b020d8359a13993dc69672ab613ce2c7ec94ef705c111" gracePeriod=10 Dec 05 17:47:39 crc kubenswrapper[4756]: I1205 17:47:39.438527 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b455b6c4f-vhj69" Dec 05 17:47:39 crc kubenswrapper[4756]: I1205 17:47:39.445391 4756 generic.go:334] "Generic (PLEG): container finished" podID="1228931b-c0ef-49b3-8f30-c6ffcddca7f6" containerID="853eeb4f1011e1cefe738208cf579562c74e0c2ca576a85fe990c9a40b5830d3" exitCode=0 Dec 05 17:47:39 crc kubenswrapper[4756]: I1205 17:47:39.445449 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t47bv" event={"ID":"1228931b-c0ef-49b3-8f30-c6ffcddca7f6","Type":"ContainerDied","Data":"853eeb4f1011e1cefe738208cf579562c74e0c2ca576a85fe990c9a40b5830d3"} Dec 05 17:47:39 crc kubenswrapper[4756]: I1205 17:47:39.447076 4756 generic.go:334] "Generic (PLEG): container finished" podID="8a87cfcd-63b6-4ef6-841e-a17de8443742" containerID="1bd1bf8edbd92480817b020d8359a13993dc69672ab613ce2c7ec94ef705c111" exitCode=0 Dec 05 17:47:39 crc kubenswrapper[4756]: I1205 17:47:39.447137 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b455b6c4f-vhj69" event={"ID":"8a87cfcd-63b6-4ef6-841e-a17de8443742","Type":"ContainerDied","Data":"1bd1bf8edbd92480817b020d8359a13993dc69672ab613ce2c7ec94ef705c111"} Dec 05 17:47:39 crc kubenswrapper[4756]: I1205 17:47:39.447139 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b455b6c4f-vhj69" Dec 05 17:47:39 crc kubenswrapper[4756]: I1205 17:47:39.447183 4756 scope.go:117] "RemoveContainer" containerID="1bd1bf8edbd92480817b020d8359a13993dc69672ab613ce2c7ec94ef705c111" Dec 05 17:47:39 crc kubenswrapper[4756]: I1205 17:47:39.447169 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b455b6c4f-vhj69" event={"ID":"8a87cfcd-63b6-4ef6-841e-a17de8443742","Type":"ContainerDied","Data":"5af504504790d00a121659e3354712bf874ce069c8f979e6fe67896c3b2bc583"} Dec 05 17:47:39 crc kubenswrapper[4756]: I1205 17:47:39.475221 4756 scope.go:117] "RemoveContainer" containerID="47080795a7b80872cae5c517b00f5d4f61342942f94cf7b059fe85c782a3795e" Dec 05 17:47:39 crc kubenswrapper[4756]: I1205 17:47:39.514810 4756 scope.go:117] "RemoveContainer" containerID="1bd1bf8edbd92480817b020d8359a13993dc69672ab613ce2c7ec94ef705c111" Dec 05 17:47:39 crc kubenswrapper[4756]: E1205 17:47:39.515338 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bd1bf8edbd92480817b020d8359a13993dc69672ab613ce2c7ec94ef705c111\": container with ID starting with 1bd1bf8edbd92480817b020d8359a13993dc69672ab613ce2c7ec94ef705c111 not found: ID does not exist" containerID="1bd1bf8edbd92480817b020d8359a13993dc69672ab613ce2c7ec94ef705c111" Dec 05 17:47:39 crc kubenswrapper[4756]: I1205 17:47:39.515387 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bd1bf8edbd92480817b020d8359a13993dc69672ab613ce2c7ec94ef705c111"} err="failed to get container status \"1bd1bf8edbd92480817b020d8359a13993dc69672ab613ce2c7ec94ef705c111\": rpc error: code = NotFound desc = could not find container \"1bd1bf8edbd92480817b020d8359a13993dc69672ab613ce2c7ec94ef705c111\": container with ID starting with 1bd1bf8edbd92480817b020d8359a13993dc69672ab613ce2c7ec94ef705c111 not found: ID does not exist" Dec 05 17:47:39 crc kubenswrapper[4756]: I1205 17:47:39.515444 4756 scope.go:117] "RemoveContainer" containerID="47080795a7b80872cae5c517b00f5d4f61342942f94cf7b059fe85c782a3795e" Dec 05 17:47:39 crc kubenswrapper[4756]: E1205 17:47:39.516052 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47080795a7b80872cae5c517b00f5d4f61342942f94cf7b059fe85c782a3795e\": container with ID starting with 47080795a7b80872cae5c517b00f5d4f61342942f94cf7b059fe85c782a3795e not found: ID does not exist" containerID="47080795a7b80872cae5c517b00f5d4f61342942f94cf7b059fe85c782a3795e" Dec 05 17:47:39 crc kubenswrapper[4756]: I1205 17:47:39.516111 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47080795a7b80872cae5c517b00f5d4f61342942f94cf7b059fe85c782a3795e"} err="failed to get container status \"47080795a7b80872cae5c517b00f5d4f61342942f94cf7b059fe85c782a3795e\": rpc error: code = NotFound desc = could not find container \"47080795a7b80872cae5c517b00f5d4f61342942f94cf7b059fe85c782a3795e\": container with ID starting with 47080795a7b80872cae5c517b00f5d4f61342942f94cf7b059fe85c782a3795e not found: ID does not exist" Dec 05 17:47:39 crc kubenswrapper[4756]: I1205 17:47:39.577978 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh5sm\" (UniqueName: \"kubernetes.io/projected/8a87cfcd-63b6-4ef6-841e-a17de8443742-kube-api-access-nh5sm\") pod \"8a87cfcd-63b6-4ef6-841e-a17de8443742\" (UID: \"8a87cfcd-63b6-4ef6-841e-a17de8443742\") " Dec 05 17:47:39 crc kubenswrapper[4756]: I1205 17:47:39.578094 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a87cfcd-63b6-4ef6-841e-a17de8443742-dns-svc\") pod \"8a87cfcd-63b6-4ef6-841e-a17de8443742\" (UID: \"8a87cfcd-63b6-4ef6-841e-a17de8443742\") " Dec 05 17:47:39 crc kubenswrapper[4756]: I1205 17:47:39.578136 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a87cfcd-63b6-4ef6-841e-a17de8443742-config\") pod \"8a87cfcd-63b6-4ef6-841e-a17de8443742\" (UID: \"8a87cfcd-63b6-4ef6-841e-a17de8443742\") " Dec 05 17:47:39 crc kubenswrapper[4756]: I1205 17:47:39.578153 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a87cfcd-63b6-4ef6-841e-a17de8443742-ovsdbserver-nb\") pod \"8a87cfcd-63b6-4ef6-841e-a17de8443742\" (UID: \"8a87cfcd-63b6-4ef6-841e-a17de8443742\") " Dec 05 17:47:39 crc kubenswrapper[4756]: I1205 17:47:39.578282 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a87cfcd-63b6-4ef6-841e-a17de8443742-ovsdbserver-sb\") pod \"8a87cfcd-63b6-4ef6-841e-a17de8443742\" (UID: \"8a87cfcd-63b6-4ef6-841e-a17de8443742\") " Dec 05 17:47:39 crc kubenswrapper[4756]: I1205 17:47:39.584368 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a87cfcd-63b6-4ef6-841e-a17de8443742-kube-api-access-nh5sm" (OuterVolumeSpecName: "kube-api-access-nh5sm") pod "8a87cfcd-63b6-4ef6-841e-a17de8443742" (UID: "8a87cfcd-63b6-4ef6-841e-a17de8443742"). InnerVolumeSpecName "kube-api-access-nh5sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:47:39 crc kubenswrapper[4756]: I1205 17:47:39.617640 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a87cfcd-63b6-4ef6-841e-a17de8443742-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8a87cfcd-63b6-4ef6-841e-a17de8443742" (UID: "8a87cfcd-63b6-4ef6-841e-a17de8443742"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:47:39 crc kubenswrapper[4756]: I1205 17:47:39.629917 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a87cfcd-63b6-4ef6-841e-a17de8443742-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8a87cfcd-63b6-4ef6-841e-a17de8443742" (UID: "8a87cfcd-63b6-4ef6-841e-a17de8443742"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:47:39 crc kubenswrapper[4756]: I1205 17:47:39.630589 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a87cfcd-63b6-4ef6-841e-a17de8443742-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8a87cfcd-63b6-4ef6-841e-a17de8443742" (UID: "8a87cfcd-63b6-4ef6-841e-a17de8443742"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:47:39 crc kubenswrapper[4756]: I1205 17:47:39.635507 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a87cfcd-63b6-4ef6-841e-a17de8443742-config" (OuterVolumeSpecName: "config") pod "8a87cfcd-63b6-4ef6-841e-a17de8443742" (UID: "8a87cfcd-63b6-4ef6-841e-a17de8443742"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:47:39 crc kubenswrapper[4756]: I1205 17:47:39.680542 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a87cfcd-63b6-4ef6-841e-a17de8443742-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:39 crc kubenswrapper[4756]: I1205 17:47:39.680569 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh5sm\" (UniqueName: \"kubernetes.io/projected/8a87cfcd-63b6-4ef6-841e-a17de8443742-kube-api-access-nh5sm\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:39 crc kubenswrapper[4756]: I1205 17:47:39.680581 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a87cfcd-63b6-4ef6-841e-a17de8443742-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:39 crc kubenswrapper[4756]: I1205 17:47:39.680590 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a87cfcd-63b6-4ef6-841e-a17de8443742-config\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:39 crc kubenswrapper[4756]: I1205 17:47:39.680598 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a87cfcd-63b6-4ef6-841e-a17de8443742-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:39 crc kubenswrapper[4756]: I1205 17:47:39.817841 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b455b6c4f-vhj69"] Dec 05 17:47:39 crc kubenswrapper[4756]: I1205 17:47:39.826473 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b455b6c4f-vhj69"] Dec 05 17:47:40 crc kubenswrapper[4756]: I1205 17:47:40.064623 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a87cfcd-63b6-4ef6-841e-a17de8443742" path="/var/lib/kubelet/pods/8a87cfcd-63b6-4ef6-841e-a17de8443742/volumes" Dec 05 17:47:40 crc kubenswrapper[4756]: I1205 17:47:40.953680 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t47bv" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.039466 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-fernet-keys\") pod \"1228931b-c0ef-49b3-8f30-c6ffcddca7f6\" (UID: \"1228931b-c0ef-49b3-8f30-c6ffcddca7f6\") " Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.039828 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qplnd\" (UniqueName: \"kubernetes.io/projected/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-kube-api-access-qplnd\") pod \"1228931b-c0ef-49b3-8f30-c6ffcddca7f6\" (UID: \"1228931b-c0ef-49b3-8f30-c6ffcddca7f6\") " Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.040124 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-credential-keys\") pod \"1228931b-c0ef-49b3-8f30-c6ffcddca7f6\" (UID: \"1228931b-c0ef-49b3-8f30-c6ffcddca7f6\") " Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.040384 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-combined-ca-bundle\") pod \"1228931b-c0ef-49b3-8f30-c6ffcddca7f6\" (UID: \"1228931b-c0ef-49b3-8f30-c6ffcddca7f6\") " Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.040597 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-config-data\") pod \"1228931b-c0ef-49b3-8f30-c6ffcddca7f6\" (UID: \"1228931b-c0ef-49b3-8f30-c6ffcddca7f6\") " Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.040767 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-scripts\") pod \"1228931b-c0ef-49b3-8f30-c6ffcddca7f6\" (UID: \"1228931b-c0ef-49b3-8f30-c6ffcddca7f6\") " Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.044030 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1228931b-c0ef-49b3-8f30-c6ffcddca7f6" (UID: "1228931b-c0ef-49b3-8f30-c6ffcddca7f6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.044399 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1228931b-c0ef-49b3-8f30-c6ffcddca7f6" (UID: "1228931b-c0ef-49b3-8f30-c6ffcddca7f6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.045808 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-scripts" (OuterVolumeSpecName: "scripts") pod "1228931b-c0ef-49b3-8f30-c6ffcddca7f6" (UID: "1228931b-c0ef-49b3-8f30-c6ffcddca7f6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.050374 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-kube-api-access-qplnd" (OuterVolumeSpecName: "kube-api-access-qplnd") pod "1228931b-c0ef-49b3-8f30-c6ffcddca7f6" (UID: "1228931b-c0ef-49b3-8f30-c6ffcddca7f6"). InnerVolumeSpecName "kube-api-access-qplnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.078405 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-config-data" (OuterVolumeSpecName: "config-data") pod "1228931b-c0ef-49b3-8f30-c6ffcddca7f6" (UID: "1228931b-c0ef-49b3-8f30-c6ffcddca7f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.080137 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1228931b-c0ef-49b3-8f30-c6ffcddca7f6" (UID: "1228931b-c0ef-49b3-8f30-c6ffcddca7f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.143601 4756 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.143635 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qplnd\" (UniqueName: \"kubernetes.io/projected/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-kube-api-access-qplnd\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.143649 4756 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.143661 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.143675 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.143686 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1228931b-c0ef-49b3-8f30-c6ffcddca7f6-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.488370 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t47bv" event={"ID":"1228931b-c0ef-49b3-8f30-c6ffcddca7f6","Type":"ContainerDied","Data":"1250fc150fa7606f5d61455ed91f11477b17854feb82a10bea6b10e988bd0417"} Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.488805 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1250fc150fa7606f5d61455ed91f11477b17854feb82a10bea6b10e988bd0417" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.488973 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t47bv" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.605070 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-667b884ddd-gsppx"] Dec 05 17:47:41 crc kubenswrapper[4756]: E1205 17:47:41.605408 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a87cfcd-63b6-4ef6-841e-a17de8443742" containerName="dnsmasq-dns" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.605451 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a87cfcd-63b6-4ef6-841e-a17de8443742" containerName="dnsmasq-dns" Dec 05 17:47:41 crc kubenswrapper[4756]: E1205 17:47:41.605468 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1228931b-c0ef-49b3-8f30-c6ffcddca7f6" containerName="keystone-bootstrap" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.605476 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1228931b-c0ef-49b3-8f30-c6ffcddca7f6" containerName="keystone-bootstrap" Dec 05 17:47:41 crc kubenswrapper[4756]: E1205 17:47:41.605498 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a87cfcd-63b6-4ef6-841e-a17de8443742" containerName="init" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.605508 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a87cfcd-63b6-4ef6-841e-a17de8443742" containerName="init" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.605697 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1228931b-c0ef-49b3-8f30-c6ffcddca7f6" containerName="keystone-bootstrap" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.605718 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a87cfcd-63b6-4ef6-841e-a17de8443742" containerName="dnsmasq-dns" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.606335 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-667b884ddd-gsppx" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.608413 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.609193 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.613376 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.613554 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.614540 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-d86lz" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.624121 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.629513 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-667b884ddd-gsppx"] Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.654019 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/14f3c83c-c658-4f94-8ec6-dfd4916be5c1-credential-keys\") pod \"keystone-667b884ddd-gsppx\" (UID: \"14f3c83c-c658-4f94-8ec6-dfd4916be5c1\") " pod="openstack/keystone-667b884ddd-gsppx" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.654146 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14f3c83c-c658-4f94-8ec6-dfd4916be5c1-scripts\") pod \"keystone-667b884ddd-gsppx\" (UID: \"14f3c83c-c658-4f94-8ec6-dfd4916be5c1\") " pod="openstack/keystone-667b884ddd-gsppx" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.654199 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14f3c83c-c658-4f94-8ec6-dfd4916be5c1-internal-tls-certs\") pod \"keystone-667b884ddd-gsppx\" (UID: \"14f3c83c-c658-4f94-8ec6-dfd4916be5c1\") " pod="openstack/keystone-667b884ddd-gsppx" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.654242 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14f3c83c-c658-4f94-8ec6-dfd4916be5c1-public-tls-certs\") pod \"keystone-667b884ddd-gsppx\" (UID: \"14f3c83c-c658-4f94-8ec6-dfd4916be5c1\") " pod="openstack/keystone-667b884ddd-gsppx" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.654283 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14f3c83c-c658-4f94-8ec6-dfd4916be5c1-combined-ca-bundle\") pod \"keystone-667b884ddd-gsppx\" (UID: \"14f3c83c-c658-4f94-8ec6-dfd4916be5c1\") " pod="openstack/keystone-667b884ddd-gsppx" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.654342 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88vc9\" (UniqueName: \"kubernetes.io/projected/14f3c83c-c658-4f94-8ec6-dfd4916be5c1-kube-api-access-88vc9\") pod \"keystone-667b884ddd-gsppx\" (UID: \"14f3c83c-c658-4f94-8ec6-dfd4916be5c1\") " pod="openstack/keystone-667b884ddd-gsppx" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.654381 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14f3c83c-c658-4f94-8ec6-dfd4916be5c1-config-data\") pod \"keystone-667b884ddd-gsppx\" (UID: \"14f3c83c-c658-4f94-8ec6-dfd4916be5c1\") " pod="openstack/keystone-667b884ddd-gsppx" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.654409 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/14f3c83c-c658-4f94-8ec6-dfd4916be5c1-fernet-keys\") pod \"keystone-667b884ddd-gsppx\" (UID: \"14f3c83c-c658-4f94-8ec6-dfd4916be5c1\") " pod="openstack/keystone-667b884ddd-gsppx" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.755986 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/14f3c83c-c658-4f94-8ec6-dfd4916be5c1-credential-keys\") pod \"keystone-667b884ddd-gsppx\" (UID: \"14f3c83c-c658-4f94-8ec6-dfd4916be5c1\") " pod="openstack/keystone-667b884ddd-gsppx" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.756076 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14f3c83c-c658-4f94-8ec6-dfd4916be5c1-scripts\") pod \"keystone-667b884ddd-gsppx\" (UID: \"14f3c83c-c658-4f94-8ec6-dfd4916be5c1\") " pod="openstack/keystone-667b884ddd-gsppx" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.756115 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14f3c83c-c658-4f94-8ec6-dfd4916be5c1-internal-tls-certs\") pod \"keystone-667b884ddd-gsppx\" (UID: \"14f3c83c-c658-4f94-8ec6-dfd4916be5c1\") " pod="openstack/keystone-667b884ddd-gsppx" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.756145 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14f3c83c-c658-4f94-8ec6-dfd4916be5c1-public-tls-certs\") pod \"keystone-667b884ddd-gsppx\" (UID: \"14f3c83c-c658-4f94-8ec6-dfd4916be5c1\") " pod="openstack/keystone-667b884ddd-gsppx" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.756176 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14f3c83c-c658-4f94-8ec6-dfd4916be5c1-combined-ca-bundle\") pod \"keystone-667b884ddd-gsppx\" (UID: \"14f3c83c-c658-4f94-8ec6-dfd4916be5c1\") " pod="openstack/keystone-667b884ddd-gsppx" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.756220 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88vc9\" (UniqueName: \"kubernetes.io/projected/14f3c83c-c658-4f94-8ec6-dfd4916be5c1-kube-api-access-88vc9\") pod \"keystone-667b884ddd-gsppx\" (UID: \"14f3c83c-c658-4f94-8ec6-dfd4916be5c1\") " pod="openstack/keystone-667b884ddd-gsppx" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.756249 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14f3c83c-c658-4f94-8ec6-dfd4916be5c1-config-data\") pod \"keystone-667b884ddd-gsppx\" (UID: \"14f3c83c-c658-4f94-8ec6-dfd4916be5c1\") " pod="openstack/keystone-667b884ddd-gsppx" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.756269 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/14f3c83c-c658-4f94-8ec6-dfd4916be5c1-fernet-keys\") pod \"keystone-667b884ddd-gsppx\" (UID: \"14f3c83c-c658-4f94-8ec6-dfd4916be5c1\") " pod="openstack/keystone-667b884ddd-gsppx" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.759664 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/14f3c83c-c658-4f94-8ec6-dfd4916be5c1-fernet-keys\") pod \"keystone-667b884ddd-gsppx\" (UID: \"14f3c83c-c658-4f94-8ec6-dfd4916be5c1\") " pod="openstack/keystone-667b884ddd-gsppx" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.761039 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14f3c83c-c658-4f94-8ec6-dfd4916be5c1-combined-ca-bundle\") pod \"keystone-667b884ddd-gsppx\" (UID: \"14f3c83c-c658-4f94-8ec6-dfd4916be5c1\") " pod="openstack/keystone-667b884ddd-gsppx" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.761186 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14f3c83c-c658-4f94-8ec6-dfd4916be5c1-public-tls-certs\") pod \"keystone-667b884ddd-gsppx\" (UID: \"14f3c83c-c658-4f94-8ec6-dfd4916be5c1\") " pod="openstack/keystone-667b884ddd-gsppx" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.761533 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/14f3c83c-c658-4f94-8ec6-dfd4916be5c1-credential-keys\") pod \"keystone-667b884ddd-gsppx\" (UID: \"14f3c83c-c658-4f94-8ec6-dfd4916be5c1\") " pod="openstack/keystone-667b884ddd-gsppx" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.767350 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14f3c83c-c658-4f94-8ec6-dfd4916be5c1-config-data\") pod \"keystone-667b884ddd-gsppx\" (UID: \"14f3c83c-c658-4f94-8ec6-dfd4916be5c1\") " pod="openstack/keystone-667b884ddd-gsppx" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.767959 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14f3c83c-c658-4f94-8ec6-dfd4916be5c1-internal-tls-certs\") pod \"keystone-667b884ddd-gsppx\" (UID: \"14f3c83c-c658-4f94-8ec6-dfd4916be5c1\") " pod="openstack/keystone-667b884ddd-gsppx" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.774237 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14f3c83c-c658-4f94-8ec6-dfd4916be5c1-scripts\") pod \"keystone-667b884ddd-gsppx\" (UID: \"14f3c83c-c658-4f94-8ec6-dfd4916be5c1\") " pod="openstack/keystone-667b884ddd-gsppx" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.774826 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88vc9\" (UniqueName: \"kubernetes.io/projected/14f3c83c-c658-4f94-8ec6-dfd4916be5c1-kube-api-access-88vc9\") pod \"keystone-667b884ddd-gsppx\" (UID: \"14f3c83c-c658-4f94-8ec6-dfd4916be5c1\") " pod="openstack/keystone-667b884ddd-gsppx" Dec 05 17:47:41 crc kubenswrapper[4756]: I1205 17:47:41.934783 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-667b884ddd-gsppx" Dec 05 17:47:42 crc kubenswrapper[4756]: I1205 17:47:42.224809 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-667b884ddd-gsppx"] Dec 05 17:47:42 crc kubenswrapper[4756]: W1205 17:47:42.231093 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14f3c83c_c658_4f94_8ec6_dfd4916be5c1.slice/crio-8873205f9408055bac8725b4938e21754d35a416a89ed045c47bac70c70fb193 WatchSource:0}: Error finding container 8873205f9408055bac8725b4938e21754d35a416a89ed045c47bac70c70fb193: Status 404 returned error can't find the container with id 8873205f9408055bac8725b4938e21754d35a416a89ed045c47bac70c70fb193 Dec 05 17:47:42 crc kubenswrapper[4756]: I1205 17:47:42.499900 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-667b884ddd-gsppx" event={"ID":"14f3c83c-c658-4f94-8ec6-dfd4916be5c1","Type":"ContainerStarted","Data":"9ef4628e283fd95fce0b3f2582dfe7e4708ff8f77429c542f10803cf105e2e36"} Dec 05 17:47:42 crc kubenswrapper[4756]: I1205 17:47:42.499958 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-667b884ddd-gsppx" event={"ID":"14f3c83c-c658-4f94-8ec6-dfd4916be5c1","Type":"ContainerStarted","Data":"8873205f9408055bac8725b4938e21754d35a416a89ed045c47bac70c70fb193"} Dec 05 17:47:42 crc kubenswrapper[4756]: I1205 17:47:42.500072 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-667b884ddd-gsppx" Dec 05 17:47:42 crc kubenswrapper[4756]: I1205 17:47:42.533866 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-667b884ddd-gsppx" podStartSLOduration=1.533843142 podStartE2EDuration="1.533843142s" podCreationTimestamp="2025-12-05 17:47:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:47:42.526861147 +0000 UTC m=+5522.765534699" watchObservedRunningTime="2025-12-05 17:47:42.533843142 +0000 UTC m=+5522.772516664" Dec 05 17:47:59 crc kubenswrapper[4756]: I1205 17:47:59.156655 4756 scope.go:117] "RemoveContainer" containerID="a5257ada9706b9ac3881b750ad36a18fbe7581f35160a25fe036a60436880e5c" Dec 05 17:47:59 crc kubenswrapper[4756]: I1205 17:47:59.213091 4756 scope.go:117] "RemoveContainer" containerID="43071d64150832e1f7257f592891f10ccaa44dc7605cf77953ae6f6e489e3c57" Dec 05 17:48:13 crc kubenswrapper[4756]: I1205 17:48:13.420156 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-667b884ddd-gsppx" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.274193 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.280747 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.285445 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.285854 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-q4sqp" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.288743 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.290744 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.375157 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b5d5300b-1878-4745-9d66-eb7bfab3c266-openstack-config-secret\") pod \"openstackclient\" (UID: \"b5d5300b-1878-4745-9d66-eb7bfab3c266\") " pod="openstack/openstackclient" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.375225 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htb22\" (UniqueName: \"kubernetes.io/projected/b5d5300b-1878-4745-9d66-eb7bfab3c266-kube-api-access-htb22\") pod \"openstackclient\" (UID: \"b5d5300b-1878-4745-9d66-eb7bfab3c266\") " pod="openstack/openstackclient" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.375386 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b5d5300b-1878-4745-9d66-eb7bfab3c266-openstack-config\") pod \"openstackclient\" (UID: \"b5d5300b-1878-4745-9d66-eb7bfab3c266\") " pod="openstack/openstackclient" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.375448 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d5300b-1878-4745-9d66-eb7bfab3c266-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b5d5300b-1878-4745-9d66-eb7bfab3c266\") " pod="openstack/openstackclient" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.388501 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 05 17:48:17 crc kubenswrapper[4756]: E1205 17:48:17.389235 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-htb22 openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="b5d5300b-1878-4745-9d66-eb7bfab3c266" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.400177 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.413899 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.415075 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.422893 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.484390 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b5d5300b-1878-4745-9d66-eb7bfab3c266-openstack-config-secret\") pod \"openstackclient\" (UID: \"b5d5300b-1878-4745-9d66-eb7bfab3c266\") " pod="openstack/openstackclient" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.484468 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htb22\" (UniqueName: \"kubernetes.io/projected/b5d5300b-1878-4745-9d66-eb7bfab3c266-kube-api-access-htb22\") pod \"openstackclient\" (UID: \"b5d5300b-1878-4745-9d66-eb7bfab3c266\") " pod="openstack/openstackclient" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.484531 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc3702a4-9b2f-46b4-95d3-c0c291132199-combined-ca-bundle\") pod \"openstackclient\" (UID: \"cc3702a4-9b2f-46b4-95d3-c0c291132199\") " pod="openstack/openstackclient" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.484590 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cc3702a4-9b2f-46b4-95d3-c0c291132199-openstack-config-secret\") pod \"openstackclient\" (UID: \"cc3702a4-9b2f-46b4-95d3-c0c291132199\") " pod="openstack/openstackclient" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.484640 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrgmp\" (UniqueName: \"kubernetes.io/projected/cc3702a4-9b2f-46b4-95d3-c0c291132199-kube-api-access-nrgmp\") pod \"openstackclient\" (UID: \"cc3702a4-9b2f-46b4-95d3-c0c291132199\") " pod="openstack/openstackclient" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.484686 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b5d5300b-1878-4745-9d66-eb7bfab3c266-openstack-config\") pod \"openstackclient\" (UID: \"b5d5300b-1878-4745-9d66-eb7bfab3c266\") " pod="openstack/openstackclient" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.484718 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d5300b-1878-4745-9d66-eb7bfab3c266-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b5d5300b-1878-4745-9d66-eb7bfab3c266\") " pod="openstack/openstackclient" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.484832 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cc3702a4-9b2f-46b4-95d3-c0c291132199-openstack-config\") pod \"openstackclient\" (UID: \"cc3702a4-9b2f-46b4-95d3-c0c291132199\") " pod="openstack/openstackclient" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.485824 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b5d5300b-1878-4745-9d66-eb7bfab3c266-openstack-config\") pod \"openstackclient\" (UID: \"b5d5300b-1878-4745-9d66-eb7bfab3c266\") " pod="openstack/openstackclient" Dec 05 17:48:17 crc kubenswrapper[4756]: E1205 17:48:17.486565 4756 projected.go:194] Error preparing data for projected volume kube-api-access-htb22 for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (b5d5300b-1878-4745-9d66-eb7bfab3c266) does not match the UID in record. The object might have been deleted and then recreated Dec 05 17:48:17 crc kubenswrapper[4756]: E1205 17:48:17.486644 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b5d5300b-1878-4745-9d66-eb7bfab3c266-kube-api-access-htb22 podName:b5d5300b-1878-4745-9d66-eb7bfab3c266 nodeName:}" failed. No retries permitted until 2025-12-05 17:48:17.986612842 +0000 UTC m=+5558.225286364 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-htb22" (UniqueName: "kubernetes.io/projected/b5d5300b-1878-4745-9d66-eb7bfab3c266-kube-api-access-htb22") pod "openstackclient" (UID: "b5d5300b-1878-4745-9d66-eb7bfab3c266") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (b5d5300b-1878-4745-9d66-eb7bfab3c266) does not match the UID in record. The object might have been deleted and then recreated Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.493143 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b5d5300b-1878-4745-9d66-eb7bfab3c266-openstack-config-secret\") pod \"openstackclient\" (UID: \"b5d5300b-1878-4745-9d66-eb7bfab3c266\") " pod="openstack/openstackclient" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.498137 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d5300b-1878-4745-9d66-eb7bfab3c266-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b5d5300b-1878-4745-9d66-eb7bfab3c266\") " pod="openstack/openstackclient" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.585796 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc3702a4-9b2f-46b4-95d3-c0c291132199-combined-ca-bundle\") pod \"openstackclient\" (UID: \"cc3702a4-9b2f-46b4-95d3-c0c291132199\") " pod="openstack/openstackclient" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.585897 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cc3702a4-9b2f-46b4-95d3-c0c291132199-openstack-config-secret\") pod \"openstackclient\" (UID: \"cc3702a4-9b2f-46b4-95d3-c0c291132199\") " pod="openstack/openstackclient" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.585941 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrgmp\" (UniqueName: \"kubernetes.io/projected/cc3702a4-9b2f-46b4-95d3-c0c291132199-kube-api-access-nrgmp\") pod \"openstackclient\" (UID: \"cc3702a4-9b2f-46b4-95d3-c0c291132199\") " pod="openstack/openstackclient" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.585996 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cc3702a4-9b2f-46b4-95d3-c0c291132199-openstack-config\") pod \"openstackclient\" (UID: \"cc3702a4-9b2f-46b4-95d3-c0c291132199\") " pod="openstack/openstackclient" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.588710 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cc3702a4-9b2f-46b4-95d3-c0c291132199-openstack-config\") pod \"openstackclient\" (UID: \"cc3702a4-9b2f-46b4-95d3-c0c291132199\") " pod="openstack/openstackclient" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.594105 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cc3702a4-9b2f-46b4-95d3-c0c291132199-openstack-config-secret\") pod \"openstackclient\" (UID: \"cc3702a4-9b2f-46b4-95d3-c0c291132199\") " pod="openstack/openstackclient" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.594266 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc3702a4-9b2f-46b4-95d3-c0c291132199-combined-ca-bundle\") pod \"openstackclient\" (UID: \"cc3702a4-9b2f-46b4-95d3-c0c291132199\") " pod="openstack/openstackclient" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.609013 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrgmp\" (UniqueName: \"kubernetes.io/projected/cc3702a4-9b2f-46b4-95d3-c0c291132199-kube-api-access-nrgmp\") pod \"openstackclient\" (UID: \"cc3702a4-9b2f-46b4-95d3-c0c291132199\") " pod="openstack/openstackclient" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.735578 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.893291 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.900295 4756 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="b5d5300b-1878-4745-9d66-eb7bfab3c266" podUID="cc3702a4-9b2f-46b4-95d3-c0c291132199" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.902876 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.997926 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b5d5300b-1878-4745-9d66-eb7bfab3c266-openstack-config\") pod \"b5d5300b-1878-4745-9d66-eb7bfab3c266\" (UID: \"b5d5300b-1878-4745-9d66-eb7bfab3c266\") " Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.998109 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b5d5300b-1878-4745-9d66-eb7bfab3c266-openstack-config-secret\") pod \"b5d5300b-1878-4745-9d66-eb7bfab3c266\" (UID: \"b5d5300b-1878-4745-9d66-eb7bfab3c266\") " Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.998174 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d5300b-1878-4745-9d66-eb7bfab3c266-combined-ca-bundle\") pod \"b5d5300b-1878-4745-9d66-eb7bfab3c266\" (UID: \"b5d5300b-1878-4745-9d66-eb7bfab3c266\") " Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.998366 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5d5300b-1878-4745-9d66-eb7bfab3c266-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b5d5300b-1878-4745-9d66-eb7bfab3c266" (UID: "b5d5300b-1878-4745-9d66-eb7bfab3c266"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.998613 4756 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b5d5300b-1878-4745-9d66-eb7bfab3c266-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 05 17:48:17 crc kubenswrapper[4756]: I1205 17:48:17.998626 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htb22\" (UniqueName: \"kubernetes.io/projected/b5d5300b-1878-4745-9d66-eb7bfab3c266-kube-api-access-htb22\") on node \"crc\" DevicePath \"\"" Dec 05 17:48:18 crc kubenswrapper[4756]: I1205 17:48:18.003577 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5d5300b-1878-4745-9d66-eb7bfab3c266-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5d5300b-1878-4745-9d66-eb7bfab3c266" (UID: "b5d5300b-1878-4745-9d66-eb7bfab3c266"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:48:18 crc kubenswrapper[4756]: I1205 17:48:18.003633 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5d5300b-1878-4745-9d66-eb7bfab3c266-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b5d5300b-1878-4745-9d66-eb7bfab3c266" (UID: "b5d5300b-1878-4745-9d66-eb7bfab3c266"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:48:18 crc kubenswrapper[4756]: I1205 17:48:18.062226 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5d5300b-1878-4745-9d66-eb7bfab3c266" path="/var/lib/kubelet/pods/b5d5300b-1878-4745-9d66-eb7bfab3c266/volumes" Dec 05 17:48:18 crc kubenswrapper[4756]: I1205 17:48:18.100362 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d5300b-1878-4745-9d66-eb7bfab3c266-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:48:18 crc kubenswrapper[4756]: I1205 17:48:18.100390 4756 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b5d5300b-1878-4745-9d66-eb7bfab3c266-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 05 17:48:18 crc kubenswrapper[4756]: I1205 17:48:18.213827 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 17:48:18 crc kubenswrapper[4756]: I1205 17:48:18.907163 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"cc3702a4-9b2f-46b4-95d3-c0c291132199","Type":"ContainerStarted","Data":"4f64ef23f1ae1c759d4f18d2a0f2a70fc69012a045387303ee110d5bc5a2d5f9"} Dec 05 17:48:18 crc kubenswrapper[4756]: I1205 17:48:18.907712 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"cc3702a4-9b2f-46b4-95d3-c0c291132199","Type":"ContainerStarted","Data":"598c9a3b8817ac86fddec2ff119d32334f28774e48080d80a5a384423e20df07"} Dec 05 17:48:18 crc kubenswrapper[4756]: I1205 17:48:18.907208 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 17:48:18 crc kubenswrapper[4756]: I1205 17:48:18.941056 4756 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="b5d5300b-1878-4745-9d66-eb7bfab3c266" podUID="cc3702a4-9b2f-46b4-95d3-c0c291132199" Dec 05 17:48:18 crc kubenswrapper[4756]: I1205 17:48:18.947588 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.947552839 podStartE2EDuration="1.947552839s" podCreationTimestamp="2025-12-05 17:48:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:48:18.935487768 +0000 UTC m=+5559.174161350" watchObservedRunningTime="2025-12-05 17:48:18.947552839 +0000 UTC m=+5559.186226431" Dec 05 17:48:52 crc kubenswrapper[4756]: I1205 17:48:52.295623 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 17:48:52 crc kubenswrapper[4756]: I1205 17:48:52.296777 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 17:48:57 crc kubenswrapper[4756]: E1205 17:48:57.872154 4756 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.189:46560->38.102.83.189:33157: read tcp 38.102.83.189:46560->38.102.83.189:33157: read: connection reset by peer Dec 05 17:48:59 crc kubenswrapper[4756]: I1205 17:48:59.396872 4756 scope.go:117] "RemoveContainer" containerID="142a5da11e0eb0e54b9412c5e504f31d6cdc1511a6de841c4168bd18837e9f9b" Dec 05 17:48:59 crc kubenswrapper[4756]: I1205 17:48:59.437793 4756 scope.go:117] "RemoveContainer" containerID="6ec4bbc0ea895b014c2cc25bc26cae3caad2fd43e852bec1c7a401ede64188b6" Dec 05 17:48:59 crc kubenswrapper[4756]: I1205 17:48:59.498925 4756 scope.go:117] "RemoveContainer" containerID="9f4d2f47b98a59cb595396df6af312e4fdd7fe8112acc149c4f16bc9c161840d" Dec 05 17:49:22 crc kubenswrapper[4756]: I1205 17:49:22.295835 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 17:49:22 crc kubenswrapper[4756]: I1205 17:49:22.296388 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 17:49:40 crc kubenswrapper[4756]: E1205 17:49:40.665911 4756 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.189:52686->38.102.83.189:33157: read tcp 38.102.83.189:52686->38.102.83.189:33157: read: connection reset by peer Dec 05 17:49:52 crc kubenswrapper[4756]: I1205 17:49:52.295852 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 17:49:52 crc kubenswrapper[4756]: I1205 17:49:52.296509 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 17:49:52 crc kubenswrapper[4756]: I1205 17:49:52.296571 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 17:49:52 crc kubenswrapper[4756]: I1205 17:49:52.297341 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6b5a9908bca7e41a64a674dbf6cbb63be6b207127fbfed3e63c4b2167cea9c0c"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 17:49:52 crc kubenswrapper[4756]: I1205 17:49:52.297431 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://6b5a9908bca7e41a64a674dbf6cbb63be6b207127fbfed3e63c4b2167cea9c0c" gracePeriod=600 Dec 05 17:49:53 crc kubenswrapper[4756]: I1205 17:49:53.037284 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="6b5a9908bca7e41a64a674dbf6cbb63be6b207127fbfed3e63c4b2167cea9c0c" exitCode=0 Dec 05 17:49:53 crc kubenswrapper[4756]: I1205 17:49:53.037332 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"6b5a9908bca7e41a64a674dbf6cbb63be6b207127fbfed3e63c4b2167cea9c0c"} Dec 05 17:49:53 crc kubenswrapper[4756]: I1205 17:49:53.037680 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a"} Dec 05 17:49:53 crc kubenswrapper[4756]: I1205 17:49:53.037704 4756 scope.go:117] "RemoveContainer" containerID="13145d7e051555462bf10e555fd5c9ca6a7f1db46b7c17462e37d0086f868bf2" Dec 05 17:49:54 crc kubenswrapper[4756]: I1205 17:49:54.917702 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-h84pl"] Dec 05 17:49:54 crc kubenswrapper[4756]: I1205 17:49:54.919482 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h84pl" Dec 05 17:49:54 crc kubenswrapper[4756]: I1205 17:49:54.925322 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-bc89-account-create-update-8z7jt"] Dec 05 17:49:54 crc kubenswrapper[4756]: I1205 17:49:54.926436 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bc89-account-create-update-8z7jt" Dec 05 17:49:54 crc kubenswrapper[4756]: I1205 17:49:54.928437 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 05 17:49:54 crc kubenswrapper[4756]: I1205 17:49:54.933712 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-h84pl"] Dec 05 17:49:54 crc kubenswrapper[4756]: I1205 17:49:54.985010 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-bc89-account-create-update-8z7jt"] Dec 05 17:49:55 crc kubenswrapper[4756]: I1205 17:49:55.027194 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ea60fe5-2e89-4d87-966b-7be9872f939f-operator-scripts\") pod \"barbican-db-create-h84pl\" (UID: \"7ea60fe5-2e89-4d87-966b-7be9872f939f\") " pod="openstack/barbican-db-create-h84pl" Dec 05 17:49:55 crc kubenswrapper[4756]: I1205 17:49:55.027249 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82pnw\" (UniqueName: \"kubernetes.io/projected/7ea60fe5-2e89-4d87-966b-7be9872f939f-kube-api-access-82pnw\") pod \"barbican-db-create-h84pl\" (UID: \"7ea60fe5-2e89-4d87-966b-7be9872f939f\") " pod="openstack/barbican-db-create-h84pl" Dec 05 17:49:55 crc kubenswrapper[4756]: I1205 17:49:55.027393 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f212d8d-36fd-4499-9cd4-08c17b5e39e9-operator-scripts\") pod \"barbican-bc89-account-create-update-8z7jt\" (UID: \"5f212d8d-36fd-4499-9cd4-08c17b5e39e9\") " pod="openstack/barbican-bc89-account-create-update-8z7jt" Dec 05 17:49:55 crc kubenswrapper[4756]: I1205 17:49:55.027446 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f65x4\" (UniqueName: \"kubernetes.io/projected/5f212d8d-36fd-4499-9cd4-08c17b5e39e9-kube-api-access-f65x4\") pod \"barbican-bc89-account-create-update-8z7jt\" (UID: \"5f212d8d-36fd-4499-9cd4-08c17b5e39e9\") " pod="openstack/barbican-bc89-account-create-update-8z7jt" Dec 05 17:49:55 crc kubenswrapper[4756]: I1205 17:49:55.129134 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ea60fe5-2e89-4d87-966b-7be9872f939f-operator-scripts\") pod \"barbican-db-create-h84pl\" (UID: \"7ea60fe5-2e89-4d87-966b-7be9872f939f\") " pod="openstack/barbican-db-create-h84pl" Dec 05 17:49:55 crc kubenswrapper[4756]: I1205 17:49:55.129496 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82pnw\" (UniqueName: \"kubernetes.io/projected/7ea60fe5-2e89-4d87-966b-7be9872f939f-kube-api-access-82pnw\") pod \"barbican-db-create-h84pl\" (UID: \"7ea60fe5-2e89-4d87-966b-7be9872f939f\") " pod="openstack/barbican-db-create-h84pl" Dec 05 17:49:55 crc kubenswrapper[4756]: I1205 17:49:55.129645 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f212d8d-36fd-4499-9cd4-08c17b5e39e9-operator-scripts\") pod \"barbican-bc89-account-create-update-8z7jt\" (UID: \"5f212d8d-36fd-4499-9cd4-08c17b5e39e9\") " pod="openstack/barbican-bc89-account-create-update-8z7jt" Dec 05 17:49:55 crc kubenswrapper[4756]: I1205 17:49:55.129777 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f65x4\" (UniqueName: \"kubernetes.io/projected/5f212d8d-36fd-4499-9cd4-08c17b5e39e9-kube-api-access-f65x4\") pod \"barbican-bc89-account-create-update-8z7jt\" (UID: \"5f212d8d-36fd-4499-9cd4-08c17b5e39e9\") " pod="openstack/barbican-bc89-account-create-update-8z7jt" Dec 05 17:49:55 crc kubenswrapper[4756]: I1205 17:49:55.130010 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ea60fe5-2e89-4d87-966b-7be9872f939f-operator-scripts\") pod \"barbican-db-create-h84pl\" (UID: \"7ea60fe5-2e89-4d87-966b-7be9872f939f\") " pod="openstack/barbican-db-create-h84pl" Dec 05 17:49:55 crc kubenswrapper[4756]: I1205 17:49:55.130595 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f212d8d-36fd-4499-9cd4-08c17b5e39e9-operator-scripts\") pod \"barbican-bc89-account-create-update-8z7jt\" (UID: \"5f212d8d-36fd-4499-9cd4-08c17b5e39e9\") " pod="openstack/barbican-bc89-account-create-update-8z7jt" Dec 05 17:49:55 crc kubenswrapper[4756]: I1205 17:49:55.157538 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82pnw\" (UniqueName: \"kubernetes.io/projected/7ea60fe5-2e89-4d87-966b-7be9872f939f-kube-api-access-82pnw\") pod \"barbican-db-create-h84pl\" (UID: \"7ea60fe5-2e89-4d87-966b-7be9872f939f\") " pod="openstack/barbican-db-create-h84pl" Dec 05 17:49:55 crc kubenswrapper[4756]: I1205 17:49:55.159773 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f65x4\" (UniqueName: \"kubernetes.io/projected/5f212d8d-36fd-4499-9cd4-08c17b5e39e9-kube-api-access-f65x4\") pod \"barbican-bc89-account-create-update-8z7jt\" (UID: \"5f212d8d-36fd-4499-9cd4-08c17b5e39e9\") " pod="openstack/barbican-bc89-account-create-update-8z7jt" Dec 05 17:49:55 crc kubenswrapper[4756]: I1205 17:49:55.280236 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h84pl" Dec 05 17:49:55 crc kubenswrapper[4756]: I1205 17:49:55.301227 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bc89-account-create-update-8z7jt" Dec 05 17:49:55 crc kubenswrapper[4756]: I1205 17:49:55.781119 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-h84pl"] Dec 05 17:49:55 crc kubenswrapper[4756]: I1205 17:49:55.788523 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-bc89-account-create-update-8z7jt"] Dec 05 17:49:55 crc kubenswrapper[4756]: W1205 17:49:55.794336 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ea60fe5_2e89_4d87_966b_7be9872f939f.slice/crio-b5e87340248234c6cc60abe1ff4a819bd840f074489369d65d423289d2f98f2e WatchSource:0}: Error finding container b5e87340248234c6cc60abe1ff4a819bd840f074489369d65d423289d2f98f2e: Status 404 returned error can't find the container with id b5e87340248234c6cc60abe1ff4a819bd840f074489369d65d423289d2f98f2e Dec 05 17:49:56 crc kubenswrapper[4756]: I1205 17:49:56.088545 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-h84pl" event={"ID":"7ea60fe5-2e89-4d87-966b-7be9872f939f","Type":"ContainerStarted","Data":"6fbdb3100de011c82dc4e734e4b5e80fbdb5413f975d8e8f31cb81095c6bf752"} Dec 05 17:49:56 crc kubenswrapper[4756]: I1205 17:49:56.088818 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-h84pl" event={"ID":"7ea60fe5-2e89-4d87-966b-7be9872f939f","Type":"ContainerStarted","Data":"b5e87340248234c6cc60abe1ff4a819bd840f074489369d65d423289d2f98f2e"} Dec 05 17:49:56 crc kubenswrapper[4756]: I1205 17:49:56.092443 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-bc89-account-create-update-8z7jt" event={"ID":"5f212d8d-36fd-4499-9cd4-08c17b5e39e9","Type":"ContainerStarted","Data":"2b8a3d5580211892e58592ee0b5e5166aadb7b984aeacbda154d78deef1e08d0"} Dec 05 17:49:56 crc kubenswrapper[4756]: I1205 17:49:56.092481 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-bc89-account-create-update-8z7jt" event={"ID":"5f212d8d-36fd-4499-9cd4-08c17b5e39e9","Type":"ContainerStarted","Data":"1eb6e112ef9bbcace567ac30fa10aa83c3940b6fdf6868d0516908f3de753ae1"} Dec 05 17:49:56 crc kubenswrapper[4756]: I1205 17:49:56.115951 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-h84pl" podStartSLOduration=2.115928531 podStartE2EDuration="2.115928531s" podCreationTimestamp="2025-12-05 17:49:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:49:56.114336989 +0000 UTC m=+5656.353010521" watchObservedRunningTime="2025-12-05 17:49:56.115928531 +0000 UTC m=+5656.354602053" Dec 05 17:49:56 crc kubenswrapper[4756]: I1205 17:49:56.134730 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-bc89-account-create-update-8z7jt" podStartSLOduration=2.134712152 podStartE2EDuration="2.134712152s" podCreationTimestamp="2025-12-05 17:49:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:49:56.132317378 +0000 UTC m=+5656.370990890" watchObservedRunningTime="2025-12-05 17:49:56.134712152 +0000 UTC m=+5656.373385654" Dec 05 17:49:57 crc kubenswrapper[4756]: I1205 17:49:57.103500 4756 generic.go:334] "Generic (PLEG): container finished" podID="7ea60fe5-2e89-4d87-966b-7be9872f939f" containerID="6fbdb3100de011c82dc4e734e4b5e80fbdb5413f975d8e8f31cb81095c6bf752" exitCode=0 Dec 05 17:49:57 crc kubenswrapper[4756]: I1205 17:49:57.103904 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-h84pl" event={"ID":"7ea60fe5-2e89-4d87-966b-7be9872f939f","Type":"ContainerDied","Data":"6fbdb3100de011c82dc4e734e4b5e80fbdb5413f975d8e8f31cb81095c6bf752"} Dec 05 17:49:57 crc kubenswrapper[4756]: I1205 17:49:57.106898 4756 generic.go:334] "Generic (PLEG): container finished" podID="5f212d8d-36fd-4499-9cd4-08c17b5e39e9" containerID="2b8a3d5580211892e58592ee0b5e5166aadb7b984aeacbda154d78deef1e08d0" exitCode=0 Dec 05 17:49:57 crc kubenswrapper[4756]: I1205 17:49:57.107001 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-bc89-account-create-update-8z7jt" event={"ID":"5f212d8d-36fd-4499-9cd4-08c17b5e39e9","Type":"ContainerDied","Data":"2b8a3d5580211892e58592ee0b5e5166aadb7b984aeacbda154d78deef1e08d0"} Dec 05 17:49:58 crc kubenswrapper[4756]: I1205 17:49:58.406471 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pvnkb"] Dec 05 17:49:58 crc kubenswrapper[4756]: I1205 17:49:58.408408 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pvnkb" Dec 05 17:49:58 crc kubenswrapper[4756]: I1205 17:49:58.422910 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pvnkb"] Dec 05 17:49:58 crc kubenswrapper[4756]: I1205 17:49:58.491510 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/723c11a6-c5d7-48d1-b97d-eb92ff0609b6-utilities\") pod \"redhat-operators-pvnkb\" (UID: \"723c11a6-c5d7-48d1-b97d-eb92ff0609b6\") " pod="openshift-marketplace/redhat-operators-pvnkb" Dec 05 17:49:58 crc kubenswrapper[4756]: I1205 17:49:58.491654 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/723c11a6-c5d7-48d1-b97d-eb92ff0609b6-catalog-content\") pod \"redhat-operators-pvnkb\" (UID: \"723c11a6-c5d7-48d1-b97d-eb92ff0609b6\") " pod="openshift-marketplace/redhat-operators-pvnkb" Dec 05 17:49:58 crc kubenswrapper[4756]: I1205 17:49:58.491703 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ht5d\" (UniqueName: \"kubernetes.io/projected/723c11a6-c5d7-48d1-b97d-eb92ff0609b6-kube-api-access-8ht5d\") pod \"redhat-operators-pvnkb\" (UID: \"723c11a6-c5d7-48d1-b97d-eb92ff0609b6\") " pod="openshift-marketplace/redhat-operators-pvnkb" Dec 05 17:49:58 crc kubenswrapper[4756]: I1205 17:49:58.593257 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/723c11a6-c5d7-48d1-b97d-eb92ff0609b6-utilities\") pod \"redhat-operators-pvnkb\" (UID: \"723c11a6-c5d7-48d1-b97d-eb92ff0609b6\") " pod="openshift-marketplace/redhat-operators-pvnkb" Dec 05 17:49:58 crc kubenswrapper[4756]: I1205 17:49:58.593579 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/723c11a6-c5d7-48d1-b97d-eb92ff0609b6-catalog-content\") pod \"redhat-operators-pvnkb\" (UID: \"723c11a6-c5d7-48d1-b97d-eb92ff0609b6\") " pod="openshift-marketplace/redhat-operators-pvnkb" Dec 05 17:49:58 crc kubenswrapper[4756]: I1205 17:49:58.593678 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ht5d\" (UniqueName: \"kubernetes.io/projected/723c11a6-c5d7-48d1-b97d-eb92ff0609b6-kube-api-access-8ht5d\") pod \"redhat-operators-pvnkb\" (UID: \"723c11a6-c5d7-48d1-b97d-eb92ff0609b6\") " pod="openshift-marketplace/redhat-operators-pvnkb" Dec 05 17:49:58 crc kubenswrapper[4756]: I1205 17:49:58.593817 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/723c11a6-c5d7-48d1-b97d-eb92ff0609b6-utilities\") pod \"redhat-operators-pvnkb\" (UID: \"723c11a6-c5d7-48d1-b97d-eb92ff0609b6\") " pod="openshift-marketplace/redhat-operators-pvnkb" Dec 05 17:49:58 crc kubenswrapper[4756]: I1205 17:49:58.594524 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/723c11a6-c5d7-48d1-b97d-eb92ff0609b6-catalog-content\") pod \"redhat-operators-pvnkb\" (UID: \"723c11a6-c5d7-48d1-b97d-eb92ff0609b6\") " pod="openshift-marketplace/redhat-operators-pvnkb" Dec 05 17:49:58 crc kubenswrapper[4756]: I1205 17:49:58.612096 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ht5d\" (UniqueName: \"kubernetes.io/projected/723c11a6-c5d7-48d1-b97d-eb92ff0609b6-kube-api-access-8ht5d\") pod \"redhat-operators-pvnkb\" (UID: \"723c11a6-c5d7-48d1-b97d-eb92ff0609b6\") " pod="openshift-marketplace/redhat-operators-pvnkb" Dec 05 17:49:58 crc kubenswrapper[4756]: I1205 17:49:58.619844 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bc89-account-create-update-8z7jt" Dec 05 17:49:58 crc kubenswrapper[4756]: I1205 17:49:58.652767 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h84pl" Dec 05 17:49:58 crc kubenswrapper[4756]: I1205 17:49:58.694730 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f212d8d-36fd-4499-9cd4-08c17b5e39e9-operator-scripts\") pod \"5f212d8d-36fd-4499-9cd4-08c17b5e39e9\" (UID: \"5f212d8d-36fd-4499-9cd4-08c17b5e39e9\") " Dec 05 17:49:58 crc kubenswrapper[4756]: I1205 17:49:58.694769 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f65x4\" (UniqueName: \"kubernetes.io/projected/5f212d8d-36fd-4499-9cd4-08c17b5e39e9-kube-api-access-f65x4\") pod \"5f212d8d-36fd-4499-9cd4-08c17b5e39e9\" (UID: \"5f212d8d-36fd-4499-9cd4-08c17b5e39e9\") " Dec 05 17:49:58 crc kubenswrapper[4756]: I1205 17:49:58.695165 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f212d8d-36fd-4499-9cd4-08c17b5e39e9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5f212d8d-36fd-4499-9cd4-08c17b5e39e9" (UID: "5f212d8d-36fd-4499-9cd4-08c17b5e39e9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:49:58 crc kubenswrapper[4756]: I1205 17:49:58.697897 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f212d8d-36fd-4499-9cd4-08c17b5e39e9-kube-api-access-f65x4" (OuterVolumeSpecName: "kube-api-access-f65x4") pod "5f212d8d-36fd-4499-9cd4-08c17b5e39e9" (UID: "5f212d8d-36fd-4499-9cd4-08c17b5e39e9"). InnerVolumeSpecName "kube-api-access-f65x4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:49:58 crc kubenswrapper[4756]: I1205 17:49:58.760074 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pvnkb" Dec 05 17:49:58 crc kubenswrapper[4756]: I1205 17:49:58.796651 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82pnw\" (UniqueName: \"kubernetes.io/projected/7ea60fe5-2e89-4d87-966b-7be9872f939f-kube-api-access-82pnw\") pod \"7ea60fe5-2e89-4d87-966b-7be9872f939f\" (UID: \"7ea60fe5-2e89-4d87-966b-7be9872f939f\") " Dec 05 17:49:58 crc kubenswrapper[4756]: I1205 17:49:58.796811 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ea60fe5-2e89-4d87-966b-7be9872f939f-operator-scripts\") pod \"7ea60fe5-2e89-4d87-966b-7be9872f939f\" (UID: \"7ea60fe5-2e89-4d87-966b-7be9872f939f\") " Dec 05 17:49:58 crc kubenswrapper[4756]: I1205 17:49:58.797189 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f212d8d-36fd-4499-9cd4-08c17b5e39e9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:49:58 crc kubenswrapper[4756]: I1205 17:49:58.797213 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f65x4\" (UniqueName: \"kubernetes.io/projected/5f212d8d-36fd-4499-9cd4-08c17b5e39e9-kube-api-access-f65x4\") on node \"crc\" DevicePath \"\"" Dec 05 17:49:58 crc kubenswrapper[4756]: I1205 17:49:58.797262 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ea60fe5-2e89-4d87-966b-7be9872f939f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7ea60fe5-2e89-4d87-966b-7be9872f939f" (UID: "7ea60fe5-2e89-4d87-966b-7be9872f939f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:49:58 crc kubenswrapper[4756]: I1205 17:49:58.800601 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ea60fe5-2e89-4d87-966b-7be9872f939f-kube-api-access-82pnw" (OuterVolumeSpecName: "kube-api-access-82pnw") pod "7ea60fe5-2e89-4d87-966b-7be9872f939f" (UID: "7ea60fe5-2e89-4d87-966b-7be9872f939f"). InnerVolumeSpecName "kube-api-access-82pnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:49:58 crc kubenswrapper[4756]: I1205 17:49:58.898949 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82pnw\" (UniqueName: \"kubernetes.io/projected/7ea60fe5-2e89-4d87-966b-7be9872f939f-kube-api-access-82pnw\") on node \"crc\" DevicePath \"\"" Dec 05 17:49:58 crc kubenswrapper[4756]: I1205 17:49:58.898983 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ea60fe5-2e89-4d87-966b-7be9872f939f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:49:59 crc kubenswrapper[4756]: I1205 17:49:59.135317 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h84pl" Dec 05 17:49:59 crc kubenswrapper[4756]: I1205 17:49:59.135458 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-h84pl" event={"ID":"7ea60fe5-2e89-4d87-966b-7be9872f939f","Type":"ContainerDied","Data":"b5e87340248234c6cc60abe1ff4a819bd840f074489369d65d423289d2f98f2e"} Dec 05 17:49:59 crc kubenswrapper[4756]: I1205 17:49:59.136185 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5e87340248234c6cc60abe1ff4a819bd840f074489369d65d423289d2f98f2e" Dec 05 17:49:59 crc kubenswrapper[4756]: I1205 17:49:59.137668 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-bc89-account-create-update-8z7jt" event={"ID":"5f212d8d-36fd-4499-9cd4-08c17b5e39e9","Type":"ContainerDied","Data":"1eb6e112ef9bbcace567ac30fa10aa83c3940b6fdf6868d0516908f3de753ae1"} Dec 05 17:49:59 crc kubenswrapper[4756]: I1205 17:49:59.137718 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1eb6e112ef9bbcace567ac30fa10aa83c3940b6fdf6868d0516908f3de753ae1" Dec 05 17:49:59 crc kubenswrapper[4756]: I1205 17:49:59.137719 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bc89-account-create-update-8z7jt" Dec 05 17:49:59 crc kubenswrapper[4756]: I1205 17:49:59.193802 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pvnkb"] Dec 05 17:50:00 crc kubenswrapper[4756]: I1205 17:50:00.148552 4756 generic.go:334] "Generic (PLEG): container finished" podID="723c11a6-c5d7-48d1-b97d-eb92ff0609b6" containerID="f5be81242db87d37b672e915541f69428786c036654f3b29051a74d02e861f68" exitCode=0 Dec 05 17:50:00 crc kubenswrapper[4756]: I1205 17:50:00.148947 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvnkb" event={"ID":"723c11a6-c5d7-48d1-b97d-eb92ff0609b6","Type":"ContainerDied","Data":"f5be81242db87d37b672e915541f69428786c036654f3b29051a74d02e861f68"} Dec 05 17:50:00 crc kubenswrapper[4756]: I1205 17:50:00.149049 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvnkb" event={"ID":"723c11a6-c5d7-48d1-b97d-eb92ff0609b6","Type":"ContainerStarted","Data":"1a57380df259d3bb6cad54c177838d6bfbc15cb30d6d68d546b59421ab5ea61f"} Dec 05 17:50:00 crc kubenswrapper[4756]: I1205 17:50:00.204213 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-x4nlc"] Dec 05 17:50:00 crc kubenswrapper[4756]: E1205 17:50:00.204625 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f212d8d-36fd-4499-9cd4-08c17b5e39e9" containerName="mariadb-account-create-update" Dec 05 17:50:00 crc kubenswrapper[4756]: I1205 17:50:00.204647 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f212d8d-36fd-4499-9cd4-08c17b5e39e9" containerName="mariadb-account-create-update" Dec 05 17:50:00 crc kubenswrapper[4756]: E1205 17:50:00.204680 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ea60fe5-2e89-4d87-966b-7be9872f939f" containerName="mariadb-database-create" Dec 05 17:50:00 crc kubenswrapper[4756]: I1205 17:50:00.204691 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ea60fe5-2e89-4d87-966b-7be9872f939f" containerName="mariadb-database-create" Dec 05 17:50:00 crc kubenswrapper[4756]: I1205 17:50:00.204964 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ea60fe5-2e89-4d87-966b-7be9872f939f" containerName="mariadb-database-create" Dec 05 17:50:00 crc kubenswrapper[4756]: I1205 17:50:00.205010 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f212d8d-36fd-4499-9cd4-08c17b5e39e9" containerName="mariadb-account-create-update" Dec 05 17:50:00 crc kubenswrapper[4756]: I1205 17:50:00.205710 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-x4nlc" Dec 05 17:50:00 crc kubenswrapper[4756]: I1205 17:50:00.208734 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 05 17:50:00 crc kubenswrapper[4756]: I1205 17:50:00.209060 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-prxbl" Dec 05 17:50:00 crc kubenswrapper[4756]: I1205 17:50:00.213875 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-x4nlc"] Dec 05 17:50:00 crc kubenswrapper[4756]: I1205 17:50:00.328517 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdhp4\" (UniqueName: \"kubernetes.io/projected/aef51efd-9b84-4528-9f7e-4703c6bc31dd-kube-api-access-tdhp4\") pod \"barbican-db-sync-x4nlc\" (UID: \"aef51efd-9b84-4528-9f7e-4703c6bc31dd\") " pod="openstack/barbican-db-sync-x4nlc" Dec 05 17:50:00 crc kubenswrapper[4756]: I1205 17:50:00.328609 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aef51efd-9b84-4528-9f7e-4703c6bc31dd-db-sync-config-data\") pod \"barbican-db-sync-x4nlc\" (UID: \"aef51efd-9b84-4528-9f7e-4703c6bc31dd\") " pod="openstack/barbican-db-sync-x4nlc" Dec 05 17:50:00 crc kubenswrapper[4756]: I1205 17:50:00.328985 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aef51efd-9b84-4528-9f7e-4703c6bc31dd-combined-ca-bundle\") pod \"barbican-db-sync-x4nlc\" (UID: \"aef51efd-9b84-4528-9f7e-4703c6bc31dd\") " pod="openstack/barbican-db-sync-x4nlc" Dec 05 17:50:00 crc kubenswrapper[4756]: I1205 17:50:00.431211 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aef51efd-9b84-4528-9f7e-4703c6bc31dd-db-sync-config-data\") pod \"barbican-db-sync-x4nlc\" (UID: \"aef51efd-9b84-4528-9f7e-4703c6bc31dd\") " pod="openstack/barbican-db-sync-x4nlc" Dec 05 17:50:00 crc kubenswrapper[4756]: I1205 17:50:00.431762 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aef51efd-9b84-4528-9f7e-4703c6bc31dd-combined-ca-bundle\") pod \"barbican-db-sync-x4nlc\" (UID: \"aef51efd-9b84-4528-9f7e-4703c6bc31dd\") " pod="openstack/barbican-db-sync-x4nlc" Dec 05 17:50:00 crc kubenswrapper[4756]: I1205 17:50:00.431988 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdhp4\" (UniqueName: \"kubernetes.io/projected/aef51efd-9b84-4528-9f7e-4703c6bc31dd-kube-api-access-tdhp4\") pod \"barbican-db-sync-x4nlc\" (UID: \"aef51efd-9b84-4528-9f7e-4703c6bc31dd\") " pod="openstack/barbican-db-sync-x4nlc" Dec 05 17:50:00 crc kubenswrapper[4756]: I1205 17:50:00.458048 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdhp4\" (UniqueName: \"kubernetes.io/projected/aef51efd-9b84-4528-9f7e-4703c6bc31dd-kube-api-access-tdhp4\") pod \"barbican-db-sync-x4nlc\" (UID: \"aef51efd-9b84-4528-9f7e-4703c6bc31dd\") " pod="openstack/barbican-db-sync-x4nlc" Dec 05 17:50:00 crc kubenswrapper[4756]: I1205 17:50:00.469400 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aef51efd-9b84-4528-9f7e-4703c6bc31dd-combined-ca-bundle\") pod \"barbican-db-sync-x4nlc\" (UID: \"aef51efd-9b84-4528-9f7e-4703c6bc31dd\") " pod="openstack/barbican-db-sync-x4nlc" Dec 05 17:50:00 crc kubenswrapper[4756]: I1205 17:50:00.470056 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aef51efd-9b84-4528-9f7e-4703c6bc31dd-db-sync-config-data\") pod \"barbican-db-sync-x4nlc\" (UID: \"aef51efd-9b84-4528-9f7e-4703c6bc31dd\") " pod="openstack/barbican-db-sync-x4nlc" Dec 05 17:50:00 crc kubenswrapper[4756]: I1205 17:50:00.543746 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-x4nlc" Dec 05 17:50:01 crc kubenswrapper[4756]: W1205 17:50:01.057679 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaef51efd_9b84_4528_9f7e_4703c6bc31dd.slice/crio-f7d1fd060e00191195ed50120745333ea3de438182114d239d532eeaf8362f71 WatchSource:0}: Error finding container f7d1fd060e00191195ed50120745333ea3de438182114d239d532eeaf8362f71: Status 404 returned error can't find the container with id f7d1fd060e00191195ed50120745333ea3de438182114d239d532eeaf8362f71 Dec 05 17:50:01 crc kubenswrapper[4756]: I1205 17:50:01.059070 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-x4nlc"] Dec 05 17:50:01 crc kubenswrapper[4756]: I1205 17:50:01.159868 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvnkb" event={"ID":"723c11a6-c5d7-48d1-b97d-eb92ff0609b6","Type":"ContainerStarted","Data":"4aebab63c58441bb889dc01105b6bdd1e036bf9c92692663bcf7016d0d8565e2"} Dec 05 17:50:01 crc kubenswrapper[4756]: I1205 17:50:01.161080 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-x4nlc" event={"ID":"aef51efd-9b84-4528-9f7e-4703c6bc31dd","Type":"ContainerStarted","Data":"f7d1fd060e00191195ed50120745333ea3de438182114d239d532eeaf8362f71"} Dec 05 17:50:02 crc kubenswrapper[4756]: I1205 17:50:02.173486 4756 generic.go:334] "Generic (PLEG): container finished" podID="723c11a6-c5d7-48d1-b97d-eb92ff0609b6" containerID="4aebab63c58441bb889dc01105b6bdd1e036bf9c92692663bcf7016d0d8565e2" exitCode=0 Dec 05 17:50:02 crc kubenswrapper[4756]: I1205 17:50:02.173759 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvnkb" event={"ID":"723c11a6-c5d7-48d1-b97d-eb92ff0609b6","Type":"ContainerDied","Data":"4aebab63c58441bb889dc01105b6bdd1e036bf9c92692663bcf7016d0d8565e2"} Dec 05 17:50:02 crc kubenswrapper[4756]: I1205 17:50:02.175649 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-x4nlc" event={"ID":"aef51efd-9b84-4528-9f7e-4703c6bc31dd","Type":"ContainerStarted","Data":"5452f98c7a9727302f3cc6894c9ba60c35f54cd1d76d818abb93644227e53b66"} Dec 05 17:50:02 crc kubenswrapper[4756]: I1205 17:50:02.235967 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-x4nlc" podStartSLOduration=2.235942348 podStartE2EDuration="2.235942348s" podCreationTimestamp="2025-12-05 17:50:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:50:02.228494429 +0000 UTC m=+5662.467167951" watchObservedRunningTime="2025-12-05 17:50:02.235942348 +0000 UTC m=+5662.474615880" Dec 05 17:50:03 crc kubenswrapper[4756]: I1205 17:50:03.184753 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvnkb" event={"ID":"723c11a6-c5d7-48d1-b97d-eb92ff0609b6","Type":"ContainerStarted","Data":"8fcf6ebdc952f3a820227a5394680fbae4503a0c2c6da46d01cfd6de524123f1"} Dec 05 17:50:04 crc kubenswrapper[4756]: I1205 17:50:04.192424 4756 generic.go:334] "Generic (PLEG): container finished" podID="aef51efd-9b84-4528-9f7e-4703c6bc31dd" containerID="5452f98c7a9727302f3cc6894c9ba60c35f54cd1d76d818abb93644227e53b66" exitCode=0 Dec 05 17:50:04 crc kubenswrapper[4756]: I1205 17:50:04.192455 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-x4nlc" event={"ID":"aef51efd-9b84-4528-9f7e-4703c6bc31dd","Type":"ContainerDied","Data":"5452f98c7a9727302f3cc6894c9ba60c35f54cd1d76d818abb93644227e53b66"} Dec 05 17:50:04 crc kubenswrapper[4756]: I1205 17:50:04.208926 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pvnkb" podStartSLOduration=3.78213988 podStartE2EDuration="6.208905198s" podCreationTimestamp="2025-12-05 17:49:58 +0000 UTC" firstStartedPulling="2025-12-05 17:50:00.151082008 +0000 UTC m=+5660.389755520" lastFinishedPulling="2025-12-05 17:50:02.577847286 +0000 UTC m=+5662.816520838" observedRunningTime="2025-12-05 17:50:03.211881237 +0000 UTC m=+5663.450554759" watchObservedRunningTime="2025-12-05 17:50:04.208905198 +0000 UTC m=+5664.447578710" Dec 05 17:50:05 crc kubenswrapper[4756]: I1205 17:50:05.584823 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-x4nlc" Dec 05 17:50:05 crc kubenswrapper[4756]: I1205 17:50:05.730552 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aef51efd-9b84-4528-9f7e-4703c6bc31dd-db-sync-config-data\") pod \"aef51efd-9b84-4528-9f7e-4703c6bc31dd\" (UID: \"aef51efd-9b84-4528-9f7e-4703c6bc31dd\") " Dec 05 17:50:05 crc kubenswrapper[4756]: I1205 17:50:05.730672 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aef51efd-9b84-4528-9f7e-4703c6bc31dd-combined-ca-bundle\") pod \"aef51efd-9b84-4528-9f7e-4703c6bc31dd\" (UID: \"aef51efd-9b84-4528-9f7e-4703c6bc31dd\") " Dec 05 17:50:05 crc kubenswrapper[4756]: I1205 17:50:05.730784 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdhp4\" (UniqueName: \"kubernetes.io/projected/aef51efd-9b84-4528-9f7e-4703c6bc31dd-kube-api-access-tdhp4\") pod \"aef51efd-9b84-4528-9f7e-4703c6bc31dd\" (UID: \"aef51efd-9b84-4528-9f7e-4703c6bc31dd\") " Dec 05 17:50:05 crc kubenswrapper[4756]: I1205 17:50:05.738245 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aef51efd-9b84-4528-9f7e-4703c6bc31dd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "aef51efd-9b84-4528-9f7e-4703c6bc31dd" (UID: "aef51efd-9b84-4528-9f7e-4703c6bc31dd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:50:05 crc kubenswrapper[4756]: I1205 17:50:05.739330 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aef51efd-9b84-4528-9f7e-4703c6bc31dd-kube-api-access-tdhp4" (OuterVolumeSpecName: "kube-api-access-tdhp4") pod "aef51efd-9b84-4528-9f7e-4703c6bc31dd" (UID: "aef51efd-9b84-4528-9f7e-4703c6bc31dd"). InnerVolumeSpecName "kube-api-access-tdhp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:50:05 crc kubenswrapper[4756]: I1205 17:50:05.779383 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aef51efd-9b84-4528-9f7e-4703c6bc31dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aef51efd-9b84-4528-9f7e-4703c6bc31dd" (UID: "aef51efd-9b84-4528-9f7e-4703c6bc31dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:50:05 crc kubenswrapper[4756]: I1205 17:50:05.832500 4756 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aef51efd-9b84-4528-9f7e-4703c6bc31dd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:50:05 crc kubenswrapper[4756]: I1205 17:50:05.832534 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aef51efd-9b84-4528-9f7e-4703c6bc31dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:50:05 crc kubenswrapper[4756]: I1205 17:50:05.832546 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdhp4\" (UniqueName: \"kubernetes.io/projected/aef51efd-9b84-4528-9f7e-4703c6bc31dd-kube-api-access-tdhp4\") on node \"crc\" DevicePath \"\"" Dec 05 17:50:06 crc kubenswrapper[4756]: I1205 17:50:06.212289 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-x4nlc" event={"ID":"aef51efd-9b84-4528-9f7e-4703c6bc31dd","Type":"ContainerDied","Data":"f7d1fd060e00191195ed50120745333ea3de438182114d239d532eeaf8362f71"} Dec 05 17:50:06 crc kubenswrapper[4756]: I1205 17:50:06.212326 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7d1fd060e00191195ed50120745333ea3de438182114d239d532eeaf8362f71" Dec 05 17:50:06 crc kubenswrapper[4756]: I1205 17:50:06.212338 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-x4nlc" Dec 05 17:50:06 crc kubenswrapper[4756]: I1205 17:50:06.844300 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-746459ccc9-jrtlj"] Dec 05 17:50:06 crc kubenswrapper[4756]: E1205 17:50:06.844865 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aef51efd-9b84-4528-9f7e-4703c6bc31dd" containerName="barbican-db-sync" Dec 05 17:50:06 crc kubenswrapper[4756]: I1205 17:50:06.844879 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="aef51efd-9b84-4528-9f7e-4703c6bc31dd" containerName="barbican-db-sync" Dec 05 17:50:06 crc kubenswrapper[4756]: I1205 17:50:06.845045 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="aef51efd-9b84-4528-9f7e-4703c6bc31dd" containerName="barbican-db-sync" Dec 05 17:50:06 crc kubenswrapper[4756]: I1205 17:50:06.862626 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-746459ccc9-jrtlj" Dec 05 17:50:06 crc kubenswrapper[4756]: I1205 17:50:06.867228 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 05 17:50:06 crc kubenswrapper[4756]: I1205 17:50:06.867789 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-prxbl" Dec 05 17:50:06 crc kubenswrapper[4756]: I1205 17:50:06.870960 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 05 17:50:06 crc kubenswrapper[4756]: I1205 17:50:06.904515 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6c694485c4-5xm56"] Dec 05 17:50:06 crc kubenswrapper[4756]: I1205 17:50:06.906857 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6c694485c4-5xm56" Dec 05 17:50:06 crc kubenswrapper[4756]: I1205 17:50:06.913581 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 05 17:50:06 crc kubenswrapper[4756]: I1205 17:50:06.929594 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6c694485c4-5xm56"] Dec 05 17:50:06 crc kubenswrapper[4756]: I1205 17:50:06.957644 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-746459ccc9-jrtlj"] Dec 05 17:50:06 crc kubenswrapper[4756]: I1205 17:50:06.975861 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93adb284-0238-4295-b819-3d51b171ee6a-config-data-custom\") pod \"barbican-worker-746459ccc9-jrtlj\" (UID: \"93adb284-0238-4295-b819-3d51b171ee6a\") " pod="openstack/barbican-worker-746459ccc9-jrtlj" Dec 05 17:50:06 crc kubenswrapper[4756]: I1205 17:50:06.975939 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2add0aa-4a0d-464f-8ce0-c51a983392bd-logs\") pod \"barbican-keystone-listener-6c694485c4-5xm56\" (UID: \"c2add0aa-4a0d-464f-8ce0-c51a983392bd\") " pod="openstack/barbican-keystone-listener-6c694485c4-5xm56" Dec 05 17:50:06 crc kubenswrapper[4756]: I1205 17:50:06.975974 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmtpj\" (UniqueName: \"kubernetes.io/projected/93adb284-0238-4295-b819-3d51b171ee6a-kube-api-access-xmtpj\") pod \"barbican-worker-746459ccc9-jrtlj\" (UID: \"93adb284-0238-4295-b819-3d51b171ee6a\") " pod="openstack/barbican-worker-746459ccc9-jrtlj" Dec 05 17:50:06 crc kubenswrapper[4756]: I1205 17:50:06.975995 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2add0aa-4a0d-464f-8ce0-c51a983392bd-config-data\") pod \"barbican-keystone-listener-6c694485c4-5xm56\" (UID: \"c2add0aa-4a0d-464f-8ce0-c51a983392bd\") " pod="openstack/barbican-keystone-listener-6c694485c4-5xm56" Dec 05 17:50:06 crc kubenswrapper[4756]: I1205 17:50:06.976023 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2add0aa-4a0d-464f-8ce0-c51a983392bd-combined-ca-bundle\") pod \"barbican-keystone-listener-6c694485c4-5xm56\" (UID: \"c2add0aa-4a0d-464f-8ce0-c51a983392bd\") " pod="openstack/barbican-keystone-listener-6c694485c4-5xm56" Dec 05 17:50:06 crc kubenswrapper[4756]: I1205 17:50:06.976039 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2add0aa-4a0d-464f-8ce0-c51a983392bd-config-data-custom\") pod \"barbican-keystone-listener-6c694485c4-5xm56\" (UID: \"c2add0aa-4a0d-464f-8ce0-c51a983392bd\") " pod="openstack/barbican-keystone-listener-6c694485c4-5xm56" Dec 05 17:50:06 crc kubenswrapper[4756]: I1205 17:50:06.976055 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl8px\" (UniqueName: \"kubernetes.io/projected/c2add0aa-4a0d-464f-8ce0-c51a983392bd-kube-api-access-pl8px\") pod \"barbican-keystone-listener-6c694485c4-5xm56\" (UID: \"c2add0aa-4a0d-464f-8ce0-c51a983392bd\") " pod="openstack/barbican-keystone-listener-6c694485c4-5xm56" Dec 05 17:50:06 crc kubenswrapper[4756]: I1205 17:50:06.976073 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93adb284-0238-4295-b819-3d51b171ee6a-config-data\") pod \"barbican-worker-746459ccc9-jrtlj\" (UID: \"93adb284-0238-4295-b819-3d51b171ee6a\") " pod="openstack/barbican-worker-746459ccc9-jrtlj" Dec 05 17:50:06 crc kubenswrapper[4756]: I1205 17:50:06.976112 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93adb284-0238-4295-b819-3d51b171ee6a-logs\") pod \"barbican-worker-746459ccc9-jrtlj\" (UID: \"93adb284-0238-4295-b819-3d51b171ee6a\") " pod="openstack/barbican-worker-746459ccc9-jrtlj" Dec 05 17:50:06 crc kubenswrapper[4756]: I1205 17:50:06.976135 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93adb284-0238-4295-b819-3d51b171ee6a-combined-ca-bundle\") pod \"barbican-worker-746459ccc9-jrtlj\" (UID: \"93adb284-0238-4295-b819-3d51b171ee6a\") " pod="openstack/barbican-worker-746459ccc9-jrtlj" Dec 05 17:50:06 crc kubenswrapper[4756]: I1205 17:50:06.995067 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-576d7bc65c-rdc65"] Dec 05 17:50:06 crc kubenswrapper[4756]: I1205 17:50:06.997384 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-576d7bc65c-rdc65" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.001472 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-576d7bc65c-rdc65"] Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.035164 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6bdb888948-v7g6x"] Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.036598 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6bdb888948-v7g6x" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.038973 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.056684 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6bdb888948-v7g6x"] Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.077099 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93adb284-0238-4295-b819-3d51b171ee6a-logs\") pod \"barbican-worker-746459ccc9-jrtlj\" (UID: \"93adb284-0238-4295-b819-3d51b171ee6a\") " pod="openstack/barbican-worker-746459ccc9-jrtlj" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.077152 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c9kh\" (UniqueName: \"kubernetes.io/projected/096ccb27-1f9b-4301-8aeb-ae04e71503bb-kube-api-access-6c9kh\") pod \"dnsmasq-dns-576d7bc65c-rdc65\" (UID: \"096ccb27-1f9b-4301-8aeb-ae04e71503bb\") " pod="openstack/dnsmasq-dns-576d7bc65c-rdc65" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.077182 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93adb284-0238-4295-b819-3d51b171ee6a-combined-ca-bundle\") pod \"barbican-worker-746459ccc9-jrtlj\" (UID: \"93adb284-0238-4295-b819-3d51b171ee6a\") " pod="openstack/barbican-worker-746459ccc9-jrtlj" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.077236 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/096ccb27-1f9b-4301-8aeb-ae04e71503bb-ovsdbserver-sb\") pod \"dnsmasq-dns-576d7bc65c-rdc65\" (UID: \"096ccb27-1f9b-4301-8aeb-ae04e71503bb\") " pod="openstack/dnsmasq-dns-576d7bc65c-rdc65" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.077264 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93adb284-0238-4295-b819-3d51b171ee6a-config-data-custom\") pod \"barbican-worker-746459ccc9-jrtlj\" (UID: \"93adb284-0238-4295-b819-3d51b171ee6a\") " pod="openstack/barbican-worker-746459ccc9-jrtlj" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.077289 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/096ccb27-1f9b-4301-8aeb-ae04e71503bb-config\") pod \"dnsmasq-dns-576d7bc65c-rdc65\" (UID: \"096ccb27-1f9b-4301-8aeb-ae04e71503bb\") " pod="openstack/dnsmasq-dns-576d7bc65c-rdc65" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.077315 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2add0aa-4a0d-464f-8ce0-c51a983392bd-logs\") pod \"barbican-keystone-listener-6c694485c4-5xm56\" (UID: \"c2add0aa-4a0d-464f-8ce0-c51a983392bd\") " pod="openstack/barbican-keystone-listener-6c694485c4-5xm56" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.077338 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmtpj\" (UniqueName: \"kubernetes.io/projected/93adb284-0238-4295-b819-3d51b171ee6a-kube-api-access-xmtpj\") pod \"barbican-worker-746459ccc9-jrtlj\" (UID: \"93adb284-0238-4295-b819-3d51b171ee6a\") " pod="openstack/barbican-worker-746459ccc9-jrtlj" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.077355 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2add0aa-4a0d-464f-8ce0-c51a983392bd-config-data\") pod \"barbican-keystone-listener-6c694485c4-5xm56\" (UID: \"c2add0aa-4a0d-464f-8ce0-c51a983392bd\") " pod="openstack/barbican-keystone-listener-6c694485c4-5xm56" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.077375 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/096ccb27-1f9b-4301-8aeb-ae04e71503bb-ovsdbserver-nb\") pod \"dnsmasq-dns-576d7bc65c-rdc65\" (UID: \"096ccb27-1f9b-4301-8aeb-ae04e71503bb\") " pod="openstack/dnsmasq-dns-576d7bc65c-rdc65" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.077391 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/096ccb27-1f9b-4301-8aeb-ae04e71503bb-dns-svc\") pod \"dnsmasq-dns-576d7bc65c-rdc65\" (UID: \"096ccb27-1f9b-4301-8aeb-ae04e71503bb\") " pod="openstack/dnsmasq-dns-576d7bc65c-rdc65" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.077433 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2add0aa-4a0d-464f-8ce0-c51a983392bd-combined-ca-bundle\") pod \"barbican-keystone-listener-6c694485c4-5xm56\" (UID: \"c2add0aa-4a0d-464f-8ce0-c51a983392bd\") " pod="openstack/barbican-keystone-listener-6c694485c4-5xm56" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.077449 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2add0aa-4a0d-464f-8ce0-c51a983392bd-config-data-custom\") pod \"barbican-keystone-listener-6c694485c4-5xm56\" (UID: \"c2add0aa-4a0d-464f-8ce0-c51a983392bd\") " pod="openstack/barbican-keystone-listener-6c694485c4-5xm56" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.077466 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl8px\" (UniqueName: \"kubernetes.io/projected/c2add0aa-4a0d-464f-8ce0-c51a983392bd-kube-api-access-pl8px\") pod \"barbican-keystone-listener-6c694485c4-5xm56\" (UID: \"c2add0aa-4a0d-464f-8ce0-c51a983392bd\") " pod="openstack/barbican-keystone-listener-6c694485c4-5xm56" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.077486 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93adb284-0238-4295-b819-3d51b171ee6a-config-data\") pod \"barbican-worker-746459ccc9-jrtlj\" (UID: \"93adb284-0238-4295-b819-3d51b171ee6a\") " pod="openstack/barbican-worker-746459ccc9-jrtlj" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.077609 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93adb284-0238-4295-b819-3d51b171ee6a-logs\") pod \"barbican-worker-746459ccc9-jrtlj\" (UID: \"93adb284-0238-4295-b819-3d51b171ee6a\") " pod="openstack/barbican-worker-746459ccc9-jrtlj" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.079098 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2add0aa-4a0d-464f-8ce0-c51a983392bd-logs\") pod \"barbican-keystone-listener-6c694485c4-5xm56\" (UID: \"c2add0aa-4a0d-464f-8ce0-c51a983392bd\") " pod="openstack/barbican-keystone-listener-6c694485c4-5xm56" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.084282 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2add0aa-4a0d-464f-8ce0-c51a983392bd-config-data\") pod \"barbican-keystone-listener-6c694485c4-5xm56\" (UID: \"c2add0aa-4a0d-464f-8ce0-c51a983392bd\") " pod="openstack/barbican-keystone-listener-6c694485c4-5xm56" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.085838 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2add0aa-4a0d-464f-8ce0-c51a983392bd-combined-ca-bundle\") pod \"barbican-keystone-listener-6c694485c4-5xm56\" (UID: \"c2add0aa-4a0d-464f-8ce0-c51a983392bd\") " pod="openstack/barbican-keystone-listener-6c694485c4-5xm56" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.088027 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93adb284-0238-4295-b819-3d51b171ee6a-config-data-custom\") pod \"barbican-worker-746459ccc9-jrtlj\" (UID: \"93adb284-0238-4295-b819-3d51b171ee6a\") " pod="openstack/barbican-worker-746459ccc9-jrtlj" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.088140 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93adb284-0238-4295-b819-3d51b171ee6a-combined-ca-bundle\") pod \"barbican-worker-746459ccc9-jrtlj\" (UID: \"93adb284-0238-4295-b819-3d51b171ee6a\") " pod="openstack/barbican-worker-746459ccc9-jrtlj" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.100070 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmtpj\" (UniqueName: \"kubernetes.io/projected/93adb284-0238-4295-b819-3d51b171ee6a-kube-api-access-xmtpj\") pod \"barbican-worker-746459ccc9-jrtlj\" (UID: \"93adb284-0238-4295-b819-3d51b171ee6a\") " pod="openstack/barbican-worker-746459ccc9-jrtlj" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.100329 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2add0aa-4a0d-464f-8ce0-c51a983392bd-config-data-custom\") pod \"barbican-keystone-listener-6c694485c4-5xm56\" (UID: \"c2add0aa-4a0d-464f-8ce0-c51a983392bd\") " pod="openstack/barbican-keystone-listener-6c694485c4-5xm56" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.102566 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93adb284-0238-4295-b819-3d51b171ee6a-config-data\") pod \"barbican-worker-746459ccc9-jrtlj\" (UID: \"93adb284-0238-4295-b819-3d51b171ee6a\") " pod="openstack/barbican-worker-746459ccc9-jrtlj" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.111325 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl8px\" (UniqueName: \"kubernetes.io/projected/c2add0aa-4a0d-464f-8ce0-c51a983392bd-kube-api-access-pl8px\") pod \"barbican-keystone-listener-6c694485c4-5xm56\" (UID: \"c2add0aa-4a0d-464f-8ce0-c51a983392bd\") " pod="openstack/barbican-keystone-listener-6c694485c4-5xm56" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.189087 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c8accb5-840e-4d8f-9928-4e8fae845218-logs\") pod \"barbican-api-6bdb888948-v7g6x\" (UID: \"0c8accb5-840e-4d8f-9928-4e8fae845218\") " pod="openstack/barbican-api-6bdb888948-v7g6x" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.189135 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c9kh\" (UniqueName: \"kubernetes.io/projected/096ccb27-1f9b-4301-8aeb-ae04e71503bb-kube-api-access-6c9kh\") pod \"dnsmasq-dns-576d7bc65c-rdc65\" (UID: \"096ccb27-1f9b-4301-8aeb-ae04e71503bb\") " pod="openstack/dnsmasq-dns-576d7bc65c-rdc65" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.189151 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c8accb5-840e-4d8f-9928-4e8fae845218-config-data-custom\") pod \"barbican-api-6bdb888948-v7g6x\" (UID: \"0c8accb5-840e-4d8f-9928-4e8fae845218\") " pod="openstack/barbican-api-6bdb888948-v7g6x" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.189226 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/096ccb27-1f9b-4301-8aeb-ae04e71503bb-ovsdbserver-sb\") pod \"dnsmasq-dns-576d7bc65c-rdc65\" (UID: \"096ccb27-1f9b-4301-8aeb-ae04e71503bb\") " pod="openstack/dnsmasq-dns-576d7bc65c-rdc65" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.189242 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whlx6\" (UniqueName: \"kubernetes.io/projected/0c8accb5-840e-4d8f-9928-4e8fae845218-kube-api-access-whlx6\") pod \"barbican-api-6bdb888948-v7g6x\" (UID: \"0c8accb5-840e-4d8f-9928-4e8fae845218\") " pod="openstack/barbican-api-6bdb888948-v7g6x" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.189276 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/096ccb27-1f9b-4301-8aeb-ae04e71503bb-config\") pod \"dnsmasq-dns-576d7bc65c-rdc65\" (UID: \"096ccb27-1f9b-4301-8aeb-ae04e71503bb\") " pod="openstack/dnsmasq-dns-576d7bc65c-rdc65" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.189313 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/096ccb27-1f9b-4301-8aeb-ae04e71503bb-ovsdbserver-nb\") pod \"dnsmasq-dns-576d7bc65c-rdc65\" (UID: \"096ccb27-1f9b-4301-8aeb-ae04e71503bb\") " pod="openstack/dnsmasq-dns-576d7bc65c-rdc65" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.189329 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/096ccb27-1f9b-4301-8aeb-ae04e71503bb-dns-svc\") pod \"dnsmasq-dns-576d7bc65c-rdc65\" (UID: \"096ccb27-1f9b-4301-8aeb-ae04e71503bb\") " pod="openstack/dnsmasq-dns-576d7bc65c-rdc65" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.189372 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c8accb5-840e-4d8f-9928-4e8fae845218-combined-ca-bundle\") pod \"barbican-api-6bdb888948-v7g6x\" (UID: \"0c8accb5-840e-4d8f-9928-4e8fae845218\") " pod="openstack/barbican-api-6bdb888948-v7g6x" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.189389 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c8accb5-840e-4d8f-9928-4e8fae845218-config-data\") pod \"barbican-api-6bdb888948-v7g6x\" (UID: \"0c8accb5-840e-4d8f-9928-4e8fae845218\") " pod="openstack/barbican-api-6bdb888948-v7g6x" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.190235 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/096ccb27-1f9b-4301-8aeb-ae04e71503bb-config\") pod \"dnsmasq-dns-576d7bc65c-rdc65\" (UID: \"096ccb27-1f9b-4301-8aeb-ae04e71503bb\") " pod="openstack/dnsmasq-dns-576d7bc65c-rdc65" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.190339 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/096ccb27-1f9b-4301-8aeb-ae04e71503bb-ovsdbserver-sb\") pod \"dnsmasq-dns-576d7bc65c-rdc65\" (UID: \"096ccb27-1f9b-4301-8aeb-ae04e71503bb\") " pod="openstack/dnsmasq-dns-576d7bc65c-rdc65" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.190860 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/096ccb27-1f9b-4301-8aeb-ae04e71503bb-ovsdbserver-nb\") pod \"dnsmasq-dns-576d7bc65c-rdc65\" (UID: \"096ccb27-1f9b-4301-8aeb-ae04e71503bb\") " pod="openstack/dnsmasq-dns-576d7bc65c-rdc65" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.191363 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/096ccb27-1f9b-4301-8aeb-ae04e71503bb-dns-svc\") pod \"dnsmasq-dns-576d7bc65c-rdc65\" (UID: \"096ccb27-1f9b-4301-8aeb-ae04e71503bb\") " pod="openstack/dnsmasq-dns-576d7bc65c-rdc65" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.206821 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-746459ccc9-jrtlj" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.217139 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c9kh\" (UniqueName: \"kubernetes.io/projected/096ccb27-1f9b-4301-8aeb-ae04e71503bb-kube-api-access-6c9kh\") pod \"dnsmasq-dns-576d7bc65c-rdc65\" (UID: \"096ccb27-1f9b-4301-8aeb-ae04e71503bb\") " pod="openstack/dnsmasq-dns-576d7bc65c-rdc65" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.249847 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6c694485c4-5xm56" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.291379 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c8accb5-840e-4d8f-9928-4e8fae845218-combined-ca-bundle\") pod \"barbican-api-6bdb888948-v7g6x\" (UID: \"0c8accb5-840e-4d8f-9928-4e8fae845218\") " pod="openstack/barbican-api-6bdb888948-v7g6x" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.291450 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c8accb5-840e-4d8f-9928-4e8fae845218-config-data\") pod \"barbican-api-6bdb888948-v7g6x\" (UID: \"0c8accb5-840e-4d8f-9928-4e8fae845218\") " pod="openstack/barbican-api-6bdb888948-v7g6x" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.291490 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c8accb5-840e-4d8f-9928-4e8fae845218-logs\") pod \"barbican-api-6bdb888948-v7g6x\" (UID: \"0c8accb5-840e-4d8f-9928-4e8fae845218\") " pod="openstack/barbican-api-6bdb888948-v7g6x" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.291511 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c8accb5-840e-4d8f-9928-4e8fae845218-config-data-custom\") pod \"barbican-api-6bdb888948-v7g6x\" (UID: \"0c8accb5-840e-4d8f-9928-4e8fae845218\") " pod="openstack/barbican-api-6bdb888948-v7g6x" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.291565 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whlx6\" (UniqueName: \"kubernetes.io/projected/0c8accb5-840e-4d8f-9928-4e8fae845218-kube-api-access-whlx6\") pod \"barbican-api-6bdb888948-v7g6x\" (UID: \"0c8accb5-840e-4d8f-9928-4e8fae845218\") " pod="openstack/barbican-api-6bdb888948-v7g6x" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.292441 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c8accb5-840e-4d8f-9928-4e8fae845218-logs\") pod \"barbican-api-6bdb888948-v7g6x\" (UID: \"0c8accb5-840e-4d8f-9928-4e8fae845218\") " pod="openstack/barbican-api-6bdb888948-v7g6x" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.302308 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c8accb5-840e-4d8f-9928-4e8fae845218-config-data-custom\") pod \"barbican-api-6bdb888948-v7g6x\" (UID: \"0c8accb5-840e-4d8f-9928-4e8fae845218\") " pod="openstack/barbican-api-6bdb888948-v7g6x" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.303514 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c8accb5-840e-4d8f-9928-4e8fae845218-config-data\") pod \"barbican-api-6bdb888948-v7g6x\" (UID: \"0c8accb5-840e-4d8f-9928-4e8fae845218\") " pod="openstack/barbican-api-6bdb888948-v7g6x" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.304558 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c8accb5-840e-4d8f-9928-4e8fae845218-combined-ca-bundle\") pod \"barbican-api-6bdb888948-v7g6x\" (UID: \"0c8accb5-840e-4d8f-9928-4e8fae845218\") " pod="openstack/barbican-api-6bdb888948-v7g6x" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.314963 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whlx6\" (UniqueName: \"kubernetes.io/projected/0c8accb5-840e-4d8f-9928-4e8fae845218-kube-api-access-whlx6\") pod \"barbican-api-6bdb888948-v7g6x\" (UID: \"0c8accb5-840e-4d8f-9928-4e8fae845218\") " pod="openstack/barbican-api-6bdb888948-v7g6x" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.336057 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-576d7bc65c-rdc65" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.352999 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6bdb888948-v7g6x" Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.830960 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6c694485c4-5xm56"] Dec 05 17:50:07 crc kubenswrapper[4756]: W1205 17:50:07.839485 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93adb284_0238_4295_b819_3d51b171ee6a.slice/crio-0833ebd640a9cd45ad2757993875e8533bd0ac3f1b74f0c41377abeac42ce71a WatchSource:0}: Error finding container 0833ebd640a9cd45ad2757993875e8533bd0ac3f1b74f0c41377abeac42ce71a: Status 404 returned error can't find the container with id 0833ebd640a9cd45ad2757993875e8533bd0ac3f1b74f0c41377abeac42ce71a Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.843845 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-746459ccc9-jrtlj"] Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.945881 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-576d7bc65c-rdc65"] Dec 05 17:50:07 crc kubenswrapper[4756]: I1205 17:50:07.953467 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6bdb888948-v7g6x"] Dec 05 17:50:08 crc kubenswrapper[4756]: I1205 17:50:08.294720 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6c694485c4-5xm56" event={"ID":"c2add0aa-4a0d-464f-8ce0-c51a983392bd","Type":"ContainerStarted","Data":"d0534bb99a69e419c7538cd0b35cbb5d56d982da50623bbb24f52897aac2d50e"} Dec 05 17:50:08 crc kubenswrapper[4756]: I1205 17:50:08.303977 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-746459ccc9-jrtlj" event={"ID":"93adb284-0238-4295-b819-3d51b171ee6a","Type":"ContainerStarted","Data":"0833ebd640a9cd45ad2757993875e8533bd0ac3f1b74f0c41377abeac42ce71a"} Dec 05 17:50:08 crc kubenswrapper[4756]: I1205 17:50:08.307165 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-576d7bc65c-rdc65" event={"ID":"096ccb27-1f9b-4301-8aeb-ae04e71503bb","Type":"ContainerStarted","Data":"496756be744d661155d09fbff32eaf7d53721d1ede086818bbbdaa44d5398c0e"} Dec 05 17:50:08 crc kubenswrapper[4756]: I1205 17:50:08.313626 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6bdb888948-v7g6x" event={"ID":"0c8accb5-840e-4d8f-9928-4e8fae845218","Type":"ContainerStarted","Data":"03ab827edcc47e77b146fceeeab628041c6544bd89eac0b7fdce493b5e011873"} Dec 05 17:50:08 crc kubenswrapper[4756]: I1205 17:50:08.760868 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pvnkb" Dec 05 17:50:08 crc kubenswrapper[4756]: I1205 17:50:08.761272 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pvnkb" Dec 05 17:50:09 crc kubenswrapper[4756]: I1205 17:50:09.321183 4756 generic.go:334] "Generic (PLEG): container finished" podID="096ccb27-1f9b-4301-8aeb-ae04e71503bb" containerID="ba082d4075d0e57613b55f161b5c2dc556c9a990561fde1769efcb7eafb91841" exitCode=0 Dec 05 17:50:09 crc kubenswrapper[4756]: I1205 17:50:09.321920 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-576d7bc65c-rdc65" event={"ID":"096ccb27-1f9b-4301-8aeb-ae04e71503bb","Type":"ContainerDied","Data":"ba082d4075d0e57613b55f161b5c2dc556c9a990561fde1769efcb7eafb91841"} Dec 05 17:50:09 crc kubenswrapper[4756]: I1205 17:50:09.323162 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6bdb888948-v7g6x" event={"ID":"0c8accb5-840e-4d8f-9928-4e8fae845218","Type":"ContainerStarted","Data":"959c5b5a15132f6e216ebce92be802218f5c1aee76ae80136b15ec8e73e99000"} Dec 05 17:50:09 crc kubenswrapper[4756]: I1205 17:50:09.323185 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6bdb888948-v7g6x" event={"ID":"0c8accb5-840e-4d8f-9928-4e8fae845218","Type":"ContainerStarted","Data":"59da21ba65ccddd5ecbed8b9c0eeb44fc4d4ee41573dfa3a3eda588399aa5d86"} Dec 05 17:50:09 crc kubenswrapper[4756]: I1205 17:50:09.323631 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6bdb888948-v7g6x" Dec 05 17:50:09 crc kubenswrapper[4756]: I1205 17:50:09.323654 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6bdb888948-v7g6x" Dec 05 17:50:09 crc kubenswrapper[4756]: I1205 17:50:09.325336 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6c694485c4-5xm56" event={"ID":"c2add0aa-4a0d-464f-8ce0-c51a983392bd","Type":"ContainerStarted","Data":"2b683eb29f62e36634a5480f7f2095f5f935c73c0e9d72bf7f815ca026472fbc"} Dec 05 17:50:09 crc kubenswrapper[4756]: I1205 17:50:09.325361 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6c694485c4-5xm56" event={"ID":"c2add0aa-4a0d-464f-8ce0-c51a983392bd","Type":"ContainerStarted","Data":"8f2ee4f23db8cf3560373110b7d7286c65fdf3b32b6293942cb5d09388e30011"} Dec 05 17:50:09 crc kubenswrapper[4756]: I1205 17:50:09.328182 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-746459ccc9-jrtlj" event={"ID":"93adb284-0238-4295-b819-3d51b171ee6a","Type":"ContainerStarted","Data":"704d9884d77816abc940d00ff962ab28a098e72690aa36849b162a53a1feef43"} Dec 05 17:50:09 crc kubenswrapper[4756]: I1205 17:50:09.328225 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-746459ccc9-jrtlj" event={"ID":"93adb284-0238-4295-b819-3d51b171ee6a","Type":"ContainerStarted","Data":"b82623ebdd7d263a4ed44ce0950a54ba2514b6600b4070911deea43fdeb31c12"} Dec 05 17:50:09 crc kubenswrapper[4756]: I1205 17:50:09.377045 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6c694485c4-5xm56" podStartSLOduration=3.377030416 podStartE2EDuration="3.377030416s" podCreationTimestamp="2025-12-05 17:50:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:50:09.372926566 +0000 UTC m=+5669.611600088" watchObservedRunningTime="2025-12-05 17:50:09.377030416 +0000 UTC m=+5669.615703928" Dec 05 17:50:09 crc kubenswrapper[4756]: I1205 17:50:09.410057 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6bdb888948-v7g6x" podStartSLOduration=2.410034425 podStartE2EDuration="2.410034425s" podCreationTimestamp="2025-12-05 17:50:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:50:09.393902185 +0000 UTC m=+5669.632575697" watchObservedRunningTime="2025-12-05 17:50:09.410034425 +0000 UTC m=+5669.648707937" Dec 05 17:50:09 crc kubenswrapper[4756]: I1205 17:50:09.431373 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-746459ccc9-jrtlj" podStartSLOduration=3.431357253 podStartE2EDuration="3.431357253s" podCreationTimestamp="2025-12-05 17:50:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:50:09.426257617 +0000 UTC m=+5669.664931139" watchObservedRunningTime="2025-12-05 17:50:09.431357253 +0000 UTC m=+5669.670030755" Dec 05 17:50:09 crc kubenswrapper[4756]: I1205 17:50:09.818166 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pvnkb" podUID="723c11a6-c5d7-48d1-b97d-eb92ff0609b6" containerName="registry-server" probeResult="failure" output=< Dec 05 17:50:09 crc kubenswrapper[4756]: timeout: failed to connect service ":50051" within 1s Dec 05 17:50:09 crc kubenswrapper[4756]: > Dec 05 17:50:09 crc kubenswrapper[4756]: I1205 17:50:09.940039 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7d79f4c8f4-xpgcm"] Dec 05 17:50:09 crc kubenswrapper[4756]: I1205 17:50:09.942386 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d79f4c8f4-xpgcm" Dec 05 17:50:09 crc kubenswrapper[4756]: I1205 17:50:09.945590 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 05 17:50:09 crc kubenswrapper[4756]: I1205 17:50:09.945989 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 05 17:50:09 crc kubenswrapper[4756]: I1205 17:50:09.955090 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7d79f4c8f4-xpgcm"] Dec 05 17:50:10 crc kubenswrapper[4756]: I1205 17:50:10.044928 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fbcc7b8-bcb8-4bba-9302-d22bc187e559-public-tls-certs\") pod \"barbican-api-7d79f4c8f4-xpgcm\" (UID: \"7fbcc7b8-bcb8-4bba-9302-d22bc187e559\") " pod="openstack/barbican-api-7d79f4c8f4-xpgcm" Dec 05 17:50:10 crc kubenswrapper[4756]: I1205 17:50:10.044977 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fbcc7b8-bcb8-4bba-9302-d22bc187e559-internal-tls-certs\") pod \"barbican-api-7d79f4c8f4-xpgcm\" (UID: \"7fbcc7b8-bcb8-4bba-9302-d22bc187e559\") " pod="openstack/barbican-api-7d79f4c8f4-xpgcm" Dec 05 17:50:10 crc kubenswrapper[4756]: I1205 17:50:10.044997 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fbcc7b8-bcb8-4bba-9302-d22bc187e559-combined-ca-bundle\") pod \"barbican-api-7d79f4c8f4-xpgcm\" (UID: \"7fbcc7b8-bcb8-4bba-9302-d22bc187e559\") " pod="openstack/barbican-api-7d79f4c8f4-xpgcm" Dec 05 17:50:10 crc kubenswrapper[4756]: I1205 17:50:10.045182 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgz4t\" (UniqueName: \"kubernetes.io/projected/7fbcc7b8-bcb8-4bba-9302-d22bc187e559-kube-api-access-bgz4t\") pod \"barbican-api-7d79f4c8f4-xpgcm\" (UID: \"7fbcc7b8-bcb8-4bba-9302-d22bc187e559\") " pod="openstack/barbican-api-7d79f4c8f4-xpgcm" Dec 05 17:50:10 crc kubenswrapper[4756]: I1205 17:50:10.045301 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fbcc7b8-bcb8-4bba-9302-d22bc187e559-config-data\") pod \"barbican-api-7d79f4c8f4-xpgcm\" (UID: \"7fbcc7b8-bcb8-4bba-9302-d22bc187e559\") " pod="openstack/barbican-api-7d79f4c8f4-xpgcm" Dec 05 17:50:10 crc kubenswrapper[4756]: I1205 17:50:10.045443 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fbcc7b8-bcb8-4bba-9302-d22bc187e559-logs\") pod \"barbican-api-7d79f4c8f4-xpgcm\" (UID: \"7fbcc7b8-bcb8-4bba-9302-d22bc187e559\") " pod="openstack/barbican-api-7d79f4c8f4-xpgcm" Dec 05 17:50:10 crc kubenswrapper[4756]: I1205 17:50:10.045480 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fbcc7b8-bcb8-4bba-9302-d22bc187e559-config-data-custom\") pod \"barbican-api-7d79f4c8f4-xpgcm\" (UID: \"7fbcc7b8-bcb8-4bba-9302-d22bc187e559\") " pod="openstack/barbican-api-7d79f4c8f4-xpgcm" Dec 05 17:50:10 crc kubenswrapper[4756]: I1205 17:50:10.147310 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fbcc7b8-bcb8-4bba-9302-d22bc187e559-config-data-custom\") pod \"barbican-api-7d79f4c8f4-xpgcm\" (UID: \"7fbcc7b8-bcb8-4bba-9302-d22bc187e559\") " pod="openstack/barbican-api-7d79f4c8f4-xpgcm" Dec 05 17:50:10 crc kubenswrapper[4756]: I1205 17:50:10.147351 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fbcc7b8-bcb8-4bba-9302-d22bc187e559-logs\") pod \"barbican-api-7d79f4c8f4-xpgcm\" (UID: \"7fbcc7b8-bcb8-4bba-9302-d22bc187e559\") " pod="openstack/barbican-api-7d79f4c8f4-xpgcm" Dec 05 17:50:10 crc kubenswrapper[4756]: I1205 17:50:10.147470 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fbcc7b8-bcb8-4bba-9302-d22bc187e559-public-tls-certs\") pod \"barbican-api-7d79f4c8f4-xpgcm\" (UID: \"7fbcc7b8-bcb8-4bba-9302-d22bc187e559\") " pod="openstack/barbican-api-7d79f4c8f4-xpgcm" Dec 05 17:50:10 crc kubenswrapper[4756]: I1205 17:50:10.147498 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fbcc7b8-bcb8-4bba-9302-d22bc187e559-internal-tls-certs\") pod \"barbican-api-7d79f4c8f4-xpgcm\" (UID: \"7fbcc7b8-bcb8-4bba-9302-d22bc187e559\") " pod="openstack/barbican-api-7d79f4c8f4-xpgcm" Dec 05 17:50:10 crc kubenswrapper[4756]: I1205 17:50:10.147519 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fbcc7b8-bcb8-4bba-9302-d22bc187e559-combined-ca-bundle\") pod \"barbican-api-7d79f4c8f4-xpgcm\" (UID: \"7fbcc7b8-bcb8-4bba-9302-d22bc187e559\") " pod="openstack/barbican-api-7d79f4c8f4-xpgcm" Dec 05 17:50:10 crc kubenswrapper[4756]: I1205 17:50:10.147559 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgz4t\" (UniqueName: \"kubernetes.io/projected/7fbcc7b8-bcb8-4bba-9302-d22bc187e559-kube-api-access-bgz4t\") pod \"barbican-api-7d79f4c8f4-xpgcm\" (UID: \"7fbcc7b8-bcb8-4bba-9302-d22bc187e559\") " pod="openstack/barbican-api-7d79f4c8f4-xpgcm" Dec 05 17:50:10 crc kubenswrapper[4756]: I1205 17:50:10.147592 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fbcc7b8-bcb8-4bba-9302-d22bc187e559-config-data\") pod \"barbican-api-7d79f4c8f4-xpgcm\" (UID: \"7fbcc7b8-bcb8-4bba-9302-d22bc187e559\") " pod="openstack/barbican-api-7d79f4c8f4-xpgcm" Dec 05 17:50:10 crc kubenswrapper[4756]: I1205 17:50:10.148369 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fbcc7b8-bcb8-4bba-9302-d22bc187e559-logs\") pod \"barbican-api-7d79f4c8f4-xpgcm\" (UID: \"7fbcc7b8-bcb8-4bba-9302-d22bc187e559\") " pod="openstack/barbican-api-7d79f4c8f4-xpgcm" Dec 05 17:50:10 crc kubenswrapper[4756]: I1205 17:50:10.153407 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fbcc7b8-bcb8-4bba-9302-d22bc187e559-internal-tls-certs\") pod \"barbican-api-7d79f4c8f4-xpgcm\" (UID: \"7fbcc7b8-bcb8-4bba-9302-d22bc187e559\") " pod="openstack/barbican-api-7d79f4c8f4-xpgcm" Dec 05 17:50:10 crc kubenswrapper[4756]: I1205 17:50:10.153578 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fbcc7b8-bcb8-4bba-9302-d22bc187e559-config-data\") pod \"barbican-api-7d79f4c8f4-xpgcm\" (UID: \"7fbcc7b8-bcb8-4bba-9302-d22bc187e559\") " pod="openstack/barbican-api-7d79f4c8f4-xpgcm" Dec 05 17:50:10 crc kubenswrapper[4756]: I1205 17:50:10.154054 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fbcc7b8-bcb8-4bba-9302-d22bc187e559-combined-ca-bundle\") pod \"barbican-api-7d79f4c8f4-xpgcm\" (UID: \"7fbcc7b8-bcb8-4bba-9302-d22bc187e559\") " pod="openstack/barbican-api-7d79f4c8f4-xpgcm" Dec 05 17:50:10 crc kubenswrapper[4756]: I1205 17:50:10.154556 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fbcc7b8-bcb8-4bba-9302-d22bc187e559-public-tls-certs\") pod \"barbican-api-7d79f4c8f4-xpgcm\" (UID: \"7fbcc7b8-bcb8-4bba-9302-d22bc187e559\") " pod="openstack/barbican-api-7d79f4c8f4-xpgcm" Dec 05 17:50:10 crc kubenswrapper[4756]: I1205 17:50:10.155596 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fbcc7b8-bcb8-4bba-9302-d22bc187e559-config-data-custom\") pod \"barbican-api-7d79f4c8f4-xpgcm\" (UID: \"7fbcc7b8-bcb8-4bba-9302-d22bc187e559\") " pod="openstack/barbican-api-7d79f4c8f4-xpgcm" Dec 05 17:50:10 crc kubenswrapper[4756]: I1205 17:50:10.175224 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgz4t\" (UniqueName: \"kubernetes.io/projected/7fbcc7b8-bcb8-4bba-9302-d22bc187e559-kube-api-access-bgz4t\") pod \"barbican-api-7d79f4c8f4-xpgcm\" (UID: \"7fbcc7b8-bcb8-4bba-9302-d22bc187e559\") " pod="openstack/barbican-api-7d79f4c8f4-xpgcm" Dec 05 17:50:10 crc kubenswrapper[4756]: I1205 17:50:10.269588 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d79f4c8f4-xpgcm" Dec 05 17:50:10 crc kubenswrapper[4756]: I1205 17:50:10.353955 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-576d7bc65c-rdc65" event={"ID":"096ccb27-1f9b-4301-8aeb-ae04e71503bb","Type":"ContainerStarted","Data":"e7c21b96e26a422bf8e08c13d16b128f33b8de66a1b624c9892481c06d483838"} Dec 05 17:50:10 crc kubenswrapper[4756]: I1205 17:50:10.354644 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-576d7bc65c-rdc65" Dec 05 17:50:10 crc kubenswrapper[4756]: I1205 17:50:10.381318 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-576d7bc65c-rdc65" podStartSLOduration=4.3812915199999996 podStartE2EDuration="4.38129152s" podCreationTimestamp="2025-12-05 17:50:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:50:10.376947764 +0000 UTC m=+5670.615621266" watchObservedRunningTime="2025-12-05 17:50:10.38129152 +0000 UTC m=+5670.619965032" Dec 05 17:50:10 crc kubenswrapper[4756]: I1205 17:50:10.736368 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7d79f4c8f4-xpgcm"] Dec 05 17:50:11 crc kubenswrapper[4756]: I1205 17:50:11.363328 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d79f4c8f4-xpgcm" event={"ID":"7fbcc7b8-bcb8-4bba-9302-d22bc187e559","Type":"ContainerStarted","Data":"0c2af238557262748988478e6713b2eb8364f3193aa6880cb4824dfcc9a59055"} Dec 05 17:50:11 crc kubenswrapper[4756]: I1205 17:50:11.363728 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d79f4c8f4-xpgcm" event={"ID":"7fbcc7b8-bcb8-4bba-9302-d22bc187e559","Type":"ContainerStarted","Data":"fd0ccabcbef3e1e4ef0386a2c6a505a319547f4e84497d08b8eccb5e7f5de9a4"} Dec 05 17:50:11 crc kubenswrapper[4756]: I1205 17:50:11.363742 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d79f4c8f4-xpgcm" event={"ID":"7fbcc7b8-bcb8-4bba-9302-d22bc187e559","Type":"ContainerStarted","Data":"ca88b534930e42df47afd368017d9bdc7fd0b7c0a1fff3033cad871dbdc2bcf8"} Dec 05 17:50:11 crc kubenswrapper[4756]: I1205 17:50:11.364227 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7d79f4c8f4-xpgcm" Dec 05 17:50:11 crc kubenswrapper[4756]: I1205 17:50:11.364247 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7d79f4c8f4-xpgcm" Dec 05 17:50:15 crc kubenswrapper[4756]: I1205 17:50:15.281151 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6bdb888948-v7g6x" podUID="0c8accb5-840e-4d8f-9928-4e8fae845218" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 17:50:16 crc kubenswrapper[4756]: I1205 17:50:16.748017 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7d79f4c8f4-xpgcm" Dec 05 17:50:16 crc kubenswrapper[4756]: I1205 17:50:16.776223 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7d79f4c8f4-xpgcm" podStartSLOduration=7.77620852 podStartE2EDuration="7.77620852s" podCreationTimestamp="2025-12-05 17:50:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:50:11.386787686 +0000 UTC m=+5671.625461198" watchObservedRunningTime="2025-12-05 17:50:16.77620852 +0000 UTC m=+5677.014882032" Dec 05 17:50:17 crc kubenswrapper[4756]: I1205 17:50:17.339551 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-576d7bc65c-rdc65" Dec 05 17:50:17 crc kubenswrapper[4756]: I1205 17:50:17.459715 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f75cb8b55-mj5gh"] Dec 05 17:50:17 crc kubenswrapper[4756]: I1205 17:50:17.459946 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f75cb8b55-mj5gh" podUID="1356577d-3bc1-401b-a8e8-0d6b83a99d96" containerName="dnsmasq-dns" containerID="cri-o://c9df30652a40f86f3f12679e1d2beeedb89cc43f6a122ec18e09ba1559c0d627" gracePeriod=10 Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.026562 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f75cb8b55-mj5gh" Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.133079 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1356577d-3bc1-401b-a8e8-0d6b83a99d96-config\") pod \"1356577d-3bc1-401b-a8e8-0d6b83a99d96\" (UID: \"1356577d-3bc1-401b-a8e8-0d6b83a99d96\") " Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.133164 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cdcv\" (UniqueName: \"kubernetes.io/projected/1356577d-3bc1-401b-a8e8-0d6b83a99d96-kube-api-access-2cdcv\") pod \"1356577d-3bc1-401b-a8e8-0d6b83a99d96\" (UID: \"1356577d-3bc1-401b-a8e8-0d6b83a99d96\") " Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.133234 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1356577d-3bc1-401b-a8e8-0d6b83a99d96-dns-svc\") pod \"1356577d-3bc1-401b-a8e8-0d6b83a99d96\" (UID: \"1356577d-3bc1-401b-a8e8-0d6b83a99d96\") " Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.133334 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1356577d-3bc1-401b-a8e8-0d6b83a99d96-ovsdbserver-sb\") pod \"1356577d-3bc1-401b-a8e8-0d6b83a99d96\" (UID: \"1356577d-3bc1-401b-a8e8-0d6b83a99d96\") " Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.133357 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1356577d-3bc1-401b-a8e8-0d6b83a99d96-ovsdbserver-nb\") pod \"1356577d-3bc1-401b-a8e8-0d6b83a99d96\" (UID: \"1356577d-3bc1-401b-a8e8-0d6b83a99d96\") " Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.138560 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1356577d-3bc1-401b-a8e8-0d6b83a99d96-kube-api-access-2cdcv" (OuterVolumeSpecName: "kube-api-access-2cdcv") pod "1356577d-3bc1-401b-a8e8-0d6b83a99d96" (UID: "1356577d-3bc1-401b-a8e8-0d6b83a99d96"). InnerVolumeSpecName "kube-api-access-2cdcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.159977 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7d79f4c8f4-xpgcm" Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.181907 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1356577d-3bc1-401b-a8e8-0d6b83a99d96-config" (OuterVolumeSpecName: "config") pod "1356577d-3bc1-401b-a8e8-0d6b83a99d96" (UID: "1356577d-3bc1-401b-a8e8-0d6b83a99d96"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.186549 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1356577d-3bc1-401b-a8e8-0d6b83a99d96-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1356577d-3bc1-401b-a8e8-0d6b83a99d96" (UID: "1356577d-3bc1-401b-a8e8-0d6b83a99d96"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.208439 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1356577d-3bc1-401b-a8e8-0d6b83a99d96-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1356577d-3bc1-401b-a8e8-0d6b83a99d96" (UID: "1356577d-3bc1-401b-a8e8-0d6b83a99d96"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.234772 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1356577d-3bc1-401b-a8e8-0d6b83a99d96-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1356577d-3bc1-401b-a8e8-0d6b83a99d96" (UID: "1356577d-3bc1-401b-a8e8-0d6b83a99d96"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.235952 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1356577d-3bc1-401b-a8e8-0d6b83a99d96-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.235968 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1356577d-3bc1-401b-a8e8-0d6b83a99d96-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.235977 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1356577d-3bc1-401b-a8e8-0d6b83a99d96-config\") on node \"crc\" DevicePath \"\"" Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.235987 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cdcv\" (UniqueName: \"kubernetes.io/projected/1356577d-3bc1-401b-a8e8-0d6b83a99d96-kube-api-access-2cdcv\") on node \"crc\" DevicePath \"\"" Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.235998 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1356577d-3bc1-401b-a8e8-0d6b83a99d96-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.242586 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6bdb888948-v7g6x"] Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.242939 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6bdb888948-v7g6x" podUID="0c8accb5-840e-4d8f-9928-4e8fae845218" containerName="barbican-api-log" containerID="cri-o://59da21ba65ccddd5ecbed8b9c0eeb44fc4d4ee41573dfa3a3eda588399aa5d86" gracePeriod=30 Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.243069 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6bdb888948-v7g6x" podUID="0c8accb5-840e-4d8f-9928-4e8fae845218" containerName="barbican-api" containerID="cri-o://959c5b5a15132f6e216ebce92be802218f5c1aee76ae80136b15ec8e73e99000" gracePeriod=30 Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.252580 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6bdb888948-v7g6x" podUID="0c8accb5-840e-4d8f-9928-4e8fae845218" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.1.34:9311/healthcheck\": EOF" Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.256883 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6bdb888948-v7g6x" podUID="0c8accb5-840e-4d8f-9928-4e8fae845218" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.1.34:9311/healthcheck\": EOF" Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.441606 4756 generic.go:334] "Generic (PLEG): container finished" podID="1356577d-3bc1-401b-a8e8-0d6b83a99d96" containerID="c9df30652a40f86f3f12679e1d2beeedb89cc43f6a122ec18e09ba1559c0d627" exitCode=0 Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.441644 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f75cb8b55-mj5gh" Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.441681 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f75cb8b55-mj5gh" event={"ID":"1356577d-3bc1-401b-a8e8-0d6b83a99d96","Type":"ContainerDied","Data":"c9df30652a40f86f3f12679e1d2beeedb89cc43f6a122ec18e09ba1559c0d627"} Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.441711 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f75cb8b55-mj5gh" event={"ID":"1356577d-3bc1-401b-a8e8-0d6b83a99d96","Type":"ContainerDied","Data":"e00c5de34a0d682152662530ca585e958edc08a0272c9b4d4960d1a1c11b0b67"} Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.441743 4756 scope.go:117] "RemoveContainer" containerID="c9df30652a40f86f3f12679e1d2beeedb89cc43f6a122ec18e09ba1559c0d627" Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.444532 4756 generic.go:334] "Generic (PLEG): container finished" podID="0c8accb5-840e-4d8f-9928-4e8fae845218" containerID="59da21ba65ccddd5ecbed8b9c0eeb44fc4d4ee41573dfa3a3eda588399aa5d86" exitCode=143 Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.444565 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6bdb888948-v7g6x" event={"ID":"0c8accb5-840e-4d8f-9928-4e8fae845218","Type":"ContainerDied","Data":"59da21ba65ccddd5ecbed8b9c0eeb44fc4d4ee41573dfa3a3eda588399aa5d86"} Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.475699 4756 scope.go:117] "RemoveContainer" containerID="45bb28482deb65e210bd2da99213216508976252c6bb77e5d0c5405b99b8cfc8" Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.481352 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f75cb8b55-mj5gh"] Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.496443 4756 scope.go:117] "RemoveContainer" containerID="c9df30652a40f86f3f12679e1d2beeedb89cc43f6a122ec18e09ba1559c0d627" Dec 05 17:50:18 crc kubenswrapper[4756]: E1205 17:50:18.496976 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9df30652a40f86f3f12679e1d2beeedb89cc43f6a122ec18e09ba1559c0d627\": container with ID starting with c9df30652a40f86f3f12679e1d2beeedb89cc43f6a122ec18e09ba1559c0d627 not found: ID does not exist" containerID="c9df30652a40f86f3f12679e1d2beeedb89cc43f6a122ec18e09ba1559c0d627" Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.497025 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9df30652a40f86f3f12679e1d2beeedb89cc43f6a122ec18e09ba1559c0d627"} err="failed to get container status \"c9df30652a40f86f3f12679e1d2beeedb89cc43f6a122ec18e09ba1559c0d627\": rpc error: code = NotFound desc = could not find container \"c9df30652a40f86f3f12679e1d2beeedb89cc43f6a122ec18e09ba1559c0d627\": container with ID starting with c9df30652a40f86f3f12679e1d2beeedb89cc43f6a122ec18e09ba1559c0d627 not found: ID does not exist" Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.497054 4756 scope.go:117] "RemoveContainer" containerID="45bb28482deb65e210bd2da99213216508976252c6bb77e5d0c5405b99b8cfc8" Dec 05 17:50:18 crc kubenswrapper[4756]: E1205 17:50:18.497462 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45bb28482deb65e210bd2da99213216508976252c6bb77e5d0c5405b99b8cfc8\": container with ID starting with 45bb28482deb65e210bd2da99213216508976252c6bb77e5d0c5405b99b8cfc8 not found: ID does not exist" containerID="45bb28482deb65e210bd2da99213216508976252c6bb77e5d0c5405b99b8cfc8" Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.497499 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45bb28482deb65e210bd2da99213216508976252c6bb77e5d0c5405b99b8cfc8"} err="failed to get container status \"45bb28482deb65e210bd2da99213216508976252c6bb77e5d0c5405b99b8cfc8\": rpc error: code = NotFound desc = could not find container \"45bb28482deb65e210bd2da99213216508976252c6bb77e5d0c5405b99b8cfc8\": container with ID starting with 45bb28482deb65e210bd2da99213216508976252c6bb77e5d0c5405b99b8cfc8 not found: ID does not exist" Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.499322 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f75cb8b55-mj5gh"] Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.819170 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pvnkb" Dec 05 17:50:18 crc kubenswrapper[4756]: I1205 17:50:18.874201 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pvnkb" Dec 05 17:50:19 crc kubenswrapper[4756]: I1205 17:50:19.061771 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pvnkb"] Dec 05 17:50:20 crc kubenswrapper[4756]: I1205 17:50:20.060896 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1356577d-3bc1-401b-a8e8-0d6b83a99d96" path="/var/lib/kubelet/pods/1356577d-3bc1-401b-a8e8-0d6b83a99d96/volumes" Dec 05 17:50:20 crc kubenswrapper[4756]: I1205 17:50:20.468008 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pvnkb" podUID="723c11a6-c5d7-48d1-b97d-eb92ff0609b6" containerName="registry-server" containerID="cri-o://8fcf6ebdc952f3a820227a5394680fbae4503a0c2c6da46d01cfd6de524123f1" gracePeriod=2 Dec 05 17:50:21 crc kubenswrapper[4756]: I1205 17:50:21.453366 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pvnkb" Dec 05 17:50:21 crc kubenswrapper[4756]: I1205 17:50:21.498055 4756 generic.go:334] "Generic (PLEG): container finished" podID="723c11a6-c5d7-48d1-b97d-eb92ff0609b6" containerID="8fcf6ebdc952f3a820227a5394680fbae4503a0c2c6da46d01cfd6de524123f1" exitCode=0 Dec 05 17:50:21 crc kubenswrapper[4756]: I1205 17:50:21.498117 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pvnkb" Dec 05 17:50:21 crc kubenswrapper[4756]: I1205 17:50:21.498117 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvnkb" event={"ID":"723c11a6-c5d7-48d1-b97d-eb92ff0609b6","Type":"ContainerDied","Data":"8fcf6ebdc952f3a820227a5394680fbae4503a0c2c6da46d01cfd6de524123f1"} Dec 05 17:50:21 crc kubenswrapper[4756]: I1205 17:50:21.498822 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvnkb" event={"ID":"723c11a6-c5d7-48d1-b97d-eb92ff0609b6","Type":"ContainerDied","Data":"1a57380df259d3bb6cad54c177838d6bfbc15cb30d6d68d546b59421ab5ea61f"} Dec 05 17:50:21 crc kubenswrapper[4756]: I1205 17:50:21.498846 4756 scope.go:117] "RemoveContainer" containerID="8fcf6ebdc952f3a820227a5394680fbae4503a0c2c6da46d01cfd6de524123f1" Dec 05 17:50:21 crc kubenswrapper[4756]: I1205 17:50:21.506453 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ht5d\" (UniqueName: \"kubernetes.io/projected/723c11a6-c5d7-48d1-b97d-eb92ff0609b6-kube-api-access-8ht5d\") pod \"723c11a6-c5d7-48d1-b97d-eb92ff0609b6\" (UID: \"723c11a6-c5d7-48d1-b97d-eb92ff0609b6\") " Dec 05 17:50:21 crc kubenswrapper[4756]: I1205 17:50:21.506943 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/723c11a6-c5d7-48d1-b97d-eb92ff0609b6-utilities\") pod \"723c11a6-c5d7-48d1-b97d-eb92ff0609b6\" (UID: \"723c11a6-c5d7-48d1-b97d-eb92ff0609b6\") " Dec 05 17:50:21 crc kubenswrapper[4756]: I1205 17:50:21.506996 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/723c11a6-c5d7-48d1-b97d-eb92ff0609b6-catalog-content\") pod \"723c11a6-c5d7-48d1-b97d-eb92ff0609b6\" (UID: \"723c11a6-c5d7-48d1-b97d-eb92ff0609b6\") " Dec 05 17:50:21 crc kubenswrapper[4756]: I1205 17:50:21.507949 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/723c11a6-c5d7-48d1-b97d-eb92ff0609b6-utilities" (OuterVolumeSpecName: "utilities") pod "723c11a6-c5d7-48d1-b97d-eb92ff0609b6" (UID: "723c11a6-c5d7-48d1-b97d-eb92ff0609b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:50:21 crc kubenswrapper[4756]: I1205 17:50:21.515697 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/723c11a6-c5d7-48d1-b97d-eb92ff0609b6-kube-api-access-8ht5d" (OuterVolumeSpecName: "kube-api-access-8ht5d") pod "723c11a6-c5d7-48d1-b97d-eb92ff0609b6" (UID: "723c11a6-c5d7-48d1-b97d-eb92ff0609b6"). InnerVolumeSpecName "kube-api-access-8ht5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:50:21 crc kubenswrapper[4756]: I1205 17:50:21.532710 4756 scope.go:117] "RemoveContainer" containerID="4aebab63c58441bb889dc01105b6bdd1e036bf9c92692663bcf7016d0d8565e2" Dec 05 17:50:21 crc kubenswrapper[4756]: I1205 17:50:21.564591 4756 scope.go:117] "RemoveContainer" containerID="f5be81242db87d37b672e915541f69428786c036654f3b29051a74d02e861f68" Dec 05 17:50:21 crc kubenswrapper[4756]: I1205 17:50:21.597050 4756 scope.go:117] "RemoveContainer" containerID="8fcf6ebdc952f3a820227a5394680fbae4503a0c2c6da46d01cfd6de524123f1" Dec 05 17:50:21 crc kubenswrapper[4756]: E1205 17:50:21.597548 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fcf6ebdc952f3a820227a5394680fbae4503a0c2c6da46d01cfd6de524123f1\": container with ID starting with 8fcf6ebdc952f3a820227a5394680fbae4503a0c2c6da46d01cfd6de524123f1 not found: ID does not exist" containerID="8fcf6ebdc952f3a820227a5394680fbae4503a0c2c6da46d01cfd6de524123f1" Dec 05 17:50:21 crc kubenswrapper[4756]: I1205 17:50:21.597602 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fcf6ebdc952f3a820227a5394680fbae4503a0c2c6da46d01cfd6de524123f1"} err="failed to get container status \"8fcf6ebdc952f3a820227a5394680fbae4503a0c2c6da46d01cfd6de524123f1\": rpc error: code = NotFound desc = could not find container \"8fcf6ebdc952f3a820227a5394680fbae4503a0c2c6da46d01cfd6de524123f1\": container with ID starting with 8fcf6ebdc952f3a820227a5394680fbae4503a0c2c6da46d01cfd6de524123f1 not found: ID does not exist" Dec 05 17:50:21 crc kubenswrapper[4756]: I1205 17:50:21.597637 4756 scope.go:117] "RemoveContainer" containerID="4aebab63c58441bb889dc01105b6bdd1e036bf9c92692663bcf7016d0d8565e2" Dec 05 17:50:21 crc kubenswrapper[4756]: E1205 17:50:21.598039 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4aebab63c58441bb889dc01105b6bdd1e036bf9c92692663bcf7016d0d8565e2\": container with ID starting with 4aebab63c58441bb889dc01105b6bdd1e036bf9c92692663bcf7016d0d8565e2 not found: ID does not exist" containerID="4aebab63c58441bb889dc01105b6bdd1e036bf9c92692663bcf7016d0d8565e2" Dec 05 17:50:21 crc kubenswrapper[4756]: I1205 17:50:21.598066 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4aebab63c58441bb889dc01105b6bdd1e036bf9c92692663bcf7016d0d8565e2"} err="failed to get container status \"4aebab63c58441bb889dc01105b6bdd1e036bf9c92692663bcf7016d0d8565e2\": rpc error: code = NotFound desc = could not find container \"4aebab63c58441bb889dc01105b6bdd1e036bf9c92692663bcf7016d0d8565e2\": container with ID starting with 4aebab63c58441bb889dc01105b6bdd1e036bf9c92692663bcf7016d0d8565e2 not found: ID does not exist" Dec 05 17:50:21 crc kubenswrapper[4756]: I1205 17:50:21.598085 4756 scope.go:117] "RemoveContainer" containerID="f5be81242db87d37b672e915541f69428786c036654f3b29051a74d02e861f68" Dec 05 17:50:21 crc kubenswrapper[4756]: E1205 17:50:21.598372 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5be81242db87d37b672e915541f69428786c036654f3b29051a74d02e861f68\": container with ID starting with f5be81242db87d37b672e915541f69428786c036654f3b29051a74d02e861f68 not found: ID does not exist" containerID="f5be81242db87d37b672e915541f69428786c036654f3b29051a74d02e861f68" Dec 05 17:50:21 crc kubenswrapper[4756]: I1205 17:50:21.598389 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5be81242db87d37b672e915541f69428786c036654f3b29051a74d02e861f68"} err="failed to get container status \"f5be81242db87d37b672e915541f69428786c036654f3b29051a74d02e861f68\": rpc error: code = NotFound desc = could not find container \"f5be81242db87d37b672e915541f69428786c036654f3b29051a74d02e861f68\": container with ID starting with f5be81242db87d37b672e915541f69428786c036654f3b29051a74d02e861f68 not found: ID does not exist" Dec 05 17:50:21 crc kubenswrapper[4756]: I1205 17:50:21.609567 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ht5d\" (UniqueName: \"kubernetes.io/projected/723c11a6-c5d7-48d1-b97d-eb92ff0609b6-kube-api-access-8ht5d\") on node \"crc\" DevicePath \"\"" Dec 05 17:50:21 crc kubenswrapper[4756]: I1205 17:50:21.609594 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/723c11a6-c5d7-48d1-b97d-eb92ff0609b6-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 17:50:21 crc kubenswrapper[4756]: I1205 17:50:21.644841 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/723c11a6-c5d7-48d1-b97d-eb92ff0609b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "723c11a6-c5d7-48d1-b97d-eb92ff0609b6" (UID: "723c11a6-c5d7-48d1-b97d-eb92ff0609b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:50:21 crc kubenswrapper[4756]: I1205 17:50:21.711448 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/723c11a6-c5d7-48d1-b97d-eb92ff0609b6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 17:50:21 crc kubenswrapper[4756]: I1205 17:50:21.843334 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pvnkb"] Dec 05 17:50:21 crc kubenswrapper[4756]: I1205 17:50:21.856536 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pvnkb"] Dec 05 17:50:22 crc kubenswrapper[4756]: I1205 17:50:22.069257 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="723c11a6-c5d7-48d1-b97d-eb92ff0609b6" path="/var/lib/kubelet/pods/723c11a6-c5d7-48d1-b97d-eb92ff0609b6/volumes" Dec 05 17:50:22 crc kubenswrapper[4756]: I1205 17:50:22.668620 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6bdb888948-v7g6x" podUID="0c8accb5-840e-4d8f-9928-4e8fae845218" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.1.34:9311/healthcheck\": read tcp 10.217.0.2:60298->10.217.1.34:9311: read: connection reset by peer" Dec 05 17:50:22 crc kubenswrapper[4756]: I1205 17:50:22.668666 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6bdb888948-v7g6x" podUID="0c8accb5-840e-4d8f-9928-4e8fae845218" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.1.34:9311/healthcheck\": read tcp 10.217.0.2:60300->10.217.1.34:9311: read: connection reset by peer" Dec 05 17:50:22 crc kubenswrapper[4756]: I1205 17:50:22.669114 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6bdb888948-v7g6x" podUID="0c8accb5-840e-4d8f-9928-4e8fae845218" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.1.34:9311/healthcheck\": dial tcp 10.217.1.34:9311: connect: connection refused" Dec 05 17:50:22 crc kubenswrapper[4756]: I1205 17:50:22.669218 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6bdb888948-v7g6x" podUID="0c8accb5-840e-4d8f-9928-4e8fae845218" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.1.34:9311/healthcheck\": dial tcp 10.217.1.34:9311: connect: connection refused" Dec 05 17:50:23 crc kubenswrapper[4756]: I1205 17:50:23.123223 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6bdb888948-v7g6x" Dec 05 17:50:23 crc kubenswrapper[4756]: I1205 17:50:23.243345 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c8accb5-840e-4d8f-9928-4e8fae845218-config-data\") pod \"0c8accb5-840e-4d8f-9928-4e8fae845218\" (UID: \"0c8accb5-840e-4d8f-9928-4e8fae845218\") " Dec 05 17:50:23 crc kubenswrapper[4756]: I1205 17:50:23.243486 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c8accb5-840e-4d8f-9928-4e8fae845218-config-data-custom\") pod \"0c8accb5-840e-4d8f-9928-4e8fae845218\" (UID: \"0c8accb5-840e-4d8f-9928-4e8fae845218\") " Dec 05 17:50:23 crc kubenswrapper[4756]: I1205 17:50:23.243541 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whlx6\" (UniqueName: \"kubernetes.io/projected/0c8accb5-840e-4d8f-9928-4e8fae845218-kube-api-access-whlx6\") pod \"0c8accb5-840e-4d8f-9928-4e8fae845218\" (UID: \"0c8accb5-840e-4d8f-9928-4e8fae845218\") " Dec 05 17:50:23 crc kubenswrapper[4756]: I1205 17:50:23.243661 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c8accb5-840e-4d8f-9928-4e8fae845218-combined-ca-bundle\") pod \"0c8accb5-840e-4d8f-9928-4e8fae845218\" (UID: \"0c8accb5-840e-4d8f-9928-4e8fae845218\") " Dec 05 17:50:23 crc kubenswrapper[4756]: I1205 17:50:23.243766 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c8accb5-840e-4d8f-9928-4e8fae845218-logs\") pod \"0c8accb5-840e-4d8f-9928-4e8fae845218\" (UID: \"0c8accb5-840e-4d8f-9928-4e8fae845218\") " Dec 05 17:50:23 crc kubenswrapper[4756]: I1205 17:50:23.244241 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c8accb5-840e-4d8f-9928-4e8fae845218-logs" (OuterVolumeSpecName: "logs") pod "0c8accb5-840e-4d8f-9928-4e8fae845218" (UID: "0c8accb5-840e-4d8f-9928-4e8fae845218"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:50:23 crc kubenswrapper[4756]: I1205 17:50:23.250680 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c8accb5-840e-4d8f-9928-4e8fae845218-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0c8accb5-840e-4d8f-9928-4e8fae845218" (UID: "0c8accb5-840e-4d8f-9928-4e8fae845218"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:50:23 crc kubenswrapper[4756]: I1205 17:50:23.254338 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c8accb5-840e-4d8f-9928-4e8fae845218-kube-api-access-whlx6" (OuterVolumeSpecName: "kube-api-access-whlx6") pod "0c8accb5-840e-4d8f-9928-4e8fae845218" (UID: "0c8accb5-840e-4d8f-9928-4e8fae845218"). InnerVolumeSpecName "kube-api-access-whlx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:50:23 crc kubenswrapper[4756]: I1205 17:50:23.277437 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c8accb5-840e-4d8f-9928-4e8fae845218-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c8accb5-840e-4d8f-9928-4e8fae845218" (UID: "0c8accb5-840e-4d8f-9928-4e8fae845218"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:50:23 crc kubenswrapper[4756]: I1205 17:50:23.293407 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c8accb5-840e-4d8f-9928-4e8fae845218-config-data" (OuterVolumeSpecName: "config-data") pod "0c8accb5-840e-4d8f-9928-4e8fae845218" (UID: "0c8accb5-840e-4d8f-9928-4e8fae845218"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:50:23 crc kubenswrapper[4756]: I1205 17:50:23.345186 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c8accb5-840e-4d8f-9928-4e8fae845218-logs\") on node \"crc\" DevicePath \"\"" Dec 05 17:50:23 crc kubenswrapper[4756]: I1205 17:50:23.345218 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c8accb5-840e-4d8f-9928-4e8fae845218-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:50:23 crc kubenswrapper[4756]: I1205 17:50:23.345228 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c8accb5-840e-4d8f-9928-4e8fae845218-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 17:50:23 crc kubenswrapper[4756]: I1205 17:50:23.345239 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whlx6\" (UniqueName: \"kubernetes.io/projected/0c8accb5-840e-4d8f-9928-4e8fae845218-kube-api-access-whlx6\") on node \"crc\" DevicePath \"\"" Dec 05 17:50:23 crc kubenswrapper[4756]: I1205 17:50:23.345248 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c8accb5-840e-4d8f-9928-4e8fae845218-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:50:23 crc kubenswrapper[4756]: I1205 17:50:23.517778 4756 generic.go:334] "Generic (PLEG): container finished" podID="0c8accb5-840e-4d8f-9928-4e8fae845218" containerID="959c5b5a15132f6e216ebce92be802218f5c1aee76ae80136b15ec8e73e99000" exitCode=0 Dec 05 17:50:23 crc kubenswrapper[4756]: I1205 17:50:23.517842 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6bdb888948-v7g6x" Dec 05 17:50:23 crc kubenswrapper[4756]: I1205 17:50:23.517858 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6bdb888948-v7g6x" event={"ID":"0c8accb5-840e-4d8f-9928-4e8fae845218","Type":"ContainerDied","Data":"959c5b5a15132f6e216ebce92be802218f5c1aee76ae80136b15ec8e73e99000"} Dec 05 17:50:23 crc kubenswrapper[4756]: I1205 17:50:23.517924 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6bdb888948-v7g6x" event={"ID":"0c8accb5-840e-4d8f-9928-4e8fae845218","Type":"ContainerDied","Data":"03ab827edcc47e77b146fceeeab628041c6544bd89eac0b7fdce493b5e011873"} Dec 05 17:50:23 crc kubenswrapper[4756]: I1205 17:50:23.517964 4756 scope.go:117] "RemoveContainer" containerID="959c5b5a15132f6e216ebce92be802218f5c1aee76ae80136b15ec8e73e99000" Dec 05 17:50:23 crc kubenswrapper[4756]: I1205 17:50:23.560218 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6bdb888948-v7g6x"] Dec 05 17:50:23 crc kubenswrapper[4756]: I1205 17:50:23.564148 4756 scope.go:117] "RemoveContainer" containerID="59da21ba65ccddd5ecbed8b9c0eeb44fc4d4ee41573dfa3a3eda588399aa5d86" Dec 05 17:50:23 crc kubenswrapper[4756]: I1205 17:50:23.566330 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6bdb888948-v7g6x"] Dec 05 17:50:23 crc kubenswrapper[4756]: I1205 17:50:23.584479 4756 scope.go:117] "RemoveContainer" containerID="959c5b5a15132f6e216ebce92be802218f5c1aee76ae80136b15ec8e73e99000" Dec 05 17:50:23 crc kubenswrapper[4756]: E1205 17:50:23.584974 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"959c5b5a15132f6e216ebce92be802218f5c1aee76ae80136b15ec8e73e99000\": container with ID starting with 959c5b5a15132f6e216ebce92be802218f5c1aee76ae80136b15ec8e73e99000 not found: ID does not exist" containerID="959c5b5a15132f6e216ebce92be802218f5c1aee76ae80136b15ec8e73e99000" Dec 05 17:50:23 crc kubenswrapper[4756]: I1205 17:50:23.585024 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"959c5b5a15132f6e216ebce92be802218f5c1aee76ae80136b15ec8e73e99000"} err="failed to get container status \"959c5b5a15132f6e216ebce92be802218f5c1aee76ae80136b15ec8e73e99000\": rpc error: code = NotFound desc = could not find container \"959c5b5a15132f6e216ebce92be802218f5c1aee76ae80136b15ec8e73e99000\": container with ID starting with 959c5b5a15132f6e216ebce92be802218f5c1aee76ae80136b15ec8e73e99000 not found: ID does not exist" Dec 05 17:50:23 crc kubenswrapper[4756]: I1205 17:50:23.585059 4756 scope.go:117] "RemoveContainer" containerID="59da21ba65ccddd5ecbed8b9c0eeb44fc4d4ee41573dfa3a3eda588399aa5d86" Dec 05 17:50:23 crc kubenswrapper[4756]: E1205 17:50:23.586705 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59da21ba65ccddd5ecbed8b9c0eeb44fc4d4ee41573dfa3a3eda588399aa5d86\": container with ID starting with 59da21ba65ccddd5ecbed8b9c0eeb44fc4d4ee41573dfa3a3eda588399aa5d86 not found: ID does not exist" containerID="59da21ba65ccddd5ecbed8b9c0eeb44fc4d4ee41573dfa3a3eda588399aa5d86" Dec 05 17:50:23 crc kubenswrapper[4756]: I1205 17:50:23.586743 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59da21ba65ccddd5ecbed8b9c0eeb44fc4d4ee41573dfa3a3eda588399aa5d86"} err="failed to get container status \"59da21ba65ccddd5ecbed8b9c0eeb44fc4d4ee41573dfa3a3eda588399aa5d86\": rpc error: code = NotFound desc = could not find container \"59da21ba65ccddd5ecbed8b9c0eeb44fc4d4ee41573dfa3a3eda588399aa5d86\": container with ID starting with 59da21ba65ccddd5ecbed8b9c0eeb44fc4d4ee41573dfa3a3eda588399aa5d86 not found: ID does not exist" Dec 05 17:50:24 crc kubenswrapper[4756]: I1205 17:50:24.063393 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c8accb5-840e-4d8f-9928-4e8fae845218" path="/var/lib/kubelet/pods/0c8accb5-840e-4d8f-9928-4e8fae845218/volumes" Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.251197 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-29w9v"] Dec 05 17:50:54 crc kubenswrapper[4756]: E1205 17:50:54.253689 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723c11a6-c5d7-48d1-b97d-eb92ff0609b6" containerName="extract-content" Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.253812 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="723c11a6-c5d7-48d1-b97d-eb92ff0609b6" containerName="extract-content" Dec 05 17:50:54 crc kubenswrapper[4756]: E1205 17:50:54.253901 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c8accb5-840e-4d8f-9928-4e8fae845218" containerName="barbican-api-log" Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.253983 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c8accb5-840e-4d8f-9928-4e8fae845218" containerName="barbican-api-log" Dec 05 17:50:54 crc kubenswrapper[4756]: E1205 17:50:54.254071 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1356577d-3bc1-401b-a8e8-0d6b83a99d96" containerName="init" Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.254150 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1356577d-3bc1-401b-a8e8-0d6b83a99d96" containerName="init" Dec 05 17:50:54 crc kubenswrapper[4756]: E1205 17:50:54.254238 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1356577d-3bc1-401b-a8e8-0d6b83a99d96" containerName="dnsmasq-dns" Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.254323 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1356577d-3bc1-401b-a8e8-0d6b83a99d96" containerName="dnsmasq-dns" Dec 05 17:50:54 crc kubenswrapper[4756]: E1205 17:50:54.254408 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723c11a6-c5d7-48d1-b97d-eb92ff0609b6" containerName="registry-server" Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.254512 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="723c11a6-c5d7-48d1-b97d-eb92ff0609b6" containerName="registry-server" Dec 05 17:50:54 crc kubenswrapper[4756]: E1205 17:50:54.254586 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723c11a6-c5d7-48d1-b97d-eb92ff0609b6" containerName="extract-utilities" Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.254669 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="723c11a6-c5d7-48d1-b97d-eb92ff0609b6" containerName="extract-utilities" Dec 05 17:50:54 crc kubenswrapper[4756]: E1205 17:50:54.254762 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c8accb5-840e-4d8f-9928-4e8fae845218" containerName="barbican-api" Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.254836 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c8accb5-840e-4d8f-9928-4e8fae845218" containerName="barbican-api" Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.255120 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1356577d-3bc1-401b-a8e8-0d6b83a99d96" containerName="dnsmasq-dns" Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.255216 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c8accb5-840e-4d8f-9928-4e8fae845218" containerName="barbican-api" Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.255307 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="723c11a6-c5d7-48d1-b97d-eb92ff0609b6" containerName="registry-server" Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.255389 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c8accb5-840e-4d8f-9928-4e8fae845218" containerName="barbican-api-log" Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.256197 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-29w9v" Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.266469 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-29w9v"] Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.351069 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a4fd436-4abe-4ba5-b62e-4c107f564686-operator-scripts\") pod \"neutron-db-create-29w9v\" (UID: \"2a4fd436-4abe-4ba5-b62e-4c107f564686\") " pod="openstack/neutron-db-create-29w9v" Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.351444 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vckqq\" (UniqueName: \"kubernetes.io/projected/2a4fd436-4abe-4ba5-b62e-4c107f564686-kube-api-access-vckqq\") pod \"neutron-db-create-29w9v\" (UID: \"2a4fd436-4abe-4ba5-b62e-4c107f564686\") " pod="openstack/neutron-db-create-29w9v" Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.353709 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7466-account-create-update-xqvp2"] Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.354796 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7466-account-create-update-xqvp2" Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.357672 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.368562 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7466-account-create-update-xqvp2"] Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.453382 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a4fd436-4abe-4ba5-b62e-4c107f564686-operator-scripts\") pod \"neutron-db-create-29w9v\" (UID: \"2a4fd436-4abe-4ba5-b62e-4c107f564686\") " pod="openstack/neutron-db-create-29w9v" Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.453492 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5f8accb-ed03-468b-ba0d-7251eec5a484-operator-scripts\") pod \"neutron-7466-account-create-update-xqvp2\" (UID: \"e5f8accb-ed03-468b-ba0d-7251eec5a484\") " pod="openstack/neutron-7466-account-create-update-xqvp2" Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.453540 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vckqq\" (UniqueName: \"kubernetes.io/projected/2a4fd436-4abe-4ba5-b62e-4c107f564686-kube-api-access-vckqq\") pod \"neutron-db-create-29w9v\" (UID: \"2a4fd436-4abe-4ba5-b62e-4c107f564686\") " pod="openstack/neutron-db-create-29w9v" Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.453589 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fhmp\" (UniqueName: \"kubernetes.io/projected/e5f8accb-ed03-468b-ba0d-7251eec5a484-kube-api-access-8fhmp\") pod \"neutron-7466-account-create-update-xqvp2\" (UID: \"e5f8accb-ed03-468b-ba0d-7251eec5a484\") " pod="openstack/neutron-7466-account-create-update-xqvp2" Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.454364 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a4fd436-4abe-4ba5-b62e-4c107f564686-operator-scripts\") pod \"neutron-db-create-29w9v\" (UID: \"2a4fd436-4abe-4ba5-b62e-4c107f564686\") " pod="openstack/neutron-db-create-29w9v" Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.475018 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vckqq\" (UniqueName: \"kubernetes.io/projected/2a4fd436-4abe-4ba5-b62e-4c107f564686-kube-api-access-vckqq\") pod \"neutron-db-create-29w9v\" (UID: \"2a4fd436-4abe-4ba5-b62e-4c107f564686\") " pod="openstack/neutron-db-create-29w9v" Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.554970 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5f8accb-ed03-468b-ba0d-7251eec5a484-operator-scripts\") pod \"neutron-7466-account-create-update-xqvp2\" (UID: \"e5f8accb-ed03-468b-ba0d-7251eec5a484\") " pod="openstack/neutron-7466-account-create-update-xqvp2" Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.555093 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fhmp\" (UniqueName: \"kubernetes.io/projected/e5f8accb-ed03-468b-ba0d-7251eec5a484-kube-api-access-8fhmp\") pod \"neutron-7466-account-create-update-xqvp2\" (UID: \"e5f8accb-ed03-468b-ba0d-7251eec5a484\") " pod="openstack/neutron-7466-account-create-update-xqvp2" Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.555653 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5f8accb-ed03-468b-ba0d-7251eec5a484-operator-scripts\") pod \"neutron-7466-account-create-update-xqvp2\" (UID: \"e5f8accb-ed03-468b-ba0d-7251eec5a484\") " pod="openstack/neutron-7466-account-create-update-xqvp2" Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.574485 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fhmp\" (UniqueName: \"kubernetes.io/projected/e5f8accb-ed03-468b-ba0d-7251eec5a484-kube-api-access-8fhmp\") pod \"neutron-7466-account-create-update-xqvp2\" (UID: \"e5f8accb-ed03-468b-ba0d-7251eec5a484\") " pod="openstack/neutron-7466-account-create-update-xqvp2" Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.574616 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-29w9v" Dec 05 17:50:54 crc kubenswrapper[4756]: I1205 17:50:54.678530 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7466-account-create-update-xqvp2" Dec 05 17:50:55 crc kubenswrapper[4756]: I1205 17:50:55.068103 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-29w9v"] Dec 05 17:50:55 crc kubenswrapper[4756]: I1205 17:50:55.169697 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7466-account-create-update-xqvp2"] Dec 05 17:50:55 crc kubenswrapper[4756]: W1205 17:50:55.177001 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5f8accb_ed03_468b_ba0d_7251eec5a484.slice/crio-80402a75bdd0b2147cae432fd687a5794357f2f1fc41a53828578d34feedbff4 WatchSource:0}: Error finding container 80402a75bdd0b2147cae432fd687a5794357f2f1fc41a53828578d34feedbff4: Status 404 returned error can't find the container with id 80402a75bdd0b2147cae432fd687a5794357f2f1fc41a53828578d34feedbff4 Dec 05 17:50:55 crc kubenswrapper[4756]: I1205 17:50:55.871096 4756 generic.go:334] "Generic (PLEG): container finished" podID="2a4fd436-4abe-4ba5-b62e-4c107f564686" containerID="adb928f4fefb1da8a69a78e41afe558619d6cf18bcfb9113d8586ff8f77a2051" exitCode=0 Dec 05 17:50:55 crc kubenswrapper[4756]: I1205 17:50:55.871247 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-29w9v" event={"ID":"2a4fd436-4abe-4ba5-b62e-4c107f564686","Type":"ContainerDied","Data":"adb928f4fefb1da8a69a78e41afe558619d6cf18bcfb9113d8586ff8f77a2051"} Dec 05 17:50:55 crc kubenswrapper[4756]: I1205 17:50:55.871624 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-29w9v" event={"ID":"2a4fd436-4abe-4ba5-b62e-4c107f564686","Type":"ContainerStarted","Data":"292a4a394bc6a9cb37834a20aaa82c5ea5e34ca6ced58c3982931e24cb788036"} Dec 05 17:50:55 crc kubenswrapper[4756]: I1205 17:50:55.874748 4756 generic.go:334] "Generic (PLEG): container finished" podID="e5f8accb-ed03-468b-ba0d-7251eec5a484" containerID="49b25f6f134f5b8f594e82a24f36dec2f2d71abf99696f10b5ca8a0deb086007" exitCode=0 Dec 05 17:50:55 crc kubenswrapper[4756]: I1205 17:50:55.874821 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7466-account-create-update-xqvp2" event={"ID":"e5f8accb-ed03-468b-ba0d-7251eec5a484","Type":"ContainerDied","Data":"49b25f6f134f5b8f594e82a24f36dec2f2d71abf99696f10b5ca8a0deb086007"} Dec 05 17:50:55 crc kubenswrapper[4756]: I1205 17:50:55.874869 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7466-account-create-update-xqvp2" event={"ID":"e5f8accb-ed03-468b-ba0d-7251eec5a484","Type":"ContainerStarted","Data":"80402a75bdd0b2147cae432fd687a5794357f2f1fc41a53828578d34feedbff4"} Dec 05 17:50:57 crc kubenswrapper[4756]: I1205 17:50:57.346688 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7466-account-create-update-xqvp2" Dec 05 17:50:57 crc kubenswrapper[4756]: I1205 17:50:57.351467 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-29w9v" Dec 05 17:50:57 crc kubenswrapper[4756]: I1205 17:50:57.411014 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fhmp\" (UniqueName: \"kubernetes.io/projected/e5f8accb-ed03-468b-ba0d-7251eec5a484-kube-api-access-8fhmp\") pod \"e5f8accb-ed03-468b-ba0d-7251eec5a484\" (UID: \"e5f8accb-ed03-468b-ba0d-7251eec5a484\") " Dec 05 17:50:57 crc kubenswrapper[4756]: I1205 17:50:57.411221 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5f8accb-ed03-468b-ba0d-7251eec5a484-operator-scripts\") pod \"e5f8accb-ed03-468b-ba0d-7251eec5a484\" (UID: \"e5f8accb-ed03-468b-ba0d-7251eec5a484\") " Dec 05 17:50:57 crc kubenswrapper[4756]: I1205 17:50:57.411283 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vckqq\" (UniqueName: \"kubernetes.io/projected/2a4fd436-4abe-4ba5-b62e-4c107f564686-kube-api-access-vckqq\") pod \"2a4fd436-4abe-4ba5-b62e-4c107f564686\" (UID: \"2a4fd436-4abe-4ba5-b62e-4c107f564686\") " Dec 05 17:50:57 crc kubenswrapper[4756]: I1205 17:50:57.411357 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a4fd436-4abe-4ba5-b62e-4c107f564686-operator-scripts\") pod \"2a4fd436-4abe-4ba5-b62e-4c107f564686\" (UID: \"2a4fd436-4abe-4ba5-b62e-4c107f564686\") " Dec 05 17:50:57 crc kubenswrapper[4756]: I1205 17:50:57.411975 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a4fd436-4abe-4ba5-b62e-4c107f564686-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2a4fd436-4abe-4ba5-b62e-4c107f564686" (UID: "2a4fd436-4abe-4ba5-b62e-4c107f564686"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:50:57 crc kubenswrapper[4756]: I1205 17:50:57.412214 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5f8accb-ed03-468b-ba0d-7251eec5a484-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e5f8accb-ed03-468b-ba0d-7251eec5a484" (UID: "e5f8accb-ed03-468b-ba0d-7251eec5a484"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:50:57 crc kubenswrapper[4756]: I1205 17:50:57.418226 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a4fd436-4abe-4ba5-b62e-4c107f564686-kube-api-access-vckqq" (OuterVolumeSpecName: "kube-api-access-vckqq") pod "2a4fd436-4abe-4ba5-b62e-4c107f564686" (UID: "2a4fd436-4abe-4ba5-b62e-4c107f564686"). InnerVolumeSpecName "kube-api-access-vckqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:50:57 crc kubenswrapper[4756]: I1205 17:50:57.419289 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5f8accb-ed03-468b-ba0d-7251eec5a484-kube-api-access-8fhmp" (OuterVolumeSpecName: "kube-api-access-8fhmp") pod "e5f8accb-ed03-468b-ba0d-7251eec5a484" (UID: "e5f8accb-ed03-468b-ba0d-7251eec5a484"). InnerVolumeSpecName "kube-api-access-8fhmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:50:57 crc kubenswrapper[4756]: I1205 17:50:57.513041 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5f8accb-ed03-468b-ba0d-7251eec5a484-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:50:57 crc kubenswrapper[4756]: I1205 17:50:57.513092 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vckqq\" (UniqueName: \"kubernetes.io/projected/2a4fd436-4abe-4ba5-b62e-4c107f564686-kube-api-access-vckqq\") on node \"crc\" DevicePath \"\"" Dec 05 17:50:57 crc kubenswrapper[4756]: I1205 17:50:57.513114 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a4fd436-4abe-4ba5-b62e-4c107f564686-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:50:57 crc kubenswrapper[4756]: I1205 17:50:57.513135 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fhmp\" (UniqueName: \"kubernetes.io/projected/e5f8accb-ed03-468b-ba0d-7251eec5a484-kube-api-access-8fhmp\") on node \"crc\" DevicePath \"\"" Dec 05 17:50:57 crc kubenswrapper[4756]: I1205 17:50:57.907709 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7466-account-create-update-xqvp2" event={"ID":"e5f8accb-ed03-468b-ba0d-7251eec5a484","Type":"ContainerDied","Data":"80402a75bdd0b2147cae432fd687a5794357f2f1fc41a53828578d34feedbff4"} Dec 05 17:50:57 crc kubenswrapper[4756]: I1205 17:50:57.908404 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80402a75bdd0b2147cae432fd687a5794357f2f1fc41a53828578d34feedbff4" Dec 05 17:50:57 crc kubenswrapper[4756]: I1205 17:50:57.907807 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7466-account-create-update-xqvp2" Dec 05 17:50:57 crc kubenswrapper[4756]: I1205 17:50:57.911223 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-29w9v" event={"ID":"2a4fd436-4abe-4ba5-b62e-4c107f564686","Type":"ContainerDied","Data":"292a4a394bc6a9cb37834a20aaa82c5ea5e34ca6ced58c3982931e24cb788036"} Dec 05 17:50:57 crc kubenswrapper[4756]: I1205 17:50:57.911279 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="292a4a394bc6a9cb37834a20aaa82c5ea5e34ca6ced58c3982931e24cb788036" Dec 05 17:50:57 crc kubenswrapper[4756]: I1205 17:50:57.911349 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-29w9v" Dec 05 17:50:59 crc kubenswrapper[4756]: I1205 17:50:59.600123 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-tsnnf"] Dec 05 17:50:59 crc kubenswrapper[4756]: E1205 17:50:59.600909 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f8accb-ed03-468b-ba0d-7251eec5a484" containerName="mariadb-account-create-update" Dec 05 17:50:59 crc kubenswrapper[4756]: I1205 17:50:59.600928 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f8accb-ed03-468b-ba0d-7251eec5a484" containerName="mariadb-account-create-update" Dec 05 17:50:59 crc kubenswrapper[4756]: E1205 17:50:59.600963 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a4fd436-4abe-4ba5-b62e-4c107f564686" containerName="mariadb-database-create" Dec 05 17:50:59 crc kubenswrapper[4756]: I1205 17:50:59.600972 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a4fd436-4abe-4ba5-b62e-4c107f564686" containerName="mariadb-database-create" Dec 05 17:50:59 crc kubenswrapper[4756]: I1205 17:50:59.601240 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a4fd436-4abe-4ba5-b62e-4c107f564686" containerName="mariadb-database-create" Dec 05 17:50:59 crc kubenswrapper[4756]: I1205 17:50:59.601275 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5f8accb-ed03-468b-ba0d-7251eec5a484" containerName="mariadb-account-create-update" Dec 05 17:50:59 crc kubenswrapper[4756]: I1205 17:50:59.602076 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-tsnnf" Dec 05 17:50:59 crc kubenswrapper[4756]: I1205 17:50:59.604658 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-8qmt5" Dec 05 17:50:59 crc kubenswrapper[4756]: I1205 17:50:59.604866 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 05 17:50:59 crc kubenswrapper[4756]: I1205 17:50:59.605298 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 05 17:50:59 crc kubenswrapper[4756]: I1205 17:50:59.607320 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-tsnnf"] Dec 05 17:50:59 crc kubenswrapper[4756]: I1205 17:50:59.663674 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e-config\") pod \"neutron-db-sync-tsnnf\" (UID: \"2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e\") " pod="openstack/neutron-db-sync-tsnnf" Dec 05 17:50:59 crc kubenswrapper[4756]: I1205 17:50:59.663778 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrg8t\" (UniqueName: \"kubernetes.io/projected/2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e-kube-api-access-zrg8t\") pod \"neutron-db-sync-tsnnf\" (UID: \"2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e\") " pod="openstack/neutron-db-sync-tsnnf" Dec 05 17:50:59 crc kubenswrapper[4756]: I1205 17:50:59.663820 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e-combined-ca-bundle\") pod \"neutron-db-sync-tsnnf\" (UID: \"2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e\") " pod="openstack/neutron-db-sync-tsnnf" Dec 05 17:50:59 crc kubenswrapper[4756]: I1205 17:50:59.765368 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrg8t\" (UniqueName: \"kubernetes.io/projected/2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e-kube-api-access-zrg8t\") pod \"neutron-db-sync-tsnnf\" (UID: \"2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e\") " pod="openstack/neutron-db-sync-tsnnf" Dec 05 17:50:59 crc kubenswrapper[4756]: I1205 17:50:59.765463 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e-combined-ca-bundle\") pod \"neutron-db-sync-tsnnf\" (UID: \"2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e\") " pod="openstack/neutron-db-sync-tsnnf" Dec 05 17:50:59 crc kubenswrapper[4756]: I1205 17:50:59.765535 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e-config\") pod \"neutron-db-sync-tsnnf\" (UID: \"2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e\") " pod="openstack/neutron-db-sync-tsnnf" Dec 05 17:50:59 crc kubenswrapper[4756]: I1205 17:50:59.774406 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e-config\") pod \"neutron-db-sync-tsnnf\" (UID: \"2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e\") " pod="openstack/neutron-db-sync-tsnnf" Dec 05 17:50:59 crc kubenswrapper[4756]: I1205 17:50:59.779679 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e-combined-ca-bundle\") pod \"neutron-db-sync-tsnnf\" (UID: \"2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e\") " pod="openstack/neutron-db-sync-tsnnf" Dec 05 17:50:59 crc kubenswrapper[4756]: I1205 17:50:59.782246 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrg8t\" (UniqueName: \"kubernetes.io/projected/2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e-kube-api-access-zrg8t\") pod \"neutron-db-sync-tsnnf\" (UID: \"2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e\") " pod="openstack/neutron-db-sync-tsnnf" Dec 05 17:50:59 crc kubenswrapper[4756]: I1205 17:50:59.928059 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-tsnnf" Dec 05 17:51:00 crc kubenswrapper[4756]: I1205 17:51:00.455880 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-tsnnf"] Dec 05 17:51:00 crc kubenswrapper[4756]: I1205 17:51:00.942367 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-tsnnf" event={"ID":"2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e","Type":"ContainerStarted","Data":"5eddb4965271073be520495158566283f0528c821f2d8f8cc0c95ff1c70d69de"} Dec 05 17:51:00 crc kubenswrapper[4756]: I1205 17:51:00.942450 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-tsnnf" event={"ID":"2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e","Type":"ContainerStarted","Data":"3af89ca54a932be20955174bb109c999c82019b64317767933682bb222d60ada"} Dec 05 17:51:00 crc kubenswrapper[4756]: I1205 17:51:00.965431 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-tsnnf" podStartSLOduration=1.965397939 podStartE2EDuration="1.965397939s" podCreationTimestamp="2025-12-05 17:50:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:51:00.963737945 +0000 UTC m=+5721.202411497" watchObservedRunningTime="2025-12-05 17:51:00.965397939 +0000 UTC m=+5721.204071451" Dec 05 17:51:04 crc kubenswrapper[4756]: I1205 17:51:04.985836 4756 generic.go:334] "Generic (PLEG): container finished" podID="2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e" containerID="5eddb4965271073be520495158566283f0528c821f2d8f8cc0c95ff1c70d69de" exitCode=0 Dec 05 17:51:04 crc kubenswrapper[4756]: I1205 17:51:04.985881 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-tsnnf" event={"ID":"2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e","Type":"ContainerDied","Data":"5eddb4965271073be520495158566283f0528c821f2d8f8cc0c95ff1c70d69de"} Dec 05 17:51:06 crc kubenswrapper[4756]: I1205 17:51:06.383090 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-tsnnf" Dec 05 17:51:06 crc kubenswrapper[4756]: I1205 17:51:06.406953 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrg8t\" (UniqueName: \"kubernetes.io/projected/2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e-kube-api-access-zrg8t\") pod \"2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e\" (UID: \"2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e\") " Dec 05 17:51:06 crc kubenswrapper[4756]: I1205 17:51:06.407092 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e-config\") pod \"2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e\" (UID: \"2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e\") " Dec 05 17:51:06 crc kubenswrapper[4756]: I1205 17:51:06.407151 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e-combined-ca-bundle\") pod \"2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e\" (UID: \"2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e\") " Dec 05 17:51:06 crc kubenswrapper[4756]: I1205 17:51:06.414173 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e-kube-api-access-zrg8t" (OuterVolumeSpecName: "kube-api-access-zrg8t") pod "2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e" (UID: "2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e"). InnerVolumeSpecName "kube-api-access-zrg8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:51:06 crc kubenswrapper[4756]: I1205 17:51:06.447849 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e-config" (OuterVolumeSpecName: "config") pod "2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e" (UID: "2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:51:06 crc kubenswrapper[4756]: I1205 17:51:06.456002 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e" (UID: "2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:51:06 crc kubenswrapper[4756]: I1205 17:51:06.509052 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrg8t\" (UniqueName: \"kubernetes.io/projected/2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e-kube-api-access-zrg8t\") on node \"crc\" DevicePath \"\"" Dec 05 17:51:06 crc kubenswrapper[4756]: I1205 17:51:06.509091 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e-config\") on node \"crc\" DevicePath \"\"" Dec 05 17:51:06 crc kubenswrapper[4756]: I1205 17:51:06.509107 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.007371 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-tsnnf" event={"ID":"2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e","Type":"ContainerDied","Data":"3af89ca54a932be20955174bb109c999c82019b64317767933682bb222d60ada"} Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.007451 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3af89ca54a932be20955174bb109c999c82019b64317767933682bb222d60ada" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.007549 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-tsnnf" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.289860 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56bc9cf955-6rqlh"] Dec 05 17:51:07 crc kubenswrapper[4756]: E1205 17:51:07.290995 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e" containerName="neutron-db-sync" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.291023 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e" containerName="neutron-db-sync" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.291216 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e" containerName="neutron-db-sync" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.292934 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56bc9cf955-6rqlh" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.310747 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56bc9cf955-6rqlh"] Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.344571 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/369c09d9-5631-4875-b833-7abfd6fb0c24-ovsdbserver-nb\") pod \"dnsmasq-dns-56bc9cf955-6rqlh\" (UID: \"369c09d9-5631-4875-b833-7abfd6fb0c24\") " pod="openstack/dnsmasq-dns-56bc9cf955-6rqlh" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.344623 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/369c09d9-5631-4875-b833-7abfd6fb0c24-dns-svc\") pod \"dnsmasq-dns-56bc9cf955-6rqlh\" (UID: \"369c09d9-5631-4875-b833-7abfd6fb0c24\") " pod="openstack/dnsmasq-dns-56bc9cf955-6rqlh" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.344641 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/369c09d9-5631-4875-b833-7abfd6fb0c24-config\") pod \"dnsmasq-dns-56bc9cf955-6rqlh\" (UID: \"369c09d9-5631-4875-b833-7abfd6fb0c24\") " pod="openstack/dnsmasq-dns-56bc9cf955-6rqlh" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.344790 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/369c09d9-5631-4875-b833-7abfd6fb0c24-ovsdbserver-sb\") pod \"dnsmasq-dns-56bc9cf955-6rqlh\" (UID: \"369c09d9-5631-4875-b833-7abfd6fb0c24\") " pod="openstack/dnsmasq-dns-56bc9cf955-6rqlh" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.345019 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjf4s\" (UniqueName: \"kubernetes.io/projected/369c09d9-5631-4875-b833-7abfd6fb0c24-kube-api-access-wjf4s\") pod \"dnsmasq-dns-56bc9cf955-6rqlh\" (UID: \"369c09d9-5631-4875-b833-7abfd6fb0c24\") " pod="openstack/dnsmasq-dns-56bc9cf955-6rqlh" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.396753 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-bb87b744b-pw9vf"] Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.398220 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bb87b744b-pw9vf" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.406892 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bb87b744b-pw9vf"] Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.408006 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.408064 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.408256 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.408507 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-8qmt5" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.446303 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjf4s\" (UniqueName: \"kubernetes.io/projected/369c09d9-5631-4875-b833-7abfd6fb0c24-kube-api-access-wjf4s\") pod \"dnsmasq-dns-56bc9cf955-6rqlh\" (UID: \"369c09d9-5631-4875-b833-7abfd6fb0c24\") " pod="openstack/dnsmasq-dns-56bc9cf955-6rqlh" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.446353 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd10c41-923b-4044-bc25-e74b02895508-ovndb-tls-certs\") pod \"neutron-bb87b744b-pw9vf\" (UID: \"6dd10c41-923b-4044-bc25-e74b02895508\") " pod="openstack/neutron-bb87b744b-pw9vf" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.446400 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd10c41-923b-4044-bc25-e74b02895508-combined-ca-bundle\") pod \"neutron-bb87b744b-pw9vf\" (UID: \"6dd10c41-923b-4044-bc25-e74b02895508\") " pod="openstack/neutron-bb87b744b-pw9vf" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.446440 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d9gx\" (UniqueName: \"kubernetes.io/projected/6dd10c41-923b-4044-bc25-e74b02895508-kube-api-access-7d9gx\") pod \"neutron-bb87b744b-pw9vf\" (UID: \"6dd10c41-923b-4044-bc25-e74b02895508\") " pod="openstack/neutron-bb87b744b-pw9vf" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.446471 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/369c09d9-5631-4875-b833-7abfd6fb0c24-ovsdbserver-nb\") pod \"dnsmasq-dns-56bc9cf955-6rqlh\" (UID: \"369c09d9-5631-4875-b833-7abfd6fb0c24\") " pod="openstack/dnsmasq-dns-56bc9cf955-6rqlh" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.446499 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/369c09d9-5631-4875-b833-7abfd6fb0c24-dns-svc\") pod \"dnsmasq-dns-56bc9cf955-6rqlh\" (UID: \"369c09d9-5631-4875-b833-7abfd6fb0c24\") " pod="openstack/dnsmasq-dns-56bc9cf955-6rqlh" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.446517 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/369c09d9-5631-4875-b833-7abfd6fb0c24-config\") pod \"dnsmasq-dns-56bc9cf955-6rqlh\" (UID: \"369c09d9-5631-4875-b833-7abfd6fb0c24\") " pod="openstack/dnsmasq-dns-56bc9cf955-6rqlh" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.446539 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6dd10c41-923b-4044-bc25-e74b02895508-config\") pod \"neutron-bb87b744b-pw9vf\" (UID: \"6dd10c41-923b-4044-bc25-e74b02895508\") " pod="openstack/neutron-bb87b744b-pw9vf" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.446563 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/369c09d9-5631-4875-b833-7abfd6fb0c24-ovsdbserver-sb\") pod \"dnsmasq-dns-56bc9cf955-6rqlh\" (UID: \"369c09d9-5631-4875-b833-7abfd6fb0c24\") " pod="openstack/dnsmasq-dns-56bc9cf955-6rqlh" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.446616 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6dd10c41-923b-4044-bc25-e74b02895508-httpd-config\") pod \"neutron-bb87b744b-pw9vf\" (UID: \"6dd10c41-923b-4044-bc25-e74b02895508\") " pod="openstack/neutron-bb87b744b-pw9vf" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.447650 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/369c09d9-5631-4875-b833-7abfd6fb0c24-ovsdbserver-nb\") pod \"dnsmasq-dns-56bc9cf955-6rqlh\" (UID: \"369c09d9-5631-4875-b833-7abfd6fb0c24\") " pod="openstack/dnsmasq-dns-56bc9cf955-6rqlh" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.448189 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/369c09d9-5631-4875-b833-7abfd6fb0c24-ovsdbserver-sb\") pod \"dnsmasq-dns-56bc9cf955-6rqlh\" (UID: \"369c09d9-5631-4875-b833-7abfd6fb0c24\") " pod="openstack/dnsmasq-dns-56bc9cf955-6rqlh" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.448724 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/369c09d9-5631-4875-b833-7abfd6fb0c24-dns-svc\") pod \"dnsmasq-dns-56bc9cf955-6rqlh\" (UID: \"369c09d9-5631-4875-b833-7abfd6fb0c24\") " pod="openstack/dnsmasq-dns-56bc9cf955-6rqlh" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.448719 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/369c09d9-5631-4875-b833-7abfd6fb0c24-config\") pod \"dnsmasq-dns-56bc9cf955-6rqlh\" (UID: \"369c09d9-5631-4875-b833-7abfd6fb0c24\") " pod="openstack/dnsmasq-dns-56bc9cf955-6rqlh" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.463287 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjf4s\" (UniqueName: \"kubernetes.io/projected/369c09d9-5631-4875-b833-7abfd6fb0c24-kube-api-access-wjf4s\") pod \"dnsmasq-dns-56bc9cf955-6rqlh\" (UID: \"369c09d9-5631-4875-b833-7abfd6fb0c24\") " pod="openstack/dnsmasq-dns-56bc9cf955-6rqlh" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.550051 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6dd10c41-923b-4044-bc25-e74b02895508-httpd-config\") pod \"neutron-bb87b744b-pw9vf\" (UID: \"6dd10c41-923b-4044-bc25-e74b02895508\") " pod="openstack/neutron-bb87b744b-pw9vf" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.550103 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd10c41-923b-4044-bc25-e74b02895508-ovndb-tls-certs\") pod \"neutron-bb87b744b-pw9vf\" (UID: \"6dd10c41-923b-4044-bc25-e74b02895508\") " pod="openstack/neutron-bb87b744b-pw9vf" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.550147 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd10c41-923b-4044-bc25-e74b02895508-combined-ca-bundle\") pod \"neutron-bb87b744b-pw9vf\" (UID: \"6dd10c41-923b-4044-bc25-e74b02895508\") " pod="openstack/neutron-bb87b744b-pw9vf" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.550175 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d9gx\" (UniqueName: \"kubernetes.io/projected/6dd10c41-923b-4044-bc25-e74b02895508-kube-api-access-7d9gx\") pod \"neutron-bb87b744b-pw9vf\" (UID: \"6dd10c41-923b-4044-bc25-e74b02895508\") " pod="openstack/neutron-bb87b744b-pw9vf" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.550225 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6dd10c41-923b-4044-bc25-e74b02895508-config\") pod \"neutron-bb87b744b-pw9vf\" (UID: \"6dd10c41-923b-4044-bc25-e74b02895508\") " pod="openstack/neutron-bb87b744b-pw9vf" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.554207 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd10c41-923b-4044-bc25-e74b02895508-combined-ca-bundle\") pod \"neutron-bb87b744b-pw9vf\" (UID: \"6dd10c41-923b-4044-bc25-e74b02895508\") " pod="openstack/neutron-bb87b744b-pw9vf" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.554390 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6dd10c41-923b-4044-bc25-e74b02895508-config\") pod \"neutron-bb87b744b-pw9vf\" (UID: \"6dd10c41-923b-4044-bc25-e74b02895508\") " pod="openstack/neutron-bb87b744b-pw9vf" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.556882 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd10c41-923b-4044-bc25-e74b02895508-ovndb-tls-certs\") pod \"neutron-bb87b744b-pw9vf\" (UID: \"6dd10c41-923b-4044-bc25-e74b02895508\") " pod="openstack/neutron-bb87b744b-pw9vf" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.561931 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6dd10c41-923b-4044-bc25-e74b02895508-httpd-config\") pod \"neutron-bb87b744b-pw9vf\" (UID: \"6dd10c41-923b-4044-bc25-e74b02895508\") " pod="openstack/neutron-bb87b744b-pw9vf" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.571297 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d9gx\" (UniqueName: \"kubernetes.io/projected/6dd10c41-923b-4044-bc25-e74b02895508-kube-api-access-7d9gx\") pod \"neutron-bb87b744b-pw9vf\" (UID: \"6dd10c41-923b-4044-bc25-e74b02895508\") " pod="openstack/neutron-bb87b744b-pw9vf" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.623544 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56bc9cf955-6rqlh" Dec 05 17:51:07 crc kubenswrapper[4756]: I1205 17:51:07.732021 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bb87b744b-pw9vf" Dec 05 17:51:08 crc kubenswrapper[4756]: I1205 17:51:08.098955 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56bc9cf955-6rqlh"] Dec 05 17:51:08 crc kubenswrapper[4756]: I1205 17:51:08.347995 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bb87b744b-pw9vf"] Dec 05 17:51:09 crc kubenswrapper[4756]: I1205 17:51:09.026937 4756 generic.go:334] "Generic (PLEG): container finished" podID="369c09d9-5631-4875-b833-7abfd6fb0c24" containerID="af1a6533c9d1b35d68cb6e84bc980014731dc0934b9c2560e2acf104288ad9f6" exitCode=0 Dec 05 17:51:09 crc kubenswrapper[4756]: I1205 17:51:09.027068 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56bc9cf955-6rqlh" event={"ID":"369c09d9-5631-4875-b833-7abfd6fb0c24","Type":"ContainerDied","Data":"af1a6533c9d1b35d68cb6e84bc980014731dc0934b9c2560e2acf104288ad9f6"} Dec 05 17:51:09 crc kubenswrapper[4756]: I1205 17:51:09.027452 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56bc9cf955-6rqlh" event={"ID":"369c09d9-5631-4875-b833-7abfd6fb0c24","Type":"ContainerStarted","Data":"73667b21207675c8a75e9377449f77f8e8e24f6be57d991298956e80428737f2"} Dec 05 17:51:09 crc kubenswrapper[4756]: I1205 17:51:09.029725 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bb87b744b-pw9vf" event={"ID":"6dd10c41-923b-4044-bc25-e74b02895508","Type":"ContainerStarted","Data":"ae0d4b401d501ed1bfe6f29345f453924f66c28f3b9fb69e7d7ab480bea18a71"} Dec 05 17:51:09 crc kubenswrapper[4756]: I1205 17:51:09.029767 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bb87b744b-pw9vf" event={"ID":"6dd10c41-923b-4044-bc25-e74b02895508","Type":"ContainerStarted","Data":"dd080a13d742dc81abd0f1a65ffec5c50302eabdcd27756c850dfb34d36346d8"} Dec 05 17:51:09 crc kubenswrapper[4756]: I1205 17:51:09.029783 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bb87b744b-pw9vf" event={"ID":"6dd10c41-923b-4044-bc25-e74b02895508","Type":"ContainerStarted","Data":"7e484843e896e0eaecec0370f03d30084ddb7087516b3da6bfcbcbf7c8702106"} Dec 05 17:51:09 crc kubenswrapper[4756]: I1205 17:51:09.030027 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-bb87b744b-pw9vf" Dec 05 17:51:09 crc kubenswrapper[4756]: I1205 17:51:09.087586 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-bb87b744b-pw9vf" podStartSLOduration=2.087562063 podStartE2EDuration="2.087562063s" podCreationTimestamp="2025-12-05 17:51:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:51:09.078531422 +0000 UTC m=+5729.317204934" watchObservedRunningTime="2025-12-05 17:51:09.087562063 +0000 UTC m=+5729.326235585" Dec 05 17:51:09 crc kubenswrapper[4756]: I1205 17:51:09.780653 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db4968d85-bhsbf"] Dec 05 17:51:09 crc kubenswrapper[4756]: I1205 17:51:09.782225 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db4968d85-bhsbf" Dec 05 17:51:09 crc kubenswrapper[4756]: I1205 17:51:09.787644 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 05 17:51:09 crc kubenswrapper[4756]: I1205 17:51:09.796813 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 05 17:51:09 crc kubenswrapper[4756]: I1205 17:51:09.806969 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db4968d85-bhsbf"] Dec 05 17:51:09 crc kubenswrapper[4756]: I1205 17:51:09.898060 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f15acbfb-ac97-4d70-8a46-3d08cf7a53c2-config\") pod \"neutron-db4968d85-bhsbf\" (UID: \"f15acbfb-ac97-4d70-8a46-3d08cf7a53c2\") " pod="openstack/neutron-db4968d85-bhsbf" Dec 05 17:51:09 crc kubenswrapper[4756]: I1205 17:51:09.898661 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f15acbfb-ac97-4d70-8a46-3d08cf7a53c2-internal-tls-certs\") pod \"neutron-db4968d85-bhsbf\" (UID: \"f15acbfb-ac97-4d70-8a46-3d08cf7a53c2\") " pod="openstack/neutron-db4968d85-bhsbf" Dec 05 17:51:09 crc kubenswrapper[4756]: I1205 17:51:09.898719 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f15acbfb-ac97-4d70-8a46-3d08cf7a53c2-ovndb-tls-certs\") pod \"neutron-db4968d85-bhsbf\" (UID: \"f15acbfb-ac97-4d70-8a46-3d08cf7a53c2\") " pod="openstack/neutron-db4968d85-bhsbf" Dec 05 17:51:09 crc kubenswrapper[4756]: I1205 17:51:09.898743 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f15acbfb-ac97-4d70-8a46-3d08cf7a53c2-combined-ca-bundle\") pod \"neutron-db4968d85-bhsbf\" (UID: \"f15acbfb-ac97-4d70-8a46-3d08cf7a53c2\") " pod="openstack/neutron-db4968d85-bhsbf" Dec 05 17:51:09 crc kubenswrapper[4756]: I1205 17:51:09.898831 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f15acbfb-ac97-4d70-8a46-3d08cf7a53c2-httpd-config\") pod \"neutron-db4968d85-bhsbf\" (UID: \"f15acbfb-ac97-4d70-8a46-3d08cf7a53c2\") " pod="openstack/neutron-db4968d85-bhsbf" Dec 05 17:51:09 crc kubenswrapper[4756]: I1205 17:51:09.898865 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f15acbfb-ac97-4d70-8a46-3d08cf7a53c2-public-tls-certs\") pod \"neutron-db4968d85-bhsbf\" (UID: \"f15acbfb-ac97-4d70-8a46-3d08cf7a53c2\") " pod="openstack/neutron-db4968d85-bhsbf" Dec 05 17:51:09 crc kubenswrapper[4756]: I1205 17:51:09.898897 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jvnx\" (UniqueName: \"kubernetes.io/projected/f15acbfb-ac97-4d70-8a46-3d08cf7a53c2-kube-api-access-8jvnx\") pod \"neutron-db4968d85-bhsbf\" (UID: \"f15acbfb-ac97-4d70-8a46-3d08cf7a53c2\") " pod="openstack/neutron-db4968d85-bhsbf" Dec 05 17:51:10 crc kubenswrapper[4756]: I1205 17:51:10.000621 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f15acbfb-ac97-4d70-8a46-3d08cf7a53c2-internal-tls-certs\") pod \"neutron-db4968d85-bhsbf\" (UID: \"f15acbfb-ac97-4d70-8a46-3d08cf7a53c2\") " pod="openstack/neutron-db4968d85-bhsbf" Dec 05 17:51:10 crc kubenswrapper[4756]: I1205 17:51:10.000692 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f15acbfb-ac97-4d70-8a46-3d08cf7a53c2-ovndb-tls-certs\") pod \"neutron-db4968d85-bhsbf\" (UID: \"f15acbfb-ac97-4d70-8a46-3d08cf7a53c2\") " pod="openstack/neutron-db4968d85-bhsbf" Dec 05 17:51:10 crc kubenswrapper[4756]: I1205 17:51:10.000718 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f15acbfb-ac97-4d70-8a46-3d08cf7a53c2-combined-ca-bundle\") pod \"neutron-db4968d85-bhsbf\" (UID: \"f15acbfb-ac97-4d70-8a46-3d08cf7a53c2\") " pod="openstack/neutron-db4968d85-bhsbf" Dec 05 17:51:10 crc kubenswrapper[4756]: I1205 17:51:10.000810 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f15acbfb-ac97-4d70-8a46-3d08cf7a53c2-httpd-config\") pod \"neutron-db4968d85-bhsbf\" (UID: \"f15acbfb-ac97-4d70-8a46-3d08cf7a53c2\") " pod="openstack/neutron-db4968d85-bhsbf" Dec 05 17:51:10 crc kubenswrapper[4756]: I1205 17:51:10.000844 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f15acbfb-ac97-4d70-8a46-3d08cf7a53c2-public-tls-certs\") pod \"neutron-db4968d85-bhsbf\" (UID: \"f15acbfb-ac97-4d70-8a46-3d08cf7a53c2\") " pod="openstack/neutron-db4968d85-bhsbf" Dec 05 17:51:10 crc kubenswrapper[4756]: I1205 17:51:10.000866 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jvnx\" (UniqueName: \"kubernetes.io/projected/f15acbfb-ac97-4d70-8a46-3d08cf7a53c2-kube-api-access-8jvnx\") pod \"neutron-db4968d85-bhsbf\" (UID: \"f15acbfb-ac97-4d70-8a46-3d08cf7a53c2\") " pod="openstack/neutron-db4968d85-bhsbf" Dec 05 17:51:10 crc kubenswrapper[4756]: I1205 17:51:10.000913 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f15acbfb-ac97-4d70-8a46-3d08cf7a53c2-config\") pod \"neutron-db4968d85-bhsbf\" (UID: \"f15acbfb-ac97-4d70-8a46-3d08cf7a53c2\") " pod="openstack/neutron-db4968d85-bhsbf" Dec 05 17:51:10 crc kubenswrapper[4756]: I1205 17:51:10.006464 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f15acbfb-ac97-4d70-8a46-3d08cf7a53c2-combined-ca-bundle\") pod \"neutron-db4968d85-bhsbf\" (UID: \"f15acbfb-ac97-4d70-8a46-3d08cf7a53c2\") " pod="openstack/neutron-db4968d85-bhsbf" Dec 05 17:51:10 crc kubenswrapper[4756]: I1205 17:51:10.006617 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f15acbfb-ac97-4d70-8a46-3d08cf7a53c2-ovndb-tls-certs\") pod \"neutron-db4968d85-bhsbf\" (UID: \"f15acbfb-ac97-4d70-8a46-3d08cf7a53c2\") " pod="openstack/neutron-db4968d85-bhsbf" Dec 05 17:51:10 crc kubenswrapper[4756]: I1205 17:51:10.007237 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f15acbfb-ac97-4d70-8a46-3d08cf7a53c2-public-tls-certs\") pod \"neutron-db4968d85-bhsbf\" (UID: \"f15acbfb-ac97-4d70-8a46-3d08cf7a53c2\") " pod="openstack/neutron-db4968d85-bhsbf" Dec 05 17:51:10 crc kubenswrapper[4756]: I1205 17:51:10.007264 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f15acbfb-ac97-4d70-8a46-3d08cf7a53c2-config\") pod \"neutron-db4968d85-bhsbf\" (UID: \"f15acbfb-ac97-4d70-8a46-3d08cf7a53c2\") " pod="openstack/neutron-db4968d85-bhsbf" Dec 05 17:51:10 crc kubenswrapper[4756]: I1205 17:51:10.013346 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f15acbfb-ac97-4d70-8a46-3d08cf7a53c2-internal-tls-certs\") pod \"neutron-db4968d85-bhsbf\" (UID: \"f15acbfb-ac97-4d70-8a46-3d08cf7a53c2\") " pod="openstack/neutron-db4968d85-bhsbf" Dec 05 17:51:10 crc kubenswrapper[4756]: I1205 17:51:10.014297 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f15acbfb-ac97-4d70-8a46-3d08cf7a53c2-httpd-config\") pod \"neutron-db4968d85-bhsbf\" (UID: \"f15acbfb-ac97-4d70-8a46-3d08cf7a53c2\") " pod="openstack/neutron-db4968d85-bhsbf" Dec 05 17:51:10 crc kubenswrapper[4756]: I1205 17:51:10.023628 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jvnx\" (UniqueName: \"kubernetes.io/projected/f15acbfb-ac97-4d70-8a46-3d08cf7a53c2-kube-api-access-8jvnx\") pod \"neutron-db4968d85-bhsbf\" (UID: \"f15acbfb-ac97-4d70-8a46-3d08cf7a53c2\") " pod="openstack/neutron-db4968d85-bhsbf" Dec 05 17:51:10 crc kubenswrapper[4756]: I1205 17:51:10.041298 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56bc9cf955-6rqlh" event={"ID":"369c09d9-5631-4875-b833-7abfd6fb0c24","Type":"ContainerStarted","Data":"73e9d9daeadd72dceff5a5842c9a4d31ec6332ffe46da374897edea447caa0e6"} Dec 05 17:51:10 crc kubenswrapper[4756]: I1205 17:51:10.076648 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56bc9cf955-6rqlh" podStartSLOduration=3.076609271 podStartE2EDuration="3.076609271s" podCreationTimestamp="2025-12-05 17:51:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:51:10.063960154 +0000 UTC m=+5730.302633676" watchObservedRunningTime="2025-12-05 17:51:10.076609271 +0000 UTC m=+5730.315282823" Dec 05 17:51:10 crc kubenswrapper[4756]: I1205 17:51:10.133684 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db4968d85-bhsbf" Dec 05 17:51:10 crc kubenswrapper[4756]: I1205 17:51:10.592017 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db4968d85-bhsbf"] Dec 05 17:51:10 crc kubenswrapper[4756]: W1205 17:51:10.594898 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf15acbfb_ac97_4d70_8a46_3d08cf7a53c2.slice/crio-d93446c2363aa84b37ec365acfccde64035fe3032e2f180f3961a2d0a5c22859 WatchSource:0}: Error finding container d93446c2363aa84b37ec365acfccde64035fe3032e2f180f3961a2d0a5c22859: Status 404 returned error can't find the container with id d93446c2363aa84b37ec365acfccde64035fe3032e2f180f3961a2d0a5c22859 Dec 05 17:51:11 crc kubenswrapper[4756]: I1205 17:51:11.051059 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db4968d85-bhsbf" event={"ID":"f15acbfb-ac97-4d70-8a46-3d08cf7a53c2","Type":"ContainerStarted","Data":"f16eafb37609b596a0b592ae9262e082913f1f0f35ec4d58c8754bd73a72e1a7"} Dec 05 17:51:11 crc kubenswrapper[4756]: I1205 17:51:11.051465 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db4968d85-bhsbf" event={"ID":"f15acbfb-ac97-4d70-8a46-3d08cf7a53c2","Type":"ContainerStarted","Data":"369bbe0996b9c2fb62619c2f6aa237ae4aaa9a39c6b10c958c7cc44c6f0e8439"} Dec 05 17:51:11 crc kubenswrapper[4756]: I1205 17:51:11.051482 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db4968d85-bhsbf" event={"ID":"f15acbfb-ac97-4d70-8a46-3d08cf7a53c2","Type":"ContainerStarted","Data":"d93446c2363aa84b37ec365acfccde64035fe3032e2f180f3961a2d0a5c22859"} Dec 05 17:51:11 crc kubenswrapper[4756]: I1205 17:51:11.051504 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56bc9cf955-6rqlh" Dec 05 17:51:11 crc kubenswrapper[4756]: I1205 17:51:11.073635 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db4968d85-bhsbf" podStartSLOduration=2.073610541 podStartE2EDuration="2.073610541s" podCreationTimestamp="2025-12-05 17:51:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:51:11.069492982 +0000 UTC m=+5731.308166494" watchObservedRunningTime="2025-12-05 17:51:11.073610541 +0000 UTC m=+5731.312284063" Dec 05 17:51:12 crc kubenswrapper[4756]: I1205 17:51:12.066443 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-db4968d85-bhsbf" Dec 05 17:51:17 crc kubenswrapper[4756]: I1205 17:51:17.625695 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56bc9cf955-6rqlh" Dec 05 17:51:17 crc kubenswrapper[4756]: I1205 17:51:17.702912 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-576d7bc65c-rdc65"] Dec 05 17:51:17 crc kubenswrapper[4756]: I1205 17:51:17.703298 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-576d7bc65c-rdc65" podUID="096ccb27-1f9b-4301-8aeb-ae04e71503bb" containerName="dnsmasq-dns" containerID="cri-o://e7c21b96e26a422bf8e08c13d16b128f33b8de66a1b624c9892481c06d483838" gracePeriod=10 Dec 05 17:51:18 crc kubenswrapper[4756]: I1205 17:51:18.138699 4756 generic.go:334] "Generic (PLEG): container finished" podID="096ccb27-1f9b-4301-8aeb-ae04e71503bb" containerID="e7c21b96e26a422bf8e08c13d16b128f33b8de66a1b624c9892481c06d483838" exitCode=0 Dec 05 17:51:18 crc kubenswrapper[4756]: I1205 17:51:18.139021 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-576d7bc65c-rdc65" event={"ID":"096ccb27-1f9b-4301-8aeb-ae04e71503bb","Type":"ContainerDied","Data":"e7c21b96e26a422bf8e08c13d16b128f33b8de66a1b624c9892481c06d483838"} Dec 05 17:51:18 crc kubenswrapper[4756]: I1205 17:51:18.259566 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-576d7bc65c-rdc65" Dec 05 17:51:18 crc kubenswrapper[4756]: I1205 17:51:18.354941 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/096ccb27-1f9b-4301-8aeb-ae04e71503bb-dns-svc\") pod \"096ccb27-1f9b-4301-8aeb-ae04e71503bb\" (UID: \"096ccb27-1f9b-4301-8aeb-ae04e71503bb\") " Dec 05 17:51:18 crc kubenswrapper[4756]: I1205 17:51:18.354986 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/096ccb27-1f9b-4301-8aeb-ae04e71503bb-ovsdbserver-sb\") pod \"096ccb27-1f9b-4301-8aeb-ae04e71503bb\" (UID: \"096ccb27-1f9b-4301-8aeb-ae04e71503bb\") " Dec 05 17:51:18 crc kubenswrapper[4756]: I1205 17:51:18.355025 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/096ccb27-1f9b-4301-8aeb-ae04e71503bb-config\") pod \"096ccb27-1f9b-4301-8aeb-ae04e71503bb\" (UID: \"096ccb27-1f9b-4301-8aeb-ae04e71503bb\") " Dec 05 17:51:18 crc kubenswrapper[4756]: I1205 17:51:18.355087 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/096ccb27-1f9b-4301-8aeb-ae04e71503bb-ovsdbserver-nb\") pod \"096ccb27-1f9b-4301-8aeb-ae04e71503bb\" (UID: \"096ccb27-1f9b-4301-8aeb-ae04e71503bb\") " Dec 05 17:51:18 crc kubenswrapper[4756]: I1205 17:51:18.355138 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c9kh\" (UniqueName: \"kubernetes.io/projected/096ccb27-1f9b-4301-8aeb-ae04e71503bb-kube-api-access-6c9kh\") pod \"096ccb27-1f9b-4301-8aeb-ae04e71503bb\" (UID: \"096ccb27-1f9b-4301-8aeb-ae04e71503bb\") " Dec 05 17:51:18 crc kubenswrapper[4756]: I1205 17:51:18.374872 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/096ccb27-1f9b-4301-8aeb-ae04e71503bb-kube-api-access-6c9kh" (OuterVolumeSpecName: "kube-api-access-6c9kh") pod "096ccb27-1f9b-4301-8aeb-ae04e71503bb" (UID: "096ccb27-1f9b-4301-8aeb-ae04e71503bb"). InnerVolumeSpecName "kube-api-access-6c9kh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:51:18 crc kubenswrapper[4756]: I1205 17:51:18.398955 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/096ccb27-1f9b-4301-8aeb-ae04e71503bb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "096ccb27-1f9b-4301-8aeb-ae04e71503bb" (UID: "096ccb27-1f9b-4301-8aeb-ae04e71503bb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:51:18 crc kubenswrapper[4756]: I1205 17:51:18.405179 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/096ccb27-1f9b-4301-8aeb-ae04e71503bb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "096ccb27-1f9b-4301-8aeb-ae04e71503bb" (UID: "096ccb27-1f9b-4301-8aeb-ae04e71503bb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:51:18 crc kubenswrapper[4756]: I1205 17:51:18.421628 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/096ccb27-1f9b-4301-8aeb-ae04e71503bb-config" (OuterVolumeSpecName: "config") pod "096ccb27-1f9b-4301-8aeb-ae04e71503bb" (UID: "096ccb27-1f9b-4301-8aeb-ae04e71503bb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:51:18 crc kubenswrapper[4756]: I1205 17:51:18.424965 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/096ccb27-1f9b-4301-8aeb-ae04e71503bb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "096ccb27-1f9b-4301-8aeb-ae04e71503bb" (UID: "096ccb27-1f9b-4301-8aeb-ae04e71503bb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:51:18 crc kubenswrapper[4756]: I1205 17:51:18.457564 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/096ccb27-1f9b-4301-8aeb-ae04e71503bb-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 17:51:18 crc kubenswrapper[4756]: I1205 17:51:18.457612 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/096ccb27-1f9b-4301-8aeb-ae04e71503bb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 17:51:18 crc kubenswrapper[4756]: I1205 17:51:18.457626 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/096ccb27-1f9b-4301-8aeb-ae04e71503bb-config\") on node \"crc\" DevicePath \"\"" Dec 05 17:51:18 crc kubenswrapper[4756]: I1205 17:51:18.457635 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/096ccb27-1f9b-4301-8aeb-ae04e71503bb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 17:51:18 crc kubenswrapper[4756]: I1205 17:51:18.457649 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6c9kh\" (UniqueName: \"kubernetes.io/projected/096ccb27-1f9b-4301-8aeb-ae04e71503bb-kube-api-access-6c9kh\") on node \"crc\" DevicePath \"\"" Dec 05 17:51:19 crc kubenswrapper[4756]: I1205 17:51:19.154151 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-576d7bc65c-rdc65" event={"ID":"096ccb27-1f9b-4301-8aeb-ae04e71503bb","Type":"ContainerDied","Data":"496756be744d661155d09fbff32eaf7d53721d1ede086818bbbdaa44d5398c0e"} Dec 05 17:51:19 crc kubenswrapper[4756]: I1205 17:51:19.154212 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-576d7bc65c-rdc65" Dec 05 17:51:19 crc kubenswrapper[4756]: I1205 17:51:19.154255 4756 scope.go:117] "RemoveContainer" containerID="e7c21b96e26a422bf8e08c13d16b128f33b8de66a1b624c9892481c06d483838" Dec 05 17:51:19 crc kubenswrapper[4756]: I1205 17:51:19.204180 4756 scope.go:117] "RemoveContainer" containerID="ba082d4075d0e57613b55f161b5c2dc556c9a990561fde1769efcb7eafb91841" Dec 05 17:51:19 crc kubenswrapper[4756]: I1205 17:51:19.204520 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-576d7bc65c-rdc65"] Dec 05 17:51:19 crc kubenswrapper[4756]: I1205 17:51:19.217460 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-576d7bc65c-rdc65"] Dec 05 17:51:20 crc kubenswrapper[4756]: I1205 17:51:20.062392 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="096ccb27-1f9b-4301-8aeb-ae04e71503bb" path="/var/lib/kubelet/pods/096ccb27-1f9b-4301-8aeb-ae04e71503bb/volumes" Dec 05 17:51:37 crc kubenswrapper[4756]: I1205 17:51:37.749177 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-bb87b744b-pw9vf" Dec 05 17:51:40 crc kubenswrapper[4756]: I1205 17:51:40.147639 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-db4968d85-bhsbf" Dec 05 17:51:40 crc kubenswrapper[4756]: I1205 17:51:40.211270 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-bb87b744b-pw9vf"] Dec 05 17:51:40 crc kubenswrapper[4756]: I1205 17:51:40.211583 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-bb87b744b-pw9vf" podUID="6dd10c41-923b-4044-bc25-e74b02895508" containerName="neutron-api" containerID="cri-o://dd080a13d742dc81abd0f1a65ffec5c50302eabdcd27756c850dfb34d36346d8" gracePeriod=30 Dec 05 17:51:40 crc kubenswrapper[4756]: I1205 17:51:40.211739 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-bb87b744b-pw9vf" podUID="6dd10c41-923b-4044-bc25-e74b02895508" containerName="neutron-httpd" containerID="cri-o://ae0d4b401d501ed1bfe6f29345f453924f66c28f3b9fb69e7d7ab480bea18a71" gracePeriod=30 Dec 05 17:51:41 crc kubenswrapper[4756]: I1205 17:51:41.423648 4756 generic.go:334] "Generic (PLEG): container finished" podID="6dd10c41-923b-4044-bc25-e74b02895508" containerID="ae0d4b401d501ed1bfe6f29345f453924f66c28f3b9fb69e7d7ab480bea18a71" exitCode=0 Dec 05 17:51:41 crc kubenswrapper[4756]: I1205 17:51:41.423968 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bb87b744b-pw9vf" event={"ID":"6dd10c41-923b-4044-bc25-e74b02895508","Type":"ContainerDied","Data":"ae0d4b401d501ed1bfe6f29345f453924f66c28f3b9fb69e7d7ab480bea18a71"} Dec 05 17:51:43 crc kubenswrapper[4756]: I1205 17:51:43.450209 4756 generic.go:334] "Generic (PLEG): container finished" podID="6dd10c41-923b-4044-bc25-e74b02895508" containerID="dd080a13d742dc81abd0f1a65ffec5c50302eabdcd27756c850dfb34d36346d8" exitCode=0 Dec 05 17:51:43 crc kubenswrapper[4756]: I1205 17:51:43.450284 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bb87b744b-pw9vf" event={"ID":"6dd10c41-923b-4044-bc25-e74b02895508","Type":"ContainerDied","Data":"dd080a13d742dc81abd0f1a65ffec5c50302eabdcd27756c850dfb34d36346d8"} Dec 05 17:51:43 crc kubenswrapper[4756]: I1205 17:51:43.832283 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bb87b744b-pw9vf" Dec 05 17:51:43 crc kubenswrapper[4756]: I1205 17:51:43.909732 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6dd10c41-923b-4044-bc25-e74b02895508-httpd-config\") pod \"6dd10c41-923b-4044-bc25-e74b02895508\" (UID: \"6dd10c41-923b-4044-bc25-e74b02895508\") " Dec 05 17:51:43 crc kubenswrapper[4756]: I1205 17:51:43.909779 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd10c41-923b-4044-bc25-e74b02895508-combined-ca-bundle\") pod \"6dd10c41-923b-4044-bc25-e74b02895508\" (UID: \"6dd10c41-923b-4044-bc25-e74b02895508\") " Dec 05 17:51:43 crc kubenswrapper[4756]: I1205 17:51:43.909900 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d9gx\" (UniqueName: \"kubernetes.io/projected/6dd10c41-923b-4044-bc25-e74b02895508-kube-api-access-7d9gx\") pod \"6dd10c41-923b-4044-bc25-e74b02895508\" (UID: \"6dd10c41-923b-4044-bc25-e74b02895508\") " Dec 05 17:51:43 crc kubenswrapper[4756]: I1205 17:51:43.909954 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6dd10c41-923b-4044-bc25-e74b02895508-config\") pod \"6dd10c41-923b-4044-bc25-e74b02895508\" (UID: \"6dd10c41-923b-4044-bc25-e74b02895508\") " Dec 05 17:51:43 crc kubenswrapper[4756]: I1205 17:51:43.909980 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd10c41-923b-4044-bc25-e74b02895508-ovndb-tls-certs\") pod \"6dd10c41-923b-4044-bc25-e74b02895508\" (UID: \"6dd10c41-923b-4044-bc25-e74b02895508\") " Dec 05 17:51:43 crc kubenswrapper[4756]: I1205 17:51:43.916647 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd10c41-923b-4044-bc25-e74b02895508-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "6dd10c41-923b-4044-bc25-e74b02895508" (UID: "6dd10c41-923b-4044-bc25-e74b02895508"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:51:43 crc kubenswrapper[4756]: I1205 17:51:43.917675 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dd10c41-923b-4044-bc25-e74b02895508-kube-api-access-7d9gx" (OuterVolumeSpecName: "kube-api-access-7d9gx") pod "6dd10c41-923b-4044-bc25-e74b02895508" (UID: "6dd10c41-923b-4044-bc25-e74b02895508"). InnerVolumeSpecName "kube-api-access-7d9gx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:51:43 crc kubenswrapper[4756]: I1205 17:51:43.956307 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd10c41-923b-4044-bc25-e74b02895508-config" (OuterVolumeSpecName: "config") pod "6dd10c41-923b-4044-bc25-e74b02895508" (UID: "6dd10c41-923b-4044-bc25-e74b02895508"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:51:43 crc kubenswrapper[4756]: I1205 17:51:43.965775 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd10c41-923b-4044-bc25-e74b02895508-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6dd10c41-923b-4044-bc25-e74b02895508" (UID: "6dd10c41-923b-4044-bc25-e74b02895508"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:51:43 crc kubenswrapper[4756]: I1205 17:51:43.981966 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd10c41-923b-4044-bc25-e74b02895508-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "6dd10c41-923b-4044-bc25-e74b02895508" (UID: "6dd10c41-923b-4044-bc25-e74b02895508"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:51:44 crc kubenswrapper[4756]: I1205 17:51:44.012208 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7d9gx\" (UniqueName: \"kubernetes.io/projected/6dd10c41-923b-4044-bc25-e74b02895508-kube-api-access-7d9gx\") on node \"crc\" DevicePath \"\"" Dec 05 17:51:44 crc kubenswrapper[4756]: I1205 17:51:44.012237 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6dd10c41-923b-4044-bc25-e74b02895508-config\") on node \"crc\" DevicePath \"\"" Dec 05 17:51:44 crc kubenswrapper[4756]: I1205 17:51:44.012248 4756 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd10c41-923b-4044-bc25-e74b02895508-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 17:51:44 crc kubenswrapper[4756]: I1205 17:51:44.012256 4756 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6dd10c41-923b-4044-bc25-e74b02895508-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 05 17:51:44 crc kubenswrapper[4756]: I1205 17:51:44.012264 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd10c41-923b-4044-bc25-e74b02895508-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:51:44 crc kubenswrapper[4756]: I1205 17:51:44.467324 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bb87b744b-pw9vf" event={"ID":"6dd10c41-923b-4044-bc25-e74b02895508","Type":"ContainerDied","Data":"7e484843e896e0eaecec0370f03d30084ddb7087516b3da6bfcbcbf7c8702106"} Dec 05 17:51:44 crc kubenswrapper[4756]: I1205 17:51:44.467397 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bb87b744b-pw9vf" Dec 05 17:51:44 crc kubenswrapper[4756]: I1205 17:51:44.467544 4756 scope.go:117] "RemoveContainer" containerID="ae0d4b401d501ed1bfe6f29345f453924f66c28f3b9fb69e7d7ab480bea18a71" Dec 05 17:51:44 crc kubenswrapper[4756]: I1205 17:51:44.513643 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-bb87b744b-pw9vf"] Dec 05 17:51:44 crc kubenswrapper[4756]: I1205 17:51:44.514066 4756 scope.go:117] "RemoveContainer" containerID="dd080a13d742dc81abd0f1a65ffec5c50302eabdcd27756c850dfb34d36346d8" Dec 05 17:51:44 crc kubenswrapper[4756]: I1205 17:51:44.523271 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-bb87b744b-pw9vf"] Dec 05 17:51:46 crc kubenswrapper[4756]: I1205 17:51:46.068395 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dd10c41-923b-4044-bc25-e74b02895508" path="/var/lib/kubelet/pods/6dd10c41-923b-4044-bc25-e74b02895508/volumes" Dec 05 17:51:52 crc kubenswrapper[4756]: I1205 17:51:52.295918 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 17:51:52 crc kubenswrapper[4756]: I1205 17:51:52.296601 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.068222 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-2b7k7"] Dec 05 17:52:09 crc kubenswrapper[4756]: E1205 17:52:09.069147 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dd10c41-923b-4044-bc25-e74b02895508" containerName="neutron-api" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.069164 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dd10c41-923b-4044-bc25-e74b02895508" containerName="neutron-api" Dec 05 17:52:09 crc kubenswrapper[4756]: E1205 17:52:09.069180 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dd10c41-923b-4044-bc25-e74b02895508" containerName="neutron-httpd" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.069189 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dd10c41-923b-4044-bc25-e74b02895508" containerName="neutron-httpd" Dec 05 17:52:09 crc kubenswrapper[4756]: E1205 17:52:09.069200 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="096ccb27-1f9b-4301-8aeb-ae04e71503bb" containerName="dnsmasq-dns" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.069208 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="096ccb27-1f9b-4301-8aeb-ae04e71503bb" containerName="dnsmasq-dns" Dec 05 17:52:09 crc kubenswrapper[4756]: E1205 17:52:09.069242 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="096ccb27-1f9b-4301-8aeb-ae04e71503bb" containerName="init" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.069250 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="096ccb27-1f9b-4301-8aeb-ae04e71503bb" containerName="init" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.069474 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dd10c41-923b-4044-bc25-e74b02895508" containerName="neutron-httpd" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.069495 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="096ccb27-1f9b-4301-8aeb-ae04e71503bb" containerName="dnsmasq-dns" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.069513 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dd10c41-923b-4044-bc25-e74b02895508" containerName="neutron-api" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.070230 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2b7k7" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.072559 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.072916 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.073009 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.086785 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-9s6rc" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.091723 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.116911 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-2b7k7"] Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.184315 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-744b879ff-nmmkl"] Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.186398 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744b879ff-nmmkl" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.194319 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-744b879ff-nmmkl"] Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.243495 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e02404d-5bef-4477-a4c9-b2d3992f0136-scripts\") pod \"swift-ring-rebalance-2b7k7\" (UID: \"7e02404d-5bef-4477-a4c9-b2d3992f0136\") " pod="openstack/swift-ring-rebalance-2b7k7" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.243844 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7e02404d-5bef-4477-a4c9-b2d3992f0136-dispersionconf\") pod \"swift-ring-rebalance-2b7k7\" (UID: \"7e02404d-5bef-4477-a4c9-b2d3992f0136\") " pod="openstack/swift-ring-rebalance-2b7k7" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.243946 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcph4\" (UniqueName: \"kubernetes.io/projected/7e02404d-5bef-4477-a4c9-b2d3992f0136-kube-api-access-mcph4\") pod \"swift-ring-rebalance-2b7k7\" (UID: \"7e02404d-5bef-4477-a4c9-b2d3992f0136\") " pod="openstack/swift-ring-rebalance-2b7k7" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.244035 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7e02404d-5bef-4477-a4c9-b2d3992f0136-swiftconf\") pod \"swift-ring-rebalance-2b7k7\" (UID: \"7e02404d-5bef-4477-a4c9-b2d3992f0136\") " pod="openstack/swift-ring-rebalance-2b7k7" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.244123 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7e02404d-5bef-4477-a4c9-b2d3992f0136-ring-data-devices\") pod \"swift-ring-rebalance-2b7k7\" (UID: \"7e02404d-5bef-4477-a4c9-b2d3992f0136\") " pod="openstack/swift-ring-rebalance-2b7k7" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.244229 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7e02404d-5bef-4477-a4c9-b2d3992f0136-etc-swift\") pod \"swift-ring-rebalance-2b7k7\" (UID: \"7e02404d-5bef-4477-a4c9-b2d3992f0136\") " pod="openstack/swift-ring-rebalance-2b7k7" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.244372 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e02404d-5bef-4477-a4c9-b2d3992f0136-combined-ca-bundle\") pod \"swift-ring-rebalance-2b7k7\" (UID: \"7e02404d-5bef-4477-a4c9-b2d3992f0136\") " pod="openstack/swift-ring-rebalance-2b7k7" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.346284 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/888ad42e-520e-40d0-a7b7-2d0edd7667a6-config\") pod \"dnsmasq-dns-744b879ff-nmmkl\" (UID: \"888ad42e-520e-40d0-a7b7-2d0edd7667a6\") " pod="openstack/dnsmasq-dns-744b879ff-nmmkl" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.346563 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/888ad42e-520e-40d0-a7b7-2d0edd7667a6-ovsdbserver-sb\") pod \"dnsmasq-dns-744b879ff-nmmkl\" (UID: \"888ad42e-520e-40d0-a7b7-2d0edd7667a6\") " pod="openstack/dnsmasq-dns-744b879ff-nmmkl" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.346622 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e02404d-5bef-4477-a4c9-b2d3992f0136-scripts\") pod \"swift-ring-rebalance-2b7k7\" (UID: \"7e02404d-5bef-4477-a4c9-b2d3992f0136\") " pod="openstack/swift-ring-rebalance-2b7k7" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.346666 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7e02404d-5bef-4477-a4c9-b2d3992f0136-dispersionconf\") pod \"swift-ring-rebalance-2b7k7\" (UID: \"7e02404d-5bef-4477-a4c9-b2d3992f0136\") " pod="openstack/swift-ring-rebalance-2b7k7" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.346694 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcph4\" (UniqueName: \"kubernetes.io/projected/7e02404d-5bef-4477-a4c9-b2d3992f0136-kube-api-access-mcph4\") pod \"swift-ring-rebalance-2b7k7\" (UID: \"7e02404d-5bef-4477-a4c9-b2d3992f0136\") " pod="openstack/swift-ring-rebalance-2b7k7" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.346729 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7e02404d-5bef-4477-a4c9-b2d3992f0136-swiftconf\") pod \"swift-ring-rebalance-2b7k7\" (UID: \"7e02404d-5bef-4477-a4c9-b2d3992f0136\") " pod="openstack/swift-ring-rebalance-2b7k7" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.346751 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7e02404d-5bef-4477-a4c9-b2d3992f0136-ring-data-devices\") pod \"swift-ring-rebalance-2b7k7\" (UID: \"7e02404d-5bef-4477-a4c9-b2d3992f0136\") " pod="openstack/swift-ring-rebalance-2b7k7" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.346775 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/888ad42e-520e-40d0-a7b7-2d0edd7667a6-dns-svc\") pod \"dnsmasq-dns-744b879ff-nmmkl\" (UID: \"888ad42e-520e-40d0-a7b7-2d0edd7667a6\") " pod="openstack/dnsmasq-dns-744b879ff-nmmkl" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.346791 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7e02404d-5bef-4477-a4c9-b2d3992f0136-etc-swift\") pod \"swift-ring-rebalance-2b7k7\" (UID: \"7e02404d-5bef-4477-a4c9-b2d3992f0136\") " pod="openstack/swift-ring-rebalance-2b7k7" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.346830 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5txj4\" (UniqueName: \"kubernetes.io/projected/888ad42e-520e-40d0-a7b7-2d0edd7667a6-kube-api-access-5txj4\") pod \"dnsmasq-dns-744b879ff-nmmkl\" (UID: \"888ad42e-520e-40d0-a7b7-2d0edd7667a6\") " pod="openstack/dnsmasq-dns-744b879ff-nmmkl" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.346853 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e02404d-5bef-4477-a4c9-b2d3992f0136-combined-ca-bundle\") pod \"swift-ring-rebalance-2b7k7\" (UID: \"7e02404d-5bef-4477-a4c9-b2d3992f0136\") " pod="openstack/swift-ring-rebalance-2b7k7" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.346876 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/888ad42e-520e-40d0-a7b7-2d0edd7667a6-ovsdbserver-nb\") pod \"dnsmasq-dns-744b879ff-nmmkl\" (UID: \"888ad42e-520e-40d0-a7b7-2d0edd7667a6\") " pod="openstack/dnsmasq-dns-744b879ff-nmmkl" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.347214 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7e02404d-5bef-4477-a4c9-b2d3992f0136-etc-swift\") pod \"swift-ring-rebalance-2b7k7\" (UID: \"7e02404d-5bef-4477-a4c9-b2d3992f0136\") " pod="openstack/swift-ring-rebalance-2b7k7" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.347282 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e02404d-5bef-4477-a4c9-b2d3992f0136-scripts\") pod \"swift-ring-rebalance-2b7k7\" (UID: \"7e02404d-5bef-4477-a4c9-b2d3992f0136\") " pod="openstack/swift-ring-rebalance-2b7k7" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.347495 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7e02404d-5bef-4477-a4c9-b2d3992f0136-ring-data-devices\") pod \"swift-ring-rebalance-2b7k7\" (UID: \"7e02404d-5bef-4477-a4c9-b2d3992f0136\") " pod="openstack/swift-ring-rebalance-2b7k7" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.353235 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7e02404d-5bef-4477-a4c9-b2d3992f0136-swiftconf\") pod \"swift-ring-rebalance-2b7k7\" (UID: \"7e02404d-5bef-4477-a4c9-b2d3992f0136\") " pod="openstack/swift-ring-rebalance-2b7k7" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.354987 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e02404d-5bef-4477-a4c9-b2d3992f0136-combined-ca-bundle\") pod \"swift-ring-rebalance-2b7k7\" (UID: \"7e02404d-5bef-4477-a4c9-b2d3992f0136\") " pod="openstack/swift-ring-rebalance-2b7k7" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.360858 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7e02404d-5bef-4477-a4c9-b2d3992f0136-dispersionconf\") pod \"swift-ring-rebalance-2b7k7\" (UID: \"7e02404d-5bef-4477-a4c9-b2d3992f0136\") " pod="openstack/swift-ring-rebalance-2b7k7" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.375988 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcph4\" (UniqueName: \"kubernetes.io/projected/7e02404d-5bef-4477-a4c9-b2d3992f0136-kube-api-access-mcph4\") pod \"swift-ring-rebalance-2b7k7\" (UID: \"7e02404d-5bef-4477-a4c9-b2d3992f0136\") " pod="openstack/swift-ring-rebalance-2b7k7" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.408229 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2b7k7" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.448628 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/888ad42e-520e-40d0-a7b7-2d0edd7667a6-dns-svc\") pod \"dnsmasq-dns-744b879ff-nmmkl\" (UID: \"888ad42e-520e-40d0-a7b7-2d0edd7667a6\") " pod="openstack/dnsmasq-dns-744b879ff-nmmkl" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.448685 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5txj4\" (UniqueName: \"kubernetes.io/projected/888ad42e-520e-40d0-a7b7-2d0edd7667a6-kube-api-access-5txj4\") pod \"dnsmasq-dns-744b879ff-nmmkl\" (UID: \"888ad42e-520e-40d0-a7b7-2d0edd7667a6\") " pod="openstack/dnsmasq-dns-744b879ff-nmmkl" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.448718 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/888ad42e-520e-40d0-a7b7-2d0edd7667a6-ovsdbserver-nb\") pod \"dnsmasq-dns-744b879ff-nmmkl\" (UID: \"888ad42e-520e-40d0-a7b7-2d0edd7667a6\") " pod="openstack/dnsmasq-dns-744b879ff-nmmkl" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.448747 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/888ad42e-520e-40d0-a7b7-2d0edd7667a6-config\") pod \"dnsmasq-dns-744b879ff-nmmkl\" (UID: \"888ad42e-520e-40d0-a7b7-2d0edd7667a6\") " pod="openstack/dnsmasq-dns-744b879ff-nmmkl" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.448791 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/888ad42e-520e-40d0-a7b7-2d0edd7667a6-ovsdbserver-sb\") pod \"dnsmasq-dns-744b879ff-nmmkl\" (UID: \"888ad42e-520e-40d0-a7b7-2d0edd7667a6\") " pod="openstack/dnsmasq-dns-744b879ff-nmmkl" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.449680 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/888ad42e-520e-40d0-a7b7-2d0edd7667a6-ovsdbserver-sb\") pod \"dnsmasq-dns-744b879ff-nmmkl\" (UID: \"888ad42e-520e-40d0-a7b7-2d0edd7667a6\") " pod="openstack/dnsmasq-dns-744b879ff-nmmkl" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.450201 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/888ad42e-520e-40d0-a7b7-2d0edd7667a6-ovsdbserver-nb\") pod \"dnsmasq-dns-744b879ff-nmmkl\" (UID: \"888ad42e-520e-40d0-a7b7-2d0edd7667a6\") " pod="openstack/dnsmasq-dns-744b879ff-nmmkl" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.450624 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/888ad42e-520e-40d0-a7b7-2d0edd7667a6-dns-svc\") pod \"dnsmasq-dns-744b879ff-nmmkl\" (UID: \"888ad42e-520e-40d0-a7b7-2d0edd7667a6\") " pod="openstack/dnsmasq-dns-744b879ff-nmmkl" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.450705 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/888ad42e-520e-40d0-a7b7-2d0edd7667a6-config\") pod \"dnsmasq-dns-744b879ff-nmmkl\" (UID: \"888ad42e-520e-40d0-a7b7-2d0edd7667a6\") " pod="openstack/dnsmasq-dns-744b879ff-nmmkl" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.468722 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5txj4\" (UniqueName: \"kubernetes.io/projected/888ad42e-520e-40d0-a7b7-2d0edd7667a6-kube-api-access-5txj4\") pod \"dnsmasq-dns-744b879ff-nmmkl\" (UID: \"888ad42e-520e-40d0-a7b7-2d0edd7667a6\") " pod="openstack/dnsmasq-dns-744b879ff-nmmkl" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.515949 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744b879ff-nmmkl" Dec 05 17:52:09 crc kubenswrapper[4756]: I1205 17:52:09.915278 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-2b7k7"] Dec 05 17:52:10 crc kubenswrapper[4756]: I1205 17:52:10.014235 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-744b879ff-nmmkl"] Dec 05 17:52:10 crc kubenswrapper[4756]: I1205 17:52:10.734617 4756 generic.go:334] "Generic (PLEG): container finished" podID="888ad42e-520e-40d0-a7b7-2d0edd7667a6" containerID="8561e60ecbbebbe2c6b626af86ae800288c6bb96ca12c901dab9558f1f4ab046" exitCode=0 Dec 05 17:52:10 crc kubenswrapper[4756]: I1205 17:52:10.734828 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744b879ff-nmmkl" event={"ID":"888ad42e-520e-40d0-a7b7-2d0edd7667a6","Type":"ContainerDied","Data":"8561e60ecbbebbe2c6b626af86ae800288c6bb96ca12c901dab9558f1f4ab046"} Dec 05 17:52:10 crc kubenswrapper[4756]: I1205 17:52:10.734998 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744b879ff-nmmkl" event={"ID":"888ad42e-520e-40d0-a7b7-2d0edd7667a6","Type":"ContainerStarted","Data":"6242339b18e03aa6118c3f2d8aadb47afd1214112c3cc53273324d186155b93c"} Dec 05 17:52:10 crc kubenswrapper[4756]: I1205 17:52:10.736469 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2b7k7" event={"ID":"7e02404d-5bef-4477-a4c9-b2d3992f0136","Type":"ContainerStarted","Data":"002ce86ec59bb2ab3a1d3adef6dff666fb8f34c3c730c33b35d651072d446dbc"} Dec 05 17:52:10 crc kubenswrapper[4756]: I1205 17:52:10.736497 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2b7k7" event={"ID":"7e02404d-5bef-4477-a4c9-b2d3992f0136","Type":"ContainerStarted","Data":"3ff729ca088a3fc1a12232b4a92ced9ce7df34b5afaaef0cca1f358dcc5e0c3d"} Dec 05 17:52:10 crc kubenswrapper[4756]: I1205 17:52:10.776129 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-2b7k7" podStartSLOduration=1.7761127920000002 podStartE2EDuration="1.776112792s" podCreationTimestamp="2025-12-05 17:52:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:52:10.772386793 +0000 UTC m=+5791.011060305" watchObservedRunningTime="2025-12-05 17:52:10.776112792 +0000 UTC m=+5791.014786304" Dec 05 17:52:11 crc kubenswrapper[4756]: I1205 17:52:11.074985 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5c9b6c5b74-7zzt9"] Dec 05 17:52:11 crc kubenswrapper[4756]: I1205 17:52:11.076529 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" Dec 05 17:52:11 crc kubenswrapper[4756]: I1205 17:52:11.078088 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 05 17:52:11 crc kubenswrapper[4756]: I1205 17:52:11.095266 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5c9b6c5b74-7zzt9"] Dec 05 17:52:11 crc kubenswrapper[4756]: I1205 17:52:11.227622 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-log-httpd\") pod \"swift-proxy-5c9b6c5b74-7zzt9\" (UID: \"a175a12a-be0f-4a3c-9d74-86b07ad3b53a\") " pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" Dec 05 17:52:11 crc kubenswrapper[4756]: I1205 17:52:11.227909 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-config-data\") pod \"swift-proxy-5c9b6c5b74-7zzt9\" (UID: \"a175a12a-be0f-4a3c-9d74-86b07ad3b53a\") " pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" Dec 05 17:52:11 crc kubenswrapper[4756]: I1205 17:52:11.227977 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-combined-ca-bundle\") pod \"swift-proxy-5c9b6c5b74-7zzt9\" (UID: \"a175a12a-be0f-4a3c-9d74-86b07ad3b53a\") " pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" Dec 05 17:52:11 crc kubenswrapper[4756]: I1205 17:52:11.228221 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-run-httpd\") pod \"swift-proxy-5c9b6c5b74-7zzt9\" (UID: \"a175a12a-be0f-4a3c-9d74-86b07ad3b53a\") " pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" Dec 05 17:52:11 crc kubenswrapper[4756]: I1205 17:52:11.228317 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjrsc\" (UniqueName: \"kubernetes.io/projected/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-kube-api-access-fjrsc\") pod \"swift-proxy-5c9b6c5b74-7zzt9\" (UID: \"a175a12a-be0f-4a3c-9d74-86b07ad3b53a\") " pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" Dec 05 17:52:11 crc kubenswrapper[4756]: I1205 17:52:11.228468 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-etc-swift\") pod \"swift-proxy-5c9b6c5b74-7zzt9\" (UID: \"a175a12a-be0f-4a3c-9d74-86b07ad3b53a\") " pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" Dec 05 17:52:11 crc kubenswrapper[4756]: I1205 17:52:11.330750 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-config-data\") pod \"swift-proxy-5c9b6c5b74-7zzt9\" (UID: \"a175a12a-be0f-4a3c-9d74-86b07ad3b53a\") " pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" Dec 05 17:52:11 crc kubenswrapper[4756]: I1205 17:52:11.330826 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-combined-ca-bundle\") pod \"swift-proxy-5c9b6c5b74-7zzt9\" (UID: \"a175a12a-be0f-4a3c-9d74-86b07ad3b53a\") " pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" Dec 05 17:52:11 crc kubenswrapper[4756]: I1205 17:52:11.330887 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-run-httpd\") pod \"swift-proxy-5c9b6c5b74-7zzt9\" (UID: \"a175a12a-be0f-4a3c-9d74-86b07ad3b53a\") " pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" Dec 05 17:52:11 crc kubenswrapper[4756]: I1205 17:52:11.330918 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjrsc\" (UniqueName: \"kubernetes.io/projected/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-kube-api-access-fjrsc\") pod \"swift-proxy-5c9b6c5b74-7zzt9\" (UID: \"a175a12a-be0f-4a3c-9d74-86b07ad3b53a\") " pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" Dec 05 17:52:11 crc kubenswrapper[4756]: I1205 17:52:11.330949 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-etc-swift\") pod \"swift-proxy-5c9b6c5b74-7zzt9\" (UID: \"a175a12a-be0f-4a3c-9d74-86b07ad3b53a\") " pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" Dec 05 17:52:11 crc kubenswrapper[4756]: I1205 17:52:11.330981 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-log-httpd\") pod \"swift-proxy-5c9b6c5b74-7zzt9\" (UID: \"a175a12a-be0f-4a3c-9d74-86b07ad3b53a\") " pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" Dec 05 17:52:11 crc kubenswrapper[4756]: I1205 17:52:11.331651 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-log-httpd\") pod \"swift-proxy-5c9b6c5b74-7zzt9\" (UID: \"a175a12a-be0f-4a3c-9d74-86b07ad3b53a\") " pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" Dec 05 17:52:11 crc kubenswrapper[4756]: I1205 17:52:11.331710 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-run-httpd\") pod \"swift-proxy-5c9b6c5b74-7zzt9\" (UID: \"a175a12a-be0f-4a3c-9d74-86b07ad3b53a\") " pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" Dec 05 17:52:11 crc kubenswrapper[4756]: I1205 17:52:11.336459 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-config-data\") pod \"swift-proxy-5c9b6c5b74-7zzt9\" (UID: \"a175a12a-be0f-4a3c-9d74-86b07ad3b53a\") " pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" Dec 05 17:52:11 crc kubenswrapper[4756]: I1205 17:52:11.337116 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-combined-ca-bundle\") pod \"swift-proxy-5c9b6c5b74-7zzt9\" (UID: \"a175a12a-be0f-4a3c-9d74-86b07ad3b53a\") " pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" Dec 05 17:52:11 crc kubenswrapper[4756]: I1205 17:52:11.346398 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-etc-swift\") pod \"swift-proxy-5c9b6c5b74-7zzt9\" (UID: \"a175a12a-be0f-4a3c-9d74-86b07ad3b53a\") " pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" Dec 05 17:52:11 crc kubenswrapper[4756]: I1205 17:52:11.377323 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjrsc\" (UniqueName: \"kubernetes.io/projected/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-kube-api-access-fjrsc\") pod \"swift-proxy-5c9b6c5b74-7zzt9\" (UID: \"a175a12a-be0f-4a3c-9d74-86b07ad3b53a\") " pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" Dec 05 17:52:11 crc kubenswrapper[4756]: I1205 17:52:11.401781 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" Dec 05 17:52:11 crc kubenswrapper[4756]: I1205 17:52:11.746499 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744b879ff-nmmkl" event={"ID":"888ad42e-520e-40d0-a7b7-2d0edd7667a6","Type":"ContainerStarted","Data":"2deb0331d9a295fd4f8c2853789700364d81f0bc908885e5e64c6d88a37854ca"} Dec 05 17:52:11 crc kubenswrapper[4756]: I1205 17:52:11.746941 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-744b879ff-nmmkl" Dec 05 17:52:11 crc kubenswrapper[4756]: I1205 17:52:11.777846 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-744b879ff-nmmkl" podStartSLOduration=2.777823668 podStartE2EDuration="2.777823668s" podCreationTimestamp="2025-12-05 17:52:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:52:11.774168811 +0000 UTC m=+5792.012842383" watchObservedRunningTime="2025-12-05 17:52:11.777823668 +0000 UTC m=+5792.016497220" Dec 05 17:52:12 crc kubenswrapper[4756]: I1205 17:52:12.021300 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5c9b6c5b74-7zzt9"] Dec 05 17:52:12 crc kubenswrapper[4756]: I1205 17:52:12.759503 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" event={"ID":"a175a12a-be0f-4a3c-9d74-86b07ad3b53a","Type":"ContainerStarted","Data":"b39dd7885a2691791124976a3763b3f20eda18c8b5531b306b98e19206861f4f"} Dec 05 17:52:12 crc kubenswrapper[4756]: I1205 17:52:12.759860 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" event={"ID":"a175a12a-be0f-4a3c-9d74-86b07ad3b53a","Type":"ContainerStarted","Data":"b6de424c1888fea411c3947a48ae5f7a971c4056ef17d26b8682a6f51b5be62c"} Dec 05 17:52:12 crc kubenswrapper[4756]: I1205 17:52:12.759880 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" Dec 05 17:52:12 crc kubenswrapper[4756]: I1205 17:52:12.759893 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" Dec 05 17:52:12 crc kubenswrapper[4756]: I1205 17:52:12.759901 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" event={"ID":"a175a12a-be0f-4a3c-9d74-86b07ad3b53a","Type":"ContainerStarted","Data":"8210d76eeca04f7ad0f121e77ae78f3fa86bb7918cb6cb686c58c7f463042bfd"} Dec 05 17:52:12 crc kubenswrapper[4756]: I1205 17:52:12.780981 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" podStartSLOduration=1.780953111 podStartE2EDuration="1.780953111s" podCreationTimestamp="2025-12-05 17:52:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:52:12.777385726 +0000 UTC m=+5793.016059238" watchObservedRunningTime="2025-12-05 17:52:12.780953111 +0000 UTC m=+5793.019626643" Dec 05 17:52:13 crc kubenswrapper[4756]: I1205 17:52:13.288039 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-99cc989cd-9j2nn"] Dec 05 17:52:13 crc kubenswrapper[4756]: I1205 17:52:13.289615 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-99cc989cd-9j2nn" Dec 05 17:52:13 crc kubenswrapper[4756]: I1205 17:52:13.291555 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 05 17:52:13 crc kubenswrapper[4756]: I1205 17:52:13.291900 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 05 17:52:13 crc kubenswrapper[4756]: I1205 17:52:13.317748 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-99cc989cd-9j2nn"] Dec 05 17:52:13 crc kubenswrapper[4756]: I1205 17:52:13.471075 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg4s7\" (UniqueName: \"kubernetes.io/projected/37480e17-7f37-4df4-9587-3933a1fc902b-kube-api-access-tg4s7\") pod \"swift-proxy-99cc989cd-9j2nn\" (UID: \"37480e17-7f37-4df4-9587-3933a1fc902b\") " pod="openstack/swift-proxy-99cc989cd-9j2nn" Dec 05 17:52:13 crc kubenswrapper[4756]: I1205 17:52:13.471144 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/37480e17-7f37-4df4-9587-3933a1fc902b-etc-swift\") pod \"swift-proxy-99cc989cd-9j2nn\" (UID: \"37480e17-7f37-4df4-9587-3933a1fc902b\") " pod="openstack/swift-proxy-99cc989cd-9j2nn" Dec 05 17:52:13 crc kubenswrapper[4756]: I1205 17:52:13.471192 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37480e17-7f37-4df4-9587-3933a1fc902b-run-httpd\") pod \"swift-proxy-99cc989cd-9j2nn\" (UID: \"37480e17-7f37-4df4-9587-3933a1fc902b\") " pod="openstack/swift-proxy-99cc989cd-9j2nn" Dec 05 17:52:13 crc kubenswrapper[4756]: I1205 17:52:13.471218 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37480e17-7f37-4df4-9587-3933a1fc902b-log-httpd\") pod \"swift-proxy-99cc989cd-9j2nn\" (UID: \"37480e17-7f37-4df4-9587-3933a1fc902b\") " pod="openstack/swift-proxy-99cc989cd-9j2nn" Dec 05 17:52:13 crc kubenswrapper[4756]: I1205 17:52:13.471262 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37480e17-7f37-4df4-9587-3933a1fc902b-config-data\") pod \"swift-proxy-99cc989cd-9j2nn\" (UID: \"37480e17-7f37-4df4-9587-3933a1fc902b\") " pod="openstack/swift-proxy-99cc989cd-9j2nn" Dec 05 17:52:13 crc kubenswrapper[4756]: I1205 17:52:13.471328 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37480e17-7f37-4df4-9587-3933a1fc902b-combined-ca-bundle\") pod \"swift-proxy-99cc989cd-9j2nn\" (UID: \"37480e17-7f37-4df4-9587-3933a1fc902b\") " pod="openstack/swift-proxy-99cc989cd-9j2nn" Dec 05 17:52:13 crc kubenswrapper[4756]: I1205 17:52:13.471364 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37480e17-7f37-4df4-9587-3933a1fc902b-internal-tls-certs\") pod \"swift-proxy-99cc989cd-9j2nn\" (UID: \"37480e17-7f37-4df4-9587-3933a1fc902b\") " pod="openstack/swift-proxy-99cc989cd-9j2nn" Dec 05 17:52:13 crc kubenswrapper[4756]: I1205 17:52:13.471388 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37480e17-7f37-4df4-9587-3933a1fc902b-public-tls-certs\") pod \"swift-proxy-99cc989cd-9j2nn\" (UID: \"37480e17-7f37-4df4-9587-3933a1fc902b\") " pod="openstack/swift-proxy-99cc989cd-9j2nn" Dec 05 17:52:13 crc kubenswrapper[4756]: I1205 17:52:13.572782 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37480e17-7f37-4df4-9587-3933a1fc902b-combined-ca-bundle\") pod \"swift-proxy-99cc989cd-9j2nn\" (UID: \"37480e17-7f37-4df4-9587-3933a1fc902b\") " pod="openstack/swift-proxy-99cc989cd-9j2nn" Dec 05 17:52:13 crc kubenswrapper[4756]: I1205 17:52:13.572836 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37480e17-7f37-4df4-9587-3933a1fc902b-internal-tls-certs\") pod \"swift-proxy-99cc989cd-9j2nn\" (UID: \"37480e17-7f37-4df4-9587-3933a1fc902b\") " pod="openstack/swift-proxy-99cc989cd-9j2nn" Dec 05 17:52:13 crc kubenswrapper[4756]: I1205 17:52:13.572859 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37480e17-7f37-4df4-9587-3933a1fc902b-public-tls-certs\") pod \"swift-proxy-99cc989cd-9j2nn\" (UID: \"37480e17-7f37-4df4-9587-3933a1fc902b\") " pod="openstack/swift-proxy-99cc989cd-9j2nn" Dec 05 17:52:13 crc kubenswrapper[4756]: I1205 17:52:13.572915 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg4s7\" (UniqueName: \"kubernetes.io/projected/37480e17-7f37-4df4-9587-3933a1fc902b-kube-api-access-tg4s7\") pod \"swift-proxy-99cc989cd-9j2nn\" (UID: \"37480e17-7f37-4df4-9587-3933a1fc902b\") " pod="openstack/swift-proxy-99cc989cd-9j2nn" Dec 05 17:52:13 crc kubenswrapper[4756]: I1205 17:52:13.572948 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/37480e17-7f37-4df4-9587-3933a1fc902b-etc-swift\") pod \"swift-proxy-99cc989cd-9j2nn\" (UID: \"37480e17-7f37-4df4-9587-3933a1fc902b\") " pod="openstack/swift-proxy-99cc989cd-9j2nn" Dec 05 17:52:13 crc kubenswrapper[4756]: I1205 17:52:13.572981 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37480e17-7f37-4df4-9587-3933a1fc902b-run-httpd\") pod \"swift-proxy-99cc989cd-9j2nn\" (UID: \"37480e17-7f37-4df4-9587-3933a1fc902b\") " pod="openstack/swift-proxy-99cc989cd-9j2nn" Dec 05 17:52:13 crc kubenswrapper[4756]: I1205 17:52:13.573000 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37480e17-7f37-4df4-9587-3933a1fc902b-log-httpd\") pod \"swift-proxy-99cc989cd-9j2nn\" (UID: \"37480e17-7f37-4df4-9587-3933a1fc902b\") " pod="openstack/swift-proxy-99cc989cd-9j2nn" Dec 05 17:52:13 crc kubenswrapper[4756]: I1205 17:52:13.573029 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37480e17-7f37-4df4-9587-3933a1fc902b-config-data\") pod \"swift-proxy-99cc989cd-9j2nn\" (UID: \"37480e17-7f37-4df4-9587-3933a1fc902b\") " pod="openstack/swift-proxy-99cc989cd-9j2nn" Dec 05 17:52:13 crc kubenswrapper[4756]: I1205 17:52:13.573843 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37480e17-7f37-4df4-9587-3933a1fc902b-run-httpd\") pod \"swift-proxy-99cc989cd-9j2nn\" (UID: \"37480e17-7f37-4df4-9587-3933a1fc902b\") " pod="openstack/swift-proxy-99cc989cd-9j2nn" Dec 05 17:52:13 crc kubenswrapper[4756]: I1205 17:52:13.575742 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37480e17-7f37-4df4-9587-3933a1fc902b-log-httpd\") pod \"swift-proxy-99cc989cd-9j2nn\" (UID: \"37480e17-7f37-4df4-9587-3933a1fc902b\") " pod="openstack/swift-proxy-99cc989cd-9j2nn" Dec 05 17:52:13 crc kubenswrapper[4756]: I1205 17:52:13.587278 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37480e17-7f37-4df4-9587-3933a1fc902b-internal-tls-certs\") pod \"swift-proxy-99cc989cd-9j2nn\" (UID: \"37480e17-7f37-4df4-9587-3933a1fc902b\") " pod="openstack/swift-proxy-99cc989cd-9j2nn" Dec 05 17:52:13 crc kubenswrapper[4756]: I1205 17:52:13.590436 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37480e17-7f37-4df4-9587-3933a1fc902b-config-data\") pod \"swift-proxy-99cc989cd-9j2nn\" (UID: \"37480e17-7f37-4df4-9587-3933a1fc902b\") " pod="openstack/swift-proxy-99cc989cd-9j2nn" Dec 05 17:52:13 crc kubenswrapper[4756]: I1205 17:52:13.594879 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37480e17-7f37-4df4-9587-3933a1fc902b-public-tls-certs\") pod \"swift-proxy-99cc989cd-9j2nn\" (UID: \"37480e17-7f37-4df4-9587-3933a1fc902b\") " pod="openstack/swift-proxy-99cc989cd-9j2nn" Dec 05 17:52:13 crc kubenswrapper[4756]: I1205 17:52:13.597265 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg4s7\" (UniqueName: \"kubernetes.io/projected/37480e17-7f37-4df4-9587-3933a1fc902b-kube-api-access-tg4s7\") pod \"swift-proxy-99cc989cd-9j2nn\" (UID: \"37480e17-7f37-4df4-9587-3933a1fc902b\") " pod="openstack/swift-proxy-99cc989cd-9j2nn" Dec 05 17:52:13 crc kubenswrapper[4756]: I1205 17:52:13.598540 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/37480e17-7f37-4df4-9587-3933a1fc902b-etc-swift\") pod \"swift-proxy-99cc989cd-9j2nn\" (UID: \"37480e17-7f37-4df4-9587-3933a1fc902b\") " pod="openstack/swift-proxy-99cc989cd-9j2nn" Dec 05 17:52:13 crc kubenswrapper[4756]: I1205 17:52:13.599107 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37480e17-7f37-4df4-9587-3933a1fc902b-combined-ca-bundle\") pod \"swift-proxy-99cc989cd-9j2nn\" (UID: \"37480e17-7f37-4df4-9587-3933a1fc902b\") " pod="openstack/swift-proxy-99cc989cd-9j2nn" Dec 05 17:52:13 crc kubenswrapper[4756]: I1205 17:52:13.664926 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-99cc989cd-9j2nn" Dec 05 17:52:14 crc kubenswrapper[4756]: I1205 17:52:14.310641 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-99cc989cd-9j2nn"] Dec 05 17:52:14 crc kubenswrapper[4756]: I1205 17:52:14.795383 4756 generic.go:334] "Generic (PLEG): container finished" podID="7e02404d-5bef-4477-a4c9-b2d3992f0136" containerID="002ce86ec59bb2ab3a1d3adef6dff666fb8f34c3c730c33b35d651072d446dbc" exitCode=0 Dec 05 17:52:14 crc kubenswrapper[4756]: I1205 17:52:14.795839 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2b7k7" event={"ID":"7e02404d-5bef-4477-a4c9-b2d3992f0136","Type":"ContainerDied","Data":"002ce86ec59bb2ab3a1d3adef6dff666fb8f34c3c730c33b35d651072d446dbc"} Dec 05 17:52:14 crc kubenswrapper[4756]: I1205 17:52:14.800016 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-99cc989cd-9j2nn" event={"ID":"37480e17-7f37-4df4-9587-3933a1fc902b","Type":"ContainerStarted","Data":"5e30645476c965bf3db87f2b747ae89edd86454ba8698753b9e5ea891ca5a10b"} Dec 05 17:52:14 crc kubenswrapper[4756]: I1205 17:52:14.800059 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-99cc989cd-9j2nn" event={"ID":"37480e17-7f37-4df4-9587-3933a1fc902b","Type":"ContainerStarted","Data":"44e70c72f9cf173a591125c70afca3e20f053958754f68f37f7082c1cfc3e5dd"} Dec 05 17:52:15 crc kubenswrapper[4756]: I1205 17:52:15.847941 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-99cc989cd-9j2nn" event={"ID":"37480e17-7f37-4df4-9587-3933a1fc902b","Type":"ContainerStarted","Data":"85fe0d37c7569c63e56ba38427a7243556d6741a242c221b6ab4c6a3c493b246"} Dec 05 17:52:15 crc kubenswrapper[4756]: I1205 17:52:15.848324 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-99cc989cd-9j2nn" Dec 05 17:52:15 crc kubenswrapper[4756]: I1205 17:52:15.848352 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-99cc989cd-9j2nn" Dec 05 17:52:15 crc kubenswrapper[4756]: I1205 17:52:15.889250 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-99cc989cd-9j2nn" podStartSLOduration=2.889144883 podStartE2EDuration="2.889144883s" podCreationTimestamp="2025-12-05 17:52:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:52:15.887557071 +0000 UTC m=+5796.126230593" watchObservedRunningTime="2025-12-05 17:52:15.889144883 +0000 UTC m=+5796.127818395" Dec 05 17:52:16 crc kubenswrapper[4756]: I1205 17:52:16.211879 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2b7k7" Dec 05 17:52:16 crc kubenswrapper[4756]: I1205 17:52:16.330096 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7e02404d-5bef-4477-a4c9-b2d3992f0136-swiftconf\") pod \"7e02404d-5bef-4477-a4c9-b2d3992f0136\" (UID: \"7e02404d-5bef-4477-a4c9-b2d3992f0136\") " Dec 05 17:52:16 crc kubenswrapper[4756]: I1205 17:52:16.330231 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7e02404d-5bef-4477-a4c9-b2d3992f0136-dispersionconf\") pod \"7e02404d-5bef-4477-a4c9-b2d3992f0136\" (UID: \"7e02404d-5bef-4477-a4c9-b2d3992f0136\") " Dec 05 17:52:16 crc kubenswrapper[4756]: I1205 17:52:16.330273 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7e02404d-5bef-4477-a4c9-b2d3992f0136-etc-swift\") pod \"7e02404d-5bef-4477-a4c9-b2d3992f0136\" (UID: \"7e02404d-5bef-4477-a4c9-b2d3992f0136\") " Dec 05 17:52:16 crc kubenswrapper[4756]: I1205 17:52:16.330311 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7e02404d-5bef-4477-a4c9-b2d3992f0136-ring-data-devices\") pod \"7e02404d-5bef-4477-a4c9-b2d3992f0136\" (UID: \"7e02404d-5bef-4477-a4c9-b2d3992f0136\") " Dec 05 17:52:16 crc kubenswrapper[4756]: I1205 17:52:16.330365 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcph4\" (UniqueName: \"kubernetes.io/projected/7e02404d-5bef-4477-a4c9-b2d3992f0136-kube-api-access-mcph4\") pod \"7e02404d-5bef-4477-a4c9-b2d3992f0136\" (UID: \"7e02404d-5bef-4477-a4c9-b2d3992f0136\") " Dec 05 17:52:16 crc kubenswrapper[4756]: I1205 17:52:16.330389 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e02404d-5bef-4477-a4c9-b2d3992f0136-combined-ca-bundle\") pod \"7e02404d-5bef-4477-a4c9-b2d3992f0136\" (UID: \"7e02404d-5bef-4477-a4c9-b2d3992f0136\") " Dec 05 17:52:16 crc kubenswrapper[4756]: I1205 17:52:16.330474 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e02404d-5bef-4477-a4c9-b2d3992f0136-scripts\") pod \"7e02404d-5bef-4477-a4c9-b2d3992f0136\" (UID: \"7e02404d-5bef-4477-a4c9-b2d3992f0136\") " Dec 05 17:52:16 crc kubenswrapper[4756]: I1205 17:52:16.332507 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e02404d-5bef-4477-a4c9-b2d3992f0136-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7e02404d-5bef-4477-a4c9-b2d3992f0136" (UID: "7e02404d-5bef-4477-a4c9-b2d3992f0136"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:52:16 crc kubenswrapper[4756]: I1205 17:52:16.333185 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e02404d-5bef-4477-a4c9-b2d3992f0136-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7e02404d-5bef-4477-a4c9-b2d3992f0136" (UID: "7e02404d-5bef-4477-a4c9-b2d3992f0136"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:52:16 crc kubenswrapper[4756]: I1205 17:52:16.337203 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e02404d-5bef-4477-a4c9-b2d3992f0136-kube-api-access-mcph4" (OuterVolumeSpecName: "kube-api-access-mcph4") pod "7e02404d-5bef-4477-a4c9-b2d3992f0136" (UID: "7e02404d-5bef-4477-a4c9-b2d3992f0136"). InnerVolumeSpecName "kube-api-access-mcph4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:52:16 crc kubenswrapper[4756]: I1205 17:52:16.343606 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e02404d-5bef-4477-a4c9-b2d3992f0136-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7e02404d-5bef-4477-a4c9-b2d3992f0136" (UID: "7e02404d-5bef-4477-a4c9-b2d3992f0136"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:52:16 crc kubenswrapper[4756]: I1205 17:52:16.360989 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e02404d-5bef-4477-a4c9-b2d3992f0136-scripts" (OuterVolumeSpecName: "scripts") pod "7e02404d-5bef-4477-a4c9-b2d3992f0136" (UID: "7e02404d-5bef-4477-a4c9-b2d3992f0136"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:52:16 crc kubenswrapper[4756]: I1205 17:52:16.365594 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e02404d-5bef-4477-a4c9-b2d3992f0136-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e02404d-5bef-4477-a4c9-b2d3992f0136" (UID: "7e02404d-5bef-4477-a4c9-b2d3992f0136"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:52:16 crc kubenswrapper[4756]: I1205 17:52:16.376271 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e02404d-5bef-4477-a4c9-b2d3992f0136-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7e02404d-5bef-4477-a4c9-b2d3992f0136" (UID: "7e02404d-5bef-4477-a4c9-b2d3992f0136"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:52:16 crc kubenswrapper[4756]: I1205 17:52:16.432386 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e02404d-5bef-4477-a4c9-b2d3992f0136-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:16 crc kubenswrapper[4756]: I1205 17:52:16.432711 4756 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7e02404d-5bef-4477-a4c9-b2d3992f0136-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:16 crc kubenswrapper[4756]: I1205 17:52:16.432723 4756 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7e02404d-5bef-4477-a4c9-b2d3992f0136-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:16 crc kubenswrapper[4756]: I1205 17:52:16.432732 4756 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7e02404d-5bef-4477-a4c9-b2d3992f0136-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:16 crc kubenswrapper[4756]: I1205 17:52:16.432742 4756 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7e02404d-5bef-4477-a4c9-b2d3992f0136-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:16 crc kubenswrapper[4756]: I1205 17:52:16.432752 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcph4\" (UniqueName: \"kubernetes.io/projected/7e02404d-5bef-4477-a4c9-b2d3992f0136-kube-api-access-mcph4\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:16 crc kubenswrapper[4756]: I1205 17:52:16.432765 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e02404d-5bef-4477-a4c9-b2d3992f0136-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:16 crc kubenswrapper[4756]: I1205 17:52:16.862100 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2b7k7" Dec 05 17:52:16 crc kubenswrapper[4756]: I1205 17:52:16.862123 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2b7k7" event={"ID":"7e02404d-5bef-4477-a4c9-b2d3992f0136","Type":"ContainerDied","Data":"3ff729ca088a3fc1a12232b4a92ced9ce7df34b5afaaef0cca1f358dcc5e0c3d"} Dec 05 17:52:16 crc kubenswrapper[4756]: I1205 17:52:16.862220 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ff729ca088a3fc1a12232b4a92ced9ce7df34b5afaaef0cca1f358dcc5e0c3d" Dec 05 17:52:19 crc kubenswrapper[4756]: I1205 17:52:19.518686 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-744b879ff-nmmkl" Dec 05 17:52:19 crc kubenswrapper[4756]: I1205 17:52:19.615899 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56bc9cf955-6rqlh"] Dec 05 17:52:19 crc kubenswrapper[4756]: I1205 17:52:19.616324 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56bc9cf955-6rqlh" podUID="369c09d9-5631-4875-b833-7abfd6fb0c24" containerName="dnsmasq-dns" containerID="cri-o://73e9d9daeadd72dceff5a5842c9a4d31ec6332ffe46da374897edea447caa0e6" gracePeriod=10 Dec 05 17:52:19 crc kubenswrapper[4756]: I1205 17:52:19.898269 4756 generic.go:334] "Generic (PLEG): container finished" podID="369c09d9-5631-4875-b833-7abfd6fb0c24" containerID="73e9d9daeadd72dceff5a5842c9a4d31ec6332ffe46da374897edea447caa0e6" exitCode=0 Dec 05 17:52:19 crc kubenswrapper[4756]: I1205 17:52:19.898646 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56bc9cf955-6rqlh" event={"ID":"369c09d9-5631-4875-b833-7abfd6fb0c24","Type":"ContainerDied","Data":"73e9d9daeadd72dceff5a5842c9a4d31ec6332ffe46da374897edea447caa0e6"} Dec 05 17:52:20 crc kubenswrapper[4756]: I1205 17:52:20.108163 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56bc9cf955-6rqlh" Dec 05 17:52:20 crc kubenswrapper[4756]: I1205 17:52:20.212865 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/369c09d9-5631-4875-b833-7abfd6fb0c24-ovsdbserver-nb\") pod \"369c09d9-5631-4875-b833-7abfd6fb0c24\" (UID: \"369c09d9-5631-4875-b833-7abfd6fb0c24\") " Dec 05 17:52:20 crc kubenswrapper[4756]: I1205 17:52:20.212974 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/369c09d9-5631-4875-b833-7abfd6fb0c24-config\") pod \"369c09d9-5631-4875-b833-7abfd6fb0c24\" (UID: \"369c09d9-5631-4875-b833-7abfd6fb0c24\") " Dec 05 17:52:20 crc kubenswrapper[4756]: I1205 17:52:20.212995 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/369c09d9-5631-4875-b833-7abfd6fb0c24-ovsdbserver-sb\") pod \"369c09d9-5631-4875-b833-7abfd6fb0c24\" (UID: \"369c09d9-5631-4875-b833-7abfd6fb0c24\") " Dec 05 17:52:20 crc kubenswrapper[4756]: I1205 17:52:20.213110 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/369c09d9-5631-4875-b833-7abfd6fb0c24-dns-svc\") pod \"369c09d9-5631-4875-b833-7abfd6fb0c24\" (UID: \"369c09d9-5631-4875-b833-7abfd6fb0c24\") " Dec 05 17:52:20 crc kubenswrapper[4756]: I1205 17:52:20.213147 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjf4s\" (UniqueName: \"kubernetes.io/projected/369c09d9-5631-4875-b833-7abfd6fb0c24-kube-api-access-wjf4s\") pod \"369c09d9-5631-4875-b833-7abfd6fb0c24\" (UID: \"369c09d9-5631-4875-b833-7abfd6fb0c24\") " Dec 05 17:52:20 crc kubenswrapper[4756]: I1205 17:52:20.221659 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/369c09d9-5631-4875-b833-7abfd6fb0c24-kube-api-access-wjf4s" (OuterVolumeSpecName: "kube-api-access-wjf4s") pod "369c09d9-5631-4875-b833-7abfd6fb0c24" (UID: "369c09d9-5631-4875-b833-7abfd6fb0c24"). InnerVolumeSpecName "kube-api-access-wjf4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:52:20 crc kubenswrapper[4756]: I1205 17:52:20.262346 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/369c09d9-5631-4875-b833-7abfd6fb0c24-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "369c09d9-5631-4875-b833-7abfd6fb0c24" (UID: "369c09d9-5631-4875-b833-7abfd6fb0c24"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:52:20 crc kubenswrapper[4756]: I1205 17:52:20.262349 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/369c09d9-5631-4875-b833-7abfd6fb0c24-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "369c09d9-5631-4875-b833-7abfd6fb0c24" (UID: "369c09d9-5631-4875-b833-7abfd6fb0c24"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:52:20 crc kubenswrapper[4756]: I1205 17:52:20.280777 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/369c09d9-5631-4875-b833-7abfd6fb0c24-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "369c09d9-5631-4875-b833-7abfd6fb0c24" (UID: "369c09d9-5631-4875-b833-7abfd6fb0c24"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:52:20 crc kubenswrapper[4756]: I1205 17:52:20.289769 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/369c09d9-5631-4875-b833-7abfd6fb0c24-config" (OuterVolumeSpecName: "config") pod "369c09d9-5631-4875-b833-7abfd6fb0c24" (UID: "369c09d9-5631-4875-b833-7abfd6fb0c24"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:52:20 crc kubenswrapper[4756]: I1205 17:52:20.315552 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/369c09d9-5631-4875-b833-7abfd6fb0c24-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:20 crc kubenswrapper[4756]: I1205 17:52:20.315593 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjf4s\" (UniqueName: \"kubernetes.io/projected/369c09d9-5631-4875-b833-7abfd6fb0c24-kube-api-access-wjf4s\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:20 crc kubenswrapper[4756]: I1205 17:52:20.315610 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/369c09d9-5631-4875-b833-7abfd6fb0c24-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:20 crc kubenswrapper[4756]: I1205 17:52:20.315643 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/369c09d9-5631-4875-b833-7abfd6fb0c24-config\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:20 crc kubenswrapper[4756]: I1205 17:52:20.315660 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/369c09d9-5631-4875-b833-7abfd6fb0c24-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:20 crc kubenswrapper[4756]: I1205 17:52:20.913297 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56bc9cf955-6rqlh" event={"ID":"369c09d9-5631-4875-b833-7abfd6fb0c24","Type":"ContainerDied","Data":"73667b21207675c8a75e9377449f77f8e8e24f6be57d991298956e80428737f2"} Dec 05 17:52:20 crc kubenswrapper[4756]: I1205 17:52:20.913737 4756 scope.go:117] "RemoveContainer" containerID="73e9d9daeadd72dceff5a5842c9a4d31ec6332ffe46da374897edea447caa0e6" Dec 05 17:52:20 crc kubenswrapper[4756]: I1205 17:52:20.913488 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56bc9cf955-6rqlh" Dec 05 17:52:20 crc kubenswrapper[4756]: I1205 17:52:20.963202 4756 scope.go:117] "RemoveContainer" containerID="af1a6533c9d1b35d68cb6e84bc980014731dc0934b9c2560e2acf104288ad9f6" Dec 05 17:52:20 crc kubenswrapper[4756]: I1205 17:52:20.978198 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56bc9cf955-6rqlh"] Dec 05 17:52:20 crc kubenswrapper[4756]: I1205 17:52:20.997098 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56bc9cf955-6rqlh"] Dec 05 17:52:21 crc kubenswrapper[4756]: I1205 17:52:21.406376 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" Dec 05 17:52:21 crc kubenswrapper[4756]: I1205 17:52:21.408055 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" Dec 05 17:52:22 crc kubenswrapper[4756]: I1205 17:52:22.065528 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="369c09d9-5631-4875-b833-7abfd6fb0c24" path="/var/lib/kubelet/pods/369c09d9-5631-4875-b833-7abfd6fb0c24/volumes" Dec 05 17:52:22 crc kubenswrapper[4756]: I1205 17:52:22.296927 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 17:52:22 crc kubenswrapper[4756]: I1205 17:52:22.297008 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 17:52:23 crc kubenswrapper[4756]: I1205 17:52:23.671685 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-99cc989cd-9j2nn" Dec 05 17:52:23 crc kubenswrapper[4756]: I1205 17:52:23.678244 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-99cc989cd-9j2nn" Dec 05 17:52:23 crc kubenswrapper[4756]: I1205 17:52:23.799684 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-5c9b6c5b74-7zzt9"] Dec 05 17:52:23 crc kubenswrapper[4756]: I1205 17:52:23.799981 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" podUID="a175a12a-be0f-4a3c-9d74-86b07ad3b53a" containerName="proxy-httpd" containerID="cri-o://b6de424c1888fea411c3947a48ae5f7a971c4056ef17d26b8682a6f51b5be62c" gracePeriod=30 Dec 05 17:52:23 crc kubenswrapper[4756]: I1205 17:52:23.800075 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" podUID="a175a12a-be0f-4a3c-9d74-86b07ad3b53a" containerName="proxy-server" containerID="cri-o://b39dd7885a2691791124976a3763b3f20eda18c8b5531b306b98e19206861f4f" gracePeriod=30 Dec 05 17:52:23 crc kubenswrapper[4756]: E1205 17:52:23.928895 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda175a12a_be0f_4a3c_9d74_86b07ad3b53a.slice/crio-b6de424c1888fea411c3947a48ae5f7a971c4056ef17d26b8682a6f51b5be62c.scope\": RecentStats: unable to find data in memory cache]" Dec 05 17:52:24 crc kubenswrapper[4756]: I1205 17:52:24.963751 4756 generic.go:334] "Generic (PLEG): container finished" podID="a175a12a-be0f-4a3c-9d74-86b07ad3b53a" containerID="b39dd7885a2691791124976a3763b3f20eda18c8b5531b306b98e19206861f4f" exitCode=0 Dec 05 17:52:24 crc kubenswrapper[4756]: I1205 17:52:24.963789 4756 generic.go:334] "Generic (PLEG): container finished" podID="a175a12a-be0f-4a3c-9d74-86b07ad3b53a" containerID="b6de424c1888fea411c3947a48ae5f7a971c4056ef17d26b8682a6f51b5be62c" exitCode=0 Dec 05 17:52:24 crc kubenswrapper[4756]: I1205 17:52:24.963812 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" event={"ID":"a175a12a-be0f-4a3c-9d74-86b07ad3b53a","Type":"ContainerDied","Data":"b39dd7885a2691791124976a3763b3f20eda18c8b5531b306b98e19206861f4f"} Dec 05 17:52:24 crc kubenswrapper[4756]: I1205 17:52:24.963844 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" event={"ID":"a175a12a-be0f-4a3c-9d74-86b07ad3b53a","Type":"ContainerDied","Data":"b6de424c1888fea411c3947a48ae5f7a971c4056ef17d26b8682a6f51b5be62c"} Dec 05 17:52:25 crc kubenswrapper[4756]: I1205 17:52:25.529285 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" Dec 05 17:52:25 crc kubenswrapper[4756]: I1205 17:52:25.616932 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-combined-ca-bundle\") pod \"a175a12a-be0f-4a3c-9d74-86b07ad3b53a\" (UID: \"a175a12a-be0f-4a3c-9d74-86b07ad3b53a\") " Dec 05 17:52:25 crc kubenswrapper[4756]: I1205 17:52:25.616976 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-run-httpd\") pod \"a175a12a-be0f-4a3c-9d74-86b07ad3b53a\" (UID: \"a175a12a-be0f-4a3c-9d74-86b07ad3b53a\") " Dec 05 17:52:25 crc kubenswrapper[4756]: I1205 17:52:25.617031 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjrsc\" (UniqueName: \"kubernetes.io/projected/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-kube-api-access-fjrsc\") pod \"a175a12a-be0f-4a3c-9d74-86b07ad3b53a\" (UID: \"a175a12a-be0f-4a3c-9d74-86b07ad3b53a\") " Dec 05 17:52:25 crc kubenswrapper[4756]: I1205 17:52:25.617060 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-config-data\") pod \"a175a12a-be0f-4a3c-9d74-86b07ad3b53a\" (UID: \"a175a12a-be0f-4a3c-9d74-86b07ad3b53a\") " Dec 05 17:52:25 crc kubenswrapper[4756]: I1205 17:52:25.617150 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-log-httpd\") pod \"a175a12a-be0f-4a3c-9d74-86b07ad3b53a\" (UID: \"a175a12a-be0f-4a3c-9d74-86b07ad3b53a\") " Dec 05 17:52:25 crc kubenswrapper[4756]: I1205 17:52:25.617211 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-etc-swift\") pod \"a175a12a-be0f-4a3c-9d74-86b07ad3b53a\" (UID: \"a175a12a-be0f-4a3c-9d74-86b07ad3b53a\") " Dec 05 17:52:25 crc kubenswrapper[4756]: I1205 17:52:25.617801 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a175a12a-be0f-4a3c-9d74-86b07ad3b53a" (UID: "a175a12a-be0f-4a3c-9d74-86b07ad3b53a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:52:25 crc kubenswrapper[4756]: I1205 17:52:25.617791 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a175a12a-be0f-4a3c-9d74-86b07ad3b53a" (UID: "a175a12a-be0f-4a3c-9d74-86b07ad3b53a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:52:25 crc kubenswrapper[4756]: I1205 17:52:25.622106 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-kube-api-access-fjrsc" (OuterVolumeSpecName: "kube-api-access-fjrsc") pod "a175a12a-be0f-4a3c-9d74-86b07ad3b53a" (UID: "a175a12a-be0f-4a3c-9d74-86b07ad3b53a"). InnerVolumeSpecName "kube-api-access-fjrsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:52:25 crc kubenswrapper[4756]: I1205 17:52:25.636042 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a175a12a-be0f-4a3c-9d74-86b07ad3b53a" (UID: "a175a12a-be0f-4a3c-9d74-86b07ad3b53a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:52:25 crc kubenswrapper[4756]: I1205 17:52:25.685201 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-config-data" (OuterVolumeSpecName: "config-data") pod "a175a12a-be0f-4a3c-9d74-86b07ad3b53a" (UID: "a175a12a-be0f-4a3c-9d74-86b07ad3b53a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:52:25 crc kubenswrapper[4756]: I1205 17:52:25.693495 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a175a12a-be0f-4a3c-9d74-86b07ad3b53a" (UID: "a175a12a-be0f-4a3c-9d74-86b07ad3b53a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:52:25 crc kubenswrapper[4756]: I1205 17:52:25.719901 4756 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:25 crc kubenswrapper[4756]: I1205 17:52:25.719943 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:25 crc kubenswrapper[4756]: I1205 17:52:25.719963 4756 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:25 crc kubenswrapper[4756]: I1205 17:52:25.719979 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjrsc\" (UniqueName: \"kubernetes.io/projected/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-kube-api-access-fjrsc\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:25 crc kubenswrapper[4756]: I1205 17:52:25.719995 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:25 crc kubenswrapper[4756]: I1205 17:52:25.720010 4756 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a175a12a-be0f-4a3c-9d74-86b07ad3b53a-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:25 crc kubenswrapper[4756]: I1205 17:52:25.977704 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" event={"ID":"a175a12a-be0f-4a3c-9d74-86b07ad3b53a","Type":"ContainerDied","Data":"8210d76eeca04f7ad0f121e77ae78f3fa86bb7918cb6cb686c58c7f463042bfd"} Dec 05 17:52:25 crc kubenswrapper[4756]: I1205 17:52:25.978090 4756 scope.go:117] "RemoveContainer" containerID="b39dd7885a2691791124976a3763b3f20eda18c8b5531b306b98e19206861f4f" Dec 05 17:52:25 crc kubenswrapper[4756]: I1205 17:52:25.977866 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5c9b6c5b74-7zzt9" Dec 05 17:52:26 crc kubenswrapper[4756]: I1205 17:52:26.021814 4756 scope.go:117] "RemoveContainer" containerID="b6de424c1888fea411c3947a48ae5f7a971c4056ef17d26b8682a6f51b5be62c" Dec 05 17:52:26 crc kubenswrapper[4756]: I1205 17:52:26.030406 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-5c9b6c5b74-7zzt9"] Dec 05 17:52:26 crc kubenswrapper[4756]: I1205 17:52:26.039913 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-5c9b6c5b74-7zzt9"] Dec 05 17:52:26 crc kubenswrapper[4756]: I1205 17:52:26.069551 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a175a12a-be0f-4a3c-9d74-86b07ad3b53a" path="/var/lib/kubelet/pods/a175a12a-be0f-4a3c-9d74-86b07ad3b53a/volumes" Dec 05 17:52:29 crc kubenswrapper[4756]: I1205 17:52:29.937119 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-sb2xm"] Dec 05 17:52:29 crc kubenswrapper[4756]: E1205 17:52:29.937762 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a175a12a-be0f-4a3c-9d74-86b07ad3b53a" containerName="proxy-server" Dec 05 17:52:29 crc kubenswrapper[4756]: I1205 17:52:29.937778 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a175a12a-be0f-4a3c-9d74-86b07ad3b53a" containerName="proxy-server" Dec 05 17:52:29 crc kubenswrapper[4756]: E1205 17:52:29.937816 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e02404d-5bef-4477-a4c9-b2d3992f0136" containerName="swift-ring-rebalance" Dec 05 17:52:29 crc kubenswrapper[4756]: I1205 17:52:29.937824 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e02404d-5bef-4477-a4c9-b2d3992f0136" containerName="swift-ring-rebalance" Dec 05 17:52:29 crc kubenswrapper[4756]: E1205 17:52:29.937844 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="369c09d9-5631-4875-b833-7abfd6fb0c24" containerName="init" Dec 05 17:52:29 crc kubenswrapper[4756]: I1205 17:52:29.937853 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="369c09d9-5631-4875-b833-7abfd6fb0c24" containerName="init" Dec 05 17:52:29 crc kubenswrapper[4756]: E1205 17:52:29.937864 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a175a12a-be0f-4a3c-9d74-86b07ad3b53a" containerName="proxy-httpd" Dec 05 17:52:29 crc kubenswrapper[4756]: I1205 17:52:29.937871 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a175a12a-be0f-4a3c-9d74-86b07ad3b53a" containerName="proxy-httpd" Dec 05 17:52:29 crc kubenswrapper[4756]: E1205 17:52:29.937891 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="369c09d9-5631-4875-b833-7abfd6fb0c24" containerName="dnsmasq-dns" Dec 05 17:52:29 crc kubenswrapper[4756]: I1205 17:52:29.937900 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="369c09d9-5631-4875-b833-7abfd6fb0c24" containerName="dnsmasq-dns" Dec 05 17:52:29 crc kubenswrapper[4756]: I1205 17:52:29.938111 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e02404d-5bef-4477-a4c9-b2d3992f0136" containerName="swift-ring-rebalance" Dec 05 17:52:29 crc kubenswrapper[4756]: I1205 17:52:29.938136 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="369c09d9-5631-4875-b833-7abfd6fb0c24" containerName="dnsmasq-dns" Dec 05 17:52:29 crc kubenswrapper[4756]: I1205 17:52:29.938148 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="a175a12a-be0f-4a3c-9d74-86b07ad3b53a" containerName="proxy-server" Dec 05 17:52:29 crc kubenswrapper[4756]: I1205 17:52:29.938160 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="a175a12a-be0f-4a3c-9d74-86b07ad3b53a" containerName="proxy-httpd" Dec 05 17:52:29 crc kubenswrapper[4756]: I1205 17:52:29.939154 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sb2xm" Dec 05 17:52:29 crc kubenswrapper[4756]: I1205 17:52:29.950823 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8311-account-create-update-f22zw"] Dec 05 17:52:29 crc kubenswrapper[4756]: I1205 17:52:29.953362 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8311-account-create-update-f22zw" Dec 05 17:52:29 crc kubenswrapper[4756]: I1205 17:52:29.956360 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 05 17:52:29 crc kubenswrapper[4756]: I1205 17:52:29.962141 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-sb2xm"] Dec 05 17:52:29 crc kubenswrapper[4756]: I1205 17:52:29.969089 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8311-account-create-update-f22zw"] Dec 05 17:52:30 crc kubenswrapper[4756]: I1205 17:52:30.023143 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b48d5aeb-da03-4791-bc18-6cc3ee644863-operator-scripts\") pod \"cinder-db-create-sb2xm\" (UID: \"b48d5aeb-da03-4791-bc18-6cc3ee644863\") " pod="openstack/cinder-db-create-sb2xm" Dec 05 17:52:30 crc kubenswrapper[4756]: I1205 17:52:30.023228 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9t9r\" (UniqueName: \"kubernetes.io/projected/7269b014-e537-4908-b6f1-26b1968cf770-kube-api-access-p9t9r\") pod \"cinder-8311-account-create-update-f22zw\" (UID: \"7269b014-e537-4908-b6f1-26b1968cf770\") " pod="openstack/cinder-8311-account-create-update-f22zw" Dec 05 17:52:30 crc kubenswrapper[4756]: I1205 17:52:30.023265 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7269b014-e537-4908-b6f1-26b1968cf770-operator-scripts\") pod \"cinder-8311-account-create-update-f22zw\" (UID: \"7269b014-e537-4908-b6f1-26b1968cf770\") " pod="openstack/cinder-8311-account-create-update-f22zw" Dec 05 17:52:30 crc kubenswrapper[4756]: I1205 17:52:30.023383 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l487c\" (UniqueName: \"kubernetes.io/projected/b48d5aeb-da03-4791-bc18-6cc3ee644863-kube-api-access-l487c\") pod \"cinder-db-create-sb2xm\" (UID: \"b48d5aeb-da03-4791-bc18-6cc3ee644863\") " pod="openstack/cinder-db-create-sb2xm" Dec 05 17:52:30 crc kubenswrapper[4756]: I1205 17:52:30.125796 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b48d5aeb-da03-4791-bc18-6cc3ee644863-operator-scripts\") pod \"cinder-db-create-sb2xm\" (UID: \"b48d5aeb-da03-4791-bc18-6cc3ee644863\") " pod="openstack/cinder-db-create-sb2xm" Dec 05 17:52:30 crc kubenswrapper[4756]: I1205 17:52:30.125890 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9t9r\" (UniqueName: \"kubernetes.io/projected/7269b014-e537-4908-b6f1-26b1968cf770-kube-api-access-p9t9r\") pod \"cinder-8311-account-create-update-f22zw\" (UID: \"7269b014-e537-4908-b6f1-26b1968cf770\") " pod="openstack/cinder-8311-account-create-update-f22zw" Dec 05 17:52:30 crc kubenswrapper[4756]: I1205 17:52:30.125927 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7269b014-e537-4908-b6f1-26b1968cf770-operator-scripts\") pod \"cinder-8311-account-create-update-f22zw\" (UID: \"7269b014-e537-4908-b6f1-26b1968cf770\") " pod="openstack/cinder-8311-account-create-update-f22zw" Dec 05 17:52:30 crc kubenswrapper[4756]: I1205 17:52:30.126016 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l487c\" (UniqueName: \"kubernetes.io/projected/b48d5aeb-da03-4791-bc18-6cc3ee644863-kube-api-access-l487c\") pod \"cinder-db-create-sb2xm\" (UID: \"b48d5aeb-da03-4791-bc18-6cc3ee644863\") " pod="openstack/cinder-db-create-sb2xm" Dec 05 17:52:30 crc kubenswrapper[4756]: I1205 17:52:30.127267 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7269b014-e537-4908-b6f1-26b1968cf770-operator-scripts\") pod \"cinder-8311-account-create-update-f22zw\" (UID: \"7269b014-e537-4908-b6f1-26b1968cf770\") " pod="openstack/cinder-8311-account-create-update-f22zw" Dec 05 17:52:30 crc kubenswrapper[4756]: I1205 17:52:30.127369 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b48d5aeb-da03-4791-bc18-6cc3ee644863-operator-scripts\") pod \"cinder-db-create-sb2xm\" (UID: \"b48d5aeb-da03-4791-bc18-6cc3ee644863\") " pod="openstack/cinder-db-create-sb2xm" Dec 05 17:52:30 crc kubenswrapper[4756]: I1205 17:52:30.147940 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9t9r\" (UniqueName: \"kubernetes.io/projected/7269b014-e537-4908-b6f1-26b1968cf770-kube-api-access-p9t9r\") pod \"cinder-8311-account-create-update-f22zw\" (UID: \"7269b014-e537-4908-b6f1-26b1968cf770\") " pod="openstack/cinder-8311-account-create-update-f22zw" Dec 05 17:52:30 crc kubenswrapper[4756]: I1205 17:52:30.156829 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l487c\" (UniqueName: \"kubernetes.io/projected/b48d5aeb-da03-4791-bc18-6cc3ee644863-kube-api-access-l487c\") pod \"cinder-db-create-sb2xm\" (UID: \"b48d5aeb-da03-4791-bc18-6cc3ee644863\") " pod="openstack/cinder-db-create-sb2xm" Dec 05 17:52:30 crc kubenswrapper[4756]: I1205 17:52:30.257390 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sb2xm" Dec 05 17:52:30 crc kubenswrapper[4756]: I1205 17:52:30.277181 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8311-account-create-update-f22zw" Dec 05 17:52:30 crc kubenswrapper[4756]: I1205 17:52:30.577433 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-sb2xm"] Dec 05 17:52:30 crc kubenswrapper[4756]: W1205 17:52:30.580631 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb48d5aeb_da03_4791_bc18_6cc3ee644863.slice/crio-53307bc3d0f43b49a26aca0b8484b6d75b121fab1fb5b3f67f764a123e237cdc WatchSource:0}: Error finding container 53307bc3d0f43b49a26aca0b8484b6d75b121fab1fb5b3f67f764a123e237cdc: Status 404 returned error can't find the container with id 53307bc3d0f43b49a26aca0b8484b6d75b121fab1fb5b3f67f764a123e237cdc Dec 05 17:52:30 crc kubenswrapper[4756]: W1205 17:52:30.857516 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7269b014_e537_4908_b6f1_26b1968cf770.slice/crio-ec5098ca614b0f809d8cd2cfa32460f269e76265fa40c5a2fff1dc0e1110ae52 WatchSource:0}: Error finding container ec5098ca614b0f809d8cd2cfa32460f269e76265fa40c5a2fff1dc0e1110ae52: Status 404 returned error can't find the container with id ec5098ca614b0f809d8cd2cfa32460f269e76265fa40c5a2fff1dc0e1110ae52 Dec 05 17:52:30 crc kubenswrapper[4756]: I1205 17:52:30.867346 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8311-account-create-update-f22zw"] Dec 05 17:52:31 crc kubenswrapper[4756]: I1205 17:52:31.064527 4756 generic.go:334] "Generic (PLEG): container finished" podID="b48d5aeb-da03-4791-bc18-6cc3ee644863" containerID="2256769bb8998f12b83ae149b8da26e39fe2cb55868940676535b29b2f768878" exitCode=0 Dec 05 17:52:31 crc kubenswrapper[4756]: I1205 17:52:31.065180 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sb2xm" event={"ID":"b48d5aeb-da03-4791-bc18-6cc3ee644863","Type":"ContainerDied","Data":"2256769bb8998f12b83ae149b8da26e39fe2cb55868940676535b29b2f768878"} Dec 05 17:52:31 crc kubenswrapper[4756]: I1205 17:52:31.065228 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sb2xm" event={"ID":"b48d5aeb-da03-4791-bc18-6cc3ee644863","Type":"ContainerStarted","Data":"53307bc3d0f43b49a26aca0b8484b6d75b121fab1fb5b3f67f764a123e237cdc"} Dec 05 17:52:31 crc kubenswrapper[4756]: I1205 17:52:31.068833 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8311-account-create-update-f22zw" event={"ID":"7269b014-e537-4908-b6f1-26b1968cf770","Type":"ContainerStarted","Data":"8776235cf14a56f635b39849b4b508b14b9236b26611f067af0b84b7096fa20c"} Dec 05 17:52:31 crc kubenswrapper[4756]: I1205 17:52:31.068902 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8311-account-create-update-f22zw" event={"ID":"7269b014-e537-4908-b6f1-26b1968cf770","Type":"ContainerStarted","Data":"ec5098ca614b0f809d8cd2cfa32460f269e76265fa40c5a2fff1dc0e1110ae52"} Dec 05 17:52:31 crc kubenswrapper[4756]: I1205 17:52:31.118670 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-8311-account-create-update-f22zw" podStartSLOduration=2.118639396 podStartE2EDuration="2.118639396s" podCreationTimestamp="2025-12-05 17:52:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:52:31.103835192 +0000 UTC m=+5811.342508734" watchObservedRunningTime="2025-12-05 17:52:31.118639396 +0000 UTC m=+5811.357312918" Dec 05 17:52:32 crc kubenswrapper[4756]: I1205 17:52:32.084031 4756 generic.go:334] "Generic (PLEG): container finished" podID="7269b014-e537-4908-b6f1-26b1968cf770" containerID="8776235cf14a56f635b39849b4b508b14b9236b26611f067af0b84b7096fa20c" exitCode=0 Dec 05 17:52:32 crc kubenswrapper[4756]: I1205 17:52:32.084106 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8311-account-create-update-f22zw" event={"ID":"7269b014-e537-4908-b6f1-26b1968cf770","Type":"ContainerDied","Data":"8776235cf14a56f635b39849b4b508b14b9236b26611f067af0b84b7096fa20c"} Dec 05 17:52:32 crc kubenswrapper[4756]: I1205 17:52:32.524083 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sb2xm" Dec 05 17:52:32 crc kubenswrapper[4756]: I1205 17:52:32.692384 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b48d5aeb-da03-4791-bc18-6cc3ee644863-operator-scripts\") pod \"b48d5aeb-da03-4791-bc18-6cc3ee644863\" (UID: \"b48d5aeb-da03-4791-bc18-6cc3ee644863\") " Dec 05 17:52:32 crc kubenswrapper[4756]: I1205 17:52:32.692528 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l487c\" (UniqueName: \"kubernetes.io/projected/b48d5aeb-da03-4791-bc18-6cc3ee644863-kube-api-access-l487c\") pod \"b48d5aeb-da03-4791-bc18-6cc3ee644863\" (UID: \"b48d5aeb-da03-4791-bc18-6cc3ee644863\") " Dec 05 17:52:32 crc kubenswrapper[4756]: I1205 17:52:32.693383 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b48d5aeb-da03-4791-bc18-6cc3ee644863-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b48d5aeb-da03-4791-bc18-6cc3ee644863" (UID: "b48d5aeb-da03-4791-bc18-6cc3ee644863"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:52:32 crc kubenswrapper[4756]: I1205 17:52:32.701823 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b48d5aeb-da03-4791-bc18-6cc3ee644863-kube-api-access-l487c" (OuterVolumeSpecName: "kube-api-access-l487c") pod "b48d5aeb-da03-4791-bc18-6cc3ee644863" (UID: "b48d5aeb-da03-4791-bc18-6cc3ee644863"). InnerVolumeSpecName "kube-api-access-l487c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:52:32 crc kubenswrapper[4756]: I1205 17:52:32.795653 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l487c\" (UniqueName: \"kubernetes.io/projected/b48d5aeb-da03-4791-bc18-6cc3ee644863-kube-api-access-l487c\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:32 crc kubenswrapper[4756]: I1205 17:52:32.795711 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b48d5aeb-da03-4791-bc18-6cc3ee644863-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:33 crc kubenswrapper[4756]: I1205 17:52:33.109389 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sb2xm" Dec 05 17:52:33 crc kubenswrapper[4756]: I1205 17:52:33.110335 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sb2xm" event={"ID":"b48d5aeb-da03-4791-bc18-6cc3ee644863","Type":"ContainerDied","Data":"53307bc3d0f43b49a26aca0b8484b6d75b121fab1fb5b3f67f764a123e237cdc"} Dec 05 17:52:33 crc kubenswrapper[4756]: I1205 17:52:33.110394 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53307bc3d0f43b49a26aca0b8484b6d75b121fab1fb5b3f67f764a123e237cdc" Dec 05 17:52:33 crc kubenswrapper[4756]: I1205 17:52:33.536693 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8311-account-create-update-f22zw" Dec 05 17:52:33 crc kubenswrapper[4756]: I1205 17:52:33.613292 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7269b014-e537-4908-b6f1-26b1968cf770-operator-scripts\") pod \"7269b014-e537-4908-b6f1-26b1968cf770\" (UID: \"7269b014-e537-4908-b6f1-26b1968cf770\") " Dec 05 17:52:33 crc kubenswrapper[4756]: I1205 17:52:33.613503 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9t9r\" (UniqueName: \"kubernetes.io/projected/7269b014-e537-4908-b6f1-26b1968cf770-kube-api-access-p9t9r\") pod \"7269b014-e537-4908-b6f1-26b1968cf770\" (UID: \"7269b014-e537-4908-b6f1-26b1968cf770\") " Dec 05 17:52:33 crc kubenswrapper[4756]: I1205 17:52:33.614584 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7269b014-e537-4908-b6f1-26b1968cf770-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7269b014-e537-4908-b6f1-26b1968cf770" (UID: "7269b014-e537-4908-b6f1-26b1968cf770"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:52:33 crc kubenswrapper[4756]: I1205 17:52:33.617621 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7269b014-e537-4908-b6f1-26b1968cf770-kube-api-access-p9t9r" (OuterVolumeSpecName: "kube-api-access-p9t9r") pod "7269b014-e537-4908-b6f1-26b1968cf770" (UID: "7269b014-e537-4908-b6f1-26b1968cf770"). InnerVolumeSpecName "kube-api-access-p9t9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:52:33 crc kubenswrapper[4756]: I1205 17:52:33.715805 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7269b014-e537-4908-b6f1-26b1968cf770-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:33 crc kubenswrapper[4756]: I1205 17:52:33.715865 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9t9r\" (UniqueName: \"kubernetes.io/projected/7269b014-e537-4908-b6f1-26b1968cf770-kube-api-access-p9t9r\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:34 crc kubenswrapper[4756]: I1205 17:52:34.118464 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8311-account-create-update-f22zw" event={"ID":"7269b014-e537-4908-b6f1-26b1968cf770","Type":"ContainerDied","Data":"ec5098ca614b0f809d8cd2cfa32460f269e76265fa40c5a2fff1dc0e1110ae52"} Dec 05 17:52:34 crc kubenswrapper[4756]: I1205 17:52:34.118518 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec5098ca614b0f809d8cd2cfa32460f269e76265fa40c5a2fff1dc0e1110ae52" Dec 05 17:52:34 crc kubenswrapper[4756]: I1205 17:52:34.118596 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8311-account-create-update-f22zw" Dec 05 17:52:34 crc kubenswrapper[4756]: E1205 17:52:34.263821 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7269b014_e537_4908_b6f1_26b1968cf770.slice\": RecentStats: unable to find data in memory cache]" Dec 05 17:52:35 crc kubenswrapper[4756]: I1205 17:52:35.280651 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-smhw8"] Dec 05 17:52:35 crc kubenswrapper[4756]: E1205 17:52:35.281387 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7269b014-e537-4908-b6f1-26b1968cf770" containerName="mariadb-account-create-update" Dec 05 17:52:35 crc kubenswrapper[4756]: I1205 17:52:35.281404 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7269b014-e537-4908-b6f1-26b1968cf770" containerName="mariadb-account-create-update" Dec 05 17:52:35 crc kubenswrapper[4756]: E1205 17:52:35.283951 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b48d5aeb-da03-4791-bc18-6cc3ee644863" containerName="mariadb-database-create" Dec 05 17:52:35 crc kubenswrapper[4756]: I1205 17:52:35.283982 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b48d5aeb-da03-4791-bc18-6cc3ee644863" containerName="mariadb-database-create" Dec 05 17:52:35 crc kubenswrapper[4756]: I1205 17:52:35.284240 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="7269b014-e537-4908-b6f1-26b1968cf770" containerName="mariadb-account-create-update" Dec 05 17:52:35 crc kubenswrapper[4756]: I1205 17:52:35.284270 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b48d5aeb-da03-4791-bc18-6cc3ee644863" containerName="mariadb-database-create" Dec 05 17:52:35 crc kubenswrapper[4756]: I1205 17:52:35.285057 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-smhw8" Dec 05 17:52:35 crc kubenswrapper[4756]: I1205 17:52:35.289798 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-8rsb7" Dec 05 17:52:35 crc kubenswrapper[4756]: I1205 17:52:35.290313 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 05 17:52:35 crc kubenswrapper[4756]: I1205 17:52:35.290730 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 05 17:52:35 crc kubenswrapper[4756]: I1205 17:52:35.306917 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-smhw8"] Dec 05 17:52:35 crc kubenswrapper[4756]: I1205 17:52:35.450828 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ff1e1cbf-0283-495e-a34e-eb538ef04665-db-sync-config-data\") pod \"cinder-db-sync-smhw8\" (UID: \"ff1e1cbf-0283-495e-a34e-eb538ef04665\") " pod="openstack/cinder-db-sync-smhw8" Dec 05 17:52:35 crc kubenswrapper[4756]: I1205 17:52:35.451029 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff1e1cbf-0283-495e-a34e-eb538ef04665-scripts\") pod \"cinder-db-sync-smhw8\" (UID: \"ff1e1cbf-0283-495e-a34e-eb538ef04665\") " pod="openstack/cinder-db-sync-smhw8" Dec 05 17:52:35 crc kubenswrapper[4756]: I1205 17:52:35.451158 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ff1e1cbf-0283-495e-a34e-eb538ef04665-etc-machine-id\") pod \"cinder-db-sync-smhw8\" (UID: \"ff1e1cbf-0283-495e-a34e-eb538ef04665\") " pod="openstack/cinder-db-sync-smhw8" Dec 05 17:52:35 crc kubenswrapper[4756]: I1205 17:52:35.451224 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgxq9\" (UniqueName: \"kubernetes.io/projected/ff1e1cbf-0283-495e-a34e-eb538ef04665-kube-api-access-pgxq9\") pod \"cinder-db-sync-smhw8\" (UID: \"ff1e1cbf-0283-495e-a34e-eb538ef04665\") " pod="openstack/cinder-db-sync-smhw8" Dec 05 17:52:35 crc kubenswrapper[4756]: I1205 17:52:35.451290 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff1e1cbf-0283-495e-a34e-eb538ef04665-config-data\") pod \"cinder-db-sync-smhw8\" (UID: \"ff1e1cbf-0283-495e-a34e-eb538ef04665\") " pod="openstack/cinder-db-sync-smhw8" Dec 05 17:52:35 crc kubenswrapper[4756]: I1205 17:52:35.451502 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff1e1cbf-0283-495e-a34e-eb538ef04665-combined-ca-bundle\") pod \"cinder-db-sync-smhw8\" (UID: \"ff1e1cbf-0283-495e-a34e-eb538ef04665\") " pod="openstack/cinder-db-sync-smhw8" Dec 05 17:52:35 crc kubenswrapper[4756]: I1205 17:52:35.553366 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ff1e1cbf-0283-495e-a34e-eb538ef04665-db-sync-config-data\") pod \"cinder-db-sync-smhw8\" (UID: \"ff1e1cbf-0283-495e-a34e-eb538ef04665\") " pod="openstack/cinder-db-sync-smhw8" Dec 05 17:52:35 crc kubenswrapper[4756]: I1205 17:52:35.553530 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff1e1cbf-0283-495e-a34e-eb538ef04665-scripts\") pod \"cinder-db-sync-smhw8\" (UID: \"ff1e1cbf-0283-495e-a34e-eb538ef04665\") " pod="openstack/cinder-db-sync-smhw8" Dec 05 17:52:35 crc kubenswrapper[4756]: I1205 17:52:35.553625 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ff1e1cbf-0283-495e-a34e-eb538ef04665-etc-machine-id\") pod \"cinder-db-sync-smhw8\" (UID: \"ff1e1cbf-0283-495e-a34e-eb538ef04665\") " pod="openstack/cinder-db-sync-smhw8" Dec 05 17:52:35 crc kubenswrapper[4756]: I1205 17:52:35.553663 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgxq9\" (UniqueName: \"kubernetes.io/projected/ff1e1cbf-0283-495e-a34e-eb538ef04665-kube-api-access-pgxq9\") pod \"cinder-db-sync-smhw8\" (UID: \"ff1e1cbf-0283-495e-a34e-eb538ef04665\") " pod="openstack/cinder-db-sync-smhw8" Dec 05 17:52:35 crc kubenswrapper[4756]: I1205 17:52:35.553705 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff1e1cbf-0283-495e-a34e-eb538ef04665-config-data\") pod \"cinder-db-sync-smhw8\" (UID: \"ff1e1cbf-0283-495e-a34e-eb538ef04665\") " pod="openstack/cinder-db-sync-smhw8" Dec 05 17:52:35 crc kubenswrapper[4756]: I1205 17:52:35.553818 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ff1e1cbf-0283-495e-a34e-eb538ef04665-etc-machine-id\") pod \"cinder-db-sync-smhw8\" (UID: \"ff1e1cbf-0283-495e-a34e-eb538ef04665\") " pod="openstack/cinder-db-sync-smhw8" Dec 05 17:52:35 crc kubenswrapper[4756]: I1205 17:52:35.554878 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff1e1cbf-0283-495e-a34e-eb538ef04665-combined-ca-bundle\") pod \"cinder-db-sync-smhw8\" (UID: \"ff1e1cbf-0283-495e-a34e-eb538ef04665\") " pod="openstack/cinder-db-sync-smhw8" Dec 05 17:52:35 crc kubenswrapper[4756]: I1205 17:52:35.562734 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff1e1cbf-0283-495e-a34e-eb538ef04665-scripts\") pod \"cinder-db-sync-smhw8\" (UID: \"ff1e1cbf-0283-495e-a34e-eb538ef04665\") " pod="openstack/cinder-db-sync-smhw8" Dec 05 17:52:35 crc kubenswrapper[4756]: I1205 17:52:35.563610 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff1e1cbf-0283-495e-a34e-eb538ef04665-combined-ca-bundle\") pod \"cinder-db-sync-smhw8\" (UID: \"ff1e1cbf-0283-495e-a34e-eb538ef04665\") " pod="openstack/cinder-db-sync-smhw8" Dec 05 17:52:35 crc kubenswrapper[4756]: I1205 17:52:35.565724 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff1e1cbf-0283-495e-a34e-eb538ef04665-config-data\") pod \"cinder-db-sync-smhw8\" (UID: \"ff1e1cbf-0283-495e-a34e-eb538ef04665\") " pod="openstack/cinder-db-sync-smhw8" Dec 05 17:52:35 crc kubenswrapper[4756]: I1205 17:52:35.566627 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ff1e1cbf-0283-495e-a34e-eb538ef04665-db-sync-config-data\") pod \"cinder-db-sync-smhw8\" (UID: \"ff1e1cbf-0283-495e-a34e-eb538ef04665\") " pod="openstack/cinder-db-sync-smhw8" Dec 05 17:52:35 crc kubenswrapper[4756]: I1205 17:52:35.583613 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgxq9\" (UniqueName: \"kubernetes.io/projected/ff1e1cbf-0283-495e-a34e-eb538ef04665-kube-api-access-pgxq9\") pod \"cinder-db-sync-smhw8\" (UID: \"ff1e1cbf-0283-495e-a34e-eb538ef04665\") " pod="openstack/cinder-db-sync-smhw8" Dec 05 17:52:35 crc kubenswrapper[4756]: I1205 17:52:35.607922 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-smhw8" Dec 05 17:52:35 crc kubenswrapper[4756]: I1205 17:52:35.966816 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-smhw8"] Dec 05 17:52:36 crc kubenswrapper[4756]: I1205 17:52:36.139295 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-smhw8" event={"ID":"ff1e1cbf-0283-495e-a34e-eb538ef04665","Type":"ContainerStarted","Data":"6f5364fcbfc4bb8340fe1b348059d96026c1c78e612f5b14f57880e1834a1f66"} Dec 05 17:52:37 crc kubenswrapper[4756]: I1205 17:52:37.156206 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-smhw8" event={"ID":"ff1e1cbf-0283-495e-a34e-eb538ef04665","Type":"ContainerStarted","Data":"c8d264d298711680548d2f39dc0f632d08f914d6d7147865ef75e46c4809ac0a"} Dec 05 17:52:37 crc kubenswrapper[4756]: I1205 17:52:37.187160 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-smhw8" podStartSLOduration=2.18713536 podStartE2EDuration="2.18713536s" podCreationTimestamp="2025-12-05 17:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:52:37.185185108 +0000 UTC m=+5817.423858650" watchObservedRunningTime="2025-12-05 17:52:37.18713536 +0000 UTC m=+5817.425808902" Dec 05 17:52:39 crc kubenswrapper[4756]: I1205 17:52:39.183689 4756 generic.go:334] "Generic (PLEG): container finished" podID="ff1e1cbf-0283-495e-a34e-eb538ef04665" containerID="c8d264d298711680548d2f39dc0f632d08f914d6d7147865ef75e46c4809ac0a" exitCode=0 Dec 05 17:52:39 crc kubenswrapper[4756]: I1205 17:52:39.183982 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-smhw8" event={"ID":"ff1e1cbf-0283-495e-a34e-eb538ef04665","Type":"ContainerDied","Data":"c8d264d298711680548d2f39dc0f632d08f914d6d7147865ef75e46c4809ac0a"} Dec 05 17:52:40 crc kubenswrapper[4756]: I1205 17:52:40.553730 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-smhw8" Dec 05 17:52:40 crc kubenswrapper[4756]: I1205 17:52:40.657633 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ff1e1cbf-0283-495e-a34e-eb538ef04665-etc-machine-id\") pod \"ff1e1cbf-0283-495e-a34e-eb538ef04665\" (UID: \"ff1e1cbf-0283-495e-a34e-eb538ef04665\") " Dec 05 17:52:40 crc kubenswrapper[4756]: I1205 17:52:40.657747 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff1e1cbf-0283-495e-a34e-eb538ef04665-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ff1e1cbf-0283-495e-a34e-eb538ef04665" (UID: "ff1e1cbf-0283-495e-a34e-eb538ef04665"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 17:52:40 crc kubenswrapper[4756]: I1205 17:52:40.657781 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff1e1cbf-0283-495e-a34e-eb538ef04665-config-data\") pod \"ff1e1cbf-0283-495e-a34e-eb538ef04665\" (UID: \"ff1e1cbf-0283-495e-a34e-eb538ef04665\") " Dec 05 17:52:40 crc kubenswrapper[4756]: I1205 17:52:40.657885 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff1e1cbf-0283-495e-a34e-eb538ef04665-scripts\") pod \"ff1e1cbf-0283-495e-a34e-eb538ef04665\" (UID: \"ff1e1cbf-0283-495e-a34e-eb538ef04665\") " Dec 05 17:52:40 crc kubenswrapper[4756]: I1205 17:52:40.657959 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgxq9\" (UniqueName: \"kubernetes.io/projected/ff1e1cbf-0283-495e-a34e-eb538ef04665-kube-api-access-pgxq9\") pod \"ff1e1cbf-0283-495e-a34e-eb538ef04665\" (UID: \"ff1e1cbf-0283-495e-a34e-eb538ef04665\") " Dec 05 17:52:40 crc kubenswrapper[4756]: I1205 17:52:40.657996 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ff1e1cbf-0283-495e-a34e-eb538ef04665-db-sync-config-data\") pod \"ff1e1cbf-0283-495e-a34e-eb538ef04665\" (UID: \"ff1e1cbf-0283-495e-a34e-eb538ef04665\") " Dec 05 17:52:40 crc kubenswrapper[4756]: I1205 17:52:40.658062 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff1e1cbf-0283-495e-a34e-eb538ef04665-combined-ca-bundle\") pod \"ff1e1cbf-0283-495e-a34e-eb538ef04665\" (UID: \"ff1e1cbf-0283-495e-a34e-eb538ef04665\") " Dec 05 17:52:40 crc kubenswrapper[4756]: I1205 17:52:40.658708 4756 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ff1e1cbf-0283-495e-a34e-eb538ef04665-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:40 crc kubenswrapper[4756]: I1205 17:52:40.665238 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff1e1cbf-0283-495e-a34e-eb538ef04665-kube-api-access-pgxq9" (OuterVolumeSpecName: "kube-api-access-pgxq9") pod "ff1e1cbf-0283-495e-a34e-eb538ef04665" (UID: "ff1e1cbf-0283-495e-a34e-eb538ef04665"). InnerVolumeSpecName "kube-api-access-pgxq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:52:40 crc kubenswrapper[4756]: I1205 17:52:40.665600 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff1e1cbf-0283-495e-a34e-eb538ef04665-scripts" (OuterVolumeSpecName: "scripts") pod "ff1e1cbf-0283-495e-a34e-eb538ef04665" (UID: "ff1e1cbf-0283-495e-a34e-eb538ef04665"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:52:40 crc kubenswrapper[4756]: I1205 17:52:40.665797 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff1e1cbf-0283-495e-a34e-eb538ef04665-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ff1e1cbf-0283-495e-a34e-eb538ef04665" (UID: "ff1e1cbf-0283-495e-a34e-eb538ef04665"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:52:40 crc kubenswrapper[4756]: I1205 17:52:40.691394 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff1e1cbf-0283-495e-a34e-eb538ef04665-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff1e1cbf-0283-495e-a34e-eb538ef04665" (UID: "ff1e1cbf-0283-495e-a34e-eb538ef04665"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:52:40 crc kubenswrapper[4756]: I1205 17:52:40.718143 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff1e1cbf-0283-495e-a34e-eb538ef04665-config-data" (OuterVolumeSpecName: "config-data") pod "ff1e1cbf-0283-495e-a34e-eb538ef04665" (UID: "ff1e1cbf-0283-495e-a34e-eb538ef04665"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:52:40 crc kubenswrapper[4756]: I1205 17:52:40.760577 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff1e1cbf-0283-495e-a34e-eb538ef04665-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:40 crc kubenswrapper[4756]: I1205 17:52:40.760615 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff1e1cbf-0283-495e-a34e-eb538ef04665-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:40 crc kubenswrapper[4756]: I1205 17:52:40.760629 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgxq9\" (UniqueName: \"kubernetes.io/projected/ff1e1cbf-0283-495e-a34e-eb538ef04665-kube-api-access-pgxq9\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:40 crc kubenswrapper[4756]: I1205 17:52:40.760642 4756 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ff1e1cbf-0283-495e-a34e-eb538ef04665-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:40 crc kubenswrapper[4756]: I1205 17:52:40.760656 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff1e1cbf-0283-495e-a34e-eb538ef04665-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.218889 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-smhw8" event={"ID":"ff1e1cbf-0283-495e-a34e-eb538ef04665","Type":"ContainerDied","Data":"6f5364fcbfc4bb8340fe1b348059d96026c1c78e612f5b14f57880e1834a1f66"} Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.220202 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f5364fcbfc4bb8340fe1b348059d96026c1c78e612f5b14f57880e1834a1f66" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.218970 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-smhw8" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.545498 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7498bd9d99-g52v7"] Dec 05 17:52:41 crc kubenswrapper[4756]: E1205 17:52:41.545910 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff1e1cbf-0283-495e-a34e-eb538ef04665" containerName="cinder-db-sync" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.545936 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff1e1cbf-0283-495e-a34e-eb538ef04665" containerName="cinder-db-sync" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.546164 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff1e1cbf-0283-495e-a34e-eb538ef04665" containerName="cinder-db-sync" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.547315 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7498bd9d99-g52v7" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.578730 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7498bd9d99-g52v7"] Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.679394 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b600aa7c-bad9-4599-b875-aeaa05f56aa2-ovsdbserver-sb\") pod \"dnsmasq-dns-7498bd9d99-g52v7\" (UID: \"b600aa7c-bad9-4599-b875-aeaa05f56aa2\") " pod="openstack/dnsmasq-dns-7498bd9d99-g52v7" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.679484 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdsrw\" (UniqueName: \"kubernetes.io/projected/b600aa7c-bad9-4599-b875-aeaa05f56aa2-kube-api-access-rdsrw\") pod \"dnsmasq-dns-7498bd9d99-g52v7\" (UID: \"b600aa7c-bad9-4599-b875-aeaa05f56aa2\") " pod="openstack/dnsmasq-dns-7498bd9d99-g52v7" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.679519 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b600aa7c-bad9-4599-b875-aeaa05f56aa2-config\") pod \"dnsmasq-dns-7498bd9d99-g52v7\" (UID: \"b600aa7c-bad9-4599-b875-aeaa05f56aa2\") " pod="openstack/dnsmasq-dns-7498bd9d99-g52v7" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.679544 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b600aa7c-bad9-4599-b875-aeaa05f56aa2-dns-svc\") pod \"dnsmasq-dns-7498bd9d99-g52v7\" (UID: \"b600aa7c-bad9-4599-b875-aeaa05f56aa2\") " pod="openstack/dnsmasq-dns-7498bd9d99-g52v7" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.679605 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b600aa7c-bad9-4599-b875-aeaa05f56aa2-ovsdbserver-nb\") pod \"dnsmasq-dns-7498bd9d99-g52v7\" (UID: \"b600aa7c-bad9-4599-b875-aeaa05f56aa2\") " pod="openstack/dnsmasq-dns-7498bd9d99-g52v7" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.696072 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.697624 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.707774 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.708007 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.708059 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.709858 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-8rsb7" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.722608 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.780722 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05fd1482-5388-490c-a3e0-e6e40e6aed64-config-data\") pod \"cinder-api-0\" (UID: \"05fd1482-5388-490c-a3e0-e6e40e6aed64\") " pod="openstack/cinder-api-0" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.780784 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b600aa7c-bad9-4599-b875-aeaa05f56aa2-ovsdbserver-nb\") pod \"dnsmasq-dns-7498bd9d99-g52v7\" (UID: \"b600aa7c-bad9-4599-b875-aeaa05f56aa2\") " pod="openstack/dnsmasq-dns-7498bd9d99-g52v7" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.780821 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fd1482-5388-490c-a3e0-e6e40e6aed64-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"05fd1482-5388-490c-a3e0-e6e40e6aed64\") " pod="openstack/cinder-api-0" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.780844 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/05fd1482-5388-490c-a3e0-e6e40e6aed64-etc-machine-id\") pod \"cinder-api-0\" (UID: \"05fd1482-5388-490c-a3e0-e6e40e6aed64\") " pod="openstack/cinder-api-0" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.780869 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05fd1482-5388-490c-a3e0-e6e40e6aed64-scripts\") pod \"cinder-api-0\" (UID: \"05fd1482-5388-490c-a3e0-e6e40e6aed64\") " pod="openstack/cinder-api-0" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.780889 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b600aa7c-bad9-4599-b875-aeaa05f56aa2-ovsdbserver-sb\") pod \"dnsmasq-dns-7498bd9d99-g52v7\" (UID: \"b600aa7c-bad9-4599-b875-aeaa05f56aa2\") " pod="openstack/dnsmasq-dns-7498bd9d99-g52v7" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.780917 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b85zk\" (UniqueName: \"kubernetes.io/projected/05fd1482-5388-490c-a3e0-e6e40e6aed64-kube-api-access-b85zk\") pod \"cinder-api-0\" (UID: \"05fd1482-5388-490c-a3e0-e6e40e6aed64\") " pod="openstack/cinder-api-0" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.780936 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05fd1482-5388-490c-a3e0-e6e40e6aed64-config-data-custom\") pod \"cinder-api-0\" (UID: \"05fd1482-5388-490c-a3e0-e6e40e6aed64\") " pod="openstack/cinder-api-0" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.780972 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdsrw\" (UniqueName: \"kubernetes.io/projected/b600aa7c-bad9-4599-b875-aeaa05f56aa2-kube-api-access-rdsrw\") pod \"dnsmasq-dns-7498bd9d99-g52v7\" (UID: \"b600aa7c-bad9-4599-b875-aeaa05f56aa2\") " pod="openstack/dnsmasq-dns-7498bd9d99-g52v7" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.780999 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b600aa7c-bad9-4599-b875-aeaa05f56aa2-config\") pod \"dnsmasq-dns-7498bd9d99-g52v7\" (UID: \"b600aa7c-bad9-4599-b875-aeaa05f56aa2\") " pod="openstack/dnsmasq-dns-7498bd9d99-g52v7" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.781024 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b600aa7c-bad9-4599-b875-aeaa05f56aa2-dns-svc\") pod \"dnsmasq-dns-7498bd9d99-g52v7\" (UID: \"b600aa7c-bad9-4599-b875-aeaa05f56aa2\") " pod="openstack/dnsmasq-dns-7498bd9d99-g52v7" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.781049 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05fd1482-5388-490c-a3e0-e6e40e6aed64-logs\") pod \"cinder-api-0\" (UID: \"05fd1482-5388-490c-a3e0-e6e40e6aed64\") " pod="openstack/cinder-api-0" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.781866 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b600aa7c-bad9-4599-b875-aeaa05f56aa2-ovsdbserver-nb\") pod \"dnsmasq-dns-7498bd9d99-g52v7\" (UID: \"b600aa7c-bad9-4599-b875-aeaa05f56aa2\") " pod="openstack/dnsmasq-dns-7498bd9d99-g52v7" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.782192 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b600aa7c-bad9-4599-b875-aeaa05f56aa2-ovsdbserver-sb\") pod \"dnsmasq-dns-7498bd9d99-g52v7\" (UID: \"b600aa7c-bad9-4599-b875-aeaa05f56aa2\") " pod="openstack/dnsmasq-dns-7498bd9d99-g52v7" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.782246 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b600aa7c-bad9-4599-b875-aeaa05f56aa2-config\") pod \"dnsmasq-dns-7498bd9d99-g52v7\" (UID: \"b600aa7c-bad9-4599-b875-aeaa05f56aa2\") " pod="openstack/dnsmasq-dns-7498bd9d99-g52v7" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.782821 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b600aa7c-bad9-4599-b875-aeaa05f56aa2-dns-svc\") pod \"dnsmasq-dns-7498bd9d99-g52v7\" (UID: \"b600aa7c-bad9-4599-b875-aeaa05f56aa2\") " pod="openstack/dnsmasq-dns-7498bd9d99-g52v7" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.808690 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdsrw\" (UniqueName: \"kubernetes.io/projected/b600aa7c-bad9-4599-b875-aeaa05f56aa2-kube-api-access-rdsrw\") pod \"dnsmasq-dns-7498bd9d99-g52v7\" (UID: \"b600aa7c-bad9-4599-b875-aeaa05f56aa2\") " pod="openstack/dnsmasq-dns-7498bd9d99-g52v7" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.875502 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7498bd9d99-g52v7" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.882081 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fd1482-5388-490c-a3e0-e6e40e6aed64-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"05fd1482-5388-490c-a3e0-e6e40e6aed64\") " pod="openstack/cinder-api-0" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.882119 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/05fd1482-5388-490c-a3e0-e6e40e6aed64-etc-machine-id\") pod \"cinder-api-0\" (UID: \"05fd1482-5388-490c-a3e0-e6e40e6aed64\") " pod="openstack/cinder-api-0" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.882143 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05fd1482-5388-490c-a3e0-e6e40e6aed64-scripts\") pod \"cinder-api-0\" (UID: \"05fd1482-5388-490c-a3e0-e6e40e6aed64\") " pod="openstack/cinder-api-0" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.882174 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b85zk\" (UniqueName: \"kubernetes.io/projected/05fd1482-5388-490c-a3e0-e6e40e6aed64-kube-api-access-b85zk\") pod \"cinder-api-0\" (UID: \"05fd1482-5388-490c-a3e0-e6e40e6aed64\") " pod="openstack/cinder-api-0" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.882192 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05fd1482-5388-490c-a3e0-e6e40e6aed64-config-data-custom\") pod \"cinder-api-0\" (UID: \"05fd1482-5388-490c-a3e0-e6e40e6aed64\") " pod="openstack/cinder-api-0" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.882252 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05fd1482-5388-490c-a3e0-e6e40e6aed64-logs\") pod \"cinder-api-0\" (UID: \"05fd1482-5388-490c-a3e0-e6e40e6aed64\") " pod="openstack/cinder-api-0" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.882284 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05fd1482-5388-490c-a3e0-e6e40e6aed64-config-data\") pod \"cinder-api-0\" (UID: \"05fd1482-5388-490c-a3e0-e6e40e6aed64\") " pod="openstack/cinder-api-0" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.882893 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/05fd1482-5388-490c-a3e0-e6e40e6aed64-etc-machine-id\") pod \"cinder-api-0\" (UID: \"05fd1482-5388-490c-a3e0-e6e40e6aed64\") " pod="openstack/cinder-api-0" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.883262 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05fd1482-5388-490c-a3e0-e6e40e6aed64-logs\") pod \"cinder-api-0\" (UID: \"05fd1482-5388-490c-a3e0-e6e40e6aed64\") " pod="openstack/cinder-api-0" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.887068 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05fd1482-5388-490c-a3e0-e6e40e6aed64-scripts\") pod \"cinder-api-0\" (UID: \"05fd1482-5388-490c-a3e0-e6e40e6aed64\") " pod="openstack/cinder-api-0" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.894233 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05fd1482-5388-490c-a3e0-e6e40e6aed64-config-data\") pod \"cinder-api-0\" (UID: \"05fd1482-5388-490c-a3e0-e6e40e6aed64\") " pod="openstack/cinder-api-0" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.898059 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fd1482-5388-490c-a3e0-e6e40e6aed64-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"05fd1482-5388-490c-a3e0-e6e40e6aed64\") " pod="openstack/cinder-api-0" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.905695 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b85zk\" (UniqueName: \"kubernetes.io/projected/05fd1482-5388-490c-a3e0-e6e40e6aed64-kube-api-access-b85zk\") pod \"cinder-api-0\" (UID: \"05fd1482-5388-490c-a3e0-e6e40e6aed64\") " pod="openstack/cinder-api-0" Dec 05 17:52:41 crc kubenswrapper[4756]: I1205 17:52:41.947115 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05fd1482-5388-490c-a3e0-e6e40e6aed64-config-data-custom\") pod \"cinder-api-0\" (UID: \"05fd1482-5388-490c-a3e0-e6e40e6aed64\") " pod="openstack/cinder-api-0" Dec 05 17:52:42 crc kubenswrapper[4756]: I1205 17:52:42.011066 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 17:52:42 crc kubenswrapper[4756]: I1205 17:52:42.397511 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7498bd9d99-g52v7"] Dec 05 17:52:42 crc kubenswrapper[4756]: I1205 17:52:42.505841 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 17:52:43 crc kubenswrapper[4756]: I1205 17:52:43.247759 4756 generic.go:334] "Generic (PLEG): container finished" podID="b600aa7c-bad9-4599-b875-aeaa05f56aa2" containerID="cb910ea1a1bf91dfc270a15cd5f56e8d86acaf9a494277eae33584814685e3bc" exitCode=0 Dec 05 17:52:43 crc kubenswrapper[4756]: I1205 17:52:43.248839 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7498bd9d99-g52v7" event={"ID":"b600aa7c-bad9-4599-b875-aeaa05f56aa2","Type":"ContainerDied","Data":"cb910ea1a1bf91dfc270a15cd5f56e8d86acaf9a494277eae33584814685e3bc"} Dec 05 17:52:43 crc kubenswrapper[4756]: I1205 17:52:43.254900 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7498bd9d99-g52v7" event={"ID":"b600aa7c-bad9-4599-b875-aeaa05f56aa2","Type":"ContainerStarted","Data":"6bae3f63a33b5432042e070e51b3b169de14d3c1cacc94d591154c05fea67434"} Dec 05 17:52:43 crc kubenswrapper[4756]: I1205 17:52:43.254948 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"05fd1482-5388-490c-a3e0-e6e40e6aed64","Type":"ContainerStarted","Data":"e65ac0d4cc962137f3c345c692cf16c327db789812f6478a2d316e3e6b193cb5"} Dec 05 17:52:43 crc kubenswrapper[4756]: I1205 17:52:43.254963 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"05fd1482-5388-490c-a3e0-e6e40e6aed64","Type":"ContainerStarted","Data":"a1397f387939fe6a88380bd74594ab069b288366b9af911ea757ab225d80e55d"} Dec 05 17:52:44 crc kubenswrapper[4756]: I1205 17:52:44.182152 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 17:52:44 crc kubenswrapper[4756]: I1205 17:52:44.262131 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7498bd9d99-g52v7" event={"ID":"b600aa7c-bad9-4599-b875-aeaa05f56aa2","Type":"ContainerStarted","Data":"67951d347a590fac48578a9eb05497c4547a373150f59afb5208b7de12207c22"} Dec 05 17:52:44 crc kubenswrapper[4756]: I1205 17:52:44.263498 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7498bd9d99-g52v7" Dec 05 17:52:44 crc kubenswrapper[4756]: I1205 17:52:44.265778 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"05fd1482-5388-490c-a3e0-e6e40e6aed64","Type":"ContainerStarted","Data":"d770348981f94dbf1698657cd7b400c84f5ee4365514af46cc18cd502cf20cf1"} Dec 05 17:52:44 crc kubenswrapper[4756]: I1205 17:52:44.265960 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 05 17:52:44 crc kubenswrapper[4756]: I1205 17:52:44.281875 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7498bd9d99-g52v7" podStartSLOduration=3.281855733 podStartE2EDuration="3.281855733s" podCreationTimestamp="2025-12-05 17:52:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:52:44.278904215 +0000 UTC m=+5824.517577737" watchObservedRunningTime="2025-12-05 17:52:44.281855733 +0000 UTC m=+5824.520529275" Dec 05 17:52:44 crc kubenswrapper[4756]: I1205 17:52:44.308908 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.308886943 podStartE2EDuration="3.308886943s" podCreationTimestamp="2025-12-05 17:52:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:52:44.306971852 +0000 UTC m=+5824.545645364" watchObservedRunningTime="2025-12-05 17:52:44.308886943 +0000 UTC m=+5824.547560455" Dec 05 17:52:45 crc kubenswrapper[4756]: I1205 17:52:45.274926 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="05fd1482-5388-490c-a3e0-e6e40e6aed64" containerName="cinder-api-log" containerID="cri-o://e65ac0d4cc962137f3c345c692cf16c327db789812f6478a2d316e3e6b193cb5" gracePeriod=30 Dec 05 17:52:45 crc kubenswrapper[4756]: I1205 17:52:45.275004 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="05fd1482-5388-490c-a3e0-e6e40e6aed64" containerName="cinder-api" containerID="cri-o://d770348981f94dbf1698657cd7b400c84f5ee4365514af46cc18cd502cf20cf1" gracePeriod=30 Dec 05 17:52:45 crc kubenswrapper[4756]: I1205 17:52:45.863509 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 17:52:45 crc kubenswrapper[4756]: I1205 17:52:45.970439 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05fd1482-5388-490c-a3e0-e6e40e6aed64-config-data\") pod \"05fd1482-5388-490c-a3e0-e6e40e6aed64\" (UID: \"05fd1482-5388-490c-a3e0-e6e40e6aed64\") " Dec 05 17:52:45 crc kubenswrapper[4756]: I1205 17:52:45.970566 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b85zk\" (UniqueName: \"kubernetes.io/projected/05fd1482-5388-490c-a3e0-e6e40e6aed64-kube-api-access-b85zk\") pod \"05fd1482-5388-490c-a3e0-e6e40e6aed64\" (UID: \"05fd1482-5388-490c-a3e0-e6e40e6aed64\") " Dec 05 17:52:45 crc kubenswrapper[4756]: I1205 17:52:45.970627 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05fd1482-5388-490c-a3e0-e6e40e6aed64-logs\") pod \"05fd1482-5388-490c-a3e0-e6e40e6aed64\" (UID: \"05fd1482-5388-490c-a3e0-e6e40e6aed64\") " Dec 05 17:52:45 crc kubenswrapper[4756]: I1205 17:52:45.970646 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fd1482-5388-490c-a3e0-e6e40e6aed64-combined-ca-bundle\") pod \"05fd1482-5388-490c-a3e0-e6e40e6aed64\" (UID: \"05fd1482-5388-490c-a3e0-e6e40e6aed64\") " Dec 05 17:52:45 crc kubenswrapper[4756]: I1205 17:52:45.970715 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05fd1482-5388-490c-a3e0-e6e40e6aed64-scripts\") pod \"05fd1482-5388-490c-a3e0-e6e40e6aed64\" (UID: \"05fd1482-5388-490c-a3e0-e6e40e6aed64\") " Dec 05 17:52:45 crc kubenswrapper[4756]: I1205 17:52:45.970800 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/05fd1482-5388-490c-a3e0-e6e40e6aed64-etc-machine-id\") pod \"05fd1482-5388-490c-a3e0-e6e40e6aed64\" (UID: \"05fd1482-5388-490c-a3e0-e6e40e6aed64\") " Dec 05 17:52:45 crc kubenswrapper[4756]: I1205 17:52:45.970823 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05fd1482-5388-490c-a3e0-e6e40e6aed64-config-data-custom\") pod \"05fd1482-5388-490c-a3e0-e6e40e6aed64\" (UID: \"05fd1482-5388-490c-a3e0-e6e40e6aed64\") " Dec 05 17:52:45 crc kubenswrapper[4756]: I1205 17:52:45.977820 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05fd1482-5388-490c-a3e0-e6e40e6aed64-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "05fd1482-5388-490c-a3e0-e6e40e6aed64" (UID: "05fd1482-5388-490c-a3e0-e6e40e6aed64"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:52:45 crc kubenswrapper[4756]: I1205 17:52:45.978463 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/05fd1482-5388-490c-a3e0-e6e40e6aed64-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "05fd1482-5388-490c-a3e0-e6e40e6aed64" (UID: "05fd1482-5388-490c-a3e0-e6e40e6aed64"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 17:52:45 crc kubenswrapper[4756]: I1205 17:52:45.978766 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05fd1482-5388-490c-a3e0-e6e40e6aed64-logs" (OuterVolumeSpecName: "logs") pod "05fd1482-5388-490c-a3e0-e6e40e6aed64" (UID: "05fd1482-5388-490c-a3e0-e6e40e6aed64"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:52:45 crc kubenswrapper[4756]: I1205 17:52:45.981005 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05fd1482-5388-490c-a3e0-e6e40e6aed64-scripts" (OuterVolumeSpecName: "scripts") pod "05fd1482-5388-490c-a3e0-e6e40e6aed64" (UID: "05fd1482-5388-490c-a3e0-e6e40e6aed64"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:52:45 crc kubenswrapper[4756]: I1205 17:52:45.994841 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05fd1482-5388-490c-a3e0-e6e40e6aed64-kube-api-access-b85zk" (OuterVolumeSpecName: "kube-api-access-b85zk") pod "05fd1482-5388-490c-a3e0-e6e40e6aed64" (UID: "05fd1482-5388-490c-a3e0-e6e40e6aed64"). InnerVolumeSpecName "kube-api-access-b85zk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.010227 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05fd1482-5388-490c-a3e0-e6e40e6aed64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05fd1482-5388-490c-a3e0-e6e40e6aed64" (UID: "05fd1482-5388-490c-a3e0-e6e40e6aed64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.035400 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05fd1482-5388-490c-a3e0-e6e40e6aed64-config-data" (OuterVolumeSpecName: "config-data") pod "05fd1482-5388-490c-a3e0-e6e40e6aed64" (UID: "05fd1482-5388-490c-a3e0-e6e40e6aed64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.073705 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05fd1482-5388-490c-a3e0-e6e40e6aed64-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.073763 4756 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/05fd1482-5388-490c-a3e0-e6e40e6aed64-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.073788 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05fd1482-5388-490c-a3e0-e6e40e6aed64-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.073808 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b85zk\" (UniqueName: \"kubernetes.io/projected/05fd1482-5388-490c-a3e0-e6e40e6aed64-kube-api-access-b85zk\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.073831 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05fd1482-5388-490c-a3e0-e6e40e6aed64-logs\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.073850 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fd1482-5388-490c-a3e0-e6e40e6aed64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.073868 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05fd1482-5388-490c-a3e0-e6e40e6aed64-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.290467 4756 generic.go:334] "Generic (PLEG): container finished" podID="05fd1482-5388-490c-a3e0-e6e40e6aed64" containerID="d770348981f94dbf1698657cd7b400c84f5ee4365514af46cc18cd502cf20cf1" exitCode=0 Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.290505 4756 generic.go:334] "Generic (PLEG): container finished" podID="05fd1482-5388-490c-a3e0-e6e40e6aed64" containerID="e65ac0d4cc962137f3c345c692cf16c327db789812f6478a2d316e3e6b193cb5" exitCode=143 Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.290522 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.290560 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"05fd1482-5388-490c-a3e0-e6e40e6aed64","Type":"ContainerDied","Data":"d770348981f94dbf1698657cd7b400c84f5ee4365514af46cc18cd502cf20cf1"} Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.290624 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"05fd1482-5388-490c-a3e0-e6e40e6aed64","Type":"ContainerDied","Data":"e65ac0d4cc962137f3c345c692cf16c327db789812f6478a2d316e3e6b193cb5"} Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.290652 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"05fd1482-5388-490c-a3e0-e6e40e6aed64","Type":"ContainerDied","Data":"a1397f387939fe6a88380bd74594ab069b288366b9af911ea757ab225d80e55d"} Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.290684 4756 scope.go:117] "RemoveContainer" containerID="d770348981f94dbf1698657cd7b400c84f5ee4365514af46cc18cd502cf20cf1" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.314387 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.322102 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.341961 4756 scope.go:117] "RemoveContainer" containerID="e65ac0d4cc962137f3c345c692cf16c327db789812f6478a2d316e3e6b193cb5" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.365391 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 05 17:52:46 crc kubenswrapper[4756]: E1205 17:52:46.365888 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05fd1482-5388-490c-a3e0-e6e40e6aed64" containerName="cinder-api" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.365911 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="05fd1482-5388-490c-a3e0-e6e40e6aed64" containerName="cinder-api" Dec 05 17:52:46 crc kubenswrapper[4756]: E1205 17:52:46.365957 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05fd1482-5388-490c-a3e0-e6e40e6aed64" containerName="cinder-api-log" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.365966 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="05fd1482-5388-490c-a3e0-e6e40e6aed64" containerName="cinder-api-log" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.366158 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="05fd1482-5388-490c-a3e0-e6e40e6aed64" containerName="cinder-api-log" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.366179 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="05fd1482-5388-490c-a3e0-e6e40e6aed64" containerName="cinder-api" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.371179 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.374653 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.374873 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.375167 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-8rsb7" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.375328 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.375740 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.379046 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.390281 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.423522 4756 scope.go:117] "RemoveContainer" containerID="d770348981f94dbf1698657cd7b400c84f5ee4365514af46cc18cd502cf20cf1" Dec 05 17:52:46 crc kubenswrapper[4756]: E1205 17:52:46.424040 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d770348981f94dbf1698657cd7b400c84f5ee4365514af46cc18cd502cf20cf1\": container with ID starting with d770348981f94dbf1698657cd7b400c84f5ee4365514af46cc18cd502cf20cf1 not found: ID does not exist" containerID="d770348981f94dbf1698657cd7b400c84f5ee4365514af46cc18cd502cf20cf1" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.424077 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d770348981f94dbf1698657cd7b400c84f5ee4365514af46cc18cd502cf20cf1"} err="failed to get container status \"d770348981f94dbf1698657cd7b400c84f5ee4365514af46cc18cd502cf20cf1\": rpc error: code = NotFound desc = could not find container \"d770348981f94dbf1698657cd7b400c84f5ee4365514af46cc18cd502cf20cf1\": container with ID starting with d770348981f94dbf1698657cd7b400c84f5ee4365514af46cc18cd502cf20cf1 not found: ID does not exist" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.424102 4756 scope.go:117] "RemoveContainer" containerID="e65ac0d4cc962137f3c345c692cf16c327db789812f6478a2d316e3e6b193cb5" Dec 05 17:52:46 crc kubenswrapper[4756]: E1205 17:52:46.424332 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e65ac0d4cc962137f3c345c692cf16c327db789812f6478a2d316e3e6b193cb5\": container with ID starting with e65ac0d4cc962137f3c345c692cf16c327db789812f6478a2d316e3e6b193cb5 not found: ID does not exist" containerID="e65ac0d4cc962137f3c345c692cf16c327db789812f6478a2d316e3e6b193cb5" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.424361 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e65ac0d4cc962137f3c345c692cf16c327db789812f6478a2d316e3e6b193cb5"} err="failed to get container status \"e65ac0d4cc962137f3c345c692cf16c327db789812f6478a2d316e3e6b193cb5\": rpc error: code = NotFound desc = could not find container \"e65ac0d4cc962137f3c345c692cf16c327db789812f6478a2d316e3e6b193cb5\": container with ID starting with e65ac0d4cc962137f3c345c692cf16c327db789812f6478a2d316e3e6b193cb5 not found: ID does not exist" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.424377 4756 scope.go:117] "RemoveContainer" containerID="d770348981f94dbf1698657cd7b400c84f5ee4365514af46cc18cd502cf20cf1" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.424710 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d770348981f94dbf1698657cd7b400c84f5ee4365514af46cc18cd502cf20cf1"} err="failed to get container status \"d770348981f94dbf1698657cd7b400c84f5ee4365514af46cc18cd502cf20cf1\": rpc error: code = NotFound desc = could not find container \"d770348981f94dbf1698657cd7b400c84f5ee4365514af46cc18cd502cf20cf1\": container with ID starting with d770348981f94dbf1698657cd7b400c84f5ee4365514af46cc18cd502cf20cf1 not found: ID does not exist" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.424752 4756 scope.go:117] "RemoveContainer" containerID="e65ac0d4cc962137f3c345c692cf16c327db789812f6478a2d316e3e6b193cb5" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.425059 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e65ac0d4cc962137f3c345c692cf16c327db789812f6478a2d316e3e6b193cb5"} err="failed to get container status \"e65ac0d4cc962137f3c345c692cf16c327db789812f6478a2d316e3e6b193cb5\": rpc error: code = NotFound desc = could not find container \"e65ac0d4cc962137f3c345c692cf16c327db789812f6478a2d316e3e6b193cb5\": container with ID starting with e65ac0d4cc962137f3c345c692cf16c327db789812f6478a2d316e3e6b193cb5 not found: ID does not exist" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.482752 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-config-data-custom\") pod \"cinder-api-0\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " pod="openstack/cinder-api-0" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.482837 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " pod="openstack/cinder-api-0" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.482897 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7qr9\" (UniqueName: \"kubernetes.io/projected/43f1c275-9ce5-4965-8b25-099a1ceed149-kube-api-access-s7qr9\") pod \"cinder-api-0\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " pod="openstack/cinder-api-0" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.482998 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-public-tls-certs\") pod \"cinder-api-0\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " pod="openstack/cinder-api-0" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.483117 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/43f1c275-9ce5-4965-8b25-099a1ceed149-etc-machine-id\") pod \"cinder-api-0\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " pod="openstack/cinder-api-0" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.483152 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-config-data\") pod \"cinder-api-0\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " pod="openstack/cinder-api-0" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.483178 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-scripts\") pod \"cinder-api-0\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " pod="openstack/cinder-api-0" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.483227 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43f1c275-9ce5-4965-8b25-099a1ceed149-logs\") pod \"cinder-api-0\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " pod="openstack/cinder-api-0" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.483266 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " pod="openstack/cinder-api-0" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.585527 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/43f1c275-9ce5-4965-8b25-099a1ceed149-etc-machine-id\") pod \"cinder-api-0\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " pod="openstack/cinder-api-0" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.585601 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-config-data\") pod \"cinder-api-0\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " pod="openstack/cinder-api-0" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.585637 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-scripts\") pod \"cinder-api-0\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " pod="openstack/cinder-api-0" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.585662 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/43f1c275-9ce5-4965-8b25-099a1ceed149-etc-machine-id\") pod \"cinder-api-0\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " pod="openstack/cinder-api-0" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.585686 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43f1c275-9ce5-4965-8b25-099a1ceed149-logs\") pod \"cinder-api-0\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " pod="openstack/cinder-api-0" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.585734 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " pod="openstack/cinder-api-0" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.585822 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-config-data-custom\") pod \"cinder-api-0\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " pod="openstack/cinder-api-0" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.585852 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " pod="openstack/cinder-api-0" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.585895 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7qr9\" (UniqueName: \"kubernetes.io/projected/43f1c275-9ce5-4965-8b25-099a1ceed149-kube-api-access-s7qr9\") pod \"cinder-api-0\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " pod="openstack/cinder-api-0" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.585935 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-public-tls-certs\") pod \"cinder-api-0\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " pod="openstack/cinder-api-0" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.586262 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43f1c275-9ce5-4965-8b25-099a1ceed149-logs\") pod \"cinder-api-0\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " pod="openstack/cinder-api-0" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.590837 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " pod="openstack/cinder-api-0" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.590965 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-config-data-custom\") pod \"cinder-api-0\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " pod="openstack/cinder-api-0" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.591111 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " pod="openstack/cinder-api-0" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.592816 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-scripts\") pod \"cinder-api-0\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " pod="openstack/cinder-api-0" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.593937 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-public-tls-certs\") pod \"cinder-api-0\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " pod="openstack/cinder-api-0" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.599673 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-config-data\") pod \"cinder-api-0\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " pod="openstack/cinder-api-0" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.614463 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7qr9\" (UniqueName: \"kubernetes.io/projected/43f1c275-9ce5-4965-8b25-099a1ceed149-kube-api-access-s7qr9\") pod \"cinder-api-0\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " pod="openstack/cinder-api-0" Dec 05 17:52:46 crc kubenswrapper[4756]: I1205 17:52:46.744837 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 17:52:47 crc kubenswrapper[4756]: I1205 17:52:47.095502 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 17:52:47 crc kubenswrapper[4756]: I1205 17:52:47.309347 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"43f1c275-9ce5-4965-8b25-099a1ceed149","Type":"ContainerStarted","Data":"d7c9a0a8dd9621997de2035913f4f90979f8bcc91f019982963835fa58ed27d1"} Dec 05 17:52:48 crc kubenswrapper[4756]: I1205 17:52:48.071117 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05fd1482-5388-490c-a3e0-e6e40e6aed64" path="/var/lib/kubelet/pods/05fd1482-5388-490c-a3e0-e6e40e6aed64/volumes" Dec 05 17:52:48 crc kubenswrapper[4756]: I1205 17:52:48.330878 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"43f1c275-9ce5-4965-8b25-099a1ceed149","Type":"ContainerStarted","Data":"3abf9deb3f1e0888d3e28f8066e930a1c91f204e6b06ae8c7b68034341b79edd"} Dec 05 17:52:49 crc kubenswrapper[4756]: I1205 17:52:49.345101 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"43f1c275-9ce5-4965-8b25-099a1ceed149","Type":"ContainerStarted","Data":"9d5d5060b1fdc771daabc19cb96858213dc8fc0d8795f2560dbbea1896eb38fe"} Dec 05 17:52:49 crc kubenswrapper[4756]: I1205 17:52:49.346688 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 05 17:52:49 crc kubenswrapper[4756]: I1205 17:52:49.377671 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.377652975 podStartE2EDuration="3.377652975s" podCreationTimestamp="2025-12-05 17:52:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:52:49.369374764 +0000 UTC m=+5829.608048276" watchObservedRunningTime="2025-12-05 17:52:49.377652975 +0000 UTC m=+5829.616326477" Dec 05 17:52:51 crc kubenswrapper[4756]: I1205 17:52:51.877735 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7498bd9d99-g52v7" Dec 05 17:52:51 crc kubenswrapper[4756]: I1205 17:52:51.955458 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-744b879ff-nmmkl"] Dec 05 17:52:51 crc kubenswrapper[4756]: I1205 17:52:51.955754 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-744b879ff-nmmkl" podUID="888ad42e-520e-40d0-a7b7-2d0edd7667a6" containerName="dnsmasq-dns" containerID="cri-o://2deb0331d9a295fd4f8c2853789700364d81f0bc908885e5e64c6d88a37854ca" gracePeriod=10 Dec 05 17:52:52 crc kubenswrapper[4756]: I1205 17:52:52.296207 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 17:52:52 crc kubenswrapper[4756]: I1205 17:52:52.296257 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 17:52:52 crc kubenswrapper[4756]: I1205 17:52:52.296295 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 17:52:52 crc kubenswrapper[4756]: I1205 17:52:52.297055 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 17:52:52 crc kubenswrapper[4756]: I1205 17:52:52.297111 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a" gracePeriod=600 Dec 05 17:52:52 crc kubenswrapper[4756]: I1205 17:52:52.387557 4756 generic.go:334] "Generic (PLEG): container finished" podID="888ad42e-520e-40d0-a7b7-2d0edd7667a6" containerID="2deb0331d9a295fd4f8c2853789700364d81f0bc908885e5e64c6d88a37854ca" exitCode=0 Dec 05 17:52:52 crc kubenswrapper[4756]: I1205 17:52:52.387724 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744b879ff-nmmkl" event={"ID":"888ad42e-520e-40d0-a7b7-2d0edd7667a6","Type":"ContainerDied","Data":"2deb0331d9a295fd4f8c2853789700364d81f0bc908885e5e64c6d88a37854ca"} Dec 05 17:52:52 crc kubenswrapper[4756]: E1205 17:52:52.447925 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:52:52 crc kubenswrapper[4756]: I1205 17:52:52.505142 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744b879ff-nmmkl" Dec 05 17:52:52 crc kubenswrapper[4756]: I1205 17:52:52.680847 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/888ad42e-520e-40d0-a7b7-2d0edd7667a6-ovsdbserver-sb\") pod \"888ad42e-520e-40d0-a7b7-2d0edd7667a6\" (UID: \"888ad42e-520e-40d0-a7b7-2d0edd7667a6\") " Dec 05 17:52:52 crc kubenswrapper[4756]: I1205 17:52:52.680914 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/888ad42e-520e-40d0-a7b7-2d0edd7667a6-ovsdbserver-nb\") pod \"888ad42e-520e-40d0-a7b7-2d0edd7667a6\" (UID: \"888ad42e-520e-40d0-a7b7-2d0edd7667a6\") " Dec 05 17:52:52 crc kubenswrapper[4756]: I1205 17:52:52.680972 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5txj4\" (UniqueName: \"kubernetes.io/projected/888ad42e-520e-40d0-a7b7-2d0edd7667a6-kube-api-access-5txj4\") pod \"888ad42e-520e-40d0-a7b7-2d0edd7667a6\" (UID: \"888ad42e-520e-40d0-a7b7-2d0edd7667a6\") " Dec 05 17:52:52 crc kubenswrapper[4756]: I1205 17:52:52.681033 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/888ad42e-520e-40d0-a7b7-2d0edd7667a6-config\") pod \"888ad42e-520e-40d0-a7b7-2d0edd7667a6\" (UID: \"888ad42e-520e-40d0-a7b7-2d0edd7667a6\") " Dec 05 17:52:52 crc kubenswrapper[4756]: I1205 17:52:52.681100 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/888ad42e-520e-40d0-a7b7-2d0edd7667a6-dns-svc\") pod \"888ad42e-520e-40d0-a7b7-2d0edd7667a6\" (UID: \"888ad42e-520e-40d0-a7b7-2d0edd7667a6\") " Dec 05 17:52:52 crc kubenswrapper[4756]: I1205 17:52:52.688628 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/888ad42e-520e-40d0-a7b7-2d0edd7667a6-kube-api-access-5txj4" (OuterVolumeSpecName: "kube-api-access-5txj4") pod "888ad42e-520e-40d0-a7b7-2d0edd7667a6" (UID: "888ad42e-520e-40d0-a7b7-2d0edd7667a6"). InnerVolumeSpecName "kube-api-access-5txj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:52:52 crc kubenswrapper[4756]: I1205 17:52:52.730019 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/888ad42e-520e-40d0-a7b7-2d0edd7667a6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "888ad42e-520e-40d0-a7b7-2d0edd7667a6" (UID: "888ad42e-520e-40d0-a7b7-2d0edd7667a6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:52:52 crc kubenswrapper[4756]: I1205 17:52:52.734952 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/888ad42e-520e-40d0-a7b7-2d0edd7667a6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "888ad42e-520e-40d0-a7b7-2d0edd7667a6" (UID: "888ad42e-520e-40d0-a7b7-2d0edd7667a6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:52:52 crc kubenswrapper[4756]: I1205 17:52:52.735733 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/888ad42e-520e-40d0-a7b7-2d0edd7667a6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "888ad42e-520e-40d0-a7b7-2d0edd7667a6" (UID: "888ad42e-520e-40d0-a7b7-2d0edd7667a6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:52:52 crc kubenswrapper[4756]: I1205 17:52:52.739359 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/888ad42e-520e-40d0-a7b7-2d0edd7667a6-config" (OuterVolumeSpecName: "config") pod "888ad42e-520e-40d0-a7b7-2d0edd7667a6" (UID: "888ad42e-520e-40d0-a7b7-2d0edd7667a6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:52:52 crc kubenswrapper[4756]: I1205 17:52:52.783265 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/888ad42e-520e-40d0-a7b7-2d0edd7667a6-config\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:52 crc kubenswrapper[4756]: I1205 17:52:52.783302 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/888ad42e-520e-40d0-a7b7-2d0edd7667a6-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:52 crc kubenswrapper[4756]: I1205 17:52:52.783340 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/888ad42e-520e-40d0-a7b7-2d0edd7667a6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:52 crc kubenswrapper[4756]: I1205 17:52:52.783353 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/888ad42e-520e-40d0-a7b7-2d0edd7667a6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:52 crc kubenswrapper[4756]: I1205 17:52:52.783366 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5txj4\" (UniqueName: \"kubernetes.io/projected/888ad42e-520e-40d0-a7b7-2d0edd7667a6-kube-api-access-5txj4\") on node \"crc\" DevicePath \"\"" Dec 05 17:52:53 crc kubenswrapper[4756]: I1205 17:52:53.406785 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744b879ff-nmmkl" event={"ID":"888ad42e-520e-40d0-a7b7-2d0edd7667a6","Type":"ContainerDied","Data":"6242339b18e03aa6118c3f2d8aadb47afd1214112c3cc53273324d186155b93c"} Dec 05 17:52:53 crc kubenswrapper[4756]: I1205 17:52:53.406808 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744b879ff-nmmkl" Dec 05 17:52:53 crc kubenswrapper[4756]: I1205 17:52:53.406869 4756 scope.go:117] "RemoveContainer" containerID="2deb0331d9a295fd4f8c2853789700364d81f0bc908885e5e64c6d88a37854ca" Dec 05 17:52:53 crc kubenswrapper[4756]: I1205 17:52:53.411754 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a" exitCode=0 Dec 05 17:52:53 crc kubenswrapper[4756]: I1205 17:52:53.411823 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a"} Dec 05 17:52:53 crc kubenswrapper[4756]: I1205 17:52:53.412526 4756 scope.go:117] "RemoveContainer" containerID="0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a" Dec 05 17:52:53 crc kubenswrapper[4756]: E1205 17:52:53.413048 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:52:53 crc kubenswrapper[4756]: I1205 17:52:53.443601 4756 scope.go:117] "RemoveContainer" containerID="8561e60ecbbebbe2c6b626af86ae800288c6bb96ca12c901dab9558f1f4ab046" Dec 05 17:52:53 crc kubenswrapper[4756]: I1205 17:52:53.479582 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-744b879ff-nmmkl"] Dec 05 17:52:53 crc kubenswrapper[4756]: I1205 17:52:53.490771 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-744b879ff-nmmkl"] Dec 05 17:52:53 crc kubenswrapper[4756]: I1205 17:52:53.494473 4756 scope.go:117] "RemoveContainer" containerID="6b5a9908bca7e41a64a674dbf6cbb63be6b207127fbfed3e63c4b2167cea9c0c" Dec 05 17:52:54 crc kubenswrapper[4756]: I1205 17:52:54.068600 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="888ad42e-520e-40d0-a7b7-2d0edd7667a6" path="/var/lib/kubelet/pods/888ad42e-520e-40d0-a7b7-2d0edd7667a6/volumes" Dec 05 17:52:58 crc kubenswrapper[4756]: I1205 17:52:58.503150 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 05 17:52:59 crc kubenswrapper[4756]: I1205 17:52:59.755179 4756 scope.go:117] "RemoveContainer" containerID="432927b238449feb6c4705064a876afad9a5c0f8dcb13ad2a84b5a633b257239" Dec 05 17:53:06 crc kubenswrapper[4756]: I1205 17:53:06.049552 4756 scope.go:117] "RemoveContainer" containerID="0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a" Dec 05 17:53:06 crc kubenswrapper[4756]: E1205 17:53:06.050533 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:53:06 crc kubenswrapper[4756]: I1205 17:53:06.313596 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m47ch"] Dec 05 17:53:06 crc kubenswrapper[4756]: E1205 17:53:06.314142 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="888ad42e-520e-40d0-a7b7-2d0edd7667a6" containerName="dnsmasq-dns" Dec 05 17:53:06 crc kubenswrapper[4756]: I1205 17:53:06.314172 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="888ad42e-520e-40d0-a7b7-2d0edd7667a6" containerName="dnsmasq-dns" Dec 05 17:53:06 crc kubenswrapper[4756]: E1205 17:53:06.314197 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="888ad42e-520e-40d0-a7b7-2d0edd7667a6" containerName="init" Dec 05 17:53:06 crc kubenswrapper[4756]: I1205 17:53:06.314212 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="888ad42e-520e-40d0-a7b7-2d0edd7667a6" containerName="init" Dec 05 17:53:06 crc kubenswrapper[4756]: I1205 17:53:06.314595 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="888ad42e-520e-40d0-a7b7-2d0edd7667a6" containerName="dnsmasq-dns" Dec 05 17:53:06 crc kubenswrapper[4756]: I1205 17:53:06.317235 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m47ch" Dec 05 17:53:06 crc kubenswrapper[4756]: I1205 17:53:06.332187 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m47ch"] Dec 05 17:53:06 crc kubenswrapper[4756]: I1205 17:53:06.489789 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbe99a86-217c-4292-a640-e0f67d74d96b-catalog-content\") pod \"community-operators-m47ch\" (UID: \"fbe99a86-217c-4292-a640-e0f67d74d96b\") " pod="openshift-marketplace/community-operators-m47ch" Dec 05 17:53:06 crc kubenswrapper[4756]: I1205 17:53:06.489980 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t28hn\" (UniqueName: \"kubernetes.io/projected/fbe99a86-217c-4292-a640-e0f67d74d96b-kube-api-access-t28hn\") pod \"community-operators-m47ch\" (UID: \"fbe99a86-217c-4292-a640-e0f67d74d96b\") " pod="openshift-marketplace/community-operators-m47ch" Dec 05 17:53:06 crc kubenswrapper[4756]: I1205 17:53:06.490009 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbe99a86-217c-4292-a640-e0f67d74d96b-utilities\") pod \"community-operators-m47ch\" (UID: \"fbe99a86-217c-4292-a640-e0f67d74d96b\") " pod="openshift-marketplace/community-operators-m47ch" Dec 05 17:53:06 crc kubenswrapper[4756]: I1205 17:53:06.591568 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t28hn\" (UniqueName: \"kubernetes.io/projected/fbe99a86-217c-4292-a640-e0f67d74d96b-kube-api-access-t28hn\") pod \"community-operators-m47ch\" (UID: \"fbe99a86-217c-4292-a640-e0f67d74d96b\") " pod="openshift-marketplace/community-operators-m47ch" Dec 05 17:53:06 crc kubenswrapper[4756]: I1205 17:53:06.591603 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbe99a86-217c-4292-a640-e0f67d74d96b-utilities\") pod \"community-operators-m47ch\" (UID: \"fbe99a86-217c-4292-a640-e0f67d74d96b\") " pod="openshift-marketplace/community-operators-m47ch" Dec 05 17:53:06 crc kubenswrapper[4756]: I1205 17:53:06.591643 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbe99a86-217c-4292-a640-e0f67d74d96b-catalog-content\") pod \"community-operators-m47ch\" (UID: \"fbe99a86-217c-4292-a640-e0f67d74d96b\") " pod="openshift-marketplace/community-operators-m47ch" Dec 05 17:53:06 crc kubenswrapper[4756]: I1205 17:53:06.592023 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbe99a86-217c-4292-a640-e0f67d74d96b-catalog-content\") pod \"community-operators-m47ch\" (UID: \"fbe99a86-217c-4292-a640-e0f67d74d96b\") " pod="openshift-marketplace/community-operators-m47ch" Dec 05 17:53:06 crc kubenswrapper[4756]: I1205 17:53:06.592113 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbe99a86-217c-4292-a640-e0f67d74d96b-utilities\") pod \"community-operators-m47ch\" (UID: \"fbe99a86-217c-4292-a640-e0f67d74d96b\") " pod="openshift-marketplace/community-operators-m47ch" Dec 05 17:53:06 crc kubenswrapper[4756]: I1205 17:53:06.610264 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t28hn\" (UniqueName: \"kubernetes.io/projected/fbe99a86-217c-4292-a640-e0f67d74d96b-kube-api-access-t28hn\") pod \"community-operators-m47ch\" (UID: \"fbe99a86-217c-4292-a640-e0f67d74d96b\") " pod="openshift-marketplace/community-operators-m47ch" Dec 05 17:53:06 crc kubenswrapper[4756]: I1205 17:53:06.656245 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m47ch" Dec 05 17:53:07 crc kubenswrapper[4756]: I1205 17:53:07.186391 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m47ch"] Dec 05 17:53:07 crc kubenswrapper[4756]: W1205 17:53:07.196027 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbe99a86_217c_4292_a640_e0f67d74d96b.slice/crio-a0dafb32f3343d47e0fe0c3c2003dadbc1a450e449f439703cee67fcf12a032c WatchSource:0}: Error finding container a0dafb32f3343d47e0fe0c3c2003dadbc1a450e449f439703cee67fcf12a032c: Status 404 returned error can't find the container with id a0dafb32f3343d47e0fe0c3c2003dadbc1a450e449f439703cee67fcf12a032c Dec 05 17:53:07 crc kubenswrapper[4756]: I1205 17:53:07.570689 4756 generic.go:334] "Generic (PLEG): container finished" podID="fbe99a86-217c-4292-a640-e0f67d74d96b" containerID="23bd6cbbd7694efbd1c0d2cc1e3a6cf3d6cc4a2ea3cace15b56c8ad7174feadd" exitCode=0 Dec 05 17:53:07 crc kubenswrapper[4756]: I1205 17:53:07.571026 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m47ch" event={"ID":"fbe99a86-217c-4292-a640-e0f67d74d96b","Type":"ContainerDied","Data":"23bd6cbbd7694efbd1c0d2cc1e3a6cf3d6cc4a2ea3cace15b56c8ad7174feadd"} Dec 05 17:53:07 crc kubenswrapper[4756]: I1205 17:53:07.571079 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m47ch" event={"ID":"fbe99a86-217c-4292-a640-e0f67d74d96b","Type":"ContainerStarted","Data":"a0dafb32f3343d47e0fe0c3c2003dadbc1a450e449f439703cee67fcf12a032c"} Dec 05 17:53:07 crc kubenswrapper[4756]: I1205 17:53:07.573628 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 17:53:07 crc kubenswrapper[4756]: I1205 17:53:07.726810 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8knx9"] Dec 05 17:53:07 crc kubenswrapper[4756]: I1205 17:53:07.731179 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8knx9" Dec 05 17:53:07 crc kubenswrapper[4756]: I1205 17:53:07.739910 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8knx9"] Dec 05 17:53:07 crc kubenswrapper[4756]: I1205 17:53:07.920187 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kss9r\" (UniqueName: \"kubernetes.io/projected/7d87358b-53b1-4bb4-9554-b0e801d7c029-kube-api-access-kss9r\") pod \"redhat-marketplace-8knx9\" (UID: \"7d87358b-53b1-4bb4-9554-b0e801d7c029\") " pod="openshift-marketplace/redhat-marketplace-8knx9" Dec 05 17:53:07 crc kubenswrapper[4756]: I1205 17:53:07.920337 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d87358b-53b1-4bb4-9554-b0e801d7c029-utilities\") pod \"redhat-marketplace-8knx9\" (UID: \"7d87358b-53b1-4bb4-9554-b0e801d7c029\") " pod="openshift-marketplace/redhat-marketplace-8knx9" Dec 05 17:53:07 crc kubenswrapper[4756]: I1205 17:53:07.920385 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d87358b-53b1-4bb4-9554-b0e801d7c029-catalog-content\") pod \"redhat-marketplace-8knx9\" (UID: \"7d87358b-53b1-4bb4-9554-b0e801d7c029\") " pod="openshift-marketplace/redhat-marketplace-8knx9" Dec 05 17:53:08 crc kubenswrapper[4756]: I1205 17:53:08.022842 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kss9r\" (UniqueName: \"kubernetes.io/projected/7d87358b-53b1-4bb4-9554-b0e801d7c029-kube-api-access-kss9r\") pod \"redhat-marketplace-8knx9\" (UID: \"7d87358b-53b1-4bb4-9554-b0e801d7c029\") " pod="openshift-marketplace/redhat-marketplace-8knx9" Dec 05 17:53:08 crc kubenswrapper[4756]: I1205 17:53:08.023015 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d87358b-53b1-4bb4-9554-b0e801d7c029-utilities\") pod \"redhat-marketplace-8knx9\" (UID: \"7d87358b-53b1-4bb4-9554-b0e801d7c029\") " pod="openshift-marketplace/redhat-marketplace-8knx9" Dec 05 17:53:08 crc kubenswrapper[4756]: I1205 17:53:08.023060 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d87358b-53b1-4bb4-9554-b0e801d7c029-catalog-content\") pod \"redhat-marketplace-8knx9\" (UID: \"7d87358b-53b1-4bb4-9554-b0e801d7c029\") " pod="openshift-marketplace/redhat-marketplace-8knx9" Dec 05 17:53:08 crc kubenswrapper[4756]: I1205 17:53:08.023745 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d87358b-53b1-4bb4-9554-b0e801d7c029-catalog-content\") pod \"redhat-marketplace-8knx9\" (UID: \"7d87358b-53b1-4bb4-9554-b0e801d7c029\") " pod="openshift-marketplace/redhat-marketplace-8knx9" Dec 05 17:53:08 crc kubenswrapper[4756]: I1205 17:53:08.023792 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d87358b-53b1-4bb4-9554-b0e801d7c029-utilities\") pod \"redhat-marketplace-8knx9\" (UID: \"7d87358b-53b1-4bb4-9554-b0e801d7c029\") " pod="openshift-marketplace/redhat-marketplace-8knx9" Dec 05 17:53:08 crc kubenswrapper[4756]: I1205 17:53:08.041843 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kss9r\" (UniqueName: \"kubernetes.io/projected/7d87358b-53b1-4bb4-9554-b0e801d7c029-kube-api-access-kss9r\") pod \"redhat-marketplace-8knx9\" (UID: \"7d87358b-53b1-4bb4-9554-b0e801d7c029\") " pod="openshift-marketplace/redhat-marketplace-8knx9" Dec 05 17:53:08 crc kubenswrapper[4756]: I1205 17:53:08.078602 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8knx9" Dec 05 17:53:08 crc kubenswrapper[4756]: I1205 17:53:08.554916 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8knx9"] Dec 05 17:53:08 crc kubenswrapper[4756]: W1205 17:53:08.562353 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d87358b_53b1_4bb4_9554_b0e801d7c029.slice/crio-86bd9f79f6c23f4a766d6e18c0168dfb3780a53a9626648fce65568cca8b6834 WatchSource:0}: Error finding container 86bd9f79f6c23f4a766d6e18c0168dfb3780a53a9626648fce65568cca8b6834: Status 404 returned error can't find the container with id 86bd9f79f6c23f4a766d6e18c0168dfb3780a53a9626648fce65568cca8b6834 Dec 05 17:53:08 crc kubenswrapper[4756]: I1205 17:53:08.585339 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8knx9" event={"ID":"7d87358b-53b1-4bb4-9554-b0e801d7c029","Type":"ContainerStarted","Data":"86bd9f79f6c23f4a766d6e18c0168dfb3780a53a9626648fce65568cca8b6834"} Dec 05 17:53:09 crc kubenswrapper[4756]: I1205 17:53:09.599462 4756 generic.go:334] "Generic (PLEG): container finished" podID="7d87358b-53b1-4bb4-9554-b0e801d7c029" containerID="2e837da62f3b3c463afccdcf1a3db51afd2bcf6be4c34ea9d16ffc0591c560ae" exitCode=0 Dec 05 17:53:09 crc kubenswrapper[4756]: I1205 17:53:09.599590 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8knx9" event={"ID":"7d87358b-53b1-4bb4-9554-b0e801d7c029","Type":"ContainerDied","Data":"2e837da62f3b3c463afccdcf1a3db51afd2bcf6be4c34ea9d16ffc0591c560ae"} Dec 05 17:53:09 crc kubenswrapper[4756]: I1205 17:53:09.617840 4756 generic.go:334] "Generic (PLEG): container finished" podID="fbe99a86-217c-4292-a640-e0f67d74d96b" containerID="b855452f6bdbc055167a3eddab6d31a23d8c9c875afa1fd9fc983d324f8a3ee2" exitCode=0 Dec 05 17:53:09 crc kubenswrapper[4756]: I1205 17:53:09.617876 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m47ch" event={"ID":"fbe99a86-217c-4292-a640-e0f67d74d96b","Type":"ContainerDied","Data":"b855452f6bdbc055167a3eddab6d31a23d8c9c875afa1fd9fc983d324f8a3ee2"} Dec 05 17:53:10 crc kubenswrapper[4756]: I1205 17:53:10.628348 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8knx9" event={"ID":"7d87358b-53b1-4bb4-9554-b0e801d7c029","Type":"ContainerStarted","Data":"ce031d5243611c9411ad9b03d199c6c898b78bb457570dfb09696924b5090acd"} Dec 05 17:53:10 crc kubenswrapper[4756]: I1205 17:53:10.633169 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m47ch" event={"ID":"fbe99a86-217c-4292-a640-e0f67d74d96b","Type":"ContainerStarted","Data":"0c43cae01506957625314de53d8a60cf09c9f67d41a3f4c2107b40d2dd36697e"} Dec 05 17:53:10 crc kubenswrapper[4756]: I1205 17:53:10.670191 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m47ch" podStartSLOduration=2.171865341 podStartE2EDuration="4.670170426s" podCreationTimestamp="2025-12-05 17:53:06 +0000 UTC" firstStartedPulling="2025-12-05 17:53:07.573263695 +0000 UTC m=+5847.811937217" lastFinishedPulling="2025-12-05 17:53:10.07156876 +0000 UTC m=+5850.310242302" observedRunningTime="2025-12-05 17:53:10.658198687 +0000 UTC m=+5850.896872209" watchObservedRunningTime="2025-12-05 17:53:10.670170426 +0000 UTC m=+5850.908843948" Dec 05 17:53:11 crc kubenswrapper[4756]: I1205 17:53:11.644296 4756 generic.go:334] "Generic (PLEG): container finished" podID="7d87358b-53b1-4bb4-9554-b0e801d7c029" containerID="ce031d5243611c9411ad9b03d199c6c898b78bb457570dfb09696924b5090acd" exitCode=0 Dec 05 17:53:11 crc kubenswrapper[4756]: I1205 17:53:11.644621 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8knx9" event={"ID":"7d87358b-53b1-4bb4-9554-b0e801d7c029","Type":"ContainerDied","Data":"ce031d5243611c9411ad9b03d199c6c898b78bb457570dfb09696924b5090acd"} Dec 05 17:53:12 crc kubenswrapper[4756]: I1205 17:53:12.656141 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8knx9" event={"ID":"7d87358b-53b1-4bb4-9554-b0e801d7c029","Type":"ContainerStarted","Data":"77472d516b31f135be5c0d35aaa4aa9952f8532d13f632079634eb6e3bf7b582"} Dec 05 17:53:12 crc kubenswrapper[4756]: I1205 17:53:12.689378 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8knx9" podStartSLOduration=3.135995806 podStartE2EDuration="5.689346487s" podCreationTimestamp="2025-12-05 17:53:07 +0000 UTC" firstStartedPulling="2025-12-05 17:53:09.606212323 +0000 UTC m=+5849.844885845" lastFinishedPulling="2025-12-05 17:53:12.159563004 +0000 UTC m=+5852.398236526" observedRunningTime="2025-12-05 17:53:12.683942643 +0000 UTC m=+5852.922616165" watchObservedRunningTime="2025-12-05 17:53:12.689346487 +0000 UTC m=+5852.928020009" Dec 05 17:53:15 crc kubenswrapper[4756]: I1205 17:53:15.769850 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 17:53:15 crc kubenswrapper[4756]: I1205 17:53:15.771708 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 17:53:15 crc kubenswrapper[4756]: I1205 17:53:15.784561 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 05 17:53:15 crc kubenswrapper[4756]: I1205 17:53:15.793812 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 17:53:15 crc kubenswrapper[4756]: I1205 17:53:15.888521 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddfbd76a-11a8-4e34-b037-494041ea66a9-config-data\") pod \"cinder-scheduler-0\" (UID: \"ddfbd76a-11a8-4e34-b037-494041ea66a9\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:15 crc kubenswrapper[4756]: I1205 17:53:15.888641 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ddfbd76a-11a8-4e34-b037-494041ea66a9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ddfbd76a-11a8-4e34-b037-494041ea66a9\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:15 crc kubenswrapper[4756]: I1205 17:53:15.888675 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddfbd76a-11a8-4e34-b037-494041ea66a9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ddfbd76a-11a8-4e34-b037-494041ea66a9\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:15 crc kubenswrapper[4756]: I1205 17:53:15.888780 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ddfbd76a-11a8-4e34-b037-494041ea66a9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ddfbd76a-11a8-4e34-b037-494041ea66a9\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:15 crc kubenswrapper[4756]: I1205 17:53:15.888854 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5nns\" (UniqueName: \"kubernetes.io/projected/ddfbd76a-11a8-4e34-b037-494041ea66a9-kube-api-access-r5nns\") pod \"cinder-scheduler-0\" (UID: \"ddfbd76a-11a8-4e34-b037-494041ea66a9\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:15 crc kubenswrapper[4756]: I1205 17:53:15.888916 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddfbd76a-11a8-4e34-b037-494041ea66a9-scripts\") pod \"cinder-scheduler-0\" (UID: \"ddfbd76a-11a8-4e34-b037-494041ea66a9\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:15 crc kubenswrapper[4756]: I1205 17:53:15.990019 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ddfbd76a-11a8-4e34-b037-494041ea66a9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ddfbd76a-11a8-4e34-b037-494041ea66a9\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:15 crc kubenswrapper[4756]: I1205 17:53:15.990117 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5nns\" (UniqueName: \"kubernetes.io/projected/ddfbd76a-11a8-4e34-b037-494041ea66a9-kube-api-access-r5nns\") pod \"cinder-scheduler-0\" (UID: \"ddfbd76a-11a8-4e34-b037-494041ea66a9\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:15 crc kubenswrapper[4756]: I1205 17:53:15.990165 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddfbd76a-11a8-4e34-b037-494041ea66a9-scripts\") pod \"cinder-scheduler-0\" (UID: \"ddfbd76a-11a8-4e34-b037-494041ea66a9\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:15 crc kubenswrapper[4756]: I1205 17:53:15.990209 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddfbd76a-11a8-4e34-b037-494041ea66a9-config-data\") pod \"cinder-scheduler-0\" (UID: \"ddfbd76a-11a8-4e34-b037-494041ea66a9\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:15 crc kubenswrapper[4756]: I1205 17:53:15.990235 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ddfbd76a-11a8-4e34-b037-494041ea66a9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ddfbd76a-11a8-4e34-b037-494041ea66a9\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:15 crc kubenswrapper[4756]: I1205 17:53:15.990259 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ddfbd76a-11a8-4e34-b037-494041ea66a9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ddfbd76a-11a8-4e34-b037-494041ea66a9\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:15 crc kubenswrapper[4756]: I1205 17:53:15.990697 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddfbd76a-11a8-4e34-b037-494041ea66a9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ddfbd76a-11a8-4e34-b037-494041ea66a9\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:15 crc kubenswrapper[4756]: I1205 17:53:15.995948 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddfbd76a-11a8-4e34-b037-494041ea66a9-scripts\") pod \"cinder-scheduler-0\" (UID: \"ddfbd76a-11a8-4e34-b037-494041ea66a9\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:15 crc kubenswrapper[4756]: I1205 17:53:15.996128 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ddfbd76a-11a8-4e34-b037-494041ea66a9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ddfbd76a-11a8-4e34-b037-494041ea66a9\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:15 crc kubenswrapper[4756]: I1205 17:53:15.997893 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddfbd76a-11a8-4e34-b037-494041ea66a9-config-data\") pod \"cinder-scheduler-0\" (UID: \"ddfbd76a-11a8-4e34-b037-494041ea66a9\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:16 crc kubenswrapper[4756]: I1205 17:53:16.005738 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddfbd76a-11a8-4e34-b037-494041ea66a9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ddfbd76a-11a8-4e34-b037-494041ea66a9\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:16 crc kubenswrapper[4756]: I1205 17:53:16.010350 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5nns\" (UniqueName: \"kubernetes.io/projected/ddfbd76a-11a8-4e34-b037-494041ea66a9-kube-api-access-r5nns\") pod \"cinder-scheduler-0\" (UID: \"ddfbd76a-11a8-4e34-b037-494041ea66a9\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:16 crc kubenswrapper[4756]: I1205 17:53:16.091565 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 17:53:16 crc kubenswrapper[4756]: I1205 17:53:16.656867 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m47ch" Dec 05 17:53:16 crc kubenswrapper[4756]: I1205 17:53:16.657199 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m47ch" Dec 05 17:53:16 crc kubenswrapper[4756]: I1205 17:53:16.723012 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m47ch" Dec 05 17:53:16 crc kubenswrapper[4756]: I1205 17:53:16.801497 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m47ch" Dec 05 17:53:16 crc kubenswrapper[4756]: I1205 17:53:16.961149 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m47ch"] Dec 05 17:53:17 crc kubenswrapper[4756]: I1205 17:53:17.044020 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 17:53:17 crc kubenswrapper[4756]: I1205 17:53:17.656750 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 17:53:17 crc kubenswrapper[4756]: I1205 17:53:17.658352 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="43f1c275-9ce5-4965-8b25-099a1ceed149" containerName="cinder-api-log" containerID="cri-o://3abf9deb3f1e0888d3e28f8066e930a1c91f204e6b06ae8c7b68034341b79edd" gracePeriod=30 Dec 05 17:53:17 crc kubenswrapper[4756]: I1205 17:53:17.658496 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="43f1c275-9ce5-4965-8b25-099a1ceed149" containerName="cinder-api" containerID="cri-o://9d5d5060b1fdc771daabc19cb96858213dc8fc0d8795f2560dbbea1896eb38fe" gracePeriod=30 Dec 05 17:53:17 crc kubenswrapper[4756]: I1205 17:53:17.701533 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ddfbd76a-11a8-4e34-b037-494041ea66a9","Type":"ContainerStarted","Data":"88ac396fc7fbe7c464cadd6a3a0d0509a356bf011c2f49c5c765ba9c33e2a48c"} Dec 05 17:53:18 crc kubenswrapper[4756]: I1205 17:53:18.080112 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8knx9" Dec 05 17:53:18 crc kubenswrapper[4756]: I1205 17:53:18.082139 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8knx9" Dec 05 17:53:18 crc kubenswrapper[4756]: I1205 17:53:18.159175 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8knx9" Dec 05 17:53:18 crc kubenswrapper[4756]: I1205 17:53:18.710825 4756 generic.go:334] "Generic (PLEG): container finished" podID="43f1c275-9ce5-4965-8b25-099a1ceed149" containerID="3abf9deb3f1e0888d3e28f8066e930a1c91f204e6b06ae8c7b68034341b79edd" exitCode=143 Dec 05 17:53:18 crc kubenswrapper[4756]: I1205 17:53:18.711181 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"43f1c275-9ce5-4965-8b25-099a1ceed149","Type":"ContainerDied","Data":"3abf9deb3f1e0888d3e28f8066e930a1c91f204e6b06ae8c7b68034341b79edd"} Dec 05 17:53:18 crc kubenswrapper[4756]: I1205 17:53:18.716089 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ddfbd76a-11a8-4e34-b037-494041ea66a9","Type":"ContainerStarted","Data":"8db7563fceda7c64faf1dc6fd53a2aea9b2b7f0a54217dabcf72ba0b12fb54aa"} Dec 05 17:53:18 crc kubenswrapper[4756]: I1205 17:53:18.716129 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ddfbd76a-11a8-4e34-b037-494041ea66a9","Type":"ContainerStarted","Data":"3b7d6c4d316faeb04a57a78d7c779c61e10944c35d88d95a3fb6413691e09cf3"} Dec 05 17:53:18 crc kubenswrapper[4756]: I1205 17:53:18.716283 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m47ch" podUID="fbe99a86-217c-4292-a640-e0f67d74d96b" containerName="registry-server" containerID="cri-o://0c43cae01506957625314de53d8a60cf09c9f67d41a3f4c2107b40d2dd36697e" gracePeriod=2 Dec 05 17:53:18 crc kubenswrapper[4756]: I1205 17:53:18.737274 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.737259143 podStartE2EDuration="3.737259143s" podCreationTimestamp="2025-12-05 17:53:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:53:18.73676297 +0000 UTC m=+5858.975436482" watchObservedRunningTime="2025-12-05 17:53:18.737259143 +0000 UTC m=+5858.975932655" Dec 05 17:53:18 crc kubenswrapper[4756]: I1205 17:53:18.803173 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8knx9" Dec 05 17:53:19 crc kubenswrapper[4756]: I1205 17:53:19.288124 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m47ch" Dec 05 17:53:19 crc kubenswrapper[4756]: I1205 17:53:19.360557 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8knx9"] Dec 05 17:53:19 crc kubenswrapper[4756]: I1205 17:53:19.470100 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbe99a86-217c-4292-a640-e0f67d74d96b-utilities\") pod \"fbe99a86-217c-4292-a640-e0f67d74d96b\" (UID: \"fbe99a86-217c-4292-a640-e0f67d74d96b\") " Dec 05 17:53:19 crc kubenswrapper[4756]: I1205 17:53:19.470181 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t28hn\" (UniqueName: \"kubernetes.io/projected/fbe99a86-217c-4292-a640-e0f67d74d96b-kube-api-access-t28hn\") pod \"fbe99a86-217c-4292-a640-e0f67d74d96b\" (UID: \"fbe99a86-217c-4292-a640-e0f67d74d96b\") " Dec 05 17:53:19 crc kubenswrapper[4756]: I1205 17:53:19.470289 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbe99a86-217c-4292-a640-e0f67d74d96b-catalog-content\") pod \"fbe99a86-217c-4292-a640-e0f67d74d96b\" (UID: \"fbe99a86-217c-4292-a640-e0f67d74d96b\") " Dec 05 17:53:19 crc kubenswrapper[4756]: I1205 17:53:19.480342 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbe99a86-217c-4292-a640-e0f67d74d96b-utilities" (OuterVolumeSpecName: "utilities") pod "fbe99a86-217c-4292-a640-e0f67d74d96b" (UID: "fbe99a86-217c-4292-a640-e0f67d74d96b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:53:19 crc kubenswrapper[4756]: I1205 17:53:19.489147 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbe99a86-217c-4292-a640-e0f67d74d96b-kube-api-access-t28hn" (OuterVolumeSpecName: "kube-api-access-t28hn") pod "fbe99a86-217c-4292-a640-e0f67d74d96b" (UID: "fbe99a86-217c-4292-a640-e0f67d74d96b"). InnerVolumeSpecName "kube-api-access-t28hn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:53:19 crc kubenswrapper[4756]: I1205 17:53:19.518709 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbe99a86-217c-4292-a640-e0f67d74d96b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fbe99a86-217c-4292-a640-e0f67d74d96b" (UID: "fbe99a86-217c-4292-a640-e0f67d74d96b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:53:19 crc kubenswrapper[4756]: I1205 17:53:19.572389 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbe99a86-217c-4292-a640-e0f67d74d96b-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:19 crc kubenswrapper[4756]: I1205 17:53:19.572479 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t28hn\" (UniqueName: \"kubernetes.io/projected/fbe99a86-217c-4292-a640-e0f67d74d96b-kube-api-access-t28hn\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:19 crc kubenswrapper[4756]: I1205 17:53:19.572493 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbe99a86-217c-4292-a640-e0f67d74d96b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:19 crc kubenswrapper[4756]: I1205 17:53:19.725940 4756 generic.go:334] "Generic (PLEG): container finished" podID="fbe99a86-217c-4292-a640-e0f67d74d96b" containerID="0c43cae01506957625314de53d8a60cf09c9f67d41a3f4c2107b40d2dd36697e" exitCode=0 Dec 05 17:53:19 crc kubenswrapper[4756]: I1205 17:53:19.726020 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m47ch" Dec 05 17:53:19 crc kubenswrapper[4756]: I1205 17:53:19.726051 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m47ch" event={"ID":"fbe99a86-217c-4292-a640-e0f67d74d96b","Type":"ContainerDied","Data":"0c43cae01506957625314de53d8a60cf09c9f67d41a3f4c2107b40d2dd36697e"} Dec 05 17:53:19 crc kubenswrapper[4756]: I1205 17:53:19.727426 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m47ch" event={"ID":"fbe99a86-217c-4292-a640-e0f67d74d96b","Type":"ContainerDied","Data":"a0dafb32f3343d47e0fe0c3c2003dadbc1a450e449f439703cee67fcf12a032c"} Dec 05 17:53:19 crc kubenswrapper[4756]: I1205 17:53:19.727463 4756 scope.go:117] "RemoveContainer" containerID="0c43cae01506957625314de53d8a60cf09c9f67d41a3f4c2107b40d2dd36697e" Dec 05 17:53:19 crc kubenswrapper[4756]: I1205 17:53:19.766398 4756 scope.go:117] "RemoveContainer" containerID="b855452f6bdbc055167a3eddab6d31a23d8c9c875afa1fd9fc983d324f8a3ee2" Dec 05 17:53:19 crc kubenswrapper[4756]: I1205 17:53:19.774966 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m47ch"] Dec 05 17:53:19 crc kubenswrapper[4756]: I1205 17:53:19.790343 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m47ch"] Dec 05 17:53:19 crc kubenswrapper[4756]: I1205 17:53:19.811775 4756 scope.go:117] "RemoveContainer" containerID="23bd6cbbd7694efbd1c0d2cc1e3a6cf3d6cc4a2ea3cace15b56c8ad7174feadd" Dec 05 17:53:19 crc kubenswrapper[4756]: I1205 17:53:19.846001 4756 scope.go:117] "RemoveContainer" containerID="0c43cae01506957625314de53d8a60cf09c9f67d41a3f4c2107b40d2dd36697e" Dec 05 17:53:19 crc kubenswrapper[4756]: E1205 17:53:19.846715 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c43cae01506957625314de53d8a60cf09c9f67d41a3f4c2107b40d2dd36697e\": container with ID starting with 0c43cae01506957625314de53d8a60cf09c9f67d41a3f4c2107b40d2dd36697e not found: ID does not exist" containerID="0c43cae01506957625314de53d8a60cf09c9f67d41a3f4c2107b40d2dd36697e" Dec 05 17:53:19 crc kubenswrapper[4756]: I1205 17:53:19.846771 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c43cae01506957625314de53d8a60cf09c9f67d41a3f4c2107b40d2dd36697e"} err="failed to get container status \"0c43cae01506957625314de53d8a60cf09c9f67d41a3f4c2107b40d2dd36697e\": rpc error: code = NotFound desc = could not find container \"0c43cae01506957625314de53d8a60cf09c9f67d41a3f4c2107b40d2dd36697e\": container with ID starting with 0c43cae01506957625314de53d8a60cf09c9f67d41a3f4c2107b40d2dd36697e not found: ID does not exist" Dec 05 17:53:19 crc kubenswrapper[4756]: I1205 17:53:19.846802 4756 scope.go:117] "RemoveContainer" containerID="b855452f6bdbc055167a3eddab6d31a23d8c9c875afa1fd9fc983d324f8a3ee2" Dec 05 17:53:19 crc kubenswrapper[4756]: E1205 17:53:19.847144 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b855452f6bdbc055167a3eddab6d31a23d8c9c875afa1fd9fc983d324f8a3ee2\": container with ID starting with b855452f6bdbc055167a3eddab6d31a23d8c9c875afa1fd9fc983d324f8a3ee2 not found: ID does not exist" containerID="b855452f6bdbc055167a3eddab6d31a23d8c9c875afa1fd9fc983d324f8a3ee2" Dec 05 17:53:19 crc kubenswrapper[4756]: I1205 17:53:19.847172 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b855452f6bdbc055167a3eddab6d31a23d8c9c875afa1fd9fc983d324f8a3ee2"} err="failed to get container status \"b855452f6bdbc055167a3eddab6d31a23d8c9c875afa1fd9fc983d324f8a3ee2\": rpc error: code = NotFound desc = could not find container \"b855452f6bdbc055167a3eddab6d31a23d8c9c875afa1fd9fc983d324f8a3ee2\": container with ID starting with b855452f6bdbc055167a3eddab6d31a23d8c9c875afa1fd9fc983d324f8a3ee2 not found: ID does not exist" Dec 05 17:53:19 crc kubenswrapper[4756]: I1205 17:53:19.847187 4756 scope.go:117] "RemoveContainer" containerID="23bd6cbbd7694efbd1c0d2cc1e3a6cf3d6cc4a2ea3cace15b56c8ad7174feadd" Dec 05 17:53:19 crc kubenswrapper[4756]: E1205 17:53:19.847463 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23bd6cbbd7694efbd1c0d2cc1e3a6cf3d6cc4a2ea3cace15b56c8ad7174feadd\": container with ID starting with 23bd6cbbd7694efbd1c0d2cc1e3a6cf3d6cc4a2ea3cace15b56c8ad7174feadd not found: ID does not exist" containerID="23bd6cbbd7694efbd1c0d2cc1e3a6cf3d6cc4a2ea3cace15b56c8ad7174feadd" Dec 05 17:53:19 crc kubenswrapper[4756]: I1205 17:53:19.847505 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23bd6cbbd7694efbd1c0d2cc1e3a6cf3d6cc4a2ea3cace15b56c8ad7174feadd"} err="failed to get container status \"23bd6cbbd7694efbd1c0d2cc1e3a6cf3d6cc4a2ea3cace15b56c8ad7174feadd\": rpc error: code = NotFound desc = could not find container \"23bd6cbbd7694efbd1c0d2cc1e3a6cf3d6cc4a2ea3cace15b56c8ad7174feadd\": container with ID starting with 23bd6cbbd7694efbd1c0d2cc1e3a6cf3d6cc4a2ea3cace15b56c8ad7174feadd not found: ID does not exist" Dec 05 17:53:20 crc kubenswrapper[4756]: I1205 17:53:20.065886 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbe99a86-217c-4292-a640-e0f67d74d96b" path="/var/lib/kubelet/pods/fbe99a86-217c-4292-a640-e0f67d74d96b/volumes" Dec 05 17:53:20 crc kubenswrapper[4756]: I1205 17:53:20.740168 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8knx9" podUID="7d87358b-53b1-4bb4-9554-b0e801d7c029" containerName="registry-server" containerID="cri-o://77472d516b31f135be5c0d35aaa4aa9952f8532d13f632079634eb6e3bf7b582" gracePeriod=2 Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.050273 4756 scope.go:117] "RemoveContainer" containerID="0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a" Dec 05 17:53:21 crc kubenswrapper[4756]: E1205 17:53:21.050707 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.092700 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.261987 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8knx9" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.267192 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.412772 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-public-tls-certs\") pod \"43f1c275-9ce5-4965-8b25-099a1ceed149\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.413108 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7qr9\" (UniqueName: \"kubernetes.io/projected/43f1c275-9ce5-4965-8b25-099a1ceed149-kube-api-access-s7qr9\") pod \"43f1c275-9ce5-4965-8b25-099a1ceed149\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.413151 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d87358b-53b1-4bb4-9554-b0e801d7c029-catalog-content\") pod \"7d87358b-53b1-4bb4-9554-b0e801d7c029\" (UID: \"7d87358b-53b1-4bb4-9554-b0e801d7c029\") " Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.413183 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-config-data\") pod \"43f1c275-9ce5-4965-8b25-099a1ceed149\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.413212 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-scripts\") pod \"43f1c275-9ce5-4965-8b25-099a1ceed149\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.413261 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-combined-ca-bundle\") pod \"43f1c275-9ce5-4965-8b25-099a1ceed149\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.413324 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kss9r\" (UniqueName: \"kubernetes.io/projected/7d87358b-53b1-4bb4-9554-b0e801d7c029-kube-api-access-kss9r\") pod \"7d87358b-53b1-4bb4-9554-b0e801d7c029\" (UID: \"7d87358b-53b1-4bb4-9554-b0e801d7c029\") " Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.413362 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-internal-tls-certs\") pod \"43f1c275-9ce5-4965-8b25-099a1ceed149\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.413469 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/43f1c275-9ce5-4965-8b25-099a1ceed149-etc-machine-id\") pod \"43f1c275-9ce5-4965-8b25-099a1ceed149\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.413501 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d87358b-53b1-4bb4-9554-b0e801d7c029-utilities\") pod \"7d87358b-53b1-4bb4-9554-b0e801d7c029\" (UID: \"7d87358b-53b1-4bb4-9554-b0e801d7c029\") " Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.413581 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-config-data-custom\") pod \"43f1c275-9ce5-4965-8b25-099a1ceed149\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.413606 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43f1c275-9ce5-4965-8b25-099a1ceed149-logs\") pod \"43f1c275-9ce5-4965-8b25-099a1ceed149\" (UID: \"43f1c275-9ce5-4965-8b25-099a1ceed149\") " Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.413906 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43f1c275-9ce5-4965-8b25-099a1ceed149-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "43f1c275-9ce5-4965-8b25-099a1ceed149" (UID: "43f1c275-9ce5-4965-8b25-099a1ceed149"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.415358 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43f1c275-9ce5-4965-8b25-099a1ceed149-logs" (OuterVolumeSpecName: "logs") pod "43f1c275-9ce5-4965-8b25-099a1ceed149" (UID: "43f1c275-9ce5-4965-8b25-099a1ceed149"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.415800 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d87358b-53b1-4bb4-9554-b0e801d7c029-utilities" (OuterVolumeSpecName: "utilities") pod "7d87358b-53b1-4bb4-9554-b0e801d7c029" (UID: "7d87358b-53b1-4bb4-9554-b0e801d7c029"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.416186 4756 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/43f1c275-9ce5-4965-8b25-099a1ceed149-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.416209 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d87358b-53b1-4bb4-9554-b0e801d7c029-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.416224 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43f1c275-9ce5-4965-8b25-099a1ceed149-logs\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.421770 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d87358b-53b1-4bb4-9554-b0e801d7c029-kube-api-access-kss9r" (OuterVolumeSpecName: "kube-api-access-kss9r") pod "7d87358b-53b1-4bb4-9554-b0e801d7c029" (UID: "7d87358b-53b1-4bb4-9554-b0e801d7c029"). InnerVolumeSpecName "kube-api-access-kss9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.422758 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-scripts" (OuterVolumeSpecName: "scripts") pod "43f1c275-9ce5-4965-8b25-099a1ceed149" (UID: "43f1c275-9ce5-4965-8b25-099a1ceed149"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.426689 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "43f1c275-9ce5-4965-8b25-099a1ceed149" (UID: "43f1c275-9ce5-4965-8b25-099a1ceed149"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.427303 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43f1c275-9ce5-4965-8b25-099a1ceed149-kube-api-access-s7qr9" (OuterVolumeSpecName: "kube-api-access-s7qr9") pod "43f1c275-9ce5-4965-8b25-099a1ceed149" (UID: "43f1c275-9ce5-4965-8b25-099a1ceed149"). InnerVolumeSpecName "kube-api-access-s7qr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.434490 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d87358b-53b1-4bb4-9554-b0e801d7c029-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d87358b-53b1-4bb4-9554-b0e801d7c029" (UID: "7d87358b-53b1-4bb4-9554-b0e801d7c029"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.464530 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43f1c275-9ce5-4965-8b25-099a1ceed149" (UID: "43f1c275-9ce5-4965-8b25-099a1ceed149"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.481887 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "43f1c275-9ce5-4965-8b25-099a1ceed149" (UID: "43f1c275-9ce5-4965-8b25-099a1ceed149"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.483578 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "43f1c275-9ce5-4965-8b25-099a1ceed149" (UID: "43f1c275-9ce5-4965-8b25-099a1ceed149"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.498907 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-config-data" (OuterVolumeSpecName: "config-data") pod "43f1c275-9ce5-4965-8b25-099a1ceed149" (UID: "43f1c275-9ce5-4965-8b25-099a1ceed149"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.517892 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.517937 4756 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.517952 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7qr9\" (UniqueName: \"kubernetes.io/projected/43f1c275-9ce5-4965-8b25-099a1ceed149-kube-api-access-s7qr9\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.517968 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.517979 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d87358b-53b1-4bb4-9554-b0e801d7c029-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.517989 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.517997 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.518005 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kss9r\" (UniqueName: \"kubernetes.io/projected/7d87358b-53b1-4bb4-9554-b0e801d7c029-kube-api-access-kss9r\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.518013 4756 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/43f1c275-9ce5-4965-8b25-099a1ceed149-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.751042 4756 generic.go:334] "Generic (PLEG): container finished" podID="43f1c275-9ce5-4965-8b25-099a1ceed149" containerID="9d5d5060b1fdc771daabc19cb96858213dc8fc0d8795f2560dbbea1896eb38fe" exitCode=0 Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.751118 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"43f1c275-9ce5-4965-8b25-099a1ceed149","Type":"ContainerDied","Data":"9d5d5060b1fdc771daabc19cb96858213dc8fc0d8795f2560dbbea1896eb38fe"} Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.751150 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.751167 4756 scope.go:117] "RemoveContainer" containerID="9d5d5060b1fdc771daabc19cb96858213dc8fc0d8795f2560dbbea1896eb38fe" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.751152 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"43f1c275-9ce5-4965-8b25-099a1ceed149","Type":"ContainerDied","Data":"d7c9a0a8dd9621997de2035913f4f90979f8bcc91f019982963835fa58ed27d1"} Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.755956 4756 generic.go:334] "Generic (PLEG): container finished" podID="7d87358b-53b1-4bb4-9554-b0e801d7c029" containerID="77472d516b31f135be5c0d35aaa4aa9952f8532d13f632079634eb6e3bf7b582" exitCode=0 Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.756001 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8knx9" event={"ID":"7d87358b-53b1-4bb4-9554-b0e801d7c029","Type":"ContainerDied","Data":"77472d516b31f135be5c0d35aaa4aa9952f8532d13f632079634eb6e3bf7b582"} Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.756022 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8knx9" event={"ID":"7d87358b-53b1-4bb4-9554-b0e801d7c029","Type":"ContainerDied","Data":"86bd9f79f6c23f4a766d6e18c0168dfb3780a53a9626648fce65568cca8b6834"} Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.756088 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8knx9" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.779565 4756 scope.go:117] "RemoveContainer" containerID="3abf9deb3f1e0888d3e28f8066e930a1c91f204e6b06ae8c7b68034341b79edd" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.795515 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8knx9"] Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.811520 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8knx9"] Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.817739 4756 scope.go:117] "RemoveContainer" containerID="9d5d5060b1fdc771daabc19cb96858213dc8fc0d8795f2560dbbea1896eb38fe" Dec 05 17:53:21 crc kubenswrapper[4756]: E1205 17:53:21.818145 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d5d5060b1fdc771daabc19cb96858213dc8fc0d8795f2560dbbea1896eb38fe\": container with ID starting with 9d5d5060b1fdc771daabc19cb96858213dc8fc0d8795f2560dbbea1896eb38fe not found: ID does not exist" containerID="9d5d5060b1fdc771daabc19cb96858213dc8fc0d8795f2560dbbea1896eb38fe" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.818177 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d5d5060b1fdc771daabc19cb96858213dc8fc0d8795f2560dbbea1896eb38fe"} err="failed to get container status \"9d5d5060b1fdc771daabc19cb96858213dc8fc0d8795f2560dbbea1896eb38fe\": rpc error: code = NotFound desc = could not find container \"9d5d5060b1fdc771daabc19cb96858213dc8fc0d8795f2560dbbea1896eb38fe\": container with ID starting with 9d5d5060b1fdc771daabc19cb96858213dc8fc0d8795f2560dbbea1896eb38fe not found: ID does not exist" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.818198 4756 scope.go:117] "RemoveContainer" containerID="3abf9deb3f1e0888d3e28f8066e930a1c91f204e6b06ae8c7b68034341b79edd" Dec 05 17:53:21 crc kubenswrapper[4756]: E1205 17:53:21.818484 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3abf9deb3f1e0888d3e28f8066e930a1c91f204e6b06ae8c7b68034341b79edd\": container with ID starting with 3abf9deb3f1e0888d3e28f8066e930a1c91f204e6b06ae8c7b68034341b79edd not found: ID does not exist" containerID="3abf9deb3f1e0888d3e28f8066e930a1c91f204e6b06ae8c7b68034341b79edd" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.818508 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3abf9deb3f1e0888d3e28f8066e930a1c91f204e6b06ae8c7b68034341b79edd"} err="failed to get container status \"3abf9deb3f1e0888d3e28f8066e930a1c91f204e6b06ae8c7b68034341b79edd\": rpc error: code = NotFound desc = could not find container \"3abf9deb3f1e0888d3e28f8066e930a1c91f204e6b06ae8c7b68034341b79edd\": container with ID starting with 3abf9deb3f1e0888d3e28f8066e930a1c91f204e6b06ae8c7b68034341b79edd not found: ID does not exist" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.818520 4756 scope.go:117] "RemoveContainer" containerID="77472d516b31f135be5c0d35aaa4aa9952f8532d13f632079634eb6e3bf7b582" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.824676 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.832233 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.843542 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 05 17:53:21 crc kubenswrapper[4756]: E1205 17:53:21.844094 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe99a86-217c-4292-a640-e0f67d74d96b" containerName="extract-content" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.844120 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe99a86-217c-4292-a640-e0f67d74d96b" containerName="extract-content" Dec 05 17:53:21 crc kubenswrapper[4756]: E1205 17:53:21.844142 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe99a86-217c-4292-a640-e0f67d74d96b" containerName="extract-utilities" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.844152 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe99a86-217c-4292-a640-e0f67d74d96b" containerName="extract-utilities" Dec 05 17:53:21 crc kubenswrapper[4756]: E1205 17:53:21.844172 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d87358b-53b1-4bb4-9554-b0e801d7c029" containerName="registry-server" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.844181 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d87358b-53b1-4bb4-9554-b0e801d7c029" containerName="registry-server" Dec 05 17:53:21 crc kubenswrapper[4756]: E1205 17:53:21.844198 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d87358b-53b1-4bb4-9554-b0e801d7c029" containerName="extract-utilities" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.844206 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d87358b-53b1-4bb4-9554-b0e801d7c029" containerName="extract-utilities" Dec 05 17:53:21 crc kubenswrapper[4756]: E1205 17:53:21.844221 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43f1c275-9ce5-4965-8b25-099a1ceed149" containerName="cinder-api-log" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.844231 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f1c275-9ce5-4965-8b25-099a1ceed149" containerName="cinder-api-log" Dec 05 17:53:21 crc kubenswrapper[4756]: E1205 17:53:21.844257 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe99a86-217c-4292-a640-e0f67d74d96b" containerName="registry-server" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.844266 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe99a86-217c-4292-a640-e0f67d74d96b" containerName="registry-server" Dec 05 17:53:21 crc kubenswrapper[4756]: E1205 17:53:21.844277 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43f1c275-9ce5-4965-8b25-099a1ceed149" containerName="cinder-api" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.844286 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f1c275-9ce5-4965-8b25-099a1ceed149" containerName="cinder-api" Dec 05 17:53:21 crc kubenswrapper[4756]: E1205 17:53:21.844304 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d87358b-53b1-4bb4-9554-b0e801d7c029" containerName="extract-content" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.844314 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d87358b-53b1-4bb4-9554-b0e801d7c029" containerName="extract-content" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.844641 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d87358b-53b1-4bb4-9554-b0e801d7c029" containerName="registry-server" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.844677 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="43f1c275-9ce5-4965-8b25-099a1ceed149" containerName="cinder-api-log" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.844695 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbe99a86-217c-4292-a640-e0f67d74d96b" containerName="registry-server" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.844717 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="43f1c275-9ce5-4965-8b25-099a1ceed149" containerName="cinder-api" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.846809 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.849445 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.849974 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.850117 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.850159 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.875033 4756 scope.go:117] "RemoveContainer" containerID="ce031d5243611c9411ad9b03d199c6c898b78bb457570dfb09696924b5090acd" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.935789 4756 scope.go:117] "RemoveContainer" containerID="2e837da62f3b3c463afccdcf1a3db51afd2bcf6be4c34ea9d16ffc0591c560ae" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.964486 4756 scope.go:117] "RemoveContainer" containerID="77472d516b31f135be5c0d35aaa4aa9952f8532d13f632079634eb6e3bf7b582" Dec 05 17:53:21 crc kubenswrapper[4756]: E1205 17:53:21.964808 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77472d516b31f135be5c0d35aaa4aa9952f8532d13f632079634eb6e3bf7b582\": container with ID starting with 77472d516b31f135be5c0d35aaa4aa9952f8532d13f632079634eb6e3bf7b582 not found: ID does not exist" containerID="77472d516b31f135be5c0d35aaa4aa9952f8532d13f632079634eb6e3bf7b582" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.964842 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77472d516b31f135be5c0d35aaa4aa9952f8532d13f632079634eb6e3bf7b582"} err="failed to get container status \"77472d516b31f135be5c0d35aaa4aa9952f8532d13f632079634eb6e3bf7b582\": rpc error: code = NotFound desc = could not find container \"77472d516b31f135be5c0d35aaa4aa9952f8532d13f632079634eb6e3bf7b582\": container with ID starting with 77472d516b31f135be5c0d35aaa4aa9952f8532d13f632079634eb6e3bf7b582 not found: ID does not exist" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.964865 4756 scope.go:117] "RemoveContainer" containerID="ce031d5243611c9411ad9b03d199c6c898b78bb457570dfb09696924b5090acd" Dec 05 17:53:21 crc kubenswrapper[4756]: E1205 17:53:21.965075 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce031d5243611c9411ad9b03d199c6c898b78bb457570dfb09696924b5090acd\": container with ID starting with ce031d5243611c9411ad9b03d199c6c898b78bb457570dfb09696924b5090acd not found: ID does not exist" containerID="ce031d5243611c9411ad9b03d199c6c898b78bb457570dfb09696924b5090acd" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.965098 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce031d5243611c9411ad9b03d199c6c898b78bb457570dfb09696924b5090acd"} err="failed to get container status \"ce031d5243611c9411ad9b03d199c6c898b78bb457570dfb09696924b5090acd\": rpc error: code = NotFound desc = could not find container \"ce031d5243611c9411ad9b03d199c6c898b78bb457570dfb09696924b5090acd\": container with ID starting with ce031d5243611c9411ad9b03d199c6c898b78bb457570dfb09696924b5090acd not found: ID does not exist" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.965115 4756 scope.go:117] "RemoveContainer" containerID="2e837da62f3b3c463afccdcf1a3db51afd2bcf6be4c34ea9d16ffc0591c560ae" Dec 05 17:53:21 crc kubenswrapper[4756]: E1205 17:53:21.965286 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e837da62f3b3c463afccdcf1a3db51afd2bcf6be4c34ea9d16ffc0591c560ae\": container with ID starting with 2e837da62f3b3c463afccdcf1a3db51afd2bcf6be4c34ea9d16ffc0591c560ae not found: ID does not exist" containerID="2e837da62f3b3c463afccdcf1a3db51afd2bcf6be4c34ea9d16ffc0591c560ae" Dec 05 17:53:21 crc kubenswrapper[4756]: I1205 17:53:21.965307 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e837da62f3b3c463afccdcf1a3db51afd2bcf6be4c34ea9d16ffc0591c560ae"} err="failed to get container status \"2e837da62f3b3c463afccdcf1a3db51afd2bcf6be4c34ea9d16ffc0591c560ae\": rpc error: code = NotFound desc = could not find container \"2e837da62f3b3c463afccdcf1a3db51afd2bcf6be4c34ea9d16ffc0591c560ae\": container with ID starting with 2e837da62f3b3c463afccdcf1a3db51afd2bcf6be4c34ea9d16ffc0591c560ae not found: ID does not exist" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.027110 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48912894-5075-4468-b597-ea47a1099fbb-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"48912894-5075-4468-b597-ea47a1099fbb\") " pod="openstack/cinder-api-0" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.027165 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48912894-5075-4468-b597-ea47a1099fbb-public-tls-certs\") pod \"cinder-api-0\" (UID: \"48912894-5075-4468-b597-ea47a1099fbb\") " pod="openstack/cinder-api-0" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.027225 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48912894-5075-4468-b597-ea47a1099fbb-config-data\") pod \"cinder-api-0\" (UID: \"48912894-5075-4468-b597-ea47a1099fbb\") " pod="openstack/cinder-api-0" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.027268 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48912894-5075-4468-b597-ea47a1099fbb-scripts\") pod \"cinder-api-0\" (UID: \"48912894-5075-4468-b597-ea47a1099fbb\") " pod="openstack/cinder-api-0" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.027508 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48912894-5075-4468-b597-ea47a1099fbb-config-data-custom\") pod \"cinder-api-0\" (UID: \"48912894-5075-4468-b597-ea47a1099fbb\") " pod="openstack/cinder-api-0" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.027618 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw58z\" (UniqueName: \"kubernetes.io/projected/48912894-5075-4468-b597-ea47a1099fbb-kube-api-access-cw58z\") pod \"cinder-api-0\" (UID: \"48912894-5075-4468-b597-ea47a1099fbb\") " pod="openstack/cinder-api-0" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.027702 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48912894-5075-4468-b597-ea47a1099fbb-logs\") pod \"cinder-api-0\" (UID: \"48912894-5075-4468-b597-ea47a1099fbb\") " pod="openstack/cinder-api-0" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.027982 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48912894-5075-4468-b597-ea47a1099fbb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"48912894-5075-4468-b597-ea47a1099fbb\") " pod="openstack/cinder-api-0" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.028018 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48912894-5075-4468-b597-ea47a1099fbb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"48912894-5075-4468-b597-ea47a1099fbb\") " pod="openstack/cinder-api-0" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.069736 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43f1c275-9ce5-4965-8b25-099a1ceed149" path="/var/lib/kubelet/pods/43f1c275-9ce5-4965-8b25-099a1ceed149/volumes" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.070330 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d87358b-53b1-4bb4-9554-b0e801d7c029" path="/var/lib/kubelet/pods/7d87358b-53b1-4bb4-9554-b0e801d7c029/volumes" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.130281 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48912894-5075-4468-b597-ea47a1099fbb-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"48912894-5075-4468-b597-ea47a1099fbb\") " pod="openstack/cinder-api-0" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.130353 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48912894-5075-4468-b597-ea47a1099fbb-public-tls-certs\") pod \"cinder-api-0\" (UID: \"48912894-5075-4468-b597-ea47a1099fbb\") " pod="openstack/cinder-api-0" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.130404 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48912894-5075-4468-b597-ea47a1099fbb-config-data\") pod \"cinder-api-0\" (UID: \"48912894-5075-4468-b597-ea47a1099fbb\") " pod="openstack/cinder-api-0" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.130476 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48912894-5075-4468-b597-ea47a1099fbb-scripts\") pod \"cinder-api-0\" (UID: \"48912894-5075-4468-b597-ea47a1099fbb\") " pod="openstack/cinder-api-0" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.130535 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48912894-5075-4468-b597-ea47a1099fbb-config-data-custom\") pod \"cinder-api-0\" (UID: \"48912894-5075-4468-b597-ea47a1099fbb\") " pod="openstack/cinder-api-0" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.130563 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw58z\" (UniqueName: \"kubernetes.io/projected/48912894-5075-4468-b597-ea47a1099fbb-kube-api-access-cw58z\") pod \"cinder-api-0\" (UID: \"48912894-5075-4468-b597-ea47a1099fbb\") " pod="openstack/cinder-api-0" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.130589 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48912894-5075-4468-b597-ea47a1099fbb-logs\") pod \"cinder-api-0\" (UID: \"48912894-5075-4468-b597-ea47a1099fbb\") " pod="openstack/cinder-api-0" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.130625 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48912894-5075-4468-b597-ea47a1099fbb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"48912894-5075-4468-b597-ea47a1099fbb\") " pod="openstack/cinder-api-0" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.130651 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48912894-5075-4468-b597-ea47a1099fbb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"48912894-5075-4468-b597-ea47a1099fbb\") " pod="openstack/cinder-api-0" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.130977 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48912894-5075-4468-b597-ea47a1099fbb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"48912894-5075-4468-b597-ea47a1099fbb\") " pod="openstack/cinder-api-0" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.131060 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48912894-5075-4468-b597-ea47a1099fbb-logs\") pod \"cinder-api-0\" (UID: \"48912894-5075-4468-b597-ea47a1099fbb\") " pod="openstack/cinder-api-0" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.134245 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48912894-5075-4468-b597-ea47a1099fbb-public-tls-certs\") pod \"cinder-api-0\" (UID: \"48912894-5075-4468-b597-ea47a1099fbb\") " pod="openstack/cinder-api-0" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.134388 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48912894-5075-4468-b597-ea47a1099fbb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"48912894-5075-4468-b597-ea47a1099fbb\") " pod="openstack/cinder-api-0" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.135042 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48912894-5075-4468-b597-ea47a1099fbb-config-data-custom\") pod \"cinder-api-0\" (UID: \"48912894-5075-4468-b597-ea47a1099fbb\") " pod="openstack/cinder-api-0" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.135692 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48912894-5075-4468-b597-ea47a1099fbb-config-data\") pod \"cinder-api-0\" (UID: \"48912894-5075-4468-b597-ea47a1099fbb\") " pod="openstack/cinder-api-0" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.135851 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48912894-5075-4468-b597-ea47a1099fbb-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"48912894-5075-4468-b597-ea47a1099fbb\") " pod="openstack/cinder-api-0" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.136342 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48912894-5075-4468-b597-ea47a1099fbb-scripts\") pod \"cinder-api-0\" (UID: \"48912894-5075-4468-b597-ea47a1099fbb\") " pod="openstack/cinder-api-0" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.149558 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw58z\" (UniqueName: \"kubernetes.io/projected/48912894-5075-4468-b597-ea47a1099fbb-kube-api-access-cw58z\") pod \"cinder-api-0\" (UID: \"48912894-5075-4468-b597-ea47a1099fbb\") " pod="openstack/cinder-api-0" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.218854 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 17:53:22 crc kubenswrapper[4756]: I1205 17:53:22.753144 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 17:53:23 crc kubenswrapper[4756]: I1205 17:53:23.778628 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"48912894-5075-4468-b597-ea47a1099fbb","Type":"ContainerStarted","Data":"4d14c89a4e279a53c298ad620a30f9e4faca8dca6ec2ac3017a4fa38f9e37d19"} Dec 05 17:53:23 crc kubenswrapper[4756]: I1205 17:53:23.779271 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"48912894-5075-4468-b597-ea47a1099fbb","Type":"ContainerStarted","Data":"78144b18e8d65cd5f041ca423960c95c6d39c79db72967917335c98f660c646f"} Dec 05 17:53:24 crc kubenswrapper[4756]: I1205 17:53:24.795149 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"48912894-5075-4468-b597-ea47a1099fbb","Type":"ContainerStarted","Data":"5b91139b417211f533cc03744cc40d8078ad21fa41febdf41a3d26138711b490"} Dec 05 17:53:24 crc kubenswrapper[4756]: I1205 17:53:24.795389 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 05 17:53:24 crc kubenswrapper[4756]: I1205 17:53:24.826492 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.826469159 podStartE2EDuration="3.826469159s" podCreationTimestamp="2025-12-05 17:53:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:53:24.823119869 +0000 UTC m=+5865.061793391" watchObservedRunningTime="2025-12-05 17:53:24.826469159 +0000 UTC m=+5865.065142711" Dec 05 17:53:26 crc kubenswrapper[4756]: I1205 17:53:26.306003 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 05 17:53:26 crc kubenswrapper[4756]: I1205 17:53:26.401907 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 17:53:26 crc kubenswrapper[4756]: I1205 17:53:26.829720 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="ddfbd76a-11a8-4e34-b037-494041ea66a9" containerName="cinder-scheduler" containerID="cri-o://3b7d6c4d316faeb04a57a78d7c779c61e10944c35d88d95a3fb6413691e09cf3" gracePeriod=30 Dec 05 17:53:26 crc kubenswrapper[4756]: I1205 17:53:26.829824 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="ddfbd76a-11a8-4e34-b037-494041ea66a9" containerName="probe" containerID="cri-o://8db7563fceda7c64faf1dc6fd53a2aea9b2b7f0a54217dabcf72ba0b12fb54aa" gracePeriod=30 Dec 05 17:53:27 crc kubenswrapper[4756]: I1205 17:53:27.853310 4756 generic.go:334] "Generic (PLEG): container finished" podID="ddfbd76a-11a8-4e34-b037-494041ea66a9" containerID="8db7563fceda7c64faf1dc6fd53a2aea9b2b7f0a54217dabcf72ba0b12fb54aa" exitCode=0 Dec 05 17:53:27 crc kubenswrapper[4756]: I1205 17:53:27.853506 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ddfbd76a-11a8-4e34-b037-494041ea66a9","Type":"ContainerDied","Data":"8db7563fceda7c64faf1dc6fd53a2aea9b2b7f0a54217dabcf72ba0b12fb54aa"} Dec 05 17:53:28 crc kubenswrapper[4756]: I1205 17:53:28.874622 4756 generic.go:334] "Generic (PLEG): container finished" podID="ddfbd76a-11a8-4e34-b037-494041ea66a9" containerID="3b7d6c4d316faeb04a57a78d7c779c61e10944c35d88d95a3fb6413691e09cf3" exitCode=0 Dec 05 17:53:28 crc kubenswrapper[4756]: I1205 17:53:28.875087 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ddfbd76a-11a8-4e34-b037-494041ea66a9","Type":"ContainerDied","Data":"3b7d6c4d316faeb04a57a78d7c779c61e10944c35d88d95a3fb6413691e09cf3"} Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.033354 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.167787 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ddfbd76a-11a8-4e34-b037-494041ea66a9-config-data-custom\") pod \"ddfbd76a-11a8-4e34-b037-494041ea66a9\" (UID: \"ddfbd76a-11a8-4e34-b037-494041ea66a9\") " Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.167958 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddfbd76a-11a8-4e34-b037-494041ea66a9-config-data\") pod \"ddfbd76a-11a8-4e34-b037-494041ea66a9\" (UID: \"ddfbd76a-11a8-4e34-b037-494041ea66a9\") " Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.168054 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddfbd76a-11a8-4e34-b037-494041ea66a9-combined-ca-bundle\") pod \"ddfbd76a-11a8-4e34-b037-494041ea66a9\" (UID: \"ddfbd76a-11a8-4e34-b037-494041ea66a9\") " Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.168109 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5nns\" (UniqueName: \"kubernetes.io/projected/ddfbd76a-11a8-4e34-b037-494041ea66a9-kube-api-access-r5nns\") pod \"ddfbd76a-11a8-4e34-b037-494041ea66a9\" (UID: \"ddfbd76a-11a8-4e34-b037-494041ea66a9\") " Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.168178 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddfbd76a-11a8-4e34-b037-494041ea66a9-scripts\") pod \"ddfbd76a-11a8-4e34-b037-494041ea66a9\" (UID: \"ddfbd76a-11a8-4e34-b037-494041ea66a9\") " Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.168217 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ddfbd76a-11a8-4e34-b037-494041ea66a9-etc-machine-id\") pod \"ddfbd76a-11a8-4e34-b037-494041ea66a9\" (UID: \"ddfbd76a-11a8-4e34-b037-494041ea66a9\") " Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.168672 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ddfbd76a-11a8-4e34-b037-494041ea66a9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ddfbd76a-11a8-4e34-b037-494041ea66a9" (UID: "ddfbd76a-11a8-4e34-b037-494041ea66a9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.174462 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddfbd76a-11a8-4e34-b037-494041ea66a9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ddfbd76a-11a8-4e34-b037-494041ea66a9" (UID: "ddfbd76a-11a8-4e34-b037-494041ea66a9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.182841 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddfbd76a-11a8-4e34-b037-494041ea66a9-scripts" (OuterVolumeSpecName: "scripts") pod "ddfbd76a-11a8-4e34-b037-494041ea66a9" (UID: "ddfbd76a-11a8-4e34-b037-494041ea66a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.187652 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddfbd76a-11a8-4e34-b037-494041ea66a9-kube-api-access-r5nns" (OuterVolumeSpecName: "kube-api-access-r5nns") pod "ddfbd76a-11a8-4e34-b037-494041ea66a9" (UID: "ddfbd76a-11a8-4e34-b037-494041ea66a9"). InnerVolumeSpecName "kube-api-access-r5nns". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.239871 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddfbd76a-11a8-4e34-b037-494041ea66a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddfbd76a-11a8-4e34-b037-494041ea66a9" (UID: "ddfbd76a-11a8-4e34-b037-494041ea66a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.270298 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddfbd76a-11a8-4e34-b037-494041ea66a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.270349 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5nns\" (UniqueName: \"kubernetes.io/projected/ddfbd76a-11a8-4e34-b037-494041ea66a9-kube-api-access-r5nns\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.270366 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddfbd76a-11a8-4e34-b037-494041ea66a9-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.270384 4756 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ddfbd76a-11a8-4e34-b037-494041ea66a9-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.270399 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ddfbd76a-11a8-4e34-b037-494041ea66a9-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.315794 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddfbd76a-11a8-4e34-b037-494041ea66a9-config-data" (OuterVolumeSpecName: "config-data") pod "ddfbd76a-11a8-4e34-b037-494041ea66a9" (UID: "ddfbd76a-11a8-4e34-b037-494041ea66a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.371892 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddfbd76a-11a8-4e34-b037-494041ea66a9-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.885876 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ddfbd76a-11a8-4e34-b037-494041ea66a9","Type":"ContainerDied","Data":"88ac396fc7fbe7c464cadd6a3a0d0509a356bf011c2f49c5c765ba9c33e2a48c"} Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.885948 4756 scope.go:117] "RemoveContainer" containerID="8db7563fceda7c64faf1dc6fd53a2aea9b2b7f0a54217dabcf72ba0b12fb54aa" Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.886629 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.927593 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.934083 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.945997 4756 scope.go:117] "RemoveContainer" containerID="3b7d6c4d316faeb04a57a78d7c779c61e10944c35d88d95a3fb6413691e09cf3" Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.979454 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 17:53:29 crc kubenswrapper[4756]: E1205 17:53:29.980125 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddfbd76a-11a8-4e34-b037-494041ea66a9" containerName="probe" Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.980207 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddfbd76a-11a8-4e34-b037-494041ea66a9" containerName="probe" Dec 05 17:53:29 crc kubenswrapper[4756]: E1205 17:53:29.980280 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddfbd76a-11a8-4e34-b037-494041ea66a9" containerName="cinder-scheduler" Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.980368 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddfbd76a-11a8-4e34-b037-494041ea66a9" containerName="cinder-scheduler" Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.980662 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddfbd76a-11a8-4e34-b037-494041ea66a9" containerName="cinder-scheduler" Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.980757 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddfbd76a-11a8-4e34-b037-494041ea66a9" containerName="probe" Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.981955 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 17:53:29 crc kubenswrapper[4756]: I1205 17:53:29.984491 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 05 17:53:30 crc kubenswrapper[4756]: I1205 17:53:30.001916 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 17:53:30 crc kubenswrapper[4756]: I1205 17:53:30.061379 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddfbd76a-11a8-4e34-b037-494041ea66a9" path="/var/lib/kubelet/pods/ddfbd76a-11a8-4e34-b037-494041ea66a9/volumes" Dec 05 17:53:30 crc kubenswrapper[4756]: I1205 17:53:30.085308 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cf6a65a9-f08a-4a67-b139-099756a8563b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cf6a65a9-f08a-4a67-b139-099756a8563b\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:30 crc kubenswrapper[4756]: I1205 17:53:30.085363 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mml7z\" (UniqueName: \"kubernetes.io/projected/cf6a65a9-f08a-4a67-b139-099756a8563b-kube-api-access-mml7z\") pod \"cinder-scheduler-0\" (UID: \"cf6a65a9-f08a-4a67-b139-099756a8563b\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:30 crc kubenswrapper[4756]: I1205 17:53:30.085602 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf6a65a9-f08a-4a67-b139-099756a8563b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cf6a65a9-f08a-4a67-b139-099756a8563b\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:30 crc kubenswrapper[4756]: I1205 17:53:30.085740 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf6a65a9-f08a-4a67-b139-099756a8563b-config-data\") pod \"cinder-scheduler-0\" (UID: \"cf6a65a9-f08a-4a67-b139-099756a8563b\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:30 crc kubenswrapper[4756]: I1205 17:53:30.085890 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf6a65a9-f08a-4a67-b139-099756a8563b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cf6a65a9-f08a-4a67-b139-099756a8563b\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:30 crc kubenswrapper[4756]: I1205 17:53:30.085913 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf6a65a9-f08a-4a67-b139-099756a8563b-scripts\") pod \"cinder-scheduler-0\" (UID: \"cf6a65a9-f08a-4a67-b139-099756a8563b\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:30 crc kubenswrapper[4756]: I1205 17:53:30.187936 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf6a65a9-f08a-4a67-b139-099756a8563b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cf6a65a9-f08a-4a67-b139-099756a8563b\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:30 crc kubenswrapper[4756]: I1205 17:53:30.188585 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf6a65a9-f08a-4a67-b139-099756a8563b-scripts\") pod \"cinder-scheduler-0\" (UID: \"cf6a65a9-f08a-4a67-b139-099756a8563b\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:30 crc kubenswrapper[4756]: I1205 17:53:30.188625 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cf6a65a9-f08a-4a67-b139-099756a8563b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cf6a65a9-f08a-4a67-b139-099756a8563b\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:30 crc kubenswrapper[4756]: I1205 17:53:30.188811 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mml7z\" (UniqueName: \"kubernetes.io/projected/cf6a65a9-f08a-4a67-b139-099756a8563b-kube-api-access-mml7z\") pod \"cinder-scheduler-0\" (UID: \"cf6a65a9-f08a-4a67-b139-099756a8563b\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:30 crc kubenswrapper[4756]: I1205 17:53:30.189212 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf6a65a9-f08a-4a67-b139-099756a8563b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cf6a65a9-f08a-4a67-b139-099756a8563b\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:30 crc kubenswrapper[4756]: I1205 17:53:30.189558 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf6a65a9-f08a-4a67-b139-099756a8563b-config-data\") pod \"cinder-scheduler-0\" (UID: \"cf6a65a9-f08a-4a67-b139-099756a8563b\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:30 crc kubenswrapper[4756]: I1205 17:53:30.189210 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cf6a65a9-f08a-4a67-b139-099756a8563b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cf6a65a9-f08a-4a67-b139-099756a8563b\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:30 crc kubenswrapper[4756]: I1205 17:53:30.196072 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf6a65a9-f08a-4a67-b139-099756a8563b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cf6a65a9-f08a-4a67-b139-099756a8563b\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:30 crc kubenswrapper[4756]: I1205 17:53:30.207482 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf6a65a9-f08a-4a67-b139-099756a8563b-scripts\") pod \"cinder-scheduler-0\" (UID: \"cf6a65a9-f08a-4a67-b139-099756a8563b\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:30 crc kubenswrapper[4756]: I1205 17:53:30.211619 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf6a65a9-f08a-4a67-b139-099756a8563b-config-data\") pod \"cinder-scheduler-0\" (UID: \"cf6a65a9-f08a-4a67-b139-099756a8563b\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:30 crc kubenswrapper[4756]: I1205 17:53:30.212218 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf6a65a9-f08a-4a67-b139-099756a8563b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cf6a65a9-f08a-4a67-b139-099756a8563b\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:30 crc kubenswrapper[4756]: I1205 17:53:30.214249 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mml7z\" (UniqueName: \"kubernetes.io/projected/cf6a65a9-f08a-4a67-b139-099756a8563b-kube-api-access-mml7z\") pod \"cinder-scheduler-0\" (UID: \"cf6a65a9-f08a-4a67-b139-099756a8563b\") " pod="openstack/cinder-scheduler-0" Dec 05 17:53:30 crc kubenswrapper[4756]: I1205 17:53:30.316555 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 17:53:30 crc kubenswrapper[4756]: W1205 17:53:30.844276 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf6a65a9_f08a_4a67_b139_099756a8563b.slice/crio-4ac1dbc173dedfc0dd5fd57800f267e0a08f1afe4afb5573b5688fa21d789324 WatchSource:0}: Error finding container 4ac1dbc173dedfc0dd5fd57800f267e0a08f1afe4afb5573b5688fa21d789324: Status 404 returned error can't find the container with id 4ac1dbc173dedfc0dd5fd57800f267e0a08f1afe4afb5573b5688fa21d789324 Dec 05 17:53:30 crc kubenswrapper[4756]: I1205 17:53:30.844557 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 17:53:30 crc kubenswrapper[4756]: I1205 17:53:30.911795 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cf6a65a9-f08a-4a67-b139-099756a8563b","Type":"ContainerStarted","Data":"4ac1dbc173dedfc0dd5fd57800f267e0a08f1afe4afb5573b5688fa21d789324"} Dec 05 17:53:31 crc kubenswrapper[4756]: I1205 17:53:31.926762 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cf6a65a9-f08a-4a67-b139-099756a8563b","Type":"ContainerStarted","Data":"32013e29352d7895bc318484448ad826187c4770ee2d3ec99bbd83524b0977fc"} Dec 05 17:53:32 crc kubenswrapper[4756]: I1205 17:53:32.937086 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cf6a65a9-f08a-4a67-b139-099756a8563b","Type":"ContainerStarted","Data":"9f86d554e21604d459c3b2d13a699d16ff577a8bbbeba9572e4c22d2b455a451"} Dec 05 17:53:32 crc kubenswrapper[4756]: I1205 17:53:32.958612 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.958595757 podStartE2EDuration="3.958595757s" podCreationTimestamp="2025-12-05 17:53:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:53:32.953700166 +0000 UTC m=+5873.192373678" watchObservedRunningTime="2025-12-05 17:53:32.958595757 +0000 UTC m=+5873.197269269" Dec 05 17:53:34 crc kubenswrapper[4756]: I1205 17:53:34.050612 4756 scope.go:117] "RemoveContainer" containerID="0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a" Dec 05 17:53:34 crc kubenswrapper[4756]: E1205 17:53:34.051338 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:53:34 crc kubenswrapper[4756]: I1205 17:53:34.067749 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 05 17:53:35 crc kubenswrapper[4756]: I1205 17:53:35.317669 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 05 17:53:40 crc kubenswrapper[4756]: I1205 17:53:40.525856 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 05 17:53:43 crc kubenswrapper[4756]: I1205 17:53:43.093462 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-hngjr"] Dec 05 17:53:43 crc kubenswrapper[4756]: I1205 17:53:43.095184 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hngjr" Dec 05 17:53:43 crc kubenswrapper[4756]: I1205 17:53:43.111567 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-hngjr"] Dec 05 17:53:43 crc kubenswrapper[4756]: I1205 17:53:43.195669 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b531-account-create-update-kk8lh"] Dec 05 17:53:43 crc kubenswrapper[4756]: I1205 17:53:43.197016 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b531-account-create-update-kk8lh" Dec 05 17:53:43 crc kubenswrapper[4756]: I1205 17:53:43.199251 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 05 17:53:43 crc kubenswrapper[4756]: I1205 17:53:43.203710 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b531-account-create-update-kk8lh"] Dec 05 17:53:43 crc kubenswrapper[4756]: I1205 17:53:43.243166 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ljfb\" (UniqueName: \"kubernetes.io/projected/f897b315-f80b-4353-b12f-83409ab6bb17-kube-api-access-9ljfb\") pod \"glance-db-create-hngjr\" (UID: \"f897b315-f80b-4353-b12f-83409ab6bb17\") " pod="openstack/glance-db-create-hngjr" Dec 05 17:53:43 crc kubenswrapper[4756]: I1205 17:53:43.243221 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f897b315-f80b-4353-b12f-83409ab6bb17-operator-scripts\") pod \"glance-db-create-hngjr\" (UID: \"f897b315-f80b-4353-b12f-83409ab6bb17\") " pod="openstack/glance-db-create-hngjr" Dec 05 17:53:43 crc kubenswrapper[4756]: I1205 17:53:43.344221 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ljfb\" (UniqueName: \"kubernetes.io/projected/f897b315-f80b-4353-b12f-83409ab6bb17-kube-api-access-9ljfb\") pod \"glance-db-create-hngjr\" (UID: \"f897b315-f80b-4353-b12f-83409ab6bb17\") " pod="openstack/glance-db-create-hngjr" Dec 05 17:53:43 crc kubenswrapper[4756]: I1205 17:53:43.344274 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f897b315-f80b-4353-b12f-83409ab6bb17-operator-scripts\") pod \"glance-db-create-hngjr\" (UID: \"f897b315-f80b-4353-b12f-83409ab6bb17\") " pod="openstack/glance-db-create-hngjr" Dec 05 17:53:43 crc kubenswrapper[4756]: I1205 17:53:43.344325 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7caec4ac-8162-4d47-ac4a-e1372071db96-operator-scripts\") pod \"glance-b531-account-create-update-kk8lh\" (UID: \"7caec4ac-8162-4d47-ac4a-e1372071db96\") " pod="openstack/glance-b531-account-create-update-kk8lh" Dec 05 17:53:43 crc kubenswrapper[4756]: I1205 17:53:43.344425 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z8qv\" (UniqueName: \"kubernetes.io/projected/7caec4ac-8162-4d47-ac4a-e1372071db96-kube-api-access-4z8qv\") pod \"glance-b531-account-create-update-kk8lh\" (UID: \"7caec4ac-8162-4d47-ac4a-e1372071db96\") " pod="openstack/glance-b531-account-create-update-kk8lh" Dec 05 17:53:43 crc kubenswrapper[4756]: I1205 17:53:43.345126 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f897b315-f80b-4353-b12f-83409ab6bb17-operator-scripts\") pod \"glance-db-create-hngjr\" (UID: \"f897b315-f80b-4353-b12f-83409ab6bb17\") " pod="openstack/glance-db-create-hngjr" Dec 05 17:53:43 crc kubenswrapper[4756]: I1205 17:53:43.372304 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ljfb\" (UniqueName: \"kubernetes.io/projected/f897b315-f80b-4353-b12f-83409ab6bb17-kube-api-access-9ljfb\") pod \"glance-db-create-hngjr\" (UID: \"f897b315-f80b-4353-b12f-83409ab6bb17\") " pod="openstack/glance-db-create-hngjr" Dec 05 17:53:43 crc kubenswrapper[4756]: I1205 17:53:43.421121 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hngjr" Dec 05 17:53:43 crc kubenswrapper[4756]: I1205 17:53:43.446741 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7caec4ac-8162-4d47-ac4a-e1372071db96-operator-scripts\") pod \"glance-b531-account-create-update-kk8lh\" (UID: \"7caec4ac-8162-4d47-ac4a-e1372071db96\") " pod="openstack/glance-b531-account-create-update-kk8lh" Dec 05 17:53:43 crc kubenswrapper[4756]: I1205 17:53:43.446924 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z8qv\" (UniqueName: \"kubernetes.io/projected/7caec4ac-8162-4d47-ac4a-e1372071db96-kube-api-access-4z8qv\") pod \"glance-b531-account-create-update-kk8lh\" (UID: \"7caec4ac-8162-4d47-ac4a-e1372071db96\") " pod="openstack/glance-b531-account-create-update-kk8lh" Dec 05 17:53:43 crc kubenswrapper[4756]: I1205 17:53:43.446944 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7caec4ac-8162-4d47-ac4a-e1372071db96-operator-scripts\") pod \"glance-b531-account-create-update-kk8lh\" (UID: \"7caec4ac-8162-4d47-ac4a-e1372071db96\") " pod="openstack/glance-b531-account-create-update-kk8lh" Dec 05 17:53:43 crc kubenswrapper[4756]: I1205 17:53:43.476038 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z8qv\" (UniqueName: \"kubernetes.io/projected/7caec4ac-8162-4d47-ac4a-e1372071db96-kube-api-access-4z8qv\") pod \"glance-b531-account-create-update-kk8lh\" (UID: \"7caec4ac-8162-4d47-ac4a-e1372071db96\") " pod="openstack/glance-b531-account-create-update-kk8lh" Dec 05 17:53:43 crc kubenswrapper[4756]: I1205 17:53:43.529333 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b531-account-create-update-kk8lh" Dec 05 17:53:43 crc kubenswrapper[4756]: I1205 17:53:43.910378 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-hngjr"] Dec 05 17:53:43 crc kubenswrapper[4756]: W1205 17:53:43.913741 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf897b315_f80b_4353_b12f_83409ab6bb17.slice/crio-5cd186309465bcd650da05a6f839c065b1344f40597000164453f79bf969939a WatchSource:0}: Error finding container 5cd186309465bcd650da05a6f839c065b1344f40597000164453f79bf969939a: Status 404 returned error can't find the container with id 5cd186309465bcd650da05a6f839c065b1344f40597000164453f79bf969939a Dec 05 17:53:43 crc kubenswrapper[4756]: I1205 17:53:43.984727 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b531-account-create-update-kk8lh"] Dec 05 17:53:44 crc kubenswrapper[4756]: I1205 17:53:44.046162 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-hngjr" event={"ID":"f897b315-f80b-4353-b12f-83409ab6bb17","Type":"ContainerStarted","Data":"5cd186309465bcd650da05a6f839c065b1344f40597000164453f79bf969939a"} Dec 05 17:53:44 crc kubenswrapper[4756]: I1205 17:53:44.047265 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b531-account-create-update-kk8lh" event={"ID":"7caec4ac-8162-4d47-ac4a-e1372071db96","Type":"ContainerStarted","Data":"bed472d6bd08d6a4b05e14206fb77ace3e0aca96a3f92b5f5c3fcf1d45bd133f"} Dec 05 17:53:45 crc kubenswrapper[4756]: I1205 17:53:45.063315 4756 generic.go:334] "Generic (PLEG): container finished" podID="7caec4ac-8162-4d47-ac4a-e1372071db96" containerID="0d986c55807f81424b35fb0eb146aec0ec4574f65ced380b93d72823a3eeb3c0" exitCode=0 Dec 05 17:53:45 crc kubenswrapper[4756]: I1205 17:53:45.063456 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b531-account-create-update-kk8lh" event={"ID":"7caec4ac-8162-4d47-ac4a-e1372071db96","Type":"ContainerDied","Data":"0d986c55807f81424b35fb0eb146aec0ec4574f65ced380b93d72823a3eeb3c0"} Dec 05 17:53:45 crc kubenswrapper[4756]: I1205 17:53:45.069127 4756 generic.go:334] "Generic (PLEG): container finished" podID="f897b315-f80b-4353-b12f-83409ab6bb17" containerID="d2c8b2fcdcb79d1b70377ef1af6c3e4e7ed49c6e3ebcf669f7f9b255d4e13df3" exitCode=0 Dec 05 17:53:45 crc kubenswrapper[4756]: I1205 17:53:45.069217 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-hngjr" event={"ID":"f897b315-f80b-4353-b12f-83409ab6bb17","Type":"ContainerDied","Data":"d2c8b2fcdcb79d1b70377ef1af6c3e4e7ed49c6e3ebcf669f7f9b255d4e13df3"} Dec 05 17:53:46 crc kubenswrapper[4756]: I1205 17:53:46.424361 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b531-account-create-update-kk8lh" Dec 05 17:53:46 crc kubenswrapper[4756]: I1205 17:53:46.512678 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4z8qv\" (UniqueName: \"kubernetes.io/projected/7caec4ac-8162-4d47-ac4a-e1372071db96-kube-api-access-4z8qv\") pod \"7caec4ac-8162-4d47-ac4a-e1372071db96\" (UID: \"7caec4ac-8162-4d47-ac4a-e1372071db96\") " Dec 05 17:53:46 crc kubenswrapper[4756]: I1205 17:53:46.512789 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7caec4ac-8162-4d47-ac4a-e1372071db96-operator-scripts\") pod \"7caec4ac-8162-4d47-ac4a-e1372071db96\" (UID: \"7caec4ac-8162-4d47-ac4a-e1372071db96\") " Dec 05 17:53:46 crc kubenswrapper[4756]: I1205 17:53:46.513629 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7caec4ac-8162-4d47-ac4a-e1372071db96-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7caec4ac-8162-4d47-ac4a-e1372071db96" (UID: "7caec4ac-8162-4d47-ac4a-e1372071db96"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:53:46 crc kubenswrapper[4756]: I1205 17:53:46.522142 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7caec4ac-8162-4d47-ac4a-e1372071db96-kube-api-access-4z8qv" (OuterVolumeSpecName: "kube-api-access-4z8qv") pod "7caec4ac-8162-4d47-ac4a-e1372071db96" (UID: "7caec4ac-8162-4d47-ac4a-e1372071db96"). InnerVolumeSpecName "kube-api-access-4z8qv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:53:46 crc kubenswrapper[4756]: I1205 17:53:46.574894 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hngjr" Dec 05 17:53:46 crc kubenswrapper[4756]: I1205 17:53:46.614927 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4z8qv\" (UniqueName: \"kubernetes.io/projected/7caec4ac-8162-4d47-ac4a-e1372071db96-kube-api-access-4z8qv\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:46 crc kubenswrapper[4756]: I1205 17:53:46.614972 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7caec4ac-8162-4d47-ac4a-e1372071db96-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:46 crc kubenswrapper[4756]: I1205 17:53:46.716624 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ljfb\" (UniqueName: \"kubernetes.io/projected/f897b315-f80b-4353-b12f-83409ab6bb17-kube-api-access-9ljfb\") pod \"f897b315-f80b-4353-b12f-83409ab6bb17\" (UID: \"f897b315-f80b-4353-b12f-83409ab6bb17\") " Dec 05 17:53:46 crc kubenswrapper[4756]: I1205 17:53:46.716802 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f897b315-f80b-4353-b12f-83409ab6bb17-operator-scripts\") pod \"f897b315-f80b-4353-b12f-83409ab6bb17\" (UID: \"f897b315-f80b-4353-b12f-83409ab6bb17\") " Dec 05 17:53:46 crc kubenswrapper[4756]: I1205 17:53:46.717679 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f897b315-f80b-4353-b12f-83409ab6bb17-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f897b315-f80b-4353-b12f-83409ab6bb17" (UID: "f897b315-f80b-4353-b12f-83409ab6bb17"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:53:46 crc kubenswrapper[4756]: I1205 17:53:46.720483 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f897b315-f80b-4353-b12f-83409ab6bb17-kube-api-access-9ljfb" (OuterVolumeSpecName: "kube-api-access-9ljfb") pod "f897b315-f80b-4353-b12f-83409ab6bb17" (UID: "f897b315-f80b-4353-b12f-83409ab6bb17"). InnerVolumeSpecName "kube-api-access-9ljfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:53:46 crc kubenswrapper[4756]: I1205 17:53:46.818418 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f897b315-f80b-4353-b12f-83409ab6bb17-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:46 crc kubenswrapper[4756]: I1205 17:53:46.818458 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ljfb\" (UniqueName: \"kubernetes.io/projected/f897b315-f80b-4353-b12f-83409ab6bb17-kube-api-access-9ljfb\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:47 crc kubenswrapper[4756]: I1205 17:53:47.094806 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b531-account-create-update-kk8lh" Dec 05 17:53:47 crc kubenswrapper[4756]: I1205 17:53:47.095860 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b531-account-create-update-kk8lh" event={"ID":"7caec4ac-8162-4d47-ac4a-e1372071db96","Type":"ContainerDied","Data":"bed472d6bd08d6a4b05e14206fb77ace3e0aca96a3f92b5f5c3fcf1d45bd133f"} Dec 05 17:53:47 crc kubenswrapper[4756]: I1205 17:53:47.096010 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bed472d6bd08d6a4b05e14206fb77ace3e0aca96a3f92b5f5c3fcf1d45bd133f" Dec 05 17:53:47 crc kubenswrapper[4756]: I1205 17:53:47.100634 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-hngjr" event={"ID":"f897b315-f80b-4353-b12f-83409ab6bb17","Type":"ContainerDied","Data":"5cd186309465bcd650da05a6f839c065b1344f40597000164453f79bf969939a"} Dec 05 17:53:47 crc kubenswrapper[4756]: I1205 17:53:47.100678 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cd186309465bcd650da05a6f839c065b1344f40597000164453f79bf969939a" Dec 05 17:53:47 crc kubenswrapper[4756]: I1205 17:53:47.100853 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hngjr" Dec 05 17:53:48 crc kubenswrapper[4756]: I1205 17:53:48.050076 4756 scope.go:117] "RemoveContainer" containerID="0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a" Dec 05 17:53:48 crc kubenswrapper[4756]: E1205 17:53:48.050969 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:53:48 crc kubenswrapper[4756]: I1205 17:53:48.299294 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-n52f6"] Dec 05 17:53:48 crc kubenswrapper[4756]: E1205 17:53:48.299811 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f897b315-f80b-4353-b12f-83409ab6bb17" containerName="mariadb-database-create" Dec 05 17:53:48 crc kubenswrapper[4756]: I1205 17:53:48.299841 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f897b315-f80b-4353-b12f-83409ab6bb17" containerName="mariadb-database-create" Dec 05 17:53:48 crc kubenswrapper[4756]: E1205 17:53:48.299872 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7caec4ac-8162-4d47-ac4a-e1372071db96" containerName="mariadb-account-create-update" Dec 05 17:53:48 crc kubenswrapper[4756]: I1205 17:53:48.299886 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7caec4ac-8162-4d47-ac4a-e1372071db96" containerName="mariadb-account-create-update" Dec 05 17:53:48 crc kubenswrapper[4756]: I1205 17:53:48.300141 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="7caec4ac-8162-4d47-ac4a-e1372071db96" containerName="mariadb-account-create-update" Dec 05 17:53:48 crc kubenswrapper[4756]: I1205 17:53:48.300169 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f897b315-f80b-4353-b12f-83409ab6bb17" containerName="mariadb-database-create" Dec 05 17:53:48 crc kubenswrapper[4756]: I1205 17:53:48.301159 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-n52f6" Dec 05 17:53:48 crc kubenswrapper[4756]: I1205 17:53:48.305508 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 05 17:53:48 crc kubenswrapper[4756]: I1205 17:53:48.306372 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6clvv" Dec 05 17:53:48 crc kubenswrapper[4756]: I1205 17:53:48.319798 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-n52f6"] Dec 05 17:53:48 crc kubenswrapper[4756]: I1205 17:53:48.451642 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ab62b10-42d1-4699-adcb-9412bb44b37c-config-data\") pod \"glance-db-sync-n52f6\" (UID: \"8ab62b10-42d1-4699-adcb-9412bb44b37c\") " pod="openstack/glance-db-sync-n52f6" Dec 05 17:53:48 crc kubenswrapper[4756]: I1205 17:53:48.451868 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j46kc\" (UniqueName: \"kubernetes.io/projected/8ab62b10-42d1-4699-adcb-9412bb44b37c-kube-api-access-j46kc\") pod \"glance-db-sync-n52f6\" (UID: \"8ab62b10-42d1-4699-adcb-9412bb44b37c\") " pod="openstack/glance-db-sync-n52f6" Dec 05 17:53:48 crc kubenswrapper[4756]: I1205 17:53:48.451951 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8ab62b10-42d1-4699-adcb-9412bb44b37c-db-sync-config-data\") pod \"glance-db-sync-n52f6\" (UID: \"8ab62b10-42d1-4699-adcb-9412bb44b37c\") " pod="openstack/glance-db-sync-n52f6" Dec 05 17:53:48 crc kubenswrapper[4756]: I1205 17:53:48.451993 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ab62b10-42d1-4699-adcb-9412bb44b37c-combined-ca-bundle\") pod \"glance-db-sync-n52f6\" (UID: \"8ab62b10-42d1-4699-adcb-9412bb44b37c\") " pod="openstack/glance-db-sync-n52f6" Dec 05 17:53:48 crc kubenswrapper[4756]: I1205 17:53:48.553977 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j46kc\" (UniqueName: \"kubernetes.io/projected/8ab62b10-42d1-4699-adcb-9412bb44b37c-kube-api-access-j46kc\") pod \"glance-db-sync-n52f6\" (UID: \"8ab62b10-42d1-4699-adcb-9412bb44b37c\") " pod="openstack/glance-db-sync-n52f6" Dec 05 17:53:48 crc kubenswrapper[4756]: I1205 17:53:48.554060 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8ab62b10-42d1-4699-adcb-9412bb44b37c-db-sync-config-data\") pod \"glance-db-sync-n52f6\" (UID: \"8ab62b10-42d1-4699-adcb-9412bb44b37c\") " pod="openstack/glance-db-sync-n52f6" Dec 05 17:53:48 crc kubenswrapper[4756]: I1205 17:53:48.554086 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ab62b10-42d1-4699-adcb-9412bb44b37c-combined-ca-bundle\") pod \"glance-db-sync-n52f6\" (UID: \"8ab62b10-42d1-4699-adcb-9412bb44b37c\") " pod="openstack/glance-db-sync-n52f6" Dec 05 17:53:48 crc kubenswrapper[4756]: I1205 17:53:48.554148 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ab62b10-42d1-4699-adcb-9412bb44b37c-config-data\") pod \"glance-db-sync-n52f6\" (UID: \"8ab62b10-42d1-4699-adcb-9412bb44b37c\") " pod="openstack/glance-db-sync-n52f6" Dec 05 17:53:48 crc kubenswrapper[4756]: I1205 17:53:48.558942 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ab62b10-42d1-4699-adcb-9412bb44b37c-combined-ca-bundle\") pod \"glance-db-sync-n52f6\" (UID: \"8ab62b10-42d1-4699-adcb-9412bb44b37c\") " pod="openstack/glance-db-sync-n52f6" Dec 05 17:53:48 crc kubenswrapper[4756]: I1205 17:53:48.559018 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ab62b10-42d1-4699-adcb-9412bb44b37c-config-data\") pod \"glance-db-sync-n52f6\" (UID: \"8ab62b10-42d1-4699-adcb-9412bb44b37c\") " pod="openstack/glance-db-sync-n52f6" Dec 05 17:53:48 crc kubenswrapper[4756]: I1205 17:53:48.559059 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8ab62b10-42d1-4699-adcb-9412bb44b37c-db-sync-config-data\") pod \"glance-db-sync-n52f6\" (UID: \"8ab62b10-42d1-4699-adcb-9412bb44b37c\") " pod="openstack/glance-db-sync-n52f6" Dec 05 17:53:48 crc kubenswrapper[4756]: I1205 17:53:48.580629 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j46kc\" (UniqueName: \"kubernetes.io/projected/8ab62b10-42d1-4699-adcb-9412bb44b37c-kube-api-access-j46kc\") pod \"glance-db-sync-n52f6\" (UID: \"8ab62b10-42d1-4699-adcb-9412bb44b37c\") " pod="openstack/glance-db-sync-n52f6" Dec 05 17:53:48 crc kubenswrapper[4756]: I1205 17:53:48.628047 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-n52f6" Dec 05 17:53:49 crc kubenswrapper[4756]: I1205 17:53:49.147698 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-n52f6"] Dec 05 17:53:49 crc kubenswrapper[4756]: W1205 17:53:49.167176 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ab62b10_42d1_4699_adcb_9412bb44b37c.slice/crio-b4e4577d76104ece6ddce35b02cbd3fa11b910f87d0f8d77832bcaf306e9cd2c WatchSource:0}: Error finding container b4e4577d76104ece6ddce35b02cbd3fa11b910f87d0f8d77832bcaf306e9cd2c: Status 404 returned error can't find the container with id b4e4577d76104ece6ddce35b02cbd3fa11b910f87d0f8d77832bcaf306e9cd2c Dec 05 17:53:50 crc kubenswrapper[4756]: I1205 17:53:50.132751 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-n52f6" event={"ID":"8ab62b10-42d1-4699-adcb-9412bb44b37c","Type":"ContainerStarted","Data":"14507b4fabfd51b9d18a6b9011a932bcb597dcbb353982c9b5a04d1d067abac8"} Dec 05 17:53:50 crc kubenswrapper[4756]: I1205 17:53:50.133122 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-n52f6" event={"ID":"8ab62b10-42d1-4699-adcb-9412bb44b37c","Type":"ContainerStarted","Data":"b4e4577d76104ece6ddce35b02cbd3fa11b910f87d0f8d77832bcaf306e9cd2c"} Dec 05 17:53:50 crc kubenswrapper[4756]: I1205 17:53:50.161987 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-n52f6" podStartSLOduration=2.161957893 podStartE2EDuration="2.161957893s" podCreationTimestamp="2025-12-05 17:53:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:53:50.153165118 +0000 UTC m=+5890.391838690" watchObservedRunningTime="2025-12-05 17:53:50.161957893 +0000 UTC m=+5890.400631445" Dec 05 17:53:53 crc kubenswrapper[4756]: I1205 17:53:53.175238 4756 generic.go:334] "Generic (PLEG): container finished" podID="8ab62b10-42d1-4699-adcb-9412bb44b37c" containerID="14507b4fabfd51b9d18a6b9011a932bcb597dcbb353982c9b5a04d1d067abac8" exitCode=0 Dec 05 17:53:53 crc kubenswrapper[4756]: I1205 17:53:53.175396 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-n52f6" event={"ID":"8ab62b10-42d1-4699-adcb-9412bb44b37c","Type":"ContainerDied","Data":"14507b4fabfd51b9d18a6b9011a932bcb597dcbb353982c9b5a04d1d067abac8"} Dec 05 17:53:54 crc kubenswrapper[4756]: I1205 17:53:54.661285 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-n52f6" Dec 05 17:53:54 crc kubenswrapper[4756]: I1205 17:53:54.778277 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8ab62b10-42d1-4699-adcb-9412bb44b37c-db-sync-config-data\") pod \"8ab62b10-42d1-4699-adcb-9412bb44b37c\" (UID: \"8ab62b10-42d1-4699-adcb-9412bb44b37c\") " Dec 05 17:53:54 crc kubenswrapper[4756]: I1205 17:53:54.778763 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j46kc\" (UniqueName: \"kubernetes.io/projected/8ab62b10-42d1-4699-adcb-9412bb44b37c-kube-api-access-j46kc\") pod \"8ab62b10-42d1-4699-adcb-9412bb44b37c\" (UID: \"8ab62b10-42d1-4699-adcb-9412bb44b37c\") " Dec 05 17:53:54 crc kubenswrapper[4756]: I1205 17:53:54.779034 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ab62b10-42d1-4699-adcb-9412bb44b37c-combined-ca-bundle\") pod \"8ab62b10-42d1-4699-adcb-9412bb44b37c\" (UID: \"8ab62b10-42d1-4699-adcb-9412bb44b37c\") " Dec 05 17:53:54 crc kubenswrapper[4756]: I1205 17:53:54.779091 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ab62b10-42d1-4699-adcb-9412bb44b37c-config-data\") pod \"8ab62b10-42d1-4699-adcb-9412bb44b37c\" (UID: \"8ab62b10-42d1-4699-adcb-9412bb44b37c\") " Dec 05 17:53:54 crc kubenswrapper[4756]: I1205 17:53:54.785504 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ab62b10-42d1-4699-adcb-9412bb44b37c-kube-api-access-j46kc" (OuterVolumeSpecName: "kube-api-access-j46kc") pod "8ab62b10-42d1-4699-adcb-9412bb44b37c" (UID: "8ab62b10-42d1-4699-adcb-9412bb44b37c"). InnerVolumeSpecName "kube-api-access-j46kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:53:54 crc kubenswrapper[4756]: I1205 17:53:54.815621 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ab62b10-42d1-4699-adcb-9412bb44b37c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8ab62b10-42d1-4699-adcb-9412bb44b37c" (UID: "8ab62b10-42d1-4699-adcb-9412bb44b37c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:53:54 crc kubenswrapper[4756]: I1205 17:53:54.881544 4756 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8ab62b10-42d1-4699-adcb-9412bb44b37c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:54 crc kubenswrapper[4756]: I1205 17:53:54.881781 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j46kc\" (UniqueName: \"kubernetes.io/projected/8ab62b10-42d1-4699-adcb-9412bb44b37c-kube-api-access-j46kc\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:54 crc kubenswrapper[4756]: I1205 17:53:54.897770 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ab62b10-42d1-4699-adcb-9412bb44b37c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ab62b10-42d1-4699-adcb-9412bb44b37c" (UID: "8ab62b10-42d1-4699-adcb-9412bb44b37c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:53:54 crc kubenswrapper[4756]: I1205 17:53:54.973977 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ab62b10-42d1-4699-adcb-9412bb44b37c-config-data" (OuterVolumeSpecName: "config-data") pod "8ab62b10-42d1-4699-adcb-9412bb44b37c" (UID: "8ab62b10-42d1-4699-adcb-9412bb44b37c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:53:54 crc kubenswrapper[4756]: I1205 17:53:54.983056 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ab62b10-42d1-4699-adcb-9412bb44b37c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:54 crc kubenswrapper[4756]: I1205 17:53:54.983091 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ab62b10-42d1-4699-adcb-9412bb44b37c-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.203929 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-n52f6" event={"ID":"8ab62b10-42d1-4699-adcb-9412bb44b37c","Type":"ContainerDied","Data":"b4e4577d76104ece6ddce35b02cbd3fa11b910f87d0f8d77832bcaf306e9cd2c"} Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.203978 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4e4577d76104ece6ddce35b02cbd3fa11b910f87d0f8d77832bcaf306e9cd2c" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.204138 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-n52f6" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.402828 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 17:53:55 crc kubenswrapper[4756]: E1205 17:53:55.403152 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ab62b10-42d1-4699-adcb-9412bb44b37c" containerName="glance-db-sync" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.403167 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab62b10-42d1-4699-adcb-9412bb44b37c" containerName="glance-db-sync" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.403326 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ab62b10-42d1-4699-adcb-9412bb44b37c" containerName="glance-db-sync" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.404183 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.419388 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.419802 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.420068 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.423444 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6clvv" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.493926 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3befaf11-9c2a-4ede-af20-06edd36f3f0a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3befaf11-9c2a-4ede-af20-06edd36f3f0a\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.494174 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3befaf11-9c2a-4ede-af20-06edd36f3f0a-scripts\") pod \"glance-default-external-api-0\" (UID: \"3befaf11-9c2a-4ede-af20-06edd36f3f0a\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.494261 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3befaf11-9c2a-4ede-af20-06edd36f3f0a-logs\") pod \"glance-default-external-api-0\" (UID: \"3befaf11-9c2a-4ede-af20-06edd36f3f0a\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.494359 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3befaf11-9c2a-4ede-af20-06edd36f3f0a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3befaf11-9c2a-4ede-af20-06edd36f3f0a\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.494514 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3befaf11-9c2a-4ede-af20-06edd36f3f0a-config-data\") pod \"glance-default-external-api-0\" (UID: \"3befaf11-9c2a-4ede-af20-06edd36f3f0a\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.494604 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wfs9\" (UniqueName: \"kubernetes.io/projected/3befaf11-9c2a-4ede-af20-06edd36f3f0a-kube-api-access-4wfs9\") pod \"glance-default-external-api-0\" (UID: \"3befaf11-9c2a-4ede-af20-06edd36f3f0a\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.507872 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f868db595-spg97"] Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.513813 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f868db595-spg97" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.524041 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f868db595-spg97"] Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.595796 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wfs9\" (UniqueName: \"kubernetes.io/projected/3befaf11-9c2a-4ede-af20-06edd36f3f0a-kube-api-access-4wfs9\") pod \"glance-default-external-api-0\" (UID: \"3befaf11-9c2a-4ede-af20-06edd36f3f0a\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.595837 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e1149b6-8a9d-4747-ad32-e42fe976c029-config\") pod \"dnsmasq-dns-5f868db595-spg97\" (UID: \"1e1149b6-8a9d-4747-ad32-e42fe976c029\") " pod="openstack/dnsmasq-dns-5f868db595-spg97" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.595871 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e1149b6-8a9d-4747-ad32-e42fe976c029-ovsdbserver-nb\") pod \"dnsmasq-dns-5f868db595-spg97\" (UID: \"1e1149b6-8a9d-4747-ad32-e42fe976c029\") " pod="openstack/dnsmasq-dns-5f868db595-spg97" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.595939 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3befaf11-9c2a-4ede-af20-06edd36f3f0a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3befaf11-9c2a-4ede-af20-06edd36f3f0a\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.595992 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3befaf11-9c2a-4ede-af20-06edd36f3f0a-scripts\") pod \"glance-default-external-api-0\" (UID: \"3befaf11-9c2a-4ede-af20-06edd36f3f0a\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.596012 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e1149b6-8a9d-4747-ad32-e42fe976c029-dns-svc\") pod \"dnsmasq-dns-5f868db595-spg97\" (UID: \"1e1149b6-8a9d-4747-ad32-e42fe976c029\") " pod="openstack/dnsmasq-dns-5f868db595-spg97" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.596032 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3befaf11-9c2a-4ede-af20-06edd36f3f0a-logs\") pod \"glance-default-external-api-0\" (UID: \"3befaf11-9c2a-4ede-af20-06edd36f3f0a\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.596062 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3befaf11-9c2a-4ede-af20-06edd36f3f0a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3befaf11-9c2a-4ede-af20-06edd36f3f0a\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.596114 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e1149b6-8a9d-4747-ad32-e42fe976c029-ovsdbserver-sb\") pod \"dnsmasq-dns-5f868db595-spg97\" (UID: \"1e1149b6-8a9d-4747-ad32-e42fe976c029\") " pod="openstack/dnsmasq-dns-5f868db595-spg97" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.596132 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3befaf11-9c2a-4ede-af20-06edd36f3f0a-config-data\") pod \"glance-default-external-api-0\" (UID: \"3befaf11-9c2a-4ede-af20-06edd36f3f0a\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.596151 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z6mz\" (UniqueName: \"kubernetes.io/projected/1e1149b6-8a9d-4747-ad32-e42fe976c029-kube-api-access-5z6mz\") pod \"dnsmasq-dns-5f868db595-spg97\" (UID: \"1e1149b6-8a9d-4747-ad32-e42fe976c029\") " pod="openstack/dnsmasq-dns-5f868db595-spg97" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.596603 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3befaf11-9c2a-4ede-af20-06edd36f3f0a-logs\") pod \"glance-default-external-api-0\" (UID: \"3befaf11-9c2a-4ede-af20-06edd36f3f0a\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.596850 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3befaf11-9c2a-4ede-af20-06edd36f3f0a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3befaf11-9c2a-4ede-af20-06edd36f3f0a\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.600155 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3befaf11-9c2a-4ede-af20-06edd36f3f0a-config-data\") pod \"glance-default-external-api-0\" (UID: \"3befaf11-9c2a-4ede-af20-06edd36f3f0a\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.601265 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3befaf11-9c2a-4ede-af20-06edd36f3f0a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3befaf11-9c2a-4ede-af20-06edd36f3f0a\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.605015 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3befaf11-9c2a-4ede-af20-06edd36f3f0a-scripts\") pod \"glance-default-external-api-0\" (UID: \"3befaf11-9c2a-4ede-af20-06edd36f3f0a\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.618659 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wfs9\" (UniqueName: \"kubernetes.io/projected/3befaf11-9c2a-4ede-af20-06edd36f3f0a-kube-api-access-4wfs9\") pod \"glance-default-external-api-0\" (UID: \"3befaf11-9c2a-4ede-af20-06edd36f3f0a\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.639765 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.642000 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.646439 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.656452 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.697250 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e1149b6-8a9d-4747-ad32-e42fe976c029-dns-svc\") pod \"dnsmasq-dns-5f868db595-spg97\" (UID: \"1e1149b6-8a9d-4747-ad32-e42fe976c029\") " pod="openstack/dnsmasq-dns-5f868db595-spg97" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.697333 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e1149b6-8a9d-4747-ad32-e42fe976c029-ovsdbserver-sb\") pod \"dnsmasq-dns-5f868db595-spg97\" (UID: \"1e1149b6-8a9d-4747-ad32-e42fe976c029\") " pod="openstack/dnsmasq-dns-5f868db595-spg97" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.697354 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z6mz\" (UniqueName: \"kubernetes.io/projected/1e1149b6-8a9d-4747-ad32-e42fe976c029-kube-api-access-5z6mz\") pod \"dnsmasq-dns-5f868db595-spg97\" (UID: \"1e1149b6-8a9d-4747-ad32-e42fe976c029\") " pod="openstack/dnsmasq-dns-5f868db595-spg97" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.697379 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e1149b6-8a9d-4747-ad32-e42fe976c029-config\") pod \"dnsmasq-dns-5f868db595-spg97\" (UID: \"1e1149b6-8a9d-4747-ad32-e42fe976c029\") " pod="openstack/dnsmasq-dns-5f868db595-spg97" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.697408 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e1149b6-8a9d-4747-ad32-e42fe976c029-ovsdbserver-nb\") pod \"dnsmasq-dns-5f868db595-spg97\" (UID: \"1e1149b6-8a9d-4747-ad32-e42fe976c029\") " pod="openstack/dnsmasq-dns-5f868db595-spg97" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.698201 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e1149b6-8a9d-4747-ad32-e42fe976c029-ovsdbserver-nb\") pod \"dnsmasq-dns-5f868db595-spg97\" (UID: \"1e1149b6-8a9d-4747-ad32-e42fe976c029\") " pod="openstack/dnsmasq-dns-5f868db595-spg97" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.698721 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e1149b6-8a9d-4747-ad32-e42fe976c029-dns-svc\") pod \"dnsmasq-dns-5f868db595-spg97\" (UID: \"1e1149b6-8a9d-4747-ad32-e42fe976c029\") " pod="openstack/dnsmasq-dns-5f868db595-spg97" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.699895 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e1149b6-8a9d-4747-ad32-e42fe976c029-config\") pod \"dnsmasq-dns-5f868db595-spg97\" (UID: \"1e1149b6-8a9d-4747-ad32-e42fe976c029\") " pod="openstack/dnsmasq-dns-5f868db595-spg97" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.700223 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e1149b6-8a9d-4747-ad32-e42fe976c029-ovsdbserver-sb\") pod \"dnsmasq-dns-5f868db595-spg97\" (UID: \"1e1149b6-8a9d-4747-ad32-e42fe976c029\") " pod="openstack/dnsmasq-dns-5f868db595-spg97" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.726624 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z6mz\" (UniqueName: \"kubernetes.io/projected/1e1149b6-8a9d-4747-ad32-e42fe976c029-kube-api-access-5z6mz\") pod \"dnsmasq-dns-5f868db595-spg97\" (UID: \"1e1149b6-8a9d-4747-ad32-e42fe976c029\") " pod="openstack/dnsmasq-dns-5f868db595-spg97" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.736359 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.799303 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ae4950-3030-4180-9509-51e731118715-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"51ae4950-3030-4180-9509-51e731118715\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.799344 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51ae4950-3030-4180-9509-51e731118715-config-data\") pod \"glance-default-internal-api-0\" (UID: \"51ae4950-3030-4180-9509-51e731118715\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.799378 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51ae4950-3030-4180-9509-51e731118715-logs\") pod \"glance-default-internal-api-0\" (UID: \"51ae4950-3030-4180-9509-51e731118715\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.799428 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51ae4950-3030-4180-9509-51e731118715-scripts\") pod \"glance-default-internal-api-0\" (UID: \"51ae4950-3030-4180-9509-51e731118715\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.799489 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qdbc\" (UniqueName: \"kubernetes.io/projected/51ae4950-3030-4180-9509-51e731118715-kube-api-access-2qdbc\") pod \"glance-default-internal-api-0\" (UID: \"51ae4950-3030-4180-9509-51e731118715\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.799532 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/51ae4950-3030-4180-9509-51e731118715-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"51ae4950-3030-4180-9509-51e731118715\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.839218 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f868db595-spg97" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.906375 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qdbc\" (UniqueName: \"kubernetes.io/projected/51ae4950-3030-4180-9509-51e731118715-kube-api-access-2qdbc\") pod \"glance-default-internal-api-0\" (UID: \"51ae4950-3030-4180-9509-51e731118715\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.906723 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/51ae4950-3030-4180-9509-51e731118715-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"51ae4950-3030-4180-9509-51e731118715\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.906773 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ae4950-3030-4180-9509-51e731118715-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"51ae4950-3030-4180-9509-51e731118715\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.906789 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51ae4950-3030-4180-9509-51e731118715-config-data\") pod \"glance-default-internal-api-0\" (UID: \"51ae4950-3030-4180-9509-51e731118715\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.906820 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51ae4950-3030-4180-9509-51e731118715-logs\") pod \"glance-default-internal-api-0\" (UID: \"51ae4950-3030-4180-9509-51e731118715\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.906856 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51ae4950-3030-4180-9509-51e731118715-scripts\") pod \"glance-default-internal-api-0\" (UID: \"51ae4950-3030-4180-9509-51e731118715\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.907356 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/51ae4950-3030-4180-9509-51e731118715-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"51ae4950-3030-4180-9509-51e731118715\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.908080 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51ae4950-3030-4180-9509-51e731118715-logs\") pod \"glance-default-internal-api-0\" (UID: \"51ae4950-3030-4180-9509-51e731118715\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.912098 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ae4950-3030-4180-9509-51e731118715-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"51ae4950-3030-4180-9509-51e731118715\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.920891 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51ae4950-3030-4180-9509-51e731118715-config-data\") pod \"glance-default-internal-api-0\" (UID: \"51ae4950-3030-4180-9509-51e731118715\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.923489 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51ae4950-3030-4180-9509-51e731118715-scripts\") pod \"glance-default-internal-api-0\" (UID: \"51ae4950-3030-4180-9509-51e731118715\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.924752 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qdbc\" (UniqueName: \"kubernetes.io/projected/51ae4950-3030-4180-9509-51e731118715-kube-api-access-2qdbc\") pod \"glance-default-internal-api-0\" (UID: \"51ae4950-3030-4180-9509-51e731118715\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:55 crc kubenswrapper[4756]: I1205 17:53:55.994525 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 17:53:56 crc kubenswrapper[4756]: I1205 17:53:56.296732 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 17:53:56 crc kubenswrapper[4756]: W1205 17:53:56.298479 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3befaf11_9c2a_4ede_af20_06edd36f3f0a.slice/crio-6160f9e185ad6a9658702e51f6357b9a1956fe1d1a84bd94e8def63a0854800b WatchSource:0}: Error finding container 6160f9e185ad6a9658702e51f6357b9a1956fe1d1a84bd94e8def63a0854800b: Status 404 returned error can't find the container with id 6160f9e185ad6a9658702e51f6357b9a1956fe1d1a84bd94e8def63a0854800b Dec 05 17:53:56 crc kubenswrapper[4756]: I1205 17:53:56.333677 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f868db595-spg97"] Dec 05 17:53:56 crc kubenswrapper[4756]: W1205 17:53:56.343451 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e1149b6_8a9d_4747_ad32_e42fe976c029.slice/crio-5aa3740cc43fc4ae12674c35f19e807a5c0abb5bf262f8d1d45cc74575a18af2 WatchSource:0}: Error finding container 5aa3740cc43fc4ae12674c35f19e807a5c0abb5bf262f8d1d45cc74575a18af2: Status 404 returned error can't find the container with id 5aa3740cc43fc4ae12674c35f19e807a5c0abb5bf262f8d1d45cc74575a18af2 Dec 05 17:53:56 crc kubenswrapper[4756]: I1205 17:53:56.585039 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 17:53:56 crc kubenswrapper[4756]: I1205 17:53:56.631150 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 17:53:57 crc kubenswrapper[4756]: I1205 17:53:57.234774 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3befaf11-9c2a-4ede-af20-06edd36f3f0a","Type":"ContainerStarted","Data":"184e37b39f6a7ec56bb6daa060fc7c7e28dc921f74b269b33467c36192247b91"} Dec 05 17:53:57 crc kubenswrapper[4756]: I1205 17:53:57.235027 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3befaf11-9c2a-4ede-af20-06edd36f3f0a","Type":"ContainerStarted","Data":"6160f9e185ad6a9658702e51f6357b9a1956fe1d1a84bd94e8def63a0854800b"} Dec 05 17:53:57 crc kubenswrapper[4756]: I1205 17:53:57.237148 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"51ae4950-3030-4180-9509-51e731118715","Type":"ContainerStarted","Data":"955edb2f04f5db3f93dcdd6a7886d935cb8db0da8718f7d6615203a499275228"} Dec 05 17:53:57 crc kubenswrapper[4756]: I1205 17:53:57.237189 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"51ae4950-3030-4180-9509-51e731118715","Type":"ContainerStarted","Data":"6cd14e30b03770317a868d1966109250b6463d7c33adeddc06a23fc92d8edcc4"} Dec 05 17:53:57 crc kubenswrapper[4756]: I1205 17:53:57.239933 4756 generic.go:334] "Generic (PLEG): container finished" podID="1e1149b6-8a9d-4747-ad32-e42fe976c029" containerID="f2a063eaa636d28809c4429b7a6a2377346c4749d6f066323814ab4d45b9c739" exitCode=0 Dec 05 17:53:57 crc kubenswrapper[4756]: I1205 17:53:57.239992 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f868db595-spg97" event={"ID":"1e1149b6-8a9d-4747-ad32-e42fe976c029","Type":"ContainerDied","Data":"f2a063eaa636d28809c4429b7a6a2377346c4749d6f066323814ab4d45b9c739"} Dec 05 17:53:57 crc kubenswrapper[4756]: I1205 17:53:57.240015 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f868db595-spg97" event={"ID":"1e1149b6-8a9d-4747-ad32-e42fe976c029","Type":"ContainerStarted","Data":"5aa3740cc43fc4ae12674c35f19e807a5c0abb5bf262f8d1d45cc74575a18af2"} Dec 05 17:53:57 crc kubenswrapper[4756]: I1205 17:53:57.888249 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 17:53:58 crc kubenswrapper[4756]: I1205 17:53:58.252824 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f868db595-spg97" event={"ID":"1e1149b6-8a9d-4747-ad32-e42fe976c029","Type":"ContainerStarted","Data":"04620a39c15040fd14a9709d61e95a69ed97087f48eb05b284995bf5c3dbe95d"} Dec 05 17:53:58 crc kubenswrapper[4756]: I1205 17:53:58.254000 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f868db595-spg97" Dec 05 17:53:58 crc kubenswrapper[4756]: I1205 17:53:58.254874 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3befaf11-9c2a-4ede-af20-06edd36f3f0a","Type":"ContainerStarted","Data":"ba9faf596533e5b4e96966573c863e7266b57acc269839c24aa8b68b6ac7dd54"} Dec 05 17:53:58 crc kubenswrapper[4756]: I1205 17:53:58.254953 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3befaf11-9c2a-4ede-af20-06edd36f3f0a" containerName="glance-log" containerID="cri-o://184e37b39f6a7ec56bb6daa060fc7c7e28dc921f74b269b33467c36192247b91" gracePeriod=30 Dec 05 17:53:58 crc kubenswrapper[4756]: I1205 17:53:58.254985 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3befaf11-9c2a-4ede-af20-06edd36f3f0a" containerName="glance-httpd" containerID="cri-o://ba9faf596533e5b4e96966573c863e7266b57acc269839c24aa8b68b6ac7dd54" gracePeriod=30 Dec 05 17:53:58 crc kubenswrapper[4756]: I1205 17:53:58.260541 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"51ae4950-3030-4180-9509-51e731118715","Type":"ContainerStarted","Data":"efb57046dce79ffbc976f7c41d7dcb5c09f9a474776a71a55d550e017df16459"} Dec 05 17:53:58 crc kubenswrapper[4756]: I1205 17:53:58.260661 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="51ae4950-3030-4180-9509-51e731118715" containerName="glance-log" containerID="cri-o://955edb2f04f5db3f93dcdd6a7886d935cb8db0da8718f7d6615203a499275228" gracePeriod=30 Dec 05 17:53:58 crc kubenswrapper[4756]: I1205 17:53:58.260790 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="51ae4950-3030-4180-9509-51e731118715" containerName="glance-httpd" containerID="cri-o://efb57046dce79ffbc976f7c41d7dcb5c09f9a474776a71a55d550e017df16459" gracePeriod=30 Dec 05 17:53:58 crc kubenswrapper[4756]: I1205 17:53:58.283441 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f868db595-spg97" podStartSLOduration=3.283408728 podStartE2EDuration="3.283408728s" podCreationTimestamp="2025-12-05 17:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:53:58.275354753 +0000 UTC m=+5898.514028275" watchObservedRunningTime="2025-12-05 17:53:58.283408728 +0000 UTC m=+5898.522082240" Dec 05 17:53:58 crc kubenswrapper[4756]: I1205 17:53:58.300006 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.299984309 podStartE2EDuration="3.299984309s" podCreationTimestamp="2025-12-05 17:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:53:58.295380877 +0000 UTC m=+5898.534054389" watchObservedRunningTime="2025-12-05 17:53:58.299984309 +0000 UTC m=+5898.538657831" Dec 05 17:53:58 crc kubenswrapper[4756]: I1205 17:53:58.316535 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.31651813 podStartE2EDuration="3.31651813s" podCreationTimestamp="2025-12-05 17:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:53:58.316449818 +0000 UTC m=+5898.555123330" watchObservedRunningTime="2025-12-05 17:53:58.31651813 +0000 UTC m=+5898.555191632" Dec 05 17:53:58 crc kubenswrapper[4756]: I1205 17:53:58.910348 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 17:53:58 crc kubenswrapper[4756]: I1205 17:53:58.915760 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.020286 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3befaf11-9c2a-4ede-af20-06edd36f3f0a-combined-ca-bundle\") pod \"3befaf11-9c2a-4ede-af20-06edd36f3f0a\" (UID: \"3befaf11-9c2a-4ede-af20-06edd36f3f0a\") " Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.020371 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3befaf11-9c2a-4ede-af20-06edd36f3f0a-httpd-run\") pod \"3befaf11-9c2a-4ede-af20-06edd36f3f0a\" (UID: \"3befaf11-9c2a-4ede-af20-06edd36f3f0a\") " Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.020395 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3befaf11-9c2a-4ede-af20-06edd36f3f0a-config-data\") pod \"3befaf11-9c2a-4ede-af20-06edd36f3f0a\" (UID: \"3befaf11-9c2a-4ede-af20-06edd36f3f0a\") " Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.020449 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51ae4950-3030-4180-9509-51e731118715-logs\") pod \"51ae4950-3030-4180-9509-51e731118715\" (UID: \"51ae4950-3030-4180-9509-51e731118715\") " Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.020466 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51ae4950-3030-4180-9509-51e731118715-scripts\") pod \"51ae4950-3030-4180-9509-51e731118715\" (UID: \"51ae4950-3030-4180-9509-51e731118715\") " Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.020508 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3befaf11-9c2a-4ede-af20-06edd36f3f0a-scripts\") pod \"3befaf11-9c2a-4ede-af20-06edd36f3f0a\" (UID: \"3befaf11-9c2a-4ede-af20-06edd36f3f0a\") " Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.020524 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3befaf11-9c2a-4ede-af20-06edd36f3f0a-logs\") pod \"3befaf11-9c2a-4ede-af20-06edd36f3f0a\" (UID: \"3befaf11-9c2a-4ede-af20-06edd36f3f0a\") " Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.020988 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51ae4950-3030-4180-9509-51e731118715-logs" (OuterVolumeSpecName: "logs") pod "51ae4950-3030-4180-9509-51e731118715" (UID: "51ae4950-3030-4180-9509-51e731118715"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.021039 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3befaf11-9c2a-4ede-af20-06edd36f3f0a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3befaf11-9c2a-4ede-af20-06edd36f3f0a" (UID: "3befaf11-9c2a-4ede-af20-06edd36f3f0a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.021221 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3befaf11-9c2a-4ede-af20-06edd36f3f0a-logs" (OuterVolumeSpecName: "logs") pod "3befaf11-9c2a-4ede-af20-06edd36f3f0a" (UID: "3befaf11-9c2a-4ede-af20-06edd36f3f0a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.021253 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/51ae4950-3030-4180-9509-51e731118715-httpd-run\") pod \"51ae4950-3030-4180-9509-51e731118715\" (UID: \"51ae4950-3030-4180-9509-51e731118715\") " Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.021273 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ae4950-3030-4180-9509-51e731118715-combined-ca-bundle\") pod \"51ae4950-3030-4180-9509-51e731118715\" (UID: \"51ae4950-3030-4180-9509-51e731118715\") " Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.021368 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51ae4950-3030-4180-9509-51e731118715-config-data\") pod \"51ae4950-3030-4180-9509-51e731118715\" (UID: \"51ae4950-3030-4180-9509-51e731118715\") " Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.021434 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wfs9\" (UniqueName: \"kubernetes.io/projected/3befaf11-9c2a-4ede-af20-06edd36f3f0a-kube-api-access-4wfs9\") pod \"3befaf11-9c2a-4ede-af20-06edd36f3f0a\" (UID: \"3befaf11-9c2a-4ede-af20-06edd36f3f0a\") " Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.021466 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qdbc\" (UniqueName: \"kubernetes.io/projected/51ae4950-3030-4180-9509-51e731118715-kube-api-access-2qdbc\") pod \"51ae4950-3030-4180-9509-51e731118715\" (UID: \"51ae4950-3030-4180-9509-51e731118715\") " Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.021866 4756 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3befaf11-9c2a-4ede-af20-06edd36f3f0a-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.021883 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51ae4950-3030-4180-9509-51e731118715-logs\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.021891 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3befaf11-9c2a-4ede-af20-06edd36f3f0a-logs\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.022058 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51ae4950-3030-4180-9509-51e731118715-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "51ae4950-3030-4180-9509-51e731118715" (UID: "51ae4950-3030-4180-9509-51e731118715"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.026003 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3befaf11-9c2a-4ede-af20-06edd36f3f0a-kube-api-access-4wfs9" (OuterVolumeSpecName: "kube-api-access-4wfs9") pod "3befaf11-9c2a-4ede-af20-06edd36f3f0a" (UID: "3befaf11-9c2a-4ede-af20-06edd36f3f0a"). InnerVolumeSpecName "kube-api-access-4wfs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.026090 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ae4950-3030-4180-9509-51e731118715-scripts" (OuterVolumeSpecName: "scripts") pod "51ae4950-3030-4180-9509-51e731118715" (UID: "51ae4950-3030-4180-9509-51e731118715"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.027128 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3befaf11-9c2a-4ede-af20-06edd36f3f0a-scripts" (OuterVolumeSpecName: "scripts") pod "3befaf11-9c2a-4ede-af20-06edd36f3f0a" (UID: "3befaf11-9c2a-4ede-af20-06edd36f3f0a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.034113 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51ae4950-3030-4180-9509-51e731118715-kube-api-access-2qdbc" (OuterVolumeSpecName: "kube-api-access-2qdbc") pod "51ae4950-3030-4180-9509-51e731118715" (UID: "51ae4950-3030-4180-9509-51e731118715"). InnerVolumeSpecName "kube-api-access-2qdbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.050205 4756 scope.go:117] "RemoveContainer" containerID="0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.050281 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3befaf11-9c2a-4ede-af20-06edd36f3f0a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3befaf11-9c2a-4ede-af20-06edd36f3f0a" (UID: "3befaf11-9c2a-4ede-af20-06edd36f3f0a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:53:59 crc kubenswrapper[4756]: E1205 17:53:59.051087 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.052486 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ae4950-3030-4180-9509-51e731118715-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51ae4950-3030-4180-9509-51e731118715" (UID: "51ae4950-3030-4180-9509-51e731118715"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.073542 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3befaf11-9c2a-4ede-af20-06edd36f3f0a-config-data" (OuterVolumeSpecName: "config-data") pod "3befaf11-9c2a-4ede-af20-06edd36f3f0a" (UID: "3befaf11-9c2a-4ede-af20-06edd36f3f0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.088590 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ae4950-3030-4180-9509-51e731118715-config-data" (OuterVolumeSpecName: "config-data") pod "51ae4950-3030-4180-9509-51e731118715" (UID: "51ae4950-3030-4180-9509-51e731118715"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.123599 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51ae4950-3030-4180-9509-51e731118715-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.123632 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wfs9\" (UniqueName: \"kubernetes.io/projected/3befaf11-9c2a-4ede-af20-06edd36f3f0a-kube-api-access-4wfs9\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.123644 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qdbc\" (UniqueName: \"kubernetes.io/projected/51ae4950-3030-4180-9509-51e731118715-kube-api-access-2qdbc\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.123655 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3befaf11-9c2a-4ede-af20-06edd36f3f0a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.123672 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3befaf11-9c2a-4ede-af20-06edd36f3f0a-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.124228 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51ae4950-3030-4180-9509-51e731118715-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.124269 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3befaf11-9c2a-4ede-af20-06edd36f3f0a-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.124279 4756 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/51ae4950-3030-4180-9509-51e731118715-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.124291 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ae4950-3030-4180-9509-51e731118715-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.270553 4756 generic.go:334] "Generic (PLEG): container finished" podID="3befaf11-9c2a-4ede-af20-06edd36f3f0a" containerID="ba9faf596533e5b4e96966573c863e7266b57acc269839c24aa8b68b6ac7dd54" exitCode=0 Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.270584 4756 generic.go:334] "Generic (PLEG): container finished" podID="3befaf11-9c2a-4ede-af20-06edd36f3f0a" containerID="184e37b39f6a7ec56bb6daa060fc7c7e28dc921f74b269b33467c36192247b91" exitCode=143 Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.270623 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3befaf11-9c2a-4ede-af20-06edd36f3f0a","Type":"ContainerDied","Data":"ba9faf596533e5b4e96966573c863e7266b57acc269839c24aa8b68b6ac7dd54"} Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.270650 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3befaf11-9c2a-4ede-af20-06edd36f3f0a","Type":"ContainerDied","Data":"184e37b39f6a7ec56bb6daa060fc7c7e28dc921f74b269b33467c36192247b91"} Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.270661 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3befaf11-9c2a-4ede-af20-06edd36f3f0a","Type":"ContainerDied","Data":"6160f9e185ad6a9658702e51f6357b9a1956fe1d1a84bd94e8def63a0854800b"} Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.270676 4756 scope.go:117] "RemoveContainer" containerID="ba9faf596533e5b4e96966573c863e7266b57acc269839c24aa8b68b6ac7dd54" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.270826 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.274123 4756 generic.go:334] "Generic (PLEG): container finished" podID="51ae4950-3030-4180-9509-51e731118715" containerID="efb57046dce79ffbc976f7c41d7dcb5c09f9a474776a71a55d550e017df16459" exitCode=143 Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.274151 4756 generic.go:334] "Generic (PLEG): container finished" podID="51ae4950-3030-4180-9509-51e731118715" containerID="955edb2f04f5db3f93dcdd6a7886d935cb8db0da8718f7d6615203a499275228" exitCode=143 Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.274802 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.275843 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"51ae4950-3030-4180-9509-51e731118715","Type":"ContainerDied","Data":"efb57046dce79ffbc976f7c41d7dcb5c09f9a474776a71a55d550e017df16459"} Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.275879 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"51ae4950-3030-4180-9509-51e731118715","Type":"ContainerDied","Data":"955edb2f04f5db3f93dcdd6a7886d935cb8db0da8718f7d6615203a499275228"} Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.275891 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"51ae4950-3030-4180-9509-51e731118715","Type":"ContainerDied","Data":"6cd14e30b03770317a868d1966109250b6463d7c33adeddc06a23fc92d8edcc4"} Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.292957 4756 scope.go:117] "RemoveContainer" containerID="184e37b39f6a7ec56bb6daa060fc7c7e28dc921f74b269b33467c36192247b91" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.300513 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.309276 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.313090 4756 scope.go:117] "RemoveContainer" containerID="ba9faf596533e5b4e96966573c863e7266b57acc269839c24aa8b68b6ac7dd54" Dec 05 17:53:59 crc kubenswrapper[4756]: E1205 17:53:59.313628 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba9faf596533e5b4e96966573c863e7266b57acc269839c24aa8b68b6ac7dd54\": container with ID starting with ba9faf596533e5b4e96966573c863e7266b57acc269839c24aa8b68b6ac7dd54 not found: ID does not exist" containerID="ba9faf596533e5b4e96966573c863e7266b57acc269839c24aa8b68b6ac7dd54" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.313685 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba9faf596533e5b4e96966573c863e7266b57acc269839c24aa8b68b6ac7dd54"} err="failed to get container status \"ba9faf596533e5b4e96966573c863e7266b57acc269839c24aa8b68b6ac7dd54\": rpc error: code = NotFound desc = could not find container \"ba9faf596533e5b4e96966573c863e7266b57acc269839c24aa8b68b6ac7dd54\": container with ID starting with ba9faf596533e5b4e96966573c863e7266b57acc269839c24aa8b68b6ac7dd54 not found: ID does not exist" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.313714 4756 scope.go:117] "RemoveContainer" containerID="184e37b39f6a7ec56bb6daa060fc7c7e28dc921f74b269b33467c36192247b91" Dec 05 17:53:59 crc kubenswrapper[4756]: E1205 17:53:59.314053 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"184e37b39f6a7ec56bb6daa060fc7c7e28dc921f74b269b33467c36192247b91\": container with ID starting with 184e37b39f6a7ec56bb6daa060fc7c7e28dc921f74b269b33467c36192247b91 not found: ID does not exist" containerID="184e37b39f6a7ec56bb6daa060fc7c7e28dc921f74b269b33467c36192247b91" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.314086 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"184e37b39f6a7ec56bb6daa060fc7c7e28dc921f74b269b33467c36192247b91"} err="failed to get container status \"184e37b39f6a7ec56bb6daa060fc7c7e28dc921f74b269b33467c36192247b91\": rpc error: code = NotFound desc = could not find container \"184e37b39f6a7ec56bb6daa060fc7c7e28dc921f74b269b33467c36192247b91\": container with ID starting with 184e37b39f6a7ec56bb6daa060fc7c7e28dc921f74b269b33467c36192247b91 not found: ID does not exist" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.314109 4756 scope.go:117] "RemoveContainer" containerID="ba9faf596533e5b4e96966573c863e7266b57acc269839c24aa8b68b6ac7dd54" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.314555 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba9faf596533e5b4e96966573c863e7266b57acc269839c24aa8b68b6ac7dd54"} err="failed to get container status \"ba9faf596533e5b4e96966573c863e7266b57acc269839c24aa8b68b6ac7dd54\": rpc error: code = NotFound desc = could not find container \"ba9faf596533e5b4e96966573c863e7266b57acc269839c24aa8b68b6ac7dd54\": container with ID starting with ba9faf596533e5b4e96966573c863e7266b57acc269839c24aa8b68b6ac7dd54 not found: ID does not exist" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.314592 4756 scope.go:117] "RemoveContainer" containerID="184e37b39f6a7ec56bb6daa060fc7c7e28dc921f74b269b33467c36192247b91" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.314865 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"184e37b39f6a7ec56bb6daa060fc7c7e28dc921f74b269b33467c36192247b91"} err="failed to get container status \"184e37b39f6a7ec56bb6daa060fc7c7e28dc921f74b269b33467c36192247b91\": rpc error: code = NotFound desc = could not find container \"184e37b39f6a7ec56bb6daa060fc7c7e28dc921f74b269b33467c36192247b91\": container with ID starting with 184e37b39f6a7ec56bb6daa060fc7c7e28dc921f74b269b33467c36192247b91 not found: ID does not exist" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.314884 4756 scope.go:117] "RemoveContainer" containerID="efb57046dce79ffbc976f7c41d7dcb5c09f9a474776a71a55d550e017df16459" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.319112 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.331797 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.339304 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 17:53:59 crc kubenswrapper[4756]: E1205 17:53:59.339741 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3befaf11-9c2a-4ede-af20-06edd36f3f0a" containerName="glance-httpd" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.339753 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3befaf11-9c2a-4ede-af20-06edd36f3f0a" containerName="glance-httpd" Dec 05 17:53:59 crc kubenswrapper[4756]: E1205 17:53:59.339782 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ae4950-3030-4180-9509-51e731118715" containerName="glance-log" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.339791 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ae4950-3030-4180-9509-51e731118715" containerName="glance-log" Dec 05 17:53:59 crc kubenswrapper[4756]: E1205 17:53:59.339814 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ae4950-3030-4180-9509-51e731118715" containerName="glance-httpd" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.339820 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ae4950-3030-4180-9509-51e731118715" containerName="glance-httpd" Dec 05 17:53:59 crc kubenswrapper[4756]: E1205 17:53:59.339837 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3befaf11-9c2a-4ede-af20-06edd36f3f0a" containerName="glance-log" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.339844 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3befaf11-9c2a-4ede-af20-06edd36f3f0a" containerName="glance-log" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.340022 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="3befaf11-9c2a-4ede-af20-06edd36f3f0a" containerName="glance-log" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.340057 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="51ae4950-3030-4180-9509-51e731118715" containerName="glance-httpd" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.340074 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="3befaf11-9c2a-4ede-af20-06edd36f3f0a" containerName="glance-httpd" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.340088 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="51ae4950-3030-4180-9509-51e731118715" containerName="glance-log" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.341043 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.344366 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.344742 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.344911 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6clvv" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.345459 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.345830 4756 scope.go:117] "RemoveContainer" containerID="955edb2f04f5db3f93dcdd6a7886d935cb8db0da8718f7d6615203a499275228" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.355671 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.360033 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.364329 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.364664 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.373711 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.383523 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.384260 4756 scope.go:117] "RemoveContainer" containerID="efb57046dce79ffbc976f7c41d7dcb5c09f9a474776a71a55d550e017df16459" Dec 05 17:53:59 crc kubenswrapper[4756]: E1205 17:53:59.384663 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efb57046dce79ffbc976f7c41d7dcb5c09f9a474776a71a55d550e017df16459\": container with ID starting with efb57046dce79ffbc976f7c41d7dcb5c09f9a474776a71a55d550e017df16459 not found: ID does not exist" containerID="efb57046dce79ffbc976f7c41d7dcb5c09f9a474776a71a55d550e017df16459" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.384702 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efb57046dce79ffbc976f7c41d7dcb5c09f9a474776a71a55d550e017df16459"} err="failed to get container status \"efb57046dce79ffbc976f7c41d7dcb5c09f9a474776a71a55d550e017df16459\": rpc error: code = NotFound desc = could not find container \"efb57046dce79ffbc976f7c41d7dcb5c09f9a474776a71a55d550e017df16459\": container with ID starting with efb57046dce79ffbc976f7c41d7dcb5c09f9a474776a71a55d550e017df16459 not found: ID does not exist" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.384728 4756 scope.go:117] "RemoveContainer" containerID="955edb2f04f5db3f93dcdd6a7886d935cb8db0da8718f7d6615203a499275228" Dec 05 17:53:59 crc kubenswrapper[4756]: E1205 17:53:59.385112 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"955edb2f04f5db3f93dcdd6a7886d935cb8db0da8718f7d6615203a499275228\": container with ID starting with 955edb2f04f5db3f93dcdd6a7886d935cb8db0da8718f7d6615203a499275228 not found: ID does not exist" containerID="955edb2f04f5db3f93dcdd6a7886d935cb8db0da8718f7d6615203a499275228" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.385130 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"955edb2f04f5db3f93dcdd6a7886d935cb8db0da8718f7d6615203a499275228"} err="failed to get container status \"955edb2f04f5db3f93dcdd6a7886d935cb8db0da8718f7d6615203a499275228\": rpc error: code = NotFound desc = could not find container \"955edb2f04f5db3f93dcdd6a7886d935cb8db0da8718f7d6615203a499275228\": container with ID starting with 955edb2f04f5db3f93dcdd6a7886d935cb8db0da8718f7d6615203a499275228 not found: ID does not exist" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.385141 4756 scope.go:117] "RemoveContainer" containerID="efb57046dce79ffbc976f7c41d7dcb5c09f9a474776a71a55d550e017df16459" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.385467 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efb57046dce79ffbc976f7c41d7dcb5c09f9a474776a71a55d550e017df16459"} err="failed to get container status \"efb57046dce79ffbc976f7c41d7dcb5c09f9a474776a71a55d550e017df16459\": rpc error: code = NotFound desc = could not find container \"efb57046dce79ffbc976f7c41d7dcb5c09f9a474776a71a55d550e017df16459\": container with ID starting with efb57046dce79ffbc976f7c41d7dcb5c09f9a474776a71a55d550e017df16459 not found: ID does not exist" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.385485 4756 scope.go:117] "RemoveContainer" containerID="955edb2f04f5db3f93dcdd6a7886d935cb8db0da8718f7d6615203a499275228" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.385729 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"955edb2f04f5db3f93dcdd6a7886d935cb8db0da8718f7d6615203a499275228"} err="failed to get container status \"955edb2f04f5db3f93dcdd6a7886d935cb8db0da8718f7d6615203a499275228\": rpc error: code = NotFound desc = could not find container \"955edb2f04f5db3f93dcdd6a7886d935cb8db0da8718f7d6615203a499275228\": container with ID starting with 955edb2f04f5db3f93dcdd6a7886d935cb8db0da8718f7d6615203a499275228 not found: ID does not exist" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.535158 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82e4ff41-9e2e-4da3-9431-8e3706464313-scripts\") pod \"glance-default-internal-api-0\" (UID: \"82e4ff41-9e2e-4da3-9431-8e3706464313\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.535200 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/82e4ff41-9e2e-4da3-9431-8e3706464313-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"82e4ff41-9e2e-4da3-9431-8e3706464313\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.535225 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-698z2\" (UniqueName: \"kubernetes.io/projected/82e4ff41-9e2e-4da3-9431-8e3706464313-kube-api-access-698z2\") pod \"glance-default-internal-api-0\" (UID: \"82e4ff41-9e2e-4da3-9431-8e3706464313\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.535296 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xk7f\" (UniqueName: \"kubernetes.io/projected/dec4ed1b-f3fe-4135-be28-524ecef6a200-kube-api-access-4xk7f\") pod \"glance-default-external-api-0\" (UID: \"dec4ed1b-f3fe-4135-be28-524ecef6a200\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.535370 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e4ff41-9e2e-4da3-9431-8e3706464313-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"82e4ff41-9e2e-4da3-9431-8e3706464313\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.535410 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e4ff41-9e2e-4da3-9431-8e3706464313-config-data\") pod \"glance-default-internal-api-0\" (UID: \"82e4ff41-9e2e-4da3-9431-8e3706464313\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.535536 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dec4ed1b-f3fe-4135-be28-524ecef6a200-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dec4ed1b-f3fe-4135-be28-524ecef6a200\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.535594 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec4ed1b-f3fe-4135-be28-524ecef6a200-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dec4ed1b-f3fe-4135-be28-524ecef6a200\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.535710 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dec4ed1b-f3fe-4135-be28-524ecef6a200-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dec4ed1b-f3fe-4135-be28-524ecef6a200\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.535740 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dec4ed1b-f3fe-4135-be28-524ecef6a200-logs\") pod \"glance-default-external-api-0\" (UID: \"dec4ed1b-f3fe-4135-be28-524ecef6a200\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.535758 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/82e4ff41-9e2e-4da3-9431-8e3706464313-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"82e4ff41-9e2e-4da3-9431-8e3706464313\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.535771 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dec4ed1b-f3fe-4135-be28-524ecef6a200-scripts\") pod \"glance-default-external-api-0\" (UID: \"dec4ed1b-f3fe-4135-be28-524ecef6a200\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.535839 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec4ed1b-f3fe-4135-be28-524ecef6a200-config-data\") pod \"glance-default-external-api-0\" (UID: \"dec4ed1b-f3fe-4135-be28-524ecef6a200\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.535872 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82e4ff41-9e2e-4da3-9431-8e3706464313-logs\") pod \"glance-default-internal-api-0\" (UID: \"82e4ff41-9e2e-4da3-9431-8e3706464313\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.637109 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82e4ff41-9e2e-4da3-9431-8e3706464313-logs\") pod \"glance-default-internal-api-0\" (UID: \"82e4ff41-9e2e-4da3-9431-8e3706464313\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.637201 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82e4ff41-9e2e-4da3-9431-8e3706464313-scripts\") pod \"glance-default-internal-api-0\" (UID: \"82e4ff41-9e2e-4da3-9431-8e3706464313\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.637251 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/82e4ff41-9e2e-4da3-9431-8e3706464313-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"82e4ff41-9e2e-4da3-9431-8e3706464313\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.637272 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-698z2\" (UniqueName: \"kubernetes.io/projected/82e4ff41-9e2e-4da3-9431-8e3706464313-kube-api-access-698z2\") pod \"glance-default-internal-api-0\" (UID: \"82e4ff41-9e2e-4da3-9431-8e3706464313\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.637310 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xk7f\" (UniqueName: \"kubernetes.io/projected/dec4ed1b-f3fe-4135-be28-524ecef6a200-kube-api-access-4xk7f\") pod \"glance-default-external-api-0\" (UID: \"dec4ed1b-f3fe-4135-be28-524ecef6a200\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.637331 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e4ff41-9e2e-4da3-9431-8e3706464313-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"82e4ff41-9e2e-4da3-9431-8e3706464313\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.637351 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e4ff41-9e2e-4da3-9431-8e3706464313-config-data\") pod \"glance-default-internal-api-0\" (UID: \"82e4ff41-9e2e-4da3-9431-8e3706464313\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.637386 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dec4ed1b-f3fe-4135-be28-524ecef6a200-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dec4ed1b-f3fe-4135-be28-524ecef6a200\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.637410 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec4ed1b-f3fe-4135-be28-524ecef6a200-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dec4ed1b-f3fe-4135-be28-524ecef6a200\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.637546 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dec4ed1b-f3fe-4135-be28-524ecef6a200-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dec4ed1b-f3fe-4135-be28-524ecef6a200\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.637572 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dec4ed1b-f3fe-4135-be28-524ecef6a200-logs\") pod \"glance-default-external-api-0\" (UID: \"dec4ed1b-f3fe-4135-be28-524ecef6a200\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.637590 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/82e4ff41-9e2e-4da3-9431-8e3706464313-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"82e4ff41-9e2e-4da3-9431-8e3706464313\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.637605 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dec4ed1b-f3fe-4135-be28-524ecef6a200-scripts\") pod \"glance-default-external-api-0\" (UID: \"dec4ed1b-f3fe-4135-be28-524ecef6a200\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.637645 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec4ed1b-f3fe-4135-be28-524ecef6a200-config-data\") pod \"glance-default-external-api-0\" (UID: \"dec4ed1b-f3fe-4135-be28-524ecef6a200\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.638403 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82e4ff41-9e2e-4da3-9431-8e3706464313-logs\") pod \"glance-default-internal-api-0\" (UID: \"82e4ff41-9e2e-4da3-9431-8e3706464313\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.638594 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/82e4ff41-9e2e-4da3-9431-8e3706464313-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"82e4ff41-9e2e-4da3-9431-8e3706464313\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.641076 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/82e4ff41-9e2e-4da3-9431-8e3706464313-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"82e4ff41-9e2e-4da3-9431-8e3706464313\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.641215 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dec4ed1b-f3fe-4135-be28-524ecef6a200-logs\") pod \"glance-default-external-api-0\" (UID: \"dec4ed1b-f3fe-4135-be28-524ecef6a200\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.641296 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dec4ed1b-f3fe-4135-be28-524ecef6a200-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dec4ed1b-f3fe-4135-be28-524ecef6a200\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.641515 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e4ff41-9e2e-4da3-9431-8e3706464313-config-data\") pod \"glance-default-internal-api-0\" (UID: \"82e4ff41-9e2e-4da3-9431-8e3706464313\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.641662 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e4ff41-9e2e-4da3-9431-8e3706464313-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"82e4ff41-9e2e-4da3-9431-8e3706464313\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.642039 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82e4ff41-9e2e-4da3-9431-8e3706464313-scripts\") pod \"glance-default-internal-api-0\" (UID: \"82e4ff41-9e2e-4da3-9431-8e3706464313\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.644212 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dec4ed1b-f3fe-4135-be28-524ecef6a200-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dec4ed1b-f3fe-4135-be28-524ecef6a200\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.644749 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dec4ed1b-f3fe-4135-be28-524ecef6a200-scripts\") pod \"glance-default-external-api-0\" (UID: \"dec4ed1b-f3fe-4135-be28-524ecef6a200\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.644874 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec4ed1b-f3fe-4135-be28-524ecef6a200-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dec4ed1b-f3fe-4135-be28-524ecef6a200\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.645695 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec4ed1b-f3fe-4135-be28-524ecef6a200-config-data\") pod \"glance-default-external-api-0\" (UID: \"dec4ed1b-f3fe-4135-be28-524ecef6a200\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.659616 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-698z2\" (UniqueName: \"kubernetes.io/projected/82e4ff41-9e2e-4da3-9431-8e3706464313-kube-api-access-698z2\") pod \"glance-default-internal-api-0\" (UID: \"82e4ff41-9e2e-4da3-9431-8e3706464313\") " pod="openstack/glance-default-internal-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.660881 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xk7f\" (UniqueName: \"kubernetes.io/projected/dec4ed1b-f3fe-4135-be28-524ecef6a200-kube-api-access-4xk7f\") pod \"glance-default-external-api-0\" (UID: \"dec4ed1b-f3fe-4135-be28-524ecef6a200\") " pod="openstack/glance-default-external-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.674592 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.683353 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 17:53:59 crc kubenswrapper[4756]: I1205 17:53:59.917797 4756 scope.go:117] "RemoveContainer" containerID="cd79b4f9681e98882728d713fd1b6c509aaeff89234dd5682a5418bde4b746a5" Dec 05 17:54:00 crc kubenswrapper[4756]: I1205 17:54:00.069649 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3befaf11-9c2a-4ede-af20-06edd36f3f0a" path="/var/lib/kubelet/pods/3befaf11-9c2a-4ede-af20-06edd36f3f0a/volumes" Dec 05 17:54:00 crc kubenswrapper[4756]: I1205 17:54:00.070439 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51ae4950-3030-4180-9509-51e731118715" path="/var/lib/kubelet/pods/51ae4950-3030-4180-9509-51e731118715/volumes" Dec 05 17:54:00 crc kubenswrapper[4756]: I1205 17:54:00.242823 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 17:54:00 crc kubenswrapper[4756]: I1205 17:54:00.295476 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dec4ed1b-f3fe-4135-be28-524ecef6a200","Type":"ContainerStarted","Data":"b25a5086d8243e5c11679acb72f97fd21bf750cf21b8eaf3f94112d35d4591f2"} Dec 05 17:54:00 crc kubenswrapper[4756]: I1205 17:54:00.456456 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 17:54:00 crc kubenswrapper[4756]: W1205 17:54:00.467656 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82e4ff41_9e2e_4da3_9431_8e3706464313.slice/crio-0d4ad61608f280984a4e57770c78ce3bb441324eb5c258c0258f5cd6c696a16c WatchSource:0}: Error finding container 0d4ad61608f280984a4e57770c78ce3bb441324eb5c258c0258f5cd6c696a16c: Status 404 returned error can't find the container with id 0d4ad61608f280984a4e57770c78ce3bb441324eb5c258c0258f5cd6c696a16c Dec 05 17:54:01 crc kubenswrapper[4756]: I1205 17:54:01.332308 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"82e4ff41-9e2e-4da3-9431-8e3706464313","Type":"ContainerStarted","Data":"4c5bbd482e18edf23a5fceec1625fdf0f583bca42dee6b8cab74f7fa5d2b2306"} Dec 05 17:54:01 crc kubenswrapper[4756]: I1205 17:54:01.332852 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"82e4ff41-9e2e-4da3-9431-8e3706464313","Type":"ContainerStarted","Data":"0d4ad61608f280984a4e57770c78ce3bb441324eb5c258c0258f5cd6c696a16c"} Dec 05 17:54:01 crc kubenswrapper[4756]: I1205 17:54:01.333999 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dec4ed1b-f3fe-4135-be28-524ecef6a200","Type":"ContainerStarted","Data":"e973bba57429ed6b80ce822780a242faea92ba92c81f938c3cc46a1d04822663"} Dec 05 17:54:02 crc kubenswrapper[4756]: I1205 17:54:02.351628 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"82e4ff41-9e2e-4da3-9431-8e3706464313","Type":"ContainerStarted","Data":"784a3d49e60574db2ac4a8d299d338aaf66ed5262ee8a8fa02905d04a265b25a"} Dec 05 17:54:02 crc kubenswrapper[4756]: I1205 17:54:02.357800 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dec4ed1b-f3fe-4135-be28-524ecef6a200","Type":"ContainerStarted","Data":"70c1492faaff25046b0aed79e6f66de280b03cd15014e4e60dcc768b33ddd5a6"} Dec 05 17:54:02 crc kubenswrapper[4756]: I1205 17:54:02.400154 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.400135807 podStartE2EDuration="3.400135807s" podCreationTimestamp="2025-12-05 17:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:54:02.396916981 +0000 UTC m=+5902.635590503" watchObservedRunningTime="2025-12-05 17:54:02.400135807 +0000 UTC m=+5902.638809329" Dec 05 17:54:02 crc kubenswrapper[4756]: I1205 17:54:02.421709 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.421689101 podStartE2EDuration="3.421689101s" podCreationTimestamp="2025-12-05 17:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:54:02.417826348 +0000 UTC m=+5902.656499870" watchObservedRunningTime="2025-12-05 17:54:02.421689101 +0000 UTC m=+5902.660362623" Dec 05 17:54:05 crc kubenswrapper[4756]: I1205 17:54:05.841813 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f868db595-spg97" Dec 05 17:54:05 crc kubenswrapper[4756]: I1205 17:54:05.949138 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7498bd9d99-g52v7"] Dec 05 17:54:05 crc kubenswrapper[4756]: I1205 17:54:05.949603 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7498bd9d99-g52v7" podUID="b600aa7c-bad9-4599-b875-aeaa05f56aa2" containerName="dnsmasq-dns" containerID="cri-o://67951d347a590fac48578a9eb05497c4547a373150f59afb5208b7de12207c22" gracePeriod=10 Dec 05 17:54:06 crc kubenswrapper[4756]: I1205 17:54:06.412968 4756 generic.go:334] "Generic (PLEG): container finished" podID="b600aa7c-bad9-4599-b875-aeaa05f56aa2" containerID="67951d347a590fac48578a9eb05497c4547a373150f59afb5208b7de12207c22" exitCode=0 Dec 05 17:54:06 crc kubenswrapper[4756]: I1205 17:54:06.413008 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7498bd9d99-g52v7" event={"ID":"b600aa7c-bad9-4599-b875-aeaa05f56aa2","Type":"ContainerDied","Data":"67951d347a590fac48578a9eb05497c4547a373150f59afb5208b7de12207c22"} Dec 05 17:54:06 crc kubenswrapper[4756]: I1205 17:54:06.413032 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7498bd9d99-g52v7" event={"ID":"b600aa7c-bad9-4599-b875-aeaa05f56aa2","Type":"ContainerDied","Data":"6bae3f63a33b5432042e070e51b3b169de14d3c1cacc94d591154c05fea67434"} Dec 05 17:54:06 crc kubenswrapper[4756]: I1205 17:54:06.413042 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bae3f63a33b5432042e070e51b3b169de14d3c1cacc94d591154c05fea67434" Dec 05 17:54:06 crc kubenswrapper[4756]: I1205 17:54:06.435824 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7498bd9d99-g52v7" Dec 05 17:54:06 crc kubenswrapper[4756]: I1205 17:54:06.591359 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b600aa7c-bad9-4599-b875-aeaa05f56aa2-ovsdbserver-nb\") pod \"b600aa7c-bad9-4599-b875-aeaa05f56aa2\" (UID: \"b600aa7c-bad9-4599-b875-aeaa05f56aa2\") " Dec 05 17:54:06 crc kubenswrapper[4756]: I1205 17:54:06.591447 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdsrw\" (UniqueName: \"kubernetes.io/projected/b600aa7c-bad9-4599-b875-aeaa05f56aa2-kube-api-access-rdsrw\") pod \"b600aa7c-bad9-4599-b875-aeaa05f56aa2\" (UID: \"b600aa7c-bad9-4599-b875-aeaa05f56aa2\") " Dec 05 17:54:06 crc kubenswrapper[4756]: I1205 17:54:06.591537 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b600aa7c-bad9-4599-b875-aeaa05f56aa2-dns-svc\") pod \"b600aa7c-bad9-4599-b875-aeaa05f56aa2\" (UID: \"b600aa7c-bad9-4599-b875-aeaa05f56aa2\") " Dec 05 17:54:06 crc kubenswrapper[4756]: I1205 17:54:06.591557 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b600aa7c-bad9-4599-b875-aeaa05f56aa2-ovsdbserver-sb\") pod \"b600aa7c-bad9-4599-b875-aeaa05f56aa2\" (UID: \"b600aa7c-bad9-4599-b875-aeaa05f56aa2\") " Dec 05 17:54:06 crc kubenswrapper[4756]: I1205 17:54:06.591654 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b600aa7c-bad9-4599-b875-aeaa05f56aa2-config\") pod \"b600aa7c-bad9-4599-b875-aeaa05f56aa2\" (UID: \"b600aa7c-bad9-4599-b875-aeaa05f56aa2\") " Dec 05 17:54:06 crc kubenswrapper[4756]: I1205 17:54:06.613265 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b600aa7c-bad9-4599-b875-aeaa05f56aa2-kube-api-access-rdsrw" (OuterVolumeSpecName: "kube-api-access-rdsrw") pod "b600aa7c-bad9-4599-b875-aeaa05f56aa2" (UID: "b600aa7c-bad9-4599-b875-aeaa05f56aa2"). InnerVolumeSpecName "kube-api-access-rdsrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:54:06 crc kubenswrapper[4756]: I1205 17:54:06.633814 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b600aa7c-bad9-4599-b875-aeaa05f56aa2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b600aa7c-bad9-4599-b875-aeaa05f56aa2" (UID: "b600aa7c-bad9-4599-b875-aeaa05f56aa2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:54:06 crc kubenswrapper[4756]: I1205 17:54:06.636030 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b600aa7c-bad9-4599-b875-aeaa05f56aa2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b600aa7c-bad9-4599-b875-aeaa05f56aa2" (UID: "b600aa7c-bad9-4599-b875-aeaa05f56aa2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:54:06 crc kubenswrapper[4756]: I1205 17:54:06.638252 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b600aa7c-bad9-4599-b875-aeaa05f56aa2-config" (OuterVolumeSpecName: "config") pod "b600aa7c-bad9-4599-b875-aeaa05f56aa2" (UID: "b600aa7c-bad9-4599-b875-aeaa05f56aa2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:54:06 crc kubenswrapper[4756]: I1205 17:54:06.640954 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b600aa7c-bad9-4599-b875-aeaa05f56aa2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b600aa7c-bad9-4599-b875-aeaa05f56aa2" (UID: "b600aa7c-bad9-4599-b875-aeaa05f56aa2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:54:06 crc kubenswrapper[4756]: I1205 17:54:06.694008 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b600aa7c-bad9-4599-b875-aeaa05f56aa2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 17:54:06 crc kubenswrapper[4756]: I1205 17:54:06.694037 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdsrw\" (UniqueName: \"kubernetes.io/projected/b600aa7c-bad9-4599-b875-aeaa05f56aa2-kube-api-access-rdsrw\") on node \"crc\" DevicePath \"\"" Dec 05 17:54:06 crc kubenswrapper[4756]: I1205 17:54:06.694062 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b600aa7c-bad9-4599-b875-aeaa05f56aa2-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 17:54:06 crc kubenswrapper[4756]: I1205 17:54:06.694071 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b600aa7c-bad9-4599-b875-aeaa05f56aa2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 17:54:06 crc kubenswrapper[4756]: I1205 17:54:06.694081 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b600aa7c-bad9-4599-b875-aeaa05f56aa2-config\") on node \"crc\" DevicePath \"\"" Dec 05 17:54:07 crc kubenswrapper[4756]: I1205 17:54:07.424480 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7498bd9d99-g52v7" Dec 05 17:54:07 crc kubenswrapper[4756]: I1205 17:54:07.495942 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7498bd9d99-g52v7"] Dec 05 17:54:07 crc kubenswrapper[4756]: I1205 17:54:07.508320 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7498bd9d99-g52v7"] Dec 05 17:54:08 crc kubenswrapper[4756]: I1205 17:54:08.067638 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b600aa7c-bad9-4599-b875-aeaa05f56aa2" path="/var/lib/kubelet/pods/b600aa7c-bad9-4599-b875-aeaa05f56aa2/volumes" Dec 05 17:54:09 crc kubenswrapper[4756]: I1205 17:54:09.675714 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 17:54:09 crc kubenswrapper[4756]: I1205 17:54:09.676115 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 17:54:09 crc kubenswrapper[4756]: I1205 17:54:09.684145 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 17:54:09 crc kubenswrapper[4756]: I1205 17:54:09.684204 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 17:54:09 crc kubenswrapper[4756]: I1205 17:54:09.726219 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 17:54:09 crc kubenswrapper[4756]: I1205 17:54:09.739839 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 17:54:09 crc kubenswrapper[4756]: I1205 17:54:09.746718 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 17:54:09 crc kubenswrapper[4756]: I1205 17:54:09.761351 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 17:54:10 crc kubenswrapper[4756]: I1205 17:54:10.061835 4756 scope.go:117] "RemoveContainer" containerID="0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a" Dec 05 17:54:10 crc kubenswrapper[4756]: E1205 17:54:10.064362 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:54:10 crc kubenswrapper[4756]: I1205 17:54:10.474380 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 17:54:10 crc kubenswrapper[4756]: I1205 17:54:10.474436 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 17:54:10 crc kubenswrapper[4756]: I1205 17:54:10.474449 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 17:54:10 crc kubenswrapper[4756]: I1205 17:54:10.474464 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 17:54:12 crc kubenswrapper[4756]: I1205 17:54:12.164499 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 17:54:12 crc kubenswrapper[4756]: I1205 17:54:12.212354 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 17:54:12 crc kubenswrapper[4756]: I1205 17:54:12.376637 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 17:54:12 crc kubenswrapper[4756]: I1205 17:54:12.490858 4756 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 17:54:12 crc kubenswrapper[4756]: I1205 17:54:12.521654 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 17:54:19 crc kubenswrapper[4756]: I1205 17:54:19.289100 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-jgc5r"] Dec 05 17:54:19 crc kubenswrapper[4756]: E1205 17:54:19.290982 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b600aa7c-bad9-4599-b875-aeaa05f56aa2" containerName="dnsmasq-dns" Dec 05 17:54:19 crc kubenswrapper[4756]: I1205 17:54:19.291096 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b600aa7c-bad9-4599-b875-aeaa05f56aa2" containerName="dnsmasq-dns" Dec 05 17:54:19 crc kubenswrapper[4756]: E1205 17:54:19.291187 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b600aa7c-bad9-4599-b875-aeaa05f56aa2" containerName="init" Dec 05 17:54:19 crc kubenswrapper[4756]: I1205 17:54:19.291265 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b600aa7c-bad9-4599-b875-aeaa05f56aa2" containerName="init" Dec 05 17:54:19 crc kubenswrapper[4756]: I1205 17:54:19.291706 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b600aa7c-bad9-4599-b875-aeaa05f56aa2" containerName="dnsmasq-dns" Dec 05 17:54:19 crc kubenswrapper[4756]: I1205 17:54:19.292863 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jgc5r" Dec 05 17:54:19 crc kubenswrapper[4756]: I1205 17:54:19.302153 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-jgc5r"] Dec 05 17:54:19 crc kubenswrapper[4756]: I1205 17:54:19.382686 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-33d3-account-create-update-dd5jn"] Dec 05 17:54:19 crc kubenswrapper[4756]: I1205 17:54:19.384237 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-33d3-account-create-update-dd5jn" Dec 05 17:54:19 crc kubenswrapper[4756]: I1205 17:54:19.388508 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 05 17:54:19 crc kubenswrapper[4756]: I1205 17:54:19.394087 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-33d3-account-create-update-dd5jn"] Dec 05 17:54:19 crc kubenswrapper[4756]: I1205 17:54:19.402142 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzxph\" (UniqueName: \"kubernetes.io/projected/d555bdfb-a12a-48ad-83ca-4c0ab4275377-kube-api-access-qzxph\") pod \"placement-db-create-jgc5r\" (UID: \"d555bdfb-a12a-48ad-83ca-4c0ab4275377\") " pod="openstack/placement-db-create-jgc5r" Dec 05 17:54:19 crc kubenswrapper[4756]: I1205 17:54:19.402323 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d555bdfb-a12a-48ad-83ca-4c0ab4275377-operator-scripts\") pod \"placement-db-create-jgc5r\" (UID: \"d555bdfb-a12a-48ad-83ca-4c0ab4275377\") " pod="openstack/placement-db-create-jgc5r" Dec 05 17:54:19 crc kubenswrapper[4756]: I1205 17:54:19.503822 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd6e6d5d-35ad-4319-b7eb-438aee2bb13c-operator-scripts\") pod \"placement-33d3-account-create-update-dd5jn\" (UID: \"fd6e6d5d-35ad-4319-b7eb-438aee2bb13c\") " pod="openstack/placement-33d3-account-create-update-dd5jn" Dec 05 17:54:19 crc kubenswrapper[4756]: I1205 17:54:19.504341 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzxph\" (UniqueName: \"kubernetes.io/projected/d555bdfb-a12a-48ad-83ca-4c0ab4275377-kube-api-access-qzxph\") pod \"placement-db-create-jgc5r\" (UID: \"d555bdfb-a12a-48ad-83ca-4c0ab4275377\") " pod="openstack/placement-db-create-jgc5r" Dec 05 17:54:19 crc kubenswrapper[4756]: I1205 17:54:19.504507 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d555bdfb-a12a-48ad-83ca-4c0ab4275377-operator-scripts\") pod \"placement-db-create-jgc5r\" (UID: \"d555bdfb-a12a-48ad-83ca-4c0ab4275377\") " pod="openstack/placement-db-create-jgc5r" Dec 05 17:54:19 crc kubenswrapper[4756]: I1205 17:54:19.504555 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx2c7\" (UniqueName: \"kubernetes.io/projected/fd6e6d5d-35ad-4319-b7eb-438aee2bb13c-kube-api-access-dx2c7\") pod \"placement-33d3-account-create-update-dd5jn\" (UID: \"fd6e6d5d-35ad-4319-b7eb-438aee2bb13c\") " pod="openstack/placement-33d3-account-create-update-dd5jn" Dec 05 17:54:19 crc kubenswrapper[4756]: I1205 17:54:19.505872 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d555bdfb-a12a-48ad-83ca-4c0ab4275377-operator-scripts\") pod \"placement-db-create-jgc5r\" (UID: \"d555bdfb-a12a-48ad-83ca-4c0ab4275377\") " pod="openstack/placement-db-create-jgc5r" Dec 05 17:54:19 crc kubenswrapper[4756]: I1205 17:54:19.535200 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzxph\" (UniqueName: \"kubernetes.io/projected/d555bdfb-a12a-48ad-83ca-4c0ab4275377-kube-api-access-qzxph\") pod \"placement-db-create-jgc5r\" (UID: \"d555bdfb-a12a-48ad-83ca-4c0ab4275377\") " pod="openstack/placement-db-create-jgc5r" Dec 05 17:54:19 crc kubenswrapper[4756]: I1205 17:54:19.606577 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd6e6d5d-35ad-4319-b7eb-438aee2bb13c-operator-scripts\") pod \"placement-33d3-account-create-update-dd5jn\" (UID: \"fd6e6d5d-35ad-4319-b7eb-438aee2bb13c\") " pod="openstack/placement-33d3-account-create-update-dd5jn" Dec 05 17:54:19 crc kubenswrapper[4756]: I1205 17:54:19.606843 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx2c7\" (UniqueName: \"kubernetes.io/projected/fd6e6d5d-35ad-4319-b7eb-438aee2bb13c-kube-api-access-dx2c7\") pod \"placement-33d3-account-create-update-dd5jn\" (UID: \"fd6e6d5d-35ad-4319-b7eb-438aee2bb13c\") " pod="openstack/placement-33d3-account-create-update-dd5jn" Dec 05 17:54:19 crc kubenswrapper[4756]: I1205 17:54:19.607295 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd6e6d5d-35ad-4319-b7eb-438aee2bb13c-operator-scripts\") pod \"placement-33d3-account-create-update-dd5jn\" (UID: \"fd6e6d5d-35ad-4319-b7eb-438aee2bb13c\") " pod="openstack/placement-33d3-account-create-update-dd5jn" Dec 05 17:54:19 crc kubenswrapper[4756]: I1205 17:54:19.626728 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx2c7\" (UniqueName: \"kubernetes.io/projected/fd6e6d5d-35ad-4319-b7eb-438aee2bb13c-kube-api-access-dx2c7\") pod \"placement-33d3-account-create-update-dd5jn\" (UID: \"fd6e6d5d-35ad-4319-b7eb-438aee2bb13c\") " pod="openstack/placement-33d3-account-create-update-dd5jn" Dec 05 17:54:19 crc kubenswrapper[4756]: I1205 17:54:19.632550 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jgc5r" Dec 05 17:54:19 crc kubenswrapper[4756]: I1205 17:54:19.716488 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-33d3-account-create-update-dd5jn" Dec 05 17:54:20 crc kubenswrapper[4756]: I1205 17:54:20.162791 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-jgc5r"] Dec 05 17:54:20 crc kubenswrapper[4756]: W1205 17:54:20.171186 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd555bdfb_a12a_48ad_83ca_4c0ab4275377.slice/crio-4497c0b4f6ee27732f784ababee84c871d08853c3f1dfcd1b90ac078fae6bb34 WatchSource:0}: Error finding container 4497c0b4f6ee27732f784ababee84c871d08853c3f1dfcd1b90ac078fae6bb34: Status 404 returned error can't find the container with id 4497c0b4f6ee27732f784ababee84c871d08853c3f1dfcd1b90ac078fae6bb34 Dec 05 17:54:20 crc kubenswrapper[4756]: I1205 17:54:20.283149 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-33d3-account-create-update-dd5jn"] Dec 05 17:54:20 crc kubenswrapper[4756]: W1205 17:54:20.288789 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd6e6d5d_35ad_4319_b7eb_438aee2bb13c.slice/crio-c894b89d3e4d63f56a192ca2e586b34da388e4903433d6a3fbac03e916d9fe3b WatchSource:0}: Error finding container c894b89d3e4d63f56a192ca2e586b34da388e4903433d6a3fbac03e916d9fe3b: Status 404 returned error can't find the container with id c894b89d3e4d63f56a192ca2e586b34da388e4903433d6a3fbac03e916d9fe3b Dec 05 17:54:20 crc kubenswrapper[4756]: I1205 17:54:20.578896 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-33d3-account-create-update-dd5jn" event={"ID":"fd6e6d5d-35ad-4319-b7eb-438aee2bb13c","Type":"ContainerStarted","Data":"abe730b63172ef2c155656c3a5b6ebdcaec877cac7d0d39f876e3f8bd7a2f1b7"} Dec 05 17:54:20 crc kubenswrapper[4756]: I1205 17:54:20.579243 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-33d3-account-create-update-dd5jn" event={"ID":"fd6e6d5d-35ad-4319-b7eb-438aee2bb13c","Type":"ContainerStarted","Data":"c894b89d3e4d63f56a192ca2e586b34da388e4903433d6a3fbac03e916d9fe3b"} Dec 05 17:54:20 crc kubenswrapper[4756]: I1205 17:54:20.582490 4756 generic.go:334] "Generic (PLEG): container finished" podID="d555bdfb-a12a-48ad-83ca-4c0ab4275377" containerID="28b1806a8d327e614cf899cc40b4f686fd487104d7ecfd2f8e8e2ae7a832aee6" exitCode=0 Dec 05 17:54:20 crc kubenswrapper[4756]: I1205 17:54:20.582529 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jgc5r" event={"ID":"d555bdfb-a12a-48ad-83ca-4c0ab4275377","Type":"ContainerDied","Data":"28b1806a8d327e614cf899cc40b4f686fd487104d7ecfd2f8e8e2ae7a832aee6"} Dec 05 17:54:20 crc kubenswrapper[4756]: I1205 17:54:20.582549 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jgc5r" event={"ID":"d555bdfb-a12a-48ad-83ca-4c0ab4275377","Type":"ContainerStarted","Data":"4497c0b4f6ee27732f784ababee84c871d08853c3f1dfcd1b90ac078fae6bb34"} Dec 05 17:54:20 crc kubenswrapper[4756]: I1205 17:54:20.600479 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-33d3-account-create-update-dd5jn" podStartSLOduration=1.600458902 podStartE2EDuration="1.600458902s" podCreationTimestamp="2025-12-05 17:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:54:20.591550885 +0000 UTC m=+5920.830224407" watchObservedRunningTime="2025-12-05 17:54:20.600458902 +0000 UTC m=+5920.839132434" Dec 05 17:54:21 crc kubenswrapper[4756]: I1205 17:54:21.597686 4756 generic.go:334] "Generic (PLEG): container finished" podID="fd6e6d5d-35ad-4319-b7eb-438aee2bb13c" containerID="abe730b63172ef2c155656c3a5b6ebdcaec877cac7d0d39f876e3f8bd7a2f1b7" exitCode=0 Dec 05 17:54:21 crc kubenswrapper[4756]: I1205 17:54:21.597950 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-33d3-account-create-update-dd5jn" event={"ID":"fd6e6d5d-35ad-4319-b7eb-438aee2bb13c","Type":"ContainerDied","Data":"abe730b63172ef2c155656c3a5b6ebdcaec877cac7d0d39f876e3f8bd7a2f1b7"} Dec 05 17:54:22 crc kubenswrapper[4756]: I1205 17:54:22.025239 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jgc5r" Dec 05 17:54:22 crc kubenswrapper[4756]: I1205 17:54:22.159163 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzxph\" (UniqueName: \"kubernetes.io/projected/d555bdfb-a12a-48ad-83ca-4c0ab4275377-kube-api-access-qzxph\") pod \"d555bdfb-a12a-48ad-83ca-4c0ab4275377\" (UID: \"d555bdfb-a12a-48ad-83ca-4c0ab4275377\") " Dec 05 17:54:22 crc kubenswrapper[4756]: I1205 17:54:22.159253 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d555bdfb-a12a-48ad-83ca-4c0ab4275377-operator-scripts\") pod \"d555bdfb-a12a-48ad-83ca-4c0ab4275377\" (UID: \"d555bdfb-a12a-48ad-83ca-4c0ab4275377\") " Dec 05 17:54:22 crc kubenswrapper[4756]: I1205 17:54:22.161002 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d555bdfb-a12a-48ad-83ca-4c0ab4275377-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d555bdfb-a12a-48ad-83ca-4c0ab4275377" (UID: "d555bdfb-a12a-48ad-83ca-4c0ab4275377"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:54:22 crc kubenswrapper[4756]: I1205 17:54:22.169046 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d555bdfb-a12a-48ad-83ca-4c0ab4275377-kube-api-access-qzxph" (OuterVolumeSpecName: "kube-api-access-qzxph") pod "d555bdfb-a12a-48ad-83ca-4c0ab4275377" (UID: "d555bdfb-a12a-48ad-83ca-4c0ab4275377"). InnerVolumeSpecName "kube-api-access-qzxph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:54:22 crc kubenswrapper[4756]: I1205 17:54:22.261292 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzxph\" (UniqueName: \"kubernetes.io/projected/d555bdfb-a12a-48ad-83ca-4c0ab4275377-kube-api-access-qzxph\") on node \"crc\" DevicePath \"\"" Dec 05 17:54:22 crc kubenswrapper[4756]: I1205 17:54:22.261322 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d555bdfb-a12a-48ad-83ca-4c0ab4275377-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:54:22 crc kubenswrapper[4756]: I1205 17:54:22.610147 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jgc5r" Dec 05 17:54:22 crc kubenswrapper[4756]: I1205 17:54:22.613520 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jgc5r" event={"ID":"d555bdfb-a12a-48ad-83ca-4c0ab4275377","Type":"ContainerDied","Data":"4497c0b4f6ee27732f784ababee84c871d08853c3f1dfcd1b90ac078fae6bb34"} Dec 05 17:54:22 crc kubenswrapper[4756]: I1205 17:54:22.613581 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4497c0b4f6ee27732f784ababee84c871d08853c3f1dfcd1b90ac078fae6bb34" Dec 05 17:54:22 crc kubenswrapper[4756]: I1205 17:54:22.952129 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-33d3-account-create-update-dd5jn" Dec 05 17:54:23 crc kubenswrapper[4756]: I1205 17:54:23.079941 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd6e6d5d-35ad-4319-b7eb-438aee2bb13c-operator-scripts\") pod \"fd6e6d5d-35ad-4319-b7eb-438aee2bb13c\" (UID: \"fd6e6d5d-35ad-4319-b7eb-438aee2bb13c\") " Dec 05 17:54:23 crc kubenswrapper[4756]: I1205 17:54:23.080123 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx2c7\" (UniqueName: \"kubernetes.io/projected/fd6e6d5d-35ad-4319-b7eb-438aee2bb13c-kube-api-access-dx2c7\") pod \"fd6e6d5d-35ad-4319-b7eb-438aee2bb13c\" (UID: \"fd6e6d5d-35ad-4319-b7eb-438aee2bb13c\") " Dec 05 17:54:23 crc kubenswrapper[4756]: I1205 17:54:23.080716 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd6e6d5d-35ad-4319-b7eb-438aee2bb13c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fd6e6d5d-35ad-4319-b7eb-438aee2bb13c" (UID: "fd6e6d5d-35ad-4319-b7eb-438aee2bb13c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:54:23 crc kubenswrapper[4756]: I1205 17:54:23.086755 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd6e6d5d-35ad-4319-b7eb-438aee2bb13c-kube-api-access-dx2c7" (OuterVolumeSpecName: "kube-api-access-dx2c7") pod "fd6e6d5d-35ad-4319-b7eb-438aee2bb13c" (UID: "fd6e6d5d-35ad-4319-b7eb-438aee2bb13c"). InnerVolumeSpecName "kube-api-access-dx2c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:54:23 crc kubenswrapper[4756]: I1205 17:54:23.185058 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dx2c7\" (UniqueName: \"kubernetes.io/projected/fd6e6d5d-35ad-4319-b7eb-438aee2bb13c-kube-api-access-dx2c7\") on node \"crc\" DevicePath \"\"" Dec 05 17:54:23 crc kubenswrapper[4756]: I1205 17:54:23.185095 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd6e6d5d-35ad-4319-b7eb-438aee2bb13c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:54:23 crc kubenswrapper[4756]: I1205 17:54:23.624347 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-33d3-account-create-update-dd5jn" event={"ID":"fd6e6d5d-35ad-4319-b7eb-438aee2bb13c","Type":"ContainerDied","Data":"c894b89d3e4d63f56a192ca2e586b34da388e4903433d6a3fbac03e916d9fe3b"} Dec 05 17:54:23 crc kubenswrapper[4756]: I1205 17:54:23.624400 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c894b89d3e4d63f56a192ca2e586b34da388e4903433d6a3fbac03e916d9fe3b" Dec 05 17:54:23 crc kubenswrapper[4756]: I1205 17:54:23.624485 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-33d3-account-create-update-dd5jn" Dec 05 17:54:24 crc kubenswrapper[4756]: I1205 17:54:24.049317 4756 scope.go:117] "RemoveContainer" containerID="0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a" Dec 05 17:54:24 crc kubenswrapper[4756]: E1205 17:54:24.049655 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:54:24 crc kubenswrapper[4756]: I1205 17:54:24.763755 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-f5kct"] Dec 05 17:54:24 crc kubenswrapper[4756]: E1205 17:54:24.764570 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d555bdfb-a12a-48ad-83ca-4c0ab4275377" containerName="mariadb-database-create" Dec 05 17:54:24 crc kubenswrapper[4756]: I1205 17:54:24.764588 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d555bdfb-a12a-48ad-83ca-4c0ab4275377" containerName="mariadb-database-create" Dec 05 17:54:24 crc kubenswrapper[4756]: E1205 17:54:24.764608 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd6e6d5d-35ad-4319-b7eb-438aee2bb13c" containerName="mariadb-account-create-update" Dec 05 17:54:24 crc kubenswrapper[4756]: I1205 17:54:24.764618 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd6e6d5d-35ad-4319-b7eb-438aee2bb13c" containerName="mariadb-account-create-update" Dec 05 17:54:24 crc kubenswrapper[4756]: I1205 17:54:24.764858 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="d555bdfb-a12a-48ad-83ca-4c0ab4275377" containerName="mariadb-database-create" Dec 05 17:54:24 crc kubenswrapper[4756]: I1205 17:54:24.764881 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd6e6d5d-35ad-4319-b7eb-438aee2bb13c" containerName="mariadb-account-create-update" Dec 05 17:54:24 crc kubenswrapper[4756]: I1205 17:54:24.765715 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-f5kct" Dec 05 17:54:24 crc kubenswrapper[4756]: I1205 17:54:24.775812 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 05 17:54:24 crc kubenswrapper[4756]: I1205 17:54:24.775858 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-8k4sv" Dec 05 17:54:24 crc kubenswrapper[4756]: I1205 17:54:24.776012 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 05 17:54:24 crc kubenswrapper[4756]: I1205 17:54:24.794242 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-f5kct"] Dec 05 17:54:24 crc kubenswrapper[4756]: I1205 17:54:24.811465 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66bbb84ccc-jsvsh"] Dec 05 17:54:24 crc kubenswrapper[4756]: I1205 17:54:24.813212 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66bbb84ccc-jsvsh" Dec 05 17:54:24 crc kubenswrapper[4756]: I1205 17:54:24.833428 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66bbb84ccc-jsvsh"] Dec 05 17:54:24 crc kubenswrapper[4756]: I1205 17:54:24.916735 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5204a282-bb58-4118-8f80-4f0f7d2a9028-scripts\") pod \"placement-db-sync-f5kct\" (UID: \"5204a282-bb58-4118-8f80-4f0f7d2a9028\") " pod="openstack/placement-db-sync-f5kct" Dec 05 17:54:24 crc kubenswrapper[4756]: I1205 17:54:24.916808 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssrj7\" (UniqueName: \"kubernetes.io/projected/5204a282-bb58-4118-8f80-4f0f7d2a9028-kube-api-access-ssrj7\") pod \"placement-db-sync-f5kct\" (UID: \"5204a282-bb58-4118-8f80-4f0f7d2a9028\") " pod="openstack/placement-db-sync-f5kct" Dec 05 17:54:24 crc kubenswrapper[4756]: I1205 17:54:24.916885 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5204a282-bb58-4118-8f80-4f0f7d2a9028-logs\") pod \"placement-db-sync-f5kct\" (UID: \"5204a282-bb58-4118-8f80-4f0f7d2a9028\") " pod="openstack/placement-db-sync-f5kct" Dec 05 17:54:24 crc kubenswrapper[4756]: I1205 17:54:24.916960 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439a3487-2bc9-4f44-af4c-02d5445fb349-config\") pod \"dnsmasq-dns-66bbb84ccc-jsvsh\" (UID: \"439a3487-2bc9-4f44-af4c-02d5445fb349\") " pod="openstack/dnsmasq-dns-66bbb84ccc-jsvsh" Dec 05 17:54:24 crc kubenswrapper[4756]: I1205 17:54:24.916979 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5204a282-bb58-4118-8f80-4f0f7d2a9028-combined-ca-bundle\") pod \"placement-db-sync-f5kct\" (UID: \"5204a282-bb58-4118-8f80-4f0f7d2a9028\") " pod="openstack/placement-db-sync-f5kct" Dec 05 17:54:24 crc kubenswrapper[4756]: I1205 17:54:24.917012 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/439a3487-2bc9-4f44-af4c-02d5445fb349-ovsdbserver-nb\") pod \"dnsmasq-dns-66bbb84ccc-jsvsh\" (UID: \"439a3487-2bc9-4f44-af4c-02d5445fb349\") " pod="openstack/dnsmasq-dns-66bbb84ccc-jsvsh" Dec 05 17:54:24 crc kubenswrapper[4756]: I1205 17:54:24.917153 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/439a3487-2bc9-4f44-af4c-02d5445fb349-dns-svc\") pod \"dnsmasq-dns-66bbb84ccc-jsvsh\" (UID: \"439a3487-2bc9-4f44-af4c-02d5445fb349\") " pod="openstack/dnsmasq-dns-66bbb84ccc-jsvsh" Dec 05 17:54:24 crc kubenswrapper[4756]: I1205 17:54:24.917293 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/439a3487-2bc9-4f44-af4c-02d5445fb349-ovsdbserver-sb\") pod \"dnsmasq-dns-66bbb84ccc-jsvsh\" (UID: \"439a3487-2bc9-4f44-af4c-02d5445fb349\") " pod="openstack/dnsmasq-dns-66bbb84ccc-jsvsh" Dec 05 17:54:24 crc kubenswrapper[4756]: I1205 17:54:24.917321 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78q4h\" (UniqueName: \"kubernetes.io/projected/439a3487-2bc9-4f44-af4c-02d5445fb349-kube-api-access-78q4h\") pod \"dnsmasq-dns-66bbb84ccc-jsvsh\" (UID: \"439a3487-2bc9-4f44-af4c-02d5445fb349\") " pod="openstack/dnsmasq-dns-66bbb84ccc-jsvsh" Dec 05 17:54:24 crc kubenswrapper[4756]: I1205 17:54:24.917572 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5204a282-bb58-4118-8f80-4f0f7d2a9028-config-data\") pod \"placement-db-sync-f5kct\" (UID: \"5204a282-bb58-4118-8f80-4f0f7d2a9028\") " pod="openstack/placement-db-sync-f5kct" Dec 05 17:54:25 crc kubenswrapper[4756]: I1205 17:54:25.021723 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5204a282-bb58-4118-8f80-4f0f7d2a9028-scripts\") pod \"placement-db-sync-f5kct\" (UID: \"5204a282-bb58-4118-8f80-4f0f7d2a9028\") " pod="openstack/placement-db-sync-f5kct" Dec 05 17:54:25 crc kubenswrapper[4756]: I1205 17:54:25.021798 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssrj7\" (UniqueName: \"kubernetes.io/projected/5204a282-bb58-4118-8f80-4f0f7d2a9028-kube-api-access-ssrj7\") pod \"placement-db-sync-f5kct\" (UID: \"5204a282-bb58-4118-8f80-4f0f7d2a9028\") " pod="openstack/placement-db-sync-f5kct" Dec 05 17:54:25 crc kubenswrapper[4756]: I1205 17:54:25.021838 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5204a282-bb58-4118-8f80-4f0f7d2a9028-logs\") pod \"placement-db-sync-f5kct\" (UID: \"5204a282-bb58-4118-8f80-4f0f7d2a9028\") " pod="openstack/placement-db-sync-f5kct" Dec 05 17:54:25 crc kubenswrapper[4756]: I1205 17:54:25.022991 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439a3487-2bc9-4f44-af4c-02d5445fb349-config\") pod \"dnsmasq-dns-66bbb84ccc-jsvsh\" (UID: \"439a3487-2bc9-4f44-af4c-02d5445fb349\") " pod="openstack/dnsmasq-dns-66bbb84ccc-jsvsh" Dec 05 17:54:25 crc kubenswrapper[4756]: I1205 17:54:25.023686 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5204a282-bb58-4118-8f80-4f0f7d2a9028-logs\") pod \"placement-db-sync-f5kct\" (UID: \"5204a282-bb58-4118-8f80-4f0f7d2a9028\") " pod="openstack/placement-db-sync-f5kct" Dec 05 17:54:25 crc kubenswrapper[4756]: I1205 17:54:25.023754 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439a3487-2bc9-4f44-af4c-02d5445fb349-config\") pod \"dnsmasq-dns-66bbb84ccc-jsvsh\" (UID: \"439a3487-2bc9-4f44-af4c-02d5445fb349\") " pod="openstack/dnsmasq-dns-66bbb84ccc-jsvsh" Dec 05 17:54:25 crc kubenswrapper[4756]: I1205 17:54:25.024197 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5204a282-bb58-4118-8f80-4f0f7d2a9028-combined-ca-bundle\") pod \"placement-db-sync-f5kct\" (UID: \"5204a282-bb58-4118-8f80-4f0f7d2a9028\") " pod="openstack/placement-db-sync-f5kct" Dec 05 17:54:25 crc kubenswrapper[4756]: I1205 17:54:25.024291 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/439a3487-2bc9-4f44-af4c-02d5445fb349-ovsdbserver-nb\") pod \"dnsmasq-dns-66bbb84ccc-jsvsh\" (UID: \"439a3487-2bc9-4f44-af4c-02d5445fb349\") " pod="openstack/dnsmasq-dns-66bbb84ccc-jsvsh" Dec 05 17:54:25 crc kubenswrapper[4756]: I1205 17:54:25.024377 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/439a3487-2bc9-4f44-af4c-02d5445fb349-dns-svc\") pod \"dnsmasq-dns-66bbb84ccc-jsvsh\" (UID: \"439a3487-2bc9-4f44-af4c-02d5445fb349\") " pod="openstack/dnsmasq-dns-66bbb84ccc-jsvsh" Dec 05 17:54:25 crc kubenswrapper[4756]: I1205 17:54:25.024440 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/439a3487-2bc9-4f44-af4c-02d5445fb349-ovsdbserver-sb\") pod \"dnsmasq-dns-66bbb84ccc-jsvsh\" (UID: \"439a3487-2bc9-4f44-af4c-02d5445fb349\") " pod="openstack/dnsmasq-dns-66bbb84ccc-jsvsh" Dec 05 17:54:25 crc kubenswrapper[4756]: I1205 17:54:25.024460 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78q4h\" (UniqueName: \"kubernetes.io/projected/439a3487-2bc9-4f44-af4c-02d5445fb349-kube-api-access-78q4h\") pod \"dnsmasq-dns-66bbb84ccc-jsvsh\" (UID: \"439a3487-2bc9-4f44-af4c-02d5445fb349\") " pod="openstack/dnsmasq-dns-66bbb84ccc-jsvsh" Dec 05 17:54:25 crc kubenswrapper[4756]: I1205 17:54:25.024485 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5204a282-bb58-4118-8f80-4f0f7d2a9028-config-data\") pod \"placement-db-sync-f5kct\" (UID: \"5204a282-bb58-4118-8f80-4f0f7d2a9028\") " pod="openstack/placement-db-sync-f5kct" Dec 05 17:54:25 crc kubenswrapper[4756]: I1205 17:54:25.024918 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/439a3487-2bc9-4f44-af4c-02d5445fb349-ovsdbserver-nb\") pod \"dnsmasq-dns-66bbb84ccc-jsvsh\" (UID: \"439a3487-2bc9-4f44-af4c-02d5445fb349\") " pod="openstack/dnsmasq-dns-66bbb84ccc-jsvsh" Dec 05 17:54:25 crc kubenswrapper[4756]: I1205 17:54:25.026284 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/439a3487-2bc9-4f44-af4c-02d5445fb349-ovsdbserver-sb\") pod \"dnsmasq-dns-66bbb84ccc-jsvsh\" (UID: \"439a3487-2bc9-4f44-af4c-02d5445fb349\") " pod="openstack/dnsmasq-dns-66bbb84ccc-jsvsh" Dec 05 17:54:25 crc kubenswrapper[4756]: I1205 17:54:25.030185 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5204a282-bb58-4118-8f80-4f0f7d2a9028-combined-ca-bundle\") pod \"placement-db-sync-f5kct\" (UID: \"5204a282-bb58-4118-8f80-4f0f7d2a9028\") " pod="openstack/placement-db-sync-f5kct" Dec 05 17:54:25 crc kubenswrapper[4756]: I1205 17:54:25.030435 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5204a282-bb58-4118-8f80-4f0f7d2a9028-config-data\") pod \"placement-db-sync-f5kct\" (UID: \"5204a282-bb58-4118-8f80-4f0f7d2a9028\") " pod="openstack/placement-db-sync-f5kct" Dec 05 17:54:25 crc kubenswrapper[4756]: I1205 17:54:25.032255 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5204a282-bb58-4118-8f80-4f0f7d2a9028-scripts\") pod \"placement-db-sync-f5kct\" (UID: \"5204a282-bb58-4118-8f80-4f0f7d2a9028\") " pod="openstack/placement-db-sync-f5kct" Dec 05 17:54:25 crc kubenswrapper[4756]: I1205 17:54:25.037071 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/439a3487-2bc9-4f44-af4c-02d5445fb349-dns-svc\") pod \"dnsmasq-dns-66bbb84ccc-jsvsh\" (UID: \"439a3487-2bc9-4f44-af4c-02d5445fb349\") " pod="openstack/dnsmasq-dns-66bbb84ccc-jsvsh" Dec 05 17:54:25 crc kubenswrapper[4756]: I1205 17:54:25.040246 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssrj7\" (UniqueName: \"kubernetes.io/projected/5204a282-bb58-4118-8f80-4f0f7d2a9028-kube-api-access-ssrj7\") pod \"placement-db-sync-f5kct\" (UID: \"5204a282-bb58-4118-8f80-4f0f7d2a9028\") " pod="openstack/placement-db-sync-f5kct" Dec 05 17:54:25 crc kubenswrapper[4756]: I1205 17:54:25.055855 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78q4h\" (UniqueName: \"kubernetes.io/projected/439a3487-2bc9-4f44-af4c-02d5445fb349-kube-api-access-78q4h\") pod \"dnsmasq-dns-66bbb84ccc-jsvsh\" (UID: \"439a3487-2bc9-4f44-af4c-02d5445fb349\") " pod="openstack/dnsmasq-dns-66bbb84ccc-jsvsh" Dec 05 17:54:25 crc kubenswrapper[4756]: I1205 17:54:25.110184 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-f5kct" Dec 05 17:54:25 crc kubenswrapper[4756]: I1205 17:54:25.140857 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66bbb84ccc-jsvsh" Dec 05 17:54:25 crc kubenswrapper[4756]: I1205 17:54:25.617551 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-f5kct"] Dec 05 17:54:25 crc kubenswrapper[4756]: I1205 17:54:25.649675 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-f5kct" event={"ID":"5204a282-bb58-4118-8f80-4f0f7d2a9028","Type":"ContainerStarted","Data":"fc56d4f2b9e3fd847d7627eee0feeeb6232838694e49a0736c266e38743ff214"} Dec 05 17:54:25 crc kubenswrapper[4756]: I1205 17:54:25.701680 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66bbb84ccc-jsvsh"] Dec 05 17:54:26 crc kubenswrapper[4756]: I1205 17:54:26.663465 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-f5kct" event={"ID":"5204a282-bb58-4118-8f80-4f0f7d2a9028","Type":"ContainerStarted","Data":"3884131a2957aaffd06693e7875800762487a6618a48df9e30deba351f1f3bfd"} Dec 05 17:54:26 crc kubenswrapper[4756]: I1205 17:54:26.667547 4756 generic.go:334] "Generic (PLEG): container finished" podID="439a3487-2bc9-4f44-af4c-02d5445fb349" containerID="286ceffd56b52175db1dd8deb168ac10e346939f0c2395b37208c54d884ad7e1" exitCode=0 Dec 05 17:54:26 crc kubenswrapper[4756]: I1205 17:54:26.667619 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66bbb84ccc-jsvsh" event={"ID":"439a3487-2bc9-4f44-af4c-02d5445fb349","Type":"ContainerDied","Data":"286ceffd56b52175db1dd8deb168ac10e346939f0c2395b37208c54d884ad7e1"} Dec 05 17:54:26 crc kubenswrapper[4756]: I1205 17:54:26.667882 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66bbb84ccc-jsvsh" event={"ID":"439a3487-2bc9-4f44-af4c-02d5445fb349","Type":"ContainerStarted","Data":"10620cc0c2cd0a7122a45666121879a40623b99c6fd865afe7fa111dc2bb2cb0"} Dec 05 17:54:26 crc kubenswrapper[4756]: I1205 17:54:26.686538 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-f5kct" podStartSLOduration=2.686519354 podStartE2EDuration="2.686519354s" podCreationTimestamp="2025-12-05 17:54:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:54:26.683912524 +0000 UTC m=+5926.922586046" watchObservedRunningTime="2025-12-05 17:54:26.686519354 +0000 UTC m=+5926.925192876" Dec 05 17:54:27 crc kubenswrapper[4756]: E1205 17:54:27.153627 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5204a282_bb58_4118_8f80_4f0f7d2a9028.slice/crio-conmon-3884131a2957aaffd06693e7875800762487a6618a48df9e30deba351f1f3bfd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5204a282_bb58_4118_8f80_4f0f7d2a9028.slice/crio-3884131a2957aaffd06693e7875800762487a6618a48df9e30deba351f1f3bfd.scope\": RecentStats: unable to find data in memory cache]" Dec 05 17:54:27 crc kubenswrapper[4756]: I1205 17:54:27.681798 4756 generic.go:334] "Generic (PLEG): container finished" podID="5204a282-bb58-4118-8f80-4f0f7d2a9028" containerID="3884131a2957aaffd06693e7875800762487a6618a48df9e30deba351f1f3bfd" exitCode=0 Dec 05 17:54:27 crc kubenswrapper[4756]: I1205 17:54:27.681866 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-f5kct" event={"ID":"5204a282-bb58-4118-8f80-4f0f7d2a9028","Type":"ContainerDied","Data":"3884131a2957aaffd06693e7875800762487a6618a48df9e30deba351f1f3bfd"} Dec 05 17:54:27 crc kubenswrapper[4756]: I1205 17:54:27.684615 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66bbb84ccc-jsvsh" event={"ID":"439a3487-2bc9-4f44-af4c-02d5445fb349","Type":"ContainerStarted","Data":"c67171f92813c91e25732d8318222028189777f18d2d30b866fd558ec69ed23c"} Dec 05 17:54:27 crc kubenswrapper[4756]: I1205 17:54:27.685172 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66bbb84ccc-jsvsh" Dec 05 17:54:27 crc kubenswrapper[4756]: I1205 17:54:27.728475 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66bbb84ccc-jsvsh" podStartSLOduration=3.728448721 podStartE2EDuration="3.728448721s" podCreationTimestamp="2025-12-05 17:54:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:54:27.725241775 +0000 UTC m=+5927.963915297" watchObservedRunningTime="2025-12-05 17:54:27.728448721 +0000 UTC m=+5927.967122263" Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.192157 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-f5kct" Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.314636 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssrj7\" (UniqueName: \"kubernetes.io/projected/5204a282-bb58-4118-8f80-4f0f7d2a9028-kube-api-access-ssrj7\") pod \"5204a282-bb58-4118-8f80-4f0f7d2a9028\" (UID: \"5204a282-bb58-4118-8f80-4f0f7d2a9028\") " Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.314721 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5204a282-bb58-4118-8f80-4f0f7d2a9028-logs\") pod \"5204a282-bb58-4118-8f80-4f0f7d2a9028\" (UID: \"5204a282-bb58-4118-8f80-4f0f7d2a9028\") " Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.315005 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5204a282-bb58-4118-8f80-4f0f7d2a9028-config-data\") pod \"5204a282-bb58-4118-8f80-4f0f7d2a9028\" (UID: \"5204a282-bb58-4118-8f80-4f0f7d2a9028\") " Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.315036 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5204a282-bb58-4118-8f80-4f0f7d2a9028-scripts\") pod \"5204a282-bb58-4118-8f80-4f0f7d2a9028\" (UID: \"5204a282-bb58-4118-8f80-4f0f7d2a9028\") " Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.315064 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5204a282-bb58-4118-8f80-4f0f7d2a9028-combined-ca-bundle\") pod \"5204a282-bb58-4118-8f80-4f0f7d2a9028\" (UID: \"5204a282-bb58-4118-8f80-4f0f7d2a9028\") " Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.315463 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5204a282-bb58-4118-8f80-4f0f7d2a9028-logs" (OuterVolumeSpecName: "logs") pod "5204a282-bb58-4118-8f80-4f0f7d2a9028" (UID: "5204a282-bb58-4118-8f80-4f0f7d2a9028"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.316404 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5204a282-bb58-4118-8f80-4f0f7d2a9028-logs\") on node \"crc\" DevicePath \"\"" Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.323183 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5204a282-bb58-4118-8f80-4f0f7d2a9028-scripts" (OuterVolumeSpecName: "scripts") pod "5204a282-bb58-4118-8f80-4f0f7d2a9028" (UID: "5204a282-bb58-4118-8f80-4f0f7d2a9028"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.326540 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5204a282-bb58-4118-8f80-4f0f7d2a9028-kube-api-access-ssrj7" (OuterVolumeSpecName: "kube-api-access-ssrj7") pod "5204a282-bb58-4118-8f80-4f0f7d2a9028" (UID: "5204a282-bb58-4118-8f80-4f0f7d2a9028"). InnerVolumeSpecName "kube-api-access-ssrj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.367454 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5204a282-bb58-4118-8f80-4f0f7d2a9028-config-data" (OuterVolumeSpecName: "config-data") pod "5204a282-bb58-4118-8f80-4f0f7d2a9028" (UID: "5204a282-bb58-4118-8f80-4f0f7d2a9028"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.368173 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5204a282-bb58-4118-8f80-4f0f7d2a9028-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5204a282-bb58-4118-8f80-4f0f7d2a9028" (UID: "5204a282-bb58-4118-8f80-4f0f7d2a9028"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.418050 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5204a282-bb58-4118-8f80-4f0f7d2a9028-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.418096 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5204a282-bb58-4118-8f80-4f0f7d2a9028-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.418110 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5204a282-bb58-4118-8f80-4f0f7d2a9028-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.418124 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssrj7\" (UniqueName: \"kubernetes.io/projected/5204a282-bb58-4118-8f80-4f0f7d2a9028-kube-api-access-ssrj7\") on node \"crc\" DevicePath \"\"" Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.710382 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-f5kct" Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.714547 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-f5kct" event={"ID":"5204a282-bb58-4118-8f80-4f0f7d2a9028","Type":"ContainerDied","Data":"fc56d4f2b9e3fd847d7627eee0feeeb6232838694e49a0736c266e38743ff214"} Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.714635 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc56d4f2b9e3fd847d7627eee0feeeb6232838694e49a0736c266e38743ff214" Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.800810 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-548487ccbd-m5m7w"] Dec 05 17:54:29 crc kubenswrapper[4756]: E1205 17:54:29.802886 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5204a282-bb58-4118-8f80-4f0f7d2a9028" containerName="placement-db-sync" Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.802908 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5204a282-bb58-4118-8f80-4f0f7d2a9028" containerName="placement-db-sync" Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.803097 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="5204a282-bb58-4118-8f80-4f0f7d2a9028" containerName="placement-db-sync" Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.805221 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-548487ccbd-m5m7w" Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.812102 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.812518 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.812679 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.812727 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-8k4sv" Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.813599 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.832164 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-548487ccbd-m5m7w"] Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.926348 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9ee4748-c8fd-47f2-95b2-18c5db21973f-logs\") pod \"placement-548487ccbd-m5m7w\" (UID: \"f9ee4748-c8fd-47f2-95b2-18c5db21973f\") " pod="openstack/placement-548487ccbd-m5m7w" Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.926401 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9ee4748-c8fd-47f2-95b2-18c5db21973f-combined-ca-bundle\") pod \"placement-548487ccbd-m5m7w\" (UID: \"f9ee4748-c8fd-47f2-95b2-18c5db21973f\") " pod="openstack/placement-548487ccbd-m5m7w" Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.926493 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnq4k\" (UniqueName: \"kubernetes.io/projected/f9ee4748-c8fd-47f2-95b2-18c5db21973f-kube-api-access-bnq4k\") pod \"placement-548487ccbd-m5m7w\" (UID: \"f9ee4748-c8fd-47f2-95b2-18c5db21973f\") " pod="openstack/placement-548487ccbd-m5m7w" Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.926531 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9ee4748-c8fd-47f2-95b2-18c5db21973f-config-data\") pod \"placement-548487ccbd-m5m7w\" (UID: \"f9ee4748-c8fd-47f2-95b2-18c5db21973f\") " pod="openstack/placement-548487ccbd-m5m7w" Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.926547 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9ee4748-c8fd-47f2-95b2-18c5db21973f-public-tls-certs\") pod \"placement-548487ccbd-m5m7w\" (UID: \"f9ee4748-c8fd-47f2-95b2-18c5db21973f\") " pod="openstack/placement-548487ccbd-m5m7w" Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.926570 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9ee4748-c8fd-47f2-95b2-18c5db21973f-scripts\") pod \"placement-548487ccbd-m5m7w\" (UID: \"f9ee4748-c8fd-47f2-95b2-18c5db21973f\") " pod="openstack/placement-548487ccbd-m5m7w" Dec 05 17:54:29 crc kubenswrapper[4756]: I1205 17:54:29.926620 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9ee4748-c8fd-47f2-95b2-18c5db21973f-internal-tls-certs\") pod \"placement-548487ccbd-m5m7w\" (UID: \"f9ee4748-c8fd-47f2-95b2-18c5db21973f\") " pod="openstack/placement-548487ccbd-m5m7w" Dec 05 17:54:30 crc kubenswrapper[4756]: I1205 17:54:30.028278 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9ee4748-c8fd-47f2-95b2-18c5db21973f-config-data\") pod \"placement-548487ccbd-m5m7w\" (UID: \"f9ee4748-c8fd-47f2-95b2-18c5db21973f\") " pod="openstack/placement-548487ccbd-m5m7w" Dec 05 17:54:30 crc kubenswrapper[4756]: I1205 17:54:30.028321 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9ee4748-c8fd-47f2-95b2-18c5db21973f-public-tls-certs\") pod \"placement-548487ccbd-m5m7w\" (UID: \"f9ee4748-c8fd-47f2-95b2-18c5db21973f\") " pod="openstack/placement-548487ccbd-m5m7w" Dec 05 17:54:30 crc kubenswrapper[4756]: I1205 17:54:30.028377 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9ee4748-c8fd-47f2-95b2-18c5db21973f-scripts\") pod \"placement-548487ccbd-m5m7w\" (UID: \"f9ee4748-c8fd-47f2-95b2-18c5db21973f\") " pod="openstack/placement-548487ccbd-m5m7w" Dec 05 17:54:30 crc kubenswrapper[4756]: I1205 17:54:30.028488 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9ee4748-c8fd-47f2-95b2-18c5db21973f-internal-tls-certs\") pod \"placement-548487ccbd-m5m7w\" (UID: \"f9ee4748-c8fd-47f2-95b2-18c5db21973f\") " pod="openstack/placement-548487ccbd-m5m7w" Dec 05 17:54:30 crc kubenswrapper[4756]: I1205 17:54:30.028524 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9ee4748-c8fd-47f2-95b2-18c5db21973f-logs\") pod \"placement-548487ccbd-m5m7w\" (UID: \"f9ee4748-c8fd-47f2-95b2-18c5db21973f\") " pod="openstack/placement-548487ccbd-m5m7w" Dec 05 17:54:30 crc kubenswrapper[4756]: I1205 17:54:30.028576 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9ee4748-c8fd-47f2-95b2-18c5db21973f-combined-ca-bundle\") pod \"placement-548487ccbd-m5m7w\" (UID: \"f9ee4748-c8fd-47f2-95b2-18c5db21973f\") " pod="openstack/placement-548487ccbd-m5m7w" Dec 05 17:54:30 crc kubenswrapper[4756]: I1205 17:54:30.028646 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnq4k\" (UniqueName: \"kubernetes.io/projected/f9ee4748-c8fd-47f2-95b2-18c5db21973f-kube-api-access-bnq4k\") pod \"placement-548487ccbd-m5m7w\" (UID: \"f9ee4748-c8fd-47f2-95b2-18c5db21973f\") " pod="openstack/placement-548487ccbd-m5m7w" Dec 05 17:54:30 crc kubenswrapper[4756]: I1205 17:54:30.029284 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9ee4748-c8fd-47f2-95b2-18c5db21973f-logs\") pod \"placement-548487ccbd-m5m7w\" (UID: \"f9ee4748-c8fd-47f2-95b2-18c5db21973f\") " pod="openstack/placement-548487ccbd-m5m7w" Dec 05 17:54:30 crc kubenswrapper[4756]: I1205 17:54:30.038698 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9ee4748-c8fd-47f2-95b2-18c5db21973f-scripts\") pod \"placement-548487ccbd-m5m7w\" (UID: \"f9ee4748-c8fd-47f2-95b2-18c5db21973f\") " pod="openstack/placement-548487ccbd-m5m7w" Dec 05 17:54:30 crc kubenswrapper[4756]: I1205 17:54:30.038852 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9ee4748-c8fd-47f2-95b2-18c5db21973f-config-data\") pod \"placement-548487ccbd-m5m7w\" (UID: \"f9ee4748-c8fd-47f2-95b2-18c5db21973f\") " pod="openstack/placement-548487ccbd-m5m7w" Dec 05 17:54:30 crc kubenswrapper[4756]: I1205 17:54:30.038903 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9ee4748-c8fd-47f2-95b2-18c5db21973f-internal-tls-certs\") pod \"placement-548487ccbd-m5m7w\" (UID: \"f9ee4748-c8fd-47f2-95b2-18c5db21973f\") " pod="openstack/placement-548487ccbd-m5m7w" Dec 05 17:54:30 crc kubenswrapper[4756]: I1205 17:54:30.039600 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9ee4748-c8fd-47f2-95b2-18c5db21973f-combined-ca-bundle\") pod \"placement-548487ccbd-m5m7w\" (UID: \"f9ee4748-c8fd-47f2-95b2-18c5db21973f\") " pod="openstack/placement-548487ccbd-m5m7w" Dec 05 17:54:30 crc kubenswrapper[4756]: I1205 17:54:30.045113 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9ee4748-c8fd-47f2-95b2-18c5db21973f-public-tls-certs\") pod \"placement-548487ccbd-m5m7w\" (UID: \"f9ee4748-c8fd-47f2-95b2-18c5db21973f\") " pod="openstack/placement-548487ccbd-m5m7w" Dec 05 17:54:30 crc kubenswrapper[4756]: I1205 17:54:30.047372 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnq4k\" (UniqueName: \"kubernetes.io/projected/f9ee4748-c8fd-47f2-95b2-18c5db21973f-kube-api-access-bnq4k\") pod \"placement-548487ccbd-m5m7w\" (UID: \"f9ee4748-c8fd-47f2-95b2-18c5db21973f\") " pod="openstack/placement-548487ccbd-m5m7w" Dec 05 17:54:30 crc kubenswrapper[4756]: I1205 17:54:30.128387 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-548487ccbd-m5m7w" Dec 05 17:54:30 crc kubenswrapper[4756]: I1205 17:54:30.695782 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-548487ccbd-m5m7w"] Dec 05 17:54:30 crc kubenswrapper[4756]: W1205 17:54:30.700548 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9ee4748_c8fd_47f2_95b2_18c5db21973f.slice/crio-b48cb1f84dc6ac9146c29b105a2b196d2a5b20f3c91c964b95d68051f5a11218 WatchSource:0}: Error finding container b48cb1f84dc6ac9146c29b105a2b196d2a5b20f3c91c964b95d68051f5a11218: Status 404 returned error can't find the container with id b48cb1f84dc6ac9146c29b105a2b196d2a5b20f3c91c964b95d68051f5a11218 Dec 05 17:54:30 crc kubenswrapper[4756]: I1205 17:54:30.718327 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-548487ccbd-m5m7w" event={"ID":"f9ee4748-c8fd-47f2-95b2-18c5db21973f","Type":"ContainerStarted","Data":"b48cb1f84dc6ac9146c29b105a2b196d2a5b20f3c91c964b95d68051f5a11218"} Dec 05 17:54:31 crc kubenswrapper[4756]: I1205 17:54:31.729181 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-548487ccbd-m5m7w" event={"ID":"f9ee4748-c8fd-47f2-95b2-18c5db21973f","Type":"ContainerStarted","Data":"1bbd718d83f2b09ad81a11724584a98a7cd439a1987db608489d9e03b5c8aed4"} Dec 05 17:54:31 crc kubenswrapper[4756]: I1205 17:54:31.729596 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-548487ccbd-m5m7w" Dec 05 17:54:31 crc kubenswrapper[4756]: I1205 17:54:31.729613 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-548487ccbd-m5m7w" event={"ID":"f9ee4748-c8fd-47f2-95b2-18c5db21973f","Type":"ContainerStarted","Data":"a8609a9da07b2b378e4d47ea334be2ee92a9c1a9b51838e744b2ee9ebb31a740"} Dec 05 17:54:31 crc kubenswrapper[4756]: I1205 17:54:31.729628 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-548487ccbd-m5m7w" Dec 05 17:54:31 crc kubenswrapper[4756]: I1205 17:54:31.755175 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-548487ccbd-m5m7w" podStartSLOduration=2.755158372 podStartE2EDuration="2.755158372s" podCreationTimestamp="2025-12-05 17:54:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:54:31.753431076 +0000 UTC m=+5931.992104588" watchObservedRunningTime="2025-12-05 17:54:31.755158372 +0000 UTC m=+5931.993831884" Dec 05 17:54:35 crc kubenswrapper[4756]: I1205 17:54:35.050600 4756 scope.go:117] "RemoveContainer" containerID="0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a" Dec 05 17:54:35 crc kubenswrapper[4756]: E1205 17:54:35.051391 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:54:35 crc kubenswrapper[4756]: I1205 17:54:35.143357 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66bbb84ccc-jsvsh" Dec 05 17:54:35 crc kubenswrapper[4756]: I1205 17:54:35.238742 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f868db595-spg97"] Dec 05 17:54:35 crc kubenswrapper[4756]: I1205 17:54:35.239263 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f868db595-spg97" podUID="1e1149b6-8a9d-4747-ad32-e42fe976c029" containerName="dnsmasq-dns" containerID="cri-o://04620a39c15040fd14a9709d61e95a69ed97087f48eb05b284995bf5c3dbe95d" gracePeriod=10 Dec 05 17:54:35 crc kubenswrapper[4756]: I1205 17:54:35.756179 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f868db595-spg97" Dec 05 17:54:35 crc kubenswrapper[4756]: I1205 17:54:35.782955 4756 generic.go:334] "Generic (PLEG): container finished" podID="1e1149b6-8a9d-4747-ad32-e42fe976c029" containerID="04620a39c15040fd14a9709d61e95a69ed97087f48eb05b284995bf5c3dbe95d" exitCode=0 Dec 05 17:54:35 crc kubenswrapper[4756]: I1205 17:54:35.783034 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f868db595-spg97" event={"ID":"1e1149b6-8a9d-4747-ad32-e42fe976c029","Type":"ContainerDied","Data":"04620a39c15040fd14a9709d61e95a69ed97087f48eb05b284995bf5c3dbe95d"} Dec 05 17:54:35 crc kubenswrapper[4756]: I1205 17:54:35.783139 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f868db595-spg97" event={"ID":"1e1149b6-8a9d-4747-ad32-e42fe976c029","Type":"ContainerDied","Data":"5aa3740cc43fc4ae12674c35f19e807a5c0abb5bf262f8d1d45cc74575a18af2"} Dec 05 17:54:35 crc kubenswrapper[4756]: I1205 17:54:35.783175 4756 scope.go:117] "RemoveContainer" containerID="04620a39c15040fd14a9709d61e95a69ed97087f48eb05b284995bf5c3dbe95d" Dec 05 17:54:35 crc kubenswrapper[4756]: I1205 17:54:35.783339 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f868db595-spg97" Dec 05 17:54:35 crc kubenswrapper[4756]: I1205 17:54:35.808626 4756 scope.go:117] "RemoveContainer" containerID="f2a063eaa636d28809c4429b7a6a2377346c4749d6f066323814ab4d45b9c739" Dec 05 17:54:35 crc kubenswrapper[4756]: I1205 17:54:35.836225 4756 scope.go:117] "RemoveContainer" containerID="04620a39c15040fd14a9709d61e95a69ed97087f48eb05b284995bf5c3dbe95d" Dec 05 17:54:35 crc kubenswrapper[4756]: E1205 17:54:35.836629 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04620a39c15040fd14a9709d61e95a69ed97087f48eb05b284995bf5c3dbe95d\": container with ID starting with 04620a39c15040fd14a9709d61e95a69ed97087f48eb05b284995bf5c3dbe95d not found: ID does not exist" containerID="04620a39c15040fd14a9709d61e95a69ed97087f48eb05b284995bf5c3dbe95d" Dec 05 17:54:35 crc kubenswrapper[4756]: I1205 17:54:35.836710 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04620a39c15040fd14a9709d61e95a69ed97087f48eb05b284995bf5c3dbe95d"} err="failed to get container status \"04620a39c15040fd14a9709d61e95a69ed97087f48eb05b284995bf5c3dbe95d\": rpc error: code = NotFound desc = could not find container \"04620a39c15040fd14a9709d61e95a69ed97087f48eb05b284995bf5c3dbe95d\": container with ID starting with 04620a39c15040fd14a9709d61e95a69ed97087f48eb05b284995bf5c3dbe95d not found: ID does not exist" Dec 05 17:54:35 crc kubenswrapper[4756]: I1205 17:54:35.836755 4756 scope.go:117] "RemoveContainer" containerID="f2a063eaa636d28809c4429b7a6a2377346c4749d6f066323814ab4d45b9c739" Dec 05 17:54:35 crc kubenswrapper[4756]: E1205 17:54:35.837747 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2a063eaa636d28809c4429b7a6a2377346c4749d6f066323814ab4d45b9c739\": container with ID starting with f2a063eaa636d28809c4429b7a6a2377346c4749d6f066323814ab4d45b9c739 not found: ID does not exist" containerID="f2a063eaa636d28809c4429b7a6a2377346c4749d6f066323814ab4d45b9c739" Dec 05 17:54:35 crc kubenswrapper[4756]: I1205 17:54:35.837795 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2a063eaa636d28809c4429b7a6a2377346c4749d6f066323814ab4d45b9c739"} err="failed to get container status \"f2a063eaa636d28809c4429b7a6a2377346c4749d6f066323814ab4d45b9c739\": rpc error: code = NotFound desc = could not find container \"f2a063eaa636d28809c4429b7a6a2377346c4749d6f066323814ab4d45b9c739\": container with ID starting with f2a063eaa636d28809c4429b7a6a2377346c4749d6f066323814ab4d45b9c739 not found: ID does not exist" Dec 05 17:54:35 crc kubenswrapper[4756]: I1205 17:54:35.845902 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e1149b6-8a9d-4747-ad32-e42fe976c029-dns-svc\") pod \"1e1149b6-8a9d-4747-ad32-e42fe976c029\" (UID: \"1e1149b6-8a9d-4747-ad32-e42fe976c029\") " Dec 05 17:54:35 crc kubenswrapper[4756]: I1205 17:54:35.846032 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e1149b6-8a9d-4747-ad32-e42fe976c029-config\") pod \"1e1149b6-8a9d-4747-ad32-e42fe976c029\" (UID: \"1e1149b6-8a9d-4747-ad32-e42fe976c029\") " Dec 05 17:54:35 crc kubenswrapper[4756]: I1205 17:54:35.846177 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5z6mz\" (UniqueName: \"kubernetes.io/projected/1e1149b6-8a9d-4747-ad32-e42fe976c029-kube-api-access-5z6mz\") pod \"1e1149b6-8a9d-4747-ad32-e42fe976c029\" (UID: \"1e1149b6-8a9d-4747-ad32-e42fe976c029\") " Dec 05 17:54:35 crc kubenswrapper[4756]: I1205 17:54:35.846288 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e1149b6-8a9d-4747-ad32-e42fe976c029-ovsdbserver-sb\") pod \"1e1149b6-8a9d-4747-ad32-e42fe976c029\" (UID: \"1e1149b6-8a9d-4747-ad32-e42fe976c029\") " Dec 05 17:54:35 crc kubenswrapper[4756]: I1205 17:54:35.846461 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e1149b6-8a9d-4747-ad32-e42fe976c029-ovsdbserver-nb\") pod \"1e1149b6-8a9d-4747-ad32-e42fe976c029\" (UID: \"1e1149b6-8a9d-4747-ad32-e42fe976c029\") " Dec 05 17:54:35 crc kubenswrapper[4756]: I1205 17:54:35.852724 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e1149b6-8a9d-4747-ad32-e42fe976c029-kube-api-access-5z6mz" (OuterVolumeSpecName: "kube-api-access-5z6mz") pod "1e1149b6-8a9d-4747-ad32-e42fe976c029" (UID: "1e1149b6-8a9d-4747-ad32-e42fe976c029"). InnerVolumeSpecName "kube-api-access-5z6mz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:54:35 crc kubenswrapper[4756]: I1205 17:54:35.896790 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e1149b6-8a9d-4747-ad32-e42fe976c029-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1e1149b6-8a9d-4747-ad32-e42fe976c029" (UID: "1e1149b6-8a9d-4747-ad32-e42fe976c029"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:54:35 crc kubenswrapper[4756]: I1205 17:54:35.897145 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e1149b6-8a9d-4747-ad32-e42fe976c029-config" (OuterVolumeSpecName: "config") pod "1e1149b6-8a9d-4747-ad32-e42fe976c029" (UID: "1e1149b6-8a9d-4747-ad32-e42fe976c029"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:54:35 crc kubenswrapper[4756]: I1205 17:54:35.903202 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e1149b6-8a9d-4747-ad32-e42fe976c029-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1e1149b6-8a9d-4747-ad32-e42fe976c029" (UID: "1e1149b6-8a9d-4747-ad32-e42fe976c029"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:54:35 crc kubenswrapper[4756]: I1205 17:54:35.904326 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e1149b6-8a9d-4747-ad32-e42fe976c029-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1e1149b6-8a9d-4747-ad32-e42fe976c029" (UID: "1e1149b6-8a9d-4747-ad32-e42fe976c029"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:54:35 crc kubenswrapper[4756]: I1205 17:54:35.948661 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5z6mz\" (UniqueName: \"kubernetes.io/projected/1e1149b6-8a9d-4747-ad32-e42fe976c029-kube-api-access-5z6mz\") on node \"crc\" DevicePath \"\"" Dec 05 17:54:35 crc kubenswrapper[4756]: I1205 17:54:35.948860 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e1149b6-8a9d-4747-ad32-e42fe976c029-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 17:54:35 crc kubenswrapper[4756]: I1205 17:54:35.948915 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e1149b6-8a9d-4747-ad32-e42fe976c029-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 17:54:35 crc kubenswrapper[4756]: I1205 17:54:35.948965 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e1149b6-8a9d-4747-ad32-e42fe976c029-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 17:54:35 crc kubenswrapper[4756]: I1205 17:54:35.949044 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e1149b6-8a9d-4747-ad32-e42fe976c029-config\") on node \"crc\" DevicePath \"\"" Dec 05 17:54:36 crc kubenswrapper[4756]: I1205 17:54:36.116759 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f868db595-spg97"] Dec 05 17:54:36 crc kubenswrapper[4756]: I1205 17:54:36.125664 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f868db595-spg97"] Dec 05 17:54:38 crc kubenswrapper[4756]: I1205 17:54:38.067300 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e1149b6-8a9d-4747-ad32-e42fe976c029" path="/var/lib/kubelet/pods/1e1149b6-8a9d-4747-ad32-e42fe976c029/volumes" Dec 05 17:54:47 crc kubenswrapper[4756]: I1205 17:54:47.050203 4756 scope.go:117] "RemoveContainer" containerID="0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a" Dec 05 17:54:47 crc kubenswrapper[4756]: E1205 17:54:47.051240 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:55:01 crc kubenswrapper[4756]: I1205 17:55:01.075166 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-548487ccbd-m5m7w" Dec 05 17:55:01 crc kubenswrapper[4756]: I1205 17:55:01.075805 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-548487ccbd-m5m7w" Dec 05 17:55:02 crc kubenswrapper[4756]: I1205 17:55:02.051174 4756 scope.go:117] "RemoveContainer" containerID="0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a" Dec 05 17:55:02 crc kubenswrapper[4756]: E1205 17:55:02.051741 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:55:14 crc kubenswrapper[4756]: I1205 17:55:14.053646 4756 scope.go:117] "RemoveContainer" containerID="0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a" Dec 05 17:55:14 crc kubenswrapper[4756]: E1205 17:55:14.056338 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.053368 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-lncvr"] Dec 05 17:55:23 crc kubenswrapper[4756]: E1205 17:55:23.054157 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e1149b6-8a9d-4747-ad32-e42fe976c029" containerName="dnsmasq-dns" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.054169 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e1149b6-8a9d-4747-ad32-e42fe976c029" containerName="dnsmasq-dns" Dec 05 17:55:23 crc kubenswrapper[4756]: E1205 17:55:23.054181 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e1149b6-8a9d-4747-ad32-e42fe976c029" containerName="init" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.054187 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e1149b6-8a9d-4747-ad32-e42fe976c029" containerName="init" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.054349 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e1149b6-8a9d-4747-ad32-e42fe976c029" containerName="dnsmasq-dns" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.054980 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lncvr" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.062509 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-lncvr"] Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.111088 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7ead6de-7881-4fe3-810e-6812c8a0669d-operator-scripts\") pod \"nova-api-db-create-lncvr\" (UID: \"f7ead6de-7881-4fe3-810e-6812c8a0669d\") " pod="openstack/nova-api-db-create-lncvr" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.111143 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-775zn\" (UniqueName: \"kubernetes.io/projected/f7ead6de-7881-4fe3-810e-6812c8a0669d-kube-api-access-775zn\") pod \"nova-api-db-create-lncvr\" (UID: \"f7ead6de-7881-4fe3-810e-6812c8a0669d\") " pod="openstack/nova-api-db-create-lncvr" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.146384 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-t2z9m"] Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.147345 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-t2z9m" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.159060 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-t2z9m"] Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.213375 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7ead6de-7881-4fe3-810e-6812c8a0669d-operator-scripts\") pod \"nova-api-db-create-lncvr\" (UID: \"f7ead6de-7881-4fe3-810e-6812c8a0669d\") " pod="openstack/nova-api-db-create-lncvr" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.213450 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-775zn\" (UniqueName: \"kubernetes.io/projected/f7ead6de-7881-4fe3-810e-6812c8a0669d-kube-api-access-775zn\") pod \"nova-api-db-create-lncvr\" (UID: \"f7ead6de-7881-4fe3-810e-6812c8a0669d\") " pod="openstack/nova-api-db-create-lncvr" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.213796 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbd2a51b-4e22-45f7-b95a-5f95837d5a11-operator-scripts\") pod \"nova-cell0-db-create-t2z9m\" (UID: \"bbd2a51b-4e22-45f7-b95a-5f95837d5a11\") " pod="openstack/nova-cell0-db-create-t2z9m" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.213883 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4xlh\" (UniqueName: \"kubernetes.io/projected/bbd2a51b-4e22-45f7-b95a-5f95837d5a11-kube-api-access-x4xlh\") pod \"nova-cell0-db-create-t2z9m\" (UID: \"bbd2a51b-4e22-45f7-b95a-5f95837d5a11\") " pod="openstack/nova-cell0-db-create-t2z9m" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.214027 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7ead6de-7881-4fe3-810e-6812c8a0669d-operator-scripts\") pod \"nova-api-db-create-lncvr\" (UID: \"f7ead6de-7881-4fe3-810e-6812c8a0669d\") " pod="openstack/nova-api-db-create-lncvr" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.231074 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-775zn\" (UniqueName: \"kubernetes.io/projected/f7ead6de-7881-4fe3-810e-6812c8a0669d-kube-api-access-775zn\") pod \"nova-api-db-create-lncvr\" (UID: \"f7ead6de-7881-4fe3-810e-6812c8a0669d\") " pod="openstack/nova-api-db-create-lncvr" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.269963 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-bv2sm"] Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.271194 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bv2sm" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.291545 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-1d5a-account-create-update-fpm7x"] Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.292638 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1d5a-account-create-update-fpm7x" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.294302 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.298052 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-bv2sm"] Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.317249 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbd2a51b-4e22-45f7-b95a-5f95837d5a11-operator-scripts\") pod \"nova-cell0-db-create-t2z9m\" (UID: \"bbd2a51b-4e22-45f7-b95a-5f95837d5a11\") " pod="openstack/nova-cell0-db-create-t2z9m" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.317384 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4xlh\" (UniqueName: \"kubernetes.io/projected/bbd2a51b-4e22-45f7-b95a-5f95837d5a11-kube-api-access-x4xlh\") pod \"nova-cell0-db-create-t2z9m\" (UID: \"bbd2a51b-4e22-45f7-b95a-5f95837d5a11\") " pod="openstack/nova-cell0-db-create-t2z9m" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.318120 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbd2a51b-4e22-45f7-b95a-5f95837d5a11-operator-scripts\") pod \"nova-cell0-db-create-t2z9m\" (UID: \"bbd2a51b-4e22-45f7-b95a-5f95837d5a11\") " pod="openstack/nova-cell0-db-create-t2z9m" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.338159 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1d5a-account-create-update-fpm7x"] Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.354759 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4xlh\" (UniqueName: \"kubernetes.io/projected/bbd2a51b-4e22-45f7-b95a-5f95837d5a11-kube-api-access-x4xlh\") pod \"nova-cell0-db-create-t2z9m\" (UID: \"bbd2a51b-4e22-45f7-b95a-5f95837d5a11\") " pod="openstack/nova-cell0-db-create-t2z9m" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.371017 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lncvr" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.419515 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7wdx\" (UniqueName: \"kubernetes.io/projected/4e04f6fd-6dd5-4eb3-8af2-ba12964173fc-kube-api-access-l7wdx\") pod \"nova-cell1-db-create-bv2sm\" (UID: \"4e04f6fd-6dd5-4eb3-8af2-ba12964173fc\") " pod="openstack/nova-cell1-db-create-bv2sm" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.419575 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh8b6\" (UniqueName: \"kubernetes.io/projected/7790cbf8-4ff7-4061-a33a-8468d51abe63-kube-api-access-mh8b6\") pod \"nova-api-1d5a-account-create-update-fpm7x\" (UID: \"7790cbf8-4ff7-4061-a33a-8468d51abe63\") " pod="openstack/nova-api-1d5a-account-create-update-fpm7x" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.419812 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e04f6fd-6dd5-4eb3-8af2-ba12964173fc-operator-scripts\") pod \"nova-cell1-db-create-bv2sm\" (UID: \"4e04f6fd-6dd5-4eb3-8af2-ba12964173fc\") " pod="openstack/nova-cell1-db-create-bv2sm" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.420053 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7790cbf8-4ff7-4061-a33a-8468d51abe63-operator-scripts\") pod \"nova-api-1d5a-account-create-update-fpm7x\" (UID: \"7790cbf8-4ff7-4061-a33a-8468d51abe63\") " pod="openstack/nova-api-1d5a-account-create-update-fpm7x" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.464186 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-4a2e-account-create-update-hd9cg"] Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.465267 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4a2e-account-create-update-hd9cg" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.467056 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.468232 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-t2z9m" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.496374 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4a2e-account-create-update-hd9cg"] Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.521399 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh8b6\" (UniqueName: \"kubernetes.io/projected/7790cbf8-4ff7-4061-a33a-8468d51abe63-kube-api-access-mh8b6\") pod \"nova-api-1d5a-account-create-update-fpm7x\" (UID: \"7790cbf8-4ff7-4061-a33a-8468d51abe63\") " pod="openstack/nova-api-1d5a-account-create-update-fpm7x" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.521705 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e04f6fd-6dd5-4eb3-8af2-ba12964173fc-operator-scripts\") pod \"nova-cell1-db-create-bv2sm\" (UID: \"4e04f6fd-6dd5-4eb3-8af2-ba12964173fc\") " pod="openstack/nova-cell1-db-create-bv2sm" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.521916 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7790cbf8-4ff7-4061-a33a-8468d51abe63-operator-scripts\") pod \"nova-api-1d5a-account-create-update-fpm7x\" (UID: \"7790cbf8-4ff7-4061-a33a-8468d51abe63\") " pod="openstack/nova-api-1d5a-account-create-update-fpm7x" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.522009 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7wdx\" (UniqueName: \"kubernetes.io/projected/4e04f6fd-6dd5-4eb3-8af2-ba12964173fc-kube-api-access-l7wdx\") pod \"nova-cell1-db-create-bv2sm\" (UID: \"4e04f6fd-6dd5-4eb3-8af2-ba12964173fc\") " pod="openstack/nova-cell1-db-create-bv2sm" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.522592 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7790cbf8-4ff7-4061-a33a-8468d51abe63-operator-scripts\") pod \"nova-api-1d5a-account-create-update-fpm7x\" (UID: \"7790cbf8-4ff7-4061-a33a-8468d51abe63\") " pod="openstack/nova-api-1d5a-account-create-update-fpm7x" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.523391 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e04f6fd-6dd5-4eb3-8af2-ba12964173fc-operator-scripts\") pod \"nova-cell1-db-create-bv2sm\" (UID: \"4e04f6fd-6dd5-4eb3-8af2-ba12964173fc\") " pod="openstack/nova-cell1-db-create-bv2sm" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.539565 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7wdx\" (UniqueName: \"kubernetes.io/projected/4e04f6fd-6dd5-4eb3-8af2-ba12964173fc-kube-api-access-l7wdx\") pod \"nova-cell1-db-create-bv2sm\" (UID: \"4e04f6fd-6dd5-4eb3-8af2-ba12964173fc\") " pod="openstack/nova-cell1-db-create-bv2sm" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.540340 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh8b6\" (UniqueName: \"kubernetes.io/projected/7790cbf8-4ff7-4061-a33a-8468d51abe63-kube-api-access-mh8b6\") pod \"nova-api-1d5a-account-create-update-fpm7x\" (UID: \"7790cbf8-4ff7-4061-a33a-8468d51abe63\") " pod="openstack/nova-api-1d5a-account-create-update-fpm7x" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.616312 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bv2sm" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.623409 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8cfbbe8-ec99-4a0c-a960-7faf5c545935-operator-scripts\") pod \"nova-cell0-4a2e-account-create-update-hd9cg\" (UID: \"e8cfbbe8-ec99-4a0c-a960-7faf5c545935\") " pod="openstack/nova-cell0-4a2e-account-create-update-hd9cg" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.623554 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2c4t\" (UniqueName: \"kubernetes.io/projected/e8cfbbe8-ec99-4a0c-a960-7faf5c545935-kube-api-access-j2c4t\") pod \"nova-cell0-4a2e-account-create-update-hd9cg\" (UID: \"e8cfbbe8-ec99-4a0c-a960-7faf5c545935\") " pod="openstack/nova-cell0-4a2e-account-create-update-hd9cg" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.624377 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1d5a-account-create-update-fpm7x" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.660312 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-35b0-account-create-update-rjv9j"] Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.663311 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-35b0-account-create-update-rjv9j" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.666763 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.667867 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-35b0-account-create-update-rjv9j"] Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.724710 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q7d5\" (UniqueName: \"kubernetes.io/projected/75698abd-8581-4a45-b861-f397a191a09e-kube-api-access-7q7d5\") pod \"nova-cell1-35b0-account-create-update-rjv9j\" (UID: \"75698abd-8581-4a45-b861-f397a191a09e\") " pod="openstack/nova-cell1-35b0-account-create-update-rjv9j" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.724769 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8cfbbe8-ec99-4a0c-a960-7faf5c545935-operator-scripts\") pod \"nova-cell0-4a2e-account-create-update-hd9cg\" (UID: \"e8cfbbe8-ec99-4a0c-a960-7faf5c545935\") " pod="openstack/nova-cell0-4a2e-account-create-update-hd9cg" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.725244 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75698abd-8581-4a45-b861-f397a191a09e-operator-scripts\") pod \"nova-cell1-35b0-account-create-update-rjv9j\" (UID: \"75698abd-8581-4a45-b861-f397a191a09e\") " pod="openstack/nova-cell1-35b0-account-create-update-rjv9j" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.725755 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2c4t\" (UniqueName: \"kubernetes.io/projected/e8cfbbe8-ec99-4a0c-a960-7faf5c545935-kube-api-access-j2c4t\") pod \"nova-cell0-4a2e-account-create-update-hd9cg\" (UID: \"e8cfbbe8-ec99-4a0c-a960-7faf5c545935\") " pod="openstack/nova-cell0-4a2e-account-create-update-hd9cg" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.727210 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8cfbbe8-ec99-4a0c-a960-7faf5c545935-operator-scripts\") pod \"nova-cell0-4a2e-account-create-update-hd9cg\" (UID: \"e8cfbbe8-ec99-4a0c-a960-7faf5c545935\") " pod="openstack/nova-cell0-4a2e-account-create-update-hd9cg" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.744197 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2c4t\" (UniqueName: \"kubernetes.io/projected/e8cfbbe8-ec99-4a0c-a960-7faf5c545935-kube-api-access-j2c4t\") pod \"nova-cell0-4a2e-account-create-update-hd9cg\" (UID: \"e8cfbbe8-ec99-4a0c-a960-7faf5c545935\") " pod="openstack/nova-cell0-4a2e-account-create-update-hd9cg" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.827700 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75698abd-8581-4a45-b861-f397a191a09e-operator-scripts\") pod \"nova-cell1-35b0-account-create-update-rjv9j\" (UID: \"75698abd-8581-4a45-b861-f397a191a09e\") " pod="openstack/nova-cell1-35b0-account-create-update-rjv9j" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.827820 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q7d5\" (UniqueName: \"kubernetes.io/projected/75698abd-8581-4a45-b861-f397a191a09e-kube-api-access-7q7d5\") pod \"nova-cell1-35b0-account-create-update-rjv9j\" (UID: \"75698abd-8581-4a45-b861-f397a191a09e\") " pod="openstack/nova-cell1-35b0-account-create-update-rjv9j" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.828371 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75698abd-8581-4a45-b861-f397a191a09e-operator-scripts\") pod \"nova-cell1-35b0-account-create-update-rjv9j\" (UID: \"75698abd-8581-4a45-b861-f397a191a09e\") " pod="openstack/nova-cell1-35b0-account-create-update-rjv9j" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.842194 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-lncvr"] Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.845012 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q7d5\" (UniqueName: \"kubernetes.io/projected/75698abd-8581-4a45-b861-f397a191a09e-kube-api-access-7q7d5\") pod \"nova-cell1-35b0-account-create-update-rjv9j\" (UID: \"75698abd-8581-4a45-b861-f397a191a09e\") " pod="openstack/nova-cell1-35b0-account-create-update-rjv9j" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.855625 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4a2e-account-create-update-hd9cg" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.982049 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-35b0-account-create-update-rjv9j" Dec 05 17:55:23 crc kubenswrapper[4756]: I1205 17:55:23.988922 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-t2z9m"] Dec 05 17:55:24 crc kubenswrapper[4756]: W1205 17:55:24.009355 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbd2a51b_4e22_45f7_b95a_5f95837d5a11.slice/crio-a64c9de112803a23ed2b1e9f7c89b6791c63275d8bcac273b536d6e6a341be8c WatchSource:0}: Error finding container a64c9de112803a23ed2b1e9f7c89b6791c63275d8bcac273b536d6e6a341be8c: Status 404 returned error can't find the container with id a64c9de112803a23ed2b1e9f7c89b6791c63275d8bcac273b536d6e6a341be8c Dec 05 17:55:24 crc kubenswrapper[4756]: I1205 17:55:24.063232 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-bv2sm"] Dec 05 17:55:24 crc kubenswrapper[4756]: W1205 17:55:24.081348 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e04f6fd_6dd5_4eb3_8af2_ba12964173fc.slice/crio-2b5dede8ec1d5fb99f4f6b88143521f4d988c17ca3a1a677cb4825b2931ac9d2 WatchSource:0}: Error finding container 2b5dede8ec1d5fb99f4f6b88143521f4d988c17ca3a1a677cb4825b2931ac9d2: Status 404 returned error can't find the container with id 2b5dede8ec1d5fb99f4f6b88143521f4d988c17ca3a1a677cb4825b2931ac9d2 Dec 05 17:55:24 crc kubenswrapper[4756]: I1205 17:55:24.156143 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1d5a-account-create-update-fpm7x"] Dec 05 17:55:24 crc kubenswrapper[4756]: I1205 17:55:24.324709 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4a2e-account-create-update-hd9cg"] Dec 05 17:55:24 crc kubenswrapper[4756]: W1205 17:55:24.398571 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8cfbbe8_ec99_4a0c_a960_7faf5c545935.slice/crio-8637ad0e5ddb4b28960233935927a66b95e7c385741ccafca65c029d539d9190 WatchSource:0}: Error finding container 8637ad0e5ddb4b28960233935927a66b95e7c385741ccafca65c029d539d9190: Status 404 returned error can't find the container with id 8637ad0e5ddb4b28960233935927a66b95e7c385741ccafca65c029d539d9190 Dec 05 17:55:24 crc kubenswrapper[4756]: I1205 17:55:24.487042 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-35b0-account-create-update-rjv9j"] Dec 05 17:55:24 crc kubenswrapper[4756]: W1205 17:55:24.501705 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75698abd_8581_4a45_b861_f397a191a09e.slice/crio-d92a1fbd6ef3246e026146be46fac42cd15babba016f76c0d71e0742861723b5 WatchSource:0}: Error finding container d92a1fbd6ef3246e026146be46fac42cd15babba016f76c0d71e0742861723b5: Status 404 returned error can't find the container with id d92a1fbd6ef3246e026146be46fac42cd15babba016f76c0d71e0742861723b5 Dec 05 17:55:24 crc kubenswrapper[4756]: I1205 17:55:24.505313 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4a2e-account-create-update-hd9cg" event={"ID":"e8cfbbe8-ec99-4a0c-a960-7faf5c545935","Type":"ContainerStarted","Data":"8637ad0e5ddb4b28960233935927a66b95e7c385741ccafca65c029d539d9190"} Dec 05 17:55:24 crc kubenswrapper[4756]: I1205 17:55:24.509852 4756 generic.go:334] "Generic (PLEG): container finished" podID="f7ead6de-7881-4fe3-810e-6812c8a0669d" containerID="fd7fff6a580f35fd53cbc8830b129d4787bd28a52bf8cfd666ebe25d217791da" exitCode=0 Dec 05 17:55:24 crc kubenswrapper[4756]: I1205 17:55:24.509904 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-lncvr" event={"ID":"f7ead6de-7881-4fe3-810e-6812c8a0669d","Type":"ContainerDied","Data":"fd7fff6a580f35fd53cbc8830b129d4787bd28a52bf8cfd666ebe25d217791da"} Dec 05 17:55:24 crc kubenswrapper[4756]: I1205 17:55:24.509925 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-lncvr" event={"ID":"f7ead6de-7881-4fe3-810e-6812c8a0669d","Type":"ContainerStarted","Data":"07d7757ab4a3fcfd20f4c5fa1cc3b1ccb88685654a578178ef64d705ae872747"} Dec 05 17:55:24 crc kubenswrapper[4756]: I1205 17:55:24.512353 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-bv2sm" event={"ID":"4e04f6fd-6dd5-4eb3-8af2-ba12964173fc","Type":"ContainerStarted","Data":"2b5dede8ec1d5fb99f4f6b88143521f4d988c17ca3a1a677cb4825b2931ac9d2"} Dec 05 17:55:24 crc kubenswrapper[4756]: I1205 17:55:24.514144 4756 generic.go:334] "Generic (PLEG): container finished" podID="bbd2a51b-4e22-45f7-b95a-5f95837d5a11" containerID="bc0ffb0f85716f7ace17617226442dbacc2a04251cb570cbba06ba02bd834aaa" exitCode=0 Dec 05 17:55:24 crc kubenswrapper[4756]: I1205 17:55:24.514187 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-t2z9m" event={"ID":"bbd2a51b-4e22-45f7-b95a-5f95837d5a11","Type":"ContainerDied","Data":"bc0ffb0f85716f7ace17617226442dbacc2a04251cb570cbba06ba02bd834aaa"} Dec 05 17:55:24 crc kubenswrapper[4756]: I1205 17:55:24.514203 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-t2z9m" event={"ID":"bbd2a51b-4e22-45f7-b95a-5f95837d5a11","Type":"ContainerStarted","Data":"a64c9de112803a23ed2b1e9f7c89b6791c63275d8bcac273b536d6e6a341be8c"} Dec 05 17:55:24 crc kubenswrapper[4756]: I1205 17:55:24.517492 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1d5a-account-create-update-fpm7x" event={"ID":"7790cbf8-4ff7-4061-a33a-8468d51abe63","Type":"ContainerStarted","Data":"3b5d6ad73c2cc22980d158118aa16e7fcd8fba424f4e95de69f1c49860e5196c"} Dec 05 17:55:25 crc kubenswrapper[4756]: I1205 17:55:25.529608 4756 generic.go:334] "Generic (PLEG): container finished" podID="75698abd-8581-4a45-b861-f397a191a09e" containerID="7bc4caa0f637823e9d8a1ae823b3d0cdf78830820a3bc1abcbea9dc8c269d92b" exitCode=0 Dec 05 17:55:25 crc kubenswrapper[4756]: I1205 17:55:25.529697 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-35b0-account-create-update-rjv9j" event={"ID":"75698abd-8581-4a45-b861-f397a191a09e","Type":"ContainerDied","Data":"7bc4caa0f637823e9d8a1ae823b3d0cdf78830820a3bc1abcbea9dc8c269d92b"} Dec 05 17:55:25 crc kubenswrapper[4756]: I1205 17:55:25.530104 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-35b0-account-create-update-rjv9j" event={"ID":"75698abd-8581-4a45-b861-f397a191a09e","Type":"ContainerStarted","Data":"d92a1fbd6ef3246e026146be46fac42cd15babba016f76c0d71e0742861723b5"} Dec 05 17:55:25 crc kubenswrapper[4756]: I1205 17:55:25.532511 4756 generic.go:334] "Generic (PLEG): container finished" podID="7790cbf8-4ff7-4061-a33a-8468d51abe63" containerID="99f940b808d324a18dbae92f7c4a8cebcb8fe7a4aedb4c6c1221e450076618c9" exitCode=0 Dec 05 17:55:25 crc kubenswrapper[4756]: I1205 17:55:25.532674 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1d5a-account-create-update-fpm7x" event={"ID":"7790cbf8-4ff7-4061-a33a-8468d51abe63","Type":"ContainerDied","Data":"99f940b808d324a18dbae92f7c4a8cebcb8fe7a4aedb4c6c1221e450076618c9"} Dec 05 17:55:25 crc kubenswrapper[4756]: I1205 17:55:25.534998 4756 generic.go:334] "Generic (PLEG): container finished" podID="e8cfbbe8-ec99-4a0c-a960-7faf5c545935" containerID="e2bd8c4dea1f9017a0a23145fabace2f0947a1c1d65c2709bb7864a2ae6a2c50" exitCode=0 Dec 05 17:55:25 crc kubenswrapper[4756]: I1205 17:55:25.535111 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4a2e-account-create-update-hd9cg" event={"ID":"e8cfbbe8-ec99-4a0c-a960-7faf5c545935","Type":"ContainerDied","Data":"e2bd8c4dea1f9017a0a23145fabace2f0947a1c1d65c2709bb7864a2ae6a2c50"} Dec 05 17:55:25 crc kubenswrapper[4756]: I1205 17:55:25.537028 4756 generic.go:334] "Generic (PLEG): container finished" podID="4e04f6fd-6dd5-4eb3-8af2-ba12964173fc" containerID="f9ad32dd323df73b2b5c6ca4c082d12426c8e7a9df13805713d1b3729f2816b7" exitCode=0 Dec 05 17:55:25 crc kubenswrapper[4756]: I1205 17:55:25.537308 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-bv2sm" event={"ID":"4e04f6fd-6dd5-4eb3-8af2-ba12964173fc","Type":"ContainerDied","Data":"f9ad32dd323df73b2b5c6ca4c082d12426c8e7a9df13805713d1b3729f2816b7"} Dec 05 17:55:25 crc kubenswrapper[4756]: I1205 17:55:25.980506 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lncvr" Dec 05 17:55:25 crc kubenswrapper[4756]: I1205 17:55:25.994753 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-t2z9m" Dec 05 17:55:26 crc kubenswrapper[4756]: I1205 17:55:26.050726 4756 scope.go:117] "RemoveContainer" containerID="0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a" Dec 05 17:55:26 crc kubenswrapper[4756]: E1205 17:55:26.051503 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:55:26 crc kubenswrapper[4756]: I1205 17:55:26.081057 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7ead6de-7881-4fe3-810e-6812c8a0669d-operator-scripts\") pod \"f7ead6de-7881-4fe3-810e-6812c8a0669d\" (UID: \"f7ead6de-7881-4fe3-810e-6812c8a0669d\") " Dec 05 17:55:26 crc kubenswrapper[4756]: I1205 17:55:26.081175 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-775zn\" (UniqueName: \"kubernetes.io/projected/f7ead6de-7881-4fe3-810e-6812c8a0669d-kube-api-access-775zn\") pod \"f7ead6de-7881-4fe3-810e-6812c8a0669d\" (UID: \"f7ead6de-7881-4fe3-810e-6812c8a0669d\") " Dec 05 17:55:26 crc kubenswrapper[4756]: I1205 17:55:26.081370 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbd2a51b-4e22-45f7-b95a-5f95837d5a11-operator-scripts\") pod \"bbd2a51b-4e22-45f7-b95a-5f95837d5a11\" (UID: \"bbd2a51b-4e22-45f7-b95a-5f95837d5a11\") " Dec 05 17:55:26 crc kubenswrapper[4756]: I1205 17:55:26.081404 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4xlh\" (UniqueName: \"kubernetes.io/projected/bbd2a51b-4e22-45f7-b95a-5f95837d5a11-kube-api-access-x4xlh\") pod \"bbd2a51b-4e22-45f7-b95a-5f95837d5a11\" (UID: \"bbd2a51b-4e22-45f7-b95a-5f95837d5a11\") " Dec 05 17:55:26 crc kubenswrapper[4756]: I1205 17:55:26.082092 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7ead6de-7881-4fe3-810e-6812c8a0669d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f7ead6de-7881-4fe3-810e-6812c8a0669d" (UID: "f7ead6de-7881-4fe3-810e-6812c8a0669d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:55:26 crc kubenswrapper[4756]: I1205 17:55:26.087587 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbd2a51b-4e22-45f7-b95a-5f95837d5a11-kube-api-access-x4xlh" (OuterVolumeSpecName: "kube-api-access-x4xlh") pod "bbd2a51b-4e22-45f7-b95a-5f95837d5a11" (UID: "bbd2a51b-4e22-45f7-b95a-5f95837d5a11"). InnerVolumeSpecName "kube-api-access-x4xlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:55:26 crc kubenswrapper[4756]: I1205 17:55:26.087970 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbd2a51b-4e22-45f7-b95a-5f95837d5a11-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bbd2a51b-4e22-45f7-b95a-5f95837d5a11" (UID: "bbd2a51b-4e22-45f7-b95a-5f95837d5a11"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:55:26 crc kubenswrapper[4756]: I1205 17:55:26.102717 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7ead6de-7881-4fe3-810e-6812c8a0669d-kube-api-access-775zn" (OuterVolumeSpecName: "kube-api-access-775zn") pod "f7ead6de-7881-4fe3-810e-6812c8a0669d" (UID: "f7ead6de-7881-4fe3-810e-6812c8a0669d"). InnerVolumeSpecName "kube-api-access-775zn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:55:26 crc kubenswrapper[4756]: I1205 17:55:26.186555 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbd2a51b-4e22-45f7-b95a-5f95837d5a11-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:26 crc kubenswrapper[4756]: I1205 17:55:26.186592 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4xlh\" (UniqueName: \"kubernetes.io/projected/bbd2a51b-4e22-45f7-b95a-5f95837d5a11-kube-api-access-x4xlh\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:26 crc kubenswrapper[4756]: I1205 17:55:26.186603 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7ead6de-7881-4fe3-810e-6812c8a0669d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:26 crc kubenswrapper[4756]: I1205 17:55:26.186635 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-775zn\" (UniqueName: \"kubernetes.io/projected/f7ead6de-7881-4fe3-810e-6812c8a0669d-kube-api-access-775zn\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:26 crc kubenswrapper[4756]: I1205 17:55:26.549201 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-lncvr" event={"ID":"f7ead6de-7881-4fe3-810e-6812c8a0669d","Type":"ContainerDied","Data":"07d7757ab4a3fcfd20f4c5fa1cc3b1ccb88685654a578178ef64d705ae872747"} Dec 05 17:55:26 crc kubenswrapper[4756]: I1205 17:55:26.549260 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07d7757ab4a3fcfd20f4c5fa1cc3b1ccb88685654a578178ef64d705ae872747" Dec 05 17:55:26 crc kubenswrapper[4756]: I1205 17:55:26.549886 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lncvr" Dec 05 17:55:26 crc kubenswrapper[4756]: I1205 17:55:26.551872 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-t2z9m" Dec 05 17:55:26 crc kubenswrapper[4756]: I1205 17:55:26.551889 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-t2z9m" event={"ID":"bbd2a51b-4e22-45f7-b95a-5f95837d5a11","Type":"ContainerDied","Data":"a64c9de112803a23ed2b1e9f7c89b6791c63275d8bcac273b536d6e6a341be8c"} Dec 05 17:55:26 crc kubenswrapper[4756]: I1205 17:55:26.552018 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a64c9de112803a23ed2b1e9f7c89b6791c63275d8bcac273b536d6e6a341be8c" Dec 05 17:55:26 crc kubenswrapper[4756]: I1205 17:55:26.962831 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4a2e-account-create-update-hd9cg" Dec 05 17:55:26 crc kubenswrapper[4756]: I1205 17:55:26.963653 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1d5a-account-create-update-fpm7x" Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.105041 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-35b0-account-create-update-rjv9j" Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.111381 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bv2sm" Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.114544 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7790cbf8-4ff7-4061-a33a-8468d51abe63-operator-scripts\") pod \"7790cbf8-4ff7-4061-a33a-8468d51abe63\" (UID: \"7790cbf8-4ff7-4061-a33a-8468d51abe63\") " Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.114656 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh8b6\" (UniqueName: \"kubernetes.io/projected/7790cbf8-4ff7-4061-a33a-8468d51abe63-kube-api-access-mh8b6\") pod \"7790cbf8-4ff7-4061-a33a-8468d51abe63\" (UID: \"7790cbf8-4ff7-4061-a33a-8468d51abe63\") " Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.114732 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8cfbbe8-ec99-4a0c-a960-7faf5c545935-operator-scripts\") pod \"e8cfbbe8-ec99-4a0c-a960-7faf5c545935\" (UID: \"e8cfbbe8-ec99-4a0c-a960-7faf5c545935\") " Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.114936 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2c4t\" (UniqueName: \"kubernetes.io/projected/e8cfbbe8-ec99-4a0c-a960-7faf5c545935-kube-api-access-j2c4t\") pod \"e8cfbbe8-ec99-4a0c-a960-7faf5c545935\" (UID: \"e8cfbbe8-ec99-4a0c-a960-7faf5c545935\") " Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.114996 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7790cbf8-4ff7-4061-a33a-8468d51abe63-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7790cbf8-4ff7-4061-a33a-8468d51abe63" (UID: "7790cbf8-4ff7-4061-a33a-8468d51abe63"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.116013 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8cfbbe8-ec99-4a0c-a960-7faf5c545935-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e8cfbbe8-ec99-4a0c-a960-7faf5c545935" (UID: "e8cfbbe8-ec99-4a0c-a960-7faf5c545935"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.116683 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8cfbbe8-ec99-4a0c-a960-7faf5c545935-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.116729 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7790cbf8-4ff7-4061-a33a-8468d51abe63-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.117633 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7790cbf8-4ff7-4061-a33a-8468d51abe63-kube-api-access-mh8b6" (OuterVolumeSpecName: "kube-api-access-mh8b6") pod "7790cbf8-4ff7-4061-a33a-8468d51abe63" (UID: "7790cbf8-4ff7-4061-a33a-8468d51abe63"). InnerVolumeSpecName "kube-api-access-mh8b6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.122650 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8cfbbe8-ec99-4a0c-a960-7faf5c545935-kube-api-access-j2c4t" (OuterVolumeSpecName: "kube-api-access-j2c4t") pod "e8cfbbe8-ec99-4a0c-a960-7faf5c545935" (UID: "e8cfbbe8-ec99-4a0c-a960-7faf5c545935"). InnerVolumeSpecName "kube-api-access-j2c4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.217969 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7q7d5\" (UniqueName: \"kubernetes.io/projected/75698abd-8581-4a45-b861-f397a191a09e-kube-api-access-7q7d5\") pod \"75698abd-8581-4a45-b861-f397a191a09e\" (UID: \"75698abd-8581-4a45-b861-f397a191a09e\") " Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.218130 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75698abd-8581-4a45-b861-f397a191a09e-operator-scripts\") pod \"75698abd-8581-4a45-b861-f397a191a09e\" (UID: \"75698abd-8581-4a45-b861-f397a191a09e\") " Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.218224 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e04f6fd-6dd5-4eb3-8af2-ba12964173fc-operator-scripts\") pod \"4e04f6fd-6dd5-4eb3-8af2-ba12964173fc\" (UID: \"4e04f6fd-6dd5-4eb3-8af2-ba12964173fc\") " Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.218327 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7wdx\" (UniqueName: \"kubernetes.io/projected/4e04f6fd-6dd5-4eb3-8af2-ba12964173fc-kube-api-access-l7wdx\") pod \"4e04f6fd-6dd5-4eb3-8af2-ba12964173fc\" (UID: \"4e04f6fd-6dd5-4eb3-8af2-ba12964173fc\") " Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.218676 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75698abd-8581-4a45-b861-f397a191a09e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "75698abd-8581-4a45-b861-f397a191a09e" (UID: "75698abd-8581-4a45-b861-f397a191a09e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.218705 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e04f6fd-6dd5-4eb3-8af2-ba12964173fc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4e04f6fd-6dd5-4eb3-8af2-ba12964173fc" (UID: "4e04f6fd-6dd5-4eb3-8af2-ba12964173fc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.219012 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75698abd-8581-4a45-b861-f397a191a09e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.219032 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2c4t\" (UniqueName: \"kubernetes.io/projected/e8cfbbe8-ec99-4a0c-a960-7faf5c545935-kube-api-access-j2c4t\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.219043 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e04f6fd-6dd5-4eb3-8af2-ba12964173fc-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.219054 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mh8b6\" (UniqueName: \"kubernetes.io/projected/7790cbf8-4ff7-4061-a33a-8468d51abe63-kube-api-access-mh8b6\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.221210 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75698abd-8581-4a45-b861-f397a191a09e-kube-api-access-7q7d5" (OuterVolumeSpecName: "kube-api-access-7q7d5") pod "75698abd-8581-4a45-b861-f397a191a09e" (UID: "75698abd-8581-4a45-b861-f397a191a09e"). InnerVolumeSpecName "kube-api-access-7q7d5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.223076 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e04f6fd-6dd5-4eb3-8af2-ba12964173fc-kube-api-access-l7wdx" (OuterVolumeSpecName: "kube-api-access-l7wdx") pod "4e04f6fd-6dd5-4eb3-8af2-ba12964173fc" (UID: "4e04f6fd-6dd5-4eb3-8af2-ba12964173fc"). InnerVolumeSpecName "kube-api-access-l7wdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.320816 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7wdx\" (UniqueName: \"kubernetes.io/projected/4e04f6fd-6dd5-4eb3-8af2-ba12964173fc-kube-api-access-l7wdx\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.320843 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7q7d5\" (UniqueName: \"kubernetes.io/projected/75698abd-8581-4a45-b861-f397a191a09e-kube-api-access-7q7d5\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.566064 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-bv2sm" event={"ID":"4e04f6fd-6dd5-4eb3-8af2-ba12964173fc","Type":"ContainerDied","Data":"2b5dede8ec1d5fb99f4f6b88143521f4d988c17ca3a1a677cb4825b2931ac9d2"} Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.566120 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b5dede8ec1d5fb99f4f6b88143521f4d988c17ca3a1a677cb4825b2931ac9d2" Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.566087 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bv2sm" Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.568973 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-35b0-account-create-update-rjv9j" event={"ID":"75698abd-8581-4a45-b861-f397a191a09e","Type":"ContainerDied","Data":"d92a1fbd6ef3246e026146be46fac42cd15babba016f76c0d71e0742861723b5"} Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.569030 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-35b0-account-create-update-rjv9j" Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.569036 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d92a1fbd6ef3246e026146be46fac42cd15babba016f76c0d71e0742861723b5" Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.571856 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1d5a-account-create-update-fpm7x" event={"ID":"7790cbf8-4ff7-4061-a33a-8468d51abe63","Type":"ContainerDied","Data":"3b5d6ad73c2cc22980d158118aa16e7fcd8fba424f4e95de69f1c49860e5196c"} Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.571908 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b5d6ad73c2cc22980d158118aa16e7fcd8fba424f4e95de69f1c49860e5196c" Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.571912 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1d5a-account-create-update-fpm7x" Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.574032 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4a2e-account-create-update-hd9cg" event={"ID":"e8cfbbe8-ec99-4a0c-a960-7faf5c545935","Type":"ContainerDied","Data":"8637ad0e5ddb4b28960233935927a66b95e7c385741ccafca65c029d539d9190"} Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.574054 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8637ad0e5ddb4b28960233935927a66b95e7c385741ccafca65c029d539d9190" Dec 05 17:55:27 crc kubenswrapper[4756]: I1205 17:55:27.574086 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4a2e-account-create-update-hd9cg" Dec 05 17:55:28 crc kubenswrapper[4756]: I1205 17:55:28.701852 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cc548"] Dec 05 17:55:28 crc kubenswrapper[4756]: E1205 17:55:28.702658 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbd2a51b-4e22-45f7-b95a-5f95837d5a11" containerName="mariadb-database-create" Dec 05 17:55:28 crc kubenswrapper[4756]: I1205 17:55:28.702674 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbd2a51b-4e22-45f7-b95a-5f95837d5a11" containerName="mariadb-database-create" Dec 05 17:55:28 crc kubenswrapper[4756]: E1205 17:55:28.702696 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ead6de-7881-4fe3-810e-6812c8a0669d" containerName="mariadb-database-create" Dec 05 17:55:28 crc kubenswrapper[4756]: I1205 17:55:28.702708 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ead6de-7881-4fe3-810e-6812c8a0669d" containerName="mariadb-database-create" Dec 05 17:55:28 crc kubenswrapper[4756]: E1205 17:55:28.702725 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7790cbf8-4ff7-4061-a33a-8468d51abe63" containerName="mariadb-account-create-update" Dec 05 17:55:28 crc kubenswrapper[4756]: I1205 17:55:28.702735 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7790cbf8-4ff7-4061-a33a-8468d51abe63" containerName="mariadb-account-create-update" Dec 05 17:55:28 crc kubenswrapper[4756]: E1205 17:55:28.702765 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75698abd-8581-4a45-b861-f397a191a09e" containerName="mariadb-account-create-update" Dec 05 17:55:28 crc kubenswrapper[4756]: I1205 17:55:28.702773 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="75698abd-8581-4a45-b861-f397a191a09e" containerName="mariadb-account-create-update" Dec 05 17:55:28 crc kubenswrapper[4756]: E1205 17:55:28.702799 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8cfbbe8-ec99-4a0c-a960-7faf5c545935" containerName="mariadb-account-create-update" Dec 05 17:55:28 crc kubenswrapper[4756]: I1205 17:55:28.702808 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8cfbbe8-ec99-4a0c-a960-7faf5c545935" containerName="mariadb-account-create-update" Dec 05 17:55:28 crc kubenswrapper[4756]: E1205 17:55:28.702823 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e04f6fd-6dd5-4eb3-8af2-ba12964173fc" containerName="mariadb-database-create" Dec 05 17:55:28 crc kubenswrapper[4756]: I1205 17:55:28.702831 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e04f6fd-6dd5-4eb3-8af2-ba12964173fc" containerName="mariadb-database-create" Dec 05 17:55:28 crc kubenswrapper[4756]: I1205 17:55:28.703026 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8cfbbe8-ec99-4a0c-a960-7faf5c545935" containerName="mariadb-account-create-update" Dec 05 17:55:28 crc kubenswrapper[4756]: I1205 17:55:28.703043 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="7790cbf8-4ff7-4061-a33a-8468d51abe63" containerName="mariadb-account-create-update" Dec 05 17:55:28 crc kubenswrapper[4756]: I1205 17:55:28.703065 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="75698abd-8581-4a45-b861-f397a191a09e" containerName="mariadb-account-create-update" Dec 05 17:55:28 crc kubenswrapper[4756]: I1205 17:55:28.703079 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7ead6de-7881-4fe3-810e-6812c8a0669d" containerName="mariadb-database-create" Dec 05 17:55:28 crc kubenswrapper[4756]: I1205 17:55:28.703098 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e04f6fd-6dd5-4eb3-8af2-ba12964173fc" containerName="mariadb-database-create" Dec 05 17:55:28 crc kubenswrapper[4756]: I1205 17:55:28.703117 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbd2a51b-4e22-45f7-b95a-5f95837d5a11" containerName="mariadb-database-create" Dec 05 17:55:28 crc kubenswrapper[4756]: I1205 17:55:28.703966 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cc548" Dec 05 17:55:28 crc kubenswrapper[4756]: I1205 17:55:28.706224 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-s8d9c" Dec 05 17:55:28 crc kubenswrapper[4756]: I1205 17:55:28.707608 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 05 17:55:28 crc kubenswrapper[4756]: I1205 17:55:28.707722 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 05 17:55:28 crc kubenswrapper[4756]: I1205 17:55:28.709699 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cc548"] Dec 05 17:55:29 crc kubenswrapper[4756]: I1205 17:55:29.011932 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44c361f5-fc03-4aa7-94a0-0a3e23537626-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cc548\" (UID: \"44c361f5-fc03-4aa7-94a0-0a3e23537626\") " pod="openstack/nova-cell0-conductor-db-sync-cc548" Dec 05 17:55:29 crc kubenswrapper[4756]: I1205 17:55:29.012031 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44c361f5-fc03-4aa7-94a0-0a3e23537626-config-data\") pod \"nova-cell0-conductor-db-sync-cc548\" (UID: \"44c361f5-fc03-4aa7-94a0-0a3e23537626\") " pod="openstack/nova-cell0-conductor-db-sync-cc548" Dec 05 17:55:29 crc kubenswrapper[4756]: I1205 17:55:29.012138 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44c361f5-fc03-4aa7-94a0-0a3e23537626-scripts\") pod \"nova-cell0-conductor-db-sync-cc548\" (UID: \"44c361f5-fc03-4aa7-94a0-0a3e23537626\") " pod="openstack/nova-cell0-conductor-db-sync-cc548" Dec 05 17:55:29 crc kubenswrapper[4756]: I1205 17:55:29.012176 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6jsm\" (UniqueName: \"kubernetes.io/projected/44c361f5-fc03-4aa7-94a0-0a3e23537626-kube-api-access-m6jsm\") pod \"nova-cell0-conductor-db-sync-cc548\" (UID: \"44c361f5-fc03-4aa7-94a0-0a3e23537626\") " pod="openstack/nova-cell0-conductor-db-sync-cc548" Dec 05 17:55:29 crc kubenswrapper[4756]: I1205 17:55:29.113848 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44c361f5-fc03-4aa7-94a0-0a3e23537626-scripts\") pod \"nova-cell0-conductor-db-sync-cc548\" (UID: \"44c361f5-fc03-4aa7-94a0-0a3e23537626\") " pod="openstack/nova-cell0-conductor-db-sync-cc548" Dec 05 17:55:29 crc kubenswrapper[4756]: I1205 17:55:29.113901 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6jsm\" (UniqueName: \"kubernetes.io/projected/44c361f5-fc03-4aa7-94a0-0a3e23537626-kube-api-access-m6jsm\") pod \"nova-cell0-conductor-db-sync-cc548\" (UID: \"44c361f5-fc03-4aa7-94a0-0a3e23537626\") " pod="openstack/nova-cell0-conductor-db-sync-cc548" Dec 05 17:55:29 crc kubenswrapper[4756]: I1205 17:55:29.113954 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44c361f5-fc03-4aa7-94a0-0a3e23537626-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cc548\" (UID: \"44c361f5-fc03-4aa7-94a0-0a3e23537626\") " pod="openstack/nova-cell0-conductor-db-sync-cc548" Dec 05 17:55:29 crc kubenswrapper[4756]: I1205 17:55:29.113997 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44c361f5-fc03-4aa7-94a0-0a3e23537626-config-data\") pod \"nova-cell0-conductor-db-sync-cc548\" (UID: \"44c361f5-fc03-4aa7-94a0-0a3e23537626\") " pod="openstack/nova-cell0-conductor-db-sync-cc548" Dec 05 17:55:29 crc kubenswrapper[4756]: I1205 17:55:29.122347 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44c361f5-fc03-4aa7-94a0-0a3e23537626-config-data\") pod \"nova-cell0-conductor-db-sync-cc548\" (UID: \"44c361f5-fc03-4aa7-94a0-0a3e23537626\") " pod="openstack/nova-cell0-conductor-db-sync-cc548" Dec 05 17:55:29 crc kubenswrapper[4756]: I1205 17:55:29.122841 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44c361f5-fc03-4aa7-94a0-0a3e23537626-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cc548\" (UID: \"44c361f5-fc03-4aa7-94a0-0a3e23537626\") " pod="openstack/nova-cell0-conductor-db-sync-cc548" Dec 05 17:55:29 crc kubenswrapper[4756]: I1205 17:55:29.129890 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44c361f5-fc03-4aa7-94a0-0a3e23537626-scripts\") pod \"nova-cell0-conductor-db-sync-cc548\" (UID: \"44c361f5-fc03-4aa7-94a0-0a3e23537626\") " pod="openstack/nova-cell0-conductor-db-sync-cc548" Dec 05 17:55:29 crc kubenswrapper[4756]: I1205 17:55:29.132200 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6jsm\" (UniqueName: \"kubernetes.io/projected/44c361f5-fc03-4aa7-94a0-0a3e23537626-kube-api-access-m6jsm\") pod \"nova-cell0-conductor-db-sync-cc548\" (UID: \"44c361f5-fc03-4aa7-94a0-0a3e23537626\") " pod="openstack/nova-cell0-conductor-db-sync-cc548" Dec 05 17:55:29 crc kubenswrapper[4756]: I1205 17:55:29.231606 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cc548" Dec 05 17:55:29 crc kubenswrapper[4756]: I1205 17:55:29.744753 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cc548"] Dec 05 17:55:30 crc kubenswrapper[4756]: I1205 17:55:30.610270 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cc548" event={"ID":"44c361f5-fc03-4aa7-94a0-0a3e23537626","Type":"ContainerStarted","Data":"2096f8d3137aea9e3ac87d65dd4a98d9318d3a8e034372ee465958b51029fcad"} Dec 05 17:55:30 crc kubenswrapper[4756]: I1205 17:55:30.610700 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cc548" event={"ID":"44c361f5-fc03-4aa7-94a0-0a3e23537626","Type":"ContainerStarted","Data":"20e7fbb4b69efe1c4dc1d1cd748a76d07de8ee6ee67c30d6ead24f39ec9163b8"} Dec 05 17:55:30 crc kubenswrapper[4756]: I1205 17:55:30.631369 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-cc548" podStartSLOduration=2.6313452809999998 podStartE2EDuration="2.631345281s" podCreationTimestamp="2025-12-05 17:55:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:55:30.629844751 +0000 UTC m=+5990.868518273" watchObservedRunningTime="2025-12-05 17:55:30.631345281 +0000 UTC m=+5990.870018803" Dec 05 17:55:35 crc kubenswrapper[4756]: I1205 17:55:35.679020 4756 generic.go:334] "Generic (PLEG): container finished" podID="44c361f5-fc03-4aa7-94a0-0a3e23537626" containerID="2096f8d3137aea9e3ac87d65dd4a98d9318d3a8e034372ee465958b51029fcad" exitCode=0 Dec 05 17:55:35 crc kubenswrapper[4756]: I1205 17:55:35.679736 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cc548" event={"ID":"44c361f5-fc03-4aa7-94a0-0a3e23537626","Type":"ContainerDied","Data":"2096f8d3137aea9e3ac87d65dd4a98d9318d3a8e034372ee465958b51029fcad"} Dec 05 17:55:37 crc kubenswrapper[4756]: I1205 17:55:37.145015 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cc548" Dec 05 17:55:37 crc kubenswrapper[4756]: I1205 17:55:37.311076 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44c361f5-fc03-4aa7-94a0-0a3e23537626-combined-ca-bundle\") pod \"44c361f5-fc03-4aa7-94a0-0a3e23537626\" (UID: \"44c361f5-fc03-4aa7-94a0-0a3e23537626\") " Dec 05 17:55:37 crc kubenswrapper[4756]: I1205 17:55:37.311163 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44c361f5-fc03-4aa7-94a0-0a3e23537626-scripts\") pod \"44c361f5-fc03-4aa7-94a0-0a3e23537626\" (UID: \"44c361f5-fc03-4aa7-94a0-0a3e23537626\") " Dec 05 17:55:37 crc kubenswrapper[4756]: I1205 17:55:37.311329 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6jsm\" (UniqueName: \"kubernetes.io/projected/44c361f5-fc03-4aa7-94a0-0a3e23537626-kube-api-access-m6jsm\") pod \"44c361f5-fc03-4aa7-94a0-0a3e23537626\" (UID: \"44c361f5-fc03-4aa7-94a0-0a3e23537626\") " Dec 05 17:55:37 crc kubenswrapper[4756]: I1205 17:55:37.311466 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44c361f5-fc03-4aa7-94a0-0a3e23537626-config-data\") pod \"44c361f5-fc03-4aa7-94a0-0a3e23537626\" (UID: \"44c361f5-fc03-4aa7-94a0-0a3e23537626\") " Dec 05 17:55:37 crc kubenswrapper[4756]: I1205 17:55:37.317120 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44c361f5-fc03-4aa7-94a0-0a3e23537626-scripts" (OuterVolumeSpecName: "scripts") pod "44c361f5-fc03-4aa7-94a0-0a3e23537626" (UID: "44c361f5-fc03-4aa7-94a0-0a3e23537626"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:55:37 crc kubenswrapper[4756]: I1205 17:55:37.317455 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44c361f5-fc03-4aa7-94a0-0a3e23537626-kube-api-access-m6jsm" (OuterVolumeSpecName: "kube-api-access-m6jsm") pod "44c361f5-fc03-4aa7-94a0-0a3e23537626" (UID: "44c361f5-fc03-4aa7-94a0-0a3e23537626"). InnerVolumeSpecName "kube-api-access-m6jsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:55:37 crc kubenswrapper[4756]: I1205 17:55:37.339274 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44c361f5-fc03-4aa7-94a0-0a3e23537626-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44c361f5-fc03-4aa7-94a0-0a3e23537626" (UID: "44c361f5-fc03-4aa7-94a0-0a3e23537626"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:55:37 crc kubenswrapper[4756]: I1205 17:55:37.366434 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44c361f5-fc03-4aa7-94a0-0a3e23537626-config-data" (OuterVolumeSpecName: "config-data") pod "44c361f5-fc03-4aa7-94a0-0a3e23537626" (UID: "44c361f5-fc03-4aa7-94a0-0a3e23537626"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:55:37 crc kubenswrapper[4756]: I1205 17:55:37.413954 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44c361f5-fc03-4aa7-94a0-0a3e23537626-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:37 crc kubenswrapper[4756]: I1205 17:55:37.413980 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44c361f5-fc03-4aa7-94a0-0a3e23537626-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:37 crc kubenswrapper[4756]: I1205 17:55:37.413991 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44c361f5-fc03-4aa7-94a0-0a3e23537626-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:37 crc kubenswrapper[4756]: I1205 17:55:37.414000 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6jsm\" (UniqueName: \"kubernetes.io/projected/44c361f5-fc03-4aa7-94a0-0a3e23537626-kube-api-access-m6jsm\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:37 crc kubenswrapper[4756]: I1205 17:55:37.715966 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cc548" event={"ID":"44c361f5-fc03-4aa7-94a0-0a3e23537626","Type":"ContainerDied","Data":"20e7fbb4b69efe1c4dc1d1cd748a76d07de8ee6ee67c30d6ead24f39ec9163b8"} Dec 05 17:55:37 crc kubenswrapper[4756]: I1205 17:55:37.716028 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20e7fbb4b69efe1c4dc1d1cd748a76d07de8ee6ee67c30d6ead24f39ec9163b8" Dec 05 17:55:37 crc kubenswrapper[4756]: I1205 17:55:37.716134 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cc548" Dec 05 17:55:37 crc kubenswrapper[4756]: I1205 17:55:37.823013 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 17:55:37 crc kubenswrapper[4756]: E1205 17:55:37.824037 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44c361f5-fc03-4aa7-94a0-0a3e23537626" containerName="nova-cell0-conductor-db-sync" Dec 05 17:55:37 crc kubenswrapper[4756]: I1205 17:55:37.824072 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c361f5-fc03-4aa7-94a0-0a3e23537626" containerName="nova-cell0-conductor-db-sync" Dec 05 17:55:37 crc kubenswrapper[4756]: I1205 17:55:37.825254 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="44c361f5-fc03-4aa7-94a0-0a3e23537626" containerName="nova-cell0-conductor-db-sync" Dec 05 17:55:37 crc kubenswrapper[4756]: I1205 17:55:37.826616 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 17:55:37 crc kubenswrapper[4756]: I1205 17:55:37.829570 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 05 17:55:37 crc kubenswrapper[4756]: I1205 17:55:37.829904 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-s8d9c" Dec 05 17:55:37 crc kubenswrapper[4756]: I1205 17:55:37.855305 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 17:55:37 crc kubenswrapper[4756]: I1205 17:55:37.925900 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d5c272e-db66-4d1e-b5b6-9fa305a01c8d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3d5c272e-db66-4d1e-b5b6-9fa305a01c8d\") " pod="openstack/nova-cell0-conductor-0" Dec 05 17:55:37 crc kubenswrapper[4756]: I1205 17:55:37.925970 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qqmf\" (UniqueName: \"kubernetes.io/projected/3d5c272e-db66-4d1e-b5b6-9fa305a01c8d-kube-api-access-6qqmf\") pod \"nova-cell0-conductor-0\" (UID: \"3d5c272e-db66-4d1e-b5b6-9fa305a01c8d\") " pod="openstack/nova-cell0-conductor-0" Dec 05 17:55:37 crc kubenswrapper[4756]: I1205 17:55:37.926377 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d5c272e-db66-4d1e-b5b6-9fa305a01c8d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3d5c272e-db66-4d1e-b5b6-9fa305a01c8d\") " pod="openstack/nova-cell0-conductor-0" Dec 05 17:55:38 crc kubenswrapper[4756]: I1205 17:55:38.028239 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d5c272e-db66-4d1e-b5b6-9fa305a01c8d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3d5c272e-db66-4d1e-b5b6-9fa305a01c8d\") " pod="openstack/nova-cell0-conductor-0" Dec 05 17:55:38 crc kubenswrapper[4756]: I1205 17:55:38.028389 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d5c272e-db66-4d1e-b5b6-9fa305a01c8d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3d5c272e-db66-4d1e-b5b6-9fa305a01c8d\") " pod="openstack/nova-cell0-conductor-0" Dec 05 17:55:38 crc kubenswrapper[4756]: I1205 17:55:38.028469 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qqmf\" (UniqueName: \"kubernetes.io/projected/3d5c272e-db66-4d1e-b5b6-9fa305a01c8d-kube-api-access-6qqmf\") pod \"nova-cell0-conductor-0\" (UID: \"3d5c272e-db66-4d1e-b5b6-9fa305a01c8d\") " pod="openstack/nova-cell0-conductor-0" Dec 05 17:55:38 crc kubenswrapper[4756]: I1205 17:55:38.034799 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d5c272e-db66-4d1e-b5b6-9fa305a01c8d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3d5c272e-db66-4d1e-b5b6-9fa305a01c8d\") " pod="openstack/nova-cell0-conductor-0" Dec 05 17:55:38 crc kubenswrapper[4756]: I1205 17:55:38.044337 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d5c272e-db66-4d1e-b5b6-9fa305a01c8d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3d5c272e-db66-4d1e-b5b6-9fa305a01c8d\") " pod="openstack/nova-cell0-conductor-0" Dec 05 17:55:38 crc kubenswrapper[4756]: I1205 17:55:38.062067 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qqmf\" (UniqueName: \"kubernetes.io/projected/3d5c272e-db66-4d1e-b5b6-9fa305a01c8d-kube-api-access-6qqmf\") pod \"nova-cell0-conductor-0\" (UID: \"3d5c272e-db66-4d1e-b5b6-9fa305a01c8d\") " pod="openstack/nova-cell0-conductor-0" Dec 05 17:55:38 crc kubenswrapper[4756]: I1205 17:55:38.186880 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 17:55:38 crc kubenswrapper[4756]: I1205 17:55:38.731673 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 17:55:38 crc kubenswrapper[4756]: I1205 17:55:38.740032 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3d5c272e-db66-4d1e-b5b6-9fa305a01c8d","Type":"ContainerStarted","Data":"0c1761025092ba7b5de0bd1ff74ff12a0dd22a01af8d2ca838a8efd0ed87e676"} Dec 05 17:55:39 crc kubenswrapper[4756]: I1205 17:55:39.051092 4756 scope.go:117] "RemoveContainer" containerID="0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a" Dec 05 17:55:39 crc kubenswrapper[4756]: E1205 17:55:39.052166 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:55:39 crc kubenswrapper[4756]: I1205 17:55:39.751992 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3d5c272e-db66-4d1e-b5b6-9fa305a01c8d","Type":"ContainerStarted","Data":"8ac544e6acb53e59340961cb347b29993be336ec7ebc42fa894e410a02fbe8ba"} Dec 05 17:55:39 crc kubenswrapper[4756]: I1205 17:55:39.752216 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 05 17:55:39 crc kubenswrapper[4756]: I1205 17:55:39.793634 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.793606554 podStartE2EDuration="2.793606554s" podCreationTimestamp="2025-12-05 17:55:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:55:39.77579667 +0000 UTC m=+6000.014470222" watchObservedRunningTime="2025-12-05 17:55:39.793606554 +0000 UTC m=+6000.032280076" Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.232794 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.728166 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-cqrvr"] Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.730843 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cqrvr" Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.735232 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.735540 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.766318 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-cqrvr"] Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.850255 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.851359 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.858864 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.860671 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3aeab9a-21a1-4186-856a-a04db31f11e7-scripts\") pod \"nova-cell0-cell-mapping-cqrvr\" (UID: \"e3aeab9a-21a1-4186-856a-a04db31f11e7\") " pod="openstack/nova-cell0-cell-mapping-cqrvr" Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.860722 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3aeab9a-21a1-4186-856a-a04db31f11e7-config-data\") pod \"nova-cell0-cell-mapping-cqrvr\" (UID: \"e3aeab9a-21a1-4186-856a-a04db31f11e7\") " pod="openstack/nova-cell0-cell-mapping-cqrvr" Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.860772 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3aeab9a-21a1-4186-856a-a04db31f11e7-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-cqrvr\" (UID: \"e3aeab9a-21a1-4186-856a-a04db31f11e7\") " pod="openstack/nova-cell0-cell-mapping-cqrvr" Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.860873 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7pb8\" (UniqueName: \"kubernetes.io/projected/e3aeab9a-21a1-4186-856a-a04db31f11e7-kube-api-access-f7pb8\") pod \"nova-cell0-cell-mapping-cqrvr\" (UID: \"e3aeab9a-21a1-4186-856a-a04db31f11e7\") " pod="openstack/nova-cell0-cell-mapping-cqrvr" Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.862124 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.928982 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.930580 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.936741 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.948257 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.949712 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.964569 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.964659 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.965625 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3aeab9a-21a1-4186-856a-a04db31f11e7-scripts\") pod \"nova-cell0-cell-mapping-cqrvr\" (UID: \"e3aeab9a-21a1-4186-856a-a04db31f11e7\") " pod="openstack/nova-cell0-cell-mapping-cqrvr" Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.965661 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3aeab9a-21a1-4186-856a-a04db31f11e7-config-data\") pod \"nova-cell0-cell-mapping-cqrvr\" (UID: \"e3aeab9a-21a1-4186-856a-a04db31f11e7\") " pod="openstack/nova-cell0-cell-mapping-cqrvr" Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.965700 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3aeab9a-21a1-4186-856a-a04db31f11e7-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-cqrvr\" (UID: \"e3aeab9a-21a1-4186-856a-a04db31f11e7\") " pod="openstack/nova-cell0-cell-mapping-cqrvr" Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.965746 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7pb8\" (UniqueName: \"kubernetes.io/projected/e3aeab9a-21a1-4186-856a-a04db31f11e7-kube-api-access-f7pb8\") pod \"nova-cell0-cell-mapping-cqrvr\" (UID: \"e3aeab9a-21a1-4186-856a-a04db31f11e7\") " pod="openstack/nova-cell0-cell-mapping-cqrvr" Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.965777 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-254z9\" (UniqueName: \"kubernetes.io/projected/68ced2e8-8358-4d3d-97cc-3037d96b0efe-kube-api-access-254z9\") pod \"nova-scheduler-0\" (UID: \"68ced2e8-8358-4d3d-97cc-3037d96b0efe\") " pod="openstack/nova-scheduler-0" Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.965808 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68ced2e8-8358-4d3d-97cc-3037d96b0efe-config-data\") pod \"nova-scheduler-0\" (UID: \"68ced2e8-8358-4d3d-97cc-3037d96b0efe\") " pod="openstack/nova-scheduler-0" Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.965826 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ced2e8-8358-4d3d-97cc-3037d96b0efe-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"68ced2e8-8358-4d3d-97cc-3037d96b0efe\") " pod="openstack/nova-scheduler-0" Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.981869 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.983563 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3aeab9a-21a1-4186-856a-a04db31f11e7-scripts\") pod \"nova-cell0-cell-mapping-cqrvr\" (UID: \"e3aeab9a-21a1-4186-856a-a04db31f11e7\") " pod="openstack/nova-cell0-cell-mapping-cqrvr" Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.989057 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3aeab9a-21a1-4186-856a-a04db31f11e7-config-data\") pod \"nova-cell0-cell-mapping-cqrvr\" (UID: \"e3aeab9a-21a1-4186-856a-a04db31f11e7\") " pod="openstack/nova-cell0-cell-mapping-cqrvr" Dec 05 17:55:43 crc kubenswrapper[4756]: I1205 17:55:43.994063 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3aeab9a-21a1-4186-856a-a04db31f11e7-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-cqrvr\" (UID: \"e3aeab9a-21a1-4186-856a-a04db31f11e7\") " pod="openstack/nova-cell0-cell-mapping-cqrvr" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.006498 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7pb8\" (UniqueName: \"kubernetes.io/projected/e3aeab9a-21a1-4186-856a-a04db31f11e7-kube-api-access-f7pb8\") pod \"nova-cell0-cell-mapping-cqrvr\" (UID: \"e3aeab9a-21a1-4186-856a-a04db31f11e7\") " pod="openstack/nova-cell0-cell-mapping-cqrvr" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.030578 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.032212 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.041482 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.062173 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cqrvr" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.067194 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.067252 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5spx\" (UniqueName: \"kubernetes.io/projected/280ce474-232c-4966-b4a8-8d0a29284bb9-kube-api-access-c5spx\") pod \"nova-metadata-0\" (UID: \"280ce474-232c-4966-b4a8-8d0a29284bb9\") " pod="openstack/nova-metadata-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.067271 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.067317 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/280ce474-232c-4966-b4a8-8d0a29284bb9-logs\") pod \"nova-metadata-0\" (UID: \"280ce474-232c-4966-b4a8-8d0a29284bb9\") " pod="openstack/nova-metadata-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.067346 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-254z9\" (UniqueName: \"kubernetes.io/projected/68ced2e8-8358-4d3d-97cc-3037d96b0efe-kube-api-access-254z9\") pod \"nova-scheduler-0\" (UID: \"68ced2e8-8358-4d3d-97cc-3037d96b0efe\") " pod="openstack/nova-scheduler-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.067372 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68ced2e8-8358-4d3d-97cc-3037d96b0efe-config-data\") pod \"nova-scheduler-0\" (UID: \"68ced2e8-8358-4d3d-97cc-3037d96b0efe\") " pod="openstack/nova-scheduler-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.067391 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/280ce474-232c-4966-b4a8-8d0a29284bb9-config-data\") pod \"nova-metadata-0\" (UID: \"280ce474-232c-4966-b4a8-8d0a29284bb9\") " pod="openstack/nova-metadata-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.067409 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ced2e8-8358-4d3d-97cc-3037d96b0efe-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"68ced2e8-8358-4d3d-97cc-3037d96b0efe\") " pod="openstack/nova-scheduler-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.067449 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/280ce474-232c-4966-b4a8-8d0a29284bb9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"280ce474-232c-4966-b4a8-8d0a29284bb9\") " pod="openstack/nova-metadata-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.067481 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tcwv\" (UniqueName: \"kubernetes.io/projected/59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1-kube-api-access-8tcwv\") pod \"nova-cell1-novncproxy-0\" (UID: \"59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.075827 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ced2e8-8358-4d3d-97cc-3037d96b0efe-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"68ced2e8-8358-4d3d-97cc-3037d96b0efe\") " pod="openstack/nova-scheduler-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.077713 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84c55b5567-zxk59"] Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.098132 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68ced2e8-8358-4d3d-97cc-3037d96b0efe-config-data\") pod \"nova-scheduler-0\" (UID: \"68ced2e8-8358-4d3d-97cc-3037d96b0efe\") " pod="openstack/nova-scheduler-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.141691 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-254z9\" (UniqueName: \"kubernetes.io/projected/68ced2e8-8358-4d3d-97cc-3037d96b0efe-kube-api-access-254z9\") pod \"nova-scheduler-0\" (UID: \"68ced2e8-8358-4d3d-97cc-3037d96b0efe\") " pod="openstack/nova-scheduler-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.144508 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c55b5567-zxk59" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.160977 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.169826 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tcwv\" (UniqueName: \"kubernetes.io/projected/59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1-kube-api-access-8tcwv\") pod \"nova-cell1-novncproxy-0\" (UID: \"59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.169901 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.169937 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60-logs\") pod \"nova-api-0\" (UID: \"b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60\") " pod="openstack/nova-api-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.169958 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60\") " pod="openstack/nova-api-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.170014 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5spx\" (UniqueName: \"kubernetes.io/projected/280ce474-232c-4966-b4a8-8d0a29284bb9-kube-api-access-c5spx\") pod \"nova-metadata-0\" (UID: \"280ce474-232c-4966-b4a8-8d0a29284bb9\") " pod="openstack/nova-metadata-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.170034 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.170051 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60-config-data\") pod \"nova-api-0\" (UID: \"b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60\") " pod="openstack/nova-api-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.170101 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/280ce474-232c-4966-b4a8-8d0a29284bb9-logs\") pod \"nova-metadata-0\" (UID: \"280ce474-232c-4966-b4a8-8d0a29284bb9\") " pod="openstack/nova-metadata-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.170325 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.170874 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/280ce474-232c-4966-b4a8-8d0a29284bb9-config-data\") pod \"nova-metadata-0\" (UID: \"280ce474-232c-4966-b4a8-8d0a29284bb9\") " pod="openstack/nova-metadata-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.170963 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-684rw\" (UniqueName: \"kubernetes.io/projected/b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60-kube-api-access-684rw\") pod \"nova-api-0\" (UID: \"b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60\") " pod="openstack/nova-api-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.170992 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/280ce474-232c-4966-b4a8-8d0a29284bb9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"280ce474-232c-4966-b4a8-8d0a29284bb9\") " pod="openstack/nova-metadata-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.171140 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/280ce474-232c-4966-b4a8-8d0a29284bb9-logs\") pod \"nova-metadata-0\" (UID: \"280ce474-232c-4966-b4a8-8d0a29284bb9\") " pod="openstack/nova-metadata-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.178737 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/280ce474-232c-4966-b4a8-8d0a29284bb9-config-data\") pod \"nova-metadata-0\" (UID: \"280ce474-232c-4966-b4a8-8d0a29284bb9\") " pod="openstack/nova-metadata-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.182891 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/280ce474-232c-4966-b4a8-8d0a29284bb9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"280ce474-232c-4966-b4a8-8d0a29284bb9\") " pod="openstack/nova-metadata-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.192458 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84c55b5567-zxk59"] Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.216104 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.222121 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.224269 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tcwv\" (UniqueName: \"kubernetes.io/projected/59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1-kube-api-access-8tcwv\") pod \"nova-cell1-novncproxy-0\" (UID: \"59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.231341 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5spx\" (UniqueName: \"kubernetes.io/projected/280ce474-232c-4966-b4a8-8d0a29284bb9-kube-api-access-c5spx\") pod \"nova-metadata-0\" (UID: \"280ce474-232c-4966-b4a8-8d0a29284bb9\") " pod="openstack/nova-metadata-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.232471 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.272373 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60-logs\") pod \"nova-api-0\" (UID: \"b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60\") " pod="openstack/nova-api-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.272411 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60\") " pod="openstack/nova-api-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.272465 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60-config-data\") pod \"nova-api-0\" (UID: \"b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60\") " pod="openstack/nova-api-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.274389 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f523eda-94d0-4016-b592-009948e19b77-ovsdbserver-nb\") pod \"dnsmasq-dns-84c55b5567-zxk59\" (UID: \"7f523eda-94d0-4016-b592-009948e19b77\") " pod="openstack/dnsmasq-dns-84c55b5567-zxk59" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.274427 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zngj\" (UniqueName: \"kubernetes.io/projected/7f523eda-94d0-4016-b592-009948e19b77-kube-api-access-8zngj\") pod \"dnsmasq-dns-84c55b5567-zxk59\" (UID: \"7f523eda-94d0-4016-b592-009948e19b77\") " pod="openstack/dnsmasq-dns-84c55b5567-zxk59" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.274480 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f523eda-94d0-4016-b592-009948e19b77-ovsdbserver-sb\") pod \"dnsmasq-dns-84c55b5567-zxk59\" (UID: \"7f523eda-94d0-4016-b592-009948e19b77\") " pod="openstack/dnsmasq-dns-84c55b5567-zxk59" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.274507 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f523eda-94d0-4016-b592-009948e19b77-config\") pod \"dnsmasq-dns-84c55b5567-zxk59\" (UID: \"7f523eda-94d0-4016-b592-009948e19b77\") " pod="openstack/dnsmasq-dns-84c55b5567-zxk59" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.274540 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-684rw\" (UniqueName: \"kubernetes.io/projected/b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60-kube-api-access-684rw\") pod \"nova-api-0\" (UID: \"b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60\") " pod="openstack/nova-api-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.274570 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f523eda-94d0-4016-b592-009948e19b77-dns-svc\") pod \"dnsmasq-dns-84c55b5567-zxk59\" (UID: \"7f523eda-94d0-4016-b592-009948e19b77\") " pod="openstack/dnsmasq-dns-84c55b5567-zxk59" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.277112 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.284665 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60-logs\") pod \"nova-api-0\" (UID: \"b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60\") " pod="openstack/nova-api-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.288246 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60-config-data\") pod \"nova-api-0\" (UID: \"b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60\") " pod="openstack/nova-api-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.288572 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60\") " pod="openstack/nova-api-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.299339 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-684rw\" (UniqueName: \"kubernetes.io/projected/b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60-kube-api-access-684rw\") pod \"nova-api-0\" (UID: \"b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60\") " pod="openstack/nova-api-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.376484 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f523eda-94d0-4016-b592-009948e19b77-dns-svc\") pod \"dnsmasq-dns-84c55b5567-zxk59\" (UID: \"7f523eda-94d0-4016-b592-009948e19b77\") " pod="openstack/dnsmasq-dns-84c55b5567-zxk59" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.376857 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f523eda-94d0-4016-b592-009948e19b77-ovsdbserver-nb\") pod \"dnsmasq-dns-84c55b5567-zxk59\" (UID: \"7f523eda-94d0-4016-b592-009948e19b77\") " pod="openstack/dnsmasq-dns-84c55b5567-zxk59" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.376877 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zngj\" (UniqueName: \"kubernetes.io/projected/7f523eda-94d0-4016-b592-009948e19b77-kube-api-access-8zngj\") pod \"dnsmasq-dns-84c55b5567-zxk59\" (UID: \"7f523eda-94d0-4016-b592-009948e19b77\") " pod="openstack/dnsmasq-dns-84c55b5567-zxk59" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.376923 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f523eda-94d0-4016-b592-009948e19b77-ovsdbserver-sb\") pod \"dnsmasq-dns-84c55b5567-zxk59\" (UID: \"7f523eda-94d0-4016-b592-009948e19b77\") " pod="openstack/dnsmasq-dns-84c55b5567-zxk59" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.376949 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f523eda-94d0-4016-b592-009948e19b77-config\") pod \"dnsmasq-dns-84c55b5567-zxk59\" (UID: \"7f523eda-94d0-4016-b592-009948e19b77\") " pod="openstack/dnsmasq-dns-84c55b5567-zxk59" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.377813 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f523eda-94d0-4016-b592-009948e19b77-config\") pod \"dnsmasq-dns-84c55b5567-zxk59\" (UID: \"7f523eda-94d0-4016-b592-009948e19b77\") " pod="openstack/dnsmasq-dns-84c55b5567-zxk59" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.377851 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f523eda-94d0-4016-b592-009948e19b77-dns-svc\") pod \"dnsmasq-dns-84c55b5567-zxk59\" (UID: \"7f523eda-94d0-4016-b592-009948e19b77\") " pod="openstack/dnsmasq-dns-84c55b5567-zxk59" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.378670 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f523eda-94d0-4016-b592-009948e19b77-ovsdbserver-sb\") pod \"dnsmasq-dns-84c55b5567-zxk59\" (UID: \"7f523eda-94d0-4016-b592-009948e19b77\") " pod="openstack/dnsmasq-dns-84c55b5567-zxk59" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.378677 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f523eda-94d0-4016-b592-009948e19b77-ovsdbserver-nb\") pod \"dnsmasq-dns-84c55b5567-zxk59\" (UID: \"7f523eda-94d0-4016-b592-009948e19b77\") " pod="openstack/dnsmasq-dns-84c55b5567-zxk59" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.397846 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zngj\" (UniqueName: \"kubernetes.io/projected/7f523eda-94d0-4016-b592-009948e19b77-kube-api-access-8zngj\") pod \"dnsmasq-dns-84c55b5567-zxk59\" (UID: \"7f523eda-94d0-4016-b592-009948e19b77\") " pod="openstack/dnsmasq-dns-84c55b5567-zxk59" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.574874 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.599787 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c55b5567-zxk59" Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.694694 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-cqrvr"] Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.827734 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.850267 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cqrvr" event={"ID":"e3aeab9a-21a1-4186-856a-a04db31f11e7","Type":"ContainerStarted","Data":"b2832128f94f473b6fbebad5652601ed6b00a7177230b51bef853ad271d86485"} Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.851764 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"68ced2e8-8358-4d3d-97cc-3037d96b0efe","Type":"ContainerStarted","Data":"82f07a023ffcf2edf48726cde6b38e1337cc45f2da4bd3a15bd20bc4a2b0eb3e"} Dec 05 17:55:44 crc kubenswrapper[4756]: W1205 17:55:44.903160 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod280ce474_232c_4966_b4a8_8d0a29284bb9.slice/crio-82eff6b7b363e37f08aded9b292313278476d6e63fb37d73dceade7ed9dda580 WatchSource:0}: Error finding container 82eff6b7b363e37f08aded9b292313278476d6e63fb37d73dceade7ed9dda580: Status 404 returned error can't find the container with id 82eff6b7b363e37f08aded9b292313278476d6e63fb37d73dceade7ed9dda580 Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.903952 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 17:55:44 crc kubenswrapper[4756]: W1205 17:55:44.904696 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59e8e13b_377e_4a24_82f0_5d7a8e7fb6f1.slice/crio-777cc392b026ab15797afea867195e24748d81c7b147f0fedd97efb8f797cc0f WatchSource:0}: Error finding container 777cc392b026ab15797afea867195e24748d81c7b147f0fedd97efb8f797cc0f: Status 404 returned error can't find the container with id 777cc392b026ab15797afea867195e24748d81c7b147f0fedd97efb8f797cc0f Dec 05 17:55:44 crc kubenswrapper[4756]: I1205 17:55:44.914780 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.024962 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8f6x8"] Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.026589 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8f6x8" Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.029326 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.029609 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.060825 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8f6x8"] Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.114635 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.132003 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84c55b5567-zxk59"] Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.208668 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b408fd08-c950-4490-800d-4167c055ed5a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8f6x8\" (UID: \"b408fd08-c950-4490-800d-4167c055ed5a\") " pod="openstack/nova-cell1-conductor-db-sync-8f6x8" Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.208865 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqwg5\" (UniqueName: \"kubernetes.io/projected/b408fd08-c950-4490-800d-4167c055ed5a-kube-api-access-bqwg5\") pod \"nova-cell1-conductor-db-sync-8f6x8\" (UID: \"b408fd08-c950-4490-800d-4167c055ed5a\") " pod="openstack/nova-cell1-conductor-db-sync-8f6x8" Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.208945 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b408fd08-c950-4490-800d-4167c055ed5a-config-data\") pod \"nova-cell1-conductor-db-sync-8f6x8\" (UID: \"b408fd08-c950-4490-800d-4167c055ed5a\") " pod="openstack/nova-cell1-conductor-db-sync-8f6x8" Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.209006 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b408fd08-c950-4490-800d-4167c055ed5a-scripts\") pod \"nova-cell1-conductor-db-sync-8f6x8\" (UID: \"b408fd08-c950-4490-800d-4167c055ed5a\") " pod="openstack/nova-cell1-conductor-db-sync-8f6x8" Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.310248 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b408fd08-c950-4490-800d-4167c055ed5a-scripts\") pod \"nova-cell1-conductor-db-sync-8f6x8\" (UID: \"b408fd08-c950-4490-800d-4167c055ed5a\") " pod="openstack/nova-cell1-conductor-db-sync-8f6x8" Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.310660 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b408fd08-c950-4490-800d-4167c055ed5a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8f6x8\" (UID: \"b408fd08-c950-4490-800d-4167c055ed5a\") " pod="openstack/nova-cell1-conductor-db-sync-8f6x8" Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.310753 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqwg5\" (UniqueName: \"kubernetes.io/projected/b408fd08-c950-4490-800d-4167c055ed5a-kube-api-access-bqwg5\") pod \"nova-cell1-conductor-db-sync-8f6x8\" (UID: \"b408fd08-c950-4490-800d-4167c055ed5a\") " pod="openstack/nova-cell1-conductor-db-sync-8f6x8" Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.310821 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b408fd08-c950-4490-800d-4167c055ed5a-config-data\") pod \"nova-cell1-conductor-db-sync-8f6x8\" (UID: \"b408fd08-c950-4490-800d-4167c055ed5a\") " pod="openstack/nova-cell1-conductor-db-sync-8f6x8" Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.319704 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b408fd08-c950-4490-800d-4167c055ed5a-scripts\") pod \"nova-cell1-conductor-db-sync-8f6x8\" (UID: \"b408fd08-c950-4490-800d-4167c055ed5a\") " pod="openstack/nova-cell1-conductor-db-sync-8f6x8" Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.319720 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b408fd08-c950-4490-800d-4167c055ed5a-config-data\") pod \"nova-cell1-conductor-db-sync-8f6x8\" (UID: \"b408fd08-c950-4490-800d-4167c055ed5a\") " pod="openstack/nova-cell1-conductor-db-sync-8f6x8" Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.319935 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b408fd08-c950-4490-800d-4167c055ed5a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8f6x8\" (UID: \"b408fd08-c950-4490-800d-4167c055ed5a\") " pod="openstack/nova-cell1-conductor-db-sync-8f6x8" Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.332432 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqwg5\" (UniqueName: \"kubernetes.io/projected/b408fd08-c950-4490-800d-4167c055ed5a-kube-api-access-bqwg5\") pod \"nova-cell1-conductor-db-sync-8f6x8\" (UID: \"b408fd08-c950-4490-800d-4167c055ed5a\") " pod="openstack/nova-cell1-conductor-db-sync-8f6x8" Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.579629 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8f6x8" Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.862808 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cqrvr" event={"ID":"e3aeab9a-21a1-4186-856a-a04db31f11e7","Type":"ContainerStarted","Data":"539ccc1002f311157024a76e2ad32d96e8bf4c542bd474ba228df54a3e992049"} Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.865656 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60","Type":"ContainerStarted","Data":"c1ec2489f4d00bd4dfe04e5ec81b876d0140ef44f8f957fbaa9c27e3ba2d5be3"} Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.865695 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60","Type":"ContainerStarted","Data":"6d99bac05a2eca5082735ccbba82e061724261975991b65590a75f0e549ccdd0"} Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.865706 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60","Type":"ContainerStarted","Data":"2231e2575bc23c005fa004ac2d39d6209143d7cef8919d711e909aa75f0bbcd0"} Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.870182 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"280ce474-232c-4966-b4a8-8d0a29284bb9","Type":"ContainerStarted","Data":"278e3e3c5860511ba42bfca202cd2625abc7b8680977eef6dd13e571b2f8676a"} Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.870245 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"280ce474-232c-4966-b4a8-8d0a29284bb9","Type":"ContainerStarted","Data":"38c377442d861e04edc89a0063dfc6998a257429d9925d09c3ff8c819be63b98"} Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.870259 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"280ce474-232c-4966-b4a8-8d0a29284bb9","Type":"ContainerStarted","Data":"82eff6b7b363e37f08aded9b292313278476d6e63fb37d73dceade7ed9dda580"} Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.871958 4756 generic.go:334] "Generic (PLEG): container finished" podID="7f523eda-94d0-4016-b592-009948e19b77" containerID="c92a326622e25c684f1a58965f8c4f951f30711e67f14b1a56c247f43c3b32c7" exitCode=0 Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.872016 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c55b5567-zxk59" event={"ID":"7f523eda-94d0-4016-b592-009948e19b77","Type":"ContainerDied","Data":"c92a326622e25c684f1a58965f8c4f951f30711e67f14b1a56c247f43c3b32c7"} Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.872038 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c55b5567-zxk59" event={"ID":"7f523eda-94d0-4016-b592-009948e19b77","Type":"ContainerStarted","Data":"64ea579b4cf8691b2d9e5012757b7b7f27f91fcbcd346bb4483471bd70ae9665"} Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.876621 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"68ced2e8-8358-4d3d-97cc-3037d96b0efe","Type":"ContainerStarted","Data":"f3683aa1f216ea13bcc96af51848d8f7fae2586d8cbdc70542720cd6f7e57ff0"} Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.884606 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1","Type":"ContainerStarted","Data":"08db38a5694df68e74ca97017808dca4b7ea8d8643f6263239e9bb04ae4a3632"} Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.884646 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1","Type":"ContainerStarted","Data":"777cc392b026ab15797afea867195e24748d81c7b147f0fedd97efb8f797cc0f"} Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.885704 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-cqrvr" podStartSLOduration=2.885685827 podStartE2EDuration="2.885685827s" podCreationTimestamp="2025-12-05 17:55:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:55:45.876269146 +0000 UTC m=+6006.114942658" watchObservedRunningTime="2025-12-05 17:55:45.885685827 +0000 UTC m=+6006.124359339" Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.927360 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.927327546 podStartE2EDuration="2.927327546s" podCreationTimestamp="2025-12-05 17:55:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:55:45.916805386 +0000 UTC m=+6006.155478898" watchObservedRunningTime="2025-12-05 17:55:45.927327546 +0000 UTC m=+6006.166001058" Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.944181 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.944164905 podStartE2EDuration="2.944164905s" podCreationTimestamp="2025-12-05 17:55:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:55:45.932090473 +0000 UTC m=+6006.170763995" watchObservedRunningTime="2025-12-05 17:55:45.944164905 +0000 UTC m=+6006.182838427" Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.980576 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.9805592340000002 podStartE2EDuration="2.980559234s" podCreationTimestamp="2025-12-05 17:55:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:55:45.95788757 +0000 UTC m=+6006.196561092" watchObservedRunningTime="2025-12-05 17:55:45.980559234 +0000 UTC m=+6006.219232746" Dec 05 17:55:45 crc kubenswrapper[4756]: I1205 17:55:45.982276 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.98227031 podStartE2EDuration="2.98227031s" podCreationTimestamp="2025-12-05 17:55:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:55:45.977136953 +0000 UTC m=+6006.215810465" watchObservedRunningTime="2025-12-05 17:55:45.98227031 +0000 UTC m=+6006.220943812" Dec 05 17:55:46 crc kubenswrapper[4756]: W1205 17:55:46.062348 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb408fd08_c950_4490_800d_4167c055ed5a.slice/crio-fcda16953a35e7261d42472a1f7fd9938edadd4d31160a9b2ba5eed617c5407a WatchSource:0}: Error finding container fcda16953a35e7261d42472a1f7fd9938edadd4d31160a9b2ba5eed617c5407a: Status 404 returned error can't find the container with id fcda16953a35e7261d42472a1f7fd9938edadd4d31160a9b2ba5eed617c5407a Dec 05 17:55:46 crc kubenswrapper[4756]: I1205 17:55:46.065402 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8f6x8"] Dec 05 17:55:46 crc kubenswrapper[4756]: I1205 17:55:46.899365 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c55b5567-zxk59" event={"ID":"7f523eda-94d0-4016-b592-009948e19b77","Type":"ContainerStarted","Data":"ffa6c89670f6d6c6ae2963cffe10e75b71dbdfa7288f0941bda2c003ce36fbdb"} Dec 05 17:55:46 crc kubenswrapper[4756]: I1205 17:55:46.899922 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84c55b5567-zxk59" Dec 05 17:55:46 crc kubenswrapper[4756]: I1205 17:55:46.901648 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8f6x8" event={"ID":"b408fd08-c950-4490-800d-4167c055ed5a","Type":"ContainerStarted","Data":"0a520722fdae61242c11e7a24284b9cef58f1c5bcedd5c3460b72c55cee0776b"} Dec 05 17:55:46 crc kubenswrapper[4756]: I1205 17:55:46.901916 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8f6x8" event={"ID":"b408fd08-c950-4490-800d-4167c055ed5a","Type":"ContainerStarted","Data":"fcda16953a35e7261d42472a1f7fd9938edadd4d31160a9b2ba5eed617c5407a"} Dec 05 17:55:46 crc kubenswrapper[4756]: I1205 17:55:46.926540 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84c55b5567-zxk59" podStartSLOduration=3.926526185 podStartE2EDuration="3.926526185s" podCreationTimestamp="2025-12-05 17:55:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:55:46.924230944 +0000 UTC m=+6007.162904456" watchObservedRunningTime="2025-12-05 17:55:46.926526185 +0000 UTC m=+6007.165199697" Dec 05 17:55:46 crc kubenswrapper[4756]: I1205 17:55:46.945826 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-8f6x8" podStartSLOduration=2.945803549 podStartE2EDuration="2.945803549s" podCreationTimestamp="2025-12-05 17:55:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:55:46.942626104 +0000 UTC m=+6007.181299616" watchObservedRunningTime="2025-12-05 17:55:46.945803549 +0000 UTC m=+6007.184477101" Dec 05 17:55:48 crc kubenswrapper[4756]: I1205 17:55:48.402011 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 17:55:48 crc kubenswrapper[4756]: I1205 17:55:48.402253 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://08db38a5694df68e74ca97017808dca4b7ea8d8643f6263239e9bb04ae4a3632" gracePeriod=30 Dec 05 17:55:48 crc kubenswrapper[4756]: I1205 17:55:48.416283 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 17:55:48 crc kubenswrapper[4756]: I1205 17:55:48.416766 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="280ce474-232c-4966-b4a8-8d0a29284bb9" containerName="nova-metadata-log" containerID="cri-o://38c377442d861e04edc89a0063dfc6998a257429d9925d09c3ff8c819be63b98" gracePeriod=30 Dec 05 17:55:48 crc kubenswrapper[4756]: I1205 17:55:48.416909 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="280ce474-232c-4966-b4a8-8d0a29284bb9" containerName="nova-metadata-metadata" containerID="cri-o://278e3e3c5860511ba42bfca202cd2625abc7b8680977eef6dd13e571b2f8676a" gracePeriod=30 Dec 05 17:55:48 crc kubenswrapper[4756]: I1205 17:55:48.940263 4756 generic.go:334] "Generic (PLEG): container finished" podID="280ce474-232c-4966-b4a8-8d0a29284bb9" containerID="278e3e3c5860511ba42bfca202cd2625abc7b8680977eef6dd13e571b2f8676a" exitCode=0 Dec 05 17:55:48 crc kubenswrapper[4756]: I1205 17:55:48.940543 4756 generic.go:334] "Generic (PLEG): container finished" podID="280ce474-232c-4966-b4a8-8d0a29284bb9" containerID="38c377442d861e04edc89a0063dfc6998a257429d9925d09c3ff8c819be63b98" exitCode=143 Dec 05 17:55:48 crc kubenswrapper[4756]: I1205 17:55:48.940657 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"280ce474-232c-4966-b4a8-8d0a29284bb9","Type":"ContainerDied","Data":"278e3e3c5860511ba42bfca202cd2625abc7b8680977eef6dd13e571b2f8676a"} Dec 05 17:55:48 crc kubenswrapper[4756]: I1205 17:55:48.940707 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"280ce474-232c-4966-b4a8-8d0a29284bb9","Type":"ContainerDied","Data":"38c377442d861e04edc89a0063dfc6998a257429d9925d09c3ff8c819be63b98"} Dec 05 17:55:48 crc kubenswrapper[4756]: I1205 17:55:48.940719 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"280ce474-232c-4966-b4a8-8d0a29284bb9","Type":"ContainerDied","Data":"82eff6b7b363e37f08aded9b292313278476d6e63fb37d73dceade7ed9dda580"} Dec 05 17:55:48 crc kubenswrapper[4756]: I1205 17:55:48.940728 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82eff6b7b363e37f08aded9b292313278476d6e63fb37d73dceade7ed9dda580" Dec 05 17:55:48 crc kubenswrapper[4756]: I1205 17:55:48.942231 4756 generic.go:334] "Generic (PLEG): container finished" podID="59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1" containerID="08db38a5694df68e74ca97017808dca4b7ea8d8643f6263239e9bb04ae4a3632" exitCode=0 Dec 05 17:55:48 crc kubenswrapper[4756]: I1205 17:55:48.942290 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1","Type":"ContainerDied","Data":"08db38a5694df68e74ca97017808dca4b7ea8d8643f6263239e9bb04ae4a3632"} Dec 05 17:55:48 crc kubenswrapper[4756]: I1205 17:55:48.943758 4756 generic.go:334] "Generic (PLEG): container finished" podID="b408fd08-c950-4490-800d-4167c055ed5a" containerID="0a520722fdae61242c11e7a24284b9cef58f1c5bcedd5c3460b72c55cee0776b" exitCode=0 Dec 05 17:55:48 crc kubenswrapper[4756]: I1205 17:55:48.943809 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8f6x8" event={"ID":"b408fd08-c950-4490-800d-4167c055ed5a","Type":"ContainerDied","Data":"0a520722fdae61242c11e7a24284b9cef58f1c5bcedd5c3460b72c55cee0776b"} Dec 05 17:55:48 crc kubenswrapper[4756]: I1205 17:55:48.994225 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 17:55:49 crc kubenswrapper[4756]: I1205 17:55:49.080761 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:49 crc kubenswrapper[4756]: I1205 17:55:49.082184 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/280ce474-232c-4966-b4a8-8d0a29284bb9-combined-ca-bundle\") pod \"280ce474-232c-4966-b4a8-8d0a29284bb9\" (UID: \"280ce474-232c-4966-b4a8-8d0a29284bb9\") " Dec 05 17:55:49 crc kubenswrapper[4756]: I1205 17:55:49.082239 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/280ce474-232c-4966-b4a8-8d0a29284bb9-logs\") pod \"280ce474-232c-4966-b4a8-8d0a29284bb9\" (UID: \"280ce474-232c-4966-b4a8-8d0a29284bb9\") " Dec 05 17:55:49 crc kubenswrapper[4756]: I1205 17:55:49.082293 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5spx\" (UniqueName: \"kubernetes.io/projected/280ce474-232c-4966-b4a8-8d0a29284bb9-kube-api-access-c5spx\") pod \"280ce474-232c-4966-b4a8-8d0a29284bb9\" (UID: \"280ce474-232c-4966-b4a8-8d0a29284bb9\") " Dec 05 17:55:49 crc kubenswrapper[4756]: I1205 17:55:49.082427 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/280ce474-232c-4966-b4a8-8d0a29284bb9-config-data\") pod \"280ce474-232c-4966-b4a8-8d0a29284bb9\" (UID: \"280ce474-232c-4966-b4a8-8d0a29284bb9\") " Dec 05 17:55:49 crc kubenswrapper[4756]: I1205 17:55:49.082621 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/280ce474-232c-4966-b4a8-8d0a29284bb9-logs" (OuterVolumeSpecName: "logs") pod "280ce474-232c-4966-b4a8-8d0a29284bb9" (UID: "280ce474-232c-4966-b4a8-8d0a29284bb9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:55:49 crc kubenswrapper[4756]: I1205 17:55:49.082909 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/280ce474-232c-4966-b4a8-8d0a29284bb9-logs\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:49 crc kubenswrapper[4756]: I1205 17:55:49.088761 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/280ce474-232c-4966-b4a8-8d0a29284bb9-kube-api-access-c5spx" (OuterVolumeSpecName: "kube-api-access-c5spx") pod "280ce474-232c-4966-b4a8-8d0a29284bb9" (UID: "280ce474-232c-4966-b4a8-8d0a29284bb9"). InnerVolumeSpecName "kube-api-access-c5spx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:55:49 crc kubenswrapper[4756]: I1205 17:55:49.118551 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/280ce474-232c-4966-b4a8-8d0a29284bb9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "280ce474-232c-4966-b4a8-8d0a29284bb9" (UID: "280ce474-232c-4966-b4a8-8d0a29284bb9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:55:49 crc kubenswrapper[4756]: I1205 17:55:49.129749 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/280ce474-232c-4966-b4a8-8d0a29284bb9-config-data" (OuterVolumeSpecName: "config-data") pod "280ce474-232c-4966-b4a8-8d0a29284bb9" (UID: "280ce474-232c-4966-b4a8-8d0a29284bb9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:55:49 crc kubenswrapper[4756]: I1205 17:55:49.170588 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 17:55:49 crc kubenswrapper[4756]: I1205 17:55:49.183320 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1-config-data\") pod \"59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1\" (UID: \"59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1\") " Dec 05 17:55:49 crc kubenswrapper[4756]: I1205 17:55:49.183595 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1-combined-ca-bundle\") pod \"59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1\" (UID: \"59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1\") " Dec 05 17:55:49 crc kubenswrapper[4756]: I1205 17:55:49.183630 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tcwv\" (UniqueName: \"kubernetes.io/projected/59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1-kube-api-access-8tcwv\") pod \"59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1\" (UID: \"59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1\") " Dec 05 17:55:49 crc kubenswrapper[4756]: I1205 17:55:49.186344 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/280ce474-232c-4966-b4a8-8d0a29284bb9-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:49 crc kubenswrapper[4756]: I1205 17:55:49.186572 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/280ce474-232c-4966-b4a8-8d0a29284bb9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:49 crc kubenswrapper[4756]: I1205 17:55:49.186600 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5spx\" (UniqueName: \"kubernetes.io/projected/280ce474-232c-4966-b4a8-8d0a29284bb9-kube-api-access-c5spx\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:49 crc kubenswrapper[4756]: I1205 17:55:49.189178 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1-kube-api-access-8tcwv" (OuterVolumeSpecName: "kube-api-access-8tcwv") pod "59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1" (UID: "59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1"). InnerVolumeSpecName "kube-api-access-8tcwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:55:49 crc kubenswrapper[4756]: I1205 17:55:49.213043 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1-config-data" (OuterVolumeSpecName: "config-data") pod "59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1" (UID: "59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:55:49 crc kubenswrapper[4756]: I1205 17:55:49.219352 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1" (UID: "59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:55:49 crc kubenswrapper[4756]: I1205 17:55:49.288676 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:49 crc kubenswrapper[4756]: I1205 17:55:49.288929 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tcwv\" (UniqueName: \"kubernetes.io/projected/59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1-kube-api-access-8tcwv\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:49 crc kubenswrapper[4756]: I1205 17:55:49.289016 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:49 crc kubenswrapper[4756]: I1205 17:55:49.978807 4756 generic.go:334] "Generic (PLEG): container finished" podID="e3aeab9a-21a1-4186-856a-a04db31f11e7" containerID="539ccc1002f311157024a76e2ad32d96e8bf4c542bd474ba228df54a3e992049" exitCode=0 Dec 05 17:55:49 crc kubenswrapper[4756]: I1205 17:55:49.978911 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cqrvr" event={"ID":"e3aeab9a-21a1-4186-856a-a04db31f11e7","Type":"ContainerDied","Data":"539ccc1002f311157024a76e2ad32d96e8bf4c542bd474ba228df54a3e992049"} Dec 05 17:55:49 crc kubenswrapper[4756]: I1205 17:55:49.984114 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1","Type":"ContainerDied","Data":"777cc392b026ab15797afea867195e24748d81c7b147f0fedd97efb8f797cc0f"} Dec 05 17:55:49 crc kubenswrapper[4756]: I1205 17:55:49.984151 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:49 crc kubenswrapper[4756]: I1205 17:55:49.984198 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 17:55:49 crc kubenswrapper[4756]: I1205 17:55:49.984207 4756 scope.go:117] "RemoveContainer" containerID="08db38a5694df68e74ca97017808dca4b7ea8d8643f6263239e9bb04ae4a3632" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.089901 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.089937 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.114754 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.122207 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.131347 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 17:55:50 crc kubenswrapper[4756]: E1205 17:55:50.131855 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.131877 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 17:55:50 crc kubenswrapper[4756]: E1205 17:55:50.131892 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="280ce474-232c-4966-b4a8-8d0a29284bb9" containerName="nova-metadata-metadata" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.131900 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="280ce474-232c-4966-b4a8-8d0a29284bb9" containerName="nova-metadata-metadata" Dec 05 17:55:50 crc kubenswrapper[4756]: E1205 17:55:50.131924 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="280ce474-232c-4966-b4a8-8d0a29284bb9" containerName="nova-metadata-log" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.131933 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="280ce474-232c-4966-b4a8-8d0a29284bb9" containerName="nova-metadata-log" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.132158 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="280ce474-232c-4966-b4a8-8d0a29284bb9" containerName="nova-metadata-metadata" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.132213 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="280ce474-232c-4966-b4a8-8d0a29284bb9" containerName="nova-metadata-log" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.132239 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.133470 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.134954 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.135734 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.136488 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.142298 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.144166 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.146990 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.147563 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.152572 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.162550 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.312470 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2151170c-3a66-44a7-9840-b1f78aba5db4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2151170c-3a66-44a7-9840-b1f78aba5db4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.312538 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab7a1d9-b1fe-46ee-8e95-e6b138c03205-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dab7a1d9-b1fe-46ee-8e95-e6b138c03205\") " pod="openstack/nova-metadata-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.312594 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2151170c-3a66-44a7-9840-b1f78aba5db4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2151170c-3a66-44a7-9840-b1f78aba5db4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.312621 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2151170c-3a66-44a7-9840-b1f78aba5db4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2151170c-3a66-44a7-9840-b1f78aba5db4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.312651 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2151170c-3a66-44a7-9840-b1f78aba5db4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2151170c-3a66-44a7-9840-b1f78aba5db4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.312669 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dab7a1d9-b1fe-46ee-8e95-e6b138c03205-config-data\") pod \"nova-metadata-0\" (UID: \"dab7a1d9-b1fe-46ee-8e95-e6b138c03205\") " pod="openstack/nova-metadata-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.312689 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dab7a1d9-b1fe-46ee-8e95-e6b138c03205-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dab7a1d9-b1fe-46ee-8e95-e6b138c03205\") " pod="openstack/nova-metadata-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.312709 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqjdk\" (UniqueName: \"kubernetes.io/projected/dab7a1d9-b1fe-46ee-8e95-e6b138c03205-kube-api-access-hqjdk\") pod \"nova-metadata-0\" (UID: \"dab7a1d9-b1fe-46ee-8e95-e6b138c03205\") " pod="openstack/nova-metadata-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.312897 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lvr5\" (UniqueName: \"kubernetes.io/projected/2151170c-3a66-44a7-9840-b1f78aba5db4-kube-api-access-7lvr5\") pod \"nova-cell1-novncproxy-0\" (UID: \"2151170c-3a66-44a7-9840-b1f78aba5db4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.312934 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dab7a1d9-b1fe-46ee-8e95-e6b138c03205-logs\") pod \"nova-metadata-0\" (UID: \"dab7a1d9-b1fe-46ee-8e95-e6b138c03205\") " pod="openstack/nova-metadata-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.367749 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8f6x8" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.430102 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2151170c-3a66-44a7-9840-b1f78aba5db4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2151170c-3a66-44a7-9840-b1f78aba5db4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.430165 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2151170c-3a66-44a7-9840-b1f78aba5db4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2151170c-3a66-44a7-9840-b1f78aba5db4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.430217 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2151170c-3a66-44a7-9840-b1f78aba5db4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2151170c-3a66-44a7-9840-b1f78aba5db4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.430243 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dab7a1d9-b1fe-46ee-8e95-e6b138c03205-config-data\") pod \"nova-metadata-0\" (UID: \"dab7a1d9-b1fe-46ee-8e95-e6b138c03205\") " pod="openstack/nova-metadata-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.430273 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dab7a1d9-b1fe-46ee-8e95-e6b138c03205-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dab7a1d9-b1fe-46ee-8e95-e6b138c03205\") " pod="openstack/nova-metadata-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.430298 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqjdk\" (UniqueName: \"kubernetes.io/projected/dab7a1d9-b1fe-46ee-8e95-e6b138c03205-kube-api-access-hqjdk\") pod \"nova-metadata-0\" (UID: \"dab7a1d9-b1fe-46ee-8e95-e6b138c03205\") " pod="openstack/nova-metadata-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.430369 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lvr5\" (UniqueName: \"kubernetes.io/projected/2151170c-3a66-44a7-9840-b1f78aba5db4-kube-api-access-7lvr5\") pod \"nova-cell1-novncproxy-0\" (UID: \"2151170c-3a66-44a7-9840-b1f78aba5db4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.430390 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dab7a1d9-b1fe-46ee-8e95-e6b138c03205-logs\") pod \"nova-metadata-0\" (UID: \"dab7a1d9-b1fe-46ee-8e95-e6b138c03205\") " pod="openstack/nova-metadata-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.430462 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2151170c-3a66-44a7-9840-b1f78aba5db4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2151170c-3a66-44a7-9840-b1f78aba5db4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.430632 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab7a1d9-b1fe-46ee-8e95-e6b138c03205-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dab7a1d9-b1fe-46ee-8e95-e6b138c03205\") " pod="openstack/nova-metadata-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.431346 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dab7a1d9-b1fe-46ee-8e95-e6b138c03205-logs\") pod \"nova-metadata-0\" (UID: \"dab7a1d9-b1fe-46ee-8e95-e6b138c03205\") " pod="openstack/nova-metadata-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.436143 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2151170c-3a66-44a7-9840-b1f78aba5db4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2151170c-3a66-44a7-9840-b1f78aba5db4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.436382 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dab7a1d9-b1fe-46ee-8e95-e6b138c03205-config-data\") pod \"nova-metadata-0\" (UID: \"dab7a1d9-b1fe-46ee-8e95-e6b138c03205\") " pod="openstack/nova-metadata-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.436628 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2151170c-3a66-44a7-9840-b1f78aba5db4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2151170c-3a66-44a7-9840-b1f78aba5db4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.437385 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2151170c-3a66-44a7-9840-b1f78aba5db4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2151170c-3a66-44a7-9840-b1f78aba5db4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.439220 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dab7a1d9-b1fe-46ee-8e95-e6b138c03205-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dab7a1d9-b1fe-46ee-8e95-e6b138c03205\") " pod="openstack/nova-metadata-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.442630 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab7a1d9-b1fe-46ee-8e95-e6b138c03205-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dab7a1d9-b1fe-46ee-8e95-e6b138c03205\") " pod="openstack/nova-metadata-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.449538 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2151170c-3a66-44a7-9840-b1f78aba5db4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2151170c-3a66-44a7-9840-b1f78aba5db4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.457518 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lvr5\" (UniqueName: \"kubernetes.io/projected/2151170c-3a66-44a7-9840-b1f78aba5db4-kube-api-access-7lvr5\") pod \"nova-cell1-novncproxy-0\" (UID: \"2151170c-3a66-44a7-9840-b1f78aba5db4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.458350 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqjdk\" (UniqueName: \"kubernetes.io/projected/dab7a1d9-b1fe-46ee-8e95-e6b138c03205-kube-api-access-hqjdk\") pod \"nova-metadata-0\" (UID: \"dab7a1d9-b1fe-46ee-8e95-e6b138c03205\") " pod="openstack/nova-metadata-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.458813 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.466440 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.532171 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b408fd08-c950-4490-800d-4167c055ed5a-scripts\") pod \"b408fd08-c950-4490-800d-4167c055ed5a\" (UID: \"b408fd08-c950-4490-800d-4167c055ed5a\") " Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.532295 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b408fd08-c950-4490-800d-4167c055ed5a-combined-ca-bundle\") pod \"b408fd08-c950-4490-800d-4167c055ed5a\" (UID: \"b408fd08-c950-4490-800d-4167c055ed5a\") " Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.532360 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqwg5\" (UniqueName: \"kubernetes.io/projected/b408fd08-c950-4490-800d-4167c055ed5a-kube-api-access-bqwg5\") pod \"b408fd08-c950-4490-800d-4167c055ed5a\" (UID: \"b408fd08-c950-4490-800d-4167c055ed5a\") " Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.532398 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b408fd08-c950-4490-800d-4167c055ed5a-config-data\") pod \"b408fd08-c950-4490-800d-4167c055ed5a\" (UID: \"b408fd08-c950-4490-800d-4167c055ed5a\") " Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.536840 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b408fd08-c950-4490-800d-4167c055ed5a-kube-api-access-bqwg5" (OuterVolumeSpecName: "kube-api-access-bqwg5") pod "b408fd08-c950-4490-800d-4167c055ed5a" (UID: "b408fd08-c950-4490-800d-4167c055ed5a"). InnerVolumeSpecName "kube-api-access-bqwg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.539126 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b408fd08-c950-4490-800d-4167c055ed5a-scripts" (OuterVolumeSpecName: "scripts") pod "b408fd08-c950-4490-800d-4167c055ed5a" (UID: "b408fd08-c950-4490-800d-4167c055ed5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.557156 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b408fd08-c950-4490-800d-4167c055ed5a-config-data" (OuterVolumeSpecName: "config-data") pod "b408fd08-c950-4490-800d-4167c055ed5a" (UID: "b408fd08-c950-4490-800d-4167c055ed5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.587624 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b408fd08-c950-4490-800d-4167c055ed5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b408fd08-c950-4490-800d-4167c055ed5a" (UID: "b408fd08-c950-4490-800d-4167c055ed5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.634049 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqwg5\" (UniqueName: \"kubernetes.io/projected/b408fd08-c950-4490-800d-4167c055ed5a-kube-api-access-bqwg5\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.634078 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b408fd08-c950-4490-800d-4167c055ed5a-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.634088 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b408fd08-c950-4490-800d-4167c055ed5a-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.634098 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b408fd08-c950-4490-800d-4167c055ed5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:50 crc kubenswrapper[4756]: I1205 17:55:50.933365 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.017782 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2151170c-3a66-44a7-9840-b1f78aba5db4","Type":"ContainerStarted","Data":"4c8c763e00a092b679c35cfe823ab077d6ee965b5afa269f6343461edfbd4939"} Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.026822 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8f6x8" Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.026905 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8f6x8" event={"ID":"b408fd08-c950-4490-800d-4167c055ed5a","Type":"ContainerDied","Data":"fcda16953a35e7261d42472a1f7fd9938edadd4d31160a9b2ba5eed617c5407a"} Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.026951 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcda16953a35e7261d42472a1f7fd9938edadd4d31160a9b2ba5eed617c5407a" Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.037103 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 17:55:51 crc kubenswrapper[4756]: E1205 17:55:51.037802 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b408fd08-c950-4490-800d-4167c055ed5a" containerName="nova-cell1-conductor-db-sync" Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.037828 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b408fd08-c950-4490-800d-4167c055ed5a" containerName="nova-cell1-conductor-db-sync" Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.038209 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b408fd08-c950-4490-800d-4167c055ed5a" containerName="nova-cell1-conductor-db-sync" Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.039327 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.041491 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.056766 4756 scope.go:117] "RemoveContainer" containerID="0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a" Dec 05 17:55:51 crc kubenswrapper[4756]: E1205 17:55:51.057394 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.080051 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.105472 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 17:55:51 crc kubenswrapper[4756]: W1205 17:55:51.127255 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddab7a1d9_b1fe_46ee_8e95_e6b138c03205.slice/crio-5dfa69f9a8dd5e3e2863faf9e6d970f4b5476e031861f2764fd59226a0a3e238 WatchSource:0}: Error finding container 5dfa69f9a8dd5e3e2863faf9e6d970f4b5476e031861f2764fd59226a0a3e238: Status 404 returned error can't find the container with id 5dfa69f9a8dd5e3e2863faf9e6d970f4b5476e031861f2764fd59226a0a3e238 Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.147607 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3e2ced8-3d41-4e67-afd7-f82da5f22f1d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a3e2ced8-3d41-4e67-afd7-f82da5f22f1d\") " pod="openstack/nova-cell1-conductor-0" Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.147650 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vvdj\" (UniqueName: \"kubernetes.io/projected/a3e2ced8-3d41-4e67-afd7-f82da5f22f1d-kube-api-access-4vvdj\") pod \"nova-cell1-conductor-0\" (UID: \"a3e2ced8-3d41-4e67-afd7-f82da5f22f1d\") " pod="openstack/nova-cell1-conductor-0" Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.147691 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3e2ced8-3d41-4e67-afd7-f82da5f22f1d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a3e2ced8-3d41-4e67-afd7-f82da5f22f1d\") " pod="openstack/nova-cell1-conductor-0" Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.250017 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3e2ced8-3d41-4e67-afd7-f82da5f22f1d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a3e2ced8-3d41-4e67-afd7-f82da5f22f1d\") " pod="openstack/nova-cell1-conductor-0" Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.250764 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vvdj\" (UniqueName: \"kubernetes.io/projected/a3e2ced8-3d41-4e67-afd7-f82da5f22f1d-kube-api-access-4vvdj\") pod \"nova-cell1-conductor-0\" (UID: \"a3e2ced8-3d41-4e67-afd7-f82da5f22f1d\") " pod="openstack/nova-cell1-conductor-0" Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.251178 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3e2ced8-3d41-4e67-afd7-f82da5f22f1d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a3e2ced8-3d41-4e67-afd7-f82da5f22f1d\") " pod="openstack/nova-cell1-conductor-0" Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.253989 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3e2ced8-3d41-4e67-afd7-f82da5f22f1d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a3e2ced8-3d41-4e67-afd7-f82da5f22f1d\") " pod="openstack/nova-cell1-conductor-0" Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.254750 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3e2ced8-3d41-4e67-afd7-f82da5f22f1d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a3e2ced8-3d41-4e67-afd7-f82da5f22f1d\") " pod="openstack/nova-cell1-conductor-0" Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.267624 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vvdj\" (UniqueName: \"kubernetes.io/projected/a3e2ced8-3d41-4e67-afd7-f82da5f22f1d-kube-api-access-4vvdj\") pod \"nova-cell1-conductor-0\" (UID: \"a3e2ced8-3d41-4e67-afd7-f82da5f22f1d\") " pod="openstack/nova-cell1-conductor-0" Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.301274 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cqrvr" Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.454254 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3aeab9a-21a1-4186-856a-a04db31f11e7-scripts\") pod \"e3aeab9a-21a1-4186-856a-a04db31f11e7\" (UID: \"e3aeab9a-21a1-4186-856a-a04db31f11e7\") " Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.454323 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3aeab9a-21a1-4186-856a-a04db31f11e7-combined-ca-bundle\") pod \"e3aeab9a-21a1-4186-856a-a04db31f11e7\" (UID: \"e3aeab9a-21a1-4186-856a-a04db31f11e7\") " Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.454352 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7pb8\" (UniqueName: \"kubernetes.io/projected/e3aeab9a-21a1-4186-856a-a04db31f11e7-kube-api-access-f7pb8\") pod \"e3aeab9a-21a1-4186-856a-a04db31f11e7\" (UID: \"e3aeab9a-21a1-4186-856a-a04db31f11e7\") " Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.454387 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3aeab9a-21a1-4186-856a-a04db31f11e7-config-data\") pod \"e3aeab9a-21a1-4186-856a-a04db31f11e7\" (UID: \"e3aeab9a-21a1-4186-856a-a04db31f11e7\") " Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.458579 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3aeab9a-21a1-4186-856a-a04db31f11e7-kube-api-access-f7pb8" (OuterVolumeSpecName: "kube-api-access-f7pb8") pod "e3aeab9a-21a1-4186-856a-a04db31f11e7" (UID: "e3aeab9a-21a1-4186-856a-a04db31f11e7"). InnerVolumeSpecName "kube-api-access-f7pb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.460480 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.460983 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3aeab9a-21a1-4186-856a-a04db31f11e7-scripts" (OuterVolumeSpecName: "scripts") pod "e3aeab9a-21a1-4186-856a-a04db31f11e7" (UID: "e3aeab9a-21a1-4186-856a-a04db31f11e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.499952 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3aeab9a-21a1-4186-856a-a04db31f11e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3aeab9a-21a1-4186-856a-a04db31f11e7" (UID: "e3aeab9a-21a1-4186-856a-a04db31f11e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.505557 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3aeab9a-21a1-4186-856a-a04db31f11e7-config-data" (OuterVolumeSpecName: "config-data") pod "e3aeab9a-21a1-4186-856a-a04db31f11e7" (UID: "e3aeab9a-21a1-4186-856a-a04db31f11e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.558241 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3aeab9a-21a1-4186-856a-a04db31f11e7-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.558267 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3aeab9a-21a1-4186-856a-a04db31f11e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.558276 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7pb8\" (UniqueName: \"kubernetes.io/projected/e3aeab9a-21a1-4186-856a-a04db31f11e7-kube-api-access-f7pb8\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:51 crc kubenswrapper[4756]: I1205 17:55:51.558298 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3aeab9a-21a1-4186-856a-a04db31f11e7-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:52 crc kubenswrapper[4756]: I1205 17:55:52.008123 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 17:55:52 crc kubenswrapper[4756]: W1205 17:55:52.018499 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3e2ced8_3d41_4e67_afd7_f82da5f22f1d.slice/crio-48bbeacd7118fc16c61fdeaef7bed0b724b3619820ca80a3fe7c8a5c674e51d8 WatchSource:0}: Error finding container 48bbeacd7118fc16c61fdeaef7bed0b724b3619820ca80a3fe7c8a5c674e51d8: Status 404 returned error can't find the container with id 48bbeacd7118fc16c61fdeaef7bed0b724b3619820ca80a3fe7c8a5c674e51d8 Dec 05 17:55:52 crc kubenswrapper[4756]: I1205 17:55:52.066846 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="280ce474-232c-4966-b4a8-8d0a29284bb9" path="/var/lib/kubelet/pods/280ce474-232c-4966-b4a8-8d0a29284bb9/volumes" Dec 05 17:55:52 crc kubenswrapper[4756]: I1205 17:55:52.067734 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cqrvr" Dec 05 17:55:52 crc kubenswrapper[4756]: I1205 17:55:52.068019 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1" path="/var/lib/kubelet/pods/59e8e13b-377e-4a24-82f0-5d7a8e7fb6f1/volumes" Dec 05 17:55:52 crc kubenswrapper[4756]: I1205 17:55:52.069684 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dab7a1d9-b1fe-46ee-8e95-e6b138c03205","Type":"ContainerStarted","Data":"886017bd3ff2a48e4d2ec88f894e6304276ca4336d3462accf389ffdb7564bf4"} Dec 05 17:55:52 crc kubenswrapper[4756]: I1205 17:55:52.069734 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dab7a1d9-b1fe-46ee-8e95-e6b138c03205","Type":"ContainerStarted","Data":"68594005f28a99dacd05f6422ad00a1038f47941fb15319f649137d6ed9aaed9"} Dec 05 17:55:52 crc kubenswrapper[4756]: I1205 17:55:52.069757 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dab7a1d9-b1fe-46ee-8e95-e6b138c03205","Type":"ContainerStarted","Data":"5dfa69f9a8dd5e3e2863faf9e6d970f4b5476e031861f2764fd59226a0a3e238"} Dec 05 17:55:52 crc kubenswrapper[4756]: I1205 17:55:52.069773 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2151170c-3a66-44a7-9840-b1f78aba5db4","Type":"ContainerStarted","Data":"ad6e04e17d41ec1f1222550bb98c75f945f86f3330bef7acb8f93769add0b35e"} Dec 05 17:55:52 crc kubenswrapper[4756]: I1205 17:55:52.069790 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a3e2ced8-3d41-4e67-afd7-f82da5f22f1d","Type":"ContainerStarted","Data":"48bbeacd7118fc16c61fdeaef7bed0b724b3619820ca80a3fe7c8a5c674e51d8"} Dec 05 17:55:52 crc kubenswrapper[4756]: I1205 17:55:52.069806 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cqrvr" event={"ID":"e3aeab9a-21a1-4186-856a-a04db31f11e7","Type":"ContainerDied","Data":"b2832128f94f473b6fbebad5652601ed6b00a7177230b51bef853ad271d86485"} Dec 05 17:55:52 crc kubenswrapper[4756]: I1205 17:55:52.069827 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2832128f94f473b6fbebad5652601ed6b00a7177230b51bef853ad271d86485" Dec 05 17:55:52 crc kubenswrapper[4756]: I1205 17:55:52.097660 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.097642703 podStartE2EDuration="2.097642703s" podCreationTimestamp="2025-12-05 17:55:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:55:52.083989929 +0000 UTC m=+6012.322663451" watchObservedRunningTime="2025-12-05 17:55:52.097642703 +0000 UTC m=+6012.336316215" Dec 05 17:55:52 crc kubenswrapper[4756]: I1205 17:55:52.108623 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.108606275 podStartE2EDuration="2.108606275s" podCreationTimestamp="2025-12-05 17:55:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:55:52.104749832 +0000 UTC m=+6012.343423414" watchObservedRunningTime="2025-12-05 17:55:52.108606275 +0000 UTC m=+6012.347279787" Dec 05 17:55:52 crc kubenswrapper[4756]: I1205 17:55:52.207544 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 17:55:52 crc kubenswrapper[4756]: I1205 17:55:52.208134 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="68ced2e8-8358-4d3d-97cc-3037d96b0efe" containerName="nova-scheduler-scheduler" containerID="cri-o://f3683aa1f216ea13bcc96af51848d8f7fae2586d8cbdc70542720cd6f7e57ff0" gracePeriod=30 Dec 05 17:55:52 crc kubenswrapper[4756]: I1205 17:55:52.217924 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 17:55:52 crc kubenswrapper[4756]: I1205 17:55:52.218239 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60" containerName="nova-api-log" containerID="cri-o://6d99bac05a2eca5082735ccbba82e061724261975991b65590a75f0e549ccdd0" gracePeriod=30 Dec 05 17:55:52 crc kubenswrapper[4756]: I1205 17:55:52.218448 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60" containerName="nova-api-api" containerID="cri-o://c1ec2489f4d00bd4dfe04e5ec81b876d0140ef44f8f957fbaa9c27e3ba2d5be3" gracePeriod=30 Dec 05 17:55:52 crc kubenswrapper[4756]: I1205 17:55:52.274283 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 17:55:52 crc kubenswrapper[4756]: I1205 17:55:52.818198 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 17:55:52 crc kubenswrapper[4756]: I1205 17:55:52.985206 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60-logs\") pod \"b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60\" (UID: \"b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60\") " Dec 05 17:55:52 crc kubenswrapper[4756]: I1205 17:55:52.985316 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60-config-data\") pod \"b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60\" (UID: \"b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60\") " Dec 05 17:55:52 crc kubenswrapper[4756]: I1205 17:55:52.985491 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60-logs" (OuterVolumeSpecName: "logs") pod "b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60" (UID: "b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:55:52 crc kubenswrapper[4756]: I1205 17:55:52.985687 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-684rw\" (UniqueName: \"kubernetes.io/projected/b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60-kube-api-access-684rw\") pod \"b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60\" (UID: \"b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60\") " Dec 05 17:55:52 crc kubenswrapper[4756]: I1205 17:55:52.985744 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60-combined-ca-bundle\") pod \"b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60\" (UID: \"b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60\") " Dec 05 17:55:52 crc kubenswrapper[4756]: I1205 17:55:52.986459 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60-logs\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:52 crc kubenswrapper[4756]: I1205 17:55:52.991528 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60-kube-api-access-684rw" (OuterVolumeSpecName: "kube-api-access-684rw") pod "b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60" (UID: "b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60"). InnerVolumeSpecName "kube-api-access-684rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.009456 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60" (UID: "b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.011606 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60-config-data" (OuterVolumeSpecName: "config-data") pod "b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60" (UID: "b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.089713 4756 generic.go:334] "Generic (PLEG): container finished" podID="b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60" containerID="c1ec2489f4d00bd4dfe04e5ec81b876d0140ef44f8f957fbaa9c27e3ba2d5be3" exitCode=0 Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.089764 4756 generic.go:334] "Generic (PLEG): container finished" podID="b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60" containerID="6d99bac05a2eca5082735ccbba82e061724261975991b65590a75f0e549ccdd0" exitCode=143 Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.089874 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60","Type":"ContainerDied","Data":"c1ec2489f4d00bd4dfe04e5ec81b876d0140ef44f8f957fbaa9c27e3ba2d5be3"} Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.089915 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60","Type":"ContainerDied","Data":"6d99bac05a2eca5082735ccbba82e061724261975991b65590a75f0e549ccdd0"} Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.089933 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60","Type":"ContainerDied","Data":"2231e2575bc23c005fa004ac2d39d6209143d7cef8919d711e909aa75f0bbcd0"} Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.089959 4756 scope.go:117] "RemoveContainer" containerID="c1ec2489f4d00bd4dfe04e5ec81b876d0140ef44f8f957fbaa9c27e3ba2d5be3" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.090128 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.090274 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-684rw\" (UniqueName: \"kubernetes.io/projected/b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60-kube-api-access-684rw\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.090756 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.090952 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.096678 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a3e2ced8-3d41-4e67-afd7-f82da5f22f1d","Type":"ContainerStarted","Data":"ca33974b6c799ac1fd07c5c1926a60110f1774b448b224e415720ed404ad30dc"} Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.096830 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.121495 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.121470748 podStartE2EDuration="2.121470748s" podCreationTimestamp="2025-12-05 17:55:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:55:53.110502176 +0000 UTC m=+6013.349175688" watchObservedRunningTime="2025-12-05 17:55:53.121470748 +0000 UTC m=+6013.360144320" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.124724 4756 scope.go:117] "RemoveContainer" containerID="6d99bac05a2eca5082735ccbba82e061724261975991b65590a75f0e549ccdd0" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.148238 4756 scope.go:117] "RemoveContainer" containerID="c1ec2489f4d00bd4dfe04e5ec81b876d0140ef44f8f957fbaa9c27e3ba2d5be3" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.154516 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 17:55:53 crc kubenswrapper[4756]: E1205 17:55:53.155673 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1ec2489f4d00bd4dfe04e5ec81b876d0140ef44f8f957fbaa9c27e3ba2d5be3\": container with ID starting with c1ec2489f4d00bd4dfe04e5ec81b876d0140ef44f8f957fbaa9c27e3ba2d5be3 not found: ID does not exist" containerID="c1ec2489f4d00bd4dfe04e5ec81b876d0140ef44f8f957fbaa9c27e3ba2d5be3" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.155717 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1ec2489f4d00bd4dfe04e5ec81b876d0140ef44f8f957fbaa9c27e3ba2d5be3"} err="failed to get container status \"c1ec2489f4d00bd4dfe04e5ec81b876d0140ef44f8f957fbaa9c27e3ba2d5be3\": rpc error: code = NotFound desc = could not find container \"c1ec2489f4d00bd4dfe04e5ec81b876d0140ef44f8f957fbaa9c27e3ba2d5be3\": container with ID starting with c1ec2489f4d00bd4dfe04e5ec81b876d0140ef44f8f957fbaa9c27e3ba2d5be3 not found: ID does not exist" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.155744 4756 scope.go:117] "RemoveContainer" containerID="6d99bac05a2eca5082735ccbba82e061724261975991b65590a75f0e549ccdd0" Dec 05 17:55:53 crc kubenswrapper[4756]: E1205 17:55:53.156612 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d99bac05a2eca5082735ccbba82e061724261975991b65590a75f0e549ccdd0\": container with ID starting with 6d99bac05a2eca5082735ccbba82e061724261975991b65590a75f0e549ccdd0 not found: ID does not exist" containerID="6d99bac05a2eca5082735ccbba82e061724261975991b65590a75f0e549ccdd0" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.156642 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d99bac05a2eca5082735ccbba82e061724261975991b65590a75f0e549ccdd0"} err="failed to get container status \"6d99bac05a2eca5082735ccbba82e061724261975991b65590a75f0e549ccdd0\": rpc error: code = NotFound desc = could not find container \"6d99bac05a2eca5082735ccbba82e061724261975991b65590a75f0e549ccdd0\": container with ID starting with 6d99bac05a2eca5082735ccbba82e061724261975991b65590a75f0e549ccdd0 not found: ID does not exist" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.156660 4756 scope.go:117] "RemoveContainer" containerID="c1ec2489f4d00bd4dfe04e5ec81b876d0140ef44f8f957fbaa9c27e3ba2d5be3" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.156879 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1ec2489f4d00bd4dfe04e5ec81b876d0140ef44f8f957fbaa9c27e3ba2d5be3"} err="failed to get container status \"c1ec2489f4d00bd4dfe04e5ec81b876d0140ef44f8f957fbaa9c27e3ba2d5be3\": rpc error: code = NotFound desc = could not find container \"c1ec2489f4d00bd4dfe04e5ec81b876d0140ef44f8f957fbaa9c27e3ba2d5be3\": container with ID starting with c1ec2489f4d00bd4dfe04e5ec81b876d0140ef44f8f957fbaa9c27e3ba2d5be3 not found: ID does not exist" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.156904 4756 scope.go:117] "RemoveContainer" containerID="6d99bac05a2eca5082735ccbba82e061724261975991b65590a75f0e549ccdd0" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.157546 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d99bac05a2eca5082735ccbba82e061724261975991b65590a75f0e549ccdd0"} err="failed to get container status \"6d99bac05a2eca5082735ccbba82e061724261975991b65590a75f0e549ccdd0\": rpc error: code = NotFound desc = could not find container \"6d99bac05a2eca5082735ccbba82e061724261975991b65590a75f0e549ccdd0\": container with ID starting with 6d99bac05a2eca5082735ccbba82e061724261975991b65590a75f0e549ccdd0 not found: ID does not exist" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.166702 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.180553 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 17:55:53 crc kubenswrapper[4756]: E1205 17:55:53.182460 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60" containerName="nova-api-api" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.182483 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60" containerName="nova-api-api" Dec 05 17:55:53 crc kubenswrapper[4756]: E1205 17:55:53.182497 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3aeab9a-21a1-4186-856a-a04db31f11e7" containerName="nova-manage" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.182507 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3aeab9a-21a1-4186-856a-a04db31f11e7" containerName="nova-manage" Dec 05 17:55:53 crc kubenswrapper[4756]: E1205 17:55:53.182528 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60" containerName="nova-api-log" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.182536 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60" containerName="nova-api-log" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.182821 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60" containerName="nova-api-api" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.182849 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60" containerName="nova-api-log" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.182873 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3aeab9a-21a1-4186-856a-a04db31f11e7" containerName="nova-manage" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.184322 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.186765 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.193498 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.303764 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfmj7\" (UniqueName: \"kubernetes.io/projected/7a8480fe-0062-4c89-9f85-337058bdfc86-kube-api-access-xfmj7\") pod \"nova-api-0\" (UID: \"7a8480fe-0062-4c89-9f85-337058bdfc86\") " pod="openstack/nova-api-0" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.303829 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8480fe-0062-4c89-9f85-337058bdfc86-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7a8480fe-0062-4c89-9f85-337058bdfc86\") " pod="openstack/nova-api-0" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.303908 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a8480fe-0062-4c89-9f85-337058bdfc86-config-data\") pod \"nova-api-0\" (UID: \"7a8480fe-0062-4c89-9f85-337058bdfc86\") " pod="openstack/nova-api-0" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.304007 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a8480fe-0062-4c89-9f85-337058bdfc86-logs\") pod \"nova-api-0\" (UID: \"7a8480fe-0062-4c89-9f85-337058bdfc86\") " pod="openstack/nova-api-0" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.406352 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfmj7\" (UniqueName: \"kubernetes.io/projected/7a8480fe-0062-4c89-9f85-337058bdfc86-kube-api-access-xfmj7\") pod \"nova-api-0\" (UID: \"7a8480fe-0062-4c89-9f85-337058bdfc86\") " pod="openstack/nova-api-0" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.406995 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8480fe-0062-4c89-9f85-337058bdfc86-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7a8480fe-0062-4c89-9f85-337058bdfc86\") " pod="openstack/nova-api-0" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.408088 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a8480fe-0062-4c89-9f85-337058bdfc86-config-data\") pod \"nova-api-0\" (UID: \"7a8480fe-0062-4c89-9f85-337058bdfc86\") " pod="openstack/nova-api-0" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.408366 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a8480fe-0062-4c89-9f85-337058bdfc86-logs\") pod \"nova-api-0\" (UID: \"7a8480fe-0062-4c89-9f85-337058bdfc86\") " pod="openstack/nova-api-0" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.409128 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a8480fe-0062-4c89-9f85-337058bdfc86-logs\") pod \"nova-api-0\" (UID: \"7a8480fe-0062-4c89-9f85-337058bdfc86\") " pod="openstack/nova-api-0" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.412044 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8480fe-0062-4c89-9f85-337058bdfc86-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7a8480fe-0062-4c89-9f85-337058bdfc86\") " pod="openstack/nova-api-0" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.414241 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a8480fe-0062-4c89-9f85-337058bdfc86-config-data\") pod \"nova-api-0\" (UID: \"7a8480fe-0062-4c89-9f85-337058bdfc86\") " pod="openstack/nova-api-0" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.427151 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfmj7\" (UniqueName: \"kubernetes.io/projected/7a8480fe-0062-4c89-9f85-337058bdfc86-kube-api-access-xfmj7\") pod \"nova-api-0\" (UID: \"7a8480fe-0062-4c89-9f85-337058bdfc86\") " pod="openstack/nova-api-0" Dec 05 17:55:53 crc kubenswrapper[4756]: I1205 17:55:53.509496 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 17:55:54 crc kubenswrapper[4756]: I1205 17:55:54.013930 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 17:55:54 crc kubenswrapper[4756]: I1205 17:55:54.068200 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60" path="/var/lib/kubelet/pods/b4b8b565-dd9a-4f9f-a576-7ef0ae8cca60/volumes" Dec 05 17:55:54 crc kubenswrapper[4756]: I1205 17:55:54.112007 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7a8480fe-0062-4c89-9f85-337058bdfc86","Type":"ContainerStarted","Data":"0c5e8299c7b9a3937110089a4f8dfbf3f8bab78882d33227aed922a1262bf3e7"} Dec 05 17:55:54 crc kubenswrapper[4756]: I1205 17:55:54.115314 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dab7a1d9-b1fe-46ee-8e95-e6b138c03205" containerName="nova-metadata-log" containerID="cri-o://68594005f28a99dacd05f6422ad00a1038f47941fb15319f649137d6ed9aaed9" gracePeriod=30 Dec 05 17:55:54 crc kubenswrapper[4756]: I1205 17:55:54.115391 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dab7a1d9-b1fe-46ee-8e95-e6b138c03205" containerName="nova-metadata-metadata" containerID="cri-o://886017bd3ff2a48e4d2ec88f894e6304276ca4336d3462accf389ffdb7564bf4" gracePeriod=30 Dec 05 17:55:54 crc kubenswrapper[4756]: I1205 17:55:54.603739 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84c55b5567-zxk59" Dec 05 17:55:54 crc kubenswrapper[4756]: I1205 17:55:54.677993 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66bbb84ccc-jsvsh"] Dec 05 17:55:54 crc kubenswrapper[4756]: I1205 17:55:54.678219 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-66bbb84ccc-jsvsh" podUID="439a3487-2bc9-4f44-af4c-02d5445fb349" containerName="dnsmasq-dns" containerID="cri-o://c67171f92813c91e25732d8318222028189777f18d2d30b866fd558ec69ed23c" gracePeriod=10 Dec 05 17:55:54 crc kubenswrapper[4756]: I1205 17:55:54.931301 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.063669 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab7a1d9-b1fe-46ee-8e95-e6b138c03205-combined-ca-bundle\") pod \"dab7a1d9-b1fe-46ee-8e95-e6b138c03205\" (UID: \"dab7a1d9-b1fe-46ee-8e95-e6b138c03205\") " Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.063785 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dab7a1d9-b1fe-46ee-8e95-e6b138c03205-nova-metadata-tls-certs\") pod \"dab7a1d9-b1fe-46ee-8e95-e6b138c03205\" (UID: \"dab7a1d9-b1fe-46ee-8e95-e6b138c03205\") " Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.063808 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqjdk\" (UniqueName: \"kubernetes.io/projected/dab7a1d9-b1fe-46ee-8e95-e6b138c03205-kube-api-access-hqjdk\") pod \"dab7a1d9-b1fe-46ee-8e95-e6b138c03205\" (UID: \"dab7a1d9-b1fe-46ee-8e95-e6b138c03205\") " Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.063887 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dab7a1d9-b1fe-46ee-8e95-e6b138c03205-logs\") pod \"dab7a1d9-b1fe-46ee-8e95-e6b138c03205\" (UID: \"dab7a1d9-b1fe-46ee-8e95-e6b138c03205\") " Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.063917 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dab7a1d9-b1fe-46ee-8e95-e6b138c03205-config-data\") pod \"dab7a1d9-b1fe-46ee-8e95-e6b138c03205\" (UID: \"dab7a1d9-b1fe-46ee-8e95-e6b138c03205\") " Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.064327 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dab7a1d9-b1fe-46ee-8e95-e6b138c03205-logs" (OuterVolumeSpecName: "logs") pod "dab7a1d9-b1fe-46ee-8e95-e6b138c03205" (UID: "dab7a1d9-b1fe-46ee-8e95-e6b138c03205"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.080984 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dab7a1d9-b1fe-46ee-8e95-e6b138c03205-kube-api-access-hqjdk" (OuterVolumeSpecName: "kube-api-access-hqjdk") pod "dab7a1d9-b1fe-46ee-8e95-e6b138c03205" (UID: "dab7a1d9-b1fe-46ee-8e95-e6b138c03205"). InnerVolumeSpecName "kube-api-access-hqjdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.106694 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dab7a1d9-b1fe-46ee-8e95-e6b138c03205-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dab7a1d9-b1fe-46ee-8e95-e6b138c03205" (UID: "dab7a1d9-b1fe-46ee-8e95-e6b138c03205"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.127015 4756 generic.go:334] "Generic (PLEG): container finished" podID="439a3487-2bc9-4f44-af4c-02d5445fb349" containerID="c67171f92813c91e25732d8318222028189777f18d2d30b866fd558ec69ed23c" exitCode=0 Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.127078 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66bbb84ccc-jsvsh" event={"ID":"439a3487-2bc9-4f44-af4c-02d5445fb349","Type":"ContainerDied","Data":"c67171f92813c91e25732d8318222028189777f18d2d30b866fd558ec69ed23c"} Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.128222 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7a8480fe-0062-4c89-9f85-337058bdfc86","Type":"ContainerStarted","Data":"91fd11898f42a6a214a976dc66031dca8e1c9ce7698f6bf023d7a0b06b7a41f5"} Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.128243 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7a8480fe-0062-4c89-9f85-337058bdfc86","Type":"ContainerStarted","Data":"8d42dd201cc5c0b28ffe5d0721a17988e76d3ee381803459e358f624256bf898"} Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.130658 4756 generic.go:334] "Generic (PLEG): container finished" podID="dab7a1d9-b1fe-46ee-8e95-e6b138c03205" containerID="886017bd3ff2a48e4d2ec88f894e6304276ca4336d3462accf389ffdb7564bf4" exitCode=0 Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.130677 4756 generic.go:334] "Generic (PLEG): container finished" podID="dab7a1d9-b1fe-46ee-8e95-e6b138c03205" containerID="68594005f28a99dacd05f6422ad00a1038f47941fb15319f649137d6ed9aaed9" exitCode=143 Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.130692 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dab7a1d9-b1fe-46ee-8e95-e6b138c03205","Type":"ContainerDied","Data":"886017bd3ff2a48e4d2ec88f894e6304276ca4336d3462accf389ffdb7564bf4"} Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.130707 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dab7a1d9-b1fe-46ee-8e95-e6b138c03205","Type":"ContainerDied","Data":"68594005f28a99dacd05f6422ad00a1038f47941fb15319f649137d6ed9aaed9"} Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.130700 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dab7a1d9-b1fe-46ee-8e95-e6b138c03205-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "dab7a1d9-b1fe-46ee-8e95-e6b138c03205" (UID: "dab7a1d9-b1fe-46ee-8e95-e6b138c03205"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.130729 4756 scope.go:117] "RemoveContainer" containerID="886017bd3ff2a48e4d2ec88f894e6304276ca4336d3462accf389ffdb7564bf4" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.130719 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dab7a1d9-b1fe-46ee-8e95-e6b138c03205","Type":"ContainerDied","Data":"5dfa69f9a8dd5e3e2863faf9e6d970f4b5476e031861f2764fd59226a0a3e238"} Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.130834 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.133528 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dab7a1d9-b1fe-46ee-8e95-e6b138c03205-config-data" (OuterVolumeSpecName: "config-data") pod "dab7a1d9-b1fe-46ee-8e95-e6b138c03205" (UID: "dab7a1d9-b1fe-46ee-8e95-e6b138c03205"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.153352 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.1533350860000002 podStartE2EDuration="2.153335086s" podCreationTimestamp="2025-12-05 17:55:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:55:55.144224614 +0000 UTC m=+6015.382898126" watchObservedRunningTime="2025-12-05 17:55:55.153335086 +0000 UTC m=+6015.392008598" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.158819 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66bbb84ccc-jsvsh" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.160485 4756 scope.go:117] "RemoveContainer" containerID="68594005f28a99dacd05f6422ad00a1038f47941fb15319f649137d6ed9aaed9" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.165773 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dab7a1d9-b1fe-46ee-8e95-e6b138c03205-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.165793 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab7a1d9-b1fe-46ee-8e95-e6b138c03205-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.165804 4756 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dab7a1d9-b1fe-46ee-8e95-e6b138c03205-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.165830 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqjdk\" (UniqueName: \"kubernetes.io/projected/dab7a1d9-b1fe-46ee-8e95-e6b138c03205-kube-api-access-hqjdk\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.165839 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dab7a1d9-b1fe-46ee-8e95-e6b138c03205-logs\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.217255 4756 scope.go:117] "RemoveContainer" containerID="886017bd3ff2a48e4d2ec88f894e6304276ca4336d3462accf389ffdb7564bf4" Dec 05 17:55:55 crc kubenswrapper[4756]: E1205 17:55:55.218477 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"886017bd3ff2a48e4d2ec88f894e6304276ca4336d3462accf389ffdb7564bf4\": container with ID starting with 886017bd3ff2a48e4d2ec88f894e6304276ca4336d3462accf389ffdb7564bf4 not found: ID does not exist" containerID="886017bd3ff2a48e4d2ec88f894e6304276ca4336d3462accf389ffdb7564bf4" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.218536 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"886017bd3ff2a48e4d2ec88f894e6304276ca4336d3462accf389ffdb7564bf4"} err="failed to get container status \"886017bd3ff2a48e4d2ec88f894e6304276ca4336d3462accf389ffdb7564bf4\": rpc error: code = NotFound desc = could not find container \"886017bd3ff2a48e4d2ec88f894e6304276ca4336d3462accf389ffdb7564bf4\": container with ID starting with 886017bd3ff2a48e4d2ec88f894e6304276ca4336d3462accf389ffdb7564bf4 not found: ID does not exist" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.218562 4756 scope.go:117] "RemoveContainer" containerID="68594005f28a99dacd05f6422ad00a1038f47941fb15319f649137d6ed9aaed9" Dec 05 17:55:55 crc kubenswrapper[4756]: E1205 17:55:55.219911 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68594005f28a99dacd05f6422ad00a1038f47941fb15319f649137d6ed9aaed9\": container with ID starting with 68594005f28a99dacd05f6422ad00a1038f47941fb15319f649137d6ed9aaed9 not found: ID does not exist" containerID="68594005f28a99dacd05f6422ad00a1038f47941fb15319f649137d6ed9aaed9" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.219953 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68594005f28a99dacd05f6422ad00a1038f47941fb15319f649137d6ed9aaed9"} err="failed to get container status \"68594005f28a99dacd05f6422ad00a1038f47941fb15319f649137d6ed9aaed9\": rpc error: code = NotFound desc = could not find container \"68594005f28a99dacd05f6422ad00a1038f47941fb15319f649137d6ed9aaed9\": container with ID starting with 68594005f28a99dacd05f6422ad00a1038f47941fb15319f649137d6ed9aaed9 not found: ID does not exist" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.219982 4756 scope.go:117] "RemoveContainer" containerID="886017bd3ff2a48e4d2ec88f894e6304276ca4336d3462accf389ffdb7564bf4" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.220265 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"886017bd3ff2a48e4d2ec88f894e6304276ca4336d3462accf389ffdb7564bf4"} err="failed to get container status \"886017bd3ff2a48e4d2ec88f894e6304276ca4336d3462accf389ffdb7564bf4\": rpc error: code = NotFound desc = could not find container \"886017bd3ff2a48e4d2ec88f894e6304276ca4336d3462accf389ffdb7564bf4\": container with ID starting with 886017bd3ff2a48e4d2ec88f894e6304276ca4336d3462accf389ffdb7564bf4 not found: ID does not exist" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.220285 4756 scope.go:117] "RemoveContainer" containerID="68594005f28a99dacd05f6422ad00a1038f47941fb15319f649137d6ed9aaed9" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.220925 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68594005f28a99dacd05f6422ad00a1038f47941fb15319f649137d6ed9aaed9"} err="failed to get container status \"68594005f28a99dacd05f6422ad00a1038f47941fb15319f649137d6ed9aaed9\": rpc error: code = NotFound desc = could not find container \"68594005f28a99dacd05f6422ad00a1038f47941fb15319f649137d6ed9aaed9\": container with ID starting with 68594005f28a99dacd05f6422ad00a1038f47941fb15319f649137d6ed9aaed9 not found: ID does not exist" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.267213 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78q4h\" (UniqueName: \"kubernetes.io/projected/439a3487-2bc9-4f44-af4c-02d5445fb349-kube-api-access-78q4h\") pod \"439a3487-2bc9-4f44-af4c-02d5445fb349\" (UID: \"439a3487-2bc9-4f44-af4c-02d5445fb349\") " Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.267538 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/439a3487-2bc9-4f44-af4c-02d5445fb349-dns-svc\") pod \"439a3487-2bc9-4f44-af4c-02d5445fb349\" (UID: \"439a3487-2bc9-4f44-af4c-02d5445fb349\") " Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.267703 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/439a3487-2bc9-4f44-af4c-02d5445fb349-ovsdbserver-sb\") pod \"439a3487-2bc9-4f44-af4c-02d5445fb349\" (UID: \"439a3487-2bc9-4f44-af4c-02d5445fb349\") " Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.267789 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/439a3487-2bc9-4f44-af4c-02d5445fb349-ovsdbserver-nb\") pod \"439a3487-2bc9-4f44-af4c-02d5445fb349\" (UID: \"439a3487-2bc9-4f44-af4c-02d5445fb349\") " Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.267971 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439a3487-2bc9-4f44-af4c-02d5445fb349-config\") pod \"439a3487-2bc9-4f44-af4c-02d5445fb349\" (UID: \"439a3487-2bc9-4f44-af4c-02d5445fb349\") " Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.270968 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/439a3487-2bc9-4f44-af4c-02d5445fb349-kube-api-access-78q4h" (OuterVolumeSpecName: "kube-api-access-78q4h") pod "439a3487-2bc9-4f44-af4c-02d5445fb349" (UID: "439a3487-2bc9-4f44-af4c-02d5445fb349"). InnerVolumeSpecName "kube-api-access-78q4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.327812 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/439a3487-2bc9-4f44-af4c-02d5445fb349-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "439a3487-2bc9-4f44-af4c-02d5445fb349" (UID: "439a3487-2bc9-4f44-af4c-02d5445fb349"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.331160 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/439a3487-2bc9-4f44-af4c-02d5445fb349-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "439a3487-2bc9-4f44-af4c-02d5445fb349" (UID: "439a3487-2bc9-4f44-af4c-02d5445fb349"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.333291 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/439a3487-2bc9-4f44-af4c-02d5445fb349-config" (OuterVolumeSpecName: "config") pod "439a3487-2bc9-4f44-af4c-02d5445fb349" (UID: "439a3487-2bc9-4f44-af4c-02d5445fb349"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.343503 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/439a3487-2bc9-4f44-af4c-02d5445fb349-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "439a3487-2bc9-4f44-af4c-02d5445fb349" (UID: "439a3487-2bc9-4f44-af4c-02d5445fb349"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.370579 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/439a3487-2bc9-4f44-af4c-02d5445fb349-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.370710 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/439a3487-2bc9-4f44-af4c-02d5445fb349-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.370771 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439a3487-2bc9-4f44-af4c-02d5445fb349-config\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.370824 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78q4h\" (UniqueName: \"kubernetes.io/projected/439a3487-2bc9-4f44-af4c-02d5445fb349-kube-api-access-78q4h\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.370877 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/439a3487-2bc9-4f44-af4c-02d5445fb349-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.463623 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.470704 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.486699 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.506522 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 17:55:55 crc kubenswrapper[4756]: E1205 17:55:55.507009 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="439a3487-2bc9-4f44-af4c-02d5445fb349" containerName="dnsmasq-dns" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.507030 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="439a3487-2bc9-4f44-af4c-02d5445fb349" containerName="dnsmasq-dns" Dec 05 17:55:55 crc kubenswrapper[4756]: E1205 17:55:55.507048 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab7a1d9-b1fe-46ee-8e95-e6b138c03205" containerName="nova-metadata-log" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.507057 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab7a1d9-b1fe-46ee-8e95-e6b138c03205" containerName="nova-metadata-log" Dec 05 17:55:55 crc kubenswrapper[4756]: E1205 17:55:55.507099 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab7a1d9-b1fe-46ee-8e95-e6b138c03205" containerName="nova-metadata-metadata" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.507108 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab7a1d9-b1fe-46ee-8e95-e6b138c03205" containerName="nova-metadata-metadata" Dec 05 17:55:55 crc kubenswrapper[4756]: E1205 17:55:55.507127 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="439a3487-2bc9-4f44-af4c-02d5445fb349" containerName="init" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.507136 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="439a3487-2bc9-4f44-af4c-02d5445fb349" containerName="init" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.507348 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="dab7a1d9-b1fe-46ee-8e95-e6b138c03205" containerName="nova-metadata-metadata" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.507367 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="dab7a1d9-b1fe-46ee-8e95-e6b138c03205" containerName="nova-metadata-log" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.507392 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="439a3487-2bc9-4f44-af4c-02d5445fb349" containerName="dnsmasq-dns" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.508627 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.512729 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.512944 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.544773 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.681014 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d\") " pod="openstack/nova-metadata-0" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.681098 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d-logs\") pod \"nova-metadata-0\" (UID: \"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d\") " pod="openstack/nova-metadata-0" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.681215 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d-config-data\") pod \"nova-metadata-0\" (UID: \"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d\") " pod="openstack/nova-metadata-0" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.681290 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d\") " pod="openstack/nova-metadata-0" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.681330 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg2l4\" (UniqueName: \"kubernetes.io/projected/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d-kube-api-access-hg2l4\") pod \"nova-metadata-0\" (UID: \"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d\") " pod="openstack/nova-metadata-0" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.783545 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d\") " pod="openstack/nova-metadata-0" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.783621 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d-logs\") pod \"nova-metadata-0\" (UID: \"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d\") " pod="openstack/nova-metadata-0" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.783703 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d-config-data\") pod \"nova-metadata-0\" (UID: \"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d\") " pod="openstack/nova-metadata-0" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.783759 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d\") " pod="openstack/nova-metadata-0" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.783791 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg2l4\" (UniqueName: \"kubernetes.io/projected/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d-kube-api-access-hg2l4\") pod \"nova-metadata-0\" (UID: \"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d\") " pod="openstack/nova-metadata-0" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.784346 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d-logs\") pod \"nova-metadata-0\" (UID: \"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d\") " pod="openstack/nova-metadata-0" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.788223 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d-config-data\") pod \"nova-metadata-0\" (UID: \"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d\") " pod="openstack/nova-metadata-0" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.789609 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d\") " pod="openstack/nova-metadata-0" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.792182 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d\") " pod="openstack/nova-metadata-0" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.812017 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg2l4\" (UniqueName: \"kubernetes.io/projected/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d-kube-api-access-hg2l4\") pod \"nova-metadata-0\" (UID: \"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d\") " pod="openstack/nova-metadata-0" Dec 05 17:55:55 crc kubenswrapper[4756]: I1205 17:55:55.833813 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 17:55:56 crc kubenswrapper[4756]: I1205 17:55:56.062703 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dab7a1d9-b1fe-46ee-8e95-e6b138c03205" path="/var/lib/kubelet/pods/dab7a1d9-b1fe-46ee-8e95-e6b138c03205/volumes" Dec 05 17:55:56 crc kubenswrapper[4756]: I1205 17:55:56.149096 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66bbb84ccc-jsvsh" Dec 05 17:55:56 crc kubenswrapper[4756]: I1205 17:55:56.149086 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66bbb84ccc-jsvsh" event={"ID":"439a3487-2bc9-4f44-af4c-02d5445fb349","Type":"ContainerDied","Data":"10620cc0c2cd0a7122a45666121879a40623b99c6fd865afe7fa111dc2bb2cb0"} Dec 05 17:55:56 crc kubenswrapper[4756]: I1205 17:55:56.149309 4756 scope.go:117] "RemoveContainer" containerID="c67171f92813c91e25732d8318222028189777f18d2d30b866fd558ec69ed23c" Dec 05 17:55:56 crc kubenswrapper[4756]: I1205 17:55:56.177596 4756 scope.go:117] "RemoveContainer" containerID="286ceffd56b52175db1dd8deb168ac10e346939f0c2395b37208c54d884ad7e1" Dec 05 17:55:56 crc kubenswrapper[4756]: I1205 17:55:56.181497 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66bbb84ccc-jsvsh"] Dec 05 17:55:56 crc kubenswrapper[4756]: I1205 17:55:56.194070 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66bbb84ccc-jsvsh"] Dec 05 17:55:56 crc kubenswrapper[4756]: I1205 17:55:56.363027 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 17:55:57 crc kubenswrapper[4756]: I1205 17:55:57.173921 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d","Type":"ContainerStarted","Data":"48e6218aaf647256dd474e8038db8b40fd79bf71a44b644b298e94a9e79992ce"} Dec 05 17:55:57 crc kubenswrapper[4756]: I1205 17:55:57.174213 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d","Type":"ContainerStarted","Data":"964b87217141f7b3648321f485ef6b05b961a0fb16bbfd29e669988b230b2c51"} Dec 05 17:55:57 crc kubenswrapper[4756]: I1205 17:55:57.174226 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d","Type":"ContainerStarted","Data":"b3f1810c0a1d634c04f17cab182e7963c3514fa784ff43d5ccdaad30fd8f7aae"} Dec 05 17:55:58 crc kubenswrapper[4756]: I1205 17:55:58.069326 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="439a3487-2bc9-4f44-af4c-02d5445fb349" path="/var/lib/kubelet/pods/439a3487-2bc9-4f44-af4c-02d5445fb349/volumes" Dec 05 17:56:00 crc kubenswrapper[4756]: I1205 17:56:00.142313 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-66bbb84ccc-jsvsh" podUID="439a3487-2bc9-4f44-af4c-02d5445fb349" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.68:5353: i/o timeout" Dec 05 17:56:00 crc kubenswrapper[4756]: I1205 17:56:00.459832 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:56:00 crc kubenswrapper[4756]: I1205 17:56:00.497648 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:56:00 crc kubenswrapper[4756]: I1205 17:56:00.538354 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=5.538327304 podStartE2EDuration="5.538327304s" podCreationTimestamp="2025-12-05 17:55:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:55:57.20336285 +0000 UTC m=+6017.442036372" watchObservedRunningTime="2025-12-05 17:56:00.538327304 +0000 UTC m=+6020.777000856" Dec 05 17:56:00 crc kubenswrapper[4756]: I1205 17:56:00.834747 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 17:56:00 crc kubenswrapper[4756]: I1205 17:56:00.834815 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 17:56:01 crc kubenswrapper[4756]: I1205 17:56:01.261189 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 05 17:56:01 crc kubenswrapper[4756]: I1205 17:56:01.485828 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 05 17:56:01 crc kubenswrapper[4756]: I1205 17:56:01.965464 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-84g29"] Dec 05 17:56:01 crc kubenswrapper[4756]: I1205 17:56:01.966601 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-84g29" Dec 05 17:56:01 crc kubenswrapper[4756]: I1205 17:56:01.969790 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 05 17:56:01 crc kubenswrapper[4756]: I1205 17:56:01.969987 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 05 17:56:01 crc kubenswrapper[4756]: I1205 17:56:01.993070 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-84g29"] Dec 05 17:56:02 crc kubenswrapper[4756]: I1205 17:56:02.140625 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78450a2e-9807-4e4c-8575-75926a469387-scripts\") pod \"nova-cell1-cell-mapping-84g29\" (UID: \"78450a2e-9807-4e4c-8575-75926a469387\") " pod="openstack/nova-cell1-cell-mapping-84g29" Dec 05 17:56:02 crc kubenswrapper[4756]: I1205 17:56:02.140705 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78450a2e-9807-4e4c-8575-75926a469387-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-84g29\" (UID: \"78450a2e-9807-4e4c-8575-75926a469387\") " pod="openstack/nova-cell1-cell-mapping-84g29" Dec 05 17:56:02 crc kubenswrapper[4756]: I1205 17:56:02.141075 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78450a2e-9807-4e4c-8575-75926a469387-config-data\") pod \"nova-cell1-cell-mapping-84g29\" (UID: \"78450a2e-9807-4e4c-8575-75926a469387\") " pod="openstack/nova-cell1-cell-mapping-84g29" Dec 05 17:56:02 crc kubenswrapper[4756]: I1205 17:56:02.141257 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twxkt\" (UniqueName: \"kubernetes.io/projected/78450a2e-9807-4e4c-8575-75926a469387-kube-api-access-twxkt\") pod \"nova-cell1-cell-mapping-84g29\" (UID: \"78450a2e-9807-4e4c-8575-75926a469387\") " pod="openstack/nova-cell1-cell-mapping-84g29" Dec 05 17:56:02 crc kubenswrapper[4756]: I1205 17:56:02.243039 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78450a2e-9807-4e4c-8575-75926a469387-scripts\") pod \"nova-cell1-cell-mapping-84g29\" (UID: \"78450a2e-9807-4e4c-8575-75926a469387\") " pod="openstack/nova-cell1-cell-mapping-84g29" Dec 05 17:56:02 crc kubenswrapper[4756]: I1205 17:56:02.243134 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78450a2e-9807-4e4c-8575-75926a469387-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-84g29\" (UID: \"78450a2e-9807-4e4c-8575-75926a469387\") " pod="openstack/nova-cell1-cell-mapping-84g29" Dec 05 17:56:02 crc kubenswrapper[4756]: I1205 17:56:02.243257 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78450a2e-9807-4e4c-8575-75926a469387-config-data\") pod \"nova-cell1-cell-mapping-84g29\" (UID: \"78450a2e-9807-4e4c-8575-75926a469387\") " pod="openstack/nova-cell1-cell-mapping-84g29" Dec 05 17:56:02 crc kubenswrapper[4756]: I1205 17:56:02.243572 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twxkt\" (UniqueName: \"kubernetes.io/projected/78450a2e-9807-4e4c-8575-75926a469387-kube-api-access-twxkt\") pod \"nova-cell1-cell-mapping-84g29\" (UID: \"78450a2e-9807-4e4c-8575-75926a469387\") " pod="openstack/nova-cell1-cell-mapping-84g29" Dec 05 17:56:02 crc kubenswrapper[4756]: I1205 17:56:02.249590 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78450a2e-9807-4e4c-8575-75926a469387-scripts\") pod \"nova-cell1-cell-mapping-84g29\" (UID: \"78450a2e-9807-4e4c-8575-75926a469387\") " pod="openstack/nova-cell1-cell-mapping-84g29" Dec 05 17:56:02 crc kubenswrapper[4756]: I1205 17:56:02.250062 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78450a2e-9807-4e4c-8575-75926a469387-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-84g29\" (UID: \"78450a2e-9807-4e4c-8575-75926a469387\") " pod="openstack/nova-cell1-cell-mapping-84g29" Dec 05 17:56:02 crc kubenswrapper[4756]: I1205 17:56:02.250460 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78450a2e-9807-4e4c-8575-75926a469387-config-data\") pod \"nova-cell1-cell-mapping-84g29\" (UID: \"78450a2e-9807-4e4c-8575-75926a469387\") " pod="openstack/nova-cell1-cell-mapping-84g29" Dec 05 17:56:02 crc kubenswrapper[4756]: I1205 17:56:02.270399 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twxkt\" (UniqueName: \"kubernetes.io/projected/78450a2e-9807-4e4c-8575-75926a469387-kube-api-access-twxkt\") pod \"nova-cell1-cell-mapping-84g29\" (UID: \"78450a2e-9807-4e4c-8575-75926a469387\") " pod="openstack/nova-cell1-cell-mapping-84g29" Dec 05 17:56:02 crc kubenswrapper[4756]: I1205 17:56:02.296643 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-84g29" Dec 05 17:56:02 crc kubenswrapper[4756]: I1205 17:56:02.800394 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-84g29"] Dec 05 17:56:02 crc kubenswrapper[4756]: W1205 17:56:02.801231 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78450a2e_9807_4e4c_8575_75926a469387.slice/crio-cc2ad25c6e8cb93983910f9a7c38a938077a05eabc31bb4ec36c11f01c84ea68 WatchSource:0}: Error finding container cc2ad25c6e8cb93983910f9a7c38a938077a05eabc31bb4ec36c11f01c84ea68: Status 404 returned error can't find the container with id cc2ad25c6e8cb93983910f9a7c38a938077a05eabc31bb4ec36c11f01c84ea68 Dec 05 17:56:03 crc kubenswrapper[4756]: I1205 17:56:03.049345 4756 scope.go:117] "RemoveContainer" containerID="0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a" Dec 05 17:56:03 crc kubenswrapper[4756]: E1205 17:56:03.049889 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:56:03 crc kubenswrapper[4756]: I1205 17:56:03.257065 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-84g29" event={"ID":"78450a2e-9807-4e4c-8575-75926a469387","Type":"ContainerStarted","Data":"2145246b616e84f1d8b93e824c59619d1a9721be5fb28a199ede99666dd321ca"} Dec 05 17:56:03 crc kubenswrapper[4756]: I1205 17:56:03.257123 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-84g29" event={"ID":"78450a2e-9807-4e4c-8575-75926a469387","Type":"ContainerStarted","Data":"cc2ad25c6e8cb93983910f9a7c38a938077a05eabc31bb4ec36c11f01c84ea68"} Dec 05 17:56:03 crc kubenswrapper[4756]: I1205 17:56:03.308644 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-84g29" podStartSLOduration=2.308624524 podStartE2EDuration="2.308624524s" podCreationTimestamp="2025-12-05 17:56:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:56:03.274900396 +0000 UTC m=+6023.513573938" watchObservedRunningTime="2025-12-05 17:56:03.308624524 +0000 UTC m=+6023.547298036" Dec 05 17:56:03 crc kubenswrapper[4756]: I1205 17:56:03.509809 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 17:56:03 crc kubenswrapper[4756]: I1205 17:56:03.510040 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 17:56:04 crc kubenswrapper[4756]: I1205 17:56:04.551962 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7a8480fe-0062-4c89-9f85-337058bdfc86" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.88:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 17:56:04 crc kubenswrapper[4756]: I1205 17:56:04.552137 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7a8480fe-0062-4c89-9f85-337058bdfc86" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.88:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 17:56:05 crc kubenswrapper[4756]: I1205 17:56:05.834856 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 17:56:05 crc kubenswrapper[4756]: I1205 17:56:05.834900 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 17:56:06 crc kubenswrapper[4756]: I1205 17:56:06.852580 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="13368e6b-26c6-47a3-b2ab-b9daa78c5a2d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.89:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 17:56:06 crc kubenswrapper[4756]: I1205 17:56:06.852656 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="13368e6b-26c6-47a3-b2ab-b9daa78c5a2d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.89:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 17:56:08 crc kubenswrapper[4756]: I1205 17:56:08.324867 4756 generic.go:334] "Generic (PLEG): container finished" podID="78450a2e-9807-4e4c-8575-75926a469387" containerID="2145246b616e84f1d8b93e824c59619d1a9721be5fb28a199ede99666dd321ca" exitCode=0 Dec 05 17:56:08 crc kubenswrapper[4756]: I1205 17:56:08.325447 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-84g29" event={"ID":"78450a2e-9807-4e4c-8575-75926a469387","Type":"ContainerDied","Data":"2145246b616e84f1d8b93e824c59619d1a9721be5fb28a199ede99666dd321ca"} Dec 05 17:56:09 crc kubenswrapper[4756]: I1205 17:56:09.718921 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-84g29" Dec 05 17:56:09 crc kubenswrapper[4756]: I1205 17:56:09.806281 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78450a2e-9807-4e4c-8575-75926a469387-scripts\") pod \"78450a2e-9807-4e4c-8575-75926a469387\" (UID: \"78450a2e-9807-4e4c-8575-75926a469387\") " Dec 05 17:56:09 crc kubenswrapper[4756]: I1205 17:56:09.806324 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78450a2e-9807-4e4c-8575-75926a469387-config-data\") pod \"78450a2e-9807-4e4c-8575-75926a469387\" (UID: \"78450a2e-9807-4e4c-8575-75926a469387\") " Dec 05 17:56:09 crc kubenswrapper[4756]: I1205 17:56:09.806530 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78450a2e-9807-4e4c-8575-75926a469387-combined-ca-bundle\") pod \"78450a2e-9807-4e4c-8575-75926a469387\" (UID: \"78450a2e-9807-4e4c-8575-75926a469387\") " Dec 05 17:56:09 crc kubenswrapper[4756]: I1205 17:56:09.806656 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twxkt\" (UniqueName: \"kubernetes.io/projected/78450a2e-9807-4e4c-8575-75926a469387-kube-api-access-twxkt\") pod \"78450a2e-9807-4e4c-8575-75926a469387\" (UID: \"78450a2e-9807-4e4c-8575-75926a469387\") " Dec 05 17:56:09 crc kubenswrapper[4756]: I1205 17:56:09.815612 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78450a2e-9807-4e4c-8575-75926a469387-scripts" (OuterVolumeSpecName: "scripts") pod "78450a2e-9807-4e4c-8575-75926a469387" (UID: "78450a2e-9807-4e4c-8575-75926a469387"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:56:09 crc kubenswrapper[4756]: I1205 17:56:09.815646 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78450a2e-9807-4e4c-8575-75926a469387-kube-api-access-twxkt" (OuterVolumeSpecName: "kube-api-access-twxkt") pod "78450a2e-9807-4e4c-8575-75926a469387" (UID: "78450a2e-9807-4e4c-8575-75926a469387"). InnerVolumeSpecName "kube-api-access-twxkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:56:09 crc kubenswrapper[4756]: I1205 17:56:09.834903 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78450a2e-9807-4e4c-8575-75926a469387-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78450a2e-9807-4e4c-8575-75926a469387" (UID: "78450a2e-9807-4e4c-8575-75926a469387"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:56:09 crc kubenswrapper[4756]: I1205 17:56:09.836771 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78450a2e-9807-4e4c-8575-75926a469387-config-data" (OuterVolumeSpecName: "config-data") pod "78450a2e-9807-4e4c-8575-75926a469387" (UID: "78450a2e-9807-4e4c-8575-75926a469387"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:56:09 crc kubenswrapper[4756]: I1205 17:56:09.909534 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78450a2e-9807-4e4c-8575-75926a469387-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:56:09 crc kubenswrapper[4756]: I1205 17:56:09.909941 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78450a2e-9807-4e4c-8575-75926a469387-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:56:09 crc kubenswrapper[4756]: I1205 17:56:09.909962 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78450a2e-9807-4e4c-8575-75926a469387-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:56:09 crc kubenswrapper[4756]: I1205 17:56:09.909982 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twxkt\" (UniqueName: \"kubernetes.io/projected/78450a2e-9807-4e4c-8575-75926a469387-kube-api-access-twxkt\") on node \"crc\" DevicePath \"\"" Dec 05 17:56:10 crc kubenswrapper[4756]: I1205 17:56:10.354127 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-84g29" event={"ID":"78450a2e-9807-4e4c-8575-75926a469387","Type":"ContainerDied","Data":"cc2ad25c6e8cb93983910f9a7c38a938077a05eabc31bb4ec36c11f01c84ea68"} Dec 05 17:56:10 crc kubenswrapper[4756]: I1205 17:56:10.354518 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc2ad25c6e8cb93983910f9a7c38a938077a05eabc31bb4ec36c11f01c84ea68" Dec 05 17:56:10 crc kubenswrapper[4756]: I1205 17:56:10.354207 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-84g29" Dec 05 17:56:10 crc kubenswrapper[4756]: I1205 17:56:10.706321 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 17:56:10 crc kubenswrapper[4756]: I1205 17:56:10.706698 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7a8480fe-0062-4c89-9f85-337058bdfc86" containerName="nova-api-log" containerID="cri-o://8d42dd201cc5c0b28ffe5d0721a17988e76d3ee381803459e358f624256bf898" gracePeriod=30 Dec 05 17:56:10 crc kubenswrapper[4756]: I1205 17:56:10.706794 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7a8480fe-0062-4c89-9f85-337058bdfc86" containerName="nova-api-api" containerID="cri-o://91fd11898f42a6a214a976dc66031dca8e1c9ce7698f6bf023d7a0b06b7a41f5" gracePeriod=30 Dec 05 17:56:10 crc kubenswrapper[4756]: I1205 17:56:10.733740 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 17:56:10 crc kubenswrapper[4756]: I1205 17:56:10.734045 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="13368e6b-26c6-47a3-b2ab-b9daa78c5a2d" containerName="nova-metadata-log" containerID="cri-o://964b87217141f7b3648321f485ef6b05b961a0fb16bbfd29e669988b230b2c51" gracePeriod=30 Dec 05 17:56:10 crc kubenswrapper[4756]: I1205 17:56:10.734200 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="13368e6b-26c6-47a3-b2ab-b9daa78c5a2d" containerName="nova-metadata-metadata" containerID="cri-o://48e6218aaf647256dd474e8038db8b40fd79bf71a44b644b298e94a9e79992ce" gracePeriod=30 Dec 05 17:56:11 crc kubenswrapper[4756]: I1205 17:56:11.369870 4756 generic.go:334] "Generic (PLEG): container finished" podID="13368e6b-26c6-47a3-b2ab-b9daa78c5a2d" containerID="964b87217141f7b3648321f485ef6b05b961a0fb16bbfd29e669988b230b2c51" exitCode=143 Dec 05 17:56:11 crc kubenswrapper[4756]: I1205 17:56:11.369995 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d","Type":"ContainerDied","Data":"964b87217141f7b3648321f485ef6b05b961a0fb16bbfd29e669988b230b2c51"} Dec 05 17:56:11 crc kubenswrapper[4756]: I1205 17:56:11.377135 4756 generic.go:334] "Generic (PLEG): container finished" podID="7a8480fe-0062-4c89-9f85-337058bdfc86" containerID="8d42dd201cc5c0b28ffe5d0721a17988e76d3ee381803459e358f624256bf898" exitCode=143 Dec 05 17:56:11 crc kubenswrapper[4756]: I1205 17:56:11.377201 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7a8480fe-0062-4c89-9f85-337058bdfc86","Type":"ContainerDied","Data":"8d42dd201cc5c0b28ffe5d0721a17988e76d3ee381803459e358f624256bf898"} Dec 05 17:56:14 crc kubenswrapper[4756]: I1205 17:56:14.408276 4756 generic.go:334] "Generic (PLEG): container finished" podID="13368e6b-26c6-47a3-b2ab-b9daa78c5a2d" containerID="48e6218aaf647256dd474e8038db8b40fd79bf71a44b644b298e94a9e79992ce" exitCode=0 Dec 05 17:56:14 crc kubenswrapper[4756]: I1205 17:56:14.408575 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d","Type":"ContainerDied","Data":"48e6218aaf647256dd474e8038db8b40fd79bf71a44b644b298e94a9e79992ce"} Dec 05 17:56:14 crc kubenswrapper[4756]: I1205 17:56:14.408788 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d","Type":"ContainerDied","Data":"b3f1810c0a1d634c04f17cab182e7963c3514fa784ff43d5ccdaad30fd8f7aae"} Dec 05 17:56:14 crc kubenswrapper[4756]: I1205 17:56:14.408807 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3f1810c0a1d634c04f17cab182e7963c3514fa784ff43d5ccdaad30fd8f7aae" Dec 05 17:56:14 crc kubenswrapper[4756]: I1205 17:56:14.424843 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 17:56:14 crc kubenswrapper[4756]: I1205 17:56:14.510850 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d-nova-metadata-tls-certs\") pod \"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d\" (UID: \"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d\") " Dec 05 17:56:14 crc kubenswrapper[4756]: I1205 17:56:14.510945 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d-logs\") pod \"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d\" (UID: \"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d\") " Dec 05 17:56:14 crc kubenswrapper[4756]: I1205 17:56:14.511087 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hg2l4\" (UniqueName: \"kubernetes.io/projected/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d-kube-api-access-hg2l4\") pod \"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d\" (UID: \"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d\") " Dec 05 17:56:14 crc kubenswrapper[4756]: I1205 17:56:14.511111 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d-config-data\") pod \"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d\" (UID: \"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d\") " Dec 05 17:56:14 crc kubenswrapper[4756]: I1205 17:56:14.511142 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d-combined-ca-bundle\") pod \"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d\" (UID: \"13368e6b-26c6-47a3-b2ab-b9daa78c5a2d\") " Dec 05 17:56:14 crc kubenswrapper[4756]: I1205 17:56:14.511489 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d-logs" (OuterVolumeSpecName: "logs") pod "13368e6b-26c6-47a3-b2ab-b9daa78c5a2d" (UID: "13368e6b-26c6-47a3-b2ab-b9daa78c5a2d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:56:14 crc kubenswrapper[4756]: I1205 17:56:14.511596 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d-logs\") on node \"crc\" DevicePath \"\"" Dec 05 17:56:14 crc kubenswrapper[4756]: I1205 17:56:14.517851 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d-kube-api-access-hg2l4" (OuterVolumeSpecName: "kube-api-access-hg2l4") pod "13368e6b-26c6-47a3-b2ab-b9daa78c5a2d" (UID: "13368e6b-26c6-47a3-b2ab-b9daa78c5a2d"). InnerVolumeSpecName "kube-api-access-hg2l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:56:14 crc kubenswrapper[4756]: I1205 17:56:14.535637 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d-config-data" (OuterVolumeSpecName: "config-data") pod "13368e6b-26c6-47a3-b2ab-b9daa78c5a2d" (UID: "13368e6b-26c6-47a3-b2ab-b9daa78c5a2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:56:14 crc kubenswrapper[4756]: I1205 17:56:14.571814 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13368e6b-26c6-47a3-b2ab-b9daa78c5a2d" (UID: "13368e6b-26c6-47a3-b2ab-b9daa78c5a2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:56:14 crc kubenswrapper[4756]: I1205 17:56:14.582855 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "13368e6b-26c6-47a3-b2ab-b9daa78c5a2d" (UID: "13368e6b-26c6-47a3-b2ab-b9daa78c5a2d"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:56:14 crc kubenswrapper[4756]: I1205 17:56:14.613372 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hg2l4\" (UniqueName: \"kubernetes.io/projected/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d-kube-api-access-hg2l4\") on node \"crc\" DevicePath \"\"" Dec 05 17:56:14 crc kubenswrapper[4756]: I1205 17:56:14.613477 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:56:14 crc kubenswrapper[4756]: I1205 17:56:14.613495 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:56:14 crc kubenswrapper[4756]: I1205 17:56:14.613506 4756 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 17:56:15 crc kubenswrapper[4756]: I1205 17:56:15.417221 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 17:56:15 crc kubenswrapper[4756]: I1205 17:56:15.458263 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 17:56:15 crc kubenswrapper[4756]: I1205 17:56:15.477486 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 17:56:15 crc kubenswrapper[4756]: I1205 17:56:15.494684 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 17:56:15 crc kubenswrapper[4756]: E1205 17:56:15.495384 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78450a2e-9807-4e4c-8575-75926a469387" containerName="nova-manage" Dec 05 17:56:15 crc kubenswrapper[4756]: I1205 17:56:15.495446 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="78450a2e-9807-4e4c-8575-75926a469387" containerName="nova-manage" Dec 05 17:56:15 crc kubenswrapper[4756]: E1205 17:56:15.495483 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13368e6b-26c6-47a3-b2ab-b9daa78c5a2d" containerName="nova-metadata-log" Dec 05 17:56:15 crc kubenswrapper[4756]: I1205 17:56:15.495498 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="13368e6b-26c6-47a3-b2ab-b9daa78c5a2d" containerName="nova-metadata-log" Dec 05 17:56:15 crc kubenswrapper[4756]: E1205 17:56:15.495543 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13368e6b-26c6-47a3-b2ab-b9daa78c5a2d" containerName="nova-metadata-metadata" Dec 05 17:56:15 crc kubenswrapper[4756]: I1205 17:56:15.495560 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="13368e6b-26c6-47a3-b2ab-b9daa78c5a2d" containerName="nova-metadata-metadata" Dec 05 17:56:15 crc kubenswrapper[4756]: I1205 17:56:15.495916 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="13368e6b-26c6-47a3-b2ab-b9daa78c5a2d" containerName="nova-metadata-metadata" Dec 05 17:56:15 crc kubenswrapper[4756]: I1205 17:56:15.495972 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="13368e6b-26c6-47a3-b2ab-b9daa78c5a2d" containerName="nova-metadata-log" Dec 05 17:56:15 crc kubenswrapper[4756]: I1205 17:56:15.496014 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="78450a2e-9807-4e4c-8575-75926a469387" containerName="nova-manage" Dec 05 17:56:15 crc kubenswrapper[4756]: I1205 17:56:15.497839 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 17:56:15 crc kubenswrapper[4756]: I1205 17:56:15.504706 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 17:56:15 crc kubenswrapper[4756]: I1205 17:56:15.505850 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 17:56:15 crc kubenswrapper[4756]: I1205 17:56:15.506155 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 17:56:15 crc kubenswrapper[4756]: I1205 17:56:15.635644 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d2b6ac9-0cdf-4fed-836e-918827fd0d90-config-data\") pod \"nova-metadata-0\" (UID: \"3d2b6ac9-0cdf-4fed-836e-918827fd0d90\") " pod="openstack/nova-metadata-0" Dec 05 17:56:15 crc kubenswrapper[4756]: I1205 17:56:15.635740 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d2b6ac9-0cdf-4fed-836e-918827fd0d90-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3d2b6ac9-0cdf-4fed-836e-918827fd0d90\") " pod="openstack/nova-metadata-0" Dec 05 17:56:15 crc kubenswrapper[4756]: I1205 17:56:15.635775 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d2b6ac9-0cdf-4fed-836e-918827fd0d90-logs\") pod \"nova-metadata-0\" (UID: \"3d2b6ac9-0cdf-4fed-836e-918827fd0d90\") " pod="openstack/nova-metadata-0" Dec 05 17:56:15 crc kubenswrapper[4756]: I1205 17:56:15.635803 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d2b6ac9-0cdf-4fed-836e-918827fd0d90-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3d2b6ac9-0cdf-4fed-836e-918827fd0d90\") " pod="openstack/nova-metadata-0" Dec 05 17:56:15 crc kubenswrapper[4756]: I1205 17:56:15.635861 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xr8g\" (UniqueName: \"kubernetes.io/projected/3d2b6ac9-0cdf-4fed-836e-918827fd0d90-kube-api-access-7xr8g\") pod \"nova-metadata-0\" (UID: \"3d2b6ac9-0cdf-4fed-836e-918827fd0d90\") " pod="openstack/nova-metadata-0" Dec 05 17:56:15 crc kubenswrapper[4756]: I1205 17:56:15.738787 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d2b6ac9-0cdf-4fed-836e-918827fd0d90-config-data\") pod \"nova-metadata-0\" (UID: \"3d2b6ac9-0cdf-4fed-836e-918827fd0d90\") " pod="openstack/nova-metadata-0" Dec 05 17:56:15 crc kubenswrapper[4756]: I1205 17:56:15.738969 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d2b6ac9-0cdf-4fed-836e-918827fd0d90-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3d2b6ac9-0cdf-4fed-836e-918827fd0d90\") " pod="openstack/nova-metadata-0" Dec 05 17:56:15 crc kubenswrapper[4756]: I1205 17:56:15.739034 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d2b6ac9-0cdf-4fed-836e-918827fd0d90-logs\") pod \"nova-metadata-0\" (UID: \"3d2b6ac9-0cdf-4fed-836e-918827fd0d90\") " pod="openstack/nova-metadata-0" Dec 05 17:56:15 crc kubenswrapper[4756]: I1205 17:56:15.739284 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d2b6ac9-0cdf-4fed-836e-918827fd0d90-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3d2b6ac9-0cdf-4fed-836e-918827fd0d90\") " pod="openstack/nova-metadata-0" Dec 05 17:56:15 crc kubenswrapper[4756]: I1205 17:56:15.739337 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xr8g\" (UniqueName: \"kubernetes.io/projected/3d2b6ac9-0cdf-4fed-836e-918827fd0d90-kube-api-access-7xr8g\") pod \"nova-metadata-0\" (UID: \"3d2b6ac9-0cdf-4fed-836e-918827fd0d90\") " pod="openstack/nova-metadata-0" Dec 05 17:56:15 crc kubenswrapper[4756]: I1205 17:56:15.740359 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d2b6ac9-0cdf-4fed-836e-918827fd0d90-logs\") pod \"nova-metadata-0\" (UID: \"3d2b6ac9-0cdf-4fed-836e-918827fd0d90\") " pod="openstack/nova-metadata-0" Dec 05 17:56:15 crc kubenswrapper[4756]: I1205 17:56:15.747132 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d2b6ac9-0cdf-4fed-836e-918827fd0d90-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3d2b6ac9-0cdf-4fed-836e-918827fd0d90\") " pod="openstack/nova-metadata-0" Dec 05 17:56:15 crc kubenswrapper[4756]: I1205 17:56:15.748165 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d2b6ac9-0cdf-4fed-836e-918827fd0d90-config-data\") pod \"nova-metadata-0\" (UID: \"3d2b6ac9-0cdf-4fed-836e-918827fd0d90\") " pod="openstack/nova-metadata-0" Dec 05 17:56:15 crc kubenswrapper[4756]: I1205 17:56:15.750356 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d2b6ac9-0cdf-4fed-836e-918827fd0d90-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3d2b6ac9-0cdf-4fed-836e-918827fd0d90\") " pod="openstack/nova-metadata-0" Dec 05 17:56:15 crc kubenswrapper[4756]: I1205 17:56:15.761967 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xr8g\" (UniqueName: \"kubernetes.io/projected/3d2b6ac9-0cdf-4fed-836e-918827fd0d90-kube-api-access-7xr8g\") pod \"nova-metadata-0\" (UID: \"3d2b6ac9-0cdf-4fed-836e-918827fd0d90\") " pod="openstack/nova-metadata-0" Dec 05 17:56:15 crc kubenswrapper[4756]: I1205 17:56:15.864345 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 17:56:16 crc kubenswrapper[4756]: I1205 17:56:16.076591 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13368e6b-26c6-47a3-b2ab-b9daa78c5a2d" path="/var/lib/kubelet/pods/13368e6b-26c6-47a3-b2ab-b9daa78c5a2d/volumes" Dec 05 17:56:16 crc kubenswrapper[4756]: I1205 17:56:16.366488 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 17:56:16 crc kubenswrapper[4756]: I1205 17:56:16.432186 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3d2b6ac9-0cdf-4fed-836e-918827fd0d90","Type":"ContainerStarted","Data":"b26bb72b3061c8d159373749196de355bac13480f901e13b622eeed5b5643890"} Dec 05 17:56:17 crc kubenswrapper[4756]: I1205 17:56:17.050305 4756 scope.go:117] "RemoveContainer" containerID="0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a" Dec 05 17:56:17 crc kubenswrapper[4756]: E1205 17:56:17.051130 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:56:17 crc kubenswrapper[4756]: I1205 17:56:17.443826 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3d2b6ac9-0cdf-4fed-836e-918827fd0d90","Type":"ContainerStarted","Data":"4d1fcf3379d75e77c4561d70e283c1ed0d5037a8f285a1af30bb896cacbe3713"} Dec 05 17:56:17 crc kubenswrapper[4756]: I1205 17:56:17.443890 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3d2b6ac9-0cdf-4fed-836e-918827fd0d90","Type":"ContainerStarted","Data":"d76bef93df02b6312bb583f8ecb763cce117bdc6e900ecbb8aeec0283f903e37"} Dec 05 17:56:17 crc kubenswrapper[4756]: I1205 17:56:17.476740 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.476722222 podStartE2EDuration="2.476722222s" podCreationTimestamp="2025-12-05 17:56:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:56:17.465892924 +0000 UTC m=+6037.704566446" watchObservedRunningTime="2025-12-05 17:56:17.476722222 +0000 UTC m=+6037.715395744" Dec 05 17:56:20 crc kubenswrapper[4756]: I1205 17:56:20.865319 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 17:56:20 crc kubenswrapper[4756]: I1205 17:56:20.865999 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 17:56:22 crc kubenswrapper[4756]: I1205 17:56:22.160747 4756 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pode3aeab9a-21a1-4186-856a-a04db31f11e7"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pode3aeab9a-21a1-4186-856a-a04db31f11e7] : Timed out while waiting for systemd to remove kubepods-besteffort-pode3aeab9a_21a1_4186_856a_a04db31f11e7.slice" Dec 05 17:56:22 crc kubenswrapper[4756]: E1205 17:56:22.161118 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pode3aeab9a-21a1-4186-856a-a04db31f11e7] : unable to destroy cgroup paths for cgroup [kubepods besteffort pode3aeab9a-21a1-4186-856a-a04db31f11e7] : Timed out while waiting for systemd to remove kubepods-besteffort-pode3aeab9a_21a1_4186_856a_a04db31f11e7.slice" pod="openstack/nova-cell0-cell-mapping-cqrvr" podUID="e3aeab9a-21a1-4186-856a-a04db31f11e7" Dec 05 17:56:22 crc kubenswrapper[4756]: I1205 17:56:22.511305 4756 generic.go:334] "Generic (PLEG): container finished" podID="68ced2e8-8358-4d3d-97cc-3037d96b0efe" containerID="f3683aa1f216ea13bcc96af51848d8f7fae2586d8cbdc70542720cd6f7e57ff0" exitCode=137 Dec 05 17:56:22 crc kubenswrapper[4756]: I1205 17:56:22.511672 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cqrvr" Dec 05 17:56:22 crc kubenswrapper[4756]: I1205 17:56:22.511405 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"68ced2e8-8358-4d3d-97cc-3037d96b0efe","Type":"ContainerDied","Data":"f3683aa1f216ea13bcc96af51848d8f7fae2586d8cbdc70542720cd6f7e57ff0"} Dec 05 17:56:22 crc kubenswrapper[4756]: I1205 17:56:22.610145 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 17:56:22 crc kubenswrapper[4756]: I1205 17:56:22.802773 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68ced2e8-8358-4d3d-97cc-3037d96b0efe-config-data\") pod \"68ced2e8-8358-4d3d-97cc-3037d96b0efe\" (UID: \"68ced2e8-8358-4d3d-97cc-3037d96b0efe\") " Dec 05 17:56:22 crc kubenswrapper[4756]: I1205 17:56:22.802907 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ced2e8-8358-4d3d-97cc-3037d96b0efe-combined-ca-bundle\") pod \"68ced2e8-8358-4d3d-97cc-3037d96b0efe\" (UID: \"68ced2e8-8358-4d3d-97cc-3037d96b0efe\") " Dec 05 17:56:22 crc kubenswrapper[4756]: I1205 17:56:22.802962 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-254z9\" (UniqueName: \"kubernetes.io/projected/68ced2e8-8358-4d3d-97cc-3037d96b0efe-kube-api-access-254z9\") pod \"68ced2e8-8358-4d3d-97cc-3037d96b0efe\" (UID: \"68ced2e8-8358-4d3d-97cc-3037d96b0efe\") " Dec 05 17:56:22 crc kubenswrapper[4756]: I1205 17:56:22.810788 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68ced2e8-8358-4d3d-97cc-3037d96b0efe-kube-api-access-254z9" (OuterVolumeSpecName: "kube-api-access-254z9") pod "68ced2e8-8358-4d3d-97cc-3037d96b0efe" (UID: "68ced2e8-8358-4d3d-97cc-3037d96b0efe"). InnerVolumeSpecName "kube-api-access-254z9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:56:22 crc kubenswrapper[4756]: I1205 17:56:22.833561 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68ced2e8-8358-4d3d-97cc-3037d96b0efe-config-data" (OuterVolumeSpecName: "config-data") pod "68ced2e8-8358-4d3d-97cc-3037d96b0efe" (UID: "68ced2e8-8358-4d3d-97cc-3037d96b0efe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:56:22 crc kubenswrapper[4756]: I1205 17:56:22.839236 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68ced2e8-8358-4d3d-97cc-3037d96b0efe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68ced2e8-8358-4d3d-97cc-3037d96b0efe" (UID: "68ced2e8-8358-4d3d-97cc-3037d96b0efe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:56:22 crc kubenswrapper[4756]: I1205 17:56:22.905436 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ced2e8-8358-4d3d-97cc-3037d96b0efe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:56:22 crc kubenswrapper[4756]: I1205 17:56:22.905480 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-254z9\" (UniqueName: \"kubernetes.io/projected/68ced2e8-8358-4d3d-97cc-3037d96b0efe-kube-api-access-254z9\") on node \"crc\" DevicePath \"\"" Dec 05 17:56:22 crc kubenswrapper[4756]: I1205 17:56:22.905496 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68ced2e8-8358-4d3d-97cc-3037d96b0efe-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:56:23 crc kubenswrapper[4756]: I1205 17:56:23.509858 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 17:56:23 crc kubenswrapper[4756]: I1205 17:56:23.510612 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 17:56:23 crc kubenswrapper[4756]: I1205 17:56:23.525526 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"68ced2e8-8358-4d3d-97cc-3037d96b0efe","Type":"ContainerDied","Data":"82f07a023ffcf2edf48726cde6b38e1337cc45f2da4bd3a15bd20bc4a2b0eb3e"} Dec 05 17:56:23 crc kubenswrapper[4756]: I1205 17:56:23.525613 4756 scope.go:117] "RemoveContainer" containerID="f3683aa1f216ea13bcc96af51848d8f7fae2586d8cbdc70542720cd6f7e57ff0" Dec 05 17:56:23 crc kubenswrapper[4756]: I1205 17:56:23.525626 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 17:56:23 crc kubenswrapper[4756]: I1205 17:56:23.584543 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 17:56:23 crc kubenswrapper[4756]: I1205 17:56:23.605058 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 17:56:23 crc kubenswrapper[4756]: I1205 17:56:23.623395 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 17:56:23 crc kubenswrapper[4756]: E1205 17:56:23.623854 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68ced2e8-8358-4d3d-97cc-3037d96b0efe" containerName="nova-scheduler-scheduler" Dec 05 17:56:23 crc kubenswrapper[4756]: I1205 17:56:23.623875 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="68ced2e8-8358-4d3d-97cc-3037d96b0efe" containerName="nova-scheduler-scheduler" Dec 05 17:56:23 crc kubenswrapper[4756]: I1205 17:56:23.624072 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="68ced2e8-8358-4d3d-97cc-3037d96b0efe" containerName="nova-scheduler-scheduler" Dec 05 17:56:23 crc kubenswrapper[4756]: I1205 17:56:23.624799 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 17:56:23 crc kubenswrapper[4756]: I1205 17:56:23.627735 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 17:56:23 crc kubenswrapper[4756]: I1205 17:56:23.632760 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 17:56:23 crc kubenswrapper[4756]: I1205 17:56:23.722551 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxv7z\" (UniqueName: \"kubernetes.io/projected/a5bd65a0-3765-47ba-bde4-73f8429a2219-kube-api-access-fxv7z\") pod \"nova-scheduler-0\" (UID: \"a5bd65a0-3765-47ba-bde4-73f8429a2219\") " pod="openstack/nova-scheduler-0" Dec 05 17:56:23 crc kubenswrapper[4756]: I1205 17:56:23.722645 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5bd65a0-3765-47ba-bde4-73f8429a2219-config-data\") pod \"nova-scheduler-0\" (UID: \"a5bd65a0-3765-47ba-bde4-73f8429a2219\") " pod="openstack/nova-scheduler-0" Dec 05 17:56:23 crc kubenswrapper[4756]: I1205 17:56:23.722753 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5bd65a0-3765-47ba-bde4-73f8429a2219-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a5bd65a0-3765-47ba-bde4-73f8429a2219\") " pod="openstack/nova-scheduler-0" Dec 05 17:56:23 crc kubenswrapper[4756]: I1205 17:56:23.824874 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5bd65a0-3765-47ba-bde4-73f8429a2219-config-data\") pod \"nova-scheduler-0\" (UID: \"a5bd65a0-3765-47ba-bde4-73f8429a2219\") " pod="openstack/nova-scheduler-0" Dec 05 17:56:23 crc kubenswrapper[4756]: I1205 17:56:23.825023 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5bd65a0-3765-47ba-bde4-73f8429a2219-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a5bd65a0-3765-47ba-bde4-73f8429a2219\") " pod="openstack/nova-scheduler-0" Dec 05 17:56:23 crc kubenswrapper[4756]: I1205 17:56:23.825224 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxv7z\" (UniqueName: \"kubernetes.io/projected/a5bd65a0-3765-47ba-bde4-73f8429a2219-kube-api-access-fxv7z\") pod \"nova-scheduler-0\" (UID: \"a5bd65a0-3765-47ba-bde4-73f8429a2219\") " pod="openstack/nova-scheduler-0" Dec 05 17:56:23 crc kubenswrapper[4756]: I1205 17:56:23.831466 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5bd65a0-3765-47ba-bde4-73f8429a2219-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a5bd65a0-3765-47ba-bde4-73f8429a2219\") " pod="openstack/nova-scheduler-0" Dec 05 17:56:23 crc kubenswrapper[4756]: I1205 17:56:23.832893 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5bd65a0-3765-47ba-bde4-73f8429a2219-config-data\") pod \"nova-scheduler-0\" (UID: \"a5bd65a0-3765-47ba-bde4-73f8429a2219\") " pod="openstack/nova-scheduler-0" Dec 05 17:56:23 crc kubenswrapper[4756]: I1205 17:56:23.857588 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxv7z\" (UniqueName: \"kubernetes.io/projected/a5bd65a0-3765-47ba-bde4-73f8429a2219-kube-api-access-fxv7z\") pod \"nova-scheduler-0\" (UID: \"a5bd65a0-3765-47ba-bde4-73f8429a2219\") " pod="openstack/nova-scheduler-0" Dec 05 17:56:23 crc kubenswrapper[4756]: I1205 17:56:23.953690 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 17:56:24 crc kubenswrapper[4756]: I1205 17:56:24.097362 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68ced2e8-8358-4d3d-97cc-3037d96b0efe" path="/var/lib/kubelet/pods/68ced2e8-8358-4d3d-97cc-3037d96b0efe/volumes" Dec 05 17:56:24 crc kubenswrapper[4756]: W1205 17:56:24.481094 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5bd65a0_3765_47ba_bde4_73f8429a2219.slice/crio-21d06e9eed0b322bd18a009f159c865c967bd859c549993e86cd696488186904 WatchSource:0}: Error finding container 21d06e9eed0b322bd18a009f159c865c967bd859c549993e86cd696488186904: Status 404 returned error can't find the container with id 21d06e9eed0b322bd18a009f159c865c967bd859c549993e86cd696488186904 Dec 05 17:56:24 crc kubenswrapper[4756]: I1205 17:56:24.482789 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 17:56:24 crc kubenswrapper[4756]: I1205 17:56:24.537088 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a5bd65a0-3765-47ba-bde4-73f8429a2219","Type":"ContainerStarted","Data":"21d06e9eed0b322bd18a009f159c865c967bd859c549993e86cd696488186904"} Dec 05 17:56:24 crc kubenswrapper[4756]: I1205 17:56:24.541962 4756 generic.go:334] "Generic (PLEG): container finished" podID="7a8480fe-0062-4c89-9f85-337058bdfc86" containerID="91fd11898f42a6a214a976dc66031dca8e1c9ce7698f6bf023d7a0b06b7a41f5" exitCode=0 Dec 05 17:56:24 crc kubenswrapper[4756]: I1205 17:56:24.542024 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7a8480fe-0062-4c89-9f85-337058bdfc86","Type":"ContainerDied","Data":"91fd11898f42a6a214a976dc66031dca8e1c9ce7698f6bf023d7a0b06b7a41f5"} Dec 05 17:56:24 crc kubenswrapper[4756]: I1205 17:56:24.647331 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 17:56:24 crc kubenswrapper[4756]: I1205 17:56:24.743972 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a8480fe-0062-4c89-9f85-337058bdfc86-logs\") pod \"7a8480fe-0062-4c89-9f85-337058bdfc86\" (UID: \"7a8480fe-0062-4c89-9f85-337058bdfc86\") " Dec 05 17:56:24 crc kubenswrapper[4756]: I1205 17:56:24.744065 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8480fe-0062-4c89-9f85-337058bdfc86-combined-ca-bundle\") pod \"7a8480fe-0062-4c89-9f85-337058bdfc86\" (UID: \"7a8480fe-0062-4c89-9f85-337058bdfc86\") " Dec 05 17:56:24 crc kubenswrapper[4756]: I1205 17:56:24.744152 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a8480fe-0062-4c89-9f85-337058bdfc86-config-data\") pod \"7a8480fe-0062-4c89-9f85-337058bdfc86\" (UID: \"7a8480fe-0062-4c89-9f85-337058bdfc86\") " Dec 05 17:56:24 crc kubenswrapper[4756]: I1205 17:56:24.744243 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfmj7\" (UniqueName: \"kubernetes.io/projected/7a8480fe-0062-4c89-9f85-337058bdfc86-kube-api-access-xfmj7\") pod \"7a8480fe-0062-4c89-9f85-337058bdfc86\" (UID: \"7a8480fe-0062-4c89-9f85-337058bdfc86\") " Dec 05 17:56:24 crc kubenswrapper[4756]: I1205 17:56:24.745026 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a8480fe-0062-4c89-9f85-337058bdfc86-logs" (OuterVolumeSpecName: "logs") pod "7a8480fe-0062-4c89-9f85-337058bdfc86" (UID: "7a8480fe-0062-4c89-9f85-337058bdfc86"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:56:24 crc kubenswrapper[4756]: I1205 17:56:24.753049 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a8480fe-0062-4c89-9f85-337058bdfc86-kube-api-access-xfmj7" (OuterVolumeSpecName: "kube-api-access-xfmj7") pod "7a8480fe-0062-4c89-9f85-337058bdfc86" (UID: "7a8480fe-0062-4c89-9f85-337058bdfc86"). InnerVolumeSpecName "kube-api-access-xfmj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:56:24 crc kubenswrapper[4756]: I1205 17:56:24.774248 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a8480fe-0062-4c89-9f85-337058bdfc86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a8480fe-0062-4c89-9f85-337058bdfc86" (UID: "7a8480fe-0062-4c89-9f85-337058bdfc86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:56:24 crc kubenswrapper[4756]: I1205 17:56:24.798509 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a8480fe-0062-4c89-9f85-337058bdfc86-config-data" (OuterVolumeSpecName: "config-data") pod "7a8480fe-0062-4c89-9f85-337058bdfc86" (UID: "7a8480fe-0062-4c89-9f85-337058bdfc86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:56:24 crc kubenswrapper[4756]: I1205 17:56:24.846889 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8480fe-0062-4c89-9f85-337058bdfc86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:56:24 crc kubenswrapper[4756]: I1205 17:56:24.846935 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a8480fe-0062-4c89-9f85-337058bdfc86-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:56:24 crc kubenswrapper[4756]: I1205 17:56:24.846954 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfmj7\" (UniqueName: \"kubernetes.io/projected/7a8480fe-0062-4c89-9f85-337058bdfc86-kube-api-access-xfmj7\") on node \"crc\" DevicePath \"\"" Dec 05 17:56:24 crc kubenswrapper[4756]: I1205 17:56:24.846972 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a8480fe-0062-4c89-9f85-337058bdfc86-logs\") on node \"crc\" DevicePath \"\"" Dec 05 17:56:25 crc kubenswrapper[4756]: I1205 17:56:25.556354 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7a8480fe-0062-4c89-9f85-337058bdfc86","Type":"ContainerDied","Data":"0c5e8299c7b9a3937110089a4f8dfbf3f8bab78882d33227aed922a1262bf3e7"} Dec 05 17:56:25 crc kubenswrapper[4756]: I1205 17:56:25.556801 4756 scope.go:117] "RemoveContainer" containerID="91fd11898f42a6a214a976dc66031dca8e1c9ce7698f6bf023d7a0b06b7a41f5" Dec 05 17:56:25 crc kubenswrapper[4756]: I1205 17:56:25.556439 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 17:56:25 crc kubenswrapper[4756]: I1205 17:56:25.562390 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a5bd65a0-3765-47ba-bde4-73f8429a2219","Type":"ContainerStarted","Data":"527c9ca1484ba7a5a882cb1a20859194d3a358f0a0bd3018b858584d7650191a"} Dec 05 17:56:25 crc kubenswrapper[4756]: I1205 17:56:25.586466 4756 scope.go:117] "RemoveContainer" containerID="8d42dd201cc5c0b28ffe5d0721a17988e76d3ee381803459e358f624256bf898" Dec 05 17:56:25 crc kubenswrapper[4756]: I1205 17:56:25.593356 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.593320618 podStartE2EDuration="2.593320618s" podCreationTimestamp="2025-12-05 17:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:56:25.590323888 +0000 UTC m=+6045.828997440" watchObservedRunningTime="2025-12-05 17:56:25.593320618 +0000 UTC m=+6045.831994200" Dec 05 17:56:25 crc kubenswrapper[4756]: I1205 17:56:25.625681 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 17:56:25 crc kubenswrapper[4756]: I1205 17:56:25.653198 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 17:56:25 crc kubenswrapper[4756]: I1205 17:56:25.667326 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 17:56:25 crc kubenswrapper[4756]: E1205 17:56:25.667989 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a8480fe-0062-4c89-9f85-337058bdfc86" containerName="nova-api-log" Dec 05 17:56:25 crc kubenswrapper[4756]: I1205 17:56:25.668016 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a8480fe-0062-4c89-9f85-337058bdfc86" containerName="nova-api-log" Dec 05 17:56:25 crc kubenswrapper[4756]: E1205 17:56:25.668048 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a8480fe-0062-4c89-9f85-337058bdfc86" containerName="nova-api-api" Dec 05 17:56:25 crc kubenswrapper[4756]: I1205 17:56:25.668059 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a8480fe-0062-4c89-9f85-337058bdfc86" containerName="nova-api-api" Dec 05 17:56:25 crc kubenswrapper[4756]: I1205 17:56:25.668355 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a8480fe-0062-4c89-9f85-337058bdfc86" containerName="nova-api-log" Dec 05 17:56:25 crc kubenswrapper[4756]: I1205 17:56:25.668375 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a8480fe-0062-4c89-9f85-337058bdfc86" containerName="nova-api-api" Dec 05 17:56:25 crc kubenswrapper[4756]: I1205 17:56:25.671738 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 17:56:25 crc kubenswrapper[4756]: I1205 17:56:25.678657 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 17:56:25 crc kubenswrapper[4756]: I1205 17:56:25.679040 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 17:56:25 crc kubenswrapper[4756]: I1205 17:56:25.767711 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6074b41-0985-409f-aeb3-fc76aedf6ed4-logs\") pod \"nova-api-0\" (UID: \"d6074b41-0985-409f-aeb3-fc76aedf6ed4\") " pod="openstack/nova-api-0" Dec 05 17:56:25 crc kubenswrapper[4756]: I1205 17:56:25.767765 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6074b41-0985-409f-aeb3-fc76aedf6ed4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d6074b41-0985-409f-aeb3-fc76aedf6ed4\") " pod="openstack/nova-api-0" Dec 05 17:56:25 crc kubenswrapper[4756]: I1205 17:56:25.767809 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6074b41-0985-409f-aeb3-fc76aedf6ed4-config-data\") pod \"nova-api-0\" (UID: \"d6074b41-0985-409f-aeb3-fc76aedf6ed4\") " pod="openstack/nova-api-0" Dec 05 17:56:25 crc kubenswrapper[4756]: I1205 17:56:25.768213 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j54j\" (UniqueName: \"kubernetes.io/projected/d6074b41-0985-409f-aeb3-fc76aedf6ed4-kube-api-access-6j54j\") pod \"nova-api-0\" (UID: \"d6074b41-0985-409f-aeb3-fc76aedf6ed4\") " pod="openstack/nova-api-0" Dec 05 17:56:25 crc kubenswrapper[4756]: I1205 17:56:25.865470 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 17:56:25 crc kubenswrapper[4756]: I1205 17:56:25.865540 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 17:56:25 crc kubenswrapper[4756]: I1205 17:56:25.870293 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6074b41-0985-409f-aeb3-fc76aedf6ed4-logs\") pod \"nova-api-0\" (UID: \"d6074b41-0985-409f-aeb3-fc76aedf6ed4\") " pod="openstack/nova-api-0" Dec 05 17:56:25 crc kubenswrapper[4756]: I1205 17:56:25.870353 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6074b41-0985-409f-aeb3-fc76aedf6ed4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d6074b41-0985-409f-aeb3-fc76aedf6ed4\") " pod="openstack/nova-api-0" Dec 05 17:56:25 crc kubenswrapper[4756]: I1205 17:56:25.870396 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6074b41-0985-409f-aeb3-fc76aedf6ed4-config-data\") pod \"nova-api-0\" (UID: \"d6074b41-0985-409f-aeb3-fc76aedf6ed4\") " pod="openstack/nova-api-0" Dec 05 17:56:25 crc kubenswrapper[4756]: I1205 17:56:25.870476 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j54j\" (UniqueName: \"kubernetes.io/projected/d6074b41-0985-409f-aeb3-fc76aedf6ed4-kube-api-access-6j54j\") pod \"nova-api-0\" (UID: \"d6074b41-0985-409f-aeb3-fc76aedf6ed4\") " pod="openstack/nova-api-0" Dec 05 17:56:25 crc kubenswrapper[4756]: I1205 17:56:25.871041 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6074b41-0985-409f-aeb3-fc76aedf6ed4-logs\") pod \"nova-api-0\" (UID: \"d6074b41-0985-409f-aeb3-fc76aedf6ed4\") " pod="openstack/nova-api-0" Dec 05 17:56:25 crc kubenswrapper[4756]: I1205 17:56:25.878075 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6074b41-0985-409f-aeb3-fc76aedf6ed4-config-data\") pod \"nova-api-0\" (UID: \"d6074b41-0985-409f-aeb3-fc76aedf6ed4\") " pod="openstack/nova-api-0" Dec 05 17:56:25 crc kubenswrapper[4756]: I1205 17:56:25.883019 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6074b41-0985-409f-aeb3-fc76aedf6ed4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d6074b41-0985-409f-aeb3-fc76aedf6ed4\") " pod="openstack/nova-api-0" Dec 05 17:56:25 crc kubenswrapper[4756]: I1205 17:56:25.887899 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j54j\" (UniqueName: \"kubernetes.io/projected/d6074b41-0985-409f-aeb3-fc76aedf6ed4-kube-api-access-6j54j\") pod \"nova-api-0\" (UID: \"d6074b41-0985-409f-aeb3-fc76aedf6ed4\") " pod="openstack/nova-api-0" Dec 05 17:56:26 crc kubenswrapper[4756]: I1205 17:56:26.013156 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 17:56:26 crc kubenswrapper[4756]: I1205 17:56:26.064118 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a8480fe-0062-4c89-9f85-337058bdfc86" path="/var/lib/kubelet/pods/7a8480fe-0062-4c89-9f85-337058bdfc86/volumes" Dec 05 17:56:26 crc kubenswrapper[4756]: I1205 17:56:26.489569 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 17:56:26 crc kubenswrapper[4756]: I1205 17:56:26.575032 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d6074b41-0985-409f-aeb3-fc76aedf6ed4","Type":"ContainerStarted","Data":"ef14daff22a6082273ccad894515cd48966c3b7d75eb26759174c901b7e6f3a9"} Dec 05 17:56:26 crc kubenswrapper[4756]: I1205 17:56:26.877542 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3d2b6ac9-0cdf-4fed-836e-918827fd0d90" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.91:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 17:56:26 crc kubenswrapper[4756]: I1205 17:56:26.877591 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3d2b6ac9-0cdf-4fed-836e-918827fd0d90" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.91:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 17:56:27 crc kubenswrapper[4756]: I1205 17:56:27.588130 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d6074b41-0985-409f-aeb3-fc76aedf6ed4","Type":"ContainerStarted","Data":"1b40fe8aada2092b723559910de1b0f3896f590a62f5918a30bc93632dfced6e"} Dec 05 17:56:27 crc kubenswrapper[4756]: I1205 17:56:27.588584 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d6074b41-0985-409f-aeb3-fc76aedf6ed4","Type":"ContainerStarted","Data":"e5fc21872ce095bdea74108d9f9b7515e3c57d22177032a58546e8db1439c5d4"} Dec 05 17:56:27 crc kubenswrapper[4756]: I1205 17:56:27.627194 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.627157349 podStartE2EDuration="2.627157349s" podCreationTimestamp="2025-12-05 17:56:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:56:27.612272882 +0000 UTC m=+6047.850946434" watchObservedRunningTime="2025-12-05 17:56:27.627157349 +0000 UTC m=+6047.865830891" Dec 05 17:56:28 crc kubenswrapper[4756]: I1205 17:56:28.954331 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 17:56:32 crc kubenswrapper[4756]: I1205 17:56:32.065693 4756 scope.go:117] "RemoveContainer" containerID="0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a" Dec 05 17:56:32 crc kubenswrapper[4756]: E1205 17:56:32.067005 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:56:33 crc kubenswrapper[4756]: I1205 17:56:33.954571 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 17:56:34 crc kubenswrapper[4756]: I1205 17:56:34.007945 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 17:56:34 crc kubenswrapper[4756]: I1205 17:56:34.708702 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 17:56:35 crc kubenswrapper[4756]: I1205 17:56:35.873374 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 17:56:35 crc kubenswrapper[4756]: I1205 17:56:35.876576 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 17:56:35 crc kubenswrapper[4756]: I1205 17:56:35.881500 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 17:56:36 crc kubenswrapper[4756]: I1205 17:56:36.014338 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 17:56:36 crc kubenswrapper[4756]: I1205 17:56:36.014399 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 17:56:36 crc kubenswrapper[4756]: I1205 17:56:36.702262 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 17:56:37 crc kubenswrapper[4756]: I1205 17:56:37.096938 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d6074b41-0985-409f-aeb3-fc76aedf6ed4" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.93:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 17:56:37 crc kubenswrapper[4756]: I1205 17:56:37.097677 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d6074b41-0985-409f-aeb3-fc76aedf6ed4" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.93:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 17:56:44 crc kubenswrapper[4756]: I1205 17:56:44.050407 4756 scope.go:117] "RemoveContainer" containerID="0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a" Dec 05 17:56:44 crc kubenswrapper[4756]: E1205 17:56:44.051716 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:56:46 crc kubenswrapper[4756]: I1205 17:56:46.019957 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 17:56:46 crc kubenswrapper[4756]: I1205 17:56:46.020136 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 17:56:46 crc kubenswrapper[4756]: I1205 17:56:46.020832 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 17:56:46 crc kubenswrapper[4756]: I1205 17:56:46.020901 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 17:56:46 crc kubenswrapper[4756]: I1205 17:56:46.025063 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 17:56:46 crc kubenswrapper[4756]: I1205 17:56:46.027105 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 17:56:46 crc kubenswrapper[4756]: I1205 17:56:46.296684 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b6f4996c9-dtntv"] Dec 05 17:56:46 crc kubenswrapper[4756]: I1205 17:56:46.301247 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b6f4996c9-dtntv" Dec 05 17:56:46 crc kubenswrapper[4756]: I1205 17:56:46.331164 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b6f4996c9-dtntv"] Dec 05 17:56:46 crc kubenswrapper[4756]: I1205 17:56:46.454553 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3184588-cae4-4f59-9cbc-fa85211c15ef-ovsdbserver-nb\") pod \"dnsmasq-dns-6b6f4996c9-dtntv\" (UID: \"d3184588-cae4-4f59-9cbc-fa85211c15ef\") " pod="openstack/dnsmasq-dns-6b6f4996c9-dtntv" Dec 05 17:56:46 crc kubenswrapper[4756]: I1205 17:56:46.454633 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3184588-cae4-4f59-9cbc-fa85211c15ef-config\") pod \"dnsmasq-dns-6b6f4996c9-dtntv\" (UID: \"d3184588-cae4-4f59-9cbc-fa85211c15ef\") " pod="openstack/dnsmasq-dns-6b6f4996c9-dtntv" Dec 05 17:56:46 crc kubenswrapper[4756]: I1205 17:56:46.454657 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3184588-cae4-4f59-9cbc-fa85211c15ef-dns-svc\") pod \"dnsmasq-dns-6b6f4996c9-dtntv\" (UID: \"d3184588-cae4-4f59-9cbc-fa85211c15ef\") " pod="openstack/dnsmasq-dns-6b6f4996c9-dtntv" Dec 05 17:56:46 crc kubenswrapper[4756]: I1205 17:56:46.454682 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd7fp\" (UniqueName: \"kubernetes.io/projected/d3184588-cae4-4f59-9cbc-fa85211c15ef-kube-api-access-rd7fp\") pod \"dnsmasq-dns-6b6f4996c9-dtntv\" (UID: \"d3184588-cae4-4f59-9cbc-fa85211c15ef\") " pod="openstack/dnsmasq-dns-6b6f4996c9-dtntv" Dec 05 17:56:46 crc kubenswrapper[4756]: I1205 17:56:46.455153 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3184588-cae4-4f59-9cbc-fa85211c15ef-ovsdbserver-sb\") pod \"dnsmasq-dns-6b6f4996c9-dtntv\" (UID: \"d3184588-cae4-4f59-9cbc-fa85211c15ef\") " pod="openstack/dnsmasq-dns-6b6f4996c9-dtntv" Dec 05 17:56:46 crc kubenswrapper[4756]: I1205 17:56:46.557538 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3184588-cae4-4f59-9cbc-fa85211c15ef-ovsdbserver-sb\") pod \"dnsmasq-dns-6b6f4996c9-dtntv\" (UID: \"d3184588-cae4-4f59-9cbc-fa85211c15ef\") " pod="openstack/dnsmasq-dns-6b6f4996c9-dtntv" Dec 05 17:56:46 crc kubenswrapper[4756]: I1205 17:56:46.557643 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3184588-cae4-4f59-9cbc-fa85211c15ef-ovsdbserver-nb\") pod \"dnsmasq-dns-6b6f4996c9-dtntv\" (UID: \"d3184588-cae4-4f59-9cbc-fa85211c15ef\") " pod="openstack/dnsmasq-dns-6b6f4996c9-dtntv" Dec 05 17:56:46 crc kubenswrapper[4756]: I1205 17:56:46.557701 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3184588-cae4-4f59-9cbc-fa85211c15ef-config\") pod \"dnsmasq-dns-6b6f4996c9-dtntv\" (UID: \"d3184588-cae4-4f59-9cbc-fa85211c15ef\") " pod="openstack/dnsmasq-dns-6b6f4996c9-dtntv" Dec 05 17:56:46 crc kubenswrapper[4756]: I1205 17:56:46.557725 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3184588-cae4-4f59-9cbc-fa85211c15ef-dns-svc\") pod \"dnsmasq-dns-6b6f4996c9-dtntv\" (UID: \"d3184588-cae4-4f59-9cbc-fa85211c15ef\") " pod="openstack/dnsmasq-dns-6b6f4996c9-dtntv" Dec 05 17:56:46 crc kubenswrapper[4756]: I1205 17:56:46.557765 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd7fp\" (UniqueName: \"kubernetes.io/projected/d3184588-cae4-4f59-9cbc-fa85211c15ef-kube-api-access-rd7fp\") pod \"dnsmasq-dns-6b6f4996c9-dtntv\" (UID: \"d3184588-cae4-4f59-9cbc-fa85211c15ef\") " pod="openstack/dnsmasq-dns-6b6f4996c9-dtntv" Dec 05 17:56:46 crc kubenswrapper[4756]: I1205 17:56:46.559471 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3184588-cae4-4f59-9cbc-fa85211c15ef-ovsdbserver-nb\") pod \"dnsmasq-dns-6b6f4996c9-dtntv\" (UID: \"d3184588-cae4-4f59-9cbc-fa85211c15ef\") " pod="openstack/dnsmasq-dns-6b6f4996c9-dtntv" Dec 05 17:56:46 crc kubenswrapper[4756]: I1205 17:56:46.559582 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3184588-cae4-4f59-9cbc-fa85211c15ef-ovsdbserver-sb\") pod \"dnsmasq-dns-6b6f4996c9-dtntv\" (UID: \"d3184588-cae4-4f59-9cbc-fa85211c15ef\") " pod="openstack/dnsmasq-dns-6b6f4996c9-dtntv" Dec 05 17:56:46 crc kubenswrapper[4756]: I1205 17:56:46.559675 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3184588-cae4-4f59-9cbc-fa85211c15ef-config\") pod \"dnsmasq-dns-6b6f4996c9-dtntv\" (UID: \"d3184588-cae4-4f59-9cbc-fa85211c15ef\") " pod="openstack/dnsmasq-dns-6b6f4996c9-dtntv" Dec 05 17:56:46 crc kubenswrapper[4756]: I1205 17:56:46.559702 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3184588-cae4-4f59-9cbc-fa85211c15ef-dns-svc\") pod \"dnsmasq-dns-6b6f4996c9-dtntv\" (UID: \"d3184588-cae4-4f59-9cbc-fa85211c15ef\") " pod="openstack/dnsmasq-dns-6b6f4996c9-dtntv" Dec 05 17:56:46 crc kubenswrapper[4756]: I1205 17:56:46.587614 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd7fp\" (UniqueName: \"kubernetes.io/projected/d3184588-cae4-4f59-9cbc-fa85211c15ef-kube-api-access-rd7fp\") pod \"dnsmasq-dns-6b6f4996c9-dtntv\" (UID: \"d3184588-cae4-4f59-9cbc-fa85211c15ef\") " pod="openstack/dnsmasq-dns-6b6f4996c9-dtntv" Dec 05 17:56:46 crc kubenswrapper[4756]: I1205 17:56:46.628554 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b6f4996c9-dtntv" Dec 05 17:56:47 crc kubenswrapper[4756]: I1205 17:56:47.120293 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b6f4996c9-dtntv"] Dec 05 17:56:47 crc kubenswrapper[4756]: W1205 17:56:47.123459 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3184588_cae4_4f59_9cbc_fa85211c15ef.slice/crio-d5f0ebfac33def1a52a3aed0876ea6f59c0a0e149b6d87f57e82aaa582014cc8 WatchSource:0}: Error finding container d5f0ebfac33def1a52a3aed0876ea6f59c0a0e149b6d87f57e82aaa582014cc8: Status 404 returned error can't find the container with id d5f0ebfac33def1a52a3aed0876ea6f59c0a0e149b6d87f57e82aaa582014cc8 Dec 05 17:56:47 crc kubenswrapper[4756]: I1205 17:56:47.859280 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6f4996c9-dtntv" event={"ID":"d3184588-cae4-4f59-9cbc-fa85211c15ef","Type":"ContainerStarted","Data":"d5f0ebfac33def1a52a3aed0876ea6f59c0a0e149b6d87f57e82aaa582014cc8"} Dec 05 17:56:48 crc kubenswrapper[4756]: I1205 17:56:48.889018 4756 generic.go:334] "Generic (PLEG): container finished" podID="d3184588-cae4-4f59-9cbc-fa85211c15ef" containerID="a3c45982eea9173cb99e2492e2f09a21a9ce2e3b3a4c9b215d542937787c68e7" exitCode=0 Dec 05 17:56:48 crc kubenswrapper[4756]: I1205 17:56:48.889396 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6f4996c9-dtntv" event={"ID":"d3184588-cae4-4f59-9cbc-fa85211c15ef","Type":"ContainerDied","Data":"a3c45982eea9173cb99e2492e2f09a21a9ce2e3b3a4c9b215d542937787c68e7"} Dec 05 17:56:48 crc kubenswrapper[4756]: I1205 17:56:48.967240 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 17:56:48 crc kubenswrapper[4756]: I1205 17:56:48.967465 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d6074b41-0985-409f-aeb3-fc76aedf6ed4" containerName="nova-api-log" containerID="cri-o://e5fc21872ce095bdea74108d9f9b7515e3c57d22177032a58546e8db1439c5d4" gracePeriod=30 Dec 05 17:56:48 crc kubenswrapper[4756]: I1205 17:56:48.967589 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d6074b41-0985-409f-aeb3-fc76aedf6ed4" containerName="nova-api-api" containerID="cri-o://1b40fe8aada2092b723559910de1b0f3896f590a62f5918a30bc93632dfced6e" gracePeriod=30 Dec 05 17:56:49 crc kubenswrapper[4756]: I1205 17:56:49.902220 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6f4996c9-dtntv" event={"ID":"d3184588-cae4-4f59-9cbc-fa85211c15ef","Type":"ContainerStarted","Data":"537707e9aa45774427bf93b9a0a3492d4fd58ff52ab08eeb9ed42f51100ea61d"} Dec 05 17:56:49 crc kubenswrapper[4756]: I1205 17:56:49.902564 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b6f4996c9-dtntv" Dec 05 17:56:49 crc kubenswrapper[4756]: I1205 17:56:49.908401 4756 generic.go:334] "Generic (PLEG): container finished" podID="d6074b41-0985-409f-aeb3-fc76aedf6ed4" containerID="e5fc21872ce095bdea74108d9f9b7515e3c57d22177032a58546e8db1439c5d4" exitCode=143 Dec 05 17:56:49 crc kubenswrapper[4756]: I1205 17:56:49.908478 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d6074b41-0985-409f-aeb3-fc76aedf6ed4","Type":"ContainerDied","Data":"e5fc21872ce095bdea74108d9f9b7515e3c57d22177032a58546e8db1439c5d4"} Dec 05 17:56:49 crc kubenswrapper[4756]: I1205 17:56:49.922686 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b6f4996c9-dtntv" podStartSLOduration=3.922669791 podStartE2EDuration="3.922669791s" podCreationTimestamp="2025-12-05 17:56:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:56:49.920205615 +0000 UTC m=+6070.158879147" watchObservedRunningTime="2025-12-05 17:56:49.922669791 +0000 UTC m=+6070.161343303" Dec 05 17:56:52 crc kubenswrapper[4756]: I1205 17:56:52.545987 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 17:56:52 crc kubenswrapper[4756]: I1205 17:56:52.710757 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j54j\" (UniqueName: \"kubernetes.io/projected/d6074b41-0985-409f-aeb3-fc76aedf6ed4-kube-api-access-6j54j\") pod \"d6074b41-0985-409f-aeb3-fc76aedf6ed4\" (UID: \"d6074b41-0985-409f-aeb3-fc76aedf6ed4\") " Dec 05 17:56:52 crc kubenswrapper[4756]: I1205 17:56:52.711362 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6074b41-0985-409f-aeb3-fc76aedf6ed4-combined-ca-bundle\") pod \"d6074b41-0985-409f-aeb3-fc76aedf6ed4\" (UID: \"d6074b41-0985-409f-aeb3-fc76aedf6ed4\") " Dec 05 17:56:52 crc kubenswrapper[4756]: I1205 17:56:52.711537 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6074b41-0985-409f-aeb3-fc76aedf6ed4-config-data\") pod \"d6074b41-0985-409f-aeb3-fc76aedf6ed4\" (UID: \"d6074b41-0985-409f-aeb3-fc76aedf6ed4\") " Dec 05 17:56:52 crc kubenswrapper[4756]: I1205 17:56:52.711636 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6074b41-0985-409f-aeb3-fc76aedf6ed4-logs\") pod \"d6074b41-0985-409f-aeb3-fc76aedf6ed4\" (UID: \"d6074b41-0985-409f-aeb3-fc76aedf6ed4\") " Dec 05 17:56:52 crc kubenswrapper[4756]: I1205 17:56:52.712967 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6074b41-0985-409f-aeb3-fc76aedf6ed4-logs" (OuterVolumeSpecName: "logs") pod "d6074b41-0985-409f-aeb3-fc76aedf6ed4" (UID: "d6074b41-0985-409f-aeb3-fc76aedf6ed4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:56:52 crc kubenswrapper[4756]: I1205 17:56:52.717773 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6074b41-0985-409f-aeb3-fc76aedf6ed4-kube-api-access-6j54j" (OuterVolumeSpecName: "kube-api-access-6j54j") pod "d6074b41-0985-409f-aeb3-fc76aedf6ed4" (UID: "d6074b41-0985-409f-aeb3-fc76aedf6ed4"). InnerVolumeSpecName "kube-api-access-6j54j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:56:52 crc kubenswrapper[4756]: I1205 17:56:52.748953 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6074b41-0985-409f-aeb3-fc76aedf6ed4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6074b41-0985-409f-aeb3-fc76aedf6ed4" (UID: "d6074b41-0985-409f-aeb3-fc76aedf6ed4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:56:52 crc kubenswrapper[4756]: I1205 17:56:52.753680 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6074b41-0985-409f-aeb3-fc76aedf6ed4-config-data" (OuterVolumeSpecName: "config-data") pod "d6074b41-0985-409f-aeb3-fc76aedf6ed4" (UID: "d6074b41-0985-409f-aeb3-fc76aedf6ed4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:56:52 crc kubenswrapper[4756]: I1205 17:56:52.814358 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j54j\" (UniqueName: \"kubernetes.io/projected/d6074b41-0985-409f-aeb3-fc76aedf6ed4-kube-api-access-6j54j\") on node \"crc\" DevicePath \"\"" Dec 05 17:56:52 crc kubenswrapper[4756]: I1205 17:56:52.814398 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6074b41-0985-409f-aeb3-fc76aedf6ed4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:56:52 crc kubenswrapper[4756]: I1205 17:56:52.814413 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6074b41-0985-409f-aeb3-fc76aedf6ed4-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:56:52 crc kubenswrapper[4756]: I1205 17:56:52.814446 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6074b41-0985-409f-aeb3-fc76aedf6ed4-logs\") on node \"crc\" DevicePath \"\"" Dec 05 17:56:52 crc kubenswrapper[4756]: I1205 17:56:52.939600 4756 generic.go:334] "Generic (PLEG): container finished" podID="d6074b41-0985-409f-aeb3-fc76aedf6ed4" containerID="1b40fe8aada2092b723559910de1b0f3896f590a62f5918a30bc93632dfced6e" exitCode=0 Dec 05 17:56:52 crc kubenswrapper[4756]: I1205 17:56:52.939667 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d6074b41-0985-409f-aeb3-fc76aedf6ed4","Type":"ContainerDied","Data":"1b40fe8aada2092b723559910de1b0f3896f590a62f5918a30bc93632dfced6e"} Dec 05 17:56:52 crc kubenswrapper[4756]: I1205 17:56:52.939736 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d6074b41-0985-409f-aeb3-fc76aedf6ed4","Type":"ContainerDied","Data":"ef14daff22a6082273ccad894515cd48966c3b7d75eb26759174c901b7e6f3a9"} Dec 05 17:56:52 crc kubenswrapper[4756]: I1205 17:56:52.939762 4756 scope.go:117] "RemoveContainer" containerID="1b40fe8aada2092b723559910de1b0f3896f590a62f5918a30bc93632dfced6e" Dec 05 17:56:52 crc kubenswrapper[4756]: I1205 17:56:52.939779 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 17:56:52 crc kubenswrapper[4756]: I1205 17:56:52.961618 4756 scope.go:117] "RemoveContainer" containerID="e5fc21872ce095bdea74108d9f9b7515e3c57d22177032a58546e8db1439c5d4" Dec 05 17:56:52 crc kubenswrapper[4756]: I1205 17:56:52.988190 4756 scope.go:117] "RemoveContainer" containerID="1b40fe8aada2092b723559910de1b0f3896f590a62f5918a30bc93632dfced6e" Dec 05 17:56:52 crc kubenswrapper[4756]: E1205 17:56:52.988695 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b40fe8aada2092b723559910de1b0f3896f590a62f5918a30bc93632dfced6e\": container with ID starting with 1b40fe8aada2092b723559910de1b0f3896f590a62f5918a30bc93632dfced6e not found: ID does not exist" containerID="1b40fe8aada2092b723559910de1b0f3896f590a62f5918a30bc93632dfced6e" Dec 05 17:56:52 crc kubenswrapper[4756]: I1205 17:56:52.988737 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b40fe8aada2092b723559910de1b0f3896f590a62f5918a30bc93632dfced6e"} err="failed to get container status \"1b40fe8aada2092b723559910de1b0f3896f590a62f5918a30bc93632dfced6e\": rpc error: code = NotFound desc = could not find container \"1b40fe8aada2092b723559910de1b0f3896f590a62f5918a30bc93632dfced6e\": container with ID starting with 1b40fe8aada2092b723559910de1b0f3896f590a62f5918a30bc93632dfced6e not found: ID does not exist" Dec 05 17:56:52 crc kubenswrapper[4756]: I1205 17:56:52.988763 4756 scope.go:117] "RemoveContainer" containerID="e5fc21872ce095bdea74108d9f9b7515e3c57d22177032a58546e8db1439c5d4" Dec 05 17:56:52 crc kubenswrapper[4756]: E1205 17:56:52.993406 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5fc21872ce095bdea74108d9f9b7515e3c57d22177032a58546e8db1439c5d4\": container with ID starting with e5fc21872ce095bdea74108d9f9b7515e3c57d22177032a58546e8db1439c5d4 not found: ID does not exist" containerID="e5fc21872ce095bdea74108d9f9b7515e3c57d22177032a58546e8db1439c5d4" Dec 05 17:56:52 crc kubenswrapper[4756]: I1205 17:56:52.993479 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5fc21872ce095bdea74108d9f9b7515e3c57d22177032a58546e8db1439c5d4"} err="failed to get container status \"e5fc21872ce095bdea74108d9f9b7515e3c57d22177032a58546e8db1439c5d4\": rpc error: code = NotFound desc = could not find container \"e5fc21872ce095bdea74108d9f9b7515e3c57d22177032a58546e8db1439c5d4\": container with ID starting with e5fc21872ce095bdea74108d9f9b7515e3c57d22177032a58546e8db1439c5d4 not found: ID does not exist" Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.002186 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.013708 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.023917 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 17:56:53 crc kubenswrapper[4756]: E1205 17:56:53.024352 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6074b41-0985-409f-aeb3-fc76aedf6ed4" containerName="nova-api-log" Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.024371 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6074b41-0985-409f-aeb3-fc76aedf6ed4" containerName="nova-api-log" Dec 05 17:56:53 crc kubenswrapper[4756]: E1205 17:56:53.024394 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6074b41-0985-409f-aeb3-fc76aedf6ed4" containerName="nova-api-api" Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.024400 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6074b41-0985-409f-aeb3-fc76aedf6ed4" containerName="nova-api-api" Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.024594 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6074b41-0985-409f-aeb3-fc76aedf6ed4" containerName="nova-api-log" Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.024618 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6074b41-0985-409f-aeb3-fc76aedf6ed4" containerName="nova-api-api" Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.025792 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.034534 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.035647 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.035879 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.036105 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.226456 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4838409d-0cb6-459b-87d3-525ec6da0fca-config-data\") pod \"nova-api-0\" (UID: \"4838409d-0cb6-459b-87d3-525ec6da0fca\") " pod="openstack/nova-api-0" Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.226525 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4838409d-0cb6-459b-87d3-525ec6da0fca-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4838409d-0cb6-459b-87d3-525ec6da0fca\") " pod="openstack/nova-api-0" Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.226671 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4838409d-0cb6-459b-87d3-525ec6da0fca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4838409d-0cb6-459b-87d3-525ec6da0fca\") " pod="openstack/nova-api-0" Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.226724 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4838409d-0cb6-459b-87d3-525ec6da0fca-logs\") pod \"nova-api-0\" (UID: \"4838409d-0cb6-459b-87d3-525ec6da0fca\") " pod="openstack/nova-api-0" Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.226757 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tflj\" (UniqueName: \"kubernetes.io/projected/4838409d-0cb6-459b-87d3-525ec6da0fca-kube-api-access-6tflj\") pod \"nova-api-0\" (UID: \"4838409d-0cb6-459b-87d3-525ec6da0fca\") " pod="openstack/nova-api-0" Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.226781 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4838409d-0cb6-459b-87d3-525ec6da0fca-public-tls-certs\") pod \"nova-api-0\" (UID: \"4838409d-0cb6-459b-87d3-525ec6da0fca\") " pod="openstack/nova-api-0" Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.328297 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4838409d-0cb6-459b-87d3-525ec6da0fca-config-data\") pod \"nova-api-0\" (UID: \"4838409d-0cb6-459b-87d3-525ec6da0fca\") " pod="openstack/nova-api-0" Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.328599 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4838409d-0cb6-459b-87d3-525ec6da0fca-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4838409d-0cb6-459b-87d3-525ec6da0fca\") " pod="openstack/nova-api-0" Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.328806 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4838409d-0cb6-459b-87d3-525ec6da0fca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4838409d-0cb6-459b-87d3-525ec6da0fca\") " pod="openstack/nova-api-0" Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.328959 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4838409d-0cb6-459b-87d3-525ec6da0fca-logs\") pod \"nova-api-0\" (UID: \"4838409d-0cb6-459b-87d3-525ec6da0fca\") " pod="openstack/nova-api-0" Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.329059 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tflj\" (UniqueName: \"kubernetes.io/projected/4838409d-0cb6-459b-87d3-525ec6da0fca-kube-api-access-6tflj\") pod \"nova-api-0\" (UID: \"4838409d-0cb6-459b-87d3-525ec6da0fca\") " pod="openstack/nova-api-0" Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.329162 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4838409d-0cb6-459b-87d3-525ec6da0fca-public-tls-certs\") pod \"nova-api-0\" (UID: \"4838409d-0cb6-459b-87d3-525ec6da0fca\") " pod="openstack/nova-api-0" Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.329447 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4838409d-0cb6-459b-87d3-525ec6da0fca-logs\") pod \"nova-api-0\" (UID: \"4838409d-0cb6-459b-87d3-525ec6da0fca\") " pod="openstack/nova-api-0" Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.333182 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4838409d-0cb6-459b-87d3-525ec6da0fca-public-tls-certs\") pod \"nova-api-0\" (UID: \"4838409d-0cb6-459b-87d3-525ec6da0fca\") " pod="openstack/nova-api-0" Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.335386 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4838409d-0cb6-459b-87d3-525ec6da0fca-config-data\") pod \"nova-api-0\" (UID: \"4838409d-0cb6-459b-87d3-525ec6da0fca\") " pod="openstack/nova-api-0" Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.336202 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4838409d-0cb6-459b-87d3-525ec6da0fca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4838409d-0cb6-459b-87d3-525ec6da0fca\") " pod="openstack/nova-api-0" Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.336969 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4838409d-0cb6-459b-87d3-525ec6da0fca-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4838409d-0cb6-459b-87d3-525ec6da0fca\") " pod="openstack/nova-api-0" Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.354231 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tflj\" (UniqueName: \"kubernetes.io/projected/4838409d-0cb6-459b-87d3-525ec6da0fca-kube-api-access-6tflj\") pod \"nova-api-0\" (UID: \"4838409d-0cb6-459b-87d3-525ec6da0fca\") " pod="openstack/nova-api-0" Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.368232 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.873000 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 17:56:53 crc kubenswrapper[4756]: W1205 17:56:53.875654 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4838409d_0cb6_459b_87d3_525ec6da0fca.slice/crio-0d3b31135b5bd4fcdf2f8a665c4d1798dcf9ba63d89f922642d7d3c52c8753b1 WatchSource:0}: Error finding container 0d3b31135b5bd4fcdf2f8a665c4d1798dcf9ba63d89f922642d7d3c52c8753b1: Status 404 returned error can't find the container with id 0d3b31135b5bd4fcdf2f8a665c4d1798dcf9ba63d89f922642d7d3c52c8753b1 Dec 05 17:56:53 crc kubenswrapper[4756]: I1205 17:56:53.959522 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4838409d-0cb6-459b-87d3-525ec6da0fca","Type":"ContainerStarted","Data":"0d3b31135b5bd4fcdf2f8a665c4d1798dcf9ba63d89f922642d7d3c52c8753b1"} Dec 05 17:56:54 crc kubenswrapper[4756]: I1205 17:56:54.070523 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6074b41-0985-409f-aeb3-fc76aedf6ed4" path="/var/lib/kubelet/pods/d6074b41-0985-409f-aeb3-fc76aedf6ed4/volumes" Dec 05 17:56:54 crc kubenswrapper[4756]: I1205 17:56:54.975505 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4838409d-0cb6-459b-87d3-525ec6da0fca","Type":"ContainerStarted","Data":"1fae077c39cddbee5bc0d17bde93f808f74bf8c6b511cc9e54e16a3f01d360b9"} Dec 05 17:56:54 crc kubenswrapper[4756]: I1205 17:56:54.975542 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4838409d-0cb6-459b-87d3-525ec6da0fca","Type":"ContainerStarted","Data":"c46b87ac77963258fe8935b2ed2881f4e04f76c200f665df70826623fd681e4f"} Dec 05 17:56:55 crc kubenswrapper[4756]: I1205 17:56:55.000126 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.000088634 podStartE2EDuration="2.000088634s" podCreationTimestamp="2025-12-05 17:56:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:56:54.99317336 +0000 UTC m=+6075.231846912" watchObservedRunningTime="2025-12-05 17:56:55.000088634 +0000 UTC m=+6075.238762216" Dec 05 17:56:55 crc kubenswrapper[4756]: I1205 17:56:55.049750 4756 scope.go:117] "RemoveContainer" containerID="0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a" Dec 05 17:56:55 crc kubenswrapper[4756]: E1205 17:56:55.050374 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:56:56 crc kubenswrapper[4756]: I1205 17:56:56.630787 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b6f4996c9-dtntv" Dec 05 17:56:56 crc kubenswrapper[4756]: I1205 17:56:56.706220 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84c55b5567-zxk59"] Dec 05 17:56:56 crc kubenswrapper[4756]: I1205 17:56:56.706460 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84c55b5567-zxk59" podUID="7f523eda-94d0-4016-b592-009948e19b77" containerName="dnsmasq-dns" containerID="cri-o://ffa6c89670f6d6c6ae2963cffe10e75b71dbdfa7288f0941bda2c003ce36fbdb" gracePeriod=10 Dec 05 17:56:56 crc kubenswrapper[4756]: I1205 17:56:56.998609 4756 generic.go:334] "Generic (PLEG): container finished" podID="7f523eda-94d0-4016-b592-009948e19b77" containerID="ffa6c89670f6d6c6ae2963cffe10e75b71dbdfa7288f0941bda2c003ce36fbdb" exitCode=0 Dec 05 17:56:56 crc kubenswrapper[4756]: I1205 17:56:56.998982 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c55b5567-zxk59" event={"ID":"7f523eda-94d0-4016-b592-009948e19b77","Type":"ContainerDied","Data":"ffa6c89670f6d6c6ae2963cffe10e75b71dbdfa7288f0941bda2c003ce36fbdb"} Dec 05 17:56:57 crc kubenswrapper[4756]: I1205 17:56:57.208515 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c55b5567-zxk59" Dec 05 17:56:57 crc kubenswrapper[4756]: I1205 17:56:57.320621 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f523eda-94d0-4016-b592-009948e19b77-ovsdbserver-sb\") pod \"7f523eda-94d0-4016-b592-009948e19b77\" (UID: \"7f523eda-94d0-4016-b592-009948e19b77\") " Dec 05 17:56:57 crc kubenswrapper[4756]: I1205 17:56:57.320775 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f523eda-94d0-4016-b592-009948e19b77-config\") pod \"7f523eda-94d0-4016-b592-009948e19b77\" (UID: \"7f523eda-94d0-4016-b592-009948e19b77\") " Dec 05 17:56:57 crc kubenswrapper[4756]: I1205 17:56:57.320849 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zngj\" (UniqueName: \"kubernetes.io/projected/7f523eda-94d0-4016-b592-009948e19b77-kube-api-access-8zngj\") pod \"7f523eda-94d0-4016-b592-009948e19b77\" (UID: \"7f523eda-94d0-4016-b592-009948e19b77\") " Dec 05 17:56:57 crc kubenswrapper[4756]: I1205 17:56:57.321028 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f523eda-94d0-4016-b592-009948e19b77-ovsdbserver-nb\") pod \"7f523eda-94d0-4016-b592-009948e19b77\" (UID: \"7f523eda-94d0-4016-b592-009948e19b77\") " Dec 05 17:56:57 crc kubenswrapper[4756]: I1205 17:56:57.321120 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f523eda-94d0-4016-b592-009948e19b77-dns-svc\") pod \"7f523eda-94d0-4016-b592-009948e19b77\" (UID: \"7f523eda-94d0-4016-b592-009948e19b77\") " Dec 05 17:56:57 crc kubenswrapper[4756]: I1205 17:56:57.337703 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f523eda-94d0-4016-b592-009948e19b77-kube-api-access-8zngj" (OuterVolumeSpecName: "kube-api-access-8zngj") pod "7f523eda-94d0-4016-b592-009948e19b77" (UID: "7f523eda-94d0-4016-b592-009948e19b77"). InnerVolumeSpecName "kube-api-access-8zngj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:56:57 crc kubenswrapper[4756]: I1205 17:56:57.395098 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f523eda-94d0-4016-b592-009948e19b77-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7f523eda-94d0-4016-b592-009948e19b77" (UID: "7f523eda-94d0-4016-b592-009948e19b77"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:56:57 crc kubenswrapper[4756]: I1205 17:56:57.395927 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f523eda-94d0-4016-b592-009948e19b77-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7f523eda-94d0-4016-b592-009948e19b77" (UID: "7f523eda-94d0-4016-b592-009948e19b77"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:56:57 crc kubenswrapper[4756]: I1205 17:56:57.395951 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f523eda-94d0-4016-b592-009948e19b77-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7f523eda-94d0-4016-b592-009948e19b77" (UID: "7f523eda-94d0-4016-b592-009948e19b77"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:56:57 crc kubenswrapper[4756]: I1205 17:56:57.408921 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f523eda-94d0-4016-b592-009948e19b77-config" (OuterVolumeSpecName: "config") pod "7f523eda-94d0-4016-b592-009948e19b77" (UID: "7f523eda-94d0-4016-b592-009948e19b77"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:56:57 crc kubenswrapper[4756]: I1205 17:56:57.425245 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f523eda-94d0-4016-b592-009948e19b77-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 17:56:57 crc kubenswrapper[4756]: I1205 17:56:57.425287 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f523eda-94d0-4016-b592-009948e19b77-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 17:56:57 crc kubenswrapper[4756]: I1205 17:56:57.425296 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f523eda-94d0-4016-b592-009948e19b77-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 17:56:57 crc kubenswrapper[4756]: I1205 17:56:57.425304 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f523eda-94d0-4016-b592-009948e19b77-config\") on node \"crc\" DevicePath \"\"" Dec 05 17:56:57 crc kubenswrapper[4756]: I1205 17:56:57.425314 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zngj\" (UniqueName: \"kubernetes.io/projected/7f523eda-94d0-4016-b592-009948e19b77-kube-api-access-8zngj\") on node \"crc\" DevicePath \"\"" Dec 05 17:56:58 crc kubenswrapper[4756]: I1205 17:56:58.009330 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c55b5567-zxk59" event={"ID":"7f523eda-94d0-4016-b592-009948e19b77","Type":"ContainerDied","Data":"64ea579b4cf8691b2d9e5012757b7b7f27f91fcbcd346bb4483471bd70ae9665"} Dec 05 17:56:58 crc kubenswrapper[4756]: I1205 17:56:58.009373 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c55b5567-zxk59" Dec 05 17:56:58 crc kubenswrapper[4756]: I1205 17:56:58.009385 4756 scope.go:117] "RemoveContainer" containerID="ffa6c89670f6d6c6ae2963cffe10e75b71dbdfa7288f0941bda2c003ce36fbdb" Dec 05 17:56:58 crc kubenswrapper[4756]: I1205 17:56:58.054335 4756 scope.go:117] "RemoveContainer" containerID="c92a326622e25c684f1a58965f8c4f951f30711e67f14b1a56c247f43c3b32c7" Dec 05 17:56:58 crc kubenswrapper[4756]: I1205 17:56:58.066107 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84c55b5567-zxk59"] Dec 05 17:56:58 crc kubenswrapper[4756]: I1205 17:56:58.080070 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84c55b5567-zxk59"] Dec 05 17:57:00 crc kubenswrapper[4756]: I1205 17:57:00.092135 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f523eda-94d0-4016-b592-009948e19b77" path="/var/lib/kubelet/pods/7f523eda-94d0-4016-b592-009948e19b77/volumes" Dec 05 17:57:03 crc kubenswrapper[4756]: I1205 17:57:03.368605 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 17:57:03 crc kubenswrapper[4756]: I1205 17:57:03.369316 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 17:57:04 crc kubenswrapper[4756]: I1205 17:57:04.384668 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4838409d-0cb6-459b-87d3-525ec6da0fca" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.95:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 17:57:04 crc kubenswrapper[4756]: I1205 17:57:04.385072 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4838409d-0cb6-459b-87d3-525ec6da0fca" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.95:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 17:57:09 crc kubenswrapper[4756]: I1205 17:57:09.049662 4756 scope.go:117] "RemoveContainer" containerID="0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a" Dec 05 17:57:09 crc kubenswrapper[4756]: E1205 17:57:09.050753 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:57:13 crc kubenswrapper[4756]: I1205 17:57:13.378304 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 17:57:13 crc kubenswrapper[4756]: I1205 17:57:13.379922 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 17:57:13 crc kubenswrapper[4756]: I1205 17:57:13.385597 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 17:57:13 crc kubenswrapper[4756]: I1205 17:57:13.389820 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 17:57:14 crc kubenswrapper[4756]: I1205 17:57:14.201015 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 17:57:14 crc kubenswrapper[4756]: I1205 17:57:14.206810 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 17:57:21 crc kubenswrapper[4756]: I1205 17:57:21.050513 4756 scope.go:117] "RemoveContainer" containerID="0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a" Dec 05 17:57:21 crc kubenswrapper[4756]: E1205 17:57:21.051278 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:57:21 crc kubenswrapper[4756]: I1205 17:57:21.081774 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-hvkpt"] Dec 05 17:57:21 crc kubenswrapper[4756]: I1205 17:57:21.092347 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-ab3f-account-create-update-7b27p"] Dec 05 17:57:21 crc kubenswrapper[4756]: I1205 17:57:21.105237 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-hvkpt"] Dec 05 17:57:21 crc kubenswrapper[4756]: I1205 17:57:21.125100 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-ab3f-account-create-update-7b27p"] Dec 05 17:57:22 crc kubenswrapper[4756]: I1205 17:57:22.063279 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d1e4851-886b-4648-8022-d9fb79072e9b" path="/var/lib/kubelet/pods/3d1e4851-886b-4648-8022-d9fb79072e9b/volumes" Dec 05 17:57:22 crc kubenswrapper[4756]: I1205 17:57:22.064856 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c64432f-bdc1-4b21-a99f-4b0892bf16c4" path="/var/lib/kubelet/pods/7c64432f-bdc1-4b21-a99f-4b0892bf16c4/volumes" Dec 05 17:57:28 crc kubenswrapper[4756]: I1205 17:57:28.090337 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-2wb2k"] Dec 05 17:57:28 crc kubenswrapper[4756]: I1205 17:57:28.097593 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-2wb2k"] Dec 05 17:57:30 crc kubenswrapper[4756]: I1205 17:57:30.070471 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cea68c9d-a357-41d9-a5ce-c7b09cad33c2" path="/var/lib/kubelet/pods/cea68c9d-a357-41d9-a5ce-c7b09cad33c2/volumes" Dec 05 17:57:33 crc kubenswrapper[4756]: I1205 17:57:33.050382 4756 scope.go:117] "RemoveContainer" containerID="0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a" Dec 05 17:57:33 crc kubenswrapper[4756]: E1205 17:57:33.051197 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:57:38 crc kubenswrapper[4756]: I1205 17:57:38.877068 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-7drzk"] Dec 05 17:57:38 crc kubenswrapper[4756]: E1205 17:57:38.877991 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f523eda-94d0-4016-b592-009948e19b77" containerName="dnsmasq-dns" Dec 05 17:57:38 crc kubenswrapper[4756]: I1205 17:57:38.878006 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f523eda-94d0-4016-b592-009948e19b77" containerName="dnsmasq-dns" Dec 05 17:57:38 crc kubenswrapper[4756]: E1205 17:57:38.878015 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f523eda-94d0-4016-b592-009948e19b77" containerName="init" Dec 05 17:57:38 crc kubenswrapper[4756]: I1205 17:57:38.878021 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f523eda-94d0-4016-b592-009948e19b77" containerName="init" Dec 05 17:57:38 crc kubenswrapper[4756]: I1205 17:57:38.878218 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f523eda-94d0-4016-b592-009948e19b77" containerName="dnsmasq-dns" Dec 05 17:57:38 crc kubenswrapper[4756]: I1205 17:57:38.878855 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7drzk" Dec 05 17:57:38 crc kubenswrapper[4756]: I1205 17:57:38.881091 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-2sz5b" Dec 05 17:57:38 crc kubenswrapper[4756]: I1205 17:57:38.882841 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 05 17:57:38 crc kubenswrapper[4756]: I1205 17:57:38.882908 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 05 17:57:38 crc kubenswrapper[4756]: I1205 17:57:38.893993 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7drzk"] Dec 05 17:57:38 crc kubenswrapper[4756]: I1205 17:57:38.935198 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-8ljss"] Dec 05 17:57:38 crc kubenswrapper[4756]: I1205 17:57:38.937176 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8ljss" Dec 05 17:57:38 crc kubenswrapper[4756]: I1205 17:57:38.963041 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-8ljss"] Dec 05 17:57:38 crc kubenswrapper[4756]: I1205 17:57:38.992643 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/484025cd-c3a0-46de-bc81-3508bbfdfd9e-var-log-ovn\") pod \"ovn-controller-7drzk\" (UID: \"484025cd-c3a0-46de-bc81-3508bbfdfd9e\") " pod="openstack/ovn-controller-7drzk" Dec 05 17:57:38 crc kubenswrapper[4756]: I1205 17:57:38.992723 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/484025cd-c3a0-46de-bc81-3508bbfdfd9e-var-run\") pod \"ovn-controller-7drzk\" (UID: \"484025cd-c3a0-46de-bc81-3508bbfdfd9e\") " pod="openstack/ovn-controller-7drzk" Dec 05 17:57:38 crc kubenswrapper[4756]: I1205 17:57:38.992817 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/484025cd-c3a0-46de-bc81-3508bbfdfd9e-ovn-controller-tls-certs\") pod \"ovn-controller-7drzk\" (UID: \"484025cd-c3a0-46de-bc81-3508bbfdfd9e\") " pod="openstack/ovn-controller-7drzk" Dec 05 17:57:38 crc kubenswrapper[4756]: I1205 17:57:38.992872 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/484025cd-c3a0-46de-bc81-3508bbfdfd9e-combined-ca-bundle\") pod \"ovn-controller-7drzk\" (UID: \"484025cd-c3a0-46de-bc81-3508bbfdfd9e\") " pod="openstack/ovn-controller-7drzk" Dec 05 17:57:38 crc kubenswrapper[4756]: I1205 17:57:38.992900 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kv2l\" (UniqueName: \"kubernetes.io/projected/484025cd-c3a0-46de-bc81-3508bbfdfd9e-kube-api-access-6kv2l\") pod \"ovn-controller-7drzk\" (UID: \"484025cd-c3a0-46de-bc81-3508bbfdfd9e\") " pod="openstack/ovn-controller-7drzk" Dec 05 17:57:38 crc kubenswrapper[4756]: I1205 17:57:38.992924 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/484025cd-c3a0-46de-bc81-3508bbfdfd9e-scripts\") pod \"ovn-controller-7drzk\" (UID: \"484025cd-c3a0-46de-bc81-3508bbfdfd9e\") " pod="openstack/ovn-controller-7drzk" Dec 05 17:57:38 crc kubenswrapper[4756]: I1205 17:57:38.993024 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/484025cd-c3a0-46de-bc81-3508bbfdfd9e-var-run-ovn\") pod \"ovn-controller-7drzk\" (UID: \"484025cd-c3a0-46de-bc81-3508bbfdfd9e\") " pod="openstack/ovn-controller-7drzk" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.094928 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/484025cd-c3a0-46de-bc81-3508bbfdfd9e-var-run\") pod \"ovn-controller-7drzk\" (UID: \"484025cd-c3a0-46de-bc81-3508bbfdfd9e\") " pod="openstack/ovn-controller-7drzk" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.094984 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/484025cd-c3a0-46de-bc81-3508bbfdfd9e-ovn-controller-tls-certs\") pod \"ovn-controller-7drzk\" (UID: \"484025cd-c3a0-46de-bc81-3508bbfdfd9e\") " pod="openstack/ovn-controller-7drzk" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.095025 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/484025cd-c3a0-46de-bc81-3508bbfdfd9e-combined-ca-bundle\") pod \"ovn-controller-7drzk\" (UID: \"484025cd-c3a0-46de-bc81-3508bbfdfd9e\") " pod="openstack/ovn-controller-7drzk" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.095044 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kv2l\" (UniqueName: \"kubernetes.io/projected/484025cd-c3a0-46de-bc81-3508bbfdfd9e-kube-api-access-6kv2l\") pod \"ovn-controller-7drzk\" (UID: \"484025cd-c3a0-46de-bc81-3508bbfdfd9e\") " pod="openstack/ovn-controller-7drzk" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.095064 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/484025cd-c3a0-46de-bc81-3508bbfdfd9e-scripts\") pod \"ovn-controller-7drzk\" (UID: \"484025cd-c3a0-46de-bc81-3508bbfdfd9e\") " pod="openstack/ovn-controller-7drzk" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.095101 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3bda222f-17e6-4b62-b1fe-2de61fa33be3-scripts\") pod \"ovn-controller-ovs-8ljss\" (UID: \"3bda222f-17e6-4b62-b1fe-2de61fa33be3\") " pod="openstack/ovn-controller-ovs-8ljss" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.095134 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3bda222f-17e6-4b62-b1fe-2de61fa33be3-etc-ovs\") pod \"ovn-controller-ovs-8ljss\" (UID: \"3bda222f-17e6-4b62-b1fe-2de61fa33be3\") " pod="openstack/ovn-controller-ovs-8ljss" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.095164 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3bda222f-17e6-4b62-b1fe-2de61fa33be3-var-run\") pod \"ovn-controller-ovs-8ljss\" (UID: \"3bda222f-17e6-4b62-b1fe-2de61fa33be3\") " pod="openstack/ovn-controller-ovs-8ljss" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.095195 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/484025cd-c3a0-46de-bc81-3508bbfdfd9e-var-run-ovn\") pod \"ovn-controller-7drzk\" (UID: \"484025cd-c3a0-46de-bc81-3508bbfdfd9e\") " pod="openstack/ovn-controller-7drzk" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.095226 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqfmc\" (UniqueName: \"kubernetes.io/projected/3bda222f-17e6-4b62-b1fe-2de61fa33be3-kube-api-access-tqfmc\") pod \"ovn-controller-ovs-8ljss\" (UID: \"3bda222f-17e6-4b62-b1fe-2de61fa33be3\") " pod="openstack/ovn-controller-ovs-8ljss" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.095252 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/484025cd-c3a0-46de-bc81-3508bbfdfd9e-var-log-ovn\") pod \"ovn-controller-7drzk\" (UID: \"484025cd-c3a0-46de-bc81-3508bbfdfd9e\") " pod="openstack/ovn-controller-7drzk" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.095269 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3bda222f-17e6-4b62-b1fe-2de61fa33be3-var-log\") pod \"ovn-controller-ovs-8ljss\" (UID: \"3bda222f-17e6-4b62-b1fe-2de61fa33be3\") " pod="openstack/ovn-controller-ovs-8ljss" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.095291 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3bda222f-17e6-4b62-b1fe-2de61fa33be3-var-lib\") pod \"ovn-controller-ovs-8ljss\" (UID: \"3bda222f-17e6-4b62-b1fe-2de61fa33be3\") " pod="openstack/ovn-controller-ovs-8ljss" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.095627 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/484025cd-c3a0-46de-bc81-3508bbfdfd9e-var-run\") pod \"ovn-controller-7drzk\" (UID: \"484025cd-c3a0-46de-bc81-3508bbfdfd9e\") " pod="openstack/ovn-controller-7drzk" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.096530 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/484025cd-c3a0-46de-bc81-3508bbfdfd9e-var-run-ovn\") pod \"ovn-controller-7drzk\" (UID: \"484025cd-c3a0-46de-bc81-3508bbfdfd9e\") " pod="openstack/ovn-controller-7drzk" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.096555 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/484025cd-c3a0-46de-bc81-3508bbfdfd9e-var-log-ovn\") pod \"ovn-controller-7drzk\" (UID: \"484025cd-c3a0-46de-bc81-3508bbfdfd9e\") " pod="openstack/ovn-controller-7drzk" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.100363 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/484025cd-c3a0-46de-bc81-3508bbfdfd9e-scripts\") pod \"ovn-controller-7drzk\" (UID: \"484025cd-c3a0-46de-bc81-3508bbfdfd9e\") " pod="openstack/ovn-controller-7drzk" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.111971 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/484025cd-c3a0-46de-bc81-3508bbfdfd9e-ovn-controller-tls-certs\") pod \"ovn-controller-7drzk\" (UID: \"484025cd-c3a0-46de-bc81-3508bbfdfd9e\") " pod="openstack/ovn-controller-7drzk" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.113631 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/484025cd-c3a0-46de-bc81-3508bbfdfd9e-combined-ca-bundle\") pod \"ovn-controller-7drzk\" (UID: \"484025cd-c3a0-46de-bc81-3508bbfdfd9e\") " pod="openstack/ovn-controller-7drzk" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.119093 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kv2l\" (UniqueName: \"kubernetes.io/projected/484025cd-c3a0-46de-bc81-3508bbfdfd9e-kube-api-access-6kv2l\") pod \"ovn-controller-7drzk\" (UID: \"484025cd-c3a0-46de-bc81-3508bbfdfd9e\") " pod="openstack/ovn-controller-7drzk" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.196662 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3bda222f-17e6-4b62-b1fe-2de61fa33be3-scripts\") pod \"ovn-controller-ovs-8ljss\" (UID: \"3bda222f-17e6-4b62-b1fe-2de61fa33be3\") " pod="openstack/ovn-controller-ovs-8ljss" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.196725 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3bda222f-17e6-4b62-b1fe-2de61fa33be3-etc-ovs\") pod \"ovn-controller-ovs-8ljss\" (UID: \"3bda222f-17e6-4b62-b1fe-2de61fa33be3\") " pod="openstack/ovn-controller-ovs-8ljss" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.196758 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3bda222f-17e6-4b62-b1fe-2de61fa33be3-var-run\") pod \"ovn-controller-ovs-8ljss\" (UID: \"3bda222f-17e6-4b62-b1fe-2de61fa33be3\") " pod="openstack/ovn-controller-ovs-8ljss" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.196823 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqfmc\" (UniqueName: \"kubernetes.io/projected/3bda222f-17e6-4b62-b1fe-2de61fa33be3-kube-api-access-tqfmc\") pod \"ovn-controller-ovs-8ljss\" (UID: \"3bda222f-17e6-4b62-b1fe-2de61fa33be3\") " pod="openstack/ovn-controller-ovs-8ljss" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.196861 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3bda222f-17e6-4b62-b1fe-2de61fa33be3-var-log\") pod \"ovn-controller-ovs-8ljss\" (UID: \"3bda222f-17e6-4b62-b1fe-2de61fa33be3\") " pod="openstack/ovn-controller-ovs-8ljss" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.196887 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3bda222f-17e6-4b62-b1fe-2de61fa33be3-var-lib\") pod \"ovn-controller-ovs-8ljss\" (UID: \"3bda222f-17e6-4b62-b1fe-2de61fa33be3\") " pod="openstack/ovn-controller-ovs-8ljss" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.197344 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3bda222f-17e6-4b62-b1fe-2de61fa33be3-etc-ovs\") pod \"ovn-controller-ovs-8ljss\" (UID: \"3bda222f-17e6-4b62-b1fe-2de61fa33be3\") " pod="openstack/ovn-controller-ovs-8ljss" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.197366 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3bda222f-17e6-4b62-b1fe-2de61fa33be3-var-run\") pod \"ovn-controller-ovs-8ljss\" (UID: \"3bda222f-17e6-4b62-b1fe-2de61fa33be3\") " pod="openstack/ovn-controller-ovs-8ljss" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.197462 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3bda222f-17e6-4b62-b1fe-2de61fa33be3-var-log\") pod \"ovn-controller-ovs-8ljss\" (UID: \"3bda222f-17e6-4b62-b1fe-2de61fa33be3\") " pod="openstack/ovn-controller-ovs-8ljss" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.197497 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3bda222f-17e6-4b62-b1fe-2de61fa33be3-var-lib\") pod \"ovn-controller-ovs-8ljss\" (UID: \"3bda222f-17e6-4b62-b1fe-2de61fa33be3\") " pod="openstack/ovn-controller-ovs-8ljss" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.199155 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3bda222f-17e6-4b62-b1fe-2de61fa33be3-scripts\") pod \"ovn-controller-ovs-8ljss\" (UID: \"3bda222f-17e6-4b62-b1fe-2de61fa33be3\") " pod="openstack/ovn-controller-ovs-8ljss" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.212489 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7drzk" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.218276 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqfmc\" (UniqueName: \"kubernetes.io/projected/3bda222f-17e6-4b62-b1fe-2de61fa33be3-kube-api-access-tqfmc\") pod \"ovn-controller-ovs-8ljss\" (UID: \"3bda222f-17e6-4b62-b1fe-2de61fa33be3\") " pod="openstack/ovn-controller-ovs-8ljss" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.257864 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8ljss" Dec 05 17:57:39 crc kubenswrapper[4756]: I1205 17:57:39.822282 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7drzk"] Dec 05 17:57:40 crc kubenswrapper[4756]: I1205 17:57:40.206687 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-8ljss"] Dec 05 17:57:40 crc kubenswrapper[4756]: I1205 17:57:40.450917 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-l5wr2"] Dec 05 17:57:40 crc kubenswrapper[4756]: I1205 17:57:40.452638 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-l5wr2" Dec 05 17:57:40 crc kubenswrapper[4756]: I1205 17:57:40.459260 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 05 17:57:40 crc kubenswrapper[4756]: I1205 17:57:40.474655 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-l5wr2"] Dec 05 17:57:40 crc kubenswrapper[4756]: I1205 17:57:40.515378 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8ljss" event={"ID":"3bda222f-17e6-4b62-b1fe-2de61fa33be3","Type":"ContainerStarted","Data":"a1cd02b24e84b0d89c17ab4ff96c080c7c059c7f883a6bd6f1963cb33fb310ad"} Dec 05 17:57:40 crc kubenswrapper[4756]: I1205 17:57:40.517272 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7drzk" event={"ID":"484025cd-c3a0-46de-bc81-3508bbfdfd9e","Type":"ContainerStarted","Data":"3971c6affd29dd58874c40c0694daca07925ae00a59ba57fd26cebc74715187f"} Dec 05 17:57:40 crc kubenswrapper[4756]: I1205 17:57:40.517314 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7drzk" event={"ID":"484025cd-c3a0-46de-bc81-3508bbfdfd9e","Type":"ContainerStarted","Data":"bd33e56f0e058e8dc4993e94423e44da8e082e3f0073249d8fc8806660c037a6"} Dec 05 17:57:40 crc kubenswrapper[4756]: I1205 17:57:40.518215 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-7drzk" Dec 05 17:57:40 crc kubenswrapper[4756]: I1205 17:57:40.524702 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qwt8\" (UniqueName: \"kubernetes.io/projected/d7a1031b-d48b-4397-a5ff-63ee325b6e7a-kube-api-access-7qwt8\") pod \"ovn-controller-metrics-l5wr2\" (UID: \"d7a1031b-d48b-4397-a5ff-63ee325b6e7a\") " pod="openstack/ovn-controller-metrics-l5wr2" Dec 05 17:57:40 crc kubenswrapper[4756]: I1205 17:57:40.524876 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7a1031b-d48b-4397-a5ff-63ee325b6e7a-config\") pod \"ovn-controller-metrics-l5wr2\" (UID: \"d7a1031b-d48b-4397-a5ff-63ee325b6e7a\") " pod="openstack/ovn-controller-metrics-l5wr2" Dec 05 17:57:40 crc kubenswrapper[4756]: I1205 17:57:40.524987 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d7a1031b-d48b-4397-a5ff-63ee325b6e7a-ovn-rundir\") pod \"ovn-controller-metrics-l5wr2\" (UID: \"d7a1031b-d48b-4397-a5ff-63ee325b6e7a\") " pod="openstack/ovn-controller-metrics-l5wr2" Dec 05 17:57:40 crc kubenswrapper[4756]: I1205 17:57:40.525087 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a1031b-d48b-4397-a5ff-63ee325b6e7a-combined-ca-bundle\") pod \"ovn-controller-metrics-l5wr2\" (UID: \"d7a1031b-d48b-4397-a5ff-63ee325b6e7a\") " pod="openstack/ovn-controller-metrics-l5wr2" Dec 05 17:57:40 crc kubenswrapper[4756]: I1205 17:57:40.525162 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d7a1031b-d48b-4397-a5ff-63ee325b6e7a-ovs-rundir\") pod \"ovn-controller-metrics-l5wr2\" (UID: \"d7a1031b-d48b-4397-a5ff-63ee325b6e7a\") " pod="openstack/ovn-controller-metrics-l5wr2" Dec 05 17:57:40 crc kubenswrapper[4756]: I1205 17:57:40.525358 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7a1031b-d48b-4397-a5ff-63ee325b6e7a-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-l5wr2\" (UID: \"d7a1031b-d48b-4397-a5ff-63ee325b6e7a\") " pod="openstack/ovn-controller-metrics-l5wr2" Dec 05 17:57:40 crc kubenswrapper[4756]: I1205 17:57:40.538924 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-7drzk" podStartSLOduration=2.538910405 podStartE2EDuration="2.538910405s" podCreationTimestamp="2025-12-05 17:57:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:57:40.533852561 +0000 UTC m=+6120.772526073" watchObservedRunningTime="2025-12-05 17:57:40.538910405 +0000 UTC m=+6120.777583907" Dec 05 17:57:40 crc kubenswrapper[4756]: I1205 17:57:40.627667 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7a1031b-d48b-4397-a5ff-63ee325b6e7a-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-l5wr2\" (UID: \"d7a1031b-d48b-4397-a5ff-63ee325b6e7a\") " pod="openstack/ovn-controller-metrics-l5wr2" Dec 05 17:57:40 crc kubenswrapper[4756]: I1205 17:57:40.627742 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qwt8\" (UniqueName: \"kubernetes.io/projected/d7a1031b-d48b-4397-a5ff-63ee325b6e7a-kube-api-access-7qwt8\") pod \"ovn-controller-metrics-l5wr2\" (UID: \"d7a1031b-d48b-4397-a5ff-63ee325b6e7a\") " pod="openstack/ovn-controller-metrics-l5wr2" Dec 05 17:57:40 crc kubenswrapper[4756]: I1205 17:57:40.627769 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7a1031b-d48b-4397-a5ff-63ee325b6e7a-config\") pod \"ovn-controller-metrics-l5wr2\" (UID: \"d7a1031b-d48b-4397-a5ff-63ee325b6e7a\") " pod="openstack/ovn-controller-metrics-l5wr2" Dec 05 17:57:40 crc kubenswrapper[4756]: I1205 17:57:40.627866 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d7a1031b-d48b-4397-a5ff-63ee325b6e7a-ovn-rundir\") pod \"ovn-controller-metrics-l5wr2\" (UID: \"d7a1031b-d48b-4397-a5ff-63ee325b6e7a\") " pod="openstack/ovn-controller-metrics-l5wr2" Dec 05 17:57:40 crc kubenswrapper[4756]: I1205 17:57:40.627887 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a1031b-d48b-4397-a5ff-63ee325b6e7a-combined-ca-bundle\") pod \"ovn-controller-metrics-l5wr2\" (UID: \"d7a1031b-d48b-4397-a5ff-63ee325b6e7a\") " pod="openstack/ovn-controller-metrics-l5wr2" Dec 05 17:57:40 crc kubenswrapper[4756]: I1205 17:57:40.627912 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d7a1031b-d48b-4397-a5ff-63ee325b6e7a-ovs-rundir\") pod \"ovn-controller-metrics-l5wr2\" (UID: \"d7a1031b-d48b-4397-a5ff-63ee325b6e7a\") " pod="openstack/ovn-controller-metrics-l5wr2" Dec 05 17:57:40 crc kubenswrapper[4756]: I1205 17:57:40.628246 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d7a1031b-d48b-4397-a5ff-63ee325b6e7a-ovs-rundir\") pod \"ovn-controller-metrics-l5wr2\" (UID: \"d7a1031b-d48b-4397-a5ff-63ee325b6e7a\") " pod="openstack/ovn-controller-metrics-l5wr2" Dec 05 17:57:40 crc kubenswrapper[4756]: I1205 17:57:40.628585 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d7a1031b-d48b-4397-a5ff-63ee325b6e7a-ovn-rundir\") pod \"ovn-controller-metrics-l5wr2\" (UID: \"d7a1031b-d48b-4397-a5ff-63ee325b6e7a\") " pod="openstack/ovn-controller-metrics-l5wr2" Dec 05 17:57:40 crc kubenswrapper[4756]: I1205 17:57:40.629033 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7a1031b-d48b-4397-a5ff-63ee325b6e7a-config\") pod \"ovn-controller-metrics-l5wr2\" (UID: \"d7a1031b-d48b-4397-a5ff-63ee325b6e7a\") " pod="openstack/ovn-controller-metrics-l5wr2" Dec 05 17:57:40 crc kubenswrapper[4756]: I1205 17:57:40.641557 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7a1031b-d48b-4397-a5ff-63ee325b6e7a-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-l5wr2\" (UID: \"d7a1031b-d48b-4397-a5ff-63ee325b6e7a\") " pod="openstack/ovn-controller-metrics-l5wr2" Dec 05 17:57:40 crc kubenswrapper[4756]: I1205 17:57:40.643600 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a1031b-d48b-4397-a5ff-63ee325b6e7a-combined-ca-bundle\") pod \"ovn-controller-metrics-l5wr2\" (UID: \"d7a1031b-d48b-4397-a5ff-63ee325b6e7a\") " pod="openstack/ovn-controller-metrics-l5wr2" Dec 05 17:57:40 crc kubenswrapper[4756]: I1205 17:57:40.646611 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qwt8\" (UniqueName: \"kubernetes.io/projected/d7a1031b-d48b-4397-a5ff-63ee325b6e7a-kube-api-access-7qwt8\") pod \"ovn-controller-metrics-l5wr2\" (UID: \"d7a1031b-d48b-4397-a5ff-63ee325b6e7a\") " pod="openstack/ovn-controller-metrics-l5wr2" Dec 05 17:57:40 crc kubenswrapper[4756]: I1205 17:57:40.778116 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-l5wr2" Dec 05 17:57:41 crc kubenswrapper[4756]: I1205 17:57:41.034483 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-t47bv"] Dec 05 17:57:41 crc kubenswrapper[4756]: I1205 17:57:41.049209 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-t47bv"] Dec 05 17:57:41 crc kubenswrapper[4756]: I1205 17:57:41.149475 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-gzb66"] Dec 05 17:57:41 crc kubenswrapper[4756]: I1205 17:57:41.150636 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-gzb66" Dec 05 17:57:41 crc kubenswrapper[4756]: I1205 17:57:41.170550 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-gzb66"] Dec 05 17:57:41 crc kubenswrapper[4756]: I1205 17:57:41.227728 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-l5wr2"] Dec 05 17:57:41 crc kubenswrapper[4756]: I1205 17:57:41.239396 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxcmx\" (UniqueName: \"kubernetes.io/projected/d1b297ed-9607-4bd5-9ec1-20125ee78453-kube-api-access-zxcmx\") pod \"octavia-db-create-gzb66\" (UID: \"d1b297ed-9607-4bd5-9ec1-20125ee78453\") " pod="openstack/octavia-db-create-gzb66" Dec 05 17:57:41 crc kubenswrapper[4756]: I1205 17:57:41.239753 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1b297ed-9607-4bd5-9ec1-20125ee78453-operator-scripts\") pod \"octavia-db-create-gzb66\" (UID: \"d1b297ed-9607-4bd5-9ec1-20125ee78453\") " pod="openstack/octavia-db-create-gzb66" Dec 05 17:57:41 crc kubenswrapper[4756]: I1205 17:57:41.341679 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxcmx\" (UniqueName: \"kubernetes.io/projected/d1b297ed-9607-4bd5-9ec1-20125ee78453-kube-api-access-zxcmx\") pod \"octavia-db-create-gzb66\" (UID: \"d1b297ed-9607-4bd5-9ec1-20125ee78453\") " pod="openstack/octavia-db-create-gzb66" Dec 05 17:57:41 crc kubenswrapper[4756]: I1205 17:57:41.341800 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1b297ed-9607-4bd5-9ec1-20125ee78453-operator-scripts\") pod \"octavia-db-create-gzb66\" (UID: \"d1b297ed-9607-4bd5-9ec1-20125ee78453\") " pod="openstack/octavia-db-create-gzb66" Dec 05 17:57:41 crc kubenswrapper[4756]: I1205 17:57:41.342560 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1b297ed-9607-4bd5-9ec1-20125ee78453-operator-scripts\") pod \"octavia-db-create-gzb66\" (UID: \"d1b297ed-9607-4bd5-9ec1-20125ee78453\") " pod="openstack/octavia-db-create-gzb66" Dec 05 17:57:41 crc kubenswrapper[4756]: I1205 17:57:41.374617 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxcmx\" (UniqueName: \"kubernetes.io/projected/d1b297ed-9607-4bd5-9ec1-20125ee78453-kube-api-access-zxcmx\") pod \"octavia-db-create-gzb66\" (UID: \"d1b297ed-9607-4bd5-9ec1-20125ee78453\") " pod="openstack/octavia-db-create-gzb66" Dec 05 17:57:41 crc kubenswrapper[4756]: I1205 17:57:41.469609 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-gzb66" Dec 05 17:57:41 crc kubenswrapper[4756]: I1205 17:57:41.532759 4756 generic.go:334] "Generic (PLEG): container finished" podID="3bda222f-17e6-4b62-b1fe-2de61fa33be3" containerID="57229de00b8c285ab5d562aaefa0edcafd265ebdcb9cdf2ee0c501b02d851980" exitCode=0 Dec 05 17:57:41 crc kubenswrapper[4756]: I1205 17:57:41.532833 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8ljss" event={"ID":"3bda222f-17e6-4b62-b1fe-2de61fa33be3","Type":"ContainerDied","Data":"57229de00b8c285ab5d562aaefa0edcafd265ebdcb9cdf2ee0c501b02d851980"} Dec 05 17:57:41 crc kubenswrapper[4756]: I1205 17:57:41.534328 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-l5wr2" event={"ID":"d7a1031b-d48b-4397-a5ff-63ee325b6e7a","Type":"ContainerStarted","Data":"777d5c657dc9967a5db5774719ba673a44e132bf2bf491c3b56ada0b3f8a8637"} Dec 05 17:57:41 crc kubenswrapper[4756]: I1205 17:57:41.534358 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-l5wr2" event={"ID":"d7a1031b-d48b-4397-a5ff-63ee325b6e7a","Type":"ContainerStarted","Data":"5fd48941605636146213ae04314c16da55c1e4c3eb8083d3a773aa58ad1dea7b"} Dec 05 17:57:41 crc kubenswrapper[4756]: I1205 17:57:41.590257 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-l5wr2" podStartSLOduration=1.590234303 podStartE2EDuration="1.590234303s" podCreationTimestamp="2025-12-05 17:57:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:57:41.58036102 +0000 UTC m=+6121.819034532" watchObservedRunningTime="2025-12-05 17:57:41.590234303 +0000 UTC m=+6121.828907835" Dec 05 17:57:41 crc kubenswrapper[4756]: I1205 17:57:41.946630 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-gzb66"] Dec 05 17:57:42 crc kubenswrapper[4756]: I1205 17:57:42.059289 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1228931b-c0ef-49b3-8f30-c6ffcddca7f6" path="/var/lib/kubelet/pods/1228931b-c0ef-49b3-8f30-c6ffcddca7f6/volumes" Dec 05 17:57:42 crc kubenswrapper[4756]: I1205 17:57:42.176849 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-f6ca-account-create-update-jxn5g"] Dec 05 17:57:42 crc kubenswrapper[4756]: I1205 17:57:42.180743 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-f6ca-account-create-update-jxn5g" Dec 05 17:57:42 crc kubenswrapper[4756]: I1205 17:57:42.182433 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Dec 05 17:57:42 crc kubenswrapper[4756]: I1205 17:57:42.200096 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-f6ca-account-create-update-jxn5g"] Dec 05 17:57:42 crc kubenswrapper[4756]: I1205 17:57:42.262395 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21b9a5ca-5ea6-4080-a7bb-24bcca2ff889-operator-scripts\") pod \"octavia-f6ca-account-create-update-jxn5g\" (UID: \"21b9a5ca-5ea6-4080-a7bb-24bcca2ff889\") " pod="openstack/octavia-f6ca-account-create-update-jxn5g" Dec 05 17:57:42 crc kubenswrapper[4756]: I1205 17:57:42.262515 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9rh2\" (UniqueName: \"kubernetes.io/projected/21b9a5ca-5ea6-4080-a7bb-24bcca2ff889-kube-api-access-z9rh2\") pod \"octavia-f6ca-account-create-update-jxn5g\" (UID: \"21b9a5ca-5ea6-4080-a7bb-24bcca2ff889\") " pod="openstack/octavia-f6ca-account-create-update-jxn5g" Dec 05 17:57:42 crc kubenswrapper[4756]: I1205 17:57:42.368975 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21b9a5ca-5ea6-4080-a7bb-24bcca2ff889-operator-scripts\") pod \"octavia-f6ca-account-create-update-jxn5g\" (UID: \"21b9a5ca-5ea6-4080-a7bb-24bcca2ff889\") " pod="openstack/octavia-f6ca-account-create-update-jxn5g" Dec 05 17:57:42 crc kubenswrapper[4756]: I1205 17:57:42.369340 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9rh2\" (UniqueName: \"kubernetes.io/projected/21b9a5ca-5ea6-4080-a7bb-24bcca2ff889-kube-api-access-z9rh2\") pod \"octavia-f6ca-account-create-update-jxn5g\" (UID: \"21b9a5ca-5ea6-4080-a7bb-24bcca2ff889\") " pod="openstack/octavia-f6ca-account-create-update-jxn5g" Dec 05 17:57:42 crc kubenswrapper[4756]: I1205 17:57:42.370298 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21b9a5ca-5ea6-4080-a7bb-24bcca2ff889-operator-scripts\") pod \"octavia-f6ca-account-create-update-jxn5g\" (UID: \"21b9a5ca-5ea6-4080-a7bb-24bcca2ff889\") " pod="openstack/octavia-f6ca-account-create-update-jxn5g" Dec 05 17:57:42 crc kubenswrapper[4756]: I1205 17:57:42.412069 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9rh2\" (UniqueName: \"kubernetes.io/projected/21b9a5ca-5ea6-4080-a7bb-24bcca2ff889-kube-api-access-z9rh2\") pod \"octavia-f6ca-account-create-update-jxn5g\" (UID: \"21b9a5ca-5ea6-4080-a7bb-24bcca2ff889\") " pod="openstack/octavia-f6ca-account-create-update-jxn5g" Dec 05 17:57:42 crc kubenswrapper[4756]: I1205 17:57:42.517206 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-f6ca-account-create-update-jxn5g" Dec 05 17:57:42 crc kubenswrapper[4756]: I1205 17:57:42.554813 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-gzb66" event={"ID":"d1b297ed-9607-4bd5-9ec1-20125ee78453","Type":"ContainerStarted","Data":"b405674568a18fce3eedd450c0c4e743461d865d46ad238926bae7cf000fc9bb"} Dec 05 17:57:42 crc kubenswrapper[4756]: I1205 17:57:42.554853 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-gzb66" event={"ID":"d1b297ed-9607-4bd5-9ec1-20125ee78453","Type":"ContainerStarted","Data":"6dd7e079a1465a31396cf3fc6288346a74461781006b444b39a576b87da6c6e4"} Dec 05 17:57:42 crc kubenswrapper[4756]: I1205 17:57:42.561300 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8ljss" event={"ID":"3bda222f-17e6-4b62-b1fe-2de61fa33be3","Type":"ContainerStarted","Data":"dfc34c3a2daf919cae99faf910c0ccbf757c5bff9cc4eba21f31dcbb4af85176"} Dec 05 17:57:42 crc kubenswrapper[4756]: I1205 17:57:42.561333 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8ljss" event={"ID":"3bda222f-17e6-4b62-b1fe-2de61fa33be3","Type":"ContainerStarted","Data":"55e655fb4375997c89c177793114613f96cc109d75255fc7414ab45995fcd2cd"} Dec 05 17:57:42 crc kubenswrapper[4756]: I1205 17:57:42.561346 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-8ljss" Dec 05 17:57:42 crc kubenswrapper[4756]: I1205 17:57:42.561366 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-8ljss" Dec 05 17:57:42 crc kubenswrapper[4756]: I1205 17:57:42.611544 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-create-gzb66" podStartSLOduration=1.611519769 podStartE2EDuration="1.611519769s" podCreationTimestamp="2025-12-05 17:57:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:57:42.592886363 +0000 UTC m=+6122.831559875" watchObservedRunningTime="2025-12-05 17:57:42.611519769 +0000 UTC m=+6122.850193281" Dec 05 17:57:42 crc kubenswrapper[4756]: I1205 17:57:42.626685 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-8ljss" podStartSLOduration=4.626666183 podStartE2EDuration="4.626666183s" podCreationTimestamp="2025-12-05 17:57:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:57:42.611478358 +0000 UTC m=+6122.850151890" watchObservedRunningTime="2025-12-05 17:57:42.626666183 +0000 UTC m=+6122.865339715" Dec 05 17:57:43 crc kubenswrapper[4756]: I1205 17:57:43.001768 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-f6ca-account-create-update-jxn5g"] Dec 05 17:57:43 crc kubenswrapper[4756]: I1205 17:57:43.574784 4756 generic.go:334] "Generic (PLEG): container finished" podID="d1b297ed-9607-4bd5-9ec1-20125ee78453" containerID="b405674568a18fce3eedd450c0c4e743461d865d46ad238926bae7cf000fc9bb" exitCode=0 Dec 05 17:57:43 crc kubenswrapper[4756]: I1205 17:57:43.575079 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-gzb66" event={"ID":"d1b297ed-9607-4bd5-9ec1-20125ee78453","Type":"ContainerDied","Data":"b405674568a18fce3eedd450c0c4e743461d865d46ad238926bae7cf000fc9bb"} Dec 05 17:57:43 crc kubenswrapper[4756]: I1205 17:57:43.577372 4756 generic.go:334] "Generic (PLEG): container finished" podID="21b9a5ca-5ea6-4080-a7bb-24bcca2ff889" containerID="4a420e8f95023e1b60e4eb8c84c3186f08ba9d5fbb1df363ffccf87a2f9817ee" exitCode=0 Dec 05 17:57:43 crc kubenswrapper[4756]: I1205 17:57:43.577513 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-f6ca-account-create-update-jxn5g" event={"ID":"21b9a5ca-5ea6-4080-a7bb-24bcca2ff889","Type":"ContainerDied","Data":"4a420e8f95023e1b60e4eb8c84c3186f08ba9d5fbb1df363ffccf87a2f9817ee"} Dec 05 17:57:43 crc kubenswrapper[4756]: I1205 17:57:43.577597 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-f6ca-account-create-update-jxn5g" event={"ID":"21b9a5ca-5ea6-4080-a7bb-24bcca2ff889","Type":"ContainerStarted","Data":"20e23aa685fff3da75bad2a8d4df13a2738935bed1f40a4f03d034ea8e4c5a02"} Dec 05 17:57:44 crc kubenswrapper[4756]: I1205 17:57:44.050335 4756 scope.go:117] "RemoveContainer" containerID="0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a" Dec 05 17:57:44 crc kubenswrapper[4756]: E1205 17:57:44.050834 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 17:57:45 crc kubenswrapper[4756]: I1205 17:57:45.086827 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-gzb66" Dec 05 17:57:45 crc kubenswrapper[4756]: I1205 17:57:45.092578 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-f6ca-account-create-update-jxn5g" Dec 05 17:57:45 crc kubenswrapper[4756]: I1205 17:57:45.242754 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1b297ed-9607-4bd5-9ec1-20125ee78453-operator-scripts\") pod \"d1b297ed-9607-4bd5-9ec1-20125ee78453\" (UID: \"d1b297ed-9607-4bd5-9ec1-20125ee78453\") " Dec 05 17:57:45 crc kubenswrapper[4756]: I1205 17:57:45.242864 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxcmx\" (UniqueName: \"kubernetes.io/projected/d1b297ed-9607-4bd5-9ec1-20125ee78453-kube-api-access-zxcmx\") pod \"d1b297ed-9607-4bd5-9ec1-20125ee78453\" (UID: \"d1b297ed-9607-4bd5-9ec1-20125ee78453\") " Dec 05 17:57:45 crc kubenswrapper[4756]: I1205 17:57:45.242934 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21b9a5ca-5ea6-4080-a7bb-24bcca2ff889-operator-scripts\") pod \"21b9a5ca-5ea6-4080-a7bb-24bcca2ff889\" (UID: \"21b9a5ca-5ea6-4080-a7bb-24bcca2ff889\") " Dec 05 17:57:45 crc kubenswrapper[4756]: I1205 17:57:45.242973 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9rh2\" (UniqueName: \"kubernetes.io/projected/21b9a5ca-5ea6-4080-a7bb-24bcca2ff889-kube-api-access-z9rh2\") pod \"21b9a5ca-5ea6-4080-a7bb-24bcca2ff889\" (UID: \"21b9a5ca-5ea6-4080-a7bb-24bcca2ff889\") " Dec 05 17:57:45 crc kubenswrapper[4756]: I1205 17:57:45.243400 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1b297ed-9607-4bd5-9ec1-20125ee78453-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d1b297ed-9607-4bd5-9ec1-20125ee78453" (UID: "d1b297ed-9607-4bd5-9ec1-20125ee78453"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:57:45 crc kubenswrapper[4756]: I1205 17:57:45.244724 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21b9a5ca-5ea6-4080-a7bb-24bcca2ff889-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "21b9a5ca-5ea6-4080-a7bb-24bcca2ff889" (UID: "21b9a5ca-5ea6-4080-a7bb-24bcca2ff889"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:57:45 crc kubenswrapper[4756]: I1205 17:57:45.248620 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21b9a5ca-5ea6-4080-a7bb-24bcca2ff889-kube-api-access-z9rh2" (OuterVolumeSpecName: "kube-api-access-z9rh2") pod "21b9a5ca-5ea6-4080-a7bb-24bcca2ff889" (UID: "21b9a5ca-5ea6-4080-a7bb-24bcca2ff889"). InnerVolumeSpecName "kube-api-access-z9rh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:57:45 crc kubenswrapper[4756]: I1205 17:57:45.251115 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1b297ed-9607-4bd5-9ec1-20125ee78453-kube-api-access-zxcmx" (OuterVolumeSpecName: "kube-api-access-zxcmx") pod "d1b297ed-9607-4bd5-9ec1-20125ee78453" (UID: "d1b297ed-9607-4bd5-9ec1-20125ee78453"). InnerVolumeSpecName "kube-api-access-zxcmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:57:45 crc kubenswrapper[4756]: I1205 17:57:45.345592 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxcmx\" (UniqueName: \"kubernetes.io/projected/d1b297ed-9607-4bd5-9ec1-20125ee78453-kube-api-access-zxcmx\") on node \"crc\" DevicePath \"\"" Dec 05 17:57:45 crc kubenswrapper[4756]: I1205 17:57:45.346139 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21b9a5ca-5ea6-4080-a7bb-24bcca2ff889-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:57:45 crc kubenswrapper[4756]: I1205 17:57:45.346238 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9rh2\" (UniqueName: \"kubernetes.io/projected/21b9a5ca-5ea6-4080-a7bb-24bcca2ff889-kube-api-access-z9rh2\") on node \"crc\" DevicePath \"\"" Dec 05 17:57:45 crc kubenswrapper[4756]: I1205 17:57:45.346320 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1b297ed-9607-4bd5-9ec1-20125ee78453-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:57:45 crc kubenswrapper[4756]: I1205 17:57:45.610303 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-f6ca-account-create-update-jxn5g" event={"ID":"21b9a5ca-5ea6-4080-a7bb-24bcca2ff889","Type":"ContainerDied","Data":"20e23aa685fff3da75bad2a8d4df13a2738935bed1f40a4f03d034ea8e4c5a02"} Dec 05 17:57:45 crc kubenswrapper[4756]: I1205 17:57:45.610339 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20e23aa685fff3da75bad2a8d4df13a2738935bed1f40a4f03d034ea8e4c5a02" Dec 05 17:57:45 crc kubenswrapper[4756]: I1205 17:57:45.610353 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-f6ca-account-create-update-jxn5g" Dec 05 17:57:45 crc kubenswrapper[4756]: I1205 17:57:45.613304 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-gzb66" event={"ID":"d1b297ed-9607-4bd5-9ec1-20125ee78453","Type":"ContainerDied","Data":"6dd7e079a1465a31396cf3fc6288346a74461781006b444b39a576b87da6c6e4"} Dec 05 17:57:45 crc kubenswrapper[4756]: I1205 17:57:45.613327 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dd7e079a1465a31396cf3fc6288346a74461781006b444b39a576b87da6c6e4" Dec 05 17:57:45 crc kubenswrapper[4756]: I1205 17:57:45.613353 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-gzb66" Dec 05 17:57:49 crc kubenswrapper[4756]: I1205 17:57:49.157706 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-9wdht"] Dec 05 17:57:49 crc kubenswrapper[4756]: E1205 17:57:49.158648 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1b297ed-9607-4bd5-9ec1-20125ee78453" containerName="mariadb-database-create" Dec 05 17:57:49 crc kubenswrapper[4756]: I1205 17:57:49.158669 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1b297ed-9607-4bd5-9ec1-20125ee78453" containerName="mariadb-database-create" Dec 05 17:57:49 crc kubenswrapper[4756]: E1205 17:57:49.158720 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21b9a5ca-5ea6-4080-a7bb-24bcca2ff889" containerName="mariadb-account-create-update" Dec 05 17:57:49 crc kubenswrapper[4756]: I1205 17:57:49.158732 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="21b9a5ca-5ea6-4080-a7bb-24bcca2ff889" containerName="mariadb-account-create-update" Dec 05 17:57:49 crc kubenswrapper[4756]: I1205 17:57:49.159069 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1b297ed-9607-4bd5-9ec1-20125ee78453" containerName="mariadb-database-create" Dec 05 17:57:49 crc kubenswrapper[4756]: I1205 17:57:49.159110 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="21b9a5ca-5ea6-4080-a7bb-24bcca2ff889" containerName="mariadb-account-create-update" Dec 05 17:57:49 crc kubenswrapper[4756]: I1205 17:57:49.160127 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-9wdht" Dec 05 17:57:49 crc kubenswrapper[4756]: I1205 17:57:49.170826 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-9wdht"] Dec 05 17:57:49 crc kubenswrapper[4756]: I1205 17:57:49.228643 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e86958a-c21c-450c-ab3d-9c1c6ab884c0-operator-scripts\") pod \"octavia-persistence-db-create-9wdht\" (UID: \"2e86958a-c21c-450c-ab3d-9c1c6ab884c0\") " pod="openstack/octavia-persistence-db-create-9wdht" Dec 05 17:57:49 crc kubenswrapper[4756]: I1205 17:57:49.232641 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfrmx\" (UniqueName: \"kubernetes.io/projected/2e86958a-c21c-450c-ab3d-9c1c6ab884c0-kube-api-access-sfrmx\") pod \"octavia-persistence-db-create-9wdht\" (UID: \"2e86958a-c21c-450c-ab3d-9c1c6ab884c0\") " pod="openstack/octavia-persistence-db-create-9wdht" Dec 05 17:57:49 crc kubenswrapper[4756]: I1205 17:57:49.335929 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e86958a-c21c-450c-ab3d-9c1c6ab884c0-operator-scripts\") pod \"octavia-persistence-db-create-9wdht\" (UID: \"2e86958a-c21c-450c-ab3d-9c1c6ab884c0\") " pod="openstack/octavia-persistence-db-create-9wdht" Dec 05 17:57:49 crc kubenswrapper[4756]: I1205 17:57:49.336249 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfrmx\" (UniqueName: \"kubernetes.io/projected/2e86958a-c21c-450c-ab3d-9c1c6ab884c0-kube-api-access-sfrmx\") pod \"octavia-persistence-db-create-9wdht\" (UID: \"2e86958a-c21c-450c-ab3d-9c1c6ab884c0\") " pod="openstack/octavia-persistence-db-create-9wdht" Dec 05 17:57:49 crc kubenswrapper[4756]: I1205 17:57:49.341343 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e86958a-c21c-450c-ab3d-9c1c6ab884c0-operator-scripts\") pod \"octavia-persistence-db-create-9wdht\" (UID: \"2e86958a-c21c-450c-ab3d-9c1c6ab884c0\") " pod="openstack/octavia-persistence-db-create-9wdht" Dec 05 17:57:49 crc kubenswrapper[4756]: I1205 17:57:49.355966 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfrmx\" (UniqueName: \"kubernetes.io/projected/2e86958a-c21c-450c-ab3d-9c1c6ab884c0-kube-api-access-sfrmx\") pod \"octavia-persistence-db-create-9wdht\" (UID: \"2e86958a-c21c-450c-ab3d-9c1c6ab884c0\") " pod="openstack/octavia-persistence-db-create-9wdht" Dec 05 17:57:49 crc kubenswrapper[4756]: I1205 17:57:49.492121 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-9wdht" Dec 05 17:57:49 crc kubenswrapper[4756]: I1205 17:57:49.985571 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-9wdht"] Dec 05 17:57:50 crc kubenswrapper[4756]: I1205 17:57:50.172690 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-dab0-account-create-update-mb5rs"] Dec 05 17:57:50 crc kubenswrapper[4756]: I1205 17:57:50.174823 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-dab0-account-create-update-mb5rs" Dec 05 17:57:50 crc kubenswrapper[4756]: I1205 17:57:50.176938 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Dec 05 17:57:50 crc kubenswrapper[4756]: I1205 17:57:50.185745 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-dab0-account-create-update-mb5rs"] Dec 05 17:57:50 crc kubenswrapper[4756]: I1205 17:57:50.256282 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbd70a73-ff81-45bc-848a-97f2c4b83c6c-operator-scripts\") pod \"octavia-dab0-account-create-update-mb5rs\" (UID: \"fbd70a73-ff81-45bc-848a-97f2c4b83c6c\") " pod="openstack/octavia-dab0-account-create-update-mb5rs" Dec 05 17:57:50 crc kubenswrapper[4756]: I1205 17:57:50.256407 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr257\" (UniqueName: \"kubernetes.io/projected/fbd70a73-ff81-45bc-848a-97f2c4b83c6c-kube-api-access-qr257\") pod \"octavia-dab0-account-create-update-mb5rs\" (UID: \"fbd70a73-ff81-45bc-848a-97f2c4b83c6c\") " pod="openstack/octavia-dab0-account-create-update-mb5rs" Dec 05 17:57:50 crc kubenswrapper[4756]: I1205 17:57:50.358967 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr257\" (UniqueName: \"kubernetes.io/projected/fbd70a73-ff81-45bc-848a-97f2c4b83c6c-kube-api-access-qr257\") pod \"octavia-dab0-account-create-update-mb5rs\" (UID: \"fbd70a73-ff81-45bc-848a-97f2c4b83c6c\") " pod="openstack/octavia-dab0-account-create-update-mb5rs" Dec 05 17:57:50 crc kubenswrapper[4756]: I1205 17:57:50.359144 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbd70a73-ff81-45bc-848a-97f2c4b83c6c-operator-scripts\") pod \"octavia-dab0-account-create-update-mb5rs\" (UID: \"fbd70a73-ff81-45bc-848a-97f2c4b83c6c\") " pod="openstack/octavia-dab0-account-create-update-mb5rs" Dec 05 17:57:50 crc kubenswrapper[4756]: I1205 17:57:50.359836 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbd70a73-ff81-45bc-848a-97f2c4b83c6c-operator-scripts\") pod \"octavia-dab0-account-create-update-mb5rs\" (UID: \"fbd70a73-ff81-45bc-848a-97f2c4b83c6c\") " pod="openstack/octavia-dab0-account-create-update-mb5rs" Dec 05 17:57:50 crc kubenswrapper[4756]: I1205 17:57:50.378151 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr257\" (UniqueName: \"kubernetes.io/projected/fbd70a73-ff81-45bc-848a-97f2c4b83c6c-kube-api-access-qr257\") pod \"octavia-dab0-account-create-update-mb5rs\" (UID: \"fbd70a73-ff81-45bc-848a-97f2c4b83c6c\") " pod="openstack/octavia-dab0-account-create-update-mb5rs" Dec 05 17:57:50 crc kubenswrapper[4756]: I1205 17:57:50.505253 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-dab0-account-create-update-mb5rs" Dec 05 17:57:50 crc kubenswrapper[4756]: I1205 17:57:50.669120 4756 generic.go:334] "Generic (PLEG): container finished" podID="2e86958a-c21c-450c-ab3d-9c1c6ab884c0" containerID="b972f8c9ac3047a70673317b0b8920e6ec7a4e3b5fda4aa995531b36874923fb" exitCode=0 Dec 05 17:57:50 crc kubenswrapper[4756]: I1205 17:57:50.669162 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-9wdht" event={"ID":"2e86958a-c21c-450c-ab3d-9c1c6ab884c0","Type":"ContainerDied","Data":"b972f8c9ac3047a70673317b0b8920e6ec7a4e3b5fda4aa995531b36874923fb"} Dec 05 17:57:50 crc kubenswrapper[4756]: I1205 17:57:50.669190 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-9wdht" event={"ID":"2e86958a-c21c-450c-ab3d-9c1c6ab884c0","Type":"ContainerStarted","Data":"2678786fae4844cc44cd3f310e512cc354459d292bbfb7f4afb97c4d353a496a"} Dec 05 17:57:50 crc kubenswrapper[4756]: I1205 17:57:50.802637 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-dab0-account-create-update-mb5rs"] Dec 05 17:57:50 crc kubenswrapper[4756]: W1205 17:57:50.808374 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbd70a73_ff81_45bc_848a_97f2c4b83c6c.slice/crio-e41490c99677d4b8c6ca7842a69b2df9dae992cc8e6fe444bf77437ff46630a9 WatchSource:0}: Error finding container e41490c99677d4b8c6ca7842a69b2df9dae992cc8e6fe444bf77437ff46630a9: Status 404 returned error can't find the container with id e41490c99677d4b8c6ca7842a69b2df9dae992cc8e6fe444bf77437ff46630a9 Dec 05 17:57:51 crc kubenswrapper[4756]: I1205 17:57:51.681827 4756 generic.go:334] "Generic (PLEG): container finished" podID="fbd70a73-ff81-45bc-848a-97f2c4b83c6c" containerID="1af54360917122aebce3a0e235a19758c38c8ac755c1af3946ee5439e66049ff" exitCode=0 Dec 05 17:57:51 crc kubenswrapper[4756]: I1205 17:57:51.681883 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-dab0-account-create-update-mb5rs" event={"ID":"fbd70a73-ff81-45bc-848a-97f2c4b83c6c","Type":"ContainerDied","Data":"1af54360917122aebce3a0e235a19758c38c8ac755c1af3946ee5439e66049ff"} Dec 05 17:57:51 crc kubenswrapper[4756]: I1205 17:57:51.681955 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-dab0-account-create-update-mb5rs" event={"ID":"fbd70a73-ff81-45bc-848a-97f2c4b83c6c","Type":"ContainerStarted","Data":"e41490c99677d4b8c6ca7842a69b2df9dae992cc8e6fe444bf77437ff46630a9"} Dec 05 17:57:52 crc kubenswrapper[4756]: I1205 17:57:52.106047 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-9wdht" Dec 05 17:57:52 crc kubenswrapper[4756]: I1205 17:57:52.201932 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfrmx\" (UniqueName: \"kubernetes.io/projected/2e86958a-c21c-450c-ab3d-9c1c6ab884c0-kube-api-access-sfrmx\") pod \"2e86958a-c21c-450c-ab3d-9c1c6ab884c0\" (UID: \"2e86958a-c21c-450c-ab3d-9c1c6ab884c0\") " Dec 05 17:57:52 crc kubenswrapper[4756]: I1205 17:57:52.201980 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e86958a-c21c-450c-ab3d-9c1c6ab884c0-operator-scripts\") pod \"2e86958a-c21c-450c-ab3d-9c1c6ab884c0\" (UID: \"2e86958a-c21c-450c-ab3d-9c1c6ab884c0\") " Dec 05 17:57:52 crc kubenswrapper[4756]: I1205 17:57:52.202797 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e86958a-c21c-450c-ab3d-9c1c6ab884c0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2e86958a-c21c-450c-ab3d-9c1c6ab884c0" (UID: "2e86958a-c21c-450c-ab3d-9c1c6ab884c0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:57:52 crc kubenswrapper[4756]: I1205 17:57:52.214162 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e86958a-c21c-450c-ab3d-9c1c6ab884c0-kube-api-access-sfrmx" (OuterVolumeSpecName: "kube-api-access-sfrmx") pod "2e86958a-c21c-450c-ab3d-9c1c6ab884c0" (UID: "2e86958a-c21c-450c-ab3d-9c1c6ab884c0"). InnerVolumeSpecName "kube-api-access-sfrmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:57:52 crc kubenswrapper[4756]: I1205 17:57:52.305060 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfrmx\" (UniqueName: \"kubernetes.io/projected/2e86958a-c21c-450c-ab3d-9c1c6ab884c0-kube-api-access-sfrmx\") on node \"crc\" DevicePath \"\"" Dec 05 17:57:52 crc kubenswrapper[4756]: I1205 17:57:52.305110 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e86958a-c21c-450c-ab3d-9c1c6ab884c0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:57:52 crc kubenswrapper[4756]: I1205 17:57:52.692331 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-9wdht" Dec 05 17:57:52 crc kubenswrapper[4756]: I1205 17:57:52.692323 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-9wdht" event={"ID":"2e86958a-c21c-450c-ab3d-9c1c6ab884c0","Type":"ContainerDied","Data":"2678786fae4844cc44cd3f310e512cc354459d292bbfb7f4afb97c4d353a496a"} Dec 05 17:57:52 crc kubenswrapper[4756]: I1205 17:57:52.692837 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2678786fae4844cc44cd3f310e512cc354459d292bbfb7f4afb97c4d353a496a" Dec 05 17:57:53 crc kubenswrapper[4756]: I1205 17:57:53.034962 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-dab0-account-create-update-mb5rs" Dec 05 17:57:53 crc kubenswrapper[4756]: I1205 17:57:53.124589 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qr257\" (UniqueName: \"kubernetes.io/projected/fbd70a73-ff81-45bc-848a-97f2c4b83c6c-kube-api-access-qr257\") pod \"fbd70a73-ff81-45bc-848a-97f2c4b83c6c\" (UID: \"fbd70a73-ff81-45bc-848a-97f2c4b83c6c\") " Dec 05 17:57:53 crc kubenswrapper[4756]: I1205 17:57:53.124796 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbd70a73-ff81-45bc-848a-97f2c4b83c6c-operator-scripts\") pod \"fbd70a73-ff81-45bc-848a-97f2c4b83c6c\" (UID: \"fbd70a73-ff81-45bc-848a-97f2c4b83c6c\") " Dec 05 17:57:53 crc kubenswrapper[4756]: I1205 17:57:53.126289 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbd70a73-ff81-45bc-848a-97f2c4b83c6c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fbd70a73-ff81-45bc-848a-97f2c4b83c6c" (UID: "fbd70a73-ff81-45bc-848a-97f2c4b83c6c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:57:53 crc kubenswrapper[4756]: I1205 17:57:53.132060 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbd70a73-ff81-45bc-848a-97f2c4b83c6c-kube-api-access-qr257" (OuterVolumeSpecName: "kube-api-access-qr257") pod "fbd70a73-ff81-45bc-848a-97f2c4b83c6c" (UID: "fbd70a73-ff81-45bc-848a-97f2c4b83c6c"). InnerVolumeSpecName "kube-api-access-qr257". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:57:53 crc kubenswrapper[4756]: I1205 17:57:53.227643 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qr257\" (UniqueName: \"kubernetes.io/projected/fbd70a73-ff81-45bc-848a-97f2c4b83c6c-kube-api-access-qr257\") on node \"crc\" DevicePath \"\"" Dec 05 17:57:53 crc kubenswrapper[4756]: I1205 17:57:53.227884 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbd70a73-ff81-45bc-848a-97f2c4b83c6c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:57:53 crc kubenswrapper[4756]: I1205 17:57:53.705229 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-dab0-account-create-update-mb5rs" event={"ID":"fbd70a73-ff81-45bc-848a-97f2c4b83c6c","Type":"ContainerDied","Data":"e41490c99677d4b8c6ca7842a69b2df9dae992cc8e6fe444bf77437ff46630a9"} Dec 05 17:57:53 crc kubenswrapper[4756]: I1205 17:57:53.705304 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e41490c99677d4b8c6ca7842a69b2df9dae992cc8e6fe444bf77437ff46630a9" Dec 05 17:57:53 crc kubenswrapper[4756]: I1205 17:57:53.705354 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-dab0-account-create-update-mb5rs" Dec 05 17:57:56 crc kubenswrapper[4756]: I1205 17:57:56.195439 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-67ccf57c7d-mfzfc"] Dec 05 17:57:56 crc kubenswrapper[4756]: E1205 17:57:56.196031 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e86958a-c21c-450c-ab3d-9c1c6ab884c0" containerName="mariadb-database-create" Dec 05 17:57:56 crc kubenswrapper[4756]: I1205 17:57:56.196049 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e86958a-c21c-450c-ab3d-9c1c6ab884c0" containerName="mariadb-database-create" Dec 05 17:57:56 crc kubenswrapper[4756]: E1205 17:57:56.196081 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbd70a73-ff81-45bc-848a-97f2c4b83c6c" containerName="mariadb-account-create-update" Dec 05 17:57:56 crc kubenswrapper[4756]: I1205 17:57:56.196090 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbd70a73-ff81-45bc-848a-97f2c4b83c6c" containerName="mariadb-account-create-update" Dec 05 17:57:56 crc kubenswrapper[4756]: I1205 17:57:56.196426 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e86958a-c21c-450c-ab3d-9c1c6ab884c0" containerName="mariadb-database-create" Dec 05 17:57:56 crc kubenswrapper[4756]: I1205 17:57:56.196446 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbd70a73-ff81-45bc-848a-97f2c4b83c6c" containerName="mariadb-account-create-update" Dec 05 17:57:56 crc kubenswrapper[4756]: I1205 17:57:56.199060 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-67ccf57c7d-mfzfc" Dec 05 17:57:56 crc kubenswrapper[4756]: I1205 17:57:56.207359 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Dec 05 17:57:56 crc kubenswrapper[4756]: I1205 17:57:56.207648 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Dec 05 17:57:56 crc kubenswrapper[4756]: I1205 17:57:56.207839 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-octavia-ovndbs" Dec 05 17:57:56 crc kubenswrapper[4756]: I1205 17:57:56.207963 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-mbcl8" Dec 05 17:57:56 crc kubenswrapper[4756]: I1205 17:57:56.213604 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-67ccf57c7d-mfzfc"] Dec 05 17:57:56 crc kubenswrapper[4756]: I1205 17:57:56.304125 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d5a0d90-0c0e-4589-8a15-7016159a2609-scripts\") pod \"octavia-api-67ccf57c7d-mfzfc\" (UID: \"0d5a0d90-0c0e-4589-8a15-7016159a2609\") " pod="openstack/octavia-api-67ccf57c7d-mfzfc" Dec 05 17:57:56 crc kubenswrapper[4756]: I1205 17:57:56.304184 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/0d5a0d90-0c0e-4589-8a15-7016159a2609-octavia-run\") pod \"octavia-api-67ccf57c7d-mfzfc\" (UID: \"0d5a0d90-0c0e-4589-8a15-7016159a2609\") " pod="openstack/octavia-api-67ccf57c7d-mfzfc" Dec 05 17:57:56 crc kubenswrapper[4756]: I1205 17:57:56.304363 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d5a0d90-0c0e-4589-8a15-7016159a2609-config-data\") pod \"octavia-api-67ccf57c7d-mfzfc\" (UID: \"0d5a0d90-0c0e-4589-8a15-7016159a2609\") " pod="openstack/octavia-api-67ccf57c7d-mfzfc" Dec 05 17:57:56 crc kubenswrapper[4756]: I1205 17:57:56.304404 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d5a0d90-0c0e-4589-8a15-7016159a2609-combined-ca-bundle\") pod \"octavia-api-67ccf57c7d-mfzfc\" (UID: \"0d5a0d90-0c0e-4589-8a15-7016159a2609\") " pod="openstack/octavia-api-67ccf57c7d-mfzfc" Dec 05 17:57:56 crc kubenswrapper[4756]: I1205 17:57:56.304504 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d5a0d90-0c0e-4589-8a15-7016159a2609-ovndb-tls-certs\") pod \"octavia-api-67ccf57c7d-mfzfc\" (UID: \"0d5a0d90-0c0e-4589-8a15-7016159a2609\") " pod="openstack/octavia-api-67ccf57c7d-mfzfc" Dec 05 17:57:56 crc kubenswrapper[4756]: I1205 17:57:56.304558 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0d5a0d90-0c0e-4589-8a15-7016159a2609-config-data-merged\") pod \"octavia-api-67ccf57c7d-mfzfc\" (UID: \"0d5a0d90-0c0e-4589-8a15-7016159a2609\") " pod="openstack/octavia-api-67ccf57c7d-mfzfc" Dec 05 17:57:56 crc kubenswrapper[4756]: I1205 17:57:56.405855 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d5a0d90-0c0e-4589-8a15-7016159a2609-scripts\") pod \"octavia-api-67ccf57c7d-mfzfc\" (UID: \"0d5a0d90-0c0e-4589-8a15-7016159a2609\") " pod="openstack/octavia-api-67ccf57c7d-mfzfc" Dec 05 17:57:56 crc kubenswrapper[4756]: I1205 17:57:56.405908 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/0d5a0d90-0c0e-4589-8a15-7016159a2609-octavia-run\") pod \"octavia-api-67ccf57c7d-mfzfc\" (UID: \"0d5a0d90-0c0e-4589-8a15-7016159a2609\") " pod="openstack/octavia-api-67ccf57c7d-mfzfc" Dec 05 17:57:56 crc kubenswrapper[4756]: I1205 17:57:56.405980 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d5a0d90-0c0e-4589-8a15-7016159a2609-config-data\") pod \"octavia-api-67ccf57c7d-mfzfc\" (UID: \"0d5a0d90-0c0e-4589-8a15-7016159a2609\") " pod="openstack/octavia-api-67ccf57c7d-mfzfc" Dec 05 17:57:56 crc kubenswrapper[4756]: I1205 17:57:56.405999 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d5a0d90-0c0e-4589-8a15-7016159a2609-combined-ca-bundle\") pod \"octavia-api-67ccf57c7d-mfzfc\" (UID: \"0d5a0d90-0c0e-4589-8a15-7016159a2609\") " pod="openstack/octavia-api-67ccf57c7d-mfzfc" Dec 05 17:57:56 crc kubenswrapper[4756]: I1205 17:57:56.406037 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d5a0d90-0c0e-4589-8a15-7016159a2609-ovndb-tls-certs\") pod \"octavia-api-67ccf57c7d-mfzfc\" (UID: \"0d5a0d90-0c0e-4589-8a15-7016159a2609\") " pod="openstack/octavia-api-67ccf57c7d-mfzfc" Dec 05 17:57:56 crc kubenswrapper[4756]: I1205 17:57:56.406072 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0d5a0d90-0c0e-4589-8a15-7016159a2609-config-data-merged\") pod \"octavia-api-67ccf57c7d-mfzfc\" (UID: \"0d5a0d90-0c0e-4589-8a15-7016159a2609\") " pod="openstack/octavia-api-67ccf57c7d-mfzfc" Dec 05 17:57:56 crc kubenswrapper[4756]: I1205 17:57:56.406588 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/0d5a0d90-0c0e-4589-8a15-7016159a2609-octavia-run\") pod \"octavia-api-67ccf57c7d-mfzfc\" (UID: \"0d5a0d90-0c0e-4589-8a15-7016159a2609\") " pod="openstack/octavia-api-67ccf57c7d-mfzfc" Dec 05 17:57:56 crc kubenswrapper[4756]: I1205 17:57:56.406629 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0d5a0d90-0c0e-4589-8a15-7016159a2609-config-data-merged\") pod \"octavia-api-67ccf57c7d-mfzfc\" (UID: \"0d5a0d90-0c0e-4589-8a15-7016159a2609\") " pod="openstack/octavia-api-67ccf57c7d-mfzfc" Dec 05 17:57:56 crc kubenswrapper[4756]: I1205 17:57:56.412302 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d5a0d90-0c0e-4589-8a15-7016159a2609-config-data\") pod \"octavia-api-67ccf57c7d-mfzfc\" (UID: \"0d5a0d90-0c0e-4589-8a15-7016159a2609\") " pod="openstack/octavia-api-67ccf57c7d-mfzfc" Dec 05 17:57:56 crc kubenswrapper[4756]: I1205 17:57:56.412955 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d5a0d90-0c0e-4589-8a15-7016159a2609-scripts\") pod \"octavia-api-67ccf57c7d-mfzfc\" (UID: \"0d5a0d90-0c0e-4589-8a15-7016159a2609\") " pod="openstack/octavia-api-67ccf57c7d-mfzfc" Dec 05 17:57:56 crc kubenswrapper[4756]: I1205 17:57:56.413917 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d5a0d90-0c0e-4589-8a15-7016159a2609-ovndb-tls-certs\") pod \"octavia-api-67ccf57c7d-mfzfc\" (UID: \"0d5a0d90-0c0e-4589-8a15-7016159a2609\") " pod="openstack/octavia-api-67ccf57c7d-mfzfc" Dec 05 17:57:56 crc kubenswrapper[4756]: I1205 17:57:56.422240 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d5a0d90-0c0e-4589-8a15-7016159a2609-combined-ca-bundle\") pod \"octavia-api-67ccf57c7d-mfzfc\" (UID: \"0d5a0d90-0c0e-4589-8a15-7016159a2609\") " pod="openstack/octavia-api-67ccf57c7d-mfzfc" Dec 05 17:57:56 crc kubenswrapper[4756]: I1205 17:57:56.526103 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-67ccf57c7d-mfzfc" Dec 05 17:57:57 crc kubenswrapper[4756]: I1205 17:57:57.018786 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-67ccf57c7d-mfzfc"] Dec 05 17:57:57 crc kubenswrapper[4756]: W1205 17:57:57.026683 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d5a0d90_0c0e_4589_8a15_7016159a2609.slice/crio-8dd5d93bb4f59b88f25ac5ea50c0cc4b86732d18eeb7e08ebc8c60afd3c7f2ba WatchSource:0}: Error finding container 8dd5d93bb4f59b88f25ac5ea50c0cc4b86732d18eeb7e08ebc8c60afd3c7f2ba: Status 404 returned error can't find the container with id 8dd5d93bb4f59b88f25ac5ea50c0cc4b86732d18eeb7e08ebc8c60afd3c7f2ba Dec 05 17:57:57 crc kubenswrapper[4756]: I1205 17:57:57.746344 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-67ccf57c7d-mfzfc" event={"ID":"0d5a0d90-0c0e-4589-8a15-7016159a2609","Type":"ContainerStarted","Data":"8dd5d93bb4f59b88f25ac5ea50c0cc4b86732d18eeb7e08ebc8c60afd3c7f2ba"} Dec 05 17:57:58 crc kubenswrapper[4756]: I1205 17:57:58.060197 4756 scope.go:117] "RemoveContainer" containerID="0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a" Dec 05 17:57:58 crc kubenswrapper[4756]: I1205 17:57:58.755922 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"32339976c849fb944e24e4a905b3c0b31eadc92ee27ed11d626318af61e21b3a"} Dec 05 17:58:00 crc kubenswrapper[4756]: I1205 17:58:00.368839 4756 scope.go:117] "RemoveContainer" containerID="de76082a3823a246cae35a684e13925d4e81561572c9edb6da76ca2b05e5211a" Dec 05 17:58:01 crc kubenswrapper[4756]: I1205 17:58:01.398345 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ngfnl"] Dec 05 17:58:01 crc kubenswrapper[4756]: I1205 17:58:01.402132 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ngfnl" Dec 05 17:58:01 crc kubenswrapper[4756]: I1205 17:58:01.407353 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ngfnl"] Dec 05 17:58:01 crc kubenswrapper[4756]: I1205 17:58:01.509066 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cfcfa90-a657-426b-996d-f54055582128-utilities\") pod \"certified-operators-ngfnl\" (UID: \"5cfcfa90-a657-426b-996d-f54055582128\") " pod="openshift-marketplace/certified-operators-ngfnl" Dec 05 17:58:01 crc kubenswrapper[4756]: I1205 17:58:01.509126 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cfcfa90-a657-426b-996d-f54055582128-catalog-content\") pod \"certified-operators-ngfnl\" (UID: \"5cfcfa90-a657-426b-996d-f54055582128\") " pod="openshift-marketplace/certified-operators-ngfnl" Dec 05 17:58:01 crc kubenswrapper[4756]: I1205 17:58:01.509561 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv9rw\" (UniqueName: \"kubernetes.io/projected/5cfcfa90-a657-426b-996d-f54055582128-kube-api-access-dv9rw\") pod \"certified-operators-ngfnl\" (UID: \"5cfcfa90-a657-426b-996d-f54055582128\") " pod="openshift-marketplace/certified-operators-ngfnl" Dec 05 17:58:01 crc kubenswrapper[4756]: I1205 17:58:01.611070 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv9rw\" (UniqueName: \"kubernetes.io/projected/5cfcfa90-a657-426b-996d-f54055582128-kube-api-access-dv9rw\") pod \"certified-operators-ngfnl\" (UID: \"5cfcfa90-a657-426b-996d-f54055582128\") " pod="openshift-marketplace/certified-operators-ngfnl" Dec 05 17:58:01 crc kubenswrapper[4756]: I1205 17:58:01.611165 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cfcfa90-a657-426b-996d-f54055582128-utilities\") pod \"certified-operators-ngfnl\" (UID: \"5cfcfa90-a657-426b-996d-f54055582128\") " pod="openshift-marketplace/certified-operators-ngfnl" Dec 05 17:58:01 crc kubenswrapper[4756]: I1205 17:58:01.611206 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cfcfa90-a657-426b-996d-f54055582128-catalog-content\") pod \"certified-operators-ngfnl\" (UID: \"5cfcfa90-a657-426b-996d-f54055582128\") " pod="openshift-marketplace/certified-operators-ngfnl" Dec 05 17:58:01 crc kubenswrapper[4756]: I1205 17:58:01.611712 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cfcfa90-a657-426b-996d-f54055582128-utilities\") pod \"certified-operators-ngfnl\" (UID: \"5cfcfa90-a657-426b-996d-f54055582128\") " pod="openshift-marketplace/certified-operators-ngfnl" Dec 05 17:58:01 crc kubenswrapper[4756]: I1205 17:58:01.611756 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cfcfa90-a657-426b-996d-f54055582128-catalog-content\") pod \"certified-operators-ngfnl\" (UID: \"5cfcfa90-a657-426b-996d-f54055582128\") " pod="openshift-marketplace/certified-operators-ngfnl" Dec 05 17:58:01 crc kubenswrapper[4756]: I1205 17:58:01.633904 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv9rw\" (UniqueName: \"kubernetes.io/projected/5cfcfa90-a657-426b-996d-f54055582128-kube-api-access-dv9rw\") pod \"certified-operators-ngfnl\" (UID: \"5cfcfa90-a657-426b-996d-f54055582128\") " pod="openshift-marketplace/certified-operators-ngfnl" Dec 05 17:58:01 crc kubenswrapper[4756]: I1205 17:58:01.722812 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ngfnl" Dec 05 17:58:07 crc kubenswrapper[4756]: I1205 17:58:07.840889 4756 scope.go:117] "RemoveContainer" containerID="853eeb4f1011e1cefe738208cf579562c74e0c2ca576a85fe990c9a40b5830d3" Dec 05 17:58:07 crc kubenswrapper[4756]: I1205 17:58:07.936834 4756 scope.go:117] "RemoveContainer" containerID="82c4701c7ba6529ca2be2c911b39bdd1295f70c4c76d041fd53cca44edee2a57" Dec 05 17:58:08 crc kubenswrapper[4756]: I1205 17:58:08.076738 4756 scope.go:117] "RemoveContainer" containerID="5794c48ac0e2702d5d1a40ca7fb5be48807def8eb8d332e8543afd872c8d2193" Dec 05 17:58:08 crc kubenswrapper[4756]: I1205 17:58:08.318527 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ngfnl"] Dec 05 17:58:08 crc kubenswrapper[4756]: W1205 17:58:08.324740 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cfcfa90_a657_426b_996d_f54055582128.slice/crio-fb0873a091698b5d5d5807f6f0ae8ba7778decc452f33d54b9c59248a3598810 WatchSource:0}: Error finding container fb0873a091698b5d5d5807f6f0ae8ba7778decc452f33d54b9c59248a3598810: Status 404 returned error can't find the container with id fb0873a091698b5d5d5807f6f0ae8ba7778decc452f33d54b9c59248a3598810 Dec 05 17:58:08 crc kubenswrapper[4756]: I1205 17:58:08.864800 4756 generic.go:334] "Generic (PLEG): container finished" podID="0d5a0d90-0c0e-4589-8a15-7016159a2609" containerID="ed711b623bf3eb026b08df7d1fab2d50af615ce027b7a4639c4a2e8057e2e0d4" exitCode=0 Dec 05 17:58:08 crc kubenswrapper[4756]: I1205 17:58:08.864970 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-67ccf57c7d-mfzfc" event={"ID":"0d5a0d90-0c0e-4589-8a15-7016159a2609","Type":"ContainerDied","Data":"ed711b623bf3eb026b08df7d1fab2d50af615ce027b7a4639c4a2e8057e2e0d4"} Dec 05 17:58:08 crc kubenswrapper[4756]: I1205 17:58:08.869751 4756 generic.go:334] "Generic (PLEG): container finished" podID="5cfcfa90-a657-426b-996d-f54055582128" containerID="0573b37b2fe17aab085b116764675a847d1e4e2a1409c1c595c97ae383873fc6" exitCode=0 Dec 05 17:58:08 crc kubenswrapper[4756]: I1205 17:58:08.869811 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngfnl" event={"ID":"5cfcfa90-a657-426b-996d-f54055582128","Type":"ContainerDied","Data":"0573b37b2fe17aab085b116764675a847d1e4e2a1409c1c595c97ae383873fc6"} Dec 05 17:58:08 crc kubenswrapper[4756]: I1205 17:58:08.869842 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngfnl" event={"ID":"5cfcfa90-a657-426b-996d-f54055582128","Type":"ContainerStarted","Data":"fb0873a091698b5d5d5807f6f0ae8ba7778decc452f33d54b9c59248a3598810"} Dec 05 17:58:08 crc kubenswrapper[4756]: I1205 17:58:08.872399 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 17:58:09 crc kubenswrapper[4756]: I1205 17:58:09.886963 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-67ccf57c7d-mfzfc" event={"ID":"0d5a0d90-0c0e-4589-8a15-7016159a2609","Type":"ContainerStarted","Data":"60aaffba6dba3cdef43b2cf3840b46e18884b34f52a1cb3fe4bd70bff8d71404"} Dec 05 17:58:09 crc kubenswrapper[4756]: I1205 17:58:09.887238 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-67ccf57c7d-mfzfc" event={"ID":"0d5a0d90-0c0e-4589-8a15-7016159a2609","Type":"ContainerStarted","Data":"f24fec2f91ffa901e9812455abf3525d47fdfe27a302a5cf8c05d29c2f4ee50d"} Dec 05 17:58:09 crc kubenswrapper[4756]: I1205 17:58:09.887333 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-67ccf57c7d-mfzfc" Dec 05 17:58:09 crc kubenswrapper[4756]: I1205 17:58:09.888481 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-67ccf57c7d-mfzfc" Dec 05 17:58:09 crc kubenswrapper[4756]: I1205 17:58:09.921768 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-67ccf57c7d-mfzfc" podStartSLOduration=3.003144713 podStartE2EDuration="13.921734534s" podCreationTimestamp="2025-12-05 17:57:56 +0000 UTC" firstStartedPulling="2025-12-05 17:57:57.028546478 +0000 UTC m=+6137.267219980" lastFinishedPulling="2025-12-05 17:58:07.947136289 +0000 UTC m=+6148.185809801" observedRunningTime="2025-12-05 17:58:09.920343167 +0000 UTC m=+6150.159016719" watchObservedRunningTime="2025-12-05 17:58:09.921734534 +0000 UTC m=+6150.160408046" Dec 05 17:58:10 crc kubenswrapper[4756]: I1205 17:58:10.902059 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngfnl" event={"ID":"5cfcfa90-a657-426b-996d-f54055582128","Type":"ContainerStarted","Data":"d68b32bdc6c2807af94ab0e15933c1f776281fab051998a4a97e63de0f1d2f02"} Dec 05 17:58:11 crc kubenswrapper[4756]: I1205 17:58:11.918648 4756 generic.go:334] "Generic (PLEG): container finished" podID="5cfcfa90-a657-426b-996d-f54055582128" containerID="d68b32bdc6c2807af94ab0e15933c1f776281fab051998a4a97e63de0f1d2f02" exitCode=0 Dec 05 17:58:11 crc kubenswrapper[4756]: I1205 17:58:11.918706 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngfnl" event={"ID":"5cfcfa90-a657-426b-996d-f54055582128","Type":"ContainerDied","Data":"d68b32bdc6c2807af94ab0e15933c1f776281fab051998a4a97e63de0f1d2f02"} Dec 05 17:58:12 crc kubenswrapper[4756]: I1205 17:58:12.933813 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngfnl" event={"ID":"5cfcfa90-a657-426b-996d-f54055582128","Type":"ContainerStarted","Data":"76fa21401c5962b20873cfbc68921e6d4e1a576bc8eeed278363931a02da2248"} Dec 05 17:58:12 crc kubenswrapper[4756]: I1205 17:58:12.974875 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ngfnl" podStartSLOduration=8.459977216 podStartE2EDuration="11.974858843s" podCreationTimestamp="2025-12-05 17:58:01 +0000 UTC" firstStartedPulling="2025-12-05 17:58:08.872144981 +0000 UTC m=+6149.110818493" lastFinishedPulling="2025-12-05 17:58:12.387026598 +0000 UTC m=+6152.625700120" observedRunningTime="2025-12-05 17:58:12.966378277 +0000 UTC m=+6153.205051799" watchObservedRunningTime="2025-12-05 17:58:12.974858843 +0000 UTC m=+6153.213532355" Dec 05 17:58:14 crc kubenswrapper[4756]: I1205 17:58:14.258587 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-7drzk" Dec 05 17:58:14 crc kubenswrapper[4756]: I1205 17:58:14.323732 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-8ljss" Dec 05 17:58:14 crc kubenswrapper[4756]: I1205 17:58:14.334681 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-8ljss" Dec 05 17:58:14 crc kubenswrapper[4756]: I1205 17:58:14.469910 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-7drzk-config-gltwn"] Dec 05 17:58:14 crc kubenswrapper[4756]: I1205 17:58:14.471180 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7drzk-config-gltwn" Dec 05 17:58:14 crc kubenswrapper[4756]: I1205 17:58:14.472667 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 05 17:58:14 crc kubenswrapper[4756]: I1205 17:58:14.482977 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7drzk-config-gltwn"] Dec 05 17:58:14 crc kubenswrapper[4756]: I1205 17:58:14.605503 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/efba0851-638d-4942-85cc-661c1e97f607-scripts\") pod \"ovn-controller-7drzk-config-gltwn\" (UID: \"efba0851-638d-4942-85cc-661c1e97f607\") " pod="openstack/ovn-controller-7drzk-config-gltwn" Dec 05 17:58:14 crc kubenswrapper[4756]: I1205 17:58:14.605607 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrw6s\" (UniqueName: \"kubernetes.io/projected/efba0851-638d-4942-85cc-661c1e97f607-kube-api-access-zrw6s\") pod \"ovn-controller-7drzk-config-gltwn\" (UID: \"efba0851-638d-4942-85cc-661c1e97f607\") " pod="openstack/ovn-controller-7drzk-config-gltwn" Dec 05 17:58:14 crc kubenswrapper[4756]: I1205 17:58:14.605632 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/efba0851-638d-4942-85cc-661c1e97f607-var-run-ovn\") pod \"ovn-controller-7drzk-config-gltwn\" (UID: \"efba0851-638d-4942-85cc-661c1e97f607\") " pod="openstack/ovn-controller-7drzk-config-gltwn" Dec 05 17:58:14 crc kubenswrapper[4756]: I1205 17:58:14.605678 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/efba0851-638d-4942-85cc-661c1e97f607-additional-scripts\") pod \"ovn-controller-7drzk-config-gltwn\" (UID: \"efba0851-638d-4942-85cc-661c1e97f607\") " pod="openstack/ovn-controller-7drzk-config-gltwn" Dec 05 17:58:14 crc kubenswrapper[4756]: I1205 17:58:14.605711 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/efba0851-638d-4942-85cc-661c1e97f607-var-log-ovn\") pod \"ovn-controller-7drzk-config-gltwn\" (UID: \"efba0851-638d-4942-85cc-661c1e97f607\") " pod="openstack/ovn-controller-7drzk-config-gltwn" Dec 05 17:58:14 crc kubenswrapper[4756]: I1205 17:58:14.605740 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/efba0851-638d-4942-85cc-661c1e97f607-var-run\") pod \"ovn-controller-7drzk-config-gltwn\" (UID: \"efba0851-638d-4942-85cc-661c1e97f607\") " pod="openstack/ovn-controller-7drzk-config-gltwn" Dec 05 17:58:14 crc kubenswrapper[4756]: I1205 17:58:14.707561 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/efba0851-638d-4942-85cc-661c1e97f607-var-run\") pod \"ovn-controller-7drzk-config-gltwn\" (UID: \"efba0851-638d-4942-85cc-661c1e97f607\") " pod="openstack/ovn-controller-7drzk-config-gltwn" Dec 05 17:58:14 crc kubenswrapper[4756]: I1205 17:58:14.707675 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/efba0851-638d-4942-85cc-661c1e97f607-scripts\") pod \"ovn-controller-7drzk-config-gltwn\" (UID: \"efba0851-638d-4942-85cc-661c1e97f607\") " pod="openstack/ovn-controller-7drzk-config-gltwn" Dec 05 17:58:14 crc kubenswrapper[4756]: I1205 17:58:14.707782 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrw6s\" (UniqueName: \"kubernetes.io/projected/efba0851-638d-4942-85cc-661c1e97f607-kube-api-access-zrw6s\") pod \"ovn-controller-7drzk-config-gltwn\" (UID: \"efba0851-638d-4942-85cc-661c1e97f607\") " pod="openstack/ovn-controller-7drzk-config-gltwn" Dec 05 17:58:14 crc kubenswrapper[4756]: I1205 17:58:14.707812 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/efba0851-638d-4942-85cc-661c1e97f607-var-run-ovn\") pod \"ovn-controller-7drzk-config-gltwn\" (UID: \"efba0851-638d-4942-85cc-661c1e97f607\") " pod="openstack/ovn-controller-7drzk-config-gltwn" Dec 05 17:58:14 crc kubenswrapper[4756]: I1205 17:58:14.707862 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/efba0851-638d-4942-85cc-661c1e97f607-additional-scripts\") pod \"ovn-controller-7drzk-config-gltwn\" (UID: \"efba0851-638d-4942-85cc-661c1e97f607\") " pod="openstack/ovn-controller-7drzk-config-gltwn" Dec 05 17:58:14 crc kubenswrapper[4756]: I1205 17:58:14.707897 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/efba0851-638d-4942-85cc-661c1e97f607-var-run\") pod \"ovn-controller-7drzk-config-gltwn\" (UID: \"efba0851-638d-4942-85cc-661c1e97f607\") " pod="openstack/ovn-controller-7drzk-config-gltwn" Dec 05 17:58:14 crc kubenswrapper[4756]: I1205 17:58:14.707909 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/efba0851-638d-4942-85cc-661c1e97f607-var-log-ovn\") pod \"ovn-controller-7drzk-config-gltwn\" (UID: \"efba0851-638d-4942-85cc-661c1e97f607\") " pod="openstack/ovn-controller-7drzk-config-gltwn" Dec 05 17:58:14 crc kubenswrapper[4756]: I1205 17:58:14.707950 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/efba0851-638d-4942-85cc-661c1e97f607-var-run-ovn\") pod \"ovn-controller-7drzk-config-gltwn\" (UID: \"efba0851-638d-4942-85cc-661c1e97f607\") " pod="openstack/ovn-controller-7drzk-config-gltwn" Dec 05 17:58:14 crc kubenswrapper[4756]: I1205 17:58:14.708209 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/efba0851-638d-4942-85cc-661c1e97f607-var-log-ovn\") pod \"ovn-controller-7drzk-config-gltwn\" (UID: \"efba0851-638d-4942-85cc-661c1e97f607\") " pod="openstack/ovn-controller-7drzk-config-gltwn" Dec 05 17:58:14 crc kubenswrapper[4756]: I1205 17:58:14.708657 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/efba0851-638d-4942-85cc-661c1e97f607-additional-scripts\") pod \"ovn-controller-7drzk-config-gltwn\" (UID: \"efba0851-638d-4942-85cc-661c1e97f607\") " pod="openstack/ovn-controller-7drzk-config-gltwn" Dec 05 17:58:14 crc kubenswrapper[4756]: I1205 17:58:14.710140 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/efba0851-638d-4942-85cc-661c1e97f607-scripts\") pod \"ovn-controller-7drzk-config-gltwn\" (UID: \"efba0851-638d-4942-85cc-661c1e97f607\") " pod="openstack/ovn-controller-7drzk-config-gltwn" Dec 05 17:58:14 crc kubenswrapper[4756]: I1205 17:58:14.727951 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrw6s\" (UniqueName: \"kubernetes.io/projected/efba0851-638d-4942-85cc-661c1e97f607-kube-api-access-zrw6s\") pod \"ovn-controller-7drzk-config-gltwn\" (UID: \"efba0851-638d-4942-85cc-661c1e97f607\") " pod="openstack/ovn-controller-7drzk-config-gltwn" Dec 05 17:58:14 crc kubenswrapper[4756]: I1205 17:58:14.791824 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7drzk-config-gltwn" Dec 05 17:58:15 crc kubenswrapper[4756]: I1205 17:58:15.254082 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7drzk-config-gltwn"] Dec 05 17:58:15 crc kubenswrapper[4756]: I1205 17:58:15.988201 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7drzk-config-gltwn" event={"ID":"efba0851-638d-4942-85cc-661c1e97f607","Type":"ContainerStarted","Data":"8563fba2b165b0edb22d1ae88b9c43eb36f9194ef479f98c7b0ae181a6e2902d"} Dec 05 17:58:15 crc kubenswrapper[4756]: I1205 17:58:15.988513 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7drzk-config-gltwn" event={"ID":"efba0851-638d-4942-85cc-661c1e97f607","Type":"ContainerStarted","Data":"2adf50af61bd390a137abc56592eaafe31d72deb9be18d2b9297837d01b9816f"} Dec 05 17:58:16 crc kubenswrapper[4756]: I1205 17:58:16.014523 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-7drzk-config-gltwn" podStartSLOduration=2.014507564 podStartE2EDuration="2.014507564s" podCreationTimestamp="2025-12-05 17:58:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:58:16.005973876 +0000 UTC m=+6156.244647398" watchObservedRunningTime="2025-12-05 17:58:16.014507564 +0000 UTC m=+6156.253181076" Dec 05 17:58:16 crc kubenswrapper[4756]: I1205 17:58:16.998719 4756 generic.go:334] "Generic (PLEG): container finished" podID="efba0851-638d-4942-85cc-661c1e97f607" containerID="8563fba2b165b0edb22d1ae88b9c43eb36f9194ef479f98c7b0ae181a6e2902d" exitCode=0 Dec 05 17:58:16 crc kubenswrapper[4756]: I1205 17:58:16.998759 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7drzk-config-gltwn" event={"ID":"efba0851-638d-4942-85cc-661c1e97f607","Type":"ContainerDied","Data":"8563fba2b165b0edb22d1ae88b9c43eb36f9194ef479f98c7b0ae181a6e2902d"} Dec 05 17:58:18 crc kubenswrapper[4756]: I1205 17:58:18.523005 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7drzk-config-gltwn" Dec 05 17:58:18 crc kubenswrapper[4756]: I1205 17:58:18.696798 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/efba0851-638d-4942-85cc-661c1e97f607-scripts\") pod \"efba0851-638d-4942-85cc-661c1e97f607\" (UID: \"efba0851-638d-4942-85cc-661c1e97f607\") " Dec 05 17:58:18 crc kubenswrapper[4756]: I1205 17:58:18.697124 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/efba0851-638d-4942-85cc-661c1e97f607-var-run-ovn\") pod \"efba0851-638d-4942-85cc-661c1e97f607\" (UID: \"efba0851-638d-4942-85cc-661c1e97f607\") " Dec 05 17:58:18 crc kubenswrapper[4756]: I1205 17:58:18.697218 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/efba0851-638d-4942-85cc-661c1e97f607-additional-scripts\") pod \"efba0851-638d-4942-85cc-661c1e97f607\" (UID: \"efba0851-638d-4942-85cc-661c1e97f607\") " Dec 05 17:58:18 crc kubenswrapper[4756]: I1205 17:58:18.697323 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/efba0851-638d-4942-85cc-661c1e97f607-var-log-ovn\") pod \"efba0851-638d-4942-85cc-661c1e97f607\" (UID: \"efba0851-638d-4942-85cc-661c1e97f607\") " Dec 05 17:58:18 crc kubenswrapper[4756]: I1205 17:58:18.697314 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/efba0851-638d-4942-85cc-661c1e97f607-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "efba0851-638d-4942-85cc-661c1e97f607" (UID: "efba0851-638d-4942-85cc-661c1e97f607"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 17:58:18 crc kubenswrapper[4756]: I1205 17:58:18.697453 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrw6s\" (UniqueName: \"kubernetes.io/projected/efba0851-638d-4942-85cc-661c1e97f607-kube-api-access-zrw6s\") pod \"efba0851-638d-4942-85cc-661c1e97f607\" (UID: \"efba0851-638d-4942-85cc-661c1e97f607\") " Dec 05 17:58:18 crc kubenswrapper[4756]: I1205 17:58:18.697454 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/efba0851-638d-4942-85cc-661c1e97f607-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "efba0851-638d-4942-85cc-661c1e97f607" (UID: "efba0851-638d-4942-85cc-661c1e97f607"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 17:58:18 crc kubenswrapper[4756]: I1205 17:58:18.697636 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/efba0851-638d-4942-85cc-661c1e97f607-var-run\") pod \"efba0851-638d-4942-85cc-661c1e97f607\" (UID: \"efba0851-638d-4942-85cc-661c1e97f607\") " Dec 05 17:58:18 crc kubenswrapper[4756]: I1205 17:58:18.697746 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/efba0851-638d-4942-85cc-661c1e97f607-var-run" (OuterVolumeSpecName: "var-run") pod "efba0851-638d-4942-85cc-661c1e97f607" (UID: "efba0851-638d-4942-85cc-661c1e97f607"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 17:58:18 crc kubenswrapper[4756]: I1205 17:58:18.698299 4756 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/efba0851-638d-4942-85cc-661c1e97f607-var-run\") on node \"crc\" DevicePath \"\"" Dec 05 17:58:18 crc kubenswrapper[4756]: I1205 17:58:18.698314 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efba0851-638d-4942-85cc-661c1e97f607-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "efba0851-638d-4942-85cc-661c1e97f607" (UID: "efba0851-638d-4942-85cc-661c1e97f607"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:58:18 crc kubenswrapper[4756]: I1205 17:58:18.698331 4756 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/efba0851-638d-4942-85cc-661c1e97f607-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 17:58:18 crc kubenswrapper[4756]: I1205 17:58:18.698443 4756 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/efba0851-638d-4942-85cc-661c1e97f607-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 17:58:18 crc kubenswrapper[4756]: I1205 17:58:18.698508 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efba0851-638d-4942-85cc-661c1e97f607-scripts" (OuterVolumeSpecName: "scripts") pod "efba0851-638d-4942-85cc-661c1e97f607" (UID: "efba0851-638d-4942-85cc-661c1e97f607"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:58:18 crc kubenswrapper[4756]: I1205 17:58:18.705821 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efba0851-638d-4942-85cc-661c1e97f607-kube-api-access-zrw6s" (OuterVolumeSpecName: "kube-api-access-zrw6s") pod "efba0851-638d-4942-85cc-661c1e97f607" (UID: "efba0851-638d-4942-85cc-661c1e97f607"). InnerVolumeSpecName "kube-api-access-zrw6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:58:18 crc kubenswrapper[4756]: I1205 17:58:18.801874 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/efba0851-638d-4942-85cc-661c1e97f607-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:58:18 crc kubenswrapper[4756]: I1205 17:58:18.801959 4756 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/efba0851-638d-4942-85cc-661c1e97f607-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:58:18 crc kubenswrapper[4756]: I1205 17:58:18.801985 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrw6s\" (UniqueName: \"kubernetes.io/projected/efba0851-638d-4942-85cc-661c1e97f607-kube-api-access-zrw6s\") on node \"crc\" DevicePath \"\"" Dec 05 17:58:19 crc kubenswrapper[4756]: I1205 17:58:19.026985 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7drzk-config-gltwn" event={"ID":"efba0851-638d-4942-85cc-661c1e97f607","Type":"ContainerDied","Data":"2adf50af61bd390a137abc56592eaafe31d72deb9be18d2b9297837d01b9816f"} Dec 05 17:58:19 crc kubenswrapper[4756]: I1205 17:58:19.027035 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2adf50af61bd390a137abc56592eaafe31d72deb9be18d2b9297837d01b9816f" Dec 05 17:58:19 crc kubenswrapper[4756]: I1205 17:58:19.027061 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7drzk-config-gltwn" Dec 05 17:58:19 crc kubenswrapper[4756]: I1205 17:58:19.152255 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-7drzk-config-gltwn"] Dec 05 17:58:19 crc kubenswrapper[4756]: I1205 17:58:19.167505 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-7drzk-config-gltwn"] Dec 05 17:58:19 crc kubenswrapper[4756]: I1205 17:58:19.277470 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-7drzk-config-g9k7q"] Dec 05 17:58:19 crc kubenswrapper[4756]: E1205 17:58:19.278002 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efba0851-638d-4942-85cc-661c1e97f607" containerName="ovn-config" Dec 05 17:58:19 crc kubenswrapper[4756]: I1205 17:58:19.278029 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="efba0851-638d-4942-85cc-661c1e97f607" containerName="ovn-config" Dec 05 17:58:19 crc kubenswrapper[4756]: I1205 17:58:19.278273 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="efba0851-638d-4942-85cc-661c1e97f607" containerName="ovn-config" Dec 05 17:58:19 crc kubenswrapper[4756]: I1205 17:58:19.279061 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7drzk-config-g9k7q" Dec 05 17:58:19 crc kubenswrapper[4756]: I1205 17:58:19.286853 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 05 17:58:19 crc kubenswrapper[4756]: I1205 17:58:19.294085 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7drzk-config-g9k7q"] Dec 05 17:58:19 crc kubenswrapper[4756]: I1205 17:58:19.416225 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ea24c142-a8f7-471a-9309-c252800da524-var-run-ovn\") pod \"ovn-controller-7drzk-config-g9k7q\" (UID: \"ea24c142-a8f7-471a-9309-c252800da524\") " pod="openstack/ovn-controller-7drzk-config-g9k7q" Dec 05 17:58:19 crc kubenswrapper[4756]: I1205 17:58:19.416476 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ea24c142-a8f7-471a-9309-c252800da524-var-log-ovn\") pod \"ovn-controller-7drzk-config-g9k7q\" (UID: \"ea24c142-a8f7-471a-9309-c252800da524\") " pod="openstack/ovn-controller-7drzk-config-g9k7q" Dec 05 17:58:19 crc kubenswrapper[4756]: I1205 17:58:19.416605 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea24c142-a8f7-471a-9309-c252800da524-scripts\") pod \"ovn-controller-7drzk-config-g9k7q\" (UID: \"ea24c142-a8f7-471a-9309-c252800da524\") " pod="openstack/ovn-controller-7drzk-config-g9k7q" Dec 05 17:58:19 crc kubenswrapper[4756]: I1205 17:58:19.416681 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ea24c142-a8f7-471a-9309-c252800da524-var-run\") pod \"ovn-controller-7drzk-config-g9k7q\" (UID: \"ea24c142-a8f7-471a-9309-c252800da524\") " pod="openstack/ovn-controller-7drzk-config-g9k7q" Dec 05 17:58:19 crc kubenswrapper[4756]: I1205 17:58:19.416768 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ea24c142-a8f7-471a-9309-c252800da524-additional-scripts\") pod \"ovn-controller-7drzk-config-g9k7q\" (UID: \"ea24c142-a8f7-471a-9309-c252800da524\") " pod="openstack/ovn-controller-7drzk-config-g9k7q" Dec 05 17:58:19 crc kubenswrapper[4756]: I1205 17:58:19.416819 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb6x9\" (UniqueName: \"kubernetes.io/projected/ea24c142-a8f7-471a-9309-c252800da524-kube-api-access-bb6x9\") pod \"ovn-controller-7drzk-config-g9k7q\" (UID: \"ea24c142-a8f7-471a-9309-c252800da524\") " pod="openstack/ovn-controller-7drzk-config-g9k7q" Dec 05 17:58:19 crc kubenswrapper[4756]: I1205 17:58:19.518355 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ea24c142-a8f7-471a-9309-c252800da524-additional-scripts\") pod \"ovn-controller-7drzk-config-g9k7q\" (UID: \"ea24c142-a8f7-471a-9309-c252800da524\") " pod="openstack/ovn-controller-7drzk-config-g9k7q" Dec 05 17:58:19 crc kubenswrapper[4756]: I1205 17:58:19.518407 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb6x9\" (UniqueName: \"kubernetes.io/projected/ea24c142-a8f7-471a-9309-c252800da524-kube-api-access-bb6x9\") pod \"ovn-controller-7drzk-config-g9k7q\" (UID: \"ea24c142-a8f7-471a-9309-c252800da524\") " pod="openstack/ovn-controller-7drzk-config-g9k7q" Dec 05 17:58:19 crc kubenswrapper[4756]: I1205 17:58:19.518471 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ea24c142-a8f7-471a-9309-c252800da524-var-run-ovn\") pod \"ovn-controller-7drzk-config-g9k7q\" (UID: \"ea24c142-a8f7-471a-9309-c252800da524\") " pod="openstack/ovn-controller-7drzk-config-g9k7q" Dec 05 17:58:19 crc kubenswrapper[4756]: I1205 17:58:19.518545 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ea24c142-a8f7-471a-9309-c252800da524-var-log-ovn\") pod \"ovn-controller-7drzk-config-g9k7q\" (UID: \"ea24c142-a8f7-471a-9309-c252800da524\") " pod="openstack/ovn-controller-7drzk-config-g9k7q" Dec 05 17:58:19 crc kubenswrapper[4756]: I1205 17:58:19.518584 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea24c142-a8f7-471a-9309-c252800da524-scripts\") pod \"ovn-controller-7drzk-config-g9k7q\" (UID: \"ea24c142-a8f7-471a-9309-c252800da524\") " pod="openstack/ovn-controller-7drzk-config-g9k7q" Dec 05 17:58:19 crc kubenswrapper[4756]: I1205 17:58:19.518657 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ea24c142-a8f7-471a-9309-c252800da524-var-run\") pod \"ovn-controller-7drzk-config-g9k7q\" (UID: \"ea24c142-a8f7-471a-9309-c252800da524\") " pod="openstack/ovn-controller-7drzk-config-g9k7q" Dec 05 17:58:19 crc kubenswrapper[4756]: I1205 17:58:19.518850 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ea24c142-a8f7-471a-9309-c252800da524-var-run\") pod \"ovn-controller-7drzk-config-g9k7q\" (UID: \"ea24c142-a8f7-471a-9309-c252800da524\") " pod="openstack/ovn-controller-7drzk-config-g9k7q" Dec 05 17:58:19 crc kubenswrapper[4756]: I1205 17:58:19.518855 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ea24c142-a8f7-471a-9309-c252800da524-var-run-ovn\") pod \"ovn-controller-7drzk-config-g9k7q\" (UID: \"ea24c142-a8f7-471a-9309-c252800da524\") " pod="openstack/ovn-controller-7drzk-config-g9k7q" Dec 05 17:58:19 crc kubenswrapper[4756]: I1205 17:58:19.518903 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ea24c142-a8f7-471a-9309-c252800da524-var-log-ovn\") pod \"ovn-controller-7drzk-config-g9k7q\" (UID: \"ea24c142-a8f7-471a-9309-c252800da524\") " pod="openstack/ovn-controller-7drzk-config-g9k7q" Dec 05 17:58:19 crc kubenswrapper[4756]: I1205 17:58:19.519921 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ea24c142-a8f7-471a-9309-c252800da524-additional-scripts\") pod \"ovn-controller-7drzk-config-g9k7q\" (UID: \"ea24c142-a8f7-471a-9309-c252800da524\") " pod="openstack/ovn-controller-7drzk-config-g9k7q" Dec 05 17:58:19 crc kubenswrapper[4756]: I1205 17:58:19.520777 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea24c142-a8f7-471a-9309-c252800da524-scripts\") pod \"ovn-controller-7drzk-config-g9k7q\" (UID: \"ea24c142-a8f7-471a-9309-c252800da524\") " pod="openstack/ovn-controller-7drzk-config-g9k7q" Dec 05 17:58:19 crc kubenswrapper[4756]: I1205 17:58:19.536330 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb6x9\" (UniqueName: \"kubernetes.io/projected/ea24c142-a8f7-471a-9309-c252800da524-kube-api-access-bb6x9\") pod \"ovn-controller-7drzk-config-g9k7q\" (UID: \"ea24c142-a8f7-471a-9309-c252800da524\") " pod="openstack/ovn-controller-7drzk-config-g9k7q" Dec 05 17:58:19 crc kubenswrapper[4756]: I1205 17:58:19.605409 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7drzk-config-g9k7q" Dec 05 17:58:20 crc kubenswrapper[4756]: I1205 17:58:20.063729 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efba0851-638d-4942-85cc-661c1e97f607" path="/var/lib/kubelet/pods/efba0851-638d-4942-85cc-661c1e97f607/volumes" Dec 05 17:58:20 crc kubenswrapper[4756]: I1205 17:58:20.079720 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7drzk-config-g9k7q"] Dec 05 17:58:20 crc kubenswrapper[4756]: W1205 17:58:20.087044 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea24c142_a8f7_471a_9309_c252800da524.slice/crio-91e0000f20319107f0297163934597a20b41ced7b90988b820524383c4112098 WatchSource:0}: Error finding container 91e0000f20319107f0297163934597a20b41ced7b90988b820524383c4112098: Status 404 returned error can't find the container with id 91e0000f20319107f0297163934597a20b41ced7b90988b820524383c4112098 Dec 05 17:58:21 crc kubenswrapper[4756]: I1205 17:58:21.057840 4756 generic.go:334] "Generic (PLEG): container finished" podID="ea24c142-a8f7-471a-9309-c252800da524" containerID="d863f579dc79bf3abf3955d7a31847ce960064c361e86884a86bb8e159169706" exitCode=0 Dec 05 17:58:21 crc kubenswrapper[4756]: I1205 17:58:21.058221 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7drzk-config-g9k7q" event={"ID":"ea24c142-a8f7-471a-9309-c252800da524","Type":"ContainerDied","Data":"d863f579dc79bf3abf3955d7a31847ce960064c361e86884a86bb8e159169706"} Dec 05 17:58:21 crc kubenswrapper[4756]: I1205 17:58:21.058259 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7drzk-config-g9k7q" event={"ID":"ea24c142-a8f7-471a-9309-c252800da524","Type":"ContainerStarted","Data":"91e0000f20319107f0297163934597a20b41ced7b90988b820524383c4112098"} Dec 05 17:58:21 crc kubenswrapper[4756]: I1205 17:58:21.723061 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ngfnl" Dec 05 17:58:21 crc kubenswrapper[4756]: I1205 17:58:21.723589 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ngfnl" Dec 05 17:58:21 crc kubenswrapper[4756]: I1205 17:58:21.804609 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ngfnl" Dec 05 17:58:22 crc kubenswrapper[4756]: I1205 17:58:22.159356 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ngfnl" Dec 05 17:58:22 crc kubenswrapper[4756]: I1205 17:58:22.226347 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ngfnl"] Dec 05 17:58:22 crc kubenswrapper[4756]: I1205 17:58:22.508998 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7drzk-config-g9k7q" Dec 05 17:58:22 crc kubenswrapper[4756]: I1205 17:58:22.687298 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea24c142-a8f7-471a-9309-c252800da524-scripts\") pod \"ea24c142-a8f7-471a-9309-c252800da524\" (UID: \"ea24c142-a8f7-471a-9309-c252800da524\") " Dec 05 17:58:22 crc kubenswrapper[4756]: I1205 17:58:22.687381 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ea24c142-a8f7-471a-9309-c252800da524-var-run\") pod \"ea24c142-a8f7-471a-9309-c252800da524\" (UID: \"ea24c142-a8f7-471a-9309-c252800da524\") " Dec 05 17:58:22 crc kubenswrapper[4756]: I1205 17:58:22.687481 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ea24c142-a8f7-471a-9309-c252800da524-additional-scripts\") pod \"ea24c142-a8f7-471a-9309-c252800da524\" (UID: \"ea24c142-a8f7-471a-9309-c252800da524\") " Dec 05 17:58:22 crc kubenswrapper[4756]: I1205 17:58:22.687520 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea24c142-a8f7-471a-9309-c252800da524-var-run" (OuterVolumeSpecName: "var-run") pod "ea24c142-a8f7-471a-9309-c252800da524" (UID: "ea24c142-a8f7-471a-9309-c252800da524"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 17:58:22 crc kubenswrapper[4756]: I1205 17:58:22.687529 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ea24c142-a8f7-471a-9309-c252800da524-var-log-ovn\") pod \"ea24c142-a8f7-471a-9309-c252800da524\" (UID: \"ea24c142-a8f7-471a-9309-c252800da524\") " Dec 05 17:58:22 crc kubenswrapper[4756]: I1205 17:58:22.687611 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ea24c142-a8f7-471a-9309-c252800da524-var-run-ovn\") pod \"ea24c142-a8f7-471a-9309-c252800da524\" (UID: \"ea24c142-a8f7-471a-9309-c252800da524\") " Dec 05 17:58:22 crc kubenswrapper[4756]: I1205 17:58:22.687560 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea24c142-a8f7-471a-9309-c252800da524-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "ea24c142-a8f7-471a-9309-c252800da524" (UID: "ea24c142-a8f7-471a-9309-c252800da524"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 17:58:22 crc kubenswrapper[4756]: I1205 17:58:22.687665 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb6x9\" (UniqueName: \"kubernetes.io/projected/ea24c142-a8f7-471a-9309-c252800da524-kube-api-access-bb6x9\") pod \"ea24c142-a8f7-471a-9309-c252800da524\" (UID: \"ea24c142-a8f7-471a-9309-c252800da524\") " Dec 05 17:58:22 crc kubenswrapper[4756]: I1205 17:58:22.687726 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea24c142-a8f7-471a-9309-c252800da524-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "ea24c142-a8f7-471a-9309-c252800da524" (UID: "ea24c142-a8f7-471a-9309-c252800da524"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 17:58:22 crc kubenswrapper[4756]: I1205 17:58:22.688073 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea24c142-a8f7-471a-9309-c252800da524-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "ea24c142-a8f7-471a-9309-c252800da524" (UID: "ea24c142-a8f7-471a-9309-c252800da524"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:58:22 crc kubenswrapper[4756]: I1205 17:58:22.688274 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea24c142-a8f7-471a-9309-c252800da524-scripts" (OuterVolumeSpecName: "scripts") pod "ea24c142-a8f7-471a-9309-c252800da524" (UID: "ea24c142-a8f7-471a-9309-c252800da524"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 17:58:22 crc kubenswrapper[4756]: I1205 17:58:22.688915 4756 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ea24c142-a8f7-471a-9309-c252800da524-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:58:22 crc kubenswrapper[4756]: I1205 17:58:22.688935 4756 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ea24c142-a8f7-471a-9309-c252800da524-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 17:58:22 crc kubenswrapper[4756]: I1205 17:58:22.688945 4756 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ea24c142-a8f7-471a-9309-c252800da524-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 17:58:22 crc kubenswrapper[4756]: I1205 17:58:22.688954 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea24c142-a8f7-471a-9309-c252800da524-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:58:22 crc kubenswrapper[4756]: I1205 17:58:22.688963 4756 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ea24c142-a8f7-471a-9309-c252800da524-var-run\") on node \"crc\" DevicePath \"\"" Dec 05 17:58:22 crc kubenswrapper[4756]: I1205 17:58:22.692382 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea24c142-a8f7-471a-9309-c252800da524-kube-api-access-bb6x9" (OuterVolumeSpecName: "kube-api-access-bb6x9") pod "ea24c142-a8f7-471a-9309-c252800da524" (UID: "ea24c142-a8f7-471a-9309-c252800da524"). InnerVolumeSpecName "kube-api-access-bb6x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:58:22 crc kubenswrapper[4756]: I1205 17:58:22.790237 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bb6x9\" (UniqueName: \"kubernetes.io/projected/ea24c142-a8f7-471a-9309-c252800da524-kube-api-access-bb6x9\") on node \"crc\" DevicePath \"\"" Dec 05 17:58:23 crc kubenswrapper[4756]: I1205 17:58:23.081368 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7drzk-config-g9k7q" Dec 05 17:58:23 crc kubenswrapper[4756]: I1205 17:58:23.081377 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7drzk-config-g9k7q" event={"ID":"ea24c142-a8f7-471a-9309-c252800da524","Type":"ContainerDied","Data":"91e0000f20319107f0297163934597a20b41ced7b90988b820524383c4112098"} Dec 05 17:58:23 crc kubenswrapper[4756]: I1205 17:58:23.081486 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91e0000f20319107f0297163934597a20b41ced7b90988b820524383c4112098" Dec 05 17:58:23 crc kubenswrapper[4756]: I1205 17:58:23.613080 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-7drzk-config-g9k7q"] Dec 05 17:58:23 crc kubenswrapper[4756]: I1205 17:58:23.622045 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-7drzk-config-g9k7q"] Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.060928 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea24c142-a8f7-471a-9309-c252800da524" path="/var/lib/kubelet/pods/ea24c142-a8f7-471a-9309-c252800da524/volumes" Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.089164 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ngfnl" podUID="5cfcfa90-a657-426b-996d-f54055582128" containerName="registry-server" containerID="cri-o://76fa21401c5962b20873cfbc68921e6d4e1a576bc8eeed278363931a02da2248" gracePeriod=2 Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.671323 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ngfnl" Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.691838 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-nwvnw"] Dec 05 17:58:24 crc kubenswrapper[4756]: E1205 17:58:24.692337 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea24c142-a8f7-471a-9309-c252800da524" containerName="ovn-config" Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.692362 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea24c142-a8f7-471a-9309-c252800da524" containerName="ovn-config" Dec 05 17:58:24 crc kubenswrapper[4756]: E1205 17:58:24.692383 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cfcfa90-a657-426b-996d-f54055582128" containerName="registry-server" Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.692391 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cfcfa90-a657-426b-996d-f54055582128" containerName="registry-server" Dec 05 17:58:24 crc kubenswrapper[4756]: E1205 17:58:24.692410 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cfcfa90-a657-426b-996d-f54055582128" containerName="extract-utilities" Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.692435 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cfcfa90-a657-426b-996d-f54055582128" containerName="extract-utilities" Dec 05 17:58:24 crc kubenswrapper[4756]: E1205 17:58:24.692458 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cfcfa90-a657-426b-996d-f54055582128" containerName="extract-content" Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.692466 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cfcfa90-a657-426b-996d-f54055582128" containerName="extract-content" Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.692708 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea24c142-a8f7-471a-9309-c252800da524" containerName="ovn-config" Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.692741 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cfcfa90-a657-426b-996d-f54055582128" containerName="registry-server" Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.693850 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-nwvnw" Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.696727 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.698317 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.699010 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.730597 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-nwvnw"] Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.833087 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv9rw\" (UniqueName: \"kubernetes.io/projected/5cfcfa90-a657-426b-996d-f54055582128-kube-api-access-dv9rw\") pod \"5cfcfa90-a657-426b-996d-f54055582128\" (UID: \"5cfcfa90-a657-426b-996d-f54055582128\") " Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.833183 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cfcfa90-a657-426b-996d-f54055582128-catalog-content\") pod \"5cfcfa90-a657-426b-996d-f54055582128\" (UID: \"5cfcfa90-a657-426b-996d-f54055582128\") " Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.836551 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cfcfa90-a657-426b-996d-f54055582128-utilities\") pod \"5cfcfa90-a657-426b-996d-f54055582128\" (UID: \"5cfcfa90-a657-426b-996d-f54055582128\") " Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.837808 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cfcfa90-a657-426b-996d-f54055582128-utilities" (OuterVolumeSpecName: "utilities") pod "5cfcfa90-a657-426b-996d-f54055582128" (UID: "5cfcfa90-a657-426b-996d-f54055582128"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.838107 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bc062ab-b968-48e4-b201-b320bf7af1c2-config-data\") pod \"octavia-rsyslog-nwvnw\" (UID: \"4bc062ab-b968-48e4-b201-b320bf7af1c2\") " pod="openstack/octavia-rsyslog-nwvnw" Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.838717 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bc062ab-b968-48e4-b201-b320bf7af1c2-scripts\") pod \"octavia-rsyslog-nwvnw\" (UID: \"4bc062ab-b968-48e4-b201-b320bf7af1c2\") " pod="openstack/octavia-rsyslog-nwvnw" Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.838896 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4bc062ab-b968-48e4-b201-b320bf7af1c2-config-data-merged\") pod \"octavia-rsyslog-nwvnw\" (UID: \"4bc062ab-b968-48e4-b201-b320bf7af1c2\") " pod="openstack/octavia-rsyslog-nwvnw" Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.839083 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/4bc062ab-b968-48e4-b201-b320bf7af1c2-hm-ports\") pod \"octavia-rsyslog-nwvnw\" (UID: \"4bc062ab-b968-48e4-b201-b320bf7af1c2\") " pod="openstack/octavia-rsyslog-nwvnw" Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.839360 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cfcfa90-a657-426b-996d-f54055582128-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.847883 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cfcfa90-a657-426b-996d-f54055582128-kube-api-access-dv9rw" (OuterVolumeSpecName: "kube-api-access-dv9rw") pod "5cfcfa90-a657-426b-996d-f54055582128" (UID: "5cfcfa90-a657-426b-996d-f54055582128"). InnerVolumeSpecName "kube-api-access-dv9rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.890162 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cfcfa90-a657-426b-996d-f54055582128-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5cfcfa90-a657-426b-996d-f54055582128" (UID: "5cfcfa90-a657-426b-996d-f54055582128"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.940855 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4bc062ab-b968-48e4-b201-b320bf7af1c2-config-data-merged\") pod \"octavia-rsyslog-nwvnw\" (UID: \"4bc062ab-b968-48e4-b201-b320bf7af1c2\") " pod="openstack/octavia-rsyslog-nwvnw" Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.940939 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/4bc062ab-b968-48e4-b201-b320bf7af1c2-hm-ports\") pod \"octavia-rsyslog-nwvnw\" (UID: \"4bc062ab-b968-48e4-b201-b320bf7af1c2\") " pod="openstack/octavia-rsyslog-nwvnw" Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.940980 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bc062ab-b968-48e4-b201-b320bf7af1c2-config-data\") pod \"octavia-rsyslog-nwvnw\" (UID: \"4bc062ab-b968-48e4-b201-b320bf7af1c2\") " pod="openstack/octavia-rsyslog-nwvnw" Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.941080 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bc062ab-b968-48e4-b201-b320bf7af1c2-scripts\") pod \"octavia-rsyslog-nwvnw\" (UID: \"4bc062ab-b968-48e4-b201-b320bf7af1c2\") " pod="openstack/octavia-rsyslog-nwvnw" Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.941164 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cfcfa90-a657-426b-996d-f54055582128-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.941176 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv9rw\" (UniqueName: \"kubernetes.io/projected/5cfcfa90-a657-426b-996d-f54055582128-kube-api-access-dv9rw\") on node \"crc\" DevicePath \"\"" Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.941573 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4bc062ab-b968-48e4-b201-b320bf7af1c2-config-data-merged\") pod \"octavia-rsyslog-nwvnw\" (UID: \"4bc062ab-b968-48e4-b201-b320bf7af1c2\") " pod="openstack/octavia-rsyslog-nwvnw" Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.942344 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/4bc062ab-b968-48e4-b201-b320bf7af1c2-hm-ports\") pod \"octavia-rsyslog-nwvnw\" (UID: \"4bc062ab-b968-48e4-b201-b320bf7af1c2\") " pod="openstack/octavia-rsyslog-nwvnw" Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.945426 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bc062ab-b968-48e4-b201-b320bf7af1c2-scripts\") pod \"octavia-rsyslog-nwvnw\" (UID: \"4bc062ab-b968-48e4-b201-b320bf7af1c2\") " pod="openstack/octavia-rsyslog-nwvnw" Dec 05 17:58:24 crc kubenswrapper[4756]: I1205 17:58:24.953401 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bc062ab-b968-48e4-b201-b320bf7af1c2-config-data\") pod \"octavia-rsyslog-nwvnw\" (UID: \"4bc062ab-b968-48e4-b201-b320bf7af1c2\") " pod="openstack/octavia-rsyslog-nwvnw" Dec 05 17:58:25 crc kubenswrapper[4756]: I1205 17:58:25.010363 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-nwvnw" Dec 05 17:58:25 crc kubenswrapper[4756]: I1205 17:58:25.102061 4756 generic.go:334] "Generic (PLEG): container finished" podID="5cfcfa90-a657-426b-996d-f54055582128" containerID="76fa21401c5962b20873cfbc68921e6d4e1a576bc8eeed278363931a02da2248" exitCode=0 Dec 05 17:58:25 crc kubenswrapper[4756]: I1205 17:58:25.102109 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ngfnl" Dec 05 17:58:25 crc kubenswrapper[4756]: I1205 17:58:25.102119 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngfnl" event={"ID":"5cfcfa90-a657-426b-996d-f54055582128","Type":"ContainerDied","Data":"76fa21401c5962b20873cfbc68921e6d4e1a576bc8eeed278363931a02da2248"} Dec 05 17:58:25 crc kubenswrapper[4756]: I1205 17:58:25.102172 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngfnl" event={"ID":"5cfcfa90-a657-426b-996d-f54055582128","Type":"ContainerDied","Data":"fb0873a091698b5d5d5807f6f0ae8ba7778decc452f33d54b9c59248a3598810"} Dec 05 17:58:25 crc kubenswrapper[4756]: I1205 17:58:25.102207 4756 scope.go:117] "RemoveContainer" containerID="76fa21401c5962b20873cfbc68921e6d4e1a576bc8eeed278363931a02da2248" Dec 05 17:58:25 crc kubenswrapper[4756]: I1205 17:58:25.145295 4756 scope.go:117] "RemoveContainer" containerID="d68b32bdc6c2807af94ab0e15933c1f776281fab051998a4a97e63de0f1d2f02" Dec 05 17:58:25 crc kubenswrapper[4756]: I1205 17:58:25.151455 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ngfnl"] Dec 05 17:58:25 crc kubenswrapper[4756]: I1205 17:58:25.161727 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ngfnl"] Dec 05 17:58:25 crc kubenswrapper[4756]: I1205 17:58:25.179355 4756 scope.go:117] "RemoveContainer" containerID="0573b37b2fe17aab085b116764675a847d1e4e2a1409c1c595c97ae383873fc6" Dec 05 17:58:25 crc kubenswrapper[4756]: I1205 17:58:25.213610 4756 scope.go:117] "RemoveContainer" containerID="76fa21401c5962b20873cfbc68921e6d4e1a576bc8eeed278363931a02da2248" Dec 05 17:58:25 crc kubenswrapper[4756]: E1205 17:58:25.214577 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76fa21401c5962b20873cfbc68921e6d4e1a576bc8eeed278363931a02da2248\": container with ID starting with 76fa21401c5962b20873cfbc68921e6d4e1a576bc8eeed278363931a02da2248 not found: ID does not exist" containerID="76fa21401c5962b20873cfbc68921e6d4e1a576bc8eeed278363931a02da2248" Dec 05 17:58:25 crc kubenswrapper[4756]: I1205 17:58:25.214623 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76fa21401c5962b20873cfbc68921e6d4e1a576bc8eeed278363931a02da2248"} err="failed to get container status \"76fa21401c5962b20873cfbc68921e6d4e1a576bc8eeed278363931a02da2248\": rpc error: code = NotFound desc = could not find container \"76fa21401c5962b20873cfbc68921e6d4e1a576bc8eeed278363931a02da2248\": container with ID starting with 76fa21401c5962b20873cfbc68921e6d4e1a576bc8eeed278363931a02da2248 not found: ID does not exist" Dec 05 17:58:25 crc kubenswrapper[4756]: I1205 17:58:25.214651 4756 scope.go:117] "RemoveContainer" containerID="d68b32bdc6c2807af94ab0e15933c1f776281fab051998a4a97e63de0f1d2f02" Dec 05 17:58:25 crc kubenswrapper[4756]: E1205 17:58:25.215877 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d68b32bdc6c2807af94ab0e15933c1f776281fab051998a4a97e63de0f1d2f02\": container with ID starting with d68b32bdc6c2807af94ab0e15933c1f776281fab051998a4a97e63de0f1d2f02 not found: ID does not exist" containerID="d68b32bdc6c2807af94ab0e15933c1f776281fab051998a4a97e63de0f1d2f02" Dec 05 17:58:25 crc kubenswrapper[4756]: I1205 17:58:25.215902 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d68b32bdc6c2807af94ab0e15933c1f776281fab051998a4a97e63de0f1d2f02"} err="failed to get container status \"d68b32bdc6c2807af94ab0e15933c1f776281fab051998a4a97e63de0f1d2f02\": rpc error: code = NotFound desc = could not find container \"d68b32bdc6c2807af94ab0e15933c1f776281fab051998a4a97e63de0f1d2f02\": container with ID starting with d68b32bdc6c2807af94ab0e15933c1f776281fab051998a4a97e63de0f1d2f02 not found: ID does not exist" Dec 05 17:58:25 crc kubenswrapper[4756]: I1205 17:58:25.215917 4756 scope.go:117] "RemoveContainer" containerID="0573b37b2fe17aab085b116764675a847d1e4e2a1409c1c595c97ae383873fc6" Dec 05 17:58:25 crc kubenswrapper[4756]: E1205 17:58:25.216657 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0573b37b2fe17aab085b116764675a847d1e4e2a1409c1c595c97ae383873fc6\": container with ID starting with 0573b37b2fe17aab085b116764675a847d1e4e2a1409c1c595c97ae383873fc6 not found: ID does not exist" containerID="0573b37b2fe17aab085b116764675a847d1e4e2a1409c1c595c97ae383873fc6" Dec 05 17:58:25 crc kubenswrapper[4756]: I1205 17:58:25.216681 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0573b37b2fe17aab085b116764675a847d1e4e2a1409c1c595c97ae383873fc6"} err="failed to get container status \"0573b37b2fe17aab085b116764675a847d1e4e2a1409c1c595c97ae383873fc6\": rpc error: code = NotFound desc = could not find container \"0573b37b2fe17aab085b116764675a847d1e4e2a1409c1c595c97ae383873fc6\": container with ID starting with 0573b37b2fe17aab085b116764675a847d1e4e2a1409c1c595c97ae383873fc6 not found: ID does not exist" Dec 05 17:58:25 crc kubenswrapper[4756]: I1205 17:58:25.464728 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-44z2z"] Dec 05 17:58:25 crc kubenswrapper[4756]: I1205 17:58:25.466294 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-56c9f55b99-44z2z" Dec 05 17:58:25 crc kubenswrapper[4756]: I1205 17:58:25.468662 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Dec 05 17:58:25 crc kubenswrapper[4756]: I1205 17:58:25.474201 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-44z2z"] Dec 05 17:58:25 crc kubenswrapper[4756]: I1205 17:58:25.602513 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-nwvnw"] Dec 05 17:58:25 crc kubenswrapper[4756]: I1205 17:58:25.656201 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/415eb61a-0592-4e80-91f3-3564eee72d78-amphora-image\") pod \"octavia-image-upload-56c9f55b99-44z2z\" (UID: \"415eb61a-0592-4e80-91f3-3564eee72d78\") " pod="openstack/octavia-image-upload-56c9f55b99-44z2z" Dec 05 17:58:25 crc kubenswrapper[4756]: I1205 17:58:25.656320 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/415eb61a-0592-4e80-91f3-3564eee72d78-httpd-config\") pod \"octavia-image-upload-56c9f55b99-44z2z\" (UID: \"415eb61a-0592-4e80-91f3-3564eee72d78\") " pod="openstack/octavia-image-upload-56c9f55b99-44z2z" Dec 05 17:58:25 crc kubenswrapper[4756]: I1205 17:58:25.758301 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/415eb61a-0592-4e80-91f3-3564eee72d78-amphora-image\") pod \"octavia-image-upload-56c9f55b99-44z2z\" (UID: \"415eb61a-0592-4e80-91f3-3564eee72d78\") " pod="openstack/octavia-image-upload-56c9f55b99-44z2z" Dec 05 17:58:25 crc kubenswrapper[4756]: I1205 17:58:25.758400 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/415eb61a-0592-4e80-91f3-3564eee72d78-httpd-config\") pod \"octavia-image-upload-56c9f55b99-44z2z\" (UID: \"415eb61a-0592-4e80-91f3-3564eee72d78\") " pod="openstack/octavia-image-upload-56c9f55b99-44z2z" Dec 05 17:58:25 crc kubenswrapper[4756]: I1205 17:58:25.759091 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/415eb61a-0592-4e80-91f3-3564eee72d78-amphora-image\") pod \"octavia-image-upload-56c9f55b99-44z2z\" (UID: \"415eb61a-0592-4e80-91f3-3564eee72d78\") " pod="openstack/octavia-image-upload-56c9f55b99-44z2z" Dec 05 17:58:25 crc kubenswrapper[4756]: I1205 17:58:25.770255 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/415eb61a-0592-4e80-91f3-3564eee72d78-httpd-config\") pod \"octavia-image-upload-56c9f55b99-44z2z\" (UID: \"415eb61a-0592-4e80-91f3-3564eee72d78\") " pod="openstack/octavia-image-upload-56c9f55b99-44z2z" Dec 05 17:58:25 crc kubenswrapper[4756]: I1205 17:58:25.786037 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-56c9f55b99-44z2z" Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.060355 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cfcfa90-a657-426b-996d-f54055582128" path="/var/lib/kubelet/pods/5cfcfa90-a657-426b-996d-f54055582128/volumes" Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.111989 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-nwvnw" event={"ID":"4bc062ab-b968-48e4-b201-b320bf7af1c2","Type":"ContainerStarted","Data":"c97d0bb57d3d81d32e5fdb19f4aa58f87983ac35b62d32241b529170ab7bb2e1"} Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.292639 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-44z2z"] Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.639689 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-854956d6cd-q45tl"] Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.643275 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-854956d6cd-q45tl" Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.651969 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-octavia-internal-svc" Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.653209 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-octavia-public-svc" Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.686661 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-854956d6cd-q45tl"] Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.785410 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbec8eb6-ea30-43de-b333-4a650978d5dd-internal-tls-certs\") pod \"octavia-api-854956d6cd-q45tl\" (UID: \"bbec8eb6-ea30-43de-b333-4a650978d5dd\") " pod="openstack/octavia-api-854956d6cd-q45tl" Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.785534 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbec8eb6-ea30-43de-b333-4a650978d5dd-config-data\") pod \"octavia-api-854956d6cd-q45tl\" (UID: \"bbec8eb6-ea30-43de-b333-4a650978d5dd\") " pod="openstack/octavia-api-854956d6cd-q45tl" Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.785571 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/bbec8eb6-ea30-43de-b333-4a650978d5dd-config-data-merged\") pod \"octavia-api-854956d6cd-q45tl\" (UID: \"bbec8eb6-ea30-43de-b333-4a650978d5dd\") " pod="openstack/octavia-api-854956d6cd-q45tl" Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.785596 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbec8eb6-ea30-43de-b333-4a650978d5dd-ovndb-tls-certs\") pod \"octavia-api-854956d6cd-q45tl\" (UID: \"bbec8eb6-ea30-43de-b333-4a650978d5dd\") " pod="openstack/octavia-api-854956d6cd-q45tl" Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.785623 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbec8eb6-ea30-43de-b333-4a650978d5dd-public-tls-certs\") pod \"octavia-api-854956d6cd-q45tl\" (UID: \"bbec8eb6-ea30-43de-b333-4a650978d5dd\") " pod="openstack/octavia-api-854956d6cd-q45tl" Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.785670 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbec8eb6-ea30-43de-b333-4a650978d5dd-scripts\") pod \"octavia-api-854956d6cd-q45tl\" (UID: \"bbec8eb6-ea30-43de-b333-4a650978d5dd\") " pod="openstack/octavia-api-854956d6cd-q45tl" Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.785745 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbec8eb6-ea30-43de-b333-4a650978d5dd-combined-ca-bundle\") pod \"octavia-api-854956d6cd-q45tl\" (UID: \"bbec8eb6-ea30-43de-b333-4a650978d5dd\") " pod="openstack/octavia-api-854956d6cd-q45tl" Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.785771 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/bbec8eb6-ea30-43de-b333-4a650978d5dd-octavia-run\") pod \"octavia-api-854956d6cd-q45tl\" (UID: \"bbec8eb6-ea30-43de-b333-4a650978d5dd\") " pod="openstack/octavia-api-854956d6cd-q45tl" Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.887803 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbec8eb6-ea30-43de-b333-4a650978d5dd-scripts\") pod \"octavia-api-854956d6cd-q45tl\" (UID: \"bbec8eb6-ea30-43de-b333-4a650978d5dd\") " pod="openstack/octavia-api-854956d6cd-q45tl" Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.888001 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbec8eb6-ea30-43de-b333-4a650978d5dd-combined-ca-bundle\") pod \"octavia-api-854956d6cd-q45tl\" (UID: \"bbec8eb6-ea30-43de-b333-4a650978d5dd\") " pod="openstack/octavia-api-854956d6cd-q45tl" Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.888096 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/bbec8eb6-ea30-43de-b333-4a650978d5dd-octavia-run\") pod \"octavia-api-854956d6cd-q45tl\" (UID: \"bbec8eb6-ea30-43de-b333-4a650978d5dd\") " pod="openstack/octavia-api-854956d6cd-q45tl" Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.888221 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbec8eb6-ea30-43de-b333-4a650978d5dd-internal-tls-certs\") pod \"octavia-api-854956d6cd-q45tl\" (UID: \"bbec8eb6-ea30-43de-b333-4a650978d5dd\") " pod="openstack/octavia-api-854956d6cd-q45tl" Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.888328 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbec8eb6-ea30-43de-b333-4a650978d5dd-config-data\") pod \"octavia-api-854956d6cd-q45tl\" (UID: \"bbec8eb6-ea30-43de-b333-4a650978d5dd\") " pod="openstack/octavia-api-854956d6cd-q45tl" Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.888815 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/bbec8eb6-ea30-43de-b333-4a650978d5dd-config-data-merged\") pod \"octavia-api-854956d6cd-q45tl\" (UID: \"bbec8eb6-ea30-43de-b333-4a650978d5dd\") " pod="openstack/octavia-api-854956d6cd-q45tl" Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.888894 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbec8eb6-ea30-43de-b333-4a650978d5dd-ovndb-tls-certs\") pod \"octavia-api-854956d6cd-q45tl\" (UID: \"bbec8eb6-ea30-43de-b333-4a650978d5dd\") " pod="openstack/octavia-api-854956d6cd-q45tl" Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.888963 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbec8eb6-ea30-43de-b333-4a650978d5dd-public-tls-certs\") pod \"octavia-api-854956d6cd-q45tl\" (UID: \"bbec8eb6-ea30-43de-b333-4a650978d5dd\") " pod="openstack/octavia-api-854956d6cd-q45tl" Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.889231 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/bbec8eb6-ea30-43de-b333-4a650978d5dd-config-data-merged\") pod \"octavia-api-854956d6cd-q45tl\" (UID: \"bbec8eb6-ea30-43de-b333-4a650978d5dd\") " pod="openstack/octavia-api-854956d6cd-q45tl" Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.888780 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/bbec8eb6-ea30-43de-b333-4a650978d5dd-octavia-run\") pod \"octavia-api-854956d6cd-q45tl\" (UID: \"bbec8eb6-ea30-43de-b333-4a650978d5dd\") " pod="openstack/octavia-api-854956d6cd-q45tl" Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.893914 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbec8eb6-ea30-43de-b333-4a650978d5dd-public-tls-certs\") pod \"octavia-api-854956d6cd-q45tl\" (UID: \"bbec8eb6-ea30-43de-b333-4a650978d5dd\") " pod="openstack/octavia-api-854956d6cd-q45tl" Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.894891 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbec8eb6-ea30-43de-b333-4a650978d5dd-ovndb-tls-certs\") pod \"octavia-api-854956d6cd-q45tl\" (UID: \"bbec8eb6-ea30-43de-b333-4a650978d5dd\") " pod="openstack/octavia-api-854956d6cd-q45tl" Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.894890 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbec8eb6-ea30-43de-b333-4a650978d5dd-combined-ca-bundle\") pod \"octavia-api-854956d6cd-q45tl\" (UID: \"bbec8eb6-ea30-43de-b333-4a650978d5dd\") " pod="openstack/octavia-api-854956d6cd-q45tl" Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.896474 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbec8eb6-ea30-43de-b333-4a650978d5dd-scripts\") pod \"octavia-api-854956d6cd-q45tl\" (UID: \"bbec8eb6-ea30-43de-b333-4a650978d5dd\") " pod="openstack/octavia-api-854956d6cd-q45tl" Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.897301 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbec8eb6-ea30-43de-b333-4a650978d5dd-config-data\") pod \"octavia-api-854956d6cd-q45tl\" (UID: \"bbec8eb6-ea30-43de-b333-4a650978d5dd\") " pod="openstack/octavia-api-854956d6cd-q45tl" Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.897637 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbec8eb6-ea30-43de-b333-4a650978d5dd-internal-tls-certs\") pod \"octavia-api-854956d6cd-q45tl\" (UID: \"bbec8eb6-ea30-43de-b333-4a650978d5dd\") " pod="openstack/octavia-api-854956d6cd-q45tl" Dec 05 17:58:26 crc kubenswrapper[4756]: I1205 17:58:26.982665 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-854956d6cd-q45tl" Dec 05 17:58:27 crc kubenswrapper[4756]: I1205 17:58:27.122913 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-44z2z" event={"ID":"415eb61a-0592-4e80-91f3-3564eee72d78","Type":"ContainerStarted","Data":"ec114634a19717c4065070aaaaa1d4c2d505da2d2ffa68a8b59d03b91fbb1551"} Dec 05 17:58:27 crc kubenswrapper[4756]: I1205 17:58:27.918910 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-854956d6cd-q45tl"] Dec 05 17:58:28 crc kubenswrapper[4756]: W1205 17:58:28.207065 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbec8eb6_ea30_43de_b333_4a650978d5dd.slice/crio-543e64a77edeb0bf1cf02e78ebeb06b23378d183c4e59a4817a0b0fd8e4eaffa WatchSource:0}: Error finding container 543e64a77edeb0bf1cf02e78ebeb06b23378d183c4e59a4817a0b0fd8e4eaffa: Status 404 returned error can't find the container with id 543e64a77edeb0bf1cf02e78ebeb06b23378d183c4e59a4817a0b0fd8e4eaffa Dec 05 17:58:29 crc kubenswrapper[4756]: I1205 17:58:29.151714 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-854956d6cd-q45tl" event={"ID":"bbec8eb6-ea30-43de-b333-4a650978d5dd","Type":"ContainerStarted","Data":"543e64a77edeb0bf1cf02e78ebeb06b23378d183c4e59a4817a0b0fd8e4eaffa"} Dec 05 17:58:30 crc kubenswrapper[4756]: I1205 17:58:30.165128 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-854956d6cd-q45tl" event={"ID":"bbec8eb6-ea30-43de-b333-4a650978d5dd","Type":"ContainerStarted","Data":"ffb1deb39ad90f2ed7126e84d52e726ea30852f2de8da4f2ee055c108ff53fd2"} Dec 05 17:58:30 crc kubenswrapper[4756]: I1205 17:58:30.171854 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-nwvnw" event={"ID":"4bc062ab-b968-48e4-b201-b320bf7af1c2","Type":"ContainerStarted","Data":"1bf0d15facfe83d7b2bf4390511f0afb3a808d43a82a901009d390859b874287"} Dec 05 17:58:31 crc kubenswrapper[4756]: I1205 17:58:31.127448 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-67ccf57c7d-mfzfc" Dec 05 17:58:31 crc kubenswrapper[4756]: I1205 17:58:31.160523 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-67ccf57c7d-mfzfc" Dec 05 17:58:31 crc kubenswrapper[4756]: I1205 17:58:31.199613 4756 generic.go:334] "Generic (PLEG): container finished" podID="bbec8eb6-ea30-43de-b333-4a650978d5dd" containerID="ffb1deb39ad90f2ed7126e84d52e726ea30852f2de8da4f2ee055c108ff53fd2" exitCode=0 Dec 05 17:58:31 crc kubenswrapper[4756]: I1205 17:58:31.199712 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-854956d6cd-q45tl" event={"ID":"bbec8eb6-ea30-43de-b333-4a650978d5dd","Type":"ContainerDied","Data":"ffb1deb39ad90f2ed7126e84d52e726ea30852f2de8da4f2ee055c108ff53fd2"} Dec 05 17:58:32 crc kubenswrapper[4756]: I1205 17:58:32.217808 4756 generic.go:334] "Generic (PLEG): container finished" podID="4bc062ab-b968-48e4-b201-b320bf7af1c2" containerID="1bf0d15facfe83d7b2bf4390511f0afb3a808d43a82a901009d390859b874287" exitCode=0 Dec 05 17:58:32 crc kubenswrapper[4756]: I1205 17:58:32.217965 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-nwvnw" event={"ID":"4bc062ab-b968-48e4-b201-b320bf7af1c2","Type":"ContainerDied","Data":"1bf0d15facfe83d7b2bf4390511f0afb3a808d43a82a901009d390859b874287"} Dec 05 17:58:32 crc kubenswrapper[4756]: I1205 17:58:32.227555 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-854956d6cd-q45tl" event={"ID":"bbec8eb6-ea30-43de-b333-4a650978d5dd","Type":"ContainerStarted","Data":"cf848e0c1d783d3fb4ac369bfc2f7a7de8ceb4180205d360d15ce496b7d2e9ac"} Dec 05 17:58:32 crc kubenswrapper[4756]: I1205 17:58:32.227638 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-854956d6cd-q45tl" event={"ID":"bbec8eb6-ea30-43de-b333-4a650978d5dd","Type":"ContainerStarted","Data":"854cd9b09de17db76015063768ca1637ec944adc9887c35f9536f9136f32d747"} Dec 05 17:58:32 crc kubenswrapper[4756]: I1205 17:58:32.229470 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-854956d6cd-q45tl" Dec 05 17:58:32 crc kubenswrapper[4756]: I1205 17:58:32.229563 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-854956d6cd-q45tl" Dec 05 17:58:32 crc kubenswrapper[4756]: I1205 17:58:32.277662 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-854956d6cd-q45tl" podStartSLOduration=6.277635457 podStartE2EDuration="6.277635457s" podCreationTimestamp="2025-12-05 17:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:58:32.265003821 +0000 UTC m=+6172.503677353" watchObservedRunningTime="2025-12-05 17:58:32.277635457 +0000 UTC m=+6172.516308999" Dec 05 17:58:37 crc kubenswrapper[4756]: I1205 17:58:37.178437 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-cbqdw"] Dec 05 17:58:37 crc kubenswrapper[4756]: I1205 17:58:37.181101 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-cbqdw" Dec 05 17:58:37 crc kubenswrapper[4756]: I1205 17:58:37.183586 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Dec 05 17:58:37 crc kubenswrapper[4756]: I1205 17:58:37.186371 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-cbqdw"] Dec 05 17:58:37 crc kubenswrapper[4756]: I1205 17:58:37.325868 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afb86990-349d-45b0-83a0-866c6332f6b0-combined-ca-bundle\") pod \"octavia-db-sync-cbqdw\" (UID: \"afb86990-349d-45b0-83a0-866c6332f6b0\") " pod="openstack/octavia-db-sync-cbqdw" Dec 05 17:58:37 crc kubenswrapper[4756]: I1205 17:58:37.325960 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afb86990-349d-45b0-83a0-866c6332f6b0-config-data\") pod \"octavia-db-sync-cbqdw\" (UID: \"afb86990-349d-45b0-83a0-866c6332f6b0\") " pod="openstack/octavia-db-sync-cbqdw" Dec 05 17:58:37 crc kubenswrapper[4756]: I1205 17:58:37.327072 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/afb86990-349d-45b0-83a0-866c6332f6b0-config-data-merged\") pod \"octavia-db-sync-cbqdw\" (UID: \"afb86990-349d-45b0-83a0-866c6332f6b0\") " pod="openstack/octavia-db-sync-cbqdw" Dec 05 17:58:37 crc kubenswrapper[4756]: I1205 17:58:37.327184 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afb86990-349d-45b0-83a0-866c6332f6b0-scripts\") pod \"octavia-db-sync-cbqdw\" (UID: \"afb86990-349d-45b0-83a0-866c6332f6b0\") " pod="openstack/octavia-db-sync-cbqdw" Dec 05 17:58:37 crc kubenswrapper[4756]: I1205 17:58:37.428188 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afb86990-349d-45b0-83a0-866c6332f6b0-config-data\") pod \"octavia-db-sync-cbqdw\" (UID: \"afb86990-349d-45b0-83a0-866c6332f6b0\") " pod="openstack/octavia-db-sync-cbqdw" Dec 05 17:58:37 crc kubenswrapper[4756]: I1205 17:58:37.428531 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/afb86990-349d-45b0-83a0-866c6332f6b0-config-data-merged\") pod \"octavia-db-sync-cbqdw\" (UID: \"afb86990-349d-45b0-83a0-866c6332f6b0\") " pod="openstack/octavia-db-sync-cbqdw" Dec 05 17:58:37 crc kubenswrapper[4756]: I1205 17:58:37.428590 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afb86990-349d-45b0-83a0-866c6332f6b0-scripts\") pod \"octavia-db-sync-cbqdw\" (UID: \"afb86990-349d-45b0-83a0-866c6332f6b0\") " pod="openstack/octavia-db-sync-cbqdw" Dec 05 17:58:37 crc kubenswrapper[4756]: I1205 17:58:37.428668 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afb86990-349d-45b0-83a0-866c6332f6b0-combined-ca-bundle\") pod \"octavia-db-sync-cbqdw\" (UID: \"afb86990-349d-45b0-83a0-866c6332f6b0\") " pod="openstack/octavia-db-sync-cbqdw" Dec 05 17:58:37 crc kubenswrapper[4756]: I1205 17:58:37.429242 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/afb86990-349d-45b0-83a0-866c6332f6b0-config-data-merged\") pod \"octavia-db-sync-cbqdw\" (UID: \"afb86990-349d-45b0-83a0-866c6332f6b0\") " pod="openstack/octavia-db-sync-cbqdw" Dec 05 17:58:37 crc kubenswrapper[4756]: I1205 17:58:37.433341 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afb86990-349d-45b0-83a0-866c6332f6b0-combined-ca-bundle\") pod \"octavia-db-sync-cbqdw\" (UID: \"afb86990-349d-45b0-83a0-866c6332f6b0\") " pod="openstack/octavia-db-sync-cbqdw" Dec 05 17:58:37 crc kubenswrapper[4756]: I1205 17:58:37.444389 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afb86990-349d-45b0-83a0-866c6332f6b0-config-data\") pod \"octavia-db-sync-cbqdw\" (UID: \"afb86990-349d-45b0-83a0-866c6332f6b0\") " pod="openstack/octavia-db-sync-cbqdw" Dec 05 17:58:37 crc kubenswrapper[4756]: I1205 17:58:37.445346 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afb86990-349d-45b0-83a0-866c6332f6b0-scripts\") pod \"octavia-db-sync-cbqdw\" (UID: \"afb86990-349d-45b0-83a0-866c6332f6b0\") " pod="openstack/octavia-db-sync-cbqdw" Dec 05 17:58:37 crc kubenswrapper[4756]: I1205 17:58:37.506434 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-cbqdw" Dec 05 17:58:40 crc kubenswrapper[4756]: I1205 17:58:40.228517 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-cbqdw"] Dec 05 17:58:40 crc kubenswrapper[4756]: I1205 17:58:40.322128 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-cbqdw" event={"ID":"afb86990-349d-45b0-83a0-866c6332f6b0","Type":"ContainerStarted","Data":"9d905ef8acffb651454bb2d1b281f31a2552cdd9d125ba35ac627093f659ceff"} Dec 05 17:58:40 crc kubenswrapper[4756]: I1205 17:58:40.324380 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-nwvnw" event={"ID":"4bc062ab-b968-48e4-b201-b320bf7af1c2","Type":"ContainerStarted","Data":"cb44d3ba01001c4376a3fb5ef3fb11497f5102859e9f65a0f2b03a9678b223bd"} Dec 05 17:58:40 crc kubenswrapper[4756]: I1205 17:58:40.324965 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-nwvnw" Dec 05 17:58:40 crc kubenswrapper[4756]: I1205 17:58:40.327961 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-44z2z" event={"ID":"415eb61a-0592-4e80-91f3-3564eee72d78","Type":"ContainerStarted","Data":"9c554b92351ab4f2b3edf692ba55affc575d5bc6cb1888551b5702ee14704bee"} Dec 05 17:58:40 crc kubenswrapper[4756]: I1205 17:58:40.372959 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-nwvnw" podStartSLOduration=2.256885683 podStartE2EDuration="16.372932498s" podCreationTimestamp="2025-12-05 17:58:24 +0000 UTC" firstStartedPulling="2025-12-05 17:58:25.60751828 +0000 UTC m=+6165.846191792" lastFinishedPulling="2025-12-05 17:58:39.723565075 +0000 UTC m=+6179.962238607" observedRunningTime="2025-12-05 17:58:40.350172682 +0000 UTC m=+6180.588846204" watchObservedRunningTime="2025-12-05 17:58:40.372932498 +0000 UTC m=+6180.611606020" Dec 05 17:58:43 crc kubenswrapper[4756]: I1205 17:58:43.384358 4756 generic.go:334] "Generic (PLEG): container finished" podID="afb86990-349d-45b0-83a0-866c6332f6b0" containerID="9b785dab74a5334616b6f862f1c6192d2b90225ada895ee551e113ca02e6374c" exitCode=0 Dec 05 17:58:43 crc kubenswrapper[4756]: I1205 17:58:43.384406 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-cbqdw" event={"ID":"afb86990-349d-45b0-83a0-866c6332f6b0","Type":"ContainerDied","Data":"9b785dab74a5334616b6f862f1c6192d2b90225ada895ee551e113ca02e6374c"} Dec 05 17:58:44 crc kubenswrapper[4756]: I1205 17:58:44.396569 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-cbqdw" event={"ID":"afb86990-349d-45b0-83a0-866c6332f6b0","Type":"ContainerStarted","Data":"2d951ae761902714016151d66a905a1495077017502c3430f0831f7b18dfe92c"} Dec 05 17:58:44 crc kubenswrapper[4756]: I1205 17:58:44.435362 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-cbqdw" podStartSLOduration=7.435339366 podStartE2EDuration="7.435339366s" podCreationTimestamp="2025-12-05 17:58:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:58:44.420411079 +0000 UTC m=+6184.659084581" watchObservedRunningTime="2025-12-05 17:58:44.435339366 +0000 UTC m=+6184.674012878" Dec 05 17:58:45 crc kubenswrapper[4756]: I1205 17:58:45.410199 4756 generic.go:334] "Generic (PLEG): container finished" podID="415eb61a-0592-4e80-91f3-3564eee72d78" containerID="9c554b92351ab4f2b3edf692ba55affc575d5bc6cb1888551b5702ee14704bee" exitCode=0 Dec 05 17:58:45 crc kubenswrapper[4756]: I1205 17:58:45.410260 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-44z2z" event={"ID":"415eb61a-0592-4e80-91f3-3564eee72d78","Type":"ContainerDied","Data":"9c554b92351ab4f2b3edf692ba55affc575d5bc6cb1888551b5702ee14704bee"} Dec 05 17:58:45 crc kubenswrapper[4756]: I1205 17:58:45.771179 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-854956d6cd-q45tl" Dec 05 17:58:45 crc kubenswrapper[4756]: I1205 17:58:45.862037 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-854956d6cd-q45tl" Dec 05 17:58:45 crc kubenswrapper[4756]: I1205 17:58:45.960480 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-api-67ccf57c7d-mfzfc"] Dec 05 17:58:45 crc kubenswrapper[4756]: I1205 17:58:45.960748 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-api-67ccf57c7d-mfzfc" podUID="0d5a0d90-0c0e-4589-8a15-7016159a2609" containerName="octavia-api" containerID="cri-o://f24fec2f91ffa901e9812455abf3525d47fdfe27a302a5cf8c05d29c2f4ee50d" gracePeriod=30 Dec 05 17:58:45 crc kubenswrapper[4756]: I1205 17:58:45.960877 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-api-67ccf57c7d-mfzfc" podUID="0d5a0d90-0c0e-4589-8a15-7016159a2609" containerName="octavia-api-provider-agent" containerID="cri-o://60aaffba6dba3cdef43b2cf3840b46e18884b34f52a1cb3fe4bd70bff8d71404" gracePeriod=30 Dec 05 17:58:46 crc kubenswrapper[4756]: I1205 17:58:46.424216 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-44z2z" event={"ID":"415eb61a-0592-4e80-91f3-3564eee72d78","Type":"ContainerStarted","Data":"0da5d3bf22c2db9d97173b74dff5b59c93ac8555b85eb8c8f363fc5497d268a1"} Dec 05 17:58:46 crc kubenswrapper[4756]: I1205 17:58:46.455663 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-56c9f55b99-44z2z" podStartSLOduration=7.925742307 podStartE2EDuration="21.455641301s" podCreationTimestamp="2025-12-05 17:58:25 +0000 UTC" firstStartedPulling="2025-12-05 17:58:26.309988418 +0000 UTC m=+6166.548661930" lastFinishedPulling="2025-12-05 17:58:39.839887402 +0000 UTC m=+6180.078560924" observedRunningTime="2025-12-05 17:58:46.439717747 +0000 UTC m=+6186.678391259" watchObservedRunningTime="2025-12-05 17:58:46.455641301 +0000 UTC m=+6186.694314823" Dec 05 17:58:47 crc kubenswrapper[4756]: I1205 17:58:47.439583 4756 generic.go:334] "Generic (PLEG): container finished" podID="afb86990-349d-45b0-83a0-866c6332f6b0" containerID="2d951ae761902714016151d66a905a1495077017502c3430f0831f7b18dfe92c" exitCode=0 Dec 05 17:58:47 crc kubenswrapper[4756]: I1205 17:58:47.439696 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-cbqdw" event={"ID":"afb86990-349d-45b0-83a0-866c6332f6b0","Type":"ContainerDied","Data":"2d951ae761902714016151d66a905a1495077017502c3430f0831f7b18dfe92c"} Dec 05 17:58:47 crc kubenswrapper[4756]: I1205 17:58:47.442799 4756 generic.go:334] "Generic (PLEG): container finished" podID="0d5a0d90-0c0e-4589-8a15-7016159a2609" containerID="60aaffba6dba3cdef43b2cf3840b46e18884b34f52a1cb3fe4bd70bff8d71404" exitCode=0 Dec 05 17:58:47 crc kubenswrapper[4756]: I1205 17:58:47.442844 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-67ccf57c7d-mfzfc" event={"ID":"0d5a0d90-0c0e-4589-8a15-7016159a2609","Type":"ContainerDied","Data":"60aaffba6dba3cdef43b2cf3840b46e18884b34f52a1cb3fe4bd70bff8d71404"} Dec 05 17:58:48 crc kubenswrapper[4756]: I1205 17:58:48.868342 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-cbqdw" Dec 05 17:58:48 crc kubenswrapper[4756]: I1205 17:58:48.969127 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afb86990-349d-45b0-83a0-866c6332f6b0-combined-ca-bundle\") pod \"afb86990-349d-45b0-83a0-866c6332f6b0\" (UID: \"afb86990-349d-45b0-83a0-866c6332f6b0\") " Dec 05 17:58:48 crc kubenswrapper[4756]: I1205 17:58:48.969282 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/afb86990-349d-45b0-83a0-866c6332f6b0-config-data-merged\") pod \"afb86990-349d-45b0-83a0-866c6332f6b0\" (UID: \"afb86990-349d-45b0-83a0-866c6332f6b0\") " Dec 05 17:58:48 crc kubenswrapper[4756]: I1205 17:58:48.969376 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afb86990-349d-45b0-83a0-866c6332f6b0-config-data\") pod \"afb86990-349d-45b0-83a0-866c6332f6b0\" (UID: \"afb86990-349d-45b0-83a0-866c6332f6b0\") " Dec 05 17:58:48 crc kubenswrapper[4756]: I1205 17:58:48.969409 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afb86990-349d-45b0-83a0-866c6332f6b0-scripts\") pod \"afb86990-349d-45b0-83a0-866c6332f6b0\" (UID: \"afb86990-349d-45b0-83a0-866c6332f6b0\") " Dec 05 17:58:48 crc kubenswrapper[4756]: I1205 17:58:48.991700 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afb86990-349d-45b0-83a0-866c6332f6b0-scripts" (OuterVolumeSpecName: "scripts") pod "afb86990-349d-45b0-83a0-866c6332f6b0" (UID: "afb86990-349d-45b0-83a0-866c6332f6b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:58:48 crc kubenswrapper[4756]: I1205 17:58:48.992100 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afb86990-349d-45b0-83a0-866c6332f6b0-config-data" (OuterVolumeSpecName: "config-data") pod "afb86990-349d-45b0-83a0-866c6332f6b0" (UID: "afb86990-349d-45b0-83a0-866c6332f6b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:58:49 crc kubenswrapper[4756]: I1205 17:58:49.018522 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afb86990-349d-45b0-83a0-866c6332f6b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "afb86990-349d-45b0-83a0-866c6332f6b0" (UID: "afb86990-349d-45b0-83a0-866c6332f6b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:58:49 crc kubenswrapper[4756]: I1205 17:58:49.023995 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afb86990-349d-45b0-83a0-866c6332f6b0-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "afb86990-349d-45b0-83a0-866c6332f6b0" (UID: "afb86990-349d-45b0-83a0-866c6332f6b0"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:58:49 crc kubenswrapper[4756]: I1205 17:58:49.072219 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/afb86990-349d-45b0-83a0-866c6332f6b0-config-data-merged\") on node \"crc\" DevicePath \"\"" Dec 05 17:58:49 crc kubenswrapper[4756]: I1205 17:58:49.072251 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afb86990-349d-45b0-83a0-866c6332f6b0-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:58:49 crc kubenswrapper[4756]: I1205 17:58:49.072259 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afb86990-349d-45b0-83a0-866c6332f6b0-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:58:49 crc kubenswrapper[4756]: I1205 17:58:49.072268 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afb86990-349d-45b0-83a0-866c6332f6b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:58:49 crc kubenswrapper[4756]: I1205 17:58:49.471556 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-cbqdw" event={"ID":"afb86990-349d-45b0-83a0-866c6332f6b0","Type":"ContainerDied","Data":"9d905ef8acffb651454bb2d1b281f31a2552cdd9d125ba35ac627093f659ceff"} Dec 05 17:58:49 crc kubenswrapper[4756]: I1205 17:58:49.471599 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d905ef8acffb651454bb2d1b281f31a2552cdd9d125ba35ac627093f659ceff" Dec 05 17:58:49 crc kubenswrapper[4756]: I1205 17:58:49.471638 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-cbqdw" Dec 05 17:58:49 crc kubenswrapper[4756]: I1205 17:58:49.476475 4756 generic.go:334] "Generic (PLEG): container finished" podID="0d5a0d90-0c0e-4589-8a15-7016159a2609" containerID="f24fec2f91ffa901e9812455abf3525d47fdfe27a302a5cf8c05d29c2f4ee50d" exitCode=0 Dec 05 17:58:49 crc kubenswrapper[4756]: I1205 17:58:49.476508 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-67ccf57c7d-mfzfc" event={"ID":"0d5a0d90-0c0e-4589-8a15-7016159a2609","Type":"ContainerDied","Data":"f24fec2f91ffa901e9812455abf3525d47fdfe27a302a5cf8c05d29c2f4ee50d"} Dec 05 17:58:49 crc kubenswrapper[4756]: I1205 17:58:49.499913 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-67ccf57c7d-mfzfc" Dec 05 17:58:49 crc kubenswrapper[4756]: I1205 17:58:49.579852 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d5a0d90-0c0e-4589-8a15-7016159a2609-scripts\") pod \"0d5a0d90-0c0e-4589-8a15-7016159a2609\" (UID: \"0d5a0d90-0c0e-4589-8a15-7016159a2609\") " Dec 05 17:58:49 crc kubenswrapper[4756]: I1205 17:58:49.579898 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/0d5a0d90-0c0e-4589-8a15-7016159a2609-octavia-run\") pod \"0d5a0d90-0c0e-4589-8a15-7016159a2609\" (UID: \"0d5a0d90-0c0e-4589-8a15-7016159a2609\") " Dec 05 17:58:49 crc kubenswrapper[4756]: I1205 17:58:49.579940 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d5a0d90-0c0e-4589-8a15-7016159a2609-ovndb-tls-certs\") pod \"0d5a0d90-0c0e-4589-8a15-7016159a2609\" (UID: \"0d5a0d90-0c0e-4589-8a15-7016159a2609\") " Dec 05 17:58:49 crc kubenswrapper[4756]: I1205 17:58:49.579990 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d5a0d90-0c0e-4589-8a15-7016159a2609-config-data\") pod \"0d5a0d90-0c0e-4589-8a15-7016159a2609\" (UID: \"0d5a0d90-0c0e-4589-8a15-7016159a2609\") " Dec 05 17:58:49 crc kubenswrapper[4756]: I1205 17:58:49.580081 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d5a0d90-0c0e-4589-8a15-7016159a2609-combined-ca-bundle\") pod \"0d5a0d90-0c0e-4589-8a15-7016159a2609\" (UID: \"0d5a0d90-0c0e-4589-8a15-7016159a2609\") " Dec 05 17:58:49 crc kubenswrapper[4756]: I1205 17:58:49.580165 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0d5a0d90-0c0e-4589-8a15-7016159a2609-config-data-merged\") pod \"0d5a0d90-0c0e-4589-8a15-7016159a2609\" (UID: \"0d5a0d90-0c0e-4589-8a15-7016159a2609\") " Dec 05 17:58:49 crc kubenswrapper[4756]: I1205 17:58:49.581537 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d5a0d90-0c0e-4589-8a15-7016159a2609-octavia-run" (OuterVolumeSpecName: "octavia-run") pod "0d5a0d90-0c0e-4589-8a15-7016159a2609" (UID: "0d5a0d90-0c0e-4589-8a15-7016159a2609"). InnerVolumeSpecName "octavia-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:58:49 crc kubenswrapper[4756]: I1205 17:58:49.582478 4756 reconciler_common.go:293] "Volume detached for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/0d5a0d90-0c0e-4589-8a15-7016159a2609-octavia-run\") on node \"crc\" DevicePath \"\"" Dec 05 17:58:49 crc kubenswrapper[4756]: I1205 17:58:49.583600 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d5a0d90-0c0e-4589-8a15-7016159a2609-scripts" (OuterVolumeSpecName: "scripts") pod "0d5a0d90-0c0e-4589-8a15-7016159a2609" (UID: "0d5a0d90-0c0e-4589-8a15-7016159a2609"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:58:49 crc kubenswrapper[4756]: I1205 17:58:49.594238 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d5a0d90-0c0e-4589-8a15-7016159a2609-config-data" (OuterVolumeSpecName: "config-data") pod "0d5a0d90-0c0e-4589-8a15-7016159a2609" (UID: "0d5a0d90-0c0e-4589-8a15-7016159a2609"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:58:49 crc kubenswrapper[4756]: I1205 17:58:49.625950 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d5a0d90-0c0e-4589-8a15-7016159a2609-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "0d5a0d90-0c0e-4589-8a15-7016159a2609" (UID: "0d5a0d90-0c0e-4589-8a15-7016159a2609"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:58:49 crc kubenswrapper[4756]: I1205 17:58:49.629222 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d5a0d90-0c0e-4589-8a15-7016159a2609-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d5a0d90-0c0e-4589-8a15-7016159a2609" (UID: "0d5a0d90-0c0e-4589-8a15-7016159a2609"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:58:49 crc kubenswrapper[4756]: I1205 17:58:49.684607 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d5a0d90-0c0e-4589-8a15-7016159a2609-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 17:58:49 crc kubenswrapper[4756]: I1205 17:58:49.684648 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d5a0d90-0c0e-4589-8a15-7016159a2609-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 17:58:49 crc kubenswrapper[4756]: I1205 17:58:49.684660 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0d5a0d90-0c0e-4589-8a15-7016159a2609-config-data-merged\") on node \"crc\" DevicePath \"\"" Dec 05 17:58:49 crc kubenswrapper[4756]: I1205 17:58:49.684670 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d5a0d90-0c0e-4589-8a15-7016159a2609-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 17:58:49 crc kubenswrapper[4756]: I1205 17:58:49.725862 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d5a0d90-0c0e-4589-8a15-7016159a2609-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0d5a0d90-0c0e-4589-8a15-7016159a2609" (UID: "0d5a0d90-0c0e-4589-8a15-7016159a2609"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:58:49 crc kubenswrapper[4756]: I1205 17:58:49.786663 4756 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d5a0d90-0c0e-4589-8a15-7016159a2609-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 17:58:50 crc kubenswrapper[4756]: I1205 17:58:50.495887 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-67ccf57c7d-mfzfc" event={"ID":"0d5a0d90-0c0e-4589-8a15-7016159a2609","Type":"ContainerDied","Data":"8dd5d93bb4f59b88f25ac5ea50c0cc4b86732d18eeb7e08ebc8c60afd3c7f2ba"} Dec 05 17:58:50 crc kubenswrapper[4756]: I1205 17:58:50.495938 4756 scope.go:117] "RemoveContainer" containerID="60aaffba6dba3cdef43b2cf3840b46e18884b34f52a1cb3fe4bd70bff8d71404" Dec 05 17:58:50 crc kubenswrapper[4756]: I1205 17:58:50.496162 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-67ccf57c7d-mfzfc" Dec 05 17:58:50 crc kubenswrapper[4756]: I1205 17:58:50.547000 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-api-67ccf57c7d-mfzfc"] Dec 05 17:58:50 crc kubenswrapper[4756]: I1205 17:58:50.551188 4756 scope.go:117] "RemoveContainer" containerID="f24fec2f91ffa901e9812455abf3525d47fdfe27a302a5cf8c05d29c2f4ee50d" Dec 05 17:58:50 crc kubenswrapper[4756]: I1205 17:58:50.559707 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-api-67ccf57c7d-mfzfc"] Dec 05 17:58:50 crc kubenswrapper[4756]: I1205 17:58:50.598749 4756 scope.go:117] "RemoveContainer" containerID="ed711b623bf3eb026b08df7d1fab2d50af615ce027b7a4639c4a2e8057e2e0d4" Dec 05 17:58:52 crc kubenswrapper[4756]: I1205 17:58:52.070245 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d5a0d90-0c0e-4589-8a15-7016159a2609" path="/var/lib/kubelet/pods/0d5a0d90-0c0e-4589-8a15-7016159a2609/volumes" Dec 05 17:58:55 crc kubenswrapper[4756]: I1205 17:58:55.061375 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-nwvnw" Dec 05 17:59:08 crc kubenswrapper[4756]: I1205 17:59:08.234097 4756 scope.go:117] "RemoveContainer" containerID="cb910ea1a1bf91dfc270a15cd5f56e8d86acaf9a494277eae33584814685e3bc" Dec 05 17:59:08 crc kubenswrapper[4756]: I1205 17:59:08.265251 4756 scope.go:117] "RemoveContainer" containerID="67951d347a590fac48578a9eb05497c4547a373150f59afb5208b7de12207c22" Dec 05 17:59:10 crc kubenswrapper[4756]: I1205 17:59:10.171494 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-44z2z"] Dec 05 17:59:10 crc kubenswrapper[4756]: I1205 17:59:10.171993 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-56c9f55b99-44z2z" podUID="415eb61a-0592-4e80-91f3-3564eee72d78" containerName="octavia-amphora-httpd" containerID="cri-o://0da5d3bf22c2db9d97173b74dff5b59c93ac8555b85eb8c8f363fc5497d268a1" gracePeriod=30 Dec 05 17:59:10 crc kubenswrapper[4756]: I1205 17:59:10.718140 4756 generic.go:334] "Generic (PLEG): container finished" podID="415eb61a-0592-4e80-91f3-3564eee72d78" containerID="0da5d3bf22c2db9d97173b74dff5b59c93ac8555b85eb8c8f363fc5497d268a1" exitCode=0 Dec 05 17:59:10 crc kubenswrapper[4756]: I1205 17:59:10.718434 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-44z2z" event={"ID":"415eb61a-0592-4e80-91f3-3564eee72d78","Type":"ContainerDied","Data":"0da5d3bf22c2db9d97173b74dff5b59c93ac8555b85eb8c8f363fc5497d268a1"} Dec 05 17:59:10 crc kubenswrapper[4756]: I1205 17:59:10.718460 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-44z2z" event={"ID":"415eb61a-0592-4e80-91f3-3564eee72d78","Type":"ContainerDied","Data":"ec114634a19717c4065070aaaaa1d4c2d505da2d2ffa68a8b59d03b91fbb1551"} Dec 05 17:59:10 crc kubenswrapper[4756]: I1205 17:59:10.718472 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec114634a19717c4065070aaaaa1d4c2d505da2d2ffa68a8b59d03b91fbb1551" Dec 05 17:59:10 crc kubenswrapper[4756]: I1205 17:59:10.751725 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-56c9f55b99-44z2z" Dec 05 17:59:10 crc kubenswrapper[4756]: I1205 17:59:10.843201 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/415eb61a-0592-4e80-91f3-3564eee72d78-httpd-config\") pod \"415eb61a-0592-4e80-91f3-3564eee72d78\" (UID: \"415eb61a-0592-4e80-91f3-3564eee72d78\") " Dec 05 17:59:10 crc kubenswrapper[4756]: I1205 17:59:10.843250 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/415eb61a-0592-4e80-91f3-3564eee72d78-amphora-image\") pod \"415eb61a-0592-4e80-91f3-3564eee72d78\" (UID: \"415eb61a-0592-4e80-91f3-3564eee72d78\") " Dec 05 17:59:10 crc kubenswrapper[4756]: I1205 17:59:10.883613 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/415eb61a-0592-4e80-91f3-3564eee72d78-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "415eb61a-0592-4e80-91f3-3564eee72d78" (UID: "415eb61a-0592-4e80-91f3-3564eee72d78"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 17:59:10 crc kubenswrapper[4756]: I1205 17:59:10.945550 4756 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/415eb61a-0592-4e80-91f3-3564eee72d78-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 05 17:59:10 crc kubenswrapper[4756]: I1205 17:59:10.948567 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/415eb61a-0592-4e80-91f3-3564eee72d78-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "415eb61a-0592-4e80-91f3-3564eee72d78" (UID: "415eb61a-0592-4e80-91f3-3564eee72d78"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 17:59:11 crc kubenswrapper[4756]: I1205 17:59:11.047671 4756 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/415eb61a-0592-4e80-91f3-3564eee72d78-amphora-image\") on node \"crc\" DevicePath \"\"" Dec 05 17:59:11 crc kubenswrapper[4756]: I1205 17:59:11.727506 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-56c9f55b99-44z2z" Dec 05 17:59:11 crc kubenswrapper[4756]: I1205 17:59:11.806962 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-44z2z"] Dec 05 17:59:11 crc kubenswrapper[4756]: I1205 17:59:11.827910 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-44z2z"] Dec 05 17:59:12 crc kubenswrapper[4756]: I1205 17:59:12.066595 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="415eb61a-0592-4e80-91f3-3564eee72d78" path="/var/lib/kubelet/pods/415eb61a-0592-4e80-91f3-3564eee72d78/volumes" Dec 05 17:59:15 crc kubenswrapper[4756]: I1205 17:59:15.589100 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-l4hvk"] Dec 05 17:59:15 crc kubenswrapper[4756]: E1205 17:59:15.589914 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb86990-349d-45b0-83a0-866c6332f6b0" containerName="init" Dec 05 17:59:15 crc kubenswrapper[4756]: I1205 17:59:15.589925 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb86990-349d-45b0-83a0-866c6332f6b0" containerName="init" Dec 05 17:59:15 crc kubenswrapper[4756]: E1205 17:59:15.589937 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d5a0d90-0c0e-4589-8a15-7016159a2609" containerName="octavia-api-provider-agent" Dec 05 17:59:15 crc kubenswrapper[4756]: I1205 17:59:15.589942 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d5a0d90-0c0e-4589-8a15-7016159a2609" containerName="octavia-api-provider-agent" Dec 05 17:59:15 crc kubenswrapper[4756]: E1205 17:59:15.589954 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d5a0d90-0c0e-4589-8a15-7016159a2609" containerName="octavia-api" Dec 05 17:59:15 crc kubenswrapper[4756]: I1205 17:59:15.589960 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d5a0d90-0c0e-4589-8a15-7016159a2609" containerName="octavia-api" Dec 05 17:59:15 crc kubenswrapper[4756]: E1205 17:59:15.589971 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb86990-349d-45b0-83a0-866c6332f6b0" containerName="octavia-db-sync" Dec 05 17:59:15 crc kubenswrapper[4756]: I1205 17:59:15.589976 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb86990-349d-45b0-83a0-866c6332f6b0" containerName="octavia-db-sync" Dec 05 17:59:15 crc kubenswrapper[4756]: E1205 17:59:15.589992 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="415eb61a-0592-4e80-91f3-3564eee72d78" containerName="init" Dec 05 17:59:15 crc kubenswrapper[4756]: I1205 17:59:15.589998 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="415eb61a-0592-4e80-91f3-3564eee72d78" containerName="init" Dec 05 17:59:15 crc kubenswrapper[4756]: E1205 17:59:15.590008 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d5a0d90-0c0e-4589-8a15-7016159a2609" containerName="init" Dec 05 17:59:15 crc kubenswrapper[4756]: I1205 17:59:15.590014 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d5a0d90-0c0e-4589-8a15-7016159a2609" containerName="init" Dec 05 17:59:15 crc kubenswrapper[4756]: E1205 17:59:15.590022 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="415eb61a-0592-4e80-91f3-3564eee72d78" containerName="octavia-amphora-httpd" Dec 05 17:59:15 crc kubenswrapper[4756]: I1205 17:59:15.590027 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="415eb61a-0592-4e80-91f3-3564eee72d78" containerName="octavia-amphora-httpd" Dec 05 17:59:15 crc kubenswrapper[4756]: I1205 17:59:15.590197 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="afb86990-349d-45b0-83a0-866c6332f6b0" containerName="octavia-db-sync" Dec 05 17:59:15 crc kubenswrapper[4756]: I1205 17:59:15.590213 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d5a0d90-0c0e-4589-8a15-7016159a2609" containerName="octavia-api" Dec 05 17:59:15 crc kubenswrapper[4756]: I1205 17:59:15.590223 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d5a0d90-0c0e-4589-8a15-7016159a2609" containerName="octavia-api-provider-agent" Dec 05 17:59:15 crc kubenswrapper[4756]: I1205 17:59:15.590230 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="415eb61a-0592-4e80-91f3-3564eee72d78" containerName="octavia-amphora-httpd" Dec 05 17:59:15 crc kubenswrapper[4756]: I1205 17:59:15.591186 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-56c9f55b99-l4hvk" Dec 05 17:59:15 crc kubenswrapper[4756]: I1205 17:59:15.598560 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Dec 05 17:59:15 crc kubenswrapper[4756]: I1205 17:59:15.613232 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-l4hvk"] Dec 05 17:59:15 crc kubenswrapper[4756]: I1205 17:59:15.695158 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dbdeb570-6b89-46ff-9f82-9c70fb24bc44-httpd-config\") pod \"octavia-image-upload-56c9f55b99-l4hvk\" (UID: \"dbdeb570-6b89-46ff-9f82-9c70fb24bc44\") " pod="openstack/octavia-image-upload-56c9f55b99-l4hvk" Dec 05 17:59:15 crc kubenswrapper[4756]: I1205 17:59:15.695326 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/dbdeb570-6b89-46ff-9f82-9c70fb24bc44-amphora-image\") pod \"octavia-image-upload-56c9f55b99-l4hvk\" (UID: \"dbdeb570-6b89-46ff-9f82-9c70fb24bc44\") " pod="openstack/octavia-image-upload-56c9f55b99-l4hvk" Dec 05 17:59:15 crc kubenswrapper[4756]: I1205 17:59:15.797829 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dbdeb570-6b89-46ff-9f82-9c70fb24bc44-httpd-config\") pod \"octavia-image-upload-56c9f55b99-l4hvk\" (UID: \"dbdeb570-6b89-46ff-9f82-9c70fb24bc44\") " pod="openstack/octavia-image-upload-56c9f55b99-l4hvk" Dec 05 17:59:15 crc kubenswrapper[4756]: I1205 17:59:15.797932 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/dbdeb570-6b89-46ff-9f82-9c70fb24bc44-amphora-image\") pod \"octavia-image-upload-56c9f55b99-l4hvk\" (UID: \"dbdeb570-6b89-46ff-9f82-9c70fb24bc44\") " pod="openstack/octavia-image-upload-56c9f55b99-l4hvk" Dec 05 17:59:15 crc kubenswrapper[4756]: I1205 17:59:15.798540 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/dbdeb570-6b89-46ff-9f82-9c70fb24bc44-amphora-image\") pod \"octavia-image-upload-56c9f55b99-l4hvk\" (UID: \"dbdeb570-6b89-46ff-9f82-9c70fb24bc44\") " pod="openstack/octavia-image-upload-56c9f55b99-l4hvk" Dec 05 17:59:15 crc kubenswrapper[4756]: I1205 17:59:15.806773 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dbdeb570-6b89-46ff-9f82-9c70fb24bc44-httpd-config\") pod \"octavia-image-upload-56c9f55b99-l4hvk\" (UID: \"dbdeb570-6b89-46ff-9f82-9c70fb24bc44\") " pod="openstack/octavia-image-upload-56c9f55b99-l4hvk" Dec 05 17:59:15 crc kubenswrapper[4756]: I1205 17:59:15.916760 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-56c9f55b99-l4hvk" Dec 05 17:59:16 crc kubenswrapper[4756]: I1205 17:59:16.419892 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-l4hvk"] Dec 05 17:59:16 crc kubenswrapper[4756]: I1205 17:59:16.782520 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-l4hvk" event={"ID":"dbdeb570-6b89-46ff-9f82-9c70fb24bc44","Type":"ContainerStarted","Data":"70f541ed9e38a0150ae4c4500ac568ec02c39dd2f8896c0f16815aaa736595f5"} Dec 05 17:59:18 crc kubenswrapper[4756]: I1205 17:59:18.802605 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-l4hvk" event={"ID":"dbdeb570-6b89-46ff-9f82-9c70fb24bc44","Type":"ContainerStarted","Data":"615f36c8c374528337c4569db67a7aa14d7d60e523681fd6fcc72d9a5278ad54"} Dec 05 17:59:19 crc kubenswrapper[4756]: I1205 17:59:19.820250 4756 generic.go:334] "Generic (PLEG): container finished" podID="dbdeb570-6b89-46ff-9f82-9c70fb24bc44" containerID="615f36c8c374528337c4569db67a7aa14d7d60e523681fd6fcc72d9a5278ad54" exitCode=0 Dec 05 17:59:19 crc kubenswrapper[4756]: I1205 17:59:19.820315 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-l4hvk" event={"ID":"dbdeb570-6b89-46ff-9f82-9c70fb24bc44","Type":"ContainerDied","Data":"615f36c8c374528337c4569db67a7aa14d7d60e523681fd6fcc72d9a5278ad54"} Dec 05 17:59:20 crc kubenswrapper[4756]: I1205 17:59:20.837583 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-l4hvk" event={"ID":"dbdeb570-6b89-46ff-9f82-9c70fb24bc44","Type":"ContainerStarted","Data":"b1de3674833e955d383786a134fbf9b6d13bdfc462e610f4ad4e6cdf638af946"} Dec 05 17:59:20 crc kubenswrapper[4756]: I1205 17:59:20.868292 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-56c9f55b99-l4hvk" podStartSLOduration=5.374538073 podStartE2EDuration="5.868260222s" podCreationTimestamp="2025-12-05 17:59:15 +0000 UTC" firstStartedPulling="2025-12-05 17:59:16.427993211 +0000 UTC m=+6216.666666733" lastFinishedPulling="2025-12-05 17:59:16.92171536 +0000 UTC m=+6217.160388882" observedRunningTime="2025-12-05 17:59:20.861758469 +0000 UTC m=+6221.100432051" watchObservedRunningTime="2025-12-05 17:59:20.868260222 +0000 UTC m=+6221.106933744" Dec 05 17:59:32 crc kubenswrapper[4756]: I1205 17:59:32.617484 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-jjj9n"] Dec 05 17:59:32 crc kubenswrapper[4756]: I1205 17:59:32.621628 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-jjj9n" Dec 05 17:59:32 crc kubenswrapper[4756]: I1205 17:59:32.628234 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Dec 05 17:59:32 crc kubenswrapper[4756]: I1205 17:59:32.628620 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Dec 05 17:59:32 crc kubenswrapper[4756]: I1205 17:59:32.628738 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Dec 05 17:59:32 crc kubenswrapper[4756]: I1205 17:59:32.646462 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-jjj9n"] Dec 05 17:59:32 crc kubenswrapper[4756]: I1205 17:59:32.787192 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/57852886-8c8f-4e90-b7bd-0bc1c7ffc235-config-data-merged\") pod \"octavia-healthmanager-jjj9n\" (UID: \"57852886-8c8f-4e90-b7bd-0bc1c7ffc235\") " pod="openstack/octavia-healthmanager-jjj9n" Dec 05 17:59:32 crc kubenswrapper[4756]: I1205 17:59:32.787396 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57852886-8c8f-4e90-b7bd-0bc1c7ffc235-combined-ca-bundle\") pod \"octavia-healthmanager-jjj9n\" (UID: \"57852886-8c8f-4e90-b7bd-0bc1c7ffc235\") " pod="openstack/octavia-healthmanager-jjj9n" Dec 05 17:59:32 crc kubenswrapper[4756]: I1205 17:59:32.787563 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/57852886-8c8f-4e90-b7bd-0bc1c7ffc235-amphora-certs\") pod \"octavia-healthmanager-jjj9n\" (UID: \"57852886-8c8f-4e90-b7bd-0bc1c7ffc235\") " pod="openstack/octavia-healthmanager-jjj9n" Dec 05 17:59:32 crc kubenswrapper[4756]: I1205 17:59:32.787597 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57852886-8c8f-4e90-b7bd-0bc1c7ffc235-config-data\") pod \"octavia-healthmanager-jjj9n\" (UID: \"57852886-8c8f-4e90-b7bd-0bc1c7ffc235\") " pod="openstack/octavia-healthmanager-jjj9n" Dec 05 17:59:32 crc kubenswrapper[4756]: I1205 17:59:32.787671 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/57852886-8c8f-4e90-b7bd-0bc1c7ffc235-hm-ports\") pod \"octavia-healthmanager-jjj9n\" (UID: \"57852886-8c8f-4e90-b7bd-0bc1c7ffc235\") " pod="openstack/octavia-healthmanager-jjj9n" Dec 05 17:59:32 crc kubenswrapper[4756]: I1205 17:59:32.787704 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57852886-8c8f-4e90-b7bd-0bc1c7ffc235-scripts\") pod \"octavia-healthmanager-jjj9n\" (UID: \"57852886-8c8f-4e90-b7bd-0bc1c7ffc235\") " pod="openstack/octavia-healthmanager-jjj9n" Dec 05 17:59:32 crc kubenswrapper[4756]: I1205 17:59:32.890199 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/57852886-8c8f-4e90-b7bd-0bc1c7ffc235-config-data-merged\") pod \"octavia-healthmanager-jjj9n\" (UID: \"57852886-8c8f-4e90-b7bd-0bc1c7ffc235\") " pod="openstack/octavia-healthmanager-jjj9n" Dec 05 17:59:32 crc kubenswrapper[4756]: I1205 17:59:32.890350 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57852886-8c8f-4e90-b7bd-0bc1c7ffc235-combined-ca-bundle\") pod \"octavia-healthmanager-jjj9n\" (UID: \"57852886-8c8f-4e90-b7bd-0bc1c7ffc235\") " pod="openstack/octavia-healthmanager-jjj9n" Dec 05 17:59:32 crc kubenswrapper[4756]: I1205 17:59:32.890469 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/57852886-8c8f-4e90-b7bd-0bc1c7ffc235-amphora-certs\") pod \"octavia-healthmanager-jjj9n\" (UID: \"57852886-8c8f-4e90-b7bd-0bc1c7ffc235\") " pod="openstack/octavia-healthmanager-jjj9n" Dec 05 17:59:32 crc kubenswrapper[4756]: I1205 17:59:32.890503 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57852886-8c8f-4e90-b7bd-0bc1c7ffc235-config-data\") pod \"octavia-healthmanager-jjj9n\" (UID: \"57852886-8c8f-4e90-b7bd-0bc1c7ffc235\") " pod="openstack/octavia-healthmanager-jjj9n" Dec 05 17:59:32 crc kubenswrapper[4756]: I1205 17:59:32.890577 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/57852886-8c8f-4e90-b7bd-0bc1c7ffc235-hm-ports\") pod \"octavia-healthmanager-jjj9n\" (UID: \"57852886-8c8f-4e90-b7bd-0bc1c7ffc235\") " pod="openstack/octavia-healthmanager-jjj9n" Dec 05 17:59:32 crc kubenswrapper[4756]: I1205 17:59:32.890619 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57852886-8c8f-4e90-b7bd-0bc1c7ffc235-scripts\") pod \"octavia-healthmanager-jjj9n\" (UID: \"57852886-8c8f-4e90-b7bd-0bc1c7ffc235\") " pod="openstack/octavia-healthmanager-jjj9n" Dec 05 17:59:32 crc kubenswrapper[4756]: I1205 17:59:32.891350 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/57852886-8c8f-4e90-b7bd-0bc1c7ffc235-config-data-merged\") pod \"octavia-healthmanager-jjj9n\" (UID: \"57852886-8c8f-4e90-b7bd-0bc1c7ffc235\") " pod="openstack/octavia-healthmanager-jjj9n" Dec 05 17:59:32 crc kubenswrapper[4756]: I1205 17:59:32.892574 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/57852886-8c8f-4e90-b7bd-0bc1c7ffc235-hm-ports\") pod \"octavia-healthmanager-jjj9n\" (UID: \"57852886-8c8f-4e90-b7bd-0bc1c7ffc235\") " pod="openstack/octavia-healthmanager-jjj9n" Dec 05 17:59:32 crc kubenswrapper[4756]: I1205 17:59:32.903220 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57852886-8c8f-4e90-b7bd-0bc1c7ffc235-scripts\") pod \"octavia-healthmanager-jjj9n\" (UID: \"57852886-8c8f-4e90-b7bd-0bc1c7ffc235\") " pod="openstack/octavia-healthmanager-jjj9n" Dec 05 17:59:32 crc kubenswrapper[4756]: I1205 17:59:32.903220 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57852886-8c8f-4e90-b7bd-0bc1c7ffc235-combined-ca-bundle\") pod \"octavia-healthmanager-jjj9n\" (UID: \"57852886-8c8f-4e90-b7bd-0bc1c7ffc235\") " pod="openstack/octavia-healthmanager-jjj9n" Dec 05 17:59:32 crc kubenswrapper[4756]: I1205 17:59:32.905255 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57852886-8c8f-4e90-b7bd-0bc1c7ffc235-config-data\") pod \"octavia-healthmanager-jjj9n\" (UID: \"57852886-8c8f-4e90-b7bd-0bc1c7ffc235\") " pod="openstack/octavia-healthmanager-jjj9n" Dec 05 17:59:32 crc kubenswrapper[4756]: I1205 17:59:32.906174 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/57852886-8c8f-4e90-b7bd-0bc1c7ffc235-amphora-certs\") pod \"octavia-healthmanager-jjj9n\" (UID: \"57852886-8c8f-4e90-b7bd-0bc1c7ffc235\") " pod="openstack/octavia-healthmanager-jjj9n" Dec 05 17:59:32 crc kubenswrapper[4756]: I1205 17:59:32.974212 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-jjj9n" Dec 05 17:59:33 crc kubenswrapper[4756]: W1205 17:59:33.553768 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57852886_8c8f_4e90_b7bd_0bc1c7ffc235.slice/crio-9358673ca4528625a4701564fb6c13a6fde5e3600b5ec97af922994a3967779b WatchSource:0}: Error finding container 9358673ca4528625a4701564fb6c13a6fde5e3600b5ec97af922994a3967779b: Status 404 returned error can't find the container with id 9358673ca4528625a4701564fb6c13a6fde5e3600b5ec97af922994a3967779b Dec 05 17:59:33 crc kubenswrapper[4756]: I1205 17:59:33.571303 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-jjj9n"] Dec 05 17:59:33 crc kubenswrapper[4756]: I1205 17:59:33.984670 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-jjj9n" event={"ID":"57852886-8c8f-4e90-b7bd-0bc1c7ffc235","Type":"ContainerStarted","Data":"9358673ca4528625a4701564fb6c13a6fde5e3600b5ec97af922994a3967779b"} Dec 05 17:59:34 crc kubenswrapper[4756]: I1205 17:59:34.759945 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-9rjkh"] Dec 05 17:59:34 crc kubenswrapper[4756]: I1205 17:59:34.768078 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-9rjkh" Dec 05 17:59:34 crc kubenswrapper[4756]: I1205 17:59:34.773075 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-9rjkh"] Dec 05 17:59:34 crc kubenswrapper[4756]: I1205 17:59:34.778068 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Dec 05 17:59:34 crc kubenswrapper[4756]: I1205 17:59:34.778175 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Dec 05 17:59:34 crc kubenswrapper[4756]: I1205 17:59:34.937549 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c3a5437-0d84-49fd-99ba-2c3d14d15b40-combined-ca-bundle\") pod \"octavia-housekeeping-9rjkh\" (UID: \"4c3a5437-0d84-49fd-99ba-2c3d14d15b40\") " pod="openstack/octavia-housekeeping-9rjkh" Dec 05 17:59:34 crc kubenswrapper[4756]: I1205 17:59:34.937622 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/4c3a5437-0d84-49fd-99ba-2c3d14d15b40-hm-ports\") pod \"octavia-housekeeping-9rjkh\" (UID: \"4c3a5437-0d84-49fd-99ba-2c3d14d15b40\") " pod="openstack/octavia-housekeeping-9rjkh" Dec 05 17:59:34 crc kubenswrapper[4756]: I1205 17:59:34.937643 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/4c3a5437-0d84-49fd-99ba-2c3d14d15b40-amphora-certs\") pod \"octavia-housekeeping-9rjkh\" (UID: \"4c3a5437-0d84-49fd-99ba-2c3d14d15b40\") " pod="openstack/octavia-housekeeping-9rjkh" Dec 05 17:59:34 crc kubenswrapper[4756]: I1205 17:59:34.937846 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c3a5437-0d84-49fd-99ba-2c3d14d15b40-scripts\") pod \"octavia-housekeeping-9rjkh\" (UID: \"4c3a5437-0d84-49fd-99ba-2c3d14d15b40\") " pod="openstack/octavia-housekeeping-9rjkh" Dec 05 17:59:34 crc kubenswrapper[4756]: I1205 17:59:34.938008 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c3a5437-0d84-49fd-99ba-2c3d14d15b40-config-data\") pod \"octavia-housekeeping-9rjkh\" (UID: \"4c3a5437-0d84-49fd-99ba-2c3d14d15b40\") " pod="openstack/octavia-housekeeping-9rjkh" Dec 05 17:59:34 crc kubenswrapper[4756]: I1205 17:59:34.938031 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4c3a5437-0d84-49fd-99ba-2c3d14d15b40-config-data-merged\") pod \"octavia-housekeeping-9rjkh\" (UID: \"4c3a5437-0d84-49fd-99ba-2c3d14d15b40\") " pod="openstack/octavia-housekeeping-9rjkh" Dec 05 17:59:34 crc kubenswrapper[4756]: I1205 17:59:34.995130 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-jjj9n" event={"ID":"57852886-8c8f-4e90-b7bd-0bc1c7ffc235","Type":"ContainerStarted","Data":"8966e5147434f65705cc65fb3ef904e2f26d1f652cc88fb450e645f6fdf0d271"} Dec 05 17:59:35 crc kubenswrapper[4756]: I1205 17:59:35.040341 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c3a5437-0d84-49fd-99ba-2c3d14d15b40-config-data\") pod \"octavia-housekeeping-9rjkh\" (UID: \"4c3a5437-0d84-49fd-99ba-2c3d14d15b40\") " pod="openstack/octavia-housekeeping-9rjkh" Dec 05 17:59:35 crc kubenswrapper[4756]: I1205 17:59:35.040388 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4c3a5437-0d84-49fd-99ba-2c3d14d15b40-config-data-merged\") pod \"octavia-housekeeping-9rjkh\" (UID: \"4c3a5437-0d84-49fd-99ba-2c3d14d15b40\") " pod="openstack/octavia-housekeeping-9rjkh" Dec 05 17:59:35 crc kubenswrapper[4756]: I1205 17:59:35.040481 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c3a5437-0d84-49fd-99ba-2c3d14d15b40-combined-ca-bundle\") pod \"octavia-housekeeping-9rjkh\" (UID: \"4c3a5437-0d84-49fd-99ba-2c3d14d15b40\") " pod="openstack/octavia-housekeeping-9rjkh" Dec 05 17:59:35 crc kubenswrapper[4756]: I1205 17:59:35.040522 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/4c3a5437-0d84-49fd-99ba-2c3d14d15b40-hm-ports\") pod \"octavia-housekeeping-9rjkh\" (UID: \"4c3a5437-0d84-49fd-99ba-2c3d14d15b40\") " pod="openstack/octavia-housekeeping-9rjkh" Dec 05 17:59:35 crc kubenswrapper[4756]: I1205 17:59:35.040544 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/4c3a5437-0d84-49fd-99ba-2c3d14d15b40-amphora-certs\") pod \"octavia-housekeeping-9rjkh\" (UID: \"4c3a5437-0d84-49fd-99ba-2c3d14d15b40\") " pod="openstack/octavia-housekeeping-9rjkh" Dec 05 17:59:35 crc kubenswrapper[4756]: I1205 17:59:35.040592 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c3a5437-0d84-49fd-99ba-2c3d14d15b40-scripts\") pod \"octavia-housekeeping-9rjkh\" (UID: \"4c3a5437-0d84-49fd-99ba-2c3d14d15b40\") " pod="openstack/octavia-housekeeping-9rjkh" Dec 05 17:59:35 crc kubenswrapper[4756]: I1205 17:59:35.042469 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/4c3a5437-0d84-49fd-99ba-2c3d14d15b40-hm-ports\") pod \"octavia-housekeeping-9rjkh\" (UID: \"4c3a5437-0d84-49fd-99ba-2c3d14d15b40\") " pod="openstack/octavia-housekeeping-9rjkh" Dec 05 17:59:35 crc kubenswrapper[4756]: I1205 17:59:35.042812 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4c3a5437-0d84-49fd-99ba-2c3d14d15b40-config-data-merged\") pod \"octavia-housekeeping-9rjkh\" (UID: \"4c3a5437-0d84-49fd-99ba-2c3d14d15b40\") " pod="openstack/octavia-housekeeping-9rjkh" Dec 05 17:59:35 crc kubenswrapper[4756]: I1205 17:59:35.051213 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c3a5437-0d84-49fd-99ba-2c3d14d15b40-combined-ca-bundle\") pod \"octavia-housekeeping-9rjkh\" (UID: \"4c3a5437-0d84-49fd-99ba-2c3d14d15b40\") " pod="openstack/octavia-housekeeping-9rjkh" Dec 05 17:59:35 crc kubenswrapper[4756]: I1205 17:59:35.055121 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c3a5437-0d84-49fd-99ba-2c3d14d15b40-config-data\") pod \"octavia-housekeeping-9rjkh\" (UID: \"4c3a5437-0d84-49fd-99ba-2c3d14d15b40\") " pod="openstack/octavia-housekeeping-9rjkh" Dec 05 17:59:35 crc kubenswrapper[4756]: I1205 17:59:35.056793 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c3a5437-0d84-49fd-99ba-2c3d14d15b40-scripts\") pod \"octavia-housekeeping-9rjkh\" (UID: \"4c3a5437-0d84-49fd-99ba-2c3d14d15b40\") " pod="openstack/octavia-housekeeping-9rjkh" Dec 05 17:59:35 crc kubenswrapper[4756]: I1205 17:59:35.063361 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/4c3a5437-0d84-49fd-99ba-2c3d14d15b40-amphora-certs\") pod \"octavia-housekeeping-9rjkh\" (UID: \"4c3a5437-0d84-49fd-99ba-2c3d14d15b40\") " pod="openstack/octavia-housekeeping-9rjkh" Dec 05 17:59:35 crc kubenswrapper[4756]: I1205 17:59:35.119565 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-9rjkh" Dec 05 17:59:35 crc kubenswrapper[4756]: I1205 17:59:35.642701 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-9rjkh"] Dec 05 17:59:35 crc kubenswrapper[4756]: E1205 17:59:35.846368 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57852886_8c8f_4e90_b7bd_0bc1c7ffc235.slice/crio-8966e5147434f65705cc65fb3ef904e2f26d1f652cc88fb450e645f6fdf0d271.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57852886_8c8f_4e90_b7bd_0bc1c7ffc235.slice/crio-conmon-8966e5147434f65705cc65fb3ef904e2f26d1f652cc88fb450e645f6fdf0d271.scope\": RecentStats: unable to find data in memory cache]" Dec 05 17:59:36 crc kubenswrapper[4756]: I1205 17:59:36.013800 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-9rjkh" event={"ID":"4c3a5437-0d84-49fd-99ba-2c3d14d15b40","Type":"ContainerStarted","Data":"ea7a88e7b289eb11ffdf8ddf29a9c2445b3221a0ca28a853d0a9f76c8e78ec6f"} Dec 05 17:59:36 crc kubenswrapper[4756]: I1205 17:59:36.015874 4756 generic.go:334] "Generic (PLEG): container finished" podID="57852886-8c8f-4e90-b7bd-0bc1c7ffc235" containerID="8966e5147434f65705cc65fb3ef904e2f26d1f652cc88fb450e645f6fdf0d271" exitCode=0 Dec 05 17:59:36 crc kubenswrapper[4756]: I1205 17:59:36.015939 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-jjj9n" event={"ID":"57852886-8c8f-4e90-b7bd-0bc1c7ffc235","Type":"ContainerDied","Data":"8966e5147434f65705cc65fb3ef904e2f26d1f652cc88fb450e645f6fdf0d271"} Dec 05 17:59:36 crc kubenswrapper[4756]: I1205 17:59:36.423600 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-tl26l"] Dec 05 17:59:36 crc kubenswrapper[4756]: I1205 17:59:36.427775 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-tl26l" Dec 05 17:59:36 crc kubenswrapper[4756]: I1205 17:59:36.430475 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Dec 05 17:59:36 crc kubenswrapper[4756]: I1205 17:59:36.430476 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Dec 05 17:59:36 crc kubenswrapper[4756]: I1205 17:59:36.434943 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-tl26l"] Dec 05 17:59:36 crc kubenswrapper[4756]: I1205 17:59:36.585886 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/f1a1e01d-72dd-43eb-912c-dab63cca1cfb-amphora-certs\") pod \"octavia-worker-tl26l\" (UID: \"f1a1e01d-72dd-43eb-912c-dab63cca1cfb\") " pod="openstack/octavia-worker-tl26l" Dec 05 17:59:36 crc kubenswrapper[4756]: I1205 17:59:36.585974 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1a1e01d-72dd-43eb-912c-dab63cca1cfb-scripts\") pod \"octavia-worker-tl26l\" (UID: \"f1a1e01d-72dd-43eb-912c-dab63cca1cfb\") " pod="openstack/octavia-worker-tl26l" Dec 05 17:59:36 crc kubenswrapper[4756]: I1205 17:59:36.586125 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/f1a1e01d-72dd-43eb-912c-dab63cca1cfb-hm-ports\") pod \"octavia-worker-tl26l\" (UID: \"f1a1e01d-72dd-43eb-912c-dab63cca1cfb\") " pod="openstack/octavia-worker-tl26l" Dec 05 17:59:36 crc kubenswrapper[4756]: I1205 17:59:36.586156 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1a1e01d-72dd-43eb-912c-dab63cca1cfb-config-data\") pod \"octavia-worker-tl26l\" (UID: \"f1a1e01d-72dd-43eb-912c-dab63cca1cfb\") " pod="openstack/octavia-worker-tl26l" Dec 05 17:59:36 crc kubenswrapper[4756]: I1205 17:59:36.586227 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/f1a1e01d-72dd-43eb-912c-dab63cca1cfb-config-data-merged\") pod \"octavia-worker-tl26l\" (UID: \"f1a1e01d-72dd-43eb-912c-dab63cca1cfb\") " pod="openstack/octavia-worker-tl26l" Dec 05 17:59:36 crc kubenswrapper[4756]: I1205 17:59:36.586245 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1a1e01d-72dd-43eb-912c-dab63cca1cfb-combined-ca-bundle\") pod \"octavia-worker-tl26l\" (UID: \"f1a1e01d-72dd-43eb-912c-dab63cca1cfb\") " pod="openstack/octavia-worker-tl26l" Dec 05 17:59:36 crc kubenswrapper[4756]: I1205 17:59:36.687450 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1a1e01d-72dd-43eb-912c-dab63cca1cfb-config-data\") pod \"octavia-worker-tl26l\" (UID: \"f1a1e01d-72dd-43eb-912c-dab63cca1cfb\") " pod="openstack/octavia-worker-tl26l" Dec 05 17:59:36 crc kubenswrapper[4756]: I1205 17:59:36.687503 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/f1a1e01d-72dd-43eb-912c-dab63cca1cfb-config-data-merged\") pod \"octavia-worker-tl26l\" (UID: \"f1a1e01d-72dd-43eb-912c-dab63cca1cfb\") " pod="openstack/octavia-worker-tl26l" Dec 05 17:59:36 crc kubenswrapper[4756]: I1205 17:59:36.687521 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1a1e01d-72dd-43eb-912c-dab63cca1cfb-combined-ca-bundle\") pod \"octavia-worker-tl26l\" (UID: \"f1a1e01d-72dd-43eb-912c-dab63cca1cfb\") " pod="openstack/octavia-worker-tl26l" Dec 05 17:59:36 crc kubenswrapper[4756]: I1205 17:59:36.687612 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/f1a1e01d-72dd-43eb-912c-dab63cca1cfb-amphora-certs\") pod \"octavia-worker-tl26l\" (UID: \"f1a1e01d-72dd-43eb-912c-dab63cca1cfb\") " pod="openstack/octavia-worker-tl26l" Dec 05 17:59:36 crc kubenswrapper[4756]: I1205 17:59:36.687670 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1a1e01d-72dd-43eb-912c-dab63cca1cfb-scripts\") pod \"octavia-worker-tl26l\" (UID: \"f1a1e01d-72dd-43eb-912c-dab63cca1cfb\") " pod="openstack/octavia-worker-tl26l" Dec 05 17:59:36 crc kubenswrapper[4756]: I1205 17:59:36.687712 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/f1a1e01d-72dd-43eb-912c-dab63cca1cfb-hm-ports\") pod \"octavia-worker-tl26l\" (UID: \"f1a1e01d-72dd-43eb-912c-dab63cca1cfb\") " pod="openstack/octavia-worker-tl26l" Dec 05 17:59:36 crc kubenswrapper[4756]: I1205 17:59:36.688891 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/f1a1e01d-72dd-43eb-912c-dab63cca1cfb-config-data-merged\") pod \"octavia-worker-tl26l\" (UID: \"f1a1e01d-72dd-43eb-912c-dab63cca1cfb\") " pod="openstack/octavia-worker-tl26l" Dec 05 17:59:36 crc kubenswrapper[4756]: I1205 17:59:36.689794 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/f1a1e01d-72dd-43eb-912c-dab63cca1cfb-hm-ports\") pod \"octavia-worker-tl26l\" (UID: \"f1a1e01d-72dd-43eb-912c-dab63cca1cfb\") " pod="openstack/octavia-worker-tl26l" Dec 05 17:59:36 crc kubenswrapper[4756]: I1205 17:59:36.692628 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/f1a1e01d-72dd-43eb-912c-dab63cca1cfb-amphora-certs\") pod \"octavia-worker-tl26l\" (UID: \"f1a1e01d-72dd-43eb-912c-dab63cca1cfb\") " pod="openstack/octavia-worker-tl26l" Dec 05 17:59:36 crc kubenswrapper[4756]: I1205 17:59:36.693293 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1a1e01d-72dd-43eb-912c-dab63cca1cfb-combined-ca-bundle\") pod \"octavia-worker-tl26l\" (UID: \"f1a1e01d-72dd-43eb-912c-dab63cca1cfb\") " pod="openstack/octavia-worker-tl26l" Dec 05 17:59:36 crc kubenswrapper[4756]: I1205 17:59:36.703153 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1a1e01d-72dd-43eb-912c-dab63cca1cfb-scripts\") pod \"octavia-worker-tl26l\" (UID: \"f1a1e01d-72dd-43eb-912c-dab63cca1cfb\") " pod="openstack/octavia-worker-tl26l" Dec 05 17:59:36 crc kubenswrapper[4756]: I1205 17:59:36.703533 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1a1e01d-72dd-43eb-912c-dab63cca1cfb-config-data\") pod \"octavia-worker-tl26l\" (UID: \"f1a1e01d-72dd-43eb-912c-dab63cca1cfb\") " pod="openstack/octavia-worker-tl26l" Dec 05 17:59:36 crc kubenswrapper[4756]: I1205 17:59:36.796775 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-tl26l" Dec 05 17:59:37 crc kubenswrapper[4756]: I1205 17:59:37.028934 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-jjj9n" event={"ID":"57852886-8c8f-4e90-b7bd-0bc1c7ffc235","Type":"ContainerStarted","Data":"809652b5282ad289606238a13d571255a5b52251029a722a96a6c566546b27a3"} Dec 05 17:59:37 crc kubenswrapper[4756]: I1205 17:59:37.030214 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-jjj9n" Dec 05 17:59:38 crc kubenswrapper[4756]: I1205 17:59:38.746394 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-jjj9n" podStartSLOduration=6.746371654 podStartE2EDuration="6.746371654s" podCreationTimestamp="2025-12-05 17:59:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 17:59:37.055234136 +0000 UTC m=+6237.293907688" watchObservedRunningTime="2025-12-05 17:59:38.746371654 +0000 UTC m=+6238.985045166" Dec 05 17:59:38 crc kubenswrapper[4756]: I1205 17:59:38.747573 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-tl26l"] Dec 05 17:59:39 crc kubenswrapper[4756]: I1205 17:59:39.055837 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-9rjkh" event={"ID":"4c3a5437-0d84-49fd-99ba-2c3d14d15b40","Type":"ContainerStarted","Data":"fb0c3644f694f394b1764a319d301ccbd0643d491508eae64e0af69c44f3452e"} Dec 05 17:59:39 crc kubenswrapper[4756]: I1205 17:59:39.059674 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-tl26l" event={"ID":"f1a1e01d-72dd-43eb-912c-dab63cca1cfb","Type":"ContainerStarted","Data":"b18b5132d9243472844c7b881aa224248eceb564b8557b5f380fe99ef18fc9a4"} Dec 05 17:59:40 crc kubenswrapper[4756]: I1205 17:59:40.074139 4756 generic.go:334] "Generic (PLEG): container finished" podID="4c3a5437-0d84-49fd-99ba-2c3d14d15b40" containerID="fb0c3644f694f394b1764a319d301ccbd0643d491508eae64e0af69c44f3452e" exitCode=0 Dec 05 17:59:40 crc kubenswrapper[4756]: I1205 17:59:40.074192 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-9rjkh" event={"ID":"4c3a5437-0d84-49fd-99ba-2c3d14d15b40","Type":"ContainerDied","Data":"fb0c3644f694f394b1764a319d301ccbd0643d491508eae64e0af69c44f3452e"} Dec 05 17:59:45 crc kubenswrapper[4756]: I1205 17:59:45.729637 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-handler-sfl86" podUID="c9f8a111-e6d4-440a-a3d4-4f3010304ba9" containerName="nmstate-handler" probeResult="failure" output="command timed out" Dec 05 17:59:48 crc kubenswrapper[4756]: I1205 17:59:48.035043 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-jjj9n" Dec 05 17:59:48 crc kubenswrapper[4756]: I1205 17:59:48.190002 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-9rjkh" event={"ID":"4c3a5437-0d84-49fd-99ba-2c3d14d15b40","Type":"ContainerStarted","Data":"32c2089926c6da54d6ddb958569b95317a3e0dd7a2e26c412575dcba9c7f1933"} Dec 05 17:59:49 crc kubenswrapper[4756]: I1205 17:59:49.201496 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-9rjkh" Dec 05 17:59:49 crc kubenswrapper[4756]: I1205 17:59:49.233390 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-9rjkh" podStartSLOduration=12.678389598 podStartE2EDuration="15.23337098s" podCreationTimestamp="2025-12-05 17:59:34 +0000 UTC" firstStartedPulling="2025-12-05 17:59:35.710542506 +0000 UTC m=+6235.949216018" lastFinishedPulling="2025-12-05 17:59:38.265523878 +0000 UTC m=+6238.504197400" observedRunningTime="2025-12-05 17:59:49.224011951 +0000 UTC m=+6249.462685503" watchObservedRunningTime="2025-12-05 17:59:49.23337098 +0000 UTC m=+6249.472044492" Dec 05 17:59:50 crc kubenswrapper[4756]: I1205 17:59:50.215311 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-tl26l" event={"ID":"f1a1e01d-72dd-43eb-912c-dab63cca1cfb","Type":"ContainerStarted","Data":"01db2a11917b2371876baec4c8c4ffca43aad03fc606552f6fc4b6f04142c359"} Dec 05 17:59:56 crc kubenswrapper[4756]: I1205 17:59:56.340007 4756 generic.go:334] "Generic (PLEG): container finished" podID="f1a1e01d-72dd-43eb-912c-dab63cca1cfb" containerID="01db2a11917b2371876baec4c8c4ffca43aad03fc606552f6fc4b6f04142c359" exitCode=0 Dec 05 17:59:56 crc kubenswrapper[4756]: I1205 17:59:56.340108 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-tl26l" event={"ID":"f1a1e01d-72dd-43eb-912c-dab63cca1cfb","Type":"ContainerDied","Data":"01db2a11917b2371876baec4c8c4ffca43aad03fc606552f6fc4b6f04142c359"} Dec 05 17:59:59 crc kubenswrapper[4756]: I1205 17:59:58.363658 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-tl26l" event={"ID":"f1a1e01d-72dd-43eb-912c-dab63cca1cfb","Type":"ContainerStarted","Data":"af73740e6934d9d3a0cf8c1e8c9dea0f30b9533ab1490e8114f25a46d821523c"} Dec 05 17:59:59 crc kubenswrapper[4756]: I1205 17:59:58.364553 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-tl26l" Dec 05 17:59:59 crc kubenswrapper[4756]: I1205 17:59:58.394066 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-tl26l" podStartSLOduration=12.189561491 podStartE2EDuration="22.394027063s" podCreationTimestamp="2025-12-05 17:59:36 +0000 UTC" firstStartedPulling="2025-12-05 17:59:38.779565268 +0000 UTC m=+6239.018238790" lastFinishedPulling="2025-12-05 17:59:48.98403085 +0000 UTC m=+6249.222704362" observedRunningTime="2025-12-05 17:59:58.383549454 +0000 UTC m=+6258.622222966" watchObservedRunningTime="2025-12-05 17:59:58.394027063 +0000 UTC m=+6258.632700575" Dec 05 17:59:59 crc kubenswrapper[4756]: I1205 17:59:59.078846 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-h84pl"] Dec 05 17:59:59 crc kubenswrapper[4756]: I1205 17:59:59.091063 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-bc89-account-create-update-8z7jt"] Dec 05 17:59:59 crc kubenswrapper[4756]: I1205 17:59:59.098926 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-h84pl"] Dec 05 17:59:59 crc kubenswrapper[4756]: I1205 17:59:59.109238 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-bc89-account-create-update-8z7jt"] Dec 05 18:00:00 crc kubenswrapper[4756]: I1205 18:00:00.075685 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f212d8d-36fd-4499-9cd4-08c17b5e39e9" path="/var/lib/kubelet/pods/5f212d8d-36fd-4499-9cd4-08c17b5e39e9/volumes" Dec 05 18:00:00 crc kubenswrapper[4756]: I1205 18:00:00.077263 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ea60fe5-2e89-4d87-966b-7be9872f939f" path="/var/lib/kubelet/pods/7ea60fe5-2e89-4d87-966b-7be9872f939f/volumes" Dec 05 18:00:00 crc kubenswrapper[4756]: I1205 18:00:00.181069 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415960-svbg4"] Dec 05 18:00:00 crc kubenswrapper[4756]: I1205 18:00:00.183563 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415960-svbg4" Dec 05 18:00:00 crc kubenswrapper[4756]: I1205 18:00:00.186982 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 18:00:00 crc kubenswrapper[4756]: I1205 18:00:00.187454 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 18:00:00 crc kubenswrapper[4756]: I1205 18:00:00.197263 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415960-svbg4"] Dec 05 18:00:00 crc kubenswrapper[4756]: I1205 18:00:00.328443 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/99dffd73-862d-44e3-ac15-ed5c4cb994c8-config-volume\") pod \"collect-profiles-29415960-svbg4\" (UID: \"99dffd73-862d-44e3-ac15-ed5c4cb994c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415960-svbg4" Dec 05 18:00:00 crc kubenswrapper[4756]: I1205 18:00:00.328525 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgsnp\" (UniqueName: \"kubernetes.io/projected/99dffd73-862d-44e3-ac15-ed5c4cb994c8-kube-api-access-jgsnp\") pod \"collect-profiles-29415960-svbg4\" (UID: \"99dffd73-862d-44e3-ac15-ed5c4cb994c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415960-svbg4" Dec 05 18:00:00 crc kubenswrapper[4756]: I1205 18:00:00.328876 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/99dffd73-862d-44e3-ac15-ed5c4cb994c8-secret-volume\") pod \"collect-profiles-29415960-svbg4\" (UID: \"99dffd73-862d-44e3-ac15-ed5c4cb994c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415960-svbg4" Dec 05 18:00:00 crc kubenswrapper[4756]: I1205 18:00:00.431256 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgsnp\" (UniqueName: \"kubernetes.io/projected/99dffd73-862d-44e3-ac15-ed5c4cb994c8-kube-api-access-jgsnp\") pod \"collect-profiles-29415960-svbg4\" (UID: \"99dffd73-862d-44e3-ac15-ed5c4cb994c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415960-svbg4" Dec 05 18:00:00 crc kubenswrapper[4756]: I1205 18:00:00.431751 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/99dffd73-862d-44e3-ac15-ed5c4cb994c8-secret-volume\") pod \"collect-profiles-29415960-svbg4\" (UID: \"99dffd73-862d-44e3-ac15-ed5c4cb994c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415960-svbg4" Dec 05 18:00:00 crc kubenswrapper[4756]: I1205 18:00:00.432052 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/99dffd73-862d-44e3-ac15-ed5c4cb994c8-config-volume\") pod \"collect-profiles-29415960-svbg4\" (UID: \"99dffd73-862d-44e3-ac15-ed5c4cb994c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415960-svbg4" Dec 05 18:00:00 crc kubenswrapper[4756]: I1205 18:00:00.435023 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/99dffd73-862d-44e3-ac15-ed5c4cb994c8-config-volume\") pod \"collect-profiles-29415960-svbg4\" (UID: \"99dffd73-862d-44e3-ac15-ed5c4cb994c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415960-svbg4" Dec 05 18:00:00 crc kubenswrapper[4756]: I1205 18:00:00.452718 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/99dffd73-862d-44e3-ac15-ed5c4cb994c8-secret-volume\") pod \"collect-profiles-29415960-svbg4\" (UID: \"99dffd73-862d-44e3-ac15-ed5c4cb994c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415960-svbg4" Dec 05 18:00:00 crc kubenswrapper[4756]: I1205 18:00:00.461809 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgsnp\" (UniqueName: \"kubernetes.io/projected/99dffd73-862d-44e3-ac15-ed5c4cb994c8-kube-api-access-jgsnp\") pod \"collect-profiles-29415960-svbg4\" (UID: \"99dffd73-862d-44e3-ac15-ed5c4cb994c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415960-svbg4" Dec 05 18:00:00 crc kubenswrapper[4756]: I1205 18:00:00.517950 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415960-svbg4" Dec 05 18:00:01 crc kubenswrapper[4756]: W1205 18:00:01.039114 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99dffd73_862d_44e3_ac15_ed5c4cb994c8.slice/crio-68486ca08f4123913838314044fd7c6f6fec0d734659b040b0bd957bb7a73e59 WatchSource:0}: Error finding container 68486ca08f4123913838314044fd7c6f6fec0d734659b040b0bd957bb7a73e59: Status 404 returned error can't find the container with id 68486ca08f4123913838314044fd7c6f6fec0d734659b040b0bd957bb7a73e59 Dec 05 18:00:01 crc kubenswrapper[4756]: I1205 18:00:01.043741 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415960-svbg4"] Dec 05 18:00:01 crc kubenswrapper[4756]: I1205 18:00:01.399919 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415960-svbg4" event={"ID":"99dffd73-862d-44e3-ac15-ed5c4cb994c8","Type":"ContainerStarted","Data":"68486ca08f4123913838314044fd7c6f6fec0d734659b040b0bd957bb7a73e59"} Dec 05 18:00:02 crc kubenswrapper[4756]: I1205 18:00:02.412566 4756 generic.go:334] "Generic (PLEG): container finished" podID="99dffd73-862d-44e3-ac15-ed5c4cb994c8" containerID="60e10a65feeaa43aa3a9aa7b101d7a2e3941380e134280e1be30c0775a99dd9c" exitCode=0 Dec 05 18:00:02 crc kubenswrapper[4756]: I1205 18:00:02.412665 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415960-svbg4" event={"ID":"99dffd73-862d-44e3-ac15-ed5c4cb994c8","Type":"ContainerDied","Data":"60e10a65feeaa43aa3a9aa7b101d7a2e3941380e134280e1be30c0775a99dd9c"} Dec 05 18:00:03 crc kubenswrapper[4756]: I1205 18:00:03.852610 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415960-svbg4" Dec 05 18:00:03 crc kubenswrapper[4756]: I1205 18:00:03.922681 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/99dffd73-862d-44e3-ac15-ed5c4cb994c8-secret-volume\") pod \"99dffd73-862d-44e3-ac15-ed5c4cb994c8\" (UID: \"99dffd73-862d-44e3-ac15-ed5c4cb994c8\") " Dec 05 18:00:03 crc kubenswrapper[4756]: I1205 18:00:03.922762 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/99dffd73-862d-44e3-ac15-ed5c4cb994c8-config-volume\") pod \"99dffd73-862d-44e3-ac15-ed5c4cb994c8\" (UID: \"99dffd73-862d-44e3-ac15-ed5c4cb994c8\") " Dec 05 18:00:03 crc kubenswrapper[4756]: I1205 18:00:03.922812 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgsnp\" (UniqueName: \"kubernetes.io/projected/99dffd73-862d-44e3-ac15-ed5c4cb994c8-kube-api-access-jgsnp\") pod \"99dffd73-862d-44e3-ac15-ed5c4cb994c8\" (UID: \"99dffd73-862d-44e3-ac15-ed5c4cb994c8\") " Dec 05 18:00:03 crc kubenswrapper[4756]: I1205 18:00:03.923656 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99dffd73-862d-44e3-ac15-ed5c4cb994c8-config-volume" (OuterVolumeSpecName: "config-volume") pod "99dffd73-862d-44e3-ac15-ed5c4cb994c8" (UID: "99dffd73-862d-44e3-ac15-ed5c4cb994c8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:00:03 crc kubenswrapper[4756]: I1205 18:00:03.929055 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99dffd73-862d-44e3-ac15-ed5c4cb994c8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "99dffd73-862d-44e3-ac15-ed5c4cb994c8" (UID: "99dffd73-862d-44e3-ac15-ed5c4cb994c8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:00:03 crc kubenswrapper[4756]: I1205 18:00:03.929901 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99dffd73-862d-44e3-ac15-ed5c4cb994c8-kube-api-access-jgsnp" (OuterVolumeSpecName: "kube-api-access-jgsnp") pod "99dffd73-862d-44e3-ac15-ed5c4cb994c8" (UID: "99dffd73-862d-44e3-ac15-ed5c4cb994c8"). InnerVolumeSpecName "kube-api-access-jgsnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:00:04 crc kubenswrapper[4756]: I1205 18:00:04.024473 4756 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/99dffd73-862d-44e3-ac15-ed5c4cb994c8-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 18:00:04 crc kubenswrapper[4756]: I1205 18:00:04.024516 4756 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/99dffd73-862d-44e3-ac15-ed5c4cb994c8-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 18:00:04 crc kubenswrapper[4756]: I1205 18:00:04.024531 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgsnp\" (UniqueName: \"kubernetes.io/projected/99dffd73-862d-44e3-ac15-ed5c4cb994c8-kube-api-access-jgsnp\") on node \"crc\" DevicePath \"\"" Dec 05 18:00:04 crc kubenswrapper[4756]: I1205 18:00:04.433691 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415960-svbg4" event={"ID":"99dffd73-862d-44e3-ac15-ed5c4cb994c8","Type":"ContainerDied","Data":"68486ca08f4123913838314044fd7c6f6fec0d734659b040b0bd957bb7a73e59"} Dec 05 18:00:04 crc kubenswrapper[4756]: I1205 18:00:04.433732 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68486ca08f4123913838314044fd7c6f6fec0d734659b040b0bd957bb7a73e59" Dec 05 18:00:04 crc kubenswrapper[4756]: I1205 18:00:04.433758 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415960-svbg4" Dec 05 18:00:04 crc kubenswrapper[4756]: I1205 18:00:04.936293 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415915-6kx7p"] Dec 05 18:00:04 crc kubenswrapper[4756]: I1205 18:00:04.952621 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415915-6kx7p"] Dec 05 18:00:05 crc kubenswrapper[4756]: I1205 18:00:05.178408 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-9rjkh" Dec 05 18:00:06 crc kubenswrapper[4756]: I1205 18:00:06.048541 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-x4nlc"] Dec 05 18:00:06 crc kubenswrapper[4756]: I1205 18:00:06.072689 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38f37263-1348-4de5-924d-13e6b4d4c921" path="/var/lib/kubelet/pods/38f37263-1348-4de5-924d-13e6b4d4c921/volumes" Dec 05 18:00:06 crc kubenswrapper[4756]: I1205 18:00:06.073690 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-x4nlc"] Dec 05 18:00:06 crc kubenswrapper[4756]: I1205 18:00:06.829002 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-tl26l" Dec 05 18:00:08 crc kubenswrapper[4756]: I1205 18:00:08.071823 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aef51efd-9b84-4528-9f7e-4703c6bc31dd" path="/var/lib/kubelet/pods/aef51efd-9b84-4528-9f7e-4703c6bc31dd/volumes" Dec 05 18:00:08 crc kubenswrapper[4756]: I1205 18:00:08.405899 4756 scope.go:117] "RemoveContainer" containerID="5452f98c7a9727302f3cc6894c9ba60c35f54cd1d76d818abb93644227e53b66" Dec 05 18:00:08 crc kubenswrapper[4756]: I1205 18:00:08.442682 4756 scope.go:117] "RemoveContainer" containerID="2b8a3d5580211892e58592ee0b5e5166aadb7b984aeacbda154d78deef1e08d0" Dec 05 18:00:08 crc kubenswrapper[4756]: I1205 18:00:08.501158 4756 scope.go:117] "RemoveContainer" containerID="f75fde1eb5ed3887d9da5236335adfcaec29bb57e53fc9f01cbc3cdb3751942a" Dec 05 18:00:08 crc kubenswrapper[4756]: I1205 18:00:08.538024 4756 scope.go:117] "RemoveContainer" containerID="6fbdb3100de011c82dc4e734e4b5e80fbdb5413f975d8e8f31cb81095c6bf752" Dec 05 18:00:22 crc kubenswrapper[4756]: I1205 18:00:22.295821 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:00:22 crc kubenswrapper[4756]: I1205 18:00:22.296641 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:00:52 crc kubenswrapper[4756]: I1205 18:00:52.295960 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:00:52 crc kubenswrapper[4756]: I1205 18:00:52.296694 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:00:58 crc kubenswrapper[4756]: I1205 18:00:58.062549 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-29w9v"] Dec 05 18:00:58 crc kubenswrapper[4756]: I1205 18:00:58.064574 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7466-account-create-update-xqvp2"] Dec 05 18:00:58 crc kubenswrapper[4756]: I1205 18:00:58.076037 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-29w9v"] Dec 05 18:00:58 crc kubenswrapper[4756]: I1205 18:00:58.087343 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7466-account-create-update-xqvp2"] Dec 05 18:01:00 crc kubenswrapper[4756]: I1205 18:01:00.065950 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a4fd436-4abe-4ba5-b62e-4c107f564686" path="/var/lib/kubelet/pods/2a4fd436-4abe-4ba5-b62e-4c107f564686/volumes" Dec 05 18:01:00 crc kubenswrapper[4756]: I1205 18:01:00.067160 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5f8accb-ed03-468b-ba0d-7251eec5a484" path="/var/lib/kubelet/pods/e5f8accb-ed03-468b-ba0d-7251eec5a484/volumes" Dec 05 18:01:00 crc kubenswrapper[4756]: I1205 18:01:00.152699 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29415961-hgbt4"] Dec 05 18:01:00 crc kubenswrapper[4756]: E1205 18:01:00.153064 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99dffd73-862d-44e3-ac15-ed5c4cb994c8" containerName="collect-profiles" Dec 05 18:01:00 crc kubenswrapper[4756]: I1205 18:01:00.153080 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="99dffd73-862d-44e3-ac15-ed5c4cb994c8" containerName="collect-profiles" Dec 05 18:01:00 crc kubenswrapper[4756]: I1205 18:01:00.153266 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="99dffd73-862d-44e3-ac15-ed5c4cb994c8" containerName="collect-profiles" Dec 05 18:01:00 crc kubenswrapper[4756]: I1205 18:01:00.153886 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29415961-hgbt4" Dec 05 18:01:00 crc kubenswrapper[4756]: I1205 18:01:00.165967 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29415961-hgbt4"] Dec 05 18:01:00 crc kubenswrapper[4756]: I1205 18:01:00.311773 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3faa564c-7313-40f9-9cad-ed666212dca9-combined-ca-bundle\") pod \"keystone-cron-29415961-hgbt4\" (UID: \"3faa564c-7313-40f9-9cad-ed666212dca9\") " pod="openstack/keystone-cron-29415961-hgbt4" Dec 05 18:01:00 crc kubenswrapper[4756]: I1205 18:01:00.311862 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3faa564c-7313-40f9-9cad-ed666212dca9-fernet-keys\") pod \"keystone-cron-29415961-hgbt4\" (UID: \"3faa564c-7313-40f9-9cad-ed666212dca9\") " pod="openstack/keystone-cron-29415961-hgbt4" Dec 05 18:01:00 crc kubenswrapper[4756]: I1205 18:01:00.311923 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74f95\" (UniqueName: \"kubernetes.io/projected/3faa564c-7313-40f9-9cad-ed666212dca9-kube-api-access-74f95\") pod \"keystone-cron-29415961-hgbt4\" (UID: \"3faa564c-7313-40f9-9cad-ed666212dca9\") " pod="openstack/keystone-cron-29415961-hgbt4" Dec 05 18:01:00 crc kubenswrapper[4756]: I1205 18:01:00.311954 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3faa564c-7313-40f9-9cad-ed666212dca9-config-data\") pod \"keystone-cron-29415961-hgbt4\" (UID: \"3faa564c-7313-40f9-9cad-ed666212dca9\") " pod="openstack/keystone-cron-29415961-hgbt4" Dec 05 18:01:00 crc kubenswrapper[4756]: I1205 18:01:00.414132 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3faa564c-7313-40f9-9cad-ed666212dca9-combined-ca-bundle\") pod \"keystone-cron-29415961-hgbt4\" (UID: \"3faa564c-7313-40f9-9cad-ed666212dca9\") " pod="openstack/keystone-cron-29415961-hgbt4" Dec 05 18:01:00 crc kubenswrapper[4756]: I1205 18:01:00.414248 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3faa564c-7313-40f9-9cad-ed666212dca9-fernet-keys\") pod \"keystone-cron-29415961-hgbt4\" (UID: \"3faa564c-7313-40f9-9cad-ed666212dca9\") " pod="openstack/keystone-cron-29415961-hgbt4" Dec 05 18:01:00 crc kubenswrapper[4756]: I1205 18:01:00.414316 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74f95\" (UniqueName: \"kubernetes.io/projected/3faa564c-7313-40f9-9cad-ed666212dca9-kube-api-access-74f95\") pod \"keystone-cron-29415961-hgbt4\" (UID: \"3faa564c-7313-40f9-9cad-ed666212dca9\") " pod="openstack/keystone-cron-29415961-hgbt4" Dec 05 18:01:00 crc kubenswrapper[4756]: I1205 18:01:00.414353 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3faa564c-7313-40f9-9cad-ed666212dca9-config-data\") pod \"keystone-cron-29415961-hgbt4\" (UID: \"3faa564c-7313-40f9-9cad-ed666212dca9\") " pod="openstack/keystone-cron-29415961-hgbt4" Dec 05 18:01:00 crc kubenswrapper[4756]: I1205 18:01:00.424678 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3faa564c-7313-40f9-9cad-ed666212dca9-combined-ca-bundle\") pod \"keystone-cron-29415961-hgbt4\" (UID: \"3faa564c-7313-40f9-9cad-ed666212dca9\") " pod="openstack/keystone-cron-29415961-hgbt4" Dec 05 18:01:00 crc kubenswrapper[4756]: I1205 18:01:00.426295 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3faa564c-7313-40f9-9cad-ed666212dca9-fernet-keys\") pod \"keystone-cron-29415961-hgbt4\" (UID: \"3faa564c-7313-40f9-9cad-ed666212dca9\") " pod="openstack/keystone-cron-29415961-hgbt4" Dec 05 18:01:00 crc kubenswrapper[4756]: I1205 18:01:00.432324 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3faa564c-7313-40f9-9cad-ed666212dca9-config-data\") pod \"keystone-cron-29415961-hgbt4\" (UID: \"3faa564c-7313-40f9-9cad-ed666212dca9\") " pod="openstack/keystone-cron-29415961-hgbt4" Dec 05 18:01:00 crc kubenswrapper[4756]: I1205 18:01:00.439348 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74f95\" (UniqueName: \"kubernetes.io/projected/3faa564c-7313-40f9-9cad-ed666212dca9-kube-api-access-74f95\") pod \"keystone-cron-29415961-hgbt4\" (UID: \"3faa564c-7313-40f9-9cad-ed666212dca9\") " pod="openstack/keystone-cron-29415961-hgbt4" Dec 05 18:01:00 crc kubenswrapper[4756]: I1205 18:01:00.481982 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29415961-hgbt4" Dec 05 18:01:00 crc kubenswrapper[4756]: I1205 18:01:00.996770 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29415961-hgbt4"] Dec 05 18:01:01 crc kubenswrapper[4756]: I1205 18:01:01.122658 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29415961-hgbt4" event={"ID":"3faa564c-7313-40f9-9cad-ed666212dca9","Type":"ContainerStarted","Data":"e44a2d896f46a8deda2ea392ecebf59c3e170bca3f77f9e24804a702d1dcad5c"} Dec 05 18:01:01 crc kubenswrapper[4756]: I1205 18:01:01.968615 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-76854b6f89-9g6bc"] Dec 05 18:01:01 crc kubenswrapper[4756]: I1205 18:01:01.971079 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76854b6f89-9g6bc" Dec 05 18:01:01 crc kubenswrapper[4756]: I1205 18:01:01.978876 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 05 18:01:01 crc kubenswrapper[4756]: I1205 18:01:01.979115 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 05 18:01:01 crc kubenswrapper[4756]: I1205 18:01:01.979280 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 05 18:01:01 crc kubenswrapper[4756]: I1205 18:01:01.979355 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-x8kx6" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.037250 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76854b6f89-9g6bc"] Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.046940 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e8e9465-2f77-4b84-b96d-a3c2768d0548-logs\") pod \"horizon-76854b6f89-9g6bc\" (UID: \"5e8e9465-2f77-4b84-b96d-a3c2768d0548\") " pod="openstack/horizon-76854b6f89-9g6bc" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.046992 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5e8e9465-2f77-4b84-b96d-a3c2768d0548-horizon-secret-key\") pod \"horizon-76854b6f89-9g6bc\" (UID: \"5e8e9465-2f77-4b84-b96d-a3c2768d0548\") " pod="openstack/horizon-76854b6f89-9g6bc" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.047023 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxxdj\" (UniqueName: \"kubernetes.io/projected/5e8e9465-2f77-4b84-b96d-a3c2768d0548-kube-api-access-rxxdj\") pod \"horizon-76854b6f89-9g6bc\" (UID: \"5e8e9465-2f77-4b84-b96d-a3c2768d0548\") " pod="openstack/horizon-76854b6f89-9g6bc" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.047180 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e8e9465-2f77-4b84-b96d-a3c2768d0548-scripts\") pod \"horizon-76854b6f89-9g6bc\" (UID: \"5e8e9465-2f77-4b84-b96d-a3c2768d0548\") " pod="openstack/horizon-76854b6f89-9g6bc" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.047280 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5e8e9465-2f77-4b84-b96d-a3c2768d0548-config-data\") pod \"horizon-76854b6f89-9g6bc\" (UID: \"5e8e9465-2f77-4b84-b96d-a3c2768d0548\") " pod="openstack/horizon-76854b6f89-9g6bc" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.075534 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.075776 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="82e4ff41-9e2e-4da3-9431-8e3706464313" containerName="glance-log" containerID="cri-o://4c5bbd482e18edf23a5fceec1625fdf0f583bca42dee6b8cab74f7fa5d2b2306" gracePeriod=30 Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.075966 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="82e4ff41-9e2e-4da3-9431-8e3706464313" containerName="glance-httpd" containerID="cri-o://784a3d49e60574db2ac4a8d299d338aaf66ed5262ee8a8fa02905d04a265b25a" gracePeriod=30 Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.112864 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.113396 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="dec4ed1b-f3fe-4135-be28-524ecef6a200" containerName="glance-log" containerID="cri-o://e973bba57429ed6b80ce822780a242faea92ba92c81f938c3cc46a1d04822663" gracePeriod=30 Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.113711 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="dec4ed1b-f3fe-4135-be28-524ecef6a200" containerName="glance-httpd" containerID="cri-o://70c1492faaff25046b0aed79e6f66de280b03cd15014e4e60dcc768b33ddd5a6" gracePeriod=30 Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.137491 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-68cd4d6b77-zlxrg"] Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.139179 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68cd4d6b77-zlxrg" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.144394 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29415961-hgbt4" event={"ID":"3faa564c-7313-40f9-9cad-ed666212dca9","Type":"ContainerStarted","Data":"527fbdfd9061c625a6bed9d3a07e11ea8a05e6713628ea4fd86ce110d1fdf205"} Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.148372 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-68cd4d6b77-zlxrg"] Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.148878 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e8e9465-2f77-4b84-b96d-a3c2768d0548-logs\") pod \"horizon-76854b6f89-9g6bc\" (UID: \"5e8e9465-2f77-4b84-b96d-a3c2768d0548\") " pod="openstack/horizon-76854b6f89-9g6bc" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.148927 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5e8e9465-2f77-4b84-b96d-a3c2768d0548-horizon-secret-key\") pod \"horizon-76854b6f89-9g6bc\" (UID: \"5e8e9465-2f77-4b84-b96d-a3c2768d0548\") " pod="openstack/horizon-76854b6f89-9g6bc" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.148952 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxxdj\" (UniqueName: \"kubernetes.io/projected/5e8e9465-2f77-4b84-b96d-a3c2768d0548-kube-api-access-rxxdj\") pod \"horizon-76854b6f89-9g6bc\" (UID: \"5e8e9465-2f77-4b84-b96d-a3c2768d0548\") " pod="openstack/horizon-76854b6f89-9g6bc" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.148990 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e8e9465-2f77-4b84-b96d-a3c2768d0548-scripts\") pod \"horizon-76854b6f89-9g6bc\" (UID: \"5e8e9465-2f77-4b84-b96d-a3c2768d0548\") " pod="openstack/horizon-76854b6f89-9g6bc" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.149029 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5e8e9465-2f77-4b84-b96d-a3c2768d0548-config-data\") pod \"horizon-76854b6f89-9g6bc\" (UID: \"5e8e9465-2f77-4b84-b96d-a3c2768d0548\") " pod="openstack/horizon-76854b6f89-9g6bc" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.160140 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e8e9465-2f77-4b84-b96d-a3c2768d0548-scripts\") pod \"horizon-76854b6f89-9g6bc\" (UID: \"5e8e9465-2f77-4b84-b96d-a3c2768d0548\") " pod="openstack/horizon-76854b6f89-9g6bc" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.161718 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e8e9465-2f77-4b84-b96d-a3c2768d0548-logs\") pod \"horizon-76854b6f89-9g6bc\" (UID: \"5e8e9465-2f77-4b84-b96d-a3c2768d0548\") " pod="openstack/horizon-76854b6f89-9g6bc" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.163388 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5e8e9465-2f77-4b84-b96d-a3c2768d0548-config-data\") pod \"horizon-76854b6f89-9g6bc\" (UID: \"5e8e9465-2f77-4b84-b96d-a3c2768d0548\") " pod="openstack/horizon-76854b6f89-9g6bc" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.172895 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5e8e9465-2f77-4b84-b96d-a3c2768d0548-horizon-secret-key\") pod \"horizon-76854b6f89-9g6bc\" (UID: \"5e8e9465-2f77-4b84-b96d-a3c2768d0548\") " pod="openstack/horizon-76854b6f89-9g6bc" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.185990 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxxdj\" (UniqueName: \"kubernetes.io/projected/5e8e9465-2f77-4b84-b96d-a3c2768d0548-kube-api-access-rxxdj\") pod \"horizon-76854b6f89-9g6bc\" (UID: \"5e8e9465-2f77-4b84-b96d-a3c2768d0548\") " pod="openstack/horizon-76854b6f89-9g6bc" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.209395 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29415961-hgbt4" podStartSLOduration=2.209374244 podStartE2EDuration="2.209374244s" podCreationTimestamp="2025-12-05 18:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 18:01:02.191762705 +0000 UTC m=+6322.430436217" watchObservedRunningTime="2025-12-05 18:01:02.209374244 +0000 UTC m=+6322.448047756" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.250536 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1d4dcfe0-9012-4762-a563-fb1ae5a47b46-horizon-secret-key\") pod \"horizon-68cd4d6b77-zlxrg\" (UID: \"1d4dcfe0-9012-4762-a563-fb1ae5a47b46\") " pod="openstack/horizon-68cd4d6b77-zlxrg" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.250609 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d4dcfe0-9012-4762-a563-fb1ae5a47b46-scripts\") pod \"horizon-68cd4d6b77-zlxrg\" (UID: \"1d4dcfe0-9012-4762-a563-fb1ae5a47b46\") " pod="openstack/horizon-68cd4d6b77-zlxrg" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.250672 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hk8b\" (UniqueName: \"kubernetes.io/projected/1d4dcfe0-9012-4762-a563-fb1ae5a47b46-kube-api-access-9hk8b\") pod \"horizon-68cd4d6b77-zlxrg\" (UID: \"1d4dcfe0-9012-4762-a563-fb1ae5a47b46\") " pod="openstack/horizon-68cd4d6b77-zlxrg" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.250749 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d4dcfe0-9012-4762-a563-fb1ae5a47b46-logs\") pod \"horizon-68cd4d6b77-zlxrg\" (UID: \"1d4dcfe0-9012-4762-a563-fb1ae5a47b46\") " pod="openstack/horizon-68cd4d6b77-zlxrg" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.250803 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1d4dcfe0-9012-4762-a563-fb1ae5a47b46-config-data\") pod \"horizon-68cd4d6b77-zlxrg\" (UID: \"1d4dcfe0-9012-4762-a563-fb1ae5a47b46\") " pod="openstack/horizon-68cd4d6b77-zlxrg" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.301736 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76854b6f89-9g6bc" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.352219 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hk8b\" (UniqueName: \"kubernetes.io/projected/1d4dcfe0-9012-4762-a563-fb1ae5a47b46-kube-api-access-9hk8b\") pod \"horizon-68cd4d6b77-zlxrg\" (UID: \"1d4dcfe0-9012-4762-a563-fb1ae5a47b46\") " pod="openstack/horizon-68cd4d6b77-zlxrg" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.352314 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d4dcfe0-9012-4762-a563-fb1ae5a47b46-logs\") pod \"horizon-68cd4d6b77-zlxrg\" (UID: \"1d4dcfe0-9012-4762-a563-fb1ae5a47b46\") " pod="openstack/horizon-68cd4d6b77-zlxrg" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.352366 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1d4dcfe0-9012-4762-a563-fb1ae5a47b46-config-data\") pod \"horizon-68cd4d6b77-zlxrg\" (UID: \"1d4dcfe0-9012-4762-a563-fb1ae5a47b46\") " pod="openstack/horizon-68cd4d6b77-zlxrg" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.352402 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1d4dcfe0-9012-4762-a563-fb1ae5a47b46-horizon-secret-key\") pod \"horizon-68cd4d6b77-zlxrg\" (UID: \"1d4dcfe0-9012-4762-a563-fb1ae5a47b46\") " pod="openstack/horizon-68cd4d6b77-zlxrg" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.352458 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d4dcfe0-9012-4762-a563-fb1ae5a47b46-scripts\") pod \"horizon-68cd4d6b77-zlxrg\" (UID: \"1d4dcfe0-9012-4762-a563-fb1ae5a47b46\") " pod="openstack/horizon-68cd4d6b77-zlxrg" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.352830 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d4dcfe0-9012-4762-a563-fb1ae5a47b46-logs\") pod \"horizon-68cd4d6b77-zlxrg\" (UID: \"1d4dcfe0-9012-4762-a563-fb1ae5a47b46\") " pod="openstack/horizon-68cd4d6b77-zlxrg" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.353535 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d4dcfe0-9012-4762-a563-fb1ae5a47b46-scripts\") pod \"horizon-68cd4d6b77-zlxrg\" (UID: \"1d4dcfe0-9012-4762-a563-fb1ae5a47b46\") " pod="openstack/horizon-68cd4d6b77-zlxrg" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.354051 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1d4dcfe0-9012-4762-a563-fb1ae5a47b46-config-data\") pod \"horizon-68cd4d6b77-zlxrg\" (UID: \"1d4dcfe0-9012-4762-a563-fb1ae5a47b46\") " pod="openstack/horizon-68cd4d6b77-zlxrg" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.363988 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1d4dcfe0-9012-4762-a563-fb1ae5a47b46-horizon-secret-key\") pod \"horizon-68cd4d6b77-zlxrg\" (UID: \"1d4dcfe0-9012-4762-a563-fb1ae5a47b46\") " pod="openstack/horizon-68cd4d6b77-zlxrg" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.379568 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hk8b\" (UniqueName: \"kubernetes.io/projected/1d4dcfe0-9012-4762-a563-fb1ae5a47b46-kube-api-access-9hk8b\") pod \"horizon-68cd4d6b77-zlxrg\" (UID: \"1d4dcfe0-9012-4762-a563-fb1ae5a47b46\") " pod="openstack/horizon-68cd4d6b77-zlxrg" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.466506 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68cd4d6b77-zlxrg" Dec 05 18:01:02 crc kubenswrapper[4756]: I1205 18:01:02.806199 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76854b6f89-9g6bc"] Dec 05 18:01:03 crc kubenswrapper[4756]: I1205 18:01:03.016120 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-68cd4d6b77-zlxrg"] Dec 05 18:01:03 crc kubenswrapper[4756]: I1205 18:01:03.162069 4756 generic.go:334] "Generic (PLEG): container finished" podID="82e4ff41-9e2e-4da3-9431-8e3706464313" containerID="4c5bbd482e18edf23a5fceec1625fdf0f583bca42dee6b8cab74f7fa5d2b2306" exitCode=143 Dec 05 18:01:03 crc kubenswrapper[4756]: I1205 18:01:03.162188 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"82e4ff41-9e2e-4da3-9431-8e3706464313","Type":"ContainerDied","Data":"4c5bbd482e18edf23a5fceec1625fdf0f583bca42dee6b8cab74f7fa5d2b2306"} Dec 05 18:01:03 crc kubenswrapper[4756]: I1205 18:01:03.167285 4756 generic.go:334] "Generic (PLEG): container finished" podID="dec4ed1b-f3fe-4135-be28-524ecef6a200" containerID="e973bba57429ed6b80ce822780a242faea92ba92c81f938c3cc46a1d04822663" exitCode=143 Dec 05 18:01:03 crc kubenswrapper[4756]: I1205 18:01:03.167344 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dec4ed1b-f3fe-4135-be28-524ecef6a200","Type":"ContainerDied","Data":"e973bba57429ed6b80ce822780a242faea92ba92c81f938c3cc46a1d04822663"} Dec 05 18:01:03 crc kubenswrapper[4756]: I1205 18:01:03.168975 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76854b6f89-9g6bc" event={"ID":"5e8e9465-2f77-4b84-b96d-a3c2768d0548","Type":"ContainerStarted","Data":"df464487a233f54c7be703593251bcc7e538f75f4ff0ab5d43e3656388d6abd7"} Dec 05 18:01:03 crc kubenswrapper[4756]: I1205 18:01:03.170615 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68cd4d6b77-zlxrg" event={"ID":"1d4dcfe0-9012-4762-a563-fb1ae5a47b46","Type":"ContainerStarted","Data":"3601131a9d52f06db0afdfd2165c83df179982d088d0c199e57c2309b64f34db"} Dec 05 18:01:03 crc kubenswrapper[4756]: I1205 18:01:03.924125 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-76854b6f89-9g6bc"] Dec 05 18:01:03 crc kubenswrapper[4756]: I1205 18:01:03.983791 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5497588c66-pwmcj"] Dec 05 18:01:03 crc kubenswrapper[4756]: I1205 18:01:03.985971 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5497588c66-pwmcj" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.000922 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.025272 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5497588c66-pwmcj"] Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.112542 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-68cd4d6b77-zlxrg"] Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.121782 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmg4v\" (UniqueName: \"kubernetes.io/projected/b47eed2d-c027-4235-893e-a92f7ad54d17-kube-api-access-rmg4v\") pod \"horizon-5497588c66-pwmcj\" (UID: \"b47eed2d-c027-4235-893e-a92f7ad54d17\") " pod="openstack/horizon-5497588c66-pwmcj" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.124600 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b47eed2d-c027-4235-893e-a92f7ad54d17-horizon-secret-key\") pod \"horizon-5497588c66-pwmcj\" (UID: \"b47eed2d-c027-4235-893e-a92f7ad54d17\") " pod="openstack/horizon-5497588c66-pwmcj" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.124850 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b47eed2d-c027-4235-893e-a92f7ad54d17-horizon-tls-certs\") pod \"horizon-5497588c66-pwmcj\" (UID: \"b47eed2d-c027-4235-893e-a92f7ad54d17\") " pod="openstack/horizon-5497588c66-pwmcj" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.124939 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b47eed2d-c027-4235-893e-a92f7ad54d17-scripts\") pod \"horizon-5497588c66-pwmcj\" (UID: \"b47eed2d-c027-4235-893e-a92f7ad54d17\") " pod="openstack/horizon-5497588c66-pwmcj" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.125024 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b47eed2d-c027-4235-893e-a92f7ad54d17-config-data\") pod \"horizon-5497588c66-pwmcj\" (UID: \"b47eed2d-c027-4235-893e-a92f7ad54d17\") " pod="openstack/horizon-5497588c66-pwmcj" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.125115 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b47eed2d-c027-4235-893e-a92f7ad54d17-combined-ca-bundle\") pod \"horizon-5497588c66-pwmcj\" (UID: \"b47eed2d-c027-4235-893e-a92f7ad54d17\") " pod="openstack/horizon-5497588c66-pwmcj" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.125409 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b47eed2d-c027-4235-893e-a92f7ad54d17-logs\") pod \"horizon-5497588c66-pwmcj\" (UID: \"b47eed2d-c027-4235-893e-a92f7ad54d17\") " pod="openstack/horizon-5497588c66-pwmcj" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.212928 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5cf757d7d6-ltbqn"] Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.236268 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cf757d7d6-ltbqn" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.243100 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b47eed2d-c027-4235-893e-a92f7ad54d17-combined-ca-bundle\") pod \"horizon-5497588c66-pwmcj\" (UID: \"b47eed2d-c027-4235-893e-a92f7ad54d17\") " pod="openstack/horizon-5497588c66-pwmcj" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.243327 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b47eed2d-c027-4235-893e-a92f7ad54d17-logs\") pod \"horizon-5497588c66-pwmcj\" (UID: \"b47eed2d-c027-4235-893e-a92f7ad54d17\") " pod="openstack/horizon-5497588c66-pwmcj" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.245355 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmg4v\" (UniqueName: \"kubernetes.io/projected/b47eed2d-c027-4235-893e-a92f7ad54d17-kube-api-access-rmg4v\") pod \"horizon-5497588c66-pwmcj\" (UID: \"b47eed2d-c027-4235-893e-a92f7ad54d17\") " pod="openstack/horizon-5497588c66-pwmcj" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.245435 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b47eed2d-c027-4235-893e-a92f7ad54d17-horizon-secret-key\") pod \"horizon-5497588c66-pwmcj\" (UID: \"b47eed2d-c027-4235-893e-a92f7ad54d17\") " pod="openstack/horizon-5497588c66-pwmcj" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.245623 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b47eed2d-c027-4235-893e-a92f7ad54d17-horizon-tls-certs\") pod \"horizon-5497588c66-pwmcj\" (UID: \"b47eed2d-c027-4235-893e-a92f7ad54d17\") " pod="openstack/horizon-5497588c66-pwmcj" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.245686 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b47eed2d-c027-4235-893e-a92f7ad54d17-scripts\") pod \"horizon-5497588c66-pwmcj\" (UID: \"b47eed2d-c027-4235-893e-a92f7ad54d17\") " pod="openstack/horizon-5497588c66-pwmcj" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.245756 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b47eed2d-c027-4235-893e-a92f7ad54d17-config-data\") pod \"horizon-5497588c66-pwmcj\" (UID: \"b47eed2d-c027-4235-893e-a92f7ad54d17\") " pod="openstack/horizon-5497588c66-pwmcj" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.246676 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b47eed2d-c027-4235-893e-a92f7ad54d17-logs\") pod \"horizon-5497588c66-pwmcj\" (UID: \"b47eed2d-c027-4235-893e-a92f7ad54d17\") " pod="openstack/horizon-5497588c66-pwmcj" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.256091 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b47eed2d-c027-4235-893e-a92f7ad54d17-scripts\") pod \"horizon-5497588c66-pwmcj\" (UID: \"b47eed2d-c027-4235-893e-a92f7ad54d17\") " pod="openstack/horizon-5497588c66-pwmcj" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.256402 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b47eed2d-c027-4235-893e-a92f7ad54d17-config-data\") pod \"horizon-5497588c66-pwmcj\" (UID: \"b47eed2d-c027-4235-893e-a92f7ad54d17\") " pod="openstack/horizon-5497588c66-pwmcj" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.261709 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b47eed2d-c027-4235-893e-a92f7ad54d17-horizon-tls-certs\") pod \"horizon-5497588c66-pwmcj\" (UID: \"b47eed2d-c027-4235-893e-a92f7ad54d17\") " pod="openstack/horizon-5497588c66-pwmcj" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.266770 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b47eed2d-c027-4235-893e-a92f7ad54d17-horizon-secret-key\") pod \"horizon-5497588c66-pwmcj\" (UID: \"b47eed2d-c027-4235-893e-a92f7ad54d17\") " pod="openstack/horizon-5497588c66-pwmcj" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.283136 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b47eed2d-c027-4235-893e-a92f7ad54d17-combined-ca-bundle\") pod \"horizon-5497588c66-pwmcj\" (UID: \"b47eed2d-c027-4235-893e-a92f7ad54d17\") " pod="openstack/horizon-5497588c66-pwmcj" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.284660 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29415961-hgbt4" event={"ID":"3faa564c-7313-40f9-9cad-ed666212dca9","Type":"ContainerDied","Data":"527fbdfd9061c625a6bed9d3a07e11ea8a05e6713628ea4fd86ce110d1fdf205"} Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.284487 4756 generic.go:334] "Generic (PLEG): container finished" podID="3faa564c-7313-40f9-9cad-ed666212dca9" containerID="527fbdfd9061c625a6bed9d3a07e11ea8a05e6713628ea4fd86ce110d1fdf205" exitCode=0 Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.286846 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmg4v\" (UniqueName: \"kubernetes.io/projected/b47eed2d-c027-4235-893e-a92f7ad54d17-kube-api-access-rmg4v\") pod \"horizon-5497588c66-pwmcj\" (UID: \"b47eed2d-c027-4235-893e-a92f7ad54d17\") " pod="openstack/horizon-5497588c66-pwmcj" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.291784 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5cf757d7d6-ltbqn"] Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.340989 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5497588c66-pwmcj" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.347561 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1acc47c0-70db-432a-b627-44a07ac96488-combined-ca-bundle\") pod \"horizon-5cf757d7d6-ltbqn\" (UID: \"1acc47c0-70db-432a-b627-44a07ac96488\") " pod="openstack/horizon-5cf757d7d6-ltbqn" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.347618 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1acc47c0-70db-432a-b627-44a07ac96488-horizon-tls-certs\") pod \"horizon-5cf757d7d6-ltbqn\" (UID: \"1acc47c0-70db-432a-b627-44a07ac96488\") " pod="openstack/horizon-5cf757d7d6-ltbqn" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.347659 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1acc47c0-70db-432a-b627-44a07ac96488-config-data\") pod \"horizon-5cf757d7d6-ltbqn\" (UID: \"1acc47c0-70db-432a-b627-44a07ac96488\") " pod="openstack/horizon-5cf757d7d6-ltbqn" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.347695 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1acc47c0-70db-432a-b627-44a07ac96488-horizon-secret-key\") pod \"horizon-5cf757d7d6-ltbqn\" (UID: \"1acc47c0-70db-432a-b627-44a07ac96488\") " pod="openstack/horizon-5cf757d7d6-ltbqn" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.347724 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2prv2\" (UniqueName: \"kubernetes.io/projected/1acc47c0-70db-432a-b627-44a07ac96488-kube-api-access-2prv2\") pod \"horizon-5cf757d7d6-ltbqn\" (UID: \"1acc47c0-70db-432a-b627-44a07ac96488\") " pod="openstack/horizon-5cf757d7d6-ltbqn" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.347762 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1acc47c0-70db-432a-b627-44a07ac96488-logs\") pod \"horizon-5cf757d7d6-ltbqn\" (UID: \"1acc47c0-70db-432a-b627-44a07ac96488\") " pod="openstack/horizon-5cf757d7d6-ltbqn" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.347829 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1acc47c0-70db-432a-b627-44a07ac96488-scripts\") pod \"horizon-5cf757d7d6-ltbqn\" (UID: \"1acc47c0-70db-432a-b627-44a07ac96488\") " pod="openstack/horizon-5cf757d7d6-ltbqn" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.448995 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2prv2\" (UniqueName: \"kubernetes.io/projected/1acc47c0-70db-432a-b627-44a07ac96488-kube-api-access-2prv2\") pod \"horizon-5cf757d7d6-ltbqn\" (UID: \"1acc47c0-70db-432a-b627-44a07ac96488\") " pod="openstack/horizon-5cf757d7d6-ltbqn" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.449054 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1acc47c0-70db-432a-b627-44a07ac96488-logs\") pod \"horizon-5cf757d7d6-ltbqn\" (UID: \"1acc47c0-70db-432a-b627-44a07ac96488\") " pod="openstack/horizon-5cf757d7d6-ltbqn" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.449126 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1acc47c0-70db-432a-b627-44a07ac96488-scripts\") pod \"horizon-5cf757d7d6-ltbqn\" (UID: \"1acc47c0-70db-432a-b627-44a07ac96488\") " pod="openstack/horizon-5cf757d7d6-ltbqn" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.449204 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1acc47c0-70db-432a-b627-44a07ac96488-combined-ca-bundle\") pod \"horizon-5cf757d7d6-ltbqn\" (UID: \"1acc47c0-70db-432a-b627-44a07ac96488\") " pod="openstack/horizon-5cf757d7d6-ltbqn" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.449226 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1acc47c0-70db-432a-b627-44a07ac96488-horizon-tls-certs\") pod \"horizon-5cf757d7d6-ltbqn\" (UID: \"1acc47c0-70db-432a-b627-44a07ac96488\") " pod="openstack/horizon-5cf757d7d6-ltbqn" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.449261 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1acc47c0-70db-432a-b627-44a07ac96488-config-data\") pod \"horizon-5cf757d7d6-ltbqn\" (UID: \"1acc47c0-70db-432a-b627-44a07ac96488\") " pod="openstack/horizon-5cf757d7d6-ltbqn" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.449288 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1acc47c0-70db-432a-b627-44a07ac96488-horizon-secret-key\") pod \"horizon-5cf757d7d6-ltbqn\" (UID: \"1acc47c0-70db-432a-b627-44a07ac96488\") " pod="openstack/horizon-5cf757d7d6-ltbqn" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.450220 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1acc47c0-70db-432a-b627-44a07ac96488-scripts\") pod \"horizon-5cf757d7d6-ltbqn\" (UID: \"1acc47c0-70db-432a-b627-44a07ac96488\") " pod="openstack/horizon-5cf757d7d6-ltbqn" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.450267 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1acc47c0-70db-432a-b627-44a07ac96488-logs\") pod \"horizon-5cf757d7d6-ltbqn\" (UID: \"1acc47c0-70db-432a-b627-44a07ac96488\") " pod="openstack/horizon-5cf757d7d6-ltbqn" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.450534 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1acc47c0-70db-432a-b627-44a07ac96488-config-data\") pod \"horizon-5cf757d7d6-ltbqn\" (UID: \"1acc47c0-70db-432a-b627-44a07ac96488\") " pod="openstack/horizon-5cf757d7d6-ltbqn" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.453519 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1acc47c0-70db-432a-b627-44a07ac96488-combined-ca-bundle\") pod \"horizon-5cf757d7d6-ltbqn\" (UID: \"1acc47c0-70db-432a-b627-44a07ac96488\") " pod="openstack/horizon-5cf757d7d6-ltbqn" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.453848 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1acc47c0-70db-432a-b627-44a07ac96488-horizon-tls-certs\") pod \"horizon-5cf757d7d6-ltbqn\" (UID: \"1acc47c0-70db-432a-b627-44a07ac96488\") " pod="openstack/horizon-5cf757d7d6-ltbqn" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.460127 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1acc47c0-70db-432a-b627-44a07ac96488-horizon-secret-key\") pod \"horizon-5cf757d7d6-ltbqn\" (UID: \"1acc47c0-70db-432a-b627-44a07ac96488\") " pod="openstack/horizon-5cf757d7d6-ltbqn" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.466230 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2prv2\" (UniqueName: \"kubernetes.io/projected/1acc47c0-70db-432a-b627-44a07ac96488-kube-api-access-2prv2\") pod \"horizon-5cf757d7d6-ltbqn\" (UID: \"1acc47c0-70db-432a-b627-44a07ac96488\") " pod="openstack/horizon-5cf757d7d6-ltbqn" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.687755 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cf757d7d6-ltbqn" Dec 05 18:01:04 crc kubenswrapper[4756]: I1205 18:01:04.848194 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5497588c66-pwmcj"] Dec 05 18:01:04 crc kubenswrapper[4756]: W1205 18:01:04.863543 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb47eed2d_c027_4235_893e_a92f7ad54d17.slice/crio-e939b5e8c144546217e5027c3e2485e2be99988fcca0042f630f9277e28f63af WatchSource:0}: Error finding container e939b5e8c144546217e5027c3e2485e2be99988fcca0042f630f9277e28f63af: Status 404 returned error can't find the container with id e939b5e8c144546217e5027c3e2485e2be99988fcca0042f630f9277e28f63af Dec 05 18:01:05 crc kubenswrapper[4756]: I1205 18:01:05.167531 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5cf757d7d6-ltbqn"] Dec 05 18:01:05 crc kubenswrapper[4756]: W1205 18:01:05.176036 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1acc47c0_70db_432a_b627_44a07ac96488.slice/crio-8c33c45d4920406d42e95e539d380eae7d6e71a9d657752dd09a2ed0a5e3bdae WatchSource:0}: Error finding container 8c33c45d4920406d42e95e539d380eae7d6e71a9d657752dd09a2ed0a5e3bdae: Status 404 returned error can't find the container with id 8c33c45d4920406d42e95e539d380eae7d6e71a9d657752dd09a2ed0a5e3bdae Dec 05 18:01:05 crc kubenswrapper[4756]: I1205 18:01:05.297052 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5497588c66-pwmcj" event={"ID":"b47eed2d-c027-4235-893e-a92f7ad54d17","Type":"ContainerStarted","Data":"e939b5e8c144546217e5027c3e2485e2be99988fcca0042f630f9277e28f63af"} Dec 05 18:01:05 crc kubenswrapper[4756]: I1205 18:01:05.299431 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cf757d7d6-ltbqn" event={"ID":"1acc47c0-70db-432a-b627-44a07ac96488","Type":"ContainerStarted","Data":"8c33c45d4920406d42e95e539d380eae7d6e71a9d657752dd09a2ed0a5e3bdae"} Dec 05 18:01:05 crc kubenswrapper[4756]: I1205 18:01:05.856018 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29415961-hgbt4" Dec 05 18:01:05 crc kubenswrapper[4756]: I1205 18:01:05.869434 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 18:01:05 crc kubenswrapper[4756]: I1205 18:01:05.945339 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.015933 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e4ff41-9e2e-4da3-9431-8e3706464313-config-data\") pod \"82e4ff41-9e2e-4da3-9431-8e3706464313\" (UID: \"82e4ff41-9e2e-4da3-9431-8e3706464313\") " Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.016042 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3faa564c-7313-40f9-9cad-ed666212dca9-combined-ca-bundle\") pod \"3faa564c-7313-40f9-9cad-ed666212dca9\" (UID: \"3faa564c-7313-40f9-9cad-ed666212dca9\") " Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.016078 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec4ed1b-f3fe-4135-be28-524ecef6a200-config-data\") pod \"dec4ed1b-f3fe-4135-be28-524ecef6a200\" (UID: \"dec4ed1b-f3fe-4135-be28-524ecef6a200\") " Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.016100 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dec4ed1b-f3fe-4135-be28-524ecef6a200-scripts\") pod \"dec4ed1b-f3fe-4135-be28-524ecef6a200\" (UID: \"dec4ed1b-f3fe-4135-be28-524ecef6a200\") " Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.016121 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e4ff41-9e2e-4da3-9431-8e3706464313-combined-ca-bundle\") pod \"82e4ff41-9e2e-4da3-9431-8e3706464313\" (UID: \"82e4ff41-9e2e-4da3-9431-8e3706464313\") " Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.016170 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dec4ed1b-f3fe-4135-be28-524ecef6a200-logs\") pod \"dec4ed1b-f3fe-4135-be28-524ecef6a200\" (UID: \"dec4ed1b-f3fe-4135-be28-524ecef6a200\") " Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.016209 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec4ed1b-f3fe-4135-be28-524ecef6a200-combined-ca-bundle\") pod \"dec4ed1b-f3fe-4135-be28-524ecef6a200\" (UID: \"dec4ed1b-f3fe-4135-be28-524ecef6a200\") " Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.016231 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/82e4ff41-9e2e-4da3-9431-8e3706464313-httpd-run\") pod \"82e4ff41-9e2e-4da3-9431-8e3706464313\" (UID: \"82e4ff41-9e2e-4da3-9431-8e3706464313\") " Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.016258 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dec4ed1b-f3fe-4135-be28-524ecef6a200-httpd-run\") pod \"dec4ed1b-f3fe-4135-be28-524ecef6a200\" (UID: \"dec4ed1b-f3fe-4135-be28-524ecef6a200\") " Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.016293 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-698z2\" (UniqueName: \"kubernetes.io/projected/82e4ff41-9e2e-4da3-9431-8e3706464313-kube-api-access-698z2\") pod \"82e4ff41-9e2e-4da3-9431-8e3706464313\" (UID: \"82e4ff41-9e2e-4da3-9431-8e3706464313\") " Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.016322 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82e4ff41-9e2e-4da3-9431-8e3706464313-logs\") pod \"82e4ff41-9e2e-4da3-9431-8e3706464313\" (UID: \"82e4ff41-9e2e-4da3-9431-8e3706464313\") " Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.016342 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dec4ed1b-f3fe-4135-be28-524ecef6a200-public-tls-certs\") pod \"dec4ed1b-f3fe-4135-be28-524ecef6a200\" (UID: \"dec4ed1b-f3fe-4135-be28-524ecef6a200\") " Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.016363 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82e4ff41-9e2e-4da3-9431-8e3706464313-scripts\") pod \"82e4ff41-9e2e-4da3-9431-8e3706464313\" (UID: \"82e4ff41-9e2e-4da3-9431-8e3706464313\") " Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.016396 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xk7f\" (UniqueName: \"kubernetes.io/projected/dec4ed1b-f3fe-4135-be28-524ecef6a200-kube-api-access-4xk7f\") pod \"dec4ed1b-f3fe-4135-be28-524ecef6a200\" (UID: \"dec4ed1b-f3fe-4135-be28-524ecef6a200\") " Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.016862 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3faa564c-7313-40f9-9cad-ed666212dca9-config-data\") pod \"3faa564c-7313-40f9-9cad-ed666212dca9\" (UID: \"3faa564c-7313-40f9-9cad-ed666212dca9\") " Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.016896 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/82e4ff41-9e2e-4da3-9431-8e3706464313-internal-tls-certs\") pod \"82e4ff41-9e2e-4da3-9431-8e3706464313\" (UID: \"82e4ff41-9e2e-4da3-9431-8e3706464313\") " Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.016923 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74f95\" (UniqueName: \"kubernetes.io/projected/3faa564c-7313-40f9-9cad-ed666212dca9-kube-api-access-74f95\") pod \"3faa564c-7313-40f9-9cad-ed666212dca9\" (UID: \"3faa564c-7313-40f9-9cad-ed666212dca9\") " Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.016966 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3faa564c-7313-40f9-9cad-ed666212dca9-fernet-keys\") pod \"3faa564c-7313-40f9-9cad-ed666212dca9\" (UID: \"3faa564c-7313-40f9-9cad-ed666212dca9\") " Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.024456 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82e4ff41-9e2e-4da3-9431-8e3706464313-logs" (OuterVolumeSpecName: "logs") pod "82e4ff41-9e2e-4da3-9431-8e3706464313" (UID: "82e4ff41-9e2e-4da3-9431-8e3706464313"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.025133 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82e4ff41-9e2e-4da3-9431-8e3706464313-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "82e4ff41-9e2e-4da3-9431-8e3706464313" (UID: "82e4ff41-9e2e-4da3-9431-8e3706464313"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.025341 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dec4ed1b-f3fe-4135-be28-524ecef6a200-scripts" (OuterVolumeSpecName: "scripts") pod "dec4ed1b-f3fe-4135-be28-524ecef6a200" (UID: "dec4ed1b-f3fe-4135-be28-524ecef6a200"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.025458 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dec4ed1b-f3fe-4135-be28-524ecef6a200-kube-api-access-4xk7f" (OuterVolumeSpecName: "kube-api-access-4xk7f") pod "dec4ed1b-f3fe-4135-be28-524ecef6a200" (UID: "dec4ed1b-f3fe-4135-be28-524ecef6a200"). InnerVolumeSpecName "kube-api-access-4xk7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.028618 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82e4ff41-9e2e-4da3-9431-8e3706464313-kube-api-access-698z2" (OuterVolumeSpecName: "kube-api-access-698z2") pod "82e4ff41-9e2e-4da3-9431-8e3706464313" (UID: "82e4ff41-9e2e-4da3-9431-8e3706464313"). InnerVolumeSpecName "kube-api-access-698z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.028703 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3faa564c-7313-40f9-9cad-ed666212dca9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3faa564c-7313-40f9-9cad-ed666212dca9" (UID: "3faa564c-7313-40f9-9cad-ed666212dca9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.028963 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dec4ed1b-f3fe-4135-be28-524ecef6a200-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "dec4ed1b-f3fe-4135-be28-524ecef6a200" (UID: "dec4ed1b-f3fe-4135-be28-524ecef6a200"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.036014 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dec4ed1b-f3fe-4135-be28-524ecef6a200-logs" (OuterVolumeSpecName: "logs") pod "dec4ed1b-f3fe-4135-be28-524ecef6a200" (UID: "dec4ed1b-f3fe-4135-be28-524ecef6a200"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.043576 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82e4ff41-9e2e-4da3-9431-8e3706464313-scripts" (OuterVolumeSpecName: "scripts") pod "82e4ff41-9e2e-4da3-9431-8e3706464313" (UID: "82e4ff41-9e2e-4da3-9431-8e3706464313"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.045700 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3faa564c-7313-40f9-9cad-ed666212dca9-kube-api-access-74f95" (OuterVolumeSpecName: "kube-api-access-74f95") pod "3faa564c-7313-40f9-9cad-ed666212dca9" (UID: "3faa564c-7313-40f9-9cad-ed666212dca9"). InnerVolumeSpecName "kube-api-access-74f95". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.060242 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dec4ed1b-f3fe-4135-be28-524ecef6a200-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dec4ed1b-f3fe-4135-be28-524ecef6a200" (UID: "dec4ed1b-f3fe-4135-be28-524ecef6a200"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.069337 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3faa564c-7313-40f9-9cad-ed666212dca9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3faa564c-7313-40f9-9cad-ed666212dca9" (UID: "3faa564c-7313-40f9-9cad-ed666212dca9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.096250 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82e4ff41-9e2e-4da3-9431-8e3706464313-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82e4ff41-9e2e-4da3-9431-8e3706464313" (UID: "82e4ff41-9e2e-4da3-9431-8e3706464313"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.108294 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3faa564c-7313-40f9-9cad-ed666212dca9-config-data" (OuterVolumeSpecName: "config-data") pod "3faa564c-7313-40f9-9cad-ed666212dca9" (UID: "3faa564c-7313-40f9-9cad-ed666212dca9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.119133 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-698z2\" (UniqueName: \"kubernetes.io/projected/82e4ff41-9e2e-4da3-9431-8e3706464313-kube-api-access-698z2\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.119156 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82e4ff41-9e2e-4da3-9431-8e3706464313-logs\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.119168 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82e4ff41-9e2e-4da3-9431-8e3706464313-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.119176 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xk7f\" (UniqueName: \"kubernetes.io/projected/dec4ed1b-f3fe-4135-be28-524ecef6a200-kube-api-access-4xk7f\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.119185 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3faa564c-7313-40f9-9cad-ed666212dca9-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.119195 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74f95\" (UniqueName: \"kubernetes.io/projected/3faa564c-7313-40f9-9cad-ed666212dca9-kube-api-access-74f95\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.119203 4756 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3faa564c-7313-40f9-9cad-ed666212dca9-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.119211 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3faa564c-7313-40f9-9cad-ed666212dca9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.119219 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dec4ed1b-f3fe-4135-be28-524ecef6a200-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.119227 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e4ff41-9e2e-4da3-9431-8e3706464313-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.119236 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dec4ed1b-f3fe-4135-be28-524ecef6a200-logs\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.119245 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec4ed1b-f3fe-4135-be28-524ecef6a200-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.119252 4756 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/82e4ff41-9e2e-4da3-9431-8e3706464313-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.119260 4756 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dec4ed1b-f3fe-4135-be28-524ecef6a200-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.136231 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82e4ff41-9e2e-4da3-9431-8e3706464313-config-data" (OuterVolumeSpecName: "config-data") pod "82e4ff41-9e2e-4da3-9431-8e3706464313" (UID: "82e4ff41-9e2e-4da3-9431-8e3706464313"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.146877 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dec4ed1b-f3fe-4135-be28-524ecef6a200-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "dec4ed1b-f3fe-4135-be28-524ecef6a200" (UID: "dec4ed1b-f3fe-4135-be28-524ecef6a200"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.148517 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82e4ff41-9e2e-4da3-9431-8e3706464313-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "82e4ff41-9e2e-4da3-9431-8e3706464313" (UID: "82e4ff41-9e2e-4da3-9431-8e3706464313"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.149337 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dec4ed1b-f3fe-4135-be28-524ecef6a200-config-data" (OuterVolumeSpecName: "config-data") pod "dec4ed1b-f3fe-4135-be28-524ecef6a200" (UID: "dec4ed1b-f3fe-4135-be28-524ecef6a200"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.221974 4756 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/82e4ff41-9e2e-4da3-9431-8e3706464313-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.222002 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e4ff41-9e2e-4da3-9431-8e3706464313-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.222013 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec4ed1b-f3fe-4135-be28-524ecef6a200-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.222022 4756 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dec4ed1b-f3fe-4135-be28-524ecef6a200-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.331435 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29415961-hgbt4" event={"ID":"3faa564c-7313-40f9-9cad-ed666212dca9","Type":"ContainerDied","Data":"e44a2d896f46a8deda2ea392ecebf59c3e170bca3f77f9e24804a702d1dcad5c"} Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.331476 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e44a2d896f46a8deda2ea392ecebf59c3e170bca3f77f9e24804a702d1dcad5c" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.331524 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29415961-hgbt4" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.362224 4756 generic.go:334] "Generic (PLEG): container finished" podID="82e4ff41-9e2e-4da3-9431-8e3706464313" containerID="784a3d49e60574db2ac4a8d299d338aaf66ed5262ee8a8fa02905d04a265b25a" exitCode=0 Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.362278 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"82e4ff41-9e2e-4da3-9431-8e3706464313","Type":"ContainerDied","Data":"784a3d49e60574db2ac4a8d299d338aaf66ed5262ee8a8fa02905d04a265b25a"} Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.362305 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"82e4ff41-9e2e-4da3-9431-8e3706464313","Type":"ContainerDied","Data":"0d4ad61608f280984a4e57770c78ce3bb441324eb5c258c0258f5cd6c696a16c"} Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.362321 4756 scope.go:117] "RemoveContainer" containerID="784a3d49e60574db2ac4a8d299d338aaf66ed5262ee8a8fa02905d04a265b25a" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.362449 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.366030 4756 generic.go:334] "Generic (PLEG): container finished" podID="dec4ed1b-f3fe-4135-be28-524ecef6a200" containerID="70c1492faaff25046b0aed79e6f66de280b03cd15014e4e60dcc768b33ddd5a6" exitCode=0 Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.366070 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dec4ed1b-f3fe-4135-be28-524ecef6a200","Type":"ContainerDied","Data":"70c1492faaff25046b0aed79e6f66de280b03cd15014e4e60dcc768b33ddd5a6"} Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.366098 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dec4ed1b-f3fe-4135-be28-524ecef6a200","Type":"ContainerDied","Data":"b25a5086d8243e5c11679acb72f97fd21bf750cf21b8eaf3f94112d35d4591f2"} Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.366104 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.451266 4756 scope.go:117] "RemoveContainer" containerID="4c5bbd482e18edf23a5fceec1625fdf0f583bca42dee6b8cab74f7fa5d2b2306" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.463829 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.477470 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.535696 4756 scope.go:117] "RemoveContainer" containerID="784a3d49e60574db2ac4a8d299d338aaf66ed5262ee8a8fa02905d04a265b25a" Dec 05 18:01:06 crc kubenswrapper[4756]: E1205 18:01:06.536382 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"784a3d49e60574db2ac4a8d299d338aaf66ed5262ee8a8fa02905d04a265b25a\": container with ID starting with 784a3d49e60574db2ac4a8d299d338aaf66ed5262ee8a8fa02905d04a265b25a not found: ID does not exist" containerID="784a3d49e60574db2ac4a8d299d338aaf66ed5262ee8a8fa02905d04a265b25a" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.536414 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"784a3d49e60574db2ac4a8d299d338aaf66ed5262ee8a8fa02905d04a265b25a"} err="failed to get container status \"784a3d49e60574db2ac4a8d299d338aaf66ed5262ee8a8fa02905d04a265b25a\": rpc error: code = NotFound desc = could not find container \"784a3d49e60574db2ac4a8d299d338aaf66ed5262ee8a8fa02905d04a265b25a\": container with ID starting with 784a3d49e60574db2ac4a8d299d338aaf66ed5262ee8a8fa02905d04a265b25a not found: ID does not exist" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.536597 4756 scope.go:117] "RemoveContainer" containerID="4c5bbd482e18edf23a5fceec1625fdf0f583bca42dee6b8cab74f7fa5d2b2306" Dec 05 18:01:06 crc kubenswrapper[4756]: E1205 18:01:06.536998 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c5bbd482e18edf23a5fceec1625fdf0f583bca42dee6b8cab74f7fa5d2b2306\": container with ID starting with 4c5bbd482e18edf23a5fceec1625fdf0f583bca42dee6b8cab74f7fa5d2b2306 not found: ID does not exist" containerID="4c5bbd482e18edf23a5fceec1625fdf0f583bca42dee6b8cab74f7fa5d2b2306" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.537020 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c5bbd482e18edf23a5fceec1625fdf0f583bca42dee6b8cab74f7fa5d2b2306"} err="failed to get container status \"4c5bbd482e18edf23a5fceec1625fdf0f583bca42dee6b8cab74f7fa5d2b2306\": rpc error: code = NotFound desc = could not find container \"4c5bbd482e18edf23a5fceec1625fdf0f583bca42dee6b8cab74f7fa5d2b2306\": container with ID starting with 4c5bbd482e18edf23a5fceec1625fdf0f583bca42dee6b8cab74f7fa5d2b2306 not found: ID does not exist" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.537034 4756 scope.go:117] "RemoveContainer" containerID="70c1492faaff25046b0aed79e6f66de280b03cd15014e4e60dcc768b33ddd5a6" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.548364 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 18:01:06 crc kubenswrapper[4756]: E1205 18:01:06.548783 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82e4ff41-9e2e-4da3-9431-8e3706464313" containerName="glance-httpd" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.548807 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="82e4ff41-9e2e-4da3-9431-8e3706464313" containerName="glance-httpd" Dec 05 18:01:06 crc kubenswrapper[4756]: E1205 18:01:06.548837 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3faa564c-7313-40f9-9cad-ed666212dca9" containerName="keystone-cron" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.548844 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3faa564c-7313-40f9-9cad-ed666212dca9" containerName="keystone-cron" Dec 05 18:01:06 crc kubenswrapper[4756]: E1205 18:01:06.548863 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec4ed1b-f3fe-4135-be28-524ecef6a200" containerName="glance-log" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.548871 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec4ed1b-f3fe-4135-be28-524ecef6a200" containerName="glance-log" Dec 05 18:01:06 crc kubenswrapper[4756]: E1205 18:01:06.548886 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82e4ff41-9e2e-4da3-9431-8e3706464313" containerName="glance-log" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.548891 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="82e4ff41-9e2e-4da3-9431-8e3706464313" containerName="glance-log" Dec 05 18:01:06 crc kubenswrapper[4756]: E1205 18:01:06.548901 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec4ed1b-f3fe-4135-be28-524ecef6a200" containerName="glance-httpd" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.548907 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec4ed1b-f3fe-4135-be28-524ecef6a200" containerName="glance-httpd" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.549085 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="82e4ff41-9e2e-4da3-9431-8e3706464313" containerName="glance-log" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.549105 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="3faa564c-7313-40f9-9cad-ed666212dca9" containerName="keystone-cron" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.549122 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="dec4ed1b-f3fe-4135-be28-524ecef6a200" containerName="glance-log" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.549133 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="dec4ed1b-f3fe-4135-be28-524ecef6a200" containerName="glance-httpd" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.549145 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="82e4ff41-9e2e-4da3-9431-8e3706464313" containerName="glance-httpd" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.553001 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.559707 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.560515 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.560643 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6clvv" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.560761 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.561465 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.570455 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.597706 4756 scope.go:117] "RemoveContainer" containerID="e973bba57429ed6b80ce822780a242faea92ba92c81f938c3cc46a1d04822663" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.597863 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.603767 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.605619 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.608162 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.610469 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.613493 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.645319 4756 scope.go:117] "RemoveContainer" containerID="70c1492faaff25046b0aed79e6f66de280b03cd15014e4e60dcc768b33ddd5a6" Dec 05 18:01:06 crc kubenswrapper[4756]: E1205 18:01:06.646914 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70c1492faaff25046b0aed79e6f66de280b03cd15014e4e60dcc768b33ddd5a6\": container with ID starting with 70c1492faaff25046b0aed79e6f66de280b03cd15014e4e60dcc768b33ddd5a6 not found: ID does not exist" containerID="70c1492faaff25046b0aed79e6f66de280b03cd15014e4e60dcc768b33ddd5a6" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.646946 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70c1492faaff25046b0aed79e6f66de280b03cd15014e4e60dcc768b33ddd5a6"} err="failed to get container status \"70c1492faaff25046b0aed79e6f66de280b03cd15014e4e60dcc768b33ddd5a6\": rpc error: code = NotFound desc = could not find container \"70c1492faaff25046b0aed79e6f66de280b03cd15014e4e60dcc768b33ddd5a6\": container with ID starting with 70c1492faaff25046b0aed79e6f66de280b03cd15014e4e60dcc768b33ddd5a6 not found: ID does not exist" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.646968 4756 scope.go:117] "RemoveContainer" containerID="e973bba57429ed6b80ce822780a242faea92ba92c81f938c3cc46a1d04822663" Dec 05 18:01:06 crc kubenswrapper[4756]: E1205 18:01:06.647230 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e973bba57429ed6b80ce822780a242faea92ba92c81f938c3cc46a1d04822663\": container with ID starting with e973bba57429ed6b80ce822780a242faea92ba92c81f938c3cc46a1d04822663 not found: ID does not exist" containerID="e973bba57429ed6b80ce822780a242faea92ba92c81f938c3cc46a1d04822663" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.647313 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e973bba57429ed6b80ce822780a242faea92ba92c81f938c3cc46a1d04822663"} err="failed to get container status \"e973bba57429ed6b80ce822780a242faea92ba92c81f938c3cc46a1d04822663\": rpc error: code = NotFound desc = could not find container \"e973bba57429ed6b80ce822780a242faea92ba92c81f938c3cc46a1d04822663\": container with ID starting with e973bba57429ed6b80ce822780a242faea92ba92c81f938c3cc46a1d04822663 not found: ID does not exist" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.736369 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c997de36-5bb1-4e49-860f-b1e90480b111-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c997de36-5bb1-4e49-860f-b1e90480b111\") " pod="openstack/glance-default-internal-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.736432 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60360fa1-9e1f-4cb0-b41d-1f4701bef286-scripts\") pod \"glance-default-external-api-0\" (UID: \"60360fa1-9e1f-4cb0-b41d-1f4701bef286\") " pod="openstack/glance-default-external-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.736455 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c997de36-5bb1-4e49-860f-b1e90480b111-logs\") pod \"glance-default-internal-api-0\" (UID: \"c997de36-5bb1-4e49-860f-b1e90480b111\") " pod="openstack/glance-default-internal-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.736514 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60360fa1-9e1f-4cb0-b41d-1f4701bef286-config-data\") pod \"glance-default-external-api-0\" (UID: \"60360fa1-9e1f-4cb0-b41d-1f4701bef286\") " pod="openstack/glance-default-external-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.736688 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/60360fa1-9e1f-4cb0-b41d-1f4701bef286-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"60360fa1-9e1f-4cb0-b41d-1f4701bef286\") " pod="openstack/glance-default-external-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.736820 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c997de36-5bb1-4e49-860f-b1e90480b111-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c997de36-5bb1-4e49-860f-b1e90480b111\") " pod="openstack/glance-default-internal-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.736866 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c997de36-5bb1-4e49-860f-b1e90480b111-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c997de36-5bb1-4e49-860f-b1e90480b111\") " pod="openstack/glance-default-internal-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.736896 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60360fa1-9e1f-4cb0-b41d-1f4701bef286-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"60360fa1-9e1f-4cb0-b41d-1f4701bef286\") " pod="openstack/glance-default-external-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.736949 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60360fa1-9e1f-4cb0-b41d-1f4701bef286-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"60360fa1-9e1f-4cb0-b41d-1f4701bef286\") " pod="openstack/glance-default-external-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.737072 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qdc4\" (UniqueName: \"kubernetes.io/projected/60360fa1-9e1f-4cb0-b41d-1f4701bef286-kube-api-access-2qdc4\") pod \"glance-default-external-api-0\" (UID: \"60360fa1-9e1f-4cb0-b41d-1f4701bef286\") " pod="openstack/glance-default-external-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.737122 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60360fa1-9e1f-4cb0-b41d-1f4701bef286-logs\") pod \"glance-default-external-api-0\" (UID: \"60360fa1-9e1f-4cb0-b41d-1f4701bef286\") " pod="openstack/glance-default-external-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.737163 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdb5g\" (UniqueName: \"kubernetes.io/projected/c997de36-5bb1-4e49-860f-b1e90480b111-kube-api-access-xdb5g\") pod \"glance-default-internal-api-0\" (UID: \"c997de36-5bb1-4e49-860f-b1e90480b111\") " pod="openstack/glance-default-internal-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.737195 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c997de36-5bb1-4e49-860f-b1e90480b111-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c997de36-5bb1-4e49-860f-b1e90480b111\") " pod="openstack/glance-default-internal-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.737212 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c997de36-5bb1-4e49-860f-b1e90480b111-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c997de36-5bb1-4e49-860f-b1e90480b111\") " pod="openstack/glance-default-internal-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.847898 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60360fa1-9e1f-4cb0-b41d-1f4701bef286-scripts\") pod \"glance-default-external-api-0\" (UID: \"60360fa1-9e1f-4cb0-b41d-1f4701bef286\") " pod="openstack/glance-default-external-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.848190 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c997de36-5bb1-4e49-860f-b1e90480b111-logs\") pod \"glance-default-internal-api-0\" (UID: \"c997de36-5bb1-4e49-860f-b1e90480b111\") " pod="openstack/glance-default-internal-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.848248 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60360fa1-9e1f-4cb0-b41d-1f4701bef286-config-data\") pod \"glance-default-external-api-0\" (UID: \"60360fa1-9e1f-4cb0-b41d-1f4701bef286\") " pod="openstack/glance-default-external-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.848358 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/60360fa1-9e1f-4cb0-b41d-1f4701bef286-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"60360fa1-9e1f-4cb0-b41d-1f4701bef286\") " pod="openstack/glance-default-external-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.848502 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c997de36-5bb1-4e49-860f-b1e90480b111-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c997de36-5bb1-4e49-860f-b1e90480b111\") " pod="openstack/glance-default-internal-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.848547 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c997de36-5bb1-4e49-860f-b1e90480b111-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c997de36-5bb1-4e49-860f-b1e90480b111\") " pod="openstack/glance-default-internal-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.848578 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60360fa1-9e1f-4cb0-b41d-1f4701bef286-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"60360fa1-9e1f-4cb0-b41d-1f4701bef286\") " pod="openstack/glance-default-external-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.848601 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60360fa1-9e1f-4cb0-b41d-1f4701bef286-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"60360fa1-9e1f-4cb0-b41d-1f4701bef286\") " pod="openstack/glance-default-external-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.848707 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qdc4\" (UniqueName: \"kubernetes.io/projected/60360fa1-9e1f-4cb0-b41d-1f4701bef286-kube-api-access-2qdc4\") pod \"glance-default-external-api-0\" (UID: \"60360fa1-9e1f-4cb0-b41d-1f4701bef286\") " pod="openstack/glance-default-external-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.848760 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60360fa1-9e1f-4cb0-b41d-1f4701bef286-logs\") pod \"glance-default-external-api-0\" (UID: \"60360fa1-9e1f-4cb0-b41d-1f4701bef286\") " pod="openstack/glance-default-external-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.848807 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdb5g\" (UniqueName: \"kubernetes.io/projected/c997de36-5bb1-4e49-860f-b1e90480b111-kube-api-access-xdb5g\") pod \"glance-default-internal-api-0\" (UID: \"c997de36-5bb1-4e49-860f-b1e90480b111\") " pod="openstack/glance-default-internal-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.848839 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c997de36-5bb1-4e49-860f-b1e90480b111-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c997de36-5bb1-4e49-860f-b1e90480b111\") " pod="openstack/glance-default-internal-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.848859 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c997de36-5bb1-4e49-860f-b1e90480b111-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c997de36-5bb1-4e49-860f-b1e90480b111\") " pod="openstack/glance-default-internal-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.848935 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c997de36-5bb1-4e49-860f-b1e90480b111-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c997de36-5bb1-4e49-860f-b1e90480b111\") " pod="openstack/glance-default-internal-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.853175 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c997de36-5bb1-4e49-860f-b1e90480b111-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c997de36-5bb1-4e49-860f-b1e90480b111\") " pod="openstack/glance-default-internal-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.857670 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60360fa1-9e1f-4cb0-b41d-1f4701bef286-scripts\") pod \"glance-default-external-api-0\" (UID: \"60360fa1-9e1f-4cb0-b41d-1f4701bef286\") " pod="openstack/glance-default-external-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.858907 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c997de36-5bb1-4e49-860f-b1e90480b111-logs\") pod \"glance-default-internal-api-0\" (UID: \"c997de36-5bb1-4e49-860f-b1e90480b111\") " pod="openstack/glance-default-internal-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.863504 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60360fa1-9e1f-4cb0-b41d-1f4701bef286-config-data\") pod \"glance-default-external-api-0\" (UID: \"60360fa1-9e1f-4cb0-b41d-1f4701bef286\") " pod="openstack/glance-default-external-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.863824 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/60360fa1-9e1f-4cb0-b41d-1f4701bef286-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"60360fa1-9e1f-4cb0-b41d-1f4701bef286\") " pod="openstack/glance-default-external-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.867718 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c997de36-5bb1-4e49-860f-b1e90480b111-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c997de36-5bb1-4e49-860f-b1e90480b111\") " pod="openstack/glance-default-internal-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.867954 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c997de36-5bb1-4e49-860f-b1e90480b111-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c997de36-5bb1-4e49-860f-b1e90480b111\") " pod="openstack/glance-default-internal-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.871458 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60360fa1-9e1f-4cb0-b41d-1f4701bef286-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"60360fa1-9e1f-4cb0-b41d-1f4701bef286\") " pod="openstack/glance-default-external-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.873804 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60360fa1-9e1f-4cb0-b41d-1f4701bef286-logs\") pod \"glance-default-external-api-0\" (UID: \"60360fa1-9e1f-4cb0-b41d-1f4701bef286\") " pod="openstack/glance-default-external-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.879801 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c997de36-5bb1-4e49-860f-b1e90480b111-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c997de36-5bb1-4e49-860f-b1e90480b111\") " pod="openstack/glance-default-internal-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.882452 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c997de36-5bb1-4e49-860f-b1e90480b111-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c997de36-5bb1-4e49-860f-b1e90480b111\") " pod="openstack/glance-default-internal-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.893842 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdb5g\" (UniqueName: \"kubernetes.io/projected/c997de36-5bb1-4e49-860f-b1e90480b111-kube-api-access-xdb5g\") pod \"glance-default-internal-api-0\" (UID: \"c997de36-5bb1-4e49-860f-b1e90480b111\") " pod="openstack/glance-default-internal-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.912411 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60360fa1-9e1f-4cb0-b41d-1f4701bef286-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"60360fa1-9e1f-4cb0-b41d-1f4701bef286\") " pod="openstack/glance-default-external-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.926964 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qdc4\" (UniqueName: \"kubernetes.io/projected/60360fa1-9e1f-4cb0-b41d-1f4701bef286-kube-api-access-2qdc4\") pod \"glance-default-external-api-0\" (UID: \"60360fa1-9e1f-4cb0-b41d-1f4701bef286\") " pod="openstack/glance-default-external-api-0" Dec 05 18:01:06 crc kubenswrapper[4756]: I1205 18:01:06.932134 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 18:01:07 crc kubenswrapper[4756]: I1205 18:01:07.010051 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nw2x8"] Dec 05 18:01:07 crc kubenswrapper[4756]: I1205 18:01:07.014178 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nw2x8" Dec 05 18:01:07 crc kubenswrapper[4756]: I1205 18:01:07.032075 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nw2x8"] Dec 05 18:01:07 crc kubenswrapper[4756]: I1205 18:01:07.044148 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-tsnnf"] Dec 05 18:01:07 crc kubenswrapper[4756]: I1205 18:01:07.072569 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-tsnnf"] Dec 05 18:01:07 crc kubenswrapper[4756]: I1205 18:01:07.156365 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a26952e7-5644-449d-96b8-182cf1046e2e-catalog-content\") pod \"redhat-operators-nw2x8\" (UID: \"a26952e7-5644-449d-96b8-182cf1046e2e\") " pod="openshift-marketplace/redhat-operators-nw2x8" Dec 05 18:01:07 crc kubenswrapper[4756]: I1205 18:01:07.156535 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a26952e7-5644-449d-96b8-182cf1046e2e-utilities\") pod \"redhat-operators-nw2x8\" (UID: \"a26952e7-5644-449d-96b8-182cf1046e2e\") " pod="openshift-marketplace/redhat-operators-nw2x8" Dec 05 18:01:07 crc kubenswrapper[4756]: I1205 18:01:07.156563 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmwp6\" (UniqueName: \"kubernetes.io/projected/a26952e7-5644-449d-96b8-182cf1046e2e-kube-api-access-bmwp6\") pod \"redhat-operators-nw2x8\" (UID: \"a26952e7-5644-449d-96b8-182cf1046e2e\") " pod="openshift-marketplace/redhat-operators-nw2x8" Dec 05 18:01:07 crc kubenswrapper[4756]: I1205 18:01:07.201001 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 18:01:07 crc kubenswrapper[4756]: I1205 18:01:07.258324 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a26952e7-5644-449d-96b8-182cf1046e2e-utilities\") pod \"redhat-operators-nw2x8\" (UID: \"a26952e7-5644-449d-96b8-182cf1046e2e\") " pod="openshift-marketplace/redhat-operators-nw2x8" Dec 05 18:01:07 crc kubenswrapper[4756]: I1205 18:01:07.258781 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmwp6\" (UniqueName: \"kubernetes.io/projected/a26952e7-5644-449d-96b8-182cf1046e2e-kube-api-access-bmwp6\") pod \"redhat-operators-nw2x8\" (UID: \"a26952e7-5644-449d-96b8-182cf1046e2e\") " pod="openshift-marketplace/redhat-operators-nw2x8" Dec 05 18:01:07 crc kubenswrapper[4756]: I1205 18:01:07.258878 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a26952e7-5644-449d-96b8-182cf1046e2e-catalog-content\") pod \"redhat-operators-nw2x8\" (UID: \"a26952e7-5644-449d-96b8-182cf1046e2e\") " pod="openshift-marketplace/redhat-operators-nw2x8" Dec 05 18:01:07 crc kubenswrapper[4756]: I1205 18:01:07.259506 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a26952e7-5644-449d-96b8-182cf1046e2e-catalog-content\") pod \"redhat-operators-nw2x8\" (UID: \"a26952e7-5644-449d-96b8-182cf1046e2e\") " pod="openshift-marketplace/redhat-operators-nw2x8" Dec 05 18:01:07 crc kubenswrapper[4756]: I1205 18:01:07.259847 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a26952e7-5644-449d-96b8-182cf1046e2e-utilities\") pod \"redhat-operators-nw2x8\" (UID: \"a26952e7-5644-449d-96b8-182cf1046e2e\") " pod="openshift-marketplace/redhat-operators-nw2x8" Dec 05 18:01:07 crc kubenswrapper[4756]: I1205 18:01:07.283206 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmwp6\" (UniqueName: \"kubernetes.io/projected/a26952e7-5644-449d-96b8-182cf1046e2e-kube-api-access-bmwp6\") pod \"redhat-operators-nw2x8\" (UID: \"a26952e7-5644-449d-96b8-182cf1046e2e\") " pod="openshift-marketplace/redhat-operators-nw2x8" Dec 05 18:01:07 crc kubenswrapper[4756]: I1205 18:01:07.373533 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nw2x8" Dec 05 18:01:07 crc kubenswrapper[4756]: I1205 18:01:07.742370 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 18:01:08 crc kubenswrapper[4756]: I1205 18:01:08.083249 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e" path="/var/lib/kubelet/pods/2efc9f9e-f63b-4a57-a61e-de7e66ebbf6e/volumes" Dec 05 18:01:08 crc kubenswrapper[4756]: I1205 18:01:08.085975 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82e4ff41-9e2e-4da3-9431-8e3706464313" path="/var/lib/kubelet/pods/82e4ff41-9e2e-4da3-9431-8e3706464313/volumes" Dec 05 18:01:08 crc kubenswrapper[4756]: I1205 18:01:08.087287 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dec4ed1b-f3fe-4135-be28-524ecef6a200" path="/var/lib/kubelet/pods/dec4ed1b-f3fe-4135-be28-524ecef6a200/volumes" Dec 05 18:01:08 crc kubenswrapper[4756]: I1205 18:01:08.088114 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 18:01:08 crc kubenswrapper[4756]: I1205 18:01:08.167056 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nw2x8"] Dec 05 18:01:08 crc kubenswrapper[4756]: I1205 18:01:08.447651 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c997de36-5bb1-4e49-860f-b1e90480b111","Type":"ContainerStarted","Data":"786189e43a9b8cce75c34a3e887980c011c56cf45d98a44855183d5188263852"} Dec 05 18:01:08 crc kubenswrapper[4756]: I1205 18:01:08.694687 4756 scope.go:117] "RemoveContainer" containerID="49b25f6f134f5b8f594e82a24f36dec2f2d71abf99696f10b5ca8a0deb086007" Dec 05 18:01:09 crc kubenswrapper[4756]: I1205 18:01:09.473330 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c997de36-5bb1-4e49-860f-b1e90480b111","Type":"ContainerStarted","Data":"70581b92a8f66ef1e191ce79e30e9ee22483b8928a328a5e4aaea9cbb6e2e771"} Dec 05 18:01:13 crc kubenswrapper[4756]: W1205 18:01:13.970036 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda26952e7_5644_449d_96b8_182cf1046e2e.slice/crio-6058046cad88654253a9df6efc991d41f2d076252de3e44df77e493d01811a8c WatchSource:0}: Error finding container 6058046cad88654253a9df6efc991d41f2d076252de3e44df77e493d01811a8c: Status 404 returned error can't find the container with id 6058046cad88654253a9df6efc991d41f2d076252de3e44df77e493d01811a8c Dec 05 18:01:14 crc kubenswrapper[4756]: I1205 18:01:14.002639 4756 scope.go:117] "RemoveContainer" containerID="5eddb4965271073be520495158566283f0528c821f2d8f8cc0c95ff1c70d69de" Dec 05 18:01:14 crc kubenswrapper[4756]: I1205 18:01:14.152770 4756 scope.go:117] "RemoveContainer" containerID="adb928f4fefb1da8a69a78e41afe558619d6cf18bcfb9113d8586ff8f77a2051" Dec 05 18:01:14 crc kubenswrapper[4756]: I1205 18:01:14.551507 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"60360fa1-9e1f-4cb0-b41d-1f4701bef286","Type":"ContainerStarted","Data":"5e8cb5dcfef25462df9d2deffba8d5e1777846e0cf90809351ebd559961af8bf"} Dec 05 18:01:14 crc kubenswrapper[4756]: I1205 18:01:14.552975 4756 generic.go:334] "Generic (PLEG): container finished" podID="a26952e7-5644-449d-96b8-182cf1046e2e" containerID="b8e4b9532891d7a2cbe3320ae5a58c840ed35294df2113d77cbc8748c06e87ba" exitCode=0 Dec 05 18:01:14 crc kubenswrapper[4756]: I1205 18:01:14.553003 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nw2x8" event={"ID":"a26952e7-5644-449d-96b8-182cf1046e2e","Type":"ContainerDied","Data":"b8e4b9532891d7a2cbe3320ae5a58c840ed35294df2113d77cbc8748c06e87ba"} Dec 05 18:01:14 crc kubenswrapper[4756]: I1205 18:01:14.553020 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nw2x8" event={"ID":"a26952e7-5644-449d-96b8-182cf1046e2e","Type":"ContainerStarted","Data":"6058046cad88654253a9df6efc991d41f2d076252de3e44df77e493d01811a8c"} Dec 05 18:01:15 crc kubenswrapper[4756]: I1205 18:01:15.564951 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68cd4d6b77-zlxrg" event={"ID":"1d4dcfe0-9012-4762-a563-fb1ae5a47b46","Type":"ContainerStarted","Data":"3a5665d1aac79af6daccd4f16ee9d09b7b0aecae2ab2031dee52922331df7daa"} Dec 05 18:01:15 crc kubenswrapper[4756]: I1205 18:01:15.565391 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68cd4d6b77-zlxrg" event={"ID":"1d4dcfe0-9012-4762-a563-fb1ae5a47b46","Type":"ContainerStarted","Data":"4f1bf781c73e8d66b88876c380894e290fb75ee472fe541a56ad13ff3809c291"} Dec 05 18:01:15 crc kubenswrapper[4756]: I1205 18:01:15.565050 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-68cd4d6b77-zlxrg" podUID="1d4dcfe0-9012-4762-a563-fb1ae5a47b46" containerName="horizon-log" containerID="cri-o://4f1bf781c73e8d66b88876c380894e290fb75ee472fe541a56ad13ff3809c291" gracePeriod=30 Dec 05 18:01:15 crc kubenswrapper[4756]: I1205 18:01:15.565718 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-68cd4d6b77-zlxrg" podUID="1d4dcfe0-9012-4762-a563-fb1ae5a47b46" containerName="horizon" containerID="cri-o://3a5665d1aac79af6daccd4f16ee9d09b7b0aecae2ab2031dee52922331df7daa" gracePeriod=30 Dec 05 18:01:15 crc kubenswrapper[4756]: I1205 18:01:15.569178 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"60360fa1-9e1f-4cb0-b41d-1f4701bef286","Type":"ContainerStarted","Data":"a0a88c50f7705da6a1361dc3aec755fe42417b63ae66135de584512e0b8ef415"} Dec 05 18:01:15 crc kubenswrapper[4756]: I1205 18:01:15.569309 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"60360fa1-9e1f-4cb0-b41d-1f4701bef286","Type":"ContainerStarted","Data":"4d2c0ae1a7fc77acf25f4664ac294b73b7997bdd4b3efe443f69686036839289"} Dec 05 18:01:15 crc kubenswrapper[4756]: I1205 18:01:15.573374 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cf757d7d6-ltbqn" event={"ID":"1acc47c0-70db-432a-b627-44a07ac96488","Type":"ContainerStarted","Data":"2ae2f3dbeef1997982e852eacc2fbee85a043c9468679be980f1e95bffcc84dd"} Dec 05 18:01:15 crc kubenswrapper[4756]: I1205 18:01:15.573503 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cf757d7d6-ltbqn" event={"ID":"1acc47c0-70db-432a-b627-44a07ac96488","Type":"ContainerStarted","Data":"043e698988c172165106cc334b83ce51f1a714fce70d32284d54bc1e71d250b1"} Dec 05 18:01:15 crc kubenswrapper[4756]: I1205 18:01:15.577268 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nw2x8" event={"ID":"a26952e7-5644-449d-96b8-182cf1046e2e","Type":"ContainerStarted","Data":"0b4881ceca2229b3e35095f7d1140f4da85d6049b83f177a6014225ce2274273"} Dec 05 18:01:15 crc kubenswrapper[4756]: I1205 18:01:15.579334 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c997de36-5bb1-4e49-860f-b1e90480b111","Type":"ContainerStarted","Data":"106f90edf1b089dddfe436bb9435ea4f6ddde4d2920d2efecca811b6462e2e9f"} Dec 05 18:01:15 crc kubenswrapper[4756]: I1205 18:01:15.581108 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5497588c66-pwmcj" event={"ID":"b47eed2d-c027-4235-893e-a92f7ad54d17","Type":"ContainerStarted","Data":"a2bd7649f32753ad00b9f102c91cc94bf993568573d465bbe5e94a917b423b5c"} Dec 05 18:01:15 crc kubenswrapper[4756]: I1205 18:01:15.581139 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5497588c66-pwmcj" event={"ID":"b47eed2d-c027-4235-893e-a92f7ad54d17","Type":"ContainerStarted","Data":"f8c3c0932410d807906d983da7557db80706e6f932fdb720e558d81eecc7f418"} Dec 05 18:01:15 crc kubenswrapper[4756]: I1205 18:01:15.582795 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76854b6f89-9g6bc" event={"ID":"5e8e9465-2f77-4b84-b96d-a3c2768d0548","Type":"ContainerStarted","Data":"e3e1bab23b2450d1f9d2e8e682eb7be8c4b192425c3c7cc1b7d0aeed288dbd9c"} Dec 05 18:01:15 crc kubenswrapper[4756]: I1205 18:01:15.582822 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76854b6f89-9g6bc" event={"ID":"5e8e9465-2f77-4b84-b96d-a3c2768d0548","Type":"ContainerStarted","Data":"f92f5fbbf49b4f907d05a3a5795a2d8997ac0a07abbcf823583d35c7473153bb"} Dec 05 18:01:15 crc kubenswrapper[4756]: I1205 18:01:15.582908 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-76854b6f89-9g6bc" podUID="5e8e9465-2f77-4b84-b96d-a3c2768d0548" containerName="horizon-log" containerID="cri-o://f92f5fbbf49b4f907d05a3a5795a2d8997ac0a07abbcf823583d35c7473153bb" gracePeriod=30 Dec 05 18:01:15 crc kubenswrapper[4756]: I1205 18:01:15.582994 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-76854b6f89-9g6bc" podUID="5e8e9465-2f77-4b84-b96d-a3c2768d0548" containerName="horizon" containerID="cri-o://e3e1bab23b2450d1f9d2e8e682eb7be8c4b192425c3c7cc1b7d0aeed288dbd9c" gracePeriod=30 Dec 05 18:01:15 crc kubenswrapper[4756]: I1205 18:01:15.593003 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-68cd4d6b77-zlxrg" podStartSLOduration=2.398466833 podStartE2EDuration="13.59298155s" podCreationTimestamp="2025-12-05 18:01:02 +0000 UTC" firstStartedPulling="2025-12-05 18:01:03.026545306 +0000 UTC m=+6323.265218818" lastFinishedPulling="2025-12-05 18:01:14.221060023 +0000 UTC m=+6334.459733535" observedRunningTime="2025-12-05 18:01:15.585983103 +0000 UTC m=+6335.824656615" watchObservedRunningTime="2025-12-05 18:01:15.59298155 +0000 UTC m=+6335.831655062" Dec 05 18:01:15 crc kubenswrapper[4756]: I1205 18:01:15.628214 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5497588c66-pwmcj" podStartSLOduration=3.187502327 podStartE2EDuration="12.628197007s" podCreationTimestamp="2025-12-05 18:01:03 +0000 UTC" firstStartedPulling="2025-12-05 18:01:04.874309875 +0000 UTC m=+6325.112983387" lastFinishedPulling="2025-12-05 18:01:14.315004555 +0000 UTC m=+6334.553678067" observedRunningTime="2025-12-05 18:01:15.618212971 +0000 UTC m=+6335.856886483" watchObservedRunningTime="2025-12-05 18:01:15.628197007 +0000 UTC m=+6335.866870519" Dec 05 18:01:15 crc kubenswrapper[4756]: I1205 18:01:15.664656 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=9.664632598 podStartE2EDuration="9.664632598s" podCreationTimestamp="2025-12-05 18:01:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 18:01:15.655000781 +0000 UTC m=+6335.893674303" watchObservedRunningTime="2025-12-05 18:01:15.664632598 +0000 UTC m=+6335.903306110" Dec 05 18:01:15 crc kubenswrapper[4756]: I1205 18:01:15.708804 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.708784424 podStartE2EDuration="9.708784424s" podCreationTimestamp="2025-12-05 18:01:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 18:01:15.699107716 +0000 UTC m=+6335.937781238" watchObservedRunningTime="2025-12-05 18:01:15.708784424 +0000 UTC m=+6335.947457936" Dec 05 18:01:15 crc kubenswrapper[4756]: I1205 18:01:15.807404 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5cf757d7d6-ltbqn" podStartSLOduration=2.766459697 podStartE2EDuration="11.80738863s" podCreationTimestamp="2025-12-05 18:01:04 +0000 UTC" firstStartedPulling="2025-12-05 18:01:05.178233129 +0000 UTC m=+6325.416906641" lastFinishedPulling="2025-12-05 18:01:14.219162062 +0000 UTC m=+6334.457835574" observedRunningTime="2025-12-05 18:01:15.768612567 +0000 UTC m=+6336.007286099" watchObservedRunningTime="2025-12-05 18:01:15.80738863 +0000 UTC m=+6336.046062132" Dec 05 18:01:15 crc kubenswrapper[4756]: I1205 18:01:15.840645 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-76854b6f89-9g6bc" podStartSLOduration=3.441707906 podStartE2EDuration="14.840627875s" podCreationTimestamp="2025-12-05 18:01:01 +0000 UTC" firstStartedPulling="2025-12-05 18:01:02.808906791 +0000 UTC m=+6323.047580303" lastFinishedPulling="2025-12-05 18:01:14.20782676 +0000 UTC m=+6334.446500272" observedRunningTime="2025-12-05 18:01:15.830401552 +0000 UTC m=+6336.069075064" watchObservedRunningTime="2025-12-05 18:01:15.840627875 +0000 UTC m=+6336.079301377" Dec 05 18:01:16 crc kubenswrapper[4756]: I1205 18:01:16.594577 4756 generic.go:334] "Generic (PLEG): container finished" podID="a26952e7-5644-449d-96b8-182cf1046e2e" containerID="0b4881ceca2229b3e35095f7d1140f4da85d6049b83f177a6014225ce2274273" exitCode=0 Dec 05 18:01:16 crc kubenswrapper[4756]: I1205 18:01:16.594654 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nw2x8" event={"ID":"a26952e7-5644-449d-96b8-182cf1046e2e","Type":"ContainerDied","Data":"0b4881ceca2229b3e35095f7d1140f4da85d6049b83f177a6014225ce2274273"} Dec 05 18:01:16 crc kubenswrapper[4756]: I1205 18:01:16.932454 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 18:01:16 crc kubenswrapper[4756]: I1205 18:01:16.933009 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 18:01:16 crc kubenswrapper[4756]: I1205 18:01:16.963136 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 18:01:16 crc kubenswrapper[4756]: I1205 18:01:16.998517 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 18:01:17 crc kubenswrapper[4756]: I1205 18:01:17.201890 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 18:01:17 crc kubenswrapper[4756]: I1205 18:01:17.202190 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 18:01:17 crc kubenswrapper[4756]: I1205 18:01:17.250215 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 18:01:17 crc kubenswrapper[4756]: I1205 18:01:17.250677 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 18:01:17 crc kubenswrapper[4756]: I1205 18:01:17.627804 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nw2x8" event={"ID":"a26952e7-5644-449d-96b8-182cf1046e2e","Type":"ContainerStarted","Data":"250da5b529daab83abad7b28ed478cb46ab79e52de9024742870fc2403eff857"} Dec 05 18:01:17 crc kubenswrapper[4756]: I1205 18:01:17.627993 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 18:01:17 crc kubenswrapper[4756]: I1205 18:01:17.628997 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 18:01:17 crc kubenswrapper[4756]: I1205 18:01:17.629023 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 18:01:17 crc kubenswrapper[4756]: I1205 18:01:17.629034 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 18:01:17 crc kubenswrapper[4756]: I1205 18:01:17.657839 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nw2x8" podStartSLOduration=9.141773822 podStartE2EDuration="11.657818299s" podCreationTimestamp="2025-12-05 18:01:06 +0000 UTC" firstStartedPulling="2025-12-05 18:01:14.555075248 +0000 UTC m=+6334.793748760" lastFinishedPulling="2025-12-05 18:01:17.071119725 +0000 UTC m=+6337.309793237" observedRunningTime="2025-12-05 18:01:17.646321353 +0000 UTC m=+6337.884994865" watchObservedRunningTime="2025-12-05 18:01:17.657818299 +0000 UTC m=+6337.896491821" Dec 05 18:01:19 crc kubenswrapper[4756]: I1205 18:01:19.650631 4756 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 18:01:20 crc kubenswrapper[4756]: I1205 18:01:20.004245 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 18:01:20 crc kubenswrapper[4756]: I1205 18:01:20.016887 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 18:01:22 crc kubenswrapper[4756]: I1205 18:01:22.295781 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:01:22 crc kubenswrapper[4756]: I1205 18:01:22.296304 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:01:22 crc kubenswrapper[4756]: I1205 18:01:22.296400 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 18:01:22 crc kubenswrapper[4756]: I1205 18:01:22.297863 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"32339976c849fb944e24e4a905b3c0b31eadc92ee27ed11d626318af61e21b3a"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 18:01:22 crc kubenswrapper[4756]: I1205 18:01:22.297980 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://32339976c849fb944e24e4a905b3c0b31eadc92ee27ed11d626318af61e21b3a" gracePeriod=600 Dec 05 18:01:22 crc kubenswrapper[4756]: I1205 18:01:22.303397 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-76854b6f89-9g6bc" Dec 05 18:01:22 crc kubenswrapper[4756]: I1205 18:01:22.468221 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-68cd4d6b77-zlxrg" Dec 05 18:01:22 crc kubenswrapper[4756]: I1205 18:01:22.684487 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="32339976c849fb944e24e4a905b3c0b31eadc92ee27ed11d626318af61e21b3a" exitCode=0 Dec 05 18:01:22 crc kubenswrapper[4756]: I1205 18:01:22.684528 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"32339976c849fb944e24e4a905b3c0b31eadc92ee27ed11d626318af61e21b3a"} Dec 05 18:01:22 crc kubenswrapper[4756]: I1205 18:01:22.684958 4756 scope.go:117] "RemoveContainer" containerID="0fec291fb85db762e3a0a3b787ad91e21392cadcb7285d171d966367bc1e032a" Dec 05 18:01:23 crc kubenswrapper[4756]: I1205 18:01:23.696164 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326"} Dec 05 18:01:24 crc kubenswrapper[4756]: I1205 18:01:24.341917 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5497588c66-pwmcj" Dec 05 18:01:24 crc kubenswrapper[4756]: I1205 18:01:24.341964 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5497588c66-pwmcj" Dec 05 18:01:24 crc kubenswrapper[4756]: I1205 18:01:24.344252 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5497588c66-pwmcj" podUID="b47eed2d-c027-4235-893e-a92f7ad54d17" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.119:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.119:8443: connect: connection refused" Dec 05 18:01:24 crc kubenswrapper[4756]: I1205 18:01:24.687934 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5cf757d7d6-ltbqn" Dec 05 18:01:24 crc kubenswrapper[4756]: I1205 18:01:24.689010 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5cf757d7d6-ltbqn" Dec 05 18:01:24 crc kubenswrapper[4756]: I1205 18:01:24.690641 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5cf757d7d6-ltbqn" podUID="1acc47c0-70db-432a-b627-44a07ac96488" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.120:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.120:8443: connect: connection refused" Dec 05 18:01:27 crc kubenswrapper[4756]: I1205 18:01:27.374688 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nw2x8" Dec 05 18:01:27 crc kubenswrapper[4756]: I1205 18:01:27.375090 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nw2x8" Dec 05 18:01:28 crc kubenswrapper[4756]: I1205 18:01:28.425248 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nw2x8" podUID="a26952e7-5644-449d-96b8-182cf1046e2e" containerName="registry-server" probeResult="failure" output=< Dec 05 18:01:28 crc kubenswrapper[4756]: timeout: failed to connect service ":50051" within 1s Dec 05 18:01:28 crc kubenswrapper[4756]: > Dec 05 18:01:36 crc kubenswrapper[4756]: I1205 18:01:36.194667 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5497588c66-pwmcj" Dec 05 18:01:36 crc kubenswrapper[4756]: I1205 18:01:36.577797 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5cf757d7d6-ltbqn" Dec 05 18:01:37 crc kubenswrapper[4756]: I1205 18:01:37.472833 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nw2x8" Dec 05 18:01:37 crc kubenswrapper[4756]: I1205 18:01:37.554142 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nw2x8" Dec 05 18:01:37 crc kubenswrapper[4756]: I1205 18:01:37.841842 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5497588c66-pwmcj" Dec 05 18:01:38 crc kubenswrapper[4756]: I1205 18:01:38.125718 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5cf757d7d6-ltbqn" Dec 05 18:01:38 crc kubenswrapper[4756]: I1205 18:01:38.176037 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nw2x8"] Dec 05 18:01:38 crc kubenswrapper[4756]: I1205 18:01:38.189365 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5497588c66-pwmcj"] Dec 05 18:01:38 crc kubenswrapper[4756]: I1205 18:01:38.189597 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5497588c66-pwmcj" podUID="b47eed2d-c027-4235-893e-a92f7ad54d17" containerName="horizon-log" containerID="cri-o://f8c3c0932410d807906d983da7557db80706e6f932fdb720e558d81eecc7f418" gracePeriod=30 Dec 05 18:01:38 crc kubenswrapper[4756]: I1205 18:01:38.190018 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5497588c66-pwmcj" podUID="b47eed2d-c027-4235-893e-a92f7ad54d17" containerName="horizon" containerID="cri-o://a2bd7649f32753ad00b9f102c91cc94bf993568573d465bbe5e94a917b423b5c" gracePeriod=30 Dec 05 18:01:38 crc kubenswrapper[4756]: I1205 18:01:38.880206 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nw2x8" podUID="a26952e7-5644-449d-96b8-182cf1046e2e" containerName="registry-server" containerID="cri-o://250da5b529daab83abad7b28ed478cb46ab79e52de9024742870fc2403eff857" gracePeriod=2 Dec 05 18:01:39 crc kubenswrapper[4756]: E1205 18:01:39.063983 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda26952e7_5644_449d_96b8_182cf1046e2e.slice/crio-conmon-250da5b529daab83abad7b28ed478cb46ab79e52de9024742870fc2403eff857.scope\": RecentStats: unable to find data in memory cache]" Dec 05 18:01:39 crc kubenswrapper[4756]: I1205 18:01:39.196474 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 18:01:39 crc kubenswrapper[4756]: I1205 18:01:39.379218 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 18:01:39 crc kubenswrapper[4756]: I1205 18:01:39.443205 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nw2x8" Dec 05 18:01:39 crc kubenswrapper[4756]: I1205 18:01:39.572277 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmwp6\" (UniqueName: \"kubernetes.io/projected/a26952e7-5644-449d-96b8-182cf1046e2e-kube-api-access-bmwp6\") pod \"a26952e7-5644-449d-96b8-182cf1046e2e\" (UID: \"a26952e7-5644-449d-96b8-182cf1046e2e\") " Dec 05 18:01:39 crc kubenswrapper[4756]: I1205 18:01:39.572604 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a26952e7-5644-449d-96b8-182cf1046e2e-catalog-content\") pod \"a26952e7-5644-449d-96b8-182cf1046e2e\" (UID: \"a26952e7-5644-449d-96b8-182cf1046e2e\") " Dec 05 18:01:39 crc kubenswrapper[4756]: I1205 18:01:39.572666 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a26952e7-5644-449d-96b8-182cf1046e2e-utilities\") pod \"a26952e7-5644-449d-96b8-182cf1046e2e\" (UID: \"a26952e7-5644-449d-96b8-182cf1046e2e\") " Dec 05 18:01:39 crc kubenswrapper[4756]: I1205 18:01:39.573930 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a26952e7-5644-449d-96b8-182cf1046e2e-utilities" (OuterVolumeSpecName: "utilities") pod "a26952e7-5644-449d-96b8-182cf1046e2e" (UID: "a26952e7-5644-449d-96b8-182cf1046e2e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:01:39 crc kubenswrapper[4756]: I1205 18:01:39.591614 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a26952e7-5644-449d-96b8-182cf1046e2e-kube-api-access-bmwp6" (OuterVolumeSpecName: "kube-api-access-bmwp6") pod "a26952e7-5644-449d-96b8-182cf1046e2e" (UID: "a26952e7-5644-449d-96b8-182cf1046e2e"). InnerVolumeSpecName "kube-api-access-bmwp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:01:39 crc kubenswrapper[4756]: I1205 18:01:39.674893 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmwp6\" (UniqueName: \"kubernetes.io/projected/a26952e7-5644-449d-96b8-182cf1046e2e-kube-api-access-bmwp6\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:39 crc kubenswrapper[4756]: I1205 18:01:39.674924 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a26952e7-5644-449d-96b8-182cf1046e2e-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:39 crc kubenswrapper[4756]: I1205 18:01:39.731468 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a26952e7-5644-449d-96b8-182cf1046e2e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a26952e7-5644-449d-96b8-182cf1046e2e" (UID: "a26952e7-5644-449d-96b8-182cf1046e2e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:01:39 crc kubenswrapper[4756]: I1205 18:01:39.777177 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a26952e7-5644-449d-96b8-182cf1046e2e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:39 crc kubenswrapper[4756]: I1205 18:01:39.891598 4756 generic.go:334] "Generic (PLEG): container finished" podID="a26952e7-5644-449d-96b8-182cf1046e2e" containerID="250da5b529daab83abad7b28ed478cb46ab79e52de9024742870fc2403eff857" exitCode=0 Dec 05 18:01:39 crc kubenswrapper[4756]: I1205 18:01:39.891671 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nw2x8" Dec 05 18:01:39 crc kubenswrapper[4756]: I1205 18:01:39.891707 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nw2x8" event={"ID":"a26952e7-5644-449d-96b8-182cf1046e2e","Type":"ContainerDied","Data":"250da5b529daab83abad7b28ed478cb46ab79e52de9024742870fc2403eff857"} Dec 05 18:01:39 crc kubenswrapper[4756]: I1205 18:01:39.891760 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nw2x8" event={"ID":"a26952e7-5644-449d-96b8-182cf1046e2e","Type":"ContainerDied","Data":"6058046cad88654253a9df6efc991d41f2d076252de3e44df77e493d01811a8c"} Dec 05 18:01:39 crc kubenswrapper[4756]: I1205 18:01:39.891781 4756 scope.go:117] "RemoveContainer" containerID="250da5b529daab83abad7b28ed478cb46ab79e52de9024742870fc2403eff857" Dec 05 18:01:39 crc kubenswrapper[4756]: I1205 18:01:39.932445 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nw2x8"] Dec 05 18:01:39 crc kubenswrapper[4756]: I1205 18:01:39.935390 4756 scope.go:117] "RemoveContainer" containerID="0b4881ceca2229b3e35095f7d1140f4da85d6049b83f177a6014225ce2274273" Dec 05 18:01:39 crc kubenswrapper[4756]: I1205 18:01:39.939763 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nw2x8"] Dec 05 18:01:39 crc kubenswrapper[4756]: I1205 18:01:39.960999 4756 scope.go:117] "RemoveContainer" containerID="b8e4b9532891d7a2cbe3320ae5a58c840ed35294df2113d77cbc8748c06e87ba" Dec 05 18:01:40 crc kubenswrapper[4756]: I1205 18:01:40.046241 4756 scope.go:117] "RemoveContainer" containerID="250da5b529daab83abad7b28ed478cb46ab79e52de9024742870fc2403eff857" Dec 05 18:01:40 crc kubenswrapper[4756]: E1205 18:01:40.047313 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"250da5b529daab83abad7b28ed478cb46ab79e52de9024742870fc2403eff857\": container with ID starting with 250da5b529daab83abad7b28ed478cb46ab79e52de9024742870fc2403eff857 not found: ID does not exist" containerID="250da5b529daab83abad7b28ed478cb46ab79e52de9024742870fc2403eff857" Dec 05 18:01:40 crc kubenswrapper[4756]: I1205 18:01:40.047360 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"250da5b529daab83abad7b28ed478cb46ab79e52de9024742870fc2403eff857"} err="failed to get container status \"250da5b529daab83abad7b28ed478cb46ab79e52de9024742870fc2403eff857\": rpc error: code = NotFound desc = could not find container \"250da5b529daab83abad7b28ed478cb46ab79e52de9024742870fc2403eff857\": container with ID starting with 250da5b529daab83abad7b28ed478cb46ab79e52de9024742870fc2403eff857 not found: ID does not exist" Dec 05 18:01:40 crc kubenswrapper[4756]: I1205 18:01:40.047387 4756 scope.go:117] "RemoveContainer" containerID="0b4881ceca2229b3e35095f7d1140f4da85d6049b83f177a6014225ce2274273" Dec 05 18:01:40 crc kubenswrapper[4756]: E1205 18:01:40.048022 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b4881ceca2229b3e35095f7d1140f4da85d6049b83f177a6014225ce2274273\": container with ID starting with 0b4881ceca2229b3e35095f7d1140f4da85d6049b83f177a6014225ce2274273 not found: ID does not exist" containerID="0b4881ceca2229b3e35095f7d1140f4da85d6049b83f177a6014225ce2274273" Dec 05 18:01:40 crc kubenswrapper[4756]: I1205 18:01:40.048065 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b4881ceca2229b3e35095f7d1140f4da85d6049b83f177a6014225ce2274273"} err="failed to get container status \"0b4881ceca2229b3e35095f7d1140f4da85d6049b83f177a6014225ce2274273\": rpc error: code = NotFound desc = could not find container \"0b4881ceca2229b3e35095f7d1140f4da85d6049b83f177a6014225ce2274273\": container with ID starting with 0b4881ceca2229b3e35095f7d1140f4da85d6049b83f177a6014225ce2274273 not found: ID does not exist" Dec 05 18:01:40 crc kubenswrapper[4756]: I1205 18:01:40.048099 4756 scope.go:117] "RemoveContainer" containerID="b8e4b9532891d7a2cbe3320ae5a58c840ed35294df2113d77cbc8748c06e87ba" Dec 05 18:01:40 crc kubenswrapper[4756]: E1205 18:01:40.051516 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8e4b9532891d7a2cbe3320ae5a58c840ed35294df2113d77cbc8748c06e87ba\": container with ID starting with b8e4b9532891d7a2cbe3320ae5a58c840ed35294df2113d77cbc8748c06e87ba not found: ID does not exist" containerID="b8e4b9532891d7a2cbe3320ae5a58c840ed35294df2113d77cbc8748c06e87ba" Dec 05 18:01:40 crc kubenswrapper[4756]: I1205 18:01:40.051557 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8e4b9532891d7a2cbe3320ae5a58c840ed35294df2113d77cbc8748c06e87ba"} err="failed to get container status \"b8e4b9532891d7a2cbe3320ae5a58c840ed35294df2113d77cbc8748c06e87ba\": rpc error: code = NotFound desc = could not find container \"b8e4b9532891d7a2cbe3320ae5a58c840ed35294df2113d77cbc8748c06e87ba\": container with ID starting with b8e4b9532891d7a2cbe3320ae5a58c840ed35294df2113d77cbc8748c06e87ba not found: ID does not exist" Dec 05 18:01:40 crc kubenswrapper[4756]: I1205 18:01:40.093972 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a26952e7-5644-449d-96b8-182cf1046e2e" path="/var/lib/kubelet/pods/a26952e7-5644-449d-96b8-182cf1046e2e/volumes" Dec 05 18:01:41 crc kubenswrapper[4756]: I1205 18:01:41.921588 4756 generic.go:334] "Generic (PLEG): container finished" podID="b47eed2d-c027-4235-893e-a92f7ad54d17" containerID="a2bd7649f32753ad00b9f102c91cc94bf993568573d465bbe5e94a917b423b5c" exitCode=0 Dec 05 18:01:41 crc kubenswrapper[4756]: I1205 18:01:41.921829 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5497588c66-pwmcj" event={"ID":"b47eed2d-c027-4235-893e-a92f7ad54d17","Type":"ContainerDied","Data":"a2bd7649f32753ad00b9f102c91cc94bf993568573d465bbe5e94a917b423b5c"} Dec 05 18:01:44 crc kubenswrapper[4756]: I1205 18:01:44.342761 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5497588c66-pwmcj" podUID="b47eed2d-c027-4235-893e-a92f7ad54d17" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.119:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.119:8443: connect: connection refused" Dec 05 18:01:45 crc kubenswrapper[4756]: I1205 18:01:45.981804 4756 generic.go:334] "Generic (PLEG): container finished" podID="5e8e9465-2f77-4b84-b96d-a3c2768d0548" containerID="e3e1bab23b2450d1f9d2e8e682eb7be8c4b192425c3c7cc1b7d0aeed288dbd9c" exitCode=137 Dec 05 18:01:45 crc kubenswrapper[4756]: I1205 18:01:45.982159 4756 generic.go:334] "Generic (PLEG): container finished" podID="5e8e9465-2f77-4b84-b96d-a3c2768d0548" containerID="f92f5fbbf49b4f907d05a3a5795a2d8997ac0a07abbcf823583d35c7473153bb" exitCode=137 Dec 05 18:01:45 crc kubenswrapper[4756]: I1205 18:01:45.982063 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76854b6f89-9g6bc" event={"ID":"5e8e9465-2f77-4b84-b96d-a3c2768d0548","Type":"ContainerDied","Data":"e3e1bab23b2450d1f9d2e8e682eb7be8c4b192425c3c7cc1b7d0aeed288dbd9c"} Dec 05 18:01:45 crc kubenswrapper[4756]: I1205 18:01:45.982217 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76854b6f89-9g6bc" event={"ID":"5e8e9465-2f77-4b84-b96d-a3c2768d0548","Type":"ContainerDied","Data":"f92f5fbbf49b4f907d05a3a5795a2d8997ac0a07abbcf823583d35c7473153bb"} Dec 05 18:01:45 crc kubenswrapper[4756]: I1205 18:01:45.987125 4756 generic.go:334] "Generic (PLEG): container finished" podID="1d4dcfe0-9012-4762-a563-fb1ae5a47b46" containerID="3a5665d1aac79af6daccd4f16ee9d09b7b0aecae2ab2031dee52922331df7daa" exitCode=137 Dec 05 18:01:45 crc kubenswrapper[4756]: I1205 18:01:45.987185 4756 generic.go:334] "Generic (PLEG): container finished" podID="1d4dcfe0-9012-4762-a563-fb1ae5a47b46" containerID="4f1bf781c73e8d66b88876c380894e290fb75ee472fe541a56ad13ff3809c291" exitCode=137 Dec 05 18:01:45 crc kubenswrapper[4756]: I1205 18:01:45.987223 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68cd4d6b77-zlxrg" event={"ID":"1d4dcfe0-9012-4762-a563-fb1ae5a47b46","Type":"ContainerDied","Data":"3a5665d1aac79af6daccd4f16ee9d09b7b0aecae2ab2031dee52922331df7daa"} Dec 05 18:01:45 crc kubenswrapper[4756]: I1205 18:01:45.987268 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68cd4d6b77-zlxrg" event={"ID":"1d4dcfe0-9012-4762-a563-fb1ae5a47b46","Type":"ContainerDied","Data":"4f1bf781c73e8d66b88876c380894e290fb75ee472fe541a56ad13ff3809c291"} Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.164655 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76854b6f89-9g6bc" Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.221247 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68cd4d6b77-zlxrg" Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.238556 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5e8e9465-2f77-4b84-b96d-a3c2768d0548-config-data\") pod \"5e8e9465-2f77-4b84-b96d-a3c2768d0548\" (UID: \"5e8e9465-2f77-4b84-b96d-a3c2768d0548\") " Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.240627 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxxdj\" (UniqueName: \"kubernetes.io/projected/5e8e9465-2f77-4b84-b96d-a3c2768d0548-kube-api-access-rxxdj\") pod \"5e8e9465-2f77-4b84-b96d-a3c2768d0548\" (UID: \"5e8e9465-2f77-4b84-b96d-a3c2768d0548\") " Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.240866 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e8e9465-2f77-4b84-b96d-a3c2768d0548-scripts\") pod \"5e8e9465-2f77-4b84-b96d-a3c2768d0548\" (UID: \"5e8e9465-2f77-4b84-b96d-a3c2768d0548\") " Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.241020 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5e8e9465-2f77-4b84-b96d-a3c2768d0548-horizon-secret-key\") pod \"5e8e9465-2f77-4b84-b96d-a3c2768d0548\" (UID: \"5e8e9465-2f77-4b84-b96d-a3c2768d0548\") " Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.241163 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e8e9465-2f77-4b84-b96d-a3c2768d0548-logs\") pod \"5e8e9465-2f77-4b84-b96d-a3c2768d0548\" (UID: \"5e8e9465-2f77-4b84-b96d-a3c2768d0548\") " Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.243183 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e8e9465-2f77-4b84-b96d-a3c2768d0548-logs" (OuterVolumeSpecName: "logs") pod "5e8e9465-2f77-4b84-b96d-a3c2768d0548" (UID: "5e8e9465-2f77-4b84-b96d-a3c2768d0548"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.249081 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e8e9465-2f77-4b84-b96d-a3c2768d0548-kube-api-access-rxxdj" (OuterVolumeSpecName: "kube-api-access-rxxdj") pod "5e8e9465-2f77-4b84-b96d-a3c2768d0548" (UID: "5e8e9465-2f77-4b84-b96d-a3c2768d0548"). InnerVolumeSpecName "kube-api-access-rxxdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.249705 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e8e9465-2f77-4b84-b96d-a3c2768d0548-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "5e8e9465-2f77-4b84-b96d-a3c2768d0548" (UID: "5e8e9465-2f77-4b84-b96d-a3c2768d0548"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.287847 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e8e9465-2f77-4b84-b96d-a3c2768d0548-scripts" (OuterVolumeSpecName: "scripts") pod "5e8e9465-2f77-4b84-b96d-a3c2768d0548" (UID: "5e8e9465-2f77-4b84-b96d-a3c2768d0548"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.287968 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e8e9465-2f77-4b84-b96d-a3c2768d0548-config-data" (OuterVolumeSpecName: "config-data") pod "5e8e9465-2f77-4b84-b96d-a3c2768d0548" (UID: "5e8e9465-2f77-4b84-b96d-a3c2768d0548"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.344395 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1d4dcfe0-9012-4762-a563-fb1ae5a47b46-config-data\") pod \"1d4dcfe0-9012-4762-a563-fb1ae5a47b46\" (UID: \"1d4dcfe0-9012-4762-a563-fb1ae5a47b46\") " Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.344531 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d4dcfe0-9012-4762-a563-fb1ae5a47b46-scripts\") pod \"1d4dcfe0-9012-4762-a563-fb1ae5a47b46\" (UID: \"1d4dcfe0-9012-4762-a563-fb1ae5a47b46\") " Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.344581 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hk8b\" (UniqueName: \"kubernetes.io/projected/1d4dcfe0-9012-4762-a563-fb1ae5a47b46-kube-api-access-9hk8b\") pod \"1d4dcfe0-9012-4762-a563-fb1ae5a47b46\" (UID: \"1d4dcfe0-9012-4762-a563-fb1ae5a47b46\") " Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.344643 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d4dcfe0-9012-4762-a563-fb1ae5a47b46-logs\") pod \"1d4dcfe0-9012-4762-a563-fb1ae5a47b46\" (UID: \"1d4dcfe0-9012-4762-a563-fb1ae5a47b46\") " Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.344692 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1d4dcfe0-9012-4762-a563-fb1ae5a47b46-horizon-secret-key\") pod \"1d4dcfe0-9012-4762-a563-fb1ae5a47b46\" (UID: \"1d4dcfe0-9012-4762-a563-fb1ae5a47b46\") " Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.345243 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5e8e9465-2f77-4b84-b96d-a3c2768d0548-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.345267 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxxdj\" (UniqueName: \"kubernetes.io/projected/5e8e9465-2f77-4b84-b96d-a3c2768d0548-kube-api-access-rxxdj\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.345278 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e8e9465-2f77-4b84-b96d-a3c2768d0548-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.345289 4756 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5e8e9465-2f77-4b84-b96d-a3c2768d0548-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.345301 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e8e9465-2f77-4b84-b96d-a3c2768d0548-logs\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.345993 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d4dcfe0-9012-4762-a563-fb1ae5a47b46-logs" (OuterVolumeSpecName: "logs") pod "1d4dcfe0-9012-4762-a563-fb1ae5a47b46" (UID: "1d4dcfe0-9012-4762-a563-fb1ae5a47b46"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.347805 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d4dcfe0-9012-4762-a563-fb1ae5a47b46-kube-api-access-9hk8b" (OuterVolumeSpecName: "kube-api-access-9hk8b") pod "1d4dcfe0-9012-4762-a563-fb1ae5a47b46" (UID: "1d4dcfe0-9012-4762-a563-fb1ae5a47b46"). InnerVolumeSpecName "kube-api-access-9hk8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.349252 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d4dcfe0-9012-4762-a563-fb1ae5a47b46-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "1d4dcfe0-9012-4762-a563-fb1ae5a47b46" (UID: "1d4dcfe0-9012-4762-a563-fb1ae5a47b46"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.373164 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d4dcfe0-9012-4762-a563-fb1ae5a47b46-config-data" (OuterVolumeSpecName: "config-data") pod "1d4dcfe0-9012-4762-a563-fb1ae5a47b46" (UID: "1d4dcfe0-9012-4762-a563-fb1ae5a47b46"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.384944 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d4dcfe0-9012-4762-a563-fb1ae5a47b46-scripts" (OuterVolumeSpecName: "scripts") pod "1d4dcfe0-9012-4762-a563-fb1ae5a47b46" (UID: "1d4dcfe0-9012-4762-a563-fb1ae5a47b46"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.447263 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d4dcfe0-9012-4762-a563-fb1ae5a47b46-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.447602 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hk8b\" (UniqueName: \"kubernetes.io/projected/1d4dcfe0-9012-4762-a563-fb1ae5a47b46-kube-api-access-9hk8b\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.447612 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d4dcfe0-9012-4762-a563-fb1ae5a47b46-logs\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.447621 4756 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1d4dcfe0-9012-4762-a563-fb1ae5a47b46-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:46 crc kubenswrapper[4756]: I1205 18:01:46.447630 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1d4dcfe0-9012-4762-a563-fb1ae5a47b46-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 18:01:47 crc kubenswrapper[4756]: I1205 18:01:47.000673 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76854b6f89-9g6bc" event={"ID":"5e8e9465-2f77-4b84-b96d-a3c2768d0548","Type":"ContainerDied","Data":"df464487a233f54c7be703593251bcc7e538f75f4ff0ab5d43e3656388d6abd7"} Dec 05 18:01:47 crc kubenswrapper[4756]: I1205 18:01:47.000727 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76854b6f89-9g6bc" Dec 05 18:01:47 crc kubenswrapper[4756]: I1205 18:01:47.002376 4756 scope.go:117] "RemoveContainer" containerID="e3e1bab23b2450d1f9d2e8e682eb7be8c4b192425c3c7cc1b7d0aeed288dbd9c" Dec 05 18:01:47 crc kubenswrapper[4756]: I1205 18:01:47.002864 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68cd4d6b77-zlxrg" event={"ID":"1d4dcfe0-9012-4762-a563-fb1ae5a47b46","Type":"ContainerDied","Data":"3601131a9d52f06db0afdfd2165c83df179982d088d0c199e57c2309b64f34db"} Dec 05 18:01:47 crc kubenswrapper[4756]: I1205 18:01:47.002958 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68cd4d6b77-zlxrg" Dec 05 18:01:47 crc kubenswrapper[4756]: I1205 18:01:47.050063 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-76854b6f89-9g6bc"] Dec 05 18:01:47 crc kubenswrapper[4756]: I1205 18:01:47.057188 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-76854b6f89-9g6bc"] Dec 05 18:01:47 crc kubenswrapper[4756]: I1205 18:01:47.082328 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-68cd4d6b77-zlxrg"] Dec 05 18:01:47 crc kubenswrapper[4756]: I1205 18:01:47.090809 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-68cd4d6b77-zlxrg"] Dec 05 18:01:47 crc kubenswrapper[4756]: I1205 18:01:47.276479 4756 scope.go:117] "RemoveContainer" containerID="f92f5fbbf49b4f907d05a3a5795a2d8997ac0a07abbcf823583d35c7473153bb" Dec 05 18:01:47 crc kubenswrapper[4756]: I1205 18:01:47.316144 4756 scope.go:117] "RemoveContainer" containerID="3a5665d1aac79af6daccd4f16ee9d09b7b0aecae2ab2031dee52922331df7daa" Dec 05 18:01:47 crc kubenswrapper[4756]: I1205 18:01:47.494172 4756 scope.go:117] "RemoveContainer" containerID="4f1bf781c73e8d66b88876c380894e290fb75ee472fe541a56ad13ff3809c291" Dec 05 18:01:48 crc kubenswrapper[4756]: I1205 18:01:48.072165 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d4dcfe0-9012-4762-a563-fb1ae5a47b46" path="/var/lib/kubelet/pods/1d4dcfe0-9012-4762-a563-fb1ae5a47b46/volumes" Dec 05 18:01:48 crc kubenswrapper[4756]: I1205 18:01:48.074183 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e8e9465-2f77-4b84-b96d-a3c2768d0548" path="/var/lib/kubelet/pods/5e8e9465-2f77-4b84-b96d-a3c2768d0548/volumes" Dec 05 18:01:54 crc kubenswrapper[4756]: I1205 18:01:54.342258 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5497588c66-pwmcj" podUID="b47eed2d-c027-4235-893e-a92f7ad54d17" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.119:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.119:8443: connect: connection refused" Dec 05 18:02:04 crc kubenswrapper[4756]: I1205 18:02:04.347053 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5497588c66-pwmcj" podUID="b47eed2d-c027-4235-893e-a92f7ad54d17" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.119:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.119:8443: connect: connection refused" Dec 05 18:02:04 crc kubenswrapper[4756]: I1205 18:02:04.347606 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5497588c66-pwmcj" Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.119339 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5497588c66-pwmcj" Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.231342 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b47eed2d-c027-4235-893e-a92f7ad54d17-scripts\") pod \"b47eed2d-c027-4235-893e-a92f7ad54d17\" (UID: \"b47eed2d-c027-4235-893e-a92f7ad54d17\") " Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.231550 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmg4v\" (UniqueName: \"kubernetes.io/projected/b47eed2d-c027-4235-893e-a92f7ad54d17-kube-api-access-rmg4v\") pod \"b47eed2d-c027-4235-893e-a92f7ad54d17\" (UID: \"b47eed2d-c027-4235-893e-a92f7ad54d17\") " Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.231724 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b47eed2d-c027-4235-893e-a92f7ad54d17-combined-ca-bundle\") pod \"b47eed2d-c027-4235-893e-a92f7ad54d17\" (UID: \"b47eed2d-c027-4235-893e-a92f7ad54d17\") " Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.231866 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b47eed2d-c027-4235-893e-a92f7ad54d17-horizon-tls-certs\") pod \"b47eed2d-c027-4235-893e-a92f7ad54d17\" (UID: \"b47eed2d-c027-4235-893e-a92f7ad54d17\") " Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.231919 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b47eed2d-c027-4235-893e-a92f7ad54d17-logs\") pod \"b47eed2d-c027-4235-893e-a92f7ad54d17\" (UID: \"b47eed2d-c027-4235-893e-a92f7ad54d17\") " Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.231956 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b47eed2d-c027-4235-893e-a92f7ad54d17-config-data\") pod \"b47eed2d-c027-4235-893e-a92f7ad54d17\" (UID: \"b47eed2d-c027-4235-893e-a92f7ad54d17\") " Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.232028 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b47eed2d-c027-4235-893e-a92f7ad54d17-horizon-secret-key\") pod \"b47eed2d-c027-4235-893e-a92f7ad54d17\" (UID: \"b47eed2d-c027-4235-893e-a92f7ad54d17\") " Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.232571 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b47eed2d-c027-4235-893e-a92f7ad54d17-logs" (OuterVolumeSpecName: "logs") pod "b47eed2d-c027-4235-893e-a92f7ad54d17" (UID: "b47eed2d-c027-4235-893e-a92f7ad54d17"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.233320 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b47eed2d-c027-4235-893e-a92f7ad54d17-logs\") on node \"crc\" DevicePath \"\"" Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.237213 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b47eed2d-c027-4235-893e-a92f7ad54d17-kube-api-access-rmg4v" (OuterVolumeSpecName: "kube-api-access-rmg4v") pod "b47eed2d-c027-4235-893e-a92f7ad54d17" (UID: "b47eed2d-c027-4235-893e-a92f7ad54d17"). InnerVolumeSpecName "kube-api-access-rmg4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.237379 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b47eed2d-c027-4235-893e-a92f7ad54d17-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b47eed2d-c027-4235-893e-a92f7ad54d17" (UID: "b47eed2d-c027-4235-893e-a92f7ad54d17"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.266035 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b47eed2d-c027-4235-893e-a92f7ad54d17-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b47eed2d-c027-4235-893e-a92f7ad54d17" (UID: "b47eed2d-c027-4235-893e-a92f7ad54d17"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.266650 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b47eed2d-c027-4235-893e-a92f7ad54d17-scripts" (OuterVolumeSpecName: "scripts") pod "b47eed2d-c027-4235-893e-a92f7ad54d17" (UID: "b47eed2d-c027-4235-893e-a92f7ad54d17"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.286120 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b47eed2d-c027-4235-893e-a92f7ad54d17-config-data" (OuterVolumeSpecName: "config-data") pod "b47eed2d-c027-4235-893e-a92f7ad54d17" (UID: "b47eed2d-c027-4235-893e-a92f7ad54d17"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.287890 4756 generic.go:334] "Generic (PLEG): container finished" podID="b47eed2d-c027-4235-893e-a92f7ad54d17" containerID="f8c3c0932410d807906d983da7557db80706e6f932fdb720e558d81eecc7f418" exitCode=137 Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.287924 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5497588c66-pwmcj" event={"ID":"b47eed2d-c027-4235-893e-a92f7ad54d17","Type":"ContainerDied","Data":"f8c3c0932410d807906d983da7557db80706e6f932fdb720e558d81eecc7f418"} Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.287949 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5497588c66-pwmcj" event={"ID":"b47eed2d-c027-4235-893e-a92f7ad54d17","Type":"ContainerDied","Data":"e939b5e8c144546217e5027c3e2485e2be99988fcca0042f630f9277e28f63af"} Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.287966 4756 scope.go:117] "RemoveContainer" containerID="a2bd7649f32753ad00b9f102c91cc94bf993568573d465bbe5e94a917b423b5c" Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.288095 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5497588c66-pwmcj" Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.303051 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b47eed2d-c027-4235-893e-a92f7ad54d17-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "b47eed2d-c027-4235-893e-a92f7ad54d17" (UID: "b47eed2d-c027-4235-893e-a92f7ad54d17"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.334906 4756 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b47eed2d-c027-4235-893e-a92f7ad54d17-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.334940 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b47eed2d-c027-4235-893e-a92f7ad54d17-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.334951 4756 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b47eed2d-c027-4235-893e-a92f7ad54d17-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.334960 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b47eed2d-c027-4235-893e-a92f7ad54d17-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.334970 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmg4v\" (UniqueName: \"kubernetes.io/projected/b47eed2d-c027-4235-893e-a92f7ad54d17-kube-api-access-rmg4v\") on node \"crc\" DevicePath \"\"" Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.334978 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b47eed2d-c027-4235-893e-a92f7ad54d17-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.534523 4756 scope.go:117] "RemoveContainer" containerID="f8c3c0932410d807906d983da7557db80706e6f932fdb720e558d81eecc7f418" Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.563284 4756 scope.go:117] "RemoveContainer" containerID="a2bd7649f32753ad00b9f102c91cc94bf993568573d465bbe5e94a917b423b5c" Dec 05 18:02:09 crc kubenswrapper[4756]: E1205 18:02:09.565637 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2bd7649f32753ad00b9f102c91cc94bf993568573d465bbe5e94a917b423b5c\": container with ID starting with a2bd7649f32753ad00b9f102c91cc94bf993568573d465bbe5e94a917b423b5c not found: ID does not exist" containerID="a2bd7649f32753ad00b9f102c91cc94bf993568573d465bbe5e94a917b423b5c" Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.565714 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2bd7649f32753ad00b9f102c91cc94bf993568573d465bbe5e94a917b423b5c"} err="failed to get container status \"a2bd7649f32753ad00b9f102c91cc94bf993568573d465bbe5e94a917b423b5c\": rpc error: code = NotFound desc = could not find container \"a2bd7649f32753ad00b9f102c91cc94bf993568573d465bbe5e94a917b423b5c\": container with ID starting with a2bd7649f32753ad00b9f102c91cc94bf993568573d465bbe5e94a917b423b5c not found: ID does not exist" Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.565757 4756 scope.go:117] "RemoveContainer" containerID="f8c3c0932410d807906d983da7557db80706e6f932fdb720e558d81eecc7f418" Dec 05 18:02:09 crc kubenswrapper[4756]: E1205 18:02:09.566096 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8c3c0932410d807906d983da7557db80706e6f932fdb720e558d81eecc7f418\": container with ID starting with f8c3c0932410d807906d983da7557db80706e6f932fdb720e558d81eecc7f418 not found: ID does not exist" containerID="f8c3c0932410d807906d983da7557db80706e6f932fdb720e558d81eecc7f418" Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.566142 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8c3c0932410d807906d983da7557db80706e6f932fdb720e558d81eecc7f418"} err="failed to get container status \"f8c3c0932410d807906d983da7557db80706e6f932fdb720e558d81eecc7f418\": rpc error: code = NotFound desc = could not find container \"f8c3c0932410d807906d983da7557db80706e6f932fdb720e558d81eecc7f418\": container with ID starting with f8c3c0932410d807906d983da7557db80706e6f932fdb720e558d81eecc7f418 not found: ID does not exist" Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.630294 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5497588c66-pwmcj"] Dec 05 18:02:09 crc kubenswrapper[4756]: I1205 18:02:09.639174 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5497588c66-pwmcj"] Dec 05 18:02:10 crc kubenswrapper[4756]: I1205 18:02:10.063379 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b47eed2d-c027-4235-893e-a92f7ad54d17" path="/var/lib/kubelet/pods/b47eed2d-c027-4235-893e-a92f7ad54d17/volumes" Dec 05 18:02:14 crc kubenswrapper[4756]: I1205 18:02:14.526586 4756 scope.go:117] "RemoveContainer" containerID="964b87217141f7b3648321f485ef6b05b961a0fb16bbfd29e669988b230b2c51" Dec 05 18:02:14 crc kubenswrapper[4756]: I1205 18:02:14.562638 4756 scope.go:117] "RemoveContainer" containerID="278e3e3c5860511ba42bfca202cd2625abc7b8680977eef6dd13e571b2f8676a" Dec 05 18:02:14 crc kubenswrapper[4756]: I1205 18:02:14.606308 4756 scope.go:117] "RemoveContainer" containerID="38c377442d861e04edc89a0063dfc6998a257429d9925d09c3ff8c819be63b98" Dec 05 18:02:14 crc kubenswrapper[4756]: I1205 18:02:14.637319 4756 scope.go:117] "RemoveContainer" containerID="48e6218aaf647256dd474e8038db8b40fd79bf71a44b644b298e94a9e79992ce" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.221506 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6599c94766-cv4cn"] Dec 05 18:02:20 crc kubenswrapper[4756]: E1205 18:02:20.222487 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e8e9465-2f77-4b84-b96d-a3c2768d0548" containerName="horizon" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.222502 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e8e9465-2f77-4b84-b96d-a3c2768d0548" containerName="horizon" Dec 05 18:02:20 crc kubenswrapper[4756]: E1205 18:02:20.222537 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a26952e7-5644-449d-96b8-182cf1046e2e" containerName="extract-content" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.222546 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a26952e7-5644-449d-96b8-182cf1046e2e" containerName="extract-content" Dec 05 18:02:20 crc kubenswrapper[4756]: E1205 18:02:20.222564 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d4dcfe0-9012-4762-a563-fb1ae5a47b46" containerName="horizon-log" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.222573 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d4dcfe0-9012-4762-a563-fb1ae5a47b46" containerName="horizon-log" Dec 05 18:02:20 crc kubenswrapper[4756]: E1205 18:02:20.222594 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b47eed2d-c027-4235-893e-a92f7ad54d17" containerName="horizon-log" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.222602 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b47eed2d-c027-4235-893e-a92f7ad54d17" containerName="horizon-log" Dec 05 18:02:20 crc kubenswrapper[4756]: E1205 18:02:20.222623 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d4dcfe0-9012-4762-a563-fb1ae5a47b46" containerName="horizon" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.222631 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d4dcfe0-9012-4762-a563-fb1ae5a47b46" containerName="horizon" Dec 05 18:02:20 crc kubenswrapper[4756]: E1205 18:02:20.222647 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a26952e7-5644-449d-96b8-182cf1046e2e" containerName="registry-server" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.222655 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a26952e7-5644-449d-96b8-182cf1046e2e" containerName="registry-server" Dec 05 18:02:20 crc kubenswrapper[4756]: E1205 18:02:20.222674 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b47eed2d-c027-4235-893e-a92f7ad54d17" containerName="horizon" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.222683 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b47eed2d-c027-4235-893e-a92f7ad54d17" containerName="horizon" Dec 05 18:02:20 crc kubenswrapper[4756]: E1205 18:02:20.222698 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e8e9465-2f77-4b84-b96d-a3c2768d0548" containerName="horizon-log" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.222706 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e8e9465-2f77-4b84-b96d-a3c2768d0548" containerName="horizon-log" Dec 05 18:02:20 crc kubenswrapper[4756]: E1205 18:02:20.222723 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a26952e7-5644-449d-96b8-182cf1046e2e" containerName="extract-utilities" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.222733 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a26952e7-5644-449d-96b8-182cf1046e2e" containerName="extract-utilities" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.222947 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="a26952e7-5644-449d-96b8-182cf1046e2e" containerName="registry-server" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.222971 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e8e9465-2f77-4b84-b96d-a3c2768d0548" containerName="horizon-log" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.222988 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e8e9465-2f77-4b84-b96d-a3c2768d0548" containerName="horizon" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.223002 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b47eed2d-c027-4235-893e-a92f7ad54d17" containerName="horizon-log" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.223013 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d4dcfe0-9012-4762-a563-fb1ae5a47b46" containerName="horizon-log" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.223023 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d4dcfe0-9012-4762-a563-fb1ae5a47b46" containerName="horizon" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.223053 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b47eed2d-c027-4235-893e-a92f7ad54d17" containerName="horizon" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.224341 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6599c94766-cv4cn" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.337995 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d180dae-962c-4f30-8d6e-49ba2182673e-logs\") pod \"horizon-6599c94766-cv4cn\" (UID: \"6d180dae-962c-4f30-8d6e-49ba2182673e\") " pod="openstack/horizon-6599c94766-cv4cn" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.338539 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d180dae-962c-4f30-8d6e-49ba2182673e-horizon-tls-certs\") pod \"horizon-6599c94766-cv4cn\" (UID: \"6d180dae-962c-4f30-8d6e-49ba2182673e\") " pod="openstack/horizon-6599c94766-cv4cn" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.338676 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d180dae-962c-4f30-8d6e-49ba2182673e-combined-ca-bundle\") pod \"horizon-6599c94766-cv4cn\" (UID: \"6d180dae-962c-4f30-8d6e-49ba2182673e\") " pod="openstack/horizon-6599c94766-cv4cn" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.338812 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwn7m\" (UniqueName: \"kubernetes.io/projected/6d180dae-962c-4f30-8d6e-49ba2182673e-kube-api-access-fwn7m\") pod \"horizon-6599c94766-cv4cn\" (UID: \"6d180dae-962c-4f30-8d6e-49ba2182673e\") " pod="openstack/horizon-6599c94766-cv4cn" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.338860 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d180dae-962c-4f30-8d6e-49ba2182673e-scripts\") pod \"horizon-6599c94766-cv4cn\" (UID: \"6d180dae-962c-4f30-8d6e-49ba2182673e\") " pod="openstack/horizon-6599c94766-cv4cn" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.339029 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6d180dae-962c-4f30-8d6e-49ba2182673e-config-data\") pod \"horizon-6599c94766-cv4cn\" (UID: \"6d180dae-962c-4f30-8d6e-49ba2182673e\") " pod="openstack/horizon-6599c94766-cv4cn" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.339341 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6d180dae-962c-4f30-8d6e-49ba2182673e-horizon-secret-key\") pod \"horizon-6599c94766-cv4cn\" (UID: \"6d180dae-962c-4f30-8d6e-49ba2182673e\") " pod="openstack/horizon-6599c94766-cv4cn" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.441636 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d180dae-962c-4f30-8d6e-49ba2182673e-logs\") pod \"horizon-6599c94766-cv4cn\" (UID: \"6d180dae-962c-4f30-8d6e-49ba2182673e\") " pod="openstack/horizon-6599c94766-cv4cn" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.441683 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d180dae-962c-4f30-8d6e-49ba2182673e-horizon-tls-certs\") pod \"horizon-6599c94766-cv4cn\" (UID: \"6d180dae-962c-4f30-8d6e-49ba2182673e\") " pod="openstack/horizon-6599c94766-cv4cn" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.441720 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d180dae-962c-4f30-8d6e-49ba2182673e-combined-ca-bundle\") pod \"horizon-6599c94766-cv4cn\" (UID: \"6d180dae-962c-4f30-8d6e-49ba2182673e\") " pod="openstack/horizon-6599c94766-cv4cn" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.441761 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwn7m\" (UniqueName: \"kubernetes.io/projected/6d180dae-962c-4f30-8d6e-49ba2182673e-kube-api-access-fwn7m\") pod \"horizon-6599c94766-cv4cn\" (UID: \"6d180dae-962c-4f30-8d6e-49ba2182673e\") " pod="openstack/horizon-6599c94766-cv4cn" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.441784 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d180dae-962c-4f30-8d6e-49ba2182673e-scripts\") pod \"horizon-6599c94766-cv4cn\" (UID: \"6d180dae-962c-4f30-8d6e-49ba2182673e\") " pod="openstack/horizon-6599c94766-cv4cn" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.441825 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6d180dae-962c-4f30-8d6e-49ba2182673e-config-data\") pod \"horizon-6599c94766-cv4cn\" (UID: \"6d180dae-962c-4f30-8d6e-49ba2182673e\") " pod="openstack/horizon-6599c94766-cv4cn" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.441887 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6d180dae-962c-4f30-8d6e-49ba2182673e-horizon-secret-key\") pod \"horizon-6599c94766-cv4cn\" (UID: \"6d180dae-962c-4f30-8d6e-49ba2182673e\") " pod="openstack/horizon-6599c94766-cv4cn" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.442131 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d180dae-962c-4f30-8d6e-49ba2182673e-logs\") pod \"horizon-6599c94766-cv4cn\" (UID: \"6d180dae-962c-4f30-8d6e-49ba2182673e\") " pod="openstack/horizon-6599c94766-cv4cn" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.442696 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d180dae-962c-4f30-8d6e-49ba2182673e-scripts\") pod \"horizon-6599c94766-cv4cn\" (UID: \"6d180dae-962c-4f30-8d6e-49ba2182673e\") " pod="openstack/horizon-6599c94766-cv4cn" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.443608 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6d180dae-962c-4f30-8d6e-49ba2182673e-config-data\") pod \"horizon-6599c94766-cv4cn\" (UID: \"6d180dae-962c-4f30-8d6e-49ba2182673e\") " pod="openstack/horizon-6599c94766-cv4cn" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.449465 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d180dae-962c-4f30-8d6e-49ba2182673e-horizon-tls-certs\") pod \"horizon-6599c94766-cv4cn\" (UID: \"6d180dae-962c-4f30-8d6e-49ba2182673e\") " pod="openstack/horizon-6599c94766-cv4cn" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.461367 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6d180dae-962c-4f30-8d6e-49ba2182673e-horizon-secret-key\") pod \"horizon-6599c94766-cv4cn\" (UID: \"6d180dae-962c-4f30-8d6e-49ba2182673e\") " pod="openstack/horizon-6599c94766-cv4cn" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.463670 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d180dae-962c-4f30-8d6e-49ba2182673e-combined-ca-bundle\") pod \"horizon-6599c94766-cv4cn\" (UID: \"6d180dae-962c-4f30-8d6e-49ba2182673e\") " pod="openstack/horizon-6599c94766-cv4cn" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.464921 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwn7m\" (UniqueName: \"kubernetes.io/projected/6d180dae-962c-4f30-8d6e-49ba2182673e-kube-api-access-fwn7m\") pod \"horizon-6599c94766-cv4cn\" (UID: \"6d180dae-962c-4f30-8d6e-49ba2182673e\") " pod="openstack/horizon-6599c94766-cv4cn" Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.517703 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6599c94766-cv4cn"] Dec 05 18:02:20 crc kubenswrapper[4756]: I1205 18:02:20.551891 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6599c94766-cv4cn" Dec 05 18:02:21 crc kubenswrapper[4756]: I1205 18:02:21.060835 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6599c94766-cv4cn"] Dec 05 18:02:21 crc kubenswrapper[4756]: I1205 18:02:21.457819 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6599c94766-cv4cn" event={"ID":"6d180dae-962c-4f30-8d6e-49ba2182673e","Type":"ContainerStarted","Data":"a1fd32a54757eb99853856a6ce1db217629eea5222685189e7f4ff256b7db1fa"} Dec 05 18:02:21 crc kubenswrapper[4756]: I1205 18:02:21.457863 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6599c94766-cv4cn" event={"ID":"6d180dae-962c-4f30-8d6e-49ba2182673e","Type":"ContainerStarted","Data":"0314e24ab0b258740fb192c17677ac35a70d254c7cd1462f5f2b237f5c25ddfe"} Dec 05 18:02:21 crc kubenswrapper[4756]: I1205 18:02:21.902639 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-d8tqq"] Dec 05 18:02:21 crc kubenswrapper[4756]: I1205 18:02:21.904539 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-d8tqq" Dec 05 18:02:21 crc kubenswrapper[4756]: I1205 18:02:21.912445 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-d8tqq"] Dec 05 18:02:21 crc kubenswrapper[4756]: I1205 18:02:21.972654 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km9vp\" (UniqueName: \"kubernetes.io/projected/6f4f52fb-d424-47ef-9320-98036fde3926-kube-api-access-km9vp\") pod \"heat-db-create-d8tqq\" (UID: \"6f4f52fb-d424-47ef-9320-98036fde3926\") " pod="openstack/heat-db-create-d8tqq" Dec 05 18:02:21 crc kubenswrapper[4756]: I1205 18:02:21.972699 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f4f52fb-d424-47ef-9320-98036fde3926-operator-scripts\") pod \"heat-db-create-d8tqq\" (UID: \"6f4f52fb-d424-47ef-9320-98036fde3926\") " pod="openstack/heat-db-create-d8tqq" Dec 05 18:02:22 crc kubenswrapper[4756]: I1205 18:02:22.007830 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-9c83-account-create-update-zvw9n"] Dec 05 18:02:22 crc kubenswrapper[4756]: I1205 18:02:22.009179 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-9c83-account-create-update-zvw9n" Dec 05 18:02:22 crc kubenswrapper[4756]: I1205 18:02:22.011501 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Dec 05 18:02:22 crc kubenswrapper[4756]: I1205 18:02:22.018440 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-9c83-account-create-update-zvw9n"] Dec 05 18:02:22 crc kubenswrapper[4756]: I1205 18:02:22.074024 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c44x\" (UniqueName: \"kubernetes.io/projected/1cb4dac2-605b-4204-b359-96783ae01e91-kube-api-access-2c44x\") pod \"heat-9c83-account-create-update-zvw9n\" (UID: \"1cb4dac2-605b-4204-b359-96783ae01e91\") " pod="openstack/heat-9c83-account-create-update-zvw9n" Dec 05 18:02:22 crc kubenswrapper[4756]: I1205 18:02:22.074146 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km9vp\" (UniqueName: \"kubernetes.io/projected/6f4f52fb-d424-47ef-9320-98036fde3926-kube-api-access-km9vp\") pod \"heat-db-create-d8tqq\" (UID: \"6f4f52fb-d424-47ef-9320-98036fde3926\") " pod="openstack/heat-db-create-d8tqq" Dec 05 18:02:22 crc kubenswrapper[4756]: I1205 18:02:22.074181 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f4f52fb-d424-47ef-9320-98036fde3926-operator-scripts\") pod \"heat-db-create-d8tqq\" (UID: \"6f4f52fb-d424-47ef-9320-98036fde3926\") " pod="openstack/heat-db-create-d8tqq" Dec 05 18:02:22 crc kubenswrapper[4756]: I1205 18:02:22.074359 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cb4dac2-605b-4204-b359-96783ae01e91-operator-scripts\") pod \"heat-9c83-account-create-update-zvw9n\" (UID: \"1cb4dac2-605b-4204-b359-96783ae01e91\") " pod="openstack/heat-9c83-account-create-update-zvw9n" Dec 05 18:02:22 crc kubenswrapper[4756]: I1205 18:02:22.075607 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f4f52fb-d424-47ef-9320-98036fde3926-operator-scripts\") pod \"heat-db-create-d8tqq\" (UID: \"6f4f52fb-d424-47ef-9320-98036fde3926\") " pod="openstack/heat-db-create-d8tqq" Dec 05 18:02:22 crc kubenswrapper[4756]: I1205 18:02:22.091540 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km9vp\" (UniqueName: \"kubernetes.io/projected/6f4f52fb-d424-47ef-9320-98036fde3926-kube-api-access-km9vp\") pod \"heat-db-create-d8tqq\" (UID: \"6f4f52fb-d424-47ef-9320-98036fde3926\") " pod="openstack/heat-db-create-d8tqq" Dec 05 18:02:22 crc kubenswrapper[4756]: I1205 18:02:22.176363 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cb4dac2-605b-4204-b359-96783ae01e91-operator-scripts\") pod \"heat-9c83-account-create-update-zvw9n\" (UID: \"1cb4dac2-605b-4204-b359-96783ae01e91\") " pod="openstack/heat-9c83-account-create-update-zvw9n" Dec 05 18:02:22 crc kubenswrapper[4756]: I1205 18:02:22.176631 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c44x\" (UniqueName: \"kubernetes.io/projected/1cb4dac2-605b-4204-b359-96783ae01e91-kube-api-access-2c44x\") pod \"heat-9c83-account-create-update-zvw9n\" (UID: \"1cb4dac2-605b-4204-b359-96783ae01e91\") " pod="openstack/heat-9c83-account-create-update-zvw9n" Dec 05 18:02:22 crc kubenswrapper[4756]: I1205 18:02:22.177778 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cb4dac2-605b-4204-b359-96783ae01e91-operator-scripts\") pod \"heat-9c83-account-create-update-zvw9n\" (UID: \"1cb4dac2-605b-4204-b359-96783ae01e91\") " pod="openstack/heat-9c83-account-create-update-zvw9n" Dec 05 18:02:22 crc kubenswrapper[4756]: I1205 18:02:22.197257 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c44x\" (UniqueName: \"kubernetes.io/projected/1cb4dac2-605b-4204-b359-96783ae01e91-kube-api-access-2c44x\") pod \"heat-9c83-account-create-update-zvw9n\" (UID: \"1cb4dac2-605b-4204-b359-96783ae01e91\") " pod="openstack/heat-9c83-account-create-update-zvw9n" Dec 05 18:02:22 crc kubenswrapper[4756]: I1205 18:02:22.220278 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-d8tqq" Dec 05 18:02:22 crc kubenswrapper[4756]: I1205 18:02:22.329874 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-9c83-account-create-update-zvw9n" Dec 05 18:02:22 crc kubenswrapper[4756]: I1205 18:02:22.471540 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6599c94766-cv4cn" event={"ID":"6d180dae-962c-4f30-8d6e-49ba2182673e","Type":"ContainerStarted","Data":"94d418327d5ea2e0ee1c1cbf5dc5527dcad87412199190be1ffdfdec45e4537c"} Dec 05 18:02:22 crc kubenswrapper[4756]: I1205 18:02:22.500474 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6599c94766-cv4cn" podStartSLOduration=2.500448439 podStartE2EDuration="2.500448439s" podCreationTimestamp="2025-12-05 18:02:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 18:02:22.486880247 +0000 UTC m=+6402.725553759" watchObservedRunningTime="2025-12-05 18:02:22.500448439 +0000 UTC m=+6402.739121951" Dec 05 18:02:22 crc kubenswrapper[4756]: I1205 18:02:22.703648 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-d8tqq"] Dec 05 18:02:22 crc kubenswrapper[4756]: W1205 18:02:22.889880 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cb4dac2_605b_4204_b359_96783ae01e91.slice/crio-927c8bee8847cbf8d3512f0df31d31dc3faa69d90729bd4bdcd434a4082330cf WatchSource:0}: Error finding container 927c8bee8847cbf8d3512f0df31d31dc3faa69d90729bd4bdcd434a4082330cf: Status 404 returned error can't find the container with id 927c8bee8847cbf8d3512f0df31d31dc3faa69d90729bd4bdcd434a4082330cf Dec 05 18:02:22 crc kubenswrapper[4756]: I1205 18:02:22.895128 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-9c83-account-create-update-zvw9n"] Dec 05 18:02:23 crc kubenswrapper[4756]: I1205 18:02:23.482528 4756 generic.go:334] "Generic (PLEG): container finished" podID="6f4f52fb-d424-47ef-9320-98036fde3926" containerID="5b89a727354a70c4b0ab2230315f95b7308b65fcd141b4fd732f293ce064bea6" exitCode=0 Dec 05 18:02:23 crc kubenswrapper[4756]: I1205 18:02:23.482568 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-d8tqq" event={"ID":"6f4f52fb-d424-47ef-9320-98036fde3926","Type":"ContainerDied","Data":"5b89a727354a70c4b0ab2230315f95b7308b65fcd141b4fd732f293ce064bea6"} Dec 05 18:02:23 crc kubenswrapper[4756]: I1205 18:02:23.482841 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-d8tqq" event={"ID":"6f4f52fb-d424-47ef-9320-98036fde3926","Type":"ContainerStarted","Data":"be50caf19ec8693ab3158800998156fc92641c03460053d4086faef47d15fe9a"} Dec 05 18:02:23 crc kubenswrapper[4756]: I1205 18:02:23.484429 4756 generic.go:334] "Generic (PLEG): container finished" podID="1cb4dac2-605b-4204-b359-96783ae01e91" containerID="c75b73a0e4088d60d58a2cb4173d7ffedffb47ef484a97ca6341656f45596422" exitCode=0 Dec 05 18:02:23 crc kubenswrapper[4756]: I1205 18:02:23.484470 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-9c83-account-create-update-zvw9n" event={"ID":"1cb4dac2-605b-4204-b359-96783ae01e91","Type":"ContainerDied","Data":"c75b73a0e4088d60d58a2cb4173d7ffedffb47ef484a97ca6341656f45596422"} Dec 05 18:02:23 crc kubenswrapper[4756]: I1205 18:02:23.484522 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-9c83-account-create-update-zvw9n" event={"ID":"1cb4dac2-605b-4204-b359-96783ae01e91","Type":"ContainerStarted","Data":"927c8bee8847cbf8d3512f0df31d31dc3faa69d90729bd4bdcd434a4082330cf"} Dec 05 18:02:24 crc kubenswrapper[4756]: I1205 18:02:24.971459 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-d8tqq" Dec 05 18:02:24 crc kubenswrapper[4756]: I1205 18:02:24.977937 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-9c83-account-create-update-zvw9n" Dec 05 18:02:25 crc kubenswrapper[4756]: I1205 18:02:25.054739 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f4f52fb-d424-47ef-9320-98036fde3926-operator-scripts\") pod \"6f4f52fb-d424-47ef-9320-98036fde3926\" (UID: \"6f4f52fb-d424-47ef-9320-98036fde3926\") " Dec 05 18:02:25 crc kubenswrapper[4756]: I1205 18:02:25.054875 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cb4dac2-605b-4204-b359-96783ae01e91-operator-scripts\") pod \"1cb4dac2-605b-4204-b359-96783ae01e91\" (UID: \"1cb4dac2-605b-4204-b359-96783ae01e91\") " Dec 05 18:02:25 crc kubenswrapper[4756]: I1205 18:02:25.055009 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c44x\" (UniqueName: \"kubernetes.io/projected/1cb4dac2-605b-4204-b359-96783ae01e91-kube-api-access-2c44x\") pod \"1cb4dac2-605b-4204-b359-96783ae01e91\" (UID: \"1cb4dac2-605b-4204-b359-96783ae01e91\") " Dec 05 18:02:25 crc kubenswrapper[4756]: I1205 18:02:25.055762 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f4f52fb-d424-47ef-9320-98036fde3926-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6f4f52fb-d424-47ef-9320-98036fde3926" (UID: "6f4f52fb-d424-47ef-9320-98036fde3926"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:02:25 crc kubenswrapper[4756]: I1205 18:02:25.055992 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km9vp\" (UniqueName: \"kubernetes.io/projected/6f4f52fb-d424-47ef-9320-98036fde3926-kube-api-access-km9vp\") pod \"6f4f52fb-d424-47ef-9320-98036fde3926\" (UID: \"6f4f52fb-d424-47ef-9320-98036fde3926\") " Dec 05 18:02:25 crc kubenswrapper[4756]: I1205 18:02:25.056603 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f4f52fb-d424-47ef-9320-98036fde3926-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 18:02:25 crc kubenswrapper[4756]: I1205 18:02:25.058935 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cb4dac2-605b-4204-b359-96783ae01e91-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1cb4dac2-605b-4204-b359-96783ae01e91" (UID: "1cb4dac2-605b-4204-b359-96783ae01e91"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:02:25 crc kubenswrapper[4756]: I1205 18:02:25.064232 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f4f52fb-d424-47ef-9320-98036fde3926-kube-api-access-km9vp" (OuterVolumeSpecName: "kube-api-access-km9vp") pod "6f4f52fb-d424-47ef-9320-98036fde3926" (UID: "6f4f52fb-d424-47ef-9320-98036fde3926"). InnerVolumeSpecName "kube-api-access-km9vp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:02:25 crc kubenswrapper[4756]: I1205 18:02:25.077188 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cb4dac2-605b-4204-b359-96783ae01e91-kube-api-access-2c44x" (OuterVolumeSpecName: "kube-api-access-2c44x") pod "1cb4dac2-605b-4204-b359-96783ae01e91" (UID: "1cb4dac2-605b-4204-b359-96783ae01e91"). InnerVolumeSpecName "kube-api-access-2c44x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:02:25 crc kubenswrapper[4756]: I1205 18:02:25.158217 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cb4dac2-605b-4204-b359-96783ae01e91-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 18:02:25 crc kubenswrapper[4756]: I1205 18:02:25.158251 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c44x\" (UniqueName: \"kubernetes.io/projected/1cb4dac2-605b-4204-b359-96783ae01e91-kube-api-access-2c44x\") on node \"crc\" DevicePath \"\"" Dec 05 18:02:25 crc kubenswrapper[4756]: I1205 18:02:25.158268 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km9vp\" (UniqueName: \"kubernetes.io/projected/6f4f52fb-d424-47ef-9320-98036fde3926-kube-api-access-km9vp\") on node \"crc\" DevicePath \"\"" Dec 05 18:02:25 crc kubenswrapper[4756]: I1205 18:02:25.512597 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-9c83-account-create-update-zvw9n" event={"ID":"1cb4dac2-605b-4204-b359-96783ae01e91","Type":"ContainerDied","Data":"927c8bee8847cbf8d3512f0df31d31dc3faa69d90729bd4bdcd434a4082330cf"} Dec 05 18:02:25 crc kubenswrapper[4756]: I1205 18:02:25.513162 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="927c8bee8847cbf8d3512f0df31d31dc3faa69d90729bd4bdcd434a4082330cf" Dec 05 18:02:25 crc kubenswrapper[4756]: I1205 18:02:25.512861 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-9c83-account-create-update-zvw9n" Dec 05 18:02:25 crc kubenswrapper[4756]: I1205 18:02:25.515814 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-d8tqq" event={"ID":"6f4f52fb-d424-47ef-9320-98036fde3926","Type":"ContainerDied","Data":"be50caf19ec8693ab3158800998156fc92641c03460053d4086faef47d15fe9a"} Dec 05 18:02:25 crc kubenswrapper[4756]: I1205 18:02:25.515848 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be50caf19ec8693ab3158800998156fc92641c03460053d4086faef47d15fe9a" Dec 05 18:02:25 crc kubenswrapper[4756]: I1205 18:02:25.515880 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-d8tqq" Dec 05 18:02:27 crc kubenswrapper[4756]: I1205 18:02:27.086759 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-bh7pp"] Dec 05 18:02:27 crc kubenswrapper[4756]: E1205 18:02:27.096449 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4f52fb-d424-47ef-9320-98036fde3926" containerName="mariadb-database-create" Dec 05 18:02:27 crc kubenswrapper[4756]: I1205 18:02:27.096472 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4f52fb-d424-47ef-9320-98036fde3926" containerName="mariadb-database-create" Dec 05 18:02:27 crc kubenswrapper[4756]: E1205 18:02:27.096510 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cb4dac2-605b-4204-b359-96783ae01e91" containerName="mariadb-account-create-update" Dec 05 18:02:27 crc kubenswrapper[4756]: I1205 18:02:27.096518 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cb4dac2-605b-4204-b359-96783ae01e91" containerName="mariadb-account-create-update" Dec 05 18:02:27 crc kubenswrapper[4756]: I1205 18:02:27.096746 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cb4dac2-605b-4204-b359-96783ae01e91" containerName="mariadb-account-create-update" Dec 05 18:02:27 crc kubenswrapper[4756]: I1205 18:02:27.096767 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f4f52fb-d424-47ef-9320-98036fde3926" containerName="mariadb-database-create" Dec 05 18:02:27 crc kubenswrapper[4756]: I1205 18:02:27.097554 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-bh7pp" Dec 05 18:02:27 crc kubenswrapper[4756]: I1205 18:02:27.104197 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 05 18:02:27 crc kubenswrapper[4756]: I1205 18:02:27.104358 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-q6vzl" Dec 05 18:02:27 crc kubenswrapper[4756]: I1205 18:02:27.106091 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-bh7pp"] Dec 05 18:02:27 crc kubenswrapper[4756]: I1205 18:02:27.207020 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94fa0a94-d869-4426-8a74-8a96196d3772-combined-ca-bundle\") pod \"heat-db-sync-bh7pp\" (UID: \"94fa0a94-d869-4426-8a74-8a96196d3772\") " pod="openstack/heat-db-sync-bh7pp" Dec 05 18:02:27 crc kubenswrapper[4756]: I1205 18:02:27.207110 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94fa0a94-d869-4426-8a74-8a96196d3772-config-data\") pod \"heat-db-sync-bh7pp\" (UID: \"94fa0a94-d869-4426-8a74-8a96196d3772\") " pod="openstack/heat-db-sync-bh7pp" Dec 05 18:02:27 crc kubenswrapper[4756]: I1205 18:02:27.207180 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6hvd\" (UniqueName: \"kubernetes.io/projected/94fa0a94-d869-4426-8a74-8a96196d3772-kube-api-access-n6hvd\") pod \"heat-db-sync-bh7pp\" (UID: \"94fa0a94-d869-4426-8a74-8a96196d3772\") " pod="openstack/heat-db-sync-bh7pp" Dec 05 18:02:27 crc kubenswrapper[4756]: I1205 18:02:27.309161 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94fa0a94-d869-4426-8a74-8a96196d3772-combined-ca-bundle\") pod \"heat-db-sync-bh7pp\" (UID: \"94fa0a94-d869-4426-8a74-8a96196d3772\") " pod="openstack/heat-db-sync-bh7pp" Dec 05 18:02:27 crc kubenswrapper[4756]: I1205 18:02:27.309260 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94fa0a94-d869-4426-8a74-8a96196d3772-config-data\") pod \"heat-db-sync-bh7pp\" (UID: \"94fa0a94-d869-4426-8a74-8a96196d3772\") " pod="openstack/heat-db-sync-bh7pp" Dec 05 18:02:27 crc kubenswrapper[4756]: I1205 18:02:27.309323 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6hvd\" (UniqueName: \"kubernetes.io/projected/94fa0a94-d869-4426-8a74-8a96196d3772-kube-api-access-n6hvd\") pod \"heat-db-sync-bh7pp\" (UID: \"94fa0a94-d869-4426-8a74-8a96196d3772\") " pod="openstack/heat-db-sync-bh7pp" Dec 05 18:02:27 crc kubenswrapper[4756]: I1205 18:02:27.323966 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94fa0a94-d869-4426-8a74-8a96196d3772-combined-ca-bundle\") pod \"heat-db-sync-bh7pp\" (UID: \"94fa0a94-d869-4426-8a74-8a96196d3772\") " pod="openstack/heat-db-sync-bh7pp" Dec 05 18:02:27 crc kubenswrapper[4756]: I1205 18:02:27.328521 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94fa0a94-d869-4426-8a74-8a96196d3772-config-data\") pod \"heat-db-sync-bh7pp\" (UID: \"94fa0a94-d869-4426-8a74-8a96196d3772\") " pod="openstack/heat-db-sync-bh7pp" Dec 05 18:02:27 crc kubenswrapper[4756]: I1205 18:02:27.340555 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6hvd\" (UniqueName: \"kubernetes.io/projected/94fa0a94-d869-4426-8a74-8a96196d3772-kube-api-access-n6hvd\") pod \"heat-db-sync-bh7pp\" (UID: \"94fa0a94-d869-4426-8a74-8a96196d3772\") " pod="openstack/heat-db-sync-bh7pp" Dec 05 18:02:27 crc kubenswrapper[4756]: I1205 18:02:27.429393 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-bh7pp" Dec 05 18:02:27 crc kubenswrapper[4756]: W1205 18:02:27.935445 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94fa0a94_d869_4426_8a74_8a96196d3772.slice/crio-3fb595b920bcaecf77811ec64888ca900766611dcf1a16382d88f01fe0dda8df WatchSource:0}: Error finding container 3fb595b920bcaecf77811ec64888ca900766611dcf1a16382d88f01fe0dda8df: Status 404 returned error can't find the container with id 3fb595b920bcaecf77811ec64888ca900766611dcf1a16382d88f01fe0dda8df Dec 05 18:02:27 crc kubenswrapper[4756]: I1205 18:02:27.958722 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-bh7pp"] Dec 05 18:02:28 crc kubenswrapper[4756]: I1205 18:02:28.556632 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-bh7pp" event={"ID":"94fa0a94-d869-4426-8a74-8a96196d3772","Type":"ContainerStarted","Data":"3fb595b920bcaecf77811ec64888ca900766611dcf1a16382d88f01fe0dda8df"} Dec 05 18:02:30 crc kubenswrapper[4756]: I1205 18:02:30.552117 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6599c94766-cv4cn" Dec 05 18:02:30 crc kubenswrapper[4756]: I1205 18:02:30.554105 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6599c94766-cv4cn" Dec 05 18:02:33 crc kubenswrapper[4756]: I1205 18:02:33.044300 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-sb2xm"] Dec 05 18:02:33 crc kubenswrapper[4756]: I1205 18:02:33.052252 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-sb2xm"] Dec 05 18:02:34 crc kubenswrapper[4756]: I1205 18:02:34.048676 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8311-account-create-update-f22zw"] Dec 05 18:02:34 crc kubenswrapper[4756]: I1205 18:02:34.071981 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b48d5aeb-da03-4791-bc18-6cc3ee644863" path="/var/lib/kubelet/pods/b48d5aeb-da03-4791-bc18-6cc3ee644863/volumes" Dec 05 18:02:34 crc kubenswrapper[4756]: I1205 18:02:34.260585 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-8311-account-create-update-f22zw"] Dec 05 18:02:36 crc kubenswrapper[4756]: I1205 18:02:36.066065 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7269b014-e537-4908-b6f1-26b1968cf770" path="/var/lib/kubelet/pods/7269b014-e537-4908-b6f1-26b1968cf770/volumes" Dec 05 18:02:36 crc kubenswrapper[4756]: I1205 18:02:36.643061 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-bh7pp" event={"ID":"94fa0a94-d869-4426-8a74-8a96196d3772","Type":"ContainerStarted","Data":"928830a8f658bbca2e7acc46db3e6620982abf02692e8228a1a78d4abc63a205"} Dec 05 18:02:36 crc kubenswrapper[4756]: I1205 18:02:36.660303 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-bh7pp" podStartSLOduration=1.275012324 podStartE2EDuration="9.660282477s" podCreationTimestamp="2025-12-05 18:02:27 +0000 UTC" firstStartedPulling="2025-12-05 18:02:27.938221215 +0000 UTC m=+6408.176894727" lastFinishedPulling="2025-12-05 18:02:36.323491358 +0000 UTC m=+6416.562164880" observedRunningTime="2025-12-05 18:02:36.659829015 +0000 UTC m=+6416.898502567" watchObservedRunningTime="2025-12-05 18:02:36.660282477 +0000 UTC m=+6416.898955999" Dec 05 18:02:38 crc kubenswrapper[4756]: I1205 18:02:38.708140 4756 generic.go:334] "Generic (PLEG): container finished" podID="94fa0a94-d869-4426-8a74-8a96196d3772" containerID="928830a8f658bbca2e7acc46db3e6620982abf02692e8228a1a78d4abc63a205" exitCode=0 Dec 05 18:02:38 crc kubenswrapper[4756]: I1205 18:02:38.708216 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-bh7pp" event={"ID":"94fa0a94-d869-4426-8a74-8a96196d3772","Type":"ContainerDied","Data":"928830a8f658bbca2e7acc46db3e6620982abf02692e8228a1a78d4abc63a205"} Dec 05 18:02:40 crc kubenswrapper[4756]: I1205 18:02:40.093454 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-bh7pp" Dec 05 18:02:40 crc kubenswrapper[4756]: I1205 18:02:40.239914 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94fa0a94-d869-4426-8a74-8a96196d3772-config-data\") pod \"94fa0a94-d869-4426-8a74-8a96196d3772\" (UID: \"94fa0a94-d869-4426-8a74-8a96196d3772\") " Dec 05 18:02:40 crc kubenswrapper[4756]: I1205 18:02:40.240011 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94fa0a94-d869-4426-8a74-8a96196d3772-combined-ca-bundle\") pod \"94fa0a94-d869-4426-8a74-8a96196d3772\" (UID: \"94fa0a94-d869-4426-8a74-8a96196d3772\") " Dec 05 18:02:40 crc kubenswrapper[4756]: I1205 18:02:40.240081 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6hvd\" (UniqueName: \"kubernetes.io/projected/94fa0a94-d869-4426-8a74-8a96196d3772-kube-api-access-n6hvd\") pod \"94fa0a94-d869-4426-8a74-8a96196d3772\" (UID: \"94fa0a94-d869-4426-8a74-8a96196d3772\") " Dec 05 18:02:40 crc kubenswrapper[4756]: I1205 18:02:40.245919 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94fa0a94-d869-4426-8a74-8a96196d3772-kube-api-access-n6hvd" (OuterVolumeSpecName: "kube-api-access-n6hvd") pod "94fa0a94-d869-4426-8a74-8a96196d3772" (UID: "94fa0a94-d869-4426-8a74-8a96196d3772"). InnerVolumeSpecName "kube-api-access-n6hvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:02:40 crc kubenswrapper[4756]: I1205 18:02:40.275873 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94fa0a94-d869-4426-8a74-8a96196d3772-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94fa0a94-d869-4426-8a74-8a96196d3772" (UID: "94fa0a94-d869-4426-8a74-8a96196d3772"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:02:40 crc kubenswrapper[4756]: I1205 18:02:40.337219 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94fa0a94-d869-4426-8a74-8a96196d3772-config-data" (OuterVolumeSpecName: "config-data") pod "94fa0a94-d869-4426-8a74-8a96196d3772" (UID: "94fa0a94-d869-4426-8a74-8a96196d3772"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:02:40 crc kubenswrapper[4756]: I1205 18:02:40.344001 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94fa0a94-d869-4426-8a74-8a96196d3772-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 18:02:40 crc kubenswrapper[4756]: I1205 18:02:40.344389 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94fa0a94-d869-4426-8a74-8a96196d3772-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:02:40 crc kubenswrapper[4756]: I1205 18:02:40.344580 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6hvd\" (UniqueName: \"kubernetes.io/projected/94fa0a94-d869-4426-8a74-8a96196d3772-kube-api-access-n6hvd\") on node \"crc\" DevicePath \"\"" Dec 05 18:02:40 crc kubenswrapper[4756]: I1205 18:02:40.738128 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-bh7pp" event={"ID":"94fa0a94-d869-4426-8a74-8a96196d3772","Type":"ContainerDied","Data":"3fb595b920bcaecf77811ec64888ca900766611dcf1a16382d88f01fe0dda8df"} Dec 05 18:02:40 crc kubenswrapper[4756]: I1205 18:02:40.738186 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-bh7pp" Dec 05 18:02:40 crc kubenswrapper[4756]: I1205 18:02:40.738190 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fb595b920bcaecf77811ec64888ca900766611dcf1a16382d88f01fe0dda8df" Dec 05 18:02:41 crc kubenswrapper[4756]: I1205 18:02:41.064953 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-smhw8"] Dec 05 18:02:41 crc kubenswrapper[4756]: I1205 18:02:41.088929 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-smhw8"] Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.048053 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-974fd87b7-7rfqs"] Dec 05 18:02:42 crc kubenswrapper[4756]: E1205 18:02:42.064068 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94fa0a94-d869-4426-8a74-8a96196d3772" containerName="heat-db-sync" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.064090 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="94fa0a94-d869-4426-8a74-8a96196d3772" containerName="heat-db-sync" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.064270 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="94fa0a94-d869-4426-8a74-8a96196d3772" containerName="heat-db-sync" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.065408 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-974fd87b7-7rfqs" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.069789 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-q6vzl" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.069963 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.070086 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.096653 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff1e1cbf-0283-495e-a34e-eb538ef04665" path="/var/lib/kubelet/pods/ff1e1cbf-0283-495e-a34e-eb538ef04665/volumes" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.097356 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-974fd87b7-7rfqs"] Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.176645 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-869596f5cd-sqsnf"] Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.177862 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-869596f5cd-sqsnf" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.186805 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.188583 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddrvr\" (UniqueName: \"kubernetes.io/projected/b6f2c129-6474-48d8-89ba-285d3c589475-kube-api-access-ddrvr\") pod \"heat-engine-974fd87b7-7rfqs\" (UID: \"b6f2c129-6474-48d8-89ba-285d3c589475\") " pod="openstack/heat-engine-974fd87b7-7rfqs" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.188629 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f2c129-6474-48d8-89ba-285d3c589475-combined-ca-bundle\") pod \"heat-engine-974fd87b7-7rfqs\" (UID: \"b6f2c129-6474-48d8-89ba-285d3c589475\") " pod="openstack/heat-engine-974fd87b7-7rfqs" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.188666 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6f2c129-6474-48d8-89ba-285d3c589475-config-data\") pod \"heat-engine-974fd87b7-7rfqs\" (UID: \"b6f2c129-6474-48d8-89ba-285d3c589475\") " pod="openstack/heat-engine-974fd87b7-7rfqs" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.188745 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6f2c129-6474-48d8-89ba-285d3c589475-config-data-custom\") pod \"heat-engine-974fd87b7-7rfqs\" (UID: \"b6f2c129-6474-48d8-89ba-285d3c589475\") " pod="openstack/heat-engine-974fd87b7-7rfqs" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.200172 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-869596f5cd-sqsnf"] Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.246788 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-84dd6cffc-vc2c8"] Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.259723 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-84dd6cffc-vc2c8" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.260981 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-84dd6cffc-vc2c8"] Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.261941 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.291527 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb-combined-ca-bundle\") pod \"heat-api-869596f5cd-sqsnf\" (UID: \"ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb\") " pod="openstack/heat-api-869596f5cd-sqsnf" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.291581 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddrvr\" (UniqueName: \"kubernetes.io/projected/b6f2c129-6474-48d8-89ba-285d3c589475-kube-api-access-ddrvr\") pod \"heat-engine-974fd87b7-7rfqs\" (UID: \"b6f2c129-6474-48d8-89ba-285d3c589475\") " pod="openstack/heat-engine-974fd87b7-7rfqs" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.291629 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f2c129-6474-48d8-89ba-285d3c589475-combined-ca-bundle\") pod \"heat-engine-974fd87b7-7rfqs\" (UID: \"b6f2c129-6474-48d8-89ba-285d3c589475\") " pod="openstack/heat-engine-974fd87b7-7rfqs" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.291676 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6f2c129-6474-48d8-89ba-285d3c589475-config-data\") pod \"heat-engine-974fd87b7-7rfqs\" (UID: \"b6f2c129-6474-48d8-89ba-285d3c589475\") " pod="openstack/heat-engine-974fd87b7-7rfqs" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.291743 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6f2c129-6474-48d8-89ba-285d3c589475-config-data-custom\") pod \"heat-engine-974fd87b7-7rfqs\" (UID: \"b6f2c129-6474-48d8-89ba-285d3c589475\") " pod="openstack/heat-engine-974fd87b7-7rfqs" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.291795 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb-config-data\") pod \"heat-api-869596f5cd-sqsnf\" (UID: \"ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb\") " pod="openstack/heat-api-869596f5cd-sqsnf" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.291862 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb-config-data-custom\") pod \"heat-api-869596f5cd-sqsnf\" (UID: \"ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb\") " pod="openstack/heat-api-869596f5cd-sqsnf" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.291914 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5s9z\" (UniqueName: \"kubernetes.io/projected/ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb-kube-api-access-b5s9z\") pod \"heat-api-869596f5cd-sqsnf\" (UID: \"ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb\") " pod="openstack/heat-api-869596f5cd-sqsnf" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.307255 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f2c129-6474-48d8-89ba-285d3c589475-combined-ca-bundle\") pod \"heat-engine-974fd87b7-7rfqs\" (UID: \"b6f2c129-6474-48d8-89ba-285d3c589475\") " pod="openstack/heat-engine-974fd87b7-7rfqs" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.308531 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6f2c129-6474-48d8-89ba-285d3c589475-config-data-custom\") pod \"heat-engine-974fd87b7-7rfqs\" (UID: \"b6f2c129-6474-48d8-89ba-285d3c589475\") " pod="openstack/heat-engine-974fd87b7-7rfqs" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.313811 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddrvr\" (UniqueName: \"kubernetes.io/projected/b6f2c129-6474-48d8-89ba-285d3c589475-kube-api-access-ddrvr\") pod \"heat-engine-974fd87b7-7rfqs\" (UID: \"b6f2c129-6474-48d8-89ba-285d3c589475\") " pod="openstack/heat-engine-974fd87b7-7rfqs" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.324689 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6f2c129-6474-48d8-89ba-285d3c589475-config-data\") pod \"heat-engine-974fd87b7-7rfqs\" (UID: \"b6f2c129-6474-48d8-89ba-285d3c589475\") " pod="openstack/heat-engine-974fd87b7-7rfqs" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.393969 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afa0b803-3a8a-4595-ace3-b4f442c478e4-config-data\") pod \"heat-cfnapi-84dd6cffc-vc2c8\" (UID: \"afa0b803-3a8a-4595-ace3-b4f442c478e4\") " pod="openstack/heat-cfnapi-84dd6cffc-vc2c8" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.394394 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gvt6\" (UniqueName: \"kubernetes.io/projected/afa0b803-3a8a-4595-ace3-b4f442c478e4-kube-api-access-7gvt6\") pod \"heat-cfnapi-84dd6cffc-vc2c8\" (UID: \"afa0b803-3a8a-4595-ace3-b4f442c478e4\") " pod="openstack/heat-cfnapi-84dd6cffc-vc2c8" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.395081 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb-config-data\") pod \"heat-api-869596f5cd-sqsnf\" (UID: \"ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb\") " pod="openstack/heat-api-869596f5cd-sqsnf" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.395114 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afa0b803-3a8a-4595-ace3-b4f442c478e4-combined-ca-bundle\") pod \"heat-cfnapi-84dd6cffc-vc2c8\" (UID: \"afa0b803-3a8a-4595-ace3-b4f442c478e4\") " pod="openstack/heat-cfnapi-84dd6cffc-vc2c8" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.395141 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/afa0b803-3a8a-4595-ace3-b4f442c478e4-config-data-custom\") pod \"heat-cfnapi-84dd6cffc-vc2c8\" (UID: \"afa0b803-3a8a-4595-ace3-b4f442c478e4\") " pod="openstack/heat-cfnapi-84dd6cffc-vc2c8" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.395188 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb-config-data-custom\") pod \"heat-api-869596f5cd-sqsnf\" (UID: \"ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb\") " pod="openstack/heat-api-869596f5cd-sqsnf" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.395223 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5s9z\" (UniqueName: \"kubernetes.io/projected/ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb-kube-api-access-b5s9z\") pod \"heat-api-869596f5cd-sqsnf\" (UID: \"ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb\") " pod="openstack/heat-api-869596f5cd-sqsnf" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.395262 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb-combined-ca-bundle\") pod \"heat-api-869596f5cd-sqsnf\" (UID: \"ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb\") " pod="openstack/heat-api-869596f5cd-sqsnf" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.399397 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb-combined-ca-bundle\") pod \"heat-api-869596f5cd-sqsnf\" (UID: \"ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb\") " pod="openstack/heat-api-869596f5cd-sqsnf" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.400014 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb-config-data-custom\") pod \"heat-api-869596f5cd-sqsnf\" (UID: \"ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb\") " pod="openstack/heat-api-869596f5cd-sqsnf" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.400436 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb-config-data\") pod \"heat-api-869596f5cd-sqsnf\" (UID: \"ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb\") " pod="openstack/heat-api-869596f5cd-sqsnf" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.410767 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5s9z\" (UniqueName: \"kubernetes.io/projected/ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb-kube-api-access-b5s9z\") pod \"heat-api-869596f5cd-sqsnf\" (UID: \"ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb\") " pod="openstack/heat-api-869596f5cd-sqsnf" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.423785 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-974fd87b7-7rfqs" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.497567 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afa0b803-3a8a-4595-ace3-b4f442c478e4-config-data\") pod \"heat-cfnapi-84dd6cffc-vc2c8\" (UID: \"afa0b803-3a8a-4595-ace3-b4f442c478e4\") " pod="openstack/heat-cfnapi-84dd6cffc-vc2c8" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.497646 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gvt6\" (UniqueName: \"kubernetes.io/projected/afa0b803-3a8a-4595-ace3-b4f442c478e4-kube-api-access-7gvt6\") pod \"heat-cfnapi-84dd6cffc-vc2c8\" (UID: \"afa0b803-3a8a-4595-ace3-b4f442c478e4\") " pod="openstack/heat-cfnapi-84dd6cffc-vc2c8" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.497683 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afa0b803-3a8a-4595-ace3-b4f442c478e4-combined-ca-bundle\") pod \"heat-cfnapi-84dd6cffc-vc2c8\" (UID: \"afa0b803-3a8a-4595-ace3-b4f442c478e4\") " pod="openstack/heat-cfnapi-84dd6cffc-vc2c8" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.497706 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/afa0b803-3a8a-4595-ace3-b4f442c478e4-config-data-custom\") pod \"heat-cfnapi-84dd6cffc-vc2c8\" (UID: \"afa0b803-3a8a-4595-ace3-b4f442c478e4\") " pod="openstack/heat-cfnapi-84dd6cffc-vc2c8" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.503546 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afa0b803-3a8a-4595-ace3-b4f442c478e4-combined-ca-bundle\") pod \"heat-cfnapi-84dd6cffc-vc2c8\" (UID: \"afa0b803-3a8a-4595-ace3-b4f442c478e4\") " pod="openstack/heat-cfnapi-84dd6cffc-vc2c8" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.506091 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afa0b803-3a8a-4595-ace3-b4f442c478e4-config-data\") pod \"heat-cfnapi-84dd6cffc-vc2c8\" (UID: \"afa0b803-3a8a-4595-ace3-b4f442c478e4\") " pod="openstack/heat-cfnapi-84dd6cffc-vc2c8" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.510965 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-869596f5cd-sqsnf" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.513291 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/afa0b803-3a8a-4595-ace3-b4f442c478e4-config-data-custom\") pod \"heat-cfnapi-84dd6cffc-vc2c8\" (UID: \"afa0b803-3a8a-4595-ace3-b4f442c478e4\") " pod="openstack/heat-cfnapi-84dd6cffc-vc2c8" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.538718 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6599c94766-cv4cn" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.555581 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gvt6\" (UniqueName: \"kubernetes.io/projected/afa0b803-3a8a-4595-ace3-b4f442c478e4-kube-api-access-7gvt6\") pod \"heat-cfnapi-84dd6cffc-vc2c8\" (UID: \"afa0b803-3a8a-4595-ace3-b4f442c478e4\") " pod="openstack/heat-cfnapi-84dd6cffc-vc2c8" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.588099 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-84dd6cffc-vc2c8" Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.826381 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-869596f5cd-sqsnf"] Dec 05 18:02:42 crc kubenswrapper[4756]: I1205 18:02:42.881976 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-974fd87b7-7rfqs"] Dec 05 18:02:42 crc kubenswrapper[4756]: W1205 18:02:42.889226 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6f2c129_6474_48d8_89ba_285d3c589475.slice/crio-61add351ff6d086d79ea525bb25e9cef51037c67f76343c094ca1195bb22389b WatchSource:0}: Error finding container 61add351ff6d086d79ea525bb25e9cef51037c67f76343c094ca1195bb22389b: Status 404 returned error can't find the container with id 61add351ff6d086d79ea525bb25e9cef51037c67f76343c094ca1195bb22389b Dec 05 18:02:43 crc kubenswrapper[4756]: W1205 18:02:43.130386 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafa0b803_3a8a_4595_ace3_b4f442c478e4.slice/crio-478df789454f4dd21660aa4fd90b3a3fc4afd47be528d3874cf4f632a7221c9c WatchSource:0}: Error finding container 478df789454f4dd21660aa4fd90b3a3fc4afd47be528d3874cf4f632a7221c9c: Status 404 returned error can't find the container with id 478df789454f4dd21660aa4fd90b3a3fc4afd47be528d3874cf4f632a7221c9c Dec 05 18:02:43 crc kubenswrapper[4756]: I1205 18:02:43.135548 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-84dd6cffc-vc2c8"] Dec 05 18:02:43 crc kubenswrapper[4756]: I1205 18:02:43.781231 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-974fd87b7-7rfqs" event={"ID":"b6f2c129-6474-48d8-89ba-285d3c589475","Type":"ContainerStarted","Data":"e3e6919ebd940ed0f6e04eeced3a82eb40b9aa1f27fd72a129437f27a84ef085"} Dec 05 18:02:43 crc kubenswrapper[4756]: I1205 18:02:43.781300 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-974fd87b7-7rfqs" event={"ID":"b6f2c129-6474-48d8-89ba-285d3c589475","Type":"ContainerStarted","Data":"61add351ff6d086d79ea525bb25e9cef51037c67f76343c094ca1195bb22389b"} Dec 05 18:02:43 crc kubenswrapper[4756]: I1205 18:02:43.781356 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-974fd87b7-7rfqs" Dec 05 18:02:43 crc kubenswrapper[4756]: I1205 18:02:43.790319 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-84dd6cffc-vc2c8" event={"ID":"afa0b803-3a8a-4595-ace3-b4f442c478e4","Type":"ContainerStarted","Data":"478df789454f4dd21660aa4fd90b3a3fc4afd47be528d3874cf4f632a7221c9c"} Dec 05 18:02:43 crc kubenswrapper[4756]: I1205 18:02:43.792404 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-869596f5cd-sqsnf" event={"ID":"ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb","Type":"ContainerStarted","Data":"0b0015833acf250c26ab0f24320a496234282921dc196c3f2f609cacdfde2c41"} Dec 05 18:02:43 crc kubenswrapper[4756]: I1205 18:02:43.800190 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-974fd87b7-7rfqs" podStartSLOduration=1.800172684 podStartE2EDuration="1.800172684s" podCreationTimestamp="2025-12-05 18:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 18:02:43.796312731 +0000 UTC m=+6424.034986253" watchObservedRunningTime="2025-12-05 18:02:43.800172684 +0000 UTC m=+6424.038846196" Dec 05 18:02:44 crc kubenswrapper[4756]: I1205 18:02:44.165234 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6599c94766-cv4cn" Dec 05 18:02:44 crc kubenswrapper[4756]: I1205 18:02:44.252262 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5cf757d7d6-ltbqn"] Dec 05 18:02:44 crc kubenswrapper[4756]: I1205 18:02:44.252711 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5cf757d7d6-ltbqn" podUID="1acc47c0-70db-432a-b627-44a07ac96488" containerName="horizon-log" containerID="cri-o://043e698988c172165106cc334b83ce51f1a714fce70d32284d54bc1e71d250b1" gracePeriod=30 Dec 05 18:02:44 crc kubenswrapper[4756]: I1205 18:02:44.253365 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5cf757d7d6-ltbqn" podUID="1acc47c0-70db-432a-b627-44a07ac96488" containerName="horizon" containerID="cri-o://2ae2f3dbeef1997982e852eacc2fbee85a043c9468679be980f1e95bffcc84dd" gracePeriod=30 Dec 05 18:02:46 crc kubenswrapper[4756]: I1205 18:02:46.824488 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-84dd6cffc-vc2c8" event={"ID":"afa0b803-3a8a-4595-ace3-b4f442c478e4","Type":"ContainerStarted","Data":"d5eb4e7dc5215e8c9af526ba7cb926ce2858f2a4163229cdc3ef7c18289b07ba"} Dec 05 18:02:46 crc kubenswrapper[4756]: I1205 18:02:46.825080 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-84dd6cffc-vc2c8" Dec 05 18:02:46 crc kubenswrapper[4756]: I1205 18:02:46.826801 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-869596f5cd-sqsnf" event={"ID":"ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb","Type":"ContainerStarted","Data":"732a6127977818e8fc1943d06bc286e130387418b872b585736710cdb234dc23"} Dec 05 18:02:46 crc kubenswrapper[4756]: I1205 18:02:46.827112 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-869596f5cd-sqsnf" Dec 05 18:02:46 crc kubenswrapper[4756]: I1205 18:02:46.848379 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-84dd6cffc-vc2c8" podStartSLOduration=2.303331794 podStartE2EDuration="4.848353331s" podCreationTimestamp="2025-12-05 18:02:42 +0000 UTC" firstStartedPulling="2025-12-05 18:02:43.133614673 +0000 UTC m=+6423.372288185" lastFinishedPulling="2025-12-05 18:02:45.67863621 +0000 UTC m=+6425.917309722" observedRunningTime="2025-12-05 18:02:46.843283536 +0000 UTC m=+6427.081957038" watchObservedRunningTime="2025-12-05 18:02:46.848353331 +0000 UTC m=+6427.087026853" Dec 05 18:02:46 crc kubenswrapper[4756]: I1205 18:02:46.865671 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-869596f5cd-sqsnf" podStartSLOduration=2.008196613 podStartE2EDuration="4.865640391s" podCreationTimestamp="2025-12-05 18:02:42 +0000 UTC" firstStartedPulling="2025-12-05 18:02:42.817787282 +0000 UTC m=+6423.056460794" lastFinishedPulling="2025-12-05 18:02:45.67523106 +0000 UTC m=+6425.913904572" observedRunningTime="2025-12-05 18:02:46.862793036 +0000 UTC m=+6427.101466588" watchObservedRunningTime="2025-12-05 18:02:46.865640391 +0000 UTC m=+6427.104313943" Dec 05 18:02:47 crc kubenswrapper[4756]: I1205 18:02:47.437167 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5cf757d7d6-ltbqn" podUID="1acc47c0-70db-432a-b627-44a07ac96488" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.120:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:43796->10.217.1.120:8443: read: connection reset by peer" Dec 05 18:02:47 crc kubenswrapper[4756]: I1205 18:02:47.841302 4756 generic.go:334] "Generic (PLEG): container finished" podID="1acc47c0-70db-432a-b627-44a07ac96488" containerID="2ae2f3dbeef1997982e852eacc2fbee85a043c9468679be980f1e95bffcc84dd" exitCode=0 Dec 05 18:02:47 crc kubenswrapper[4756]: I1205 18:02:47.841450 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cf757d7d6-ltbqn" event={"ID":"1acc47c0-70db-432a-b627-44a07ac96488","Type":"ContainerDied","Data":"2ae2f3dbeef1997982e852eacc2fbee85a043c9468679be980f1e95bffcc84dd"} Dec 05 18:02:49 crc kubenswrapper[4756]: I1205 18:02:49.899694 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-c5f6496c-9fc66"] Dec 05 18:02:49 crc kubenswrapper[4756]: I1205 18:02:49.901345 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-c5f6496c-9fc66" Dec 05 18:02:49 crc kubenswrapper[4756]: I1205 18:02:49.913290 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-c5f6496c-9fc66"] Dec 05 18:02:49 crc kubenswrapper[4756]: I1205 18:02:49.992755 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7gh6\" (UniqueName: \"kubernetes.io/projected/ee77eb77-55a3-42e6-bec8-3c0203881b39-kube-api-access-j7gh6\") pod \"heat-engine-c5f6496c-9fc66\" (UID: \"ee77eb77-55a3-42e6-bec8-3c0203881b39\") " pod="openstack/heat-engine-c5f6496c-9fc66" Dec 05 18:02:49 crc kubenswrapper[4756]: I1205 18:02:49.992794 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee77eb77-55a3-42e6-bec8-3c0203881b39-config-data-custom\") pod \"heat-engine-c5f6496c-9fc66\" (UID: \"ee77eb77-55a3-42e6-bec8-3c0203881b39\") " pod="openstack/heat-engine-c5f6496c-9fc66" Dec 05 18:02:49 crc kubenswrapper[4756]: I1205 18:02:49.992832 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee77eb77-55a3-42e6-bec8-3c0203881b39-combined-ca-bundle\") pod \"heat-engine-c5f6496c-9fc66\" (UID: \"ee77eb77-55a3-42e6-bec8-3c0203881b39\") " pod="openstack/heat-engine-c5f6496c-9fc66" Dec 05 18:02:49 crc kubenswrapper[4756]: I1205 18:02:49.992955 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee77eb77-55a3-42e6-bec8-3c0203881b39-config-data\") pod \"heat-engine-c5f6496c-9fc66\" (UID: \"ee77eb77-55a3-42e6-bec8-3c0203881b39\") " pod="openstack/heat-engine-c5f6496c-9fc66" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.000846 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7b498658dd-jr2j2"] Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.002156 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7b498658dd-jr2j2" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.015228 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7b498658dd-jr2j2"] Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.024506 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-997fb55bd-dxv2n"] Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.026077 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-997fb55bd-dxv2n" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.032974 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-997fb55bd-dxv2n"] Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.094845 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83c0758a-d657-4ebb-9a60-98ece921de66-combined-ca-bundle\") pod \"heat-cfnapi-997fb55bd-dxv2n\" (UID: \"83c0758a-d657-4ebb-9a60-98ece921de66\") " pod="openstack/heat-cfnapi-997fb55bd-dxv2n" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.094913 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d3fdb31-592c-408e-9968-d8bdcc91a143-config-data-custom\") pod \"heat-api-7b498658dd-jr2j2\" (UID: \"1d3fdb31-592c-408e-9968-d8bdcc91a143\") " pod="openstack/heat-api-7b498658dd-jr2j2" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.095151 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee77eb77-55a3-42e6-bec8-3c0203881b39-config-data\") pod \"heat-engine-c5f6496c-9fc66\" (UID: \"ee77eb77-55a3-42e6-bec8-3c0203881b39\") " pod="openstack/heat-engine-c5f6496c-9fc66" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.095331 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83c0758a-d657-4ebb-9a60-98ece921de66-config-data-custom\") pod \"heat-cfnapi-997fb55bd-dxv2n\" (UID: \"83c0758a-d657-4ebb-9a60-98ece921de66\") " pod="openstack/heat-cfnapi-997fb55bd-dxv2n" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.095467 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7gh6\" (UniqueName: \"kubernetes.io/projected/ee77eb77-55a3-42e6-bec8-3c0203881b39-kube-api-access-j7gh6\") pod \"heat-engine-c5f6496c-9fc66\" (UID: \"ee77eb77-55a3-42e6-bec8-3c0203881b39\") " pod="openstack/heat-engine-c5f6496c-9fc66" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.095494 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee77eb77-55a3-42e6-bec8-3c0203881b39-config-data-custom\") pod \"heat-engine-c5f6496c-9fc66\" (UID: \"ee77eb77-55a3-42e6-bec8-3c0203881b39\") " pod="openstack/heat-engine-c5f6496c-9fc66" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.095875 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc9vx\" (UniqueName: \"kubernetes.io/projected/83c0758a-d657-4ebb-9a60-98ece921de66-kube-api-access-bc9vx\") pod \"heat-cfnapi-997fb55bd-dxv2n\" (UID: \"83c0758a-d657-4ebb-9a60-98ece921de66\") " pod="openstack/heat-cfnapi-997fb55bd-dxv2n" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.095999 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee77eb77-55a3-42e6-bec8-3c0203881b39-combined-ca-bundle\") pod \"heat-engine-c5f6496c-9fc66\" (UID: \"ee77eb77-55a3-42e6-bec8-3c0203881b39\") " pod="openstack/heat-engine-c5f6496c-9fc66" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.096046 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl2ll\" (UniqueName: \"kubernetes.io/projected/1d3fdb31-592c-408e-9968-d8bdcc91a143-kube-api-access-tl2ll\") pod \"heat-api-7b498658dd-jr2j2\" (UID: \"1d3fdb31-592c-408e-9968-d8bdcc91a143\") " pod="openstack/heat-api-7b498658dd-jr2j2" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.096076 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d3fdb31-592c-408e-9968-d8bdcc91a143-combined-ca-bundle\") pod \"heat-api-7b498658dd-jr2j2\" (UID: \"1d3fdb31-592c-408e-9968-d8bdcc91a143\") " pod="openstack/heat-api-7b498658dd-jr2j2" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.096160 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83c0758a-d657-4ebb-9a60-98ece921de66-config-data\") pod \"heat-cfnapi-997fb55bd-dxv2n\" (UID: \"83c0758a-d657-4ebb-9a60-98ece921de66\") " pod="openstack/heat-cfnapi-997fb55bd-dxv2n" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.096333 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d3fdb31-592c-408e-9968-d8bdcc91a143-config-data\") pod \"heat-api-7b498658dd-jr2j2\" (UID: \"1d3fdb31-592c-408e-9968-d8bdcc91a143\") " pod="openstack/heat-api-7b498658dd-jr2j2" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.102092 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee77eb77-55a3-42e6-bec8-3c0203881b39-combined-ca-bundle\") pod \"heat-engine-c5f6496c-9fc66\" (UID: \"ee77eb77-55a3-42e6-bec8-3c0203881b39\") " pod="openstack/heat-engine-c5f6496c-9fc66" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.103225 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee77eb77-55a3-42e6-bec8-3c0203881b39-config-data-custom\") pod \"heat-engine-c5f6496c-9fc66\" (UID: \"ee77eb77-55a3-42e6-bec8-3c0203881b39\") " pod="openstack/heat-engine-c5f6496c-9fc66" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.111026 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7gh6\" (UniqueName: \"kubernetes.io/projected/ee77eb77-55a3-42e6-bec8-3c0203881b39-kube-api-access-j7gh6\") pod \"heat-engine-c5f6496c-9fc66\" (UID: \"ee77eb77-55a3-42e6-bec8-3c0203881b39\") " pod="openstack/heat-engine-c5f6496c-9fc66" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.113301 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee77eb77-55a3-42e6-bec8-3c0203881b39-config-data\") pod \"heat-engine-c5f6496c-9fc66\" (UID: \"ee77eb77-55a3-42e6-bec8-3c0203881b39\") " pod="openstack/heat-engine-c5f6496c-9fc66" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.198726 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83c0758a-d657-4ebb-9a60-98ece921de66-config-data-custom\") pod \"heat-cfnapi-997fb55bd-dxv2n\" (UID: \"83c0758a-d657-4ebb-9a60-98ece921de66\") " pod="openstack/heat-cfnapi-997fb55bd-dxv2n" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.198797 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc9vx\" (UniqueName: \"kubernetes.io/projected/83c0758a-d657-4ebb-9a60-98ece921de66-kube-api-access-bc9vx\") pod \"heat-cfnapi-997fb55bd-dxv2n\" (UID: \"83c0758a-d657-4ebb-9a60-98ece921de66\") " pod="openstack/heat-cfnapi-997fb55bd-dxv2n" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.198841 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl2ll\" (UniqueName: \"kubernetes.io/projected/1d3fdb31-592c-408e-9968-d8bdcc91a143-kube-api-access-tl2ll\") pod \"heat-api-7b498658dd-jr2j2\" (UID: \"1d3fdb31-592c-408e-9968-d8bdcc91a143\") " pod="openstack/heat-api-7b498658dd-jr2j2" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.198864 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d3fdb31-592c-408e-9968-d8bdcc91a143-combined-ca-bundle\") pod \"heat-api-7b498658dd-jr2j2\" (UID: \"1d3fdb31-592c-408e-9968-d8bdcc91a143\") " pod="openstack/heat-api-7b498658dd-jr2j2" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.198888 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83c0758a-d657-4ebb-9a60-98ece921de66-config-data\") pod \"heat-cfnapi-997fb55bd-dxv2n\" (UID: \"83c0758a-d657-4ebb-9a60-98ece921de66\") " pod="openstack/heat-cfnapi-997fb55bd-dxv2n" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.198920 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d3fdb31-592c-408e-9968-d8bdcc91a143-config-data\") pod \"heat-api-7b498658dd-jr2j2\" (UID: \"1d3fdb31-592c-408e-9968-d8bdcc91a143\") " pod="openstack/heat-api-7b498658dd-jr2j2" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.198969 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83c0758a-d657-4ebb-9a60-98ece921de66-combined-ca-bundle\") pod \"heat-cfnapi-997fb55bd-dxv2n\" (UID: \"83c0758a-d657-4ebb-9a60-98ece921de66\") " pod="openstack/heat-cfnapi-997fb55bd-dxv2n" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.198999 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d3fdb31-592c-408e-9968-d8bdcc91a143-config-data-custom\") pod \"heat-api-7b498658dd-jr2j2\" (UID: \"1d3fdb31-592c-408e-9968-d8bdcc91a143\") " pod="openstack/heat-api-7b498658dd-jr2j2" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.204885 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d3fdb31-592c-408e-9968-d8bdcc91a143-config-data\") pod \"heat-api-7b498658dd-jr2j2\" (UID: \"1d3fdb31-592c-408e-9968-d8bdcc91a143\") " pod="openstack/heat-api-7b498658dd-jr2j2" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.205131 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d3fdb31-592c-408e-9968-d8bdcc91a143-combined-ca-bundle\") pod \"heat-api-7b498658dd-jr2j2\" (UID: \"1d3fdb31-592c-408e-9968-d8bdcc91a143\") " pod="openstack/heat-api-7b498658dd-jr2j2" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.206012 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83c0758a-d657-4ebb-9a60-98ece921de66-combined-ca-bundle\") pod \"heat-cfnapi-997fb55bd-dxv2n\" (UID: \"83c0758a-d657-4ebb-9a60-98ece921de66\") " pod="openstack/heat-cfnapi-997fb55bd-dxv2n" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.206241 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83c0758a-d657-4ebb-9a60-98ece921de66-config-data\") pod \"heat-cfnapi-997fb55bd-dxv2n\" (UID: \"83c0758a-d657-4ebb-9a60-98ece921de66\") " pod="openstack/heat-cfnapi-997fb55bd-dxv2n" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.206802 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d3fdb31-592c-408e-9968-d8bdcc91a143-config-data-custom\") pod \"heat-api-7b498658dd-jr2j2\" (UID: \"1d3fdb31-592c-408e-9968-d8bdcc91a143\") " pod="openstack/heat-api-7b498658dd-jr2j2" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.210537 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83c0758a-d657-4ebb-9a60-98ece921de66-config-data-custom\") pod \"heat-cfnapi-997fb55bd-dxv2n\" (UID: \"83c0758a-d657-4ebb-9a60-98ece921de66\") " pod="openstack/heat-cfnapi-997fb55bd-dxv2n" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.218354 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl2ll\" (UniqueName: \"kubernetes.io/projected/1d3fdb31-592c-408e-9968-d8bdcc91a143-kube-api-access-tl2ll\") pod \"heat-api-7b498658dd-jr2j2\" (UID: \"1d3fdb31-592c-408e-9968-d8bdcc91a143\") " pod="openstack/heat-api-7b498658dd-jr2j2" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.223874 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc9vx\" (UniqueName: \"kubernetes.io/projected/83c0758a-d657-4ebb-9a60-98ece921de66-kube-api-access-bc9vx\") pod \"heat-cfnapi-997fb55bd-dxv2n\" (UID: \"83c0758a-d657-4ebb-9a60-98ece921de66\") " pod="openstack/heat-cfnapi-997fb55bd-dxv2n" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.268977 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-c5f6496c-9fc66" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.331666 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7b498658dd-jr2j2" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.343378 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-997fb55bd-dxv2n" Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.813278 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-c5f6496c-9fc66"] Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.891531 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-997fb55bd-dxv2n"] Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.891773 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-c5f6496c-9fc66" event={"ID":"ee77eb77-55a3-42e6-bec8-3c0203881b39","Type":"ContainerStarted","Data":"c3ac4efe76ecf3c49eb18f3cc85525d5a32773a765c0c2cdc293bc70b012a17c"} Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.919015 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7b498658dd-jr2j2"] Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.972631 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-869596f5cd-sqsnf"] Dec 05 18:02:50 crc kubenswrapper[4756]: I1205 18:02:50.973595 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-869596f5cd-sqsnf" podUID="ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb" containerName="heat-api" containerID="cri-o://732a6127977818e8fc1943d06bc286e130387418b872b585736710cdb234dc23" gracePeriod=60 Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.016332 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-869596f5cd-sqsnf" podUID="ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.1.129:8004/healthcheck\": EOF" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.016592 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-84dd6cffc-vc2c8"] Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.016746 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-84dd6cffc-vc2c8" podUID="afa0b803-3a8a-4595-ace3-b4f442c478e4" containerName="heat-cfnapi" containerID="cri-o://d5eb4e7dc5215e8c9af526ba7cb926ce2858f2a4163229cdc3ef7c18289b07ba" gracePeriod=60 Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.044045 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-747f9c6d84-vtx8n"] Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.050223 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-747f9c6d84-vtx8n" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.094817 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.095077 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.095656 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-647989cdb6-nq2w9"] Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.118765 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7438581b-1216-4129-b06f-953462d6572b-internal-tls-certs\") pod \"heat-api-747f9c6d84-vtx8n\" (UID: \"7438581b-1216-4129-b06f-953462d6572b\") " pod="openstack/heat-api-747f9c6d84-vtx8n" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.119007 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7438581b-1216-4129-b06f-953462d6572b-public-tls-certs\") pod \"heat-api-747f9c6d84-vtx8n\" (UID: \"7438581b-1216-4129-b06f-953462d6572b\") " pod="openstack/heat-api-747f9c6d84-vtx8n" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.119077 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7438581b-1216-4129-b06f-953462d6572b-combined-ca-bundle\") pod \"heat-api-747f9c6d84-vtx8n\" (UID: \"7438581b-1216-4129-b06f-953462d6572b\") " pod="openstack/heat-api-747f9c6d84-vtx8n" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.119138 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7438581b-1216-4129-b06f-953462d6572b-config-data\") pod \"heat-api-747f9c6d84-vtx8n\" (UID: \"7438581b-1216-4129-b06f-953462d6572b\") " pod="openstack/heat-api-747f9c6d84-vtx8n" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.119188 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ftq5\" (UniqueName: \"kubernetes.io/projected/7438581b-1216-4129-b06f-953462d6572b-kube-api-access-2ftq5\") pod \"heat-api-747f9c6d84-vtx8n\" (UID: \"7438581b-1216-4129-b06f-953462d6572b\") " pod="openstack/heat-api-747f9c6d84-vtx8n" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.119241 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7438581b-1216-4129-b06f-953462d6572b-config-data-custom\") pod \"heat-api-747f9c6d84-vtx8n\" (UID: \"7438581b-1216-4129-b06f-953462d6572b\") " pod="openstack/heat-api-747f9c6d84-vtx8n" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.125707 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-747f9c6d84-vtx8n"] Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.126651 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-647989cdb6-nq2w9" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.139739 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.140718 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.151583 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-647989cdb6-nq2w9"] Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.222075 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmdpv\" (UniqueName: \"kubernetes.io/projected/74c364df-da60-4faf-a193-eb859f541a7d-kube-api-access-kmdpv\") pod \"heat-cfnapi-647989cdb6-nq2w9\" (UID: \"74c364df-da60-4faf-a193-eb859f541a7d\") " pod="openstack/heat-cfnapi-647989cdb6-nq2w9" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.222128 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7438581b-1216-4129-b06f-953462d6572b-config-data\") pod \"heat-api-747f9c6d84-vtx8n\" (UID: \"7438581b-1216-4129-b06f-953462d6572b\") " pod="openstack/heat-api-747f9c6d84-vtx8n" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.222158 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ftq5\" (UniqueName: \"kubernetes.io/projected/7438581b-1216-4129-b06f-953462d6572b-kube-api-access-2ftq5\") pod \"heat-api-747f9c6d84-vtx8n\" (UID: \"7438581b-1216-4129-b06f-953462d6572b\") " pod="openstack/heat-api-747f9c6d84-vtx8n" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.222362 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c364df-da60-4faf-a193-eb859f541a7d-internal-tls-certs\") pod \"heat-cfnapi-647989cdb6-nq2w9\" (UID: \"74c364df-da60-4faf-a193-eb859f541a7d\") " pod="openstack/heat-cfnapi-647989cdb6-nq2w9" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.222444 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7438581b-1216-4129-b06f-953462d6572b-config-data-custom\") pod \"heat-api-747f9c6d84-vtx8n\" (UID: \"7438581b-1216-4129-b06f-953462d6572b\") " pod="openstack/heat-api-747f9c6d84-vtx8n" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.222764 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c364df-da60-4faf-a193-eb859f541a7d-config-data\") pod \"heat-cfnapi-647989cdb6-nq2w9\" (UID: \"74c364df-da60-4faf-a193-eb859f541a7d\") " pod="openstack/heat-cfnapi-647989cdb6-nq2w9" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.222844 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74c364df-da60-4faf-a193-eb859f541a7d-config-data-custom\") pod \"heat-cfnapi-647989cdb6-nq2w9\" (UID: \"74c364df-da60-4faf-a193-eb859f541a7d\") " pod="openstack/heat-cfnapi-647989cdb6-nq2w9" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.222895 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7438581b-1216-4129-b06f-953462d6572b-internal-tls-certs\") pod \"heat-api-747f9c6d84-vtx8n\" (UID: \"7438581b-1216-4129-b06f-953462d6572b\") " pod="openstack/heat-api-747f9c6d84-vtx8n" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.222941 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c364df-da60-4faf-a193-eb859f541a7d-public-tls-certs\") pod \"heat-cfnapi-647989cdb6-nq2w9\" (UID: \"74c364df-da60-4faf-a193-eb859f541a7d\") " pod="openstack/heat-cfnapi-647989cdb6-nq2w9" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.222984 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7438581b-1216-4129-b06f-953462d6572b-public-tls-certs\") pod \"heat-api-747f9c6d84-vtx8n\" (UID: \"7438581b-1216-4129-b06f-953462d6572b\") " pod="openstack/heat-api-747f9c6d84-vtx8n" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.223036 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c364df-da60-4faf-a193-eb859f541a7d-combined-ca-bundle\") pod \"heat-cfnapi-647989cdb6-nq2w9\" (UID: \"74c364df-da60-4faf-a193-eb859f541a7d\") " pod="openstack/heat-cfnapi-647989cdb6-nq2w9" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.223257 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7438581b-1216-4129-b06f-953462d6572b-combined-ca-bundle\") pod \"heat-api-747f9c6d84-vtx8n\" (UID: \"7438581b-1216-4129-b06f-953462d6572b\") " pod="openstack/heat-api-747f9c6d84-vtx8n" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.226483 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7438581b-1216-4129-b06f-953462d6572b-combined-ca-bundle\") pod \"heat-api-747f9c6d84-vtx8n\" (UID: \"7438581b-1216-4129-b06f-953462d6572b\") " pod="openstack/heat-api-747f9c6d84-vtx8n" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.228022 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7438581b-1216-4129-b06f-953462d6572b-config-data\") pod \"heat-api-747f9c6d84-vtx8n\" (UID: \"7438581b-1216-4129-b06f-953462d6572b\") " pod="openstack/heat-api-747f9c6d84-vtx8n" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.229373 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7438581b-1216-4129-b06f-953462d6572b-public-tls-certs\") pod \"heat-api-747f9c6d84-vtx8n\" (UID: \"7438581b-1216-4129-b06f-953462d6572b\") " pod="openstack/heat-api-747f9c6d84-vtx8n" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.229706 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7438581b-1216-4129-b06f-953462d6572b-config-data-custom\") pod \"heat-api-747f9c6d84-vtx8n\" (UID: \"7438581b-1216-4129-b06f-953462d6572b\") " pod="openstack/heat-api-747f9c6d84-vtx8n" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.229847 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7438581b-1216-4129-b06f-953462d6572b-internal-tls-certs\") pod \"heat-api-747f9c6d84-vtx8n\" (UID: \"7438581b-1216-4129-b06f-953462d6572b\") " pod="openstack/heat-api-747f9c6d84-vtx8n" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.243026 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ftq5\" (UniqueName: \"kubernetes.io/projected/7438581b-1216-4129-b06f-953462d6572b-kube-api-access-2ftq5\") pod \"heat-api-747f9c6d84-vtx8n\" (UID: \"7438581b-1216-4129-b06f-953462d6572b\") " pod="openstack/heat-api-747f9c6d84-vtx8n" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.324934 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c364df-da60-4faf-a193-eb859f541a7d-public-tls-certs\") pod \"heat-cfnapi-647989cdb6-nq2w9\" (UID: \"74c364df-da60-4faf-a193-eb859f541a7d\") " pod="openstack/heat-cfnapi-647989cdb6-nq2w9" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.325001 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c364df-da60-4faf-a193-eb859f541a7d-combined-ca-bundle\") pod \"heat-cfnapi-647989cdb6-nq2w9\" (UID: \"74c364df-da60-4faf-a193-eb859f541a7d\") " pod="openstack/heat-cfnapi-647989cdb6-nq2w9" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.325100 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmdpv\" (UniqueName: \"kubernetes.io/projected/74c364df-da60-4faf-a193-eb859f541a7d-kube-api-access-kmdpv\") pod \"heat-cfnapi-647989cdb6-nq2w9\" (UID: \"74c364df-da60-4faf-a193-eb859f541a7d\") " pod="openstack/heat-cfnapi-647989cdb6-nq2w9" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.325160 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c364df-da60-4faf-a193-eb859f541a7d-internal-tls-certs\") pod \"heat-cfnapi-647989cdb6-nq2w9\" (UID: \"74c364df-da60-4faf-a193-eb859f541a7d\") " pod="openstack/heat-cfnapi-647989cdb6-nq2w9" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.325197 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c364df-da60-4faf-a193-eb859f541a7d-config-data\") pod \"heat-cfnapi-647989cdb6-nq2w9\" (UID: \"74c364df-da60-4faf-a193-eb859f541a7d\") " pod="openstack/heat-cfnapi-647989cdb6-nq2w9" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.325231 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74c364df-da60-4faf-a193-eb859f541a7d-config-data-custom\") pod \"heat-cfnapi-647989cdb6-nq2w9\" (UID: \"74c364df-da60-4faf-a193-eb859f541a7d\") " pod="openstack/heat-cfnapi-647989cdb6-nq2w9" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.331208 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c364df-da60-4faf-a193-eb859f541a7d-public-tls-certs\") pod \"heat-cfnapi-647989cdb6-nq2w9\" (UID: \"74c364df-da60-4faf-a193-eb859f541a7d\") " pod="openstack/heat-cfnapi-647989cdb6-nq2w9" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.332519 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c364df-da60-4faf-a193-eb859f541a7d-config-data\") pod \"heat-cfnapi-647989cdb6-nq2w9\" (UID: \"74c364df-da60-4faf-a193-eb859f541a7d\") " pod="openstack/heat-cfnapi-647989cdb6-nq2w9" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.333915 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74c364df-da60-4faf-a193-eb859f541a7d-config-data-custom\") pod \"heat-cfnapi-647989cdb6-nq2w9\" (UID: \"74c364df-da60-4faf-a193-eb859f541a7d\") " pod="openstack/heat-cfnapi-647989cdb6-nq2w9" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.334551 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c364df-da60-4faf-a193-eb859f541a7d-combined-ca-bundle\") pod \"heat-cfnapi-647989cdb6-nq2w9\" (UID: \"74c364df-da60-4faf-a193-eb859f541a7d\") " pod="openstack/heat-cfnapi-647989cdb6-nq2w9" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.341278 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c364df-da60-4faf-a193-eb859f541a7d-internal-tls-certs\") pod \"heat-cfnapi-647989cdb6-nq2w9\" (UID: \"74c364df-da60-4faf-a193-eb859f541a7d\") " pod="openstack/heat-cfnapi-647989cdb6-nq2w9" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.355475 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmdpv\" (UniqueName: \"kubernetes.io/projected/74c364df-da60-4faf-a193-eb859f541a7d-kube-api-access-kmdpv\") pod \"heat-cfnapi-647989cdb6-nq2w9\" (UID: \"74c364df-da60-4faf-a193-eb859f541a7d\") " pod="openstack/heat-cfnapi-647989cdb6-nq2w9" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.401023 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-747f9c6d84-vtx8n" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.428362 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-647989cdb6-nq2w9" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.916972 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7b498658dd-jr2j2" event={"ID":"1d3fdb31-592c-408e-9968-d8bdcc91a143","Type":"ContainerStarted","Data":"c5f5d98291bfadb3c69c7b7498b0b3d45a2543380e9f1972811d204fe5d94944"} Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.918706 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-997fb55bd-dxv2n" event={"ID":"83c0758a-d657-4ebb-9a60-98ece921de66","Type":"ContainerStarted","Data":"824793bb6c739b414db413afc1081c2a07b10e974c1d50f5f79bfbf7573db7c5"} Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.919968 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-c5f6496c-9fc66" event={"ID":"ee77eb77-55a3-42e6-bec8-3c0203881b39","Type":"ContainerStarted","Data":"91e62519e367a5c8be43ccc49f260845ec9914b3dcb5e16afe14202e42d2b1ce"} Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.921149 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-c5f6496c-9fc66" Dec 05 18:02:51 crc kubenswrapper[4756]: I1205 18:02:51.939521 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-c5f6496c-9fc66" podStartSLOduration=2.9395031769999997 podStartE2EDuration="2.939503177s" podCreationTimestamp="2025-12-05 18:02:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 18:02:51.934870964 +0000 UTC m=+6432.173544476" watchObservedRunningTime="2025-12-05 18:02:51.939503177 +0000 UTC m=+6432.178176689" Dec 05 18:02:52 crc kubenswrapper[4756]: I1205 18:02:52.063749 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-747f9c6d84-vtx8n"] Dec 05 18:02:52 crc kubenswrapper[4756]: I1205 18:02:52.126688 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-647989cdb6-nq2w9"] Dec 05 18:02:52 crc kubenswrapper[4756]: I1205 18:02:52.716611 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-84dd6cffc-vc2c8" Dec 05 18:02:52 crc kubenswrapper[4756]: I1205 18:02:52.940681 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-647989cdb6-nq2w9" event={"ID":"74c364df-da60-4faf-a193-eb859f541a7d","Type":"ContainerStarted","Data":"850ded97321e6e4a2ee725325612f2bb3d4e01819f08a7c521d6f1b8d3dc3f42"} Dec 05 18:02:52 crc kubenswrapper[4756]: I1205 18:02:52.940750 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-647989cdb6-nq2w9" event={"ID":"74c364df-da60-4faf-a193-eb859f541a7d","Type":"ContainerStarted","Data":"462636dd852f2c2c82adb8f965e634b22d4f9211be538fb01db1fa494a76a4cb"} Dec 05 18:02:52 crc kubenswrapper[4756]: I1205 18:02:52.956682 4756 generic.go:334] "Generic (PLEG): container finished" podID="1d3fdb31-592c-408e-9968-d8bdcc91a143" containerID="188fa90329008b78d0e39894e5b5ecb79bc1d6d2ad208bf8c6a35b2c7555a3b1" exitCode=1 Dec 05 18:02:52 crc kubenswrapper[4756]: I1205 18:02:52.956781 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7b498658dd-jr2j2" event={"ID":"1d3fdb31-592c-408e-9968-d8bdcc91a143","Type":"ContainerDied","Data":"188fa90329008b78d0e39894e5b5ecb79bc1d6d2ad208bf8c6a35b2c7555a3b1"} Dec 05 18:02:52 crc kubenswrapper[4756]: I1205 18:02:52.957576 4756 scope.go:117] "RemoveContainer" containerID="188fa90329008b78d0e39894e5b5ecb79bc1d6d2ad208bf8c6a35b2c7555a3b1" Dec 05 18:02:52 crc kubenswrapper[4756]: I1205 18:02:52.962772 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-997fb55bd-dxv2n" event={"ID":"83c0758a-d657-4ebb-9a60-98ece921de66","Type":"ContainerStarted","Data":"fb656ce055ed13fc124d2134b497dec6b60dd7322f59f4c9649fb22658ee763e"} Dec 05 18:02:52 crc kubenswrapper[4756]: I1205 18:02:52.964460 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-747f9c6d84-vtx8n" event={"ID":"7438581b-1216-4129-b06f-953462d6572b","Type":"ContainerStarted","Data":"edf0331b57ea41221f55d5c106944ab1a3881e63c6be1b8a818a97f9b76d3747"} Dec 05 18:02:53 crc kubenswrapper[4756]: I1205 18:02:53.981287 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-747f9c6d84-vtx8n" event={"ID":"7438581b-1216-4129-b06f-953462d6572b","Type":"ContainerStarted","Data":"39b1b61bd86e313e4ec54d6e7f85256329e34464770ca85e01a421965a6d1cd4"} Dec 05 18:02:53 crc kubenswrapper[4756]: I1205 18:02:53.982219 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-997fb55bd-dxv2n" Dec 05 18:02:54 crc kubenswrapper[4756]: I1205 18:02:54.004845 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-997fb55bd-dxv2n" podStartSLOduration=5.00482523 podStartE2EDuration="5.00482523s" podCreationTimestamp="2025-12-05 18:02:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 18:02:53.995126451 +0000 UTC m=+6434.233799973" watchObservedRunningTime="2025-12-05 18:02:54.00482523 +0000 UTC m=+6434.243498742" Dec 05 18:02:54 crc kubenswrapper[4756]: I1205 18:02:54.017880 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-647989cdb6-nq2w9" podStartSLOduration=3.017861087 podStartE2EDuration="3.017861087s" podCreationTimestamp="2025-12-05 18:02:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 18:02:54.015200246 +0000 UTC m=+6434.253873748" watchObservedRunningTime="2025-12-05 18:02:54.017861087 +0000 UTC m=+6434.256534599" Dec 05 18:02:54 crc kubenswrapper[4756]: I1205 18:02:54.689124 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5cf757d7d6-ltbqn" podUID="1acc47c0-70db-432a-b627-44a07ac96488" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.120:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.120:8443: connect: connection refused" Dec 05 18:02:54 crc kubenswrapper[4756]: I1205 18:02:54.991498 4756 generic.go:334] "Generic (PLEG): container finished" podID="83c0758a-d657-4ebb-9a60-98ece921de66" containerID="fb656ce055ed13fc124d2134b497dec6b60dd7322f59f4c9649fb22658ee763e" exitCode=1 Dec 05 18:02:54 crc kubenswrapper[4756]: I1205 18:02:54.991545 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-997fb55bd-dxv2n" event={"ID":"83c0758a-d657-4ebb-9a60-98ece921de66","Type":"ContainerDied","Data":"fb656ce055ed13fc124d2134b497dec6b60dd7322f59f4c9649fb22658ee763e"} Dec 05 18:02:54 crc kubenswrapper[4756]: I1205 18:02:54.992438 4756 scope.go:117] "RemoveContainer" containerID="fb656ce055ed13fc124d2134b497dec6b60dd7322f59f4c9649fb22658ee763e" Dec 05 18:02:54 crc kubenswrapper[4756]: I1205 18:02:54.994219 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7b498658dd-jr2j2" event={"ID":"1d3fdb31-592c-408e-9968-d8bdcc91a143","Type":"ContainerStarted","Data":"66c9f830988b6ca4b2a2a551984f2f5b3c1579bdb9df04421daac6754111ce69"} Dec 05 18:02:54 crc kubenswrapper[4756]: I1205 18:02:54.994354 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-747f9c6d84-vtx8n" Dec 05 18:02:54 crc kubenswrapper[4756]: I1205 18:02:54.994381 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7b498658dd-jr2j2" Dec 05 18:02:55 crc kubenswrapper[4756]: I1205 18:02:55.034184 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-7b498658dd-jr2j2" podStartSLOduration=6.034142912 podStartE2EDuration="6.034142912s" podCreationTimestamp="2025-12-05 18:02:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 18:02:55.03028452 +0000 UTC m=+6435.268958042" watchObservedRunningTime="2025-12-05 18:02:55.034142912 +0000 UTC m=+6435.272816424" Dec 05 18:02:55 crc kubenswrapper[4756]: I1205 18:02:55.055282 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-747f9c6d84-vtx8n" podStartSLOduration=5.055260215 podStartE2EDuration="5.055260215s" podCreationTimestamp="2025-12-05 18:02:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 18:02:55.047455827 +0000 UTC m=+6435.286129339" watchObservedRunningTime="2025-12-05 18:02:55.055260215 +0000 UTC m=+6435.293933727" Dec 05 18:02:55 crc kubenswrapper[4756]: I1205 18:02:55.332562 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7b498658dd-jr2j2" Dec 05 18:02:55 crc kubenswrapper[4756]: I1205 18:02:55.343849 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-997fb55bd-dxv2n" Dec 05 18:02:56 crc kubenswrapper[4756]: I1205 18:02:56.422143 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-84dd6cffc-vc2c8" podUID="afa0b803-3a8a-4595-ace3-b4f442c478e4" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.1.130:8000/healthcheck\": read tcp 10.217.0.2:46972->10.217.1.130:8000: read: connection reset by peer" Dec 05 18:02:56 crc kubenswrapper[4756]: I1205 18:02:56.429308 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-647989cdb6-nq2w9" Dec 05 18:02:56 crc kubenswrapper[4756]: I1205 18:02:56.438357 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-869596f5cd-sqsnf" podUID="ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.1.129:8004/healthcheck\": read tcp 10.217.0.2:59088->10.217.1.129:8004: read: connection reset by peer" Dec 05 18:02:57 crc kubenswrapper[4756]: I1205 18:02:57.020616 4756 generic.go:334] "Generic (PLEG): container finished" podID="83c0758a-d657-4ebb-9a60-98ece921de66" containerID="9ec0f85a2155a02567e07d9256763d60674021bd10c79e89feb59769d2a59e0f" exitCode=1 Dec 05 18:02:57 crc kubenswrapper[4756]: I1205 18:02:57.020724 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-997fb55bd-dxv2n" event={"ID":"83c0758a-d657-4ebb-9a60-98ece921de66","Type":"ContainerDied","Data":"9ec0f85a2155a02567e07d9256763d60674021bd10c79e89feb59769d2a59e0f"} Dec 05 18:02:57 crc kubenswrapper[4756]: I1205 18:02:57.021014 4756 scope.go:117] "RemoveContainer" containerID="fb656ce055ed13fc124d2134b497dec6b60dd7322f59f4c9649fb22658ee763e" Dec 05 18:02:57 crc kubenswrapper[4756]: I1205 18:02:57.021655 4756 scope.go:117] "RemoveContainer" containerID="9ec0f85a2155a02567e07d9256763d60674021bd10c79e89feb59769d2a59e0f" Dec 05 18:02:57 crc kubenswrapper[4756]: E1205 18:02:57.022254 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-997fb55bd-dxv2n_openstack(83c0758a-d657-4ebb-9a60-98ece921de66)\"" pod="openstack/heat-cfnapi-997fb55bd-dxv2n" podUID="83c0758a-d657-4ebb-9a60-98ece921de66" Dec 05 18:02:57 crc kubenswrapper[4756]: I1205 18:02:57.024825 4756 generic.go:334] "Generic (PLEG): container finished" podID="afa0b803-3a8a-4595-ace3-b4f442c478e4" containerID="d5eb4e7dc5215e8c9af526ba7cb926ce2858f2a4163229cdc3ef7c18289b07ba" exitCode=0 Dec 05 18:02:57 crc kubenswrapper[4756]: I1205 18:02:57.024857 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-84dd6cffc-vc2c8" event={"ID":"afa0b803-3a8a-4595-ace3-b4f442c478e4","Type":"ContainerDied","Data":"d5eb4e7dc5215e8c9af526ba7cb926ce2858f2a4163229cdc3ef7c18289b07ba"} Dec 05 18:02:57 crc kubenswrapper[4756]: I1205 18:02:57.028920 4756 generic.go:334] "Generic (PLEG): container finished" podID="ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb" containerID="732a6127977818e8fc1943d06bc286e130387418b872b585736710cdb234dc23" exitCode=0 Dec 05 18:02:57 crc kubenswrapper[4756]: I1205 18:02:57.029139 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-869596f5cd-sqsnf" event={"ID":"ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb","Type":"ContainerDied","Data":"732a6127977818e8fc1943d06bc286e130387418b872b585736710cdb234dc23"} Dec 05 18:02:57 crc kubenswrapper[4756]: I1205 18:02:57.032966 4756 generic.go:334] "Generic (PLEG): container finished" podID="1d3fdb31-592c-408e-9968-d8bdcc91a143" containerID="66c9f830988b6ca4b2a2a551984f2f5b3c1579bdb9df04421daac6754111ce69" exitCode=1 Dec 05 18:02:57 crc kubenswrapper[4756]: I1205 18:02:57.033053 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7b498658dd-jr2j2" event={"ID":"1d3fdb31-592c-408e-9968-d8bdcc91a143","Type":"ContainerDied","Data":"66c9f830988b6ca4b2a2a551984f2f5b3c1579bdb9df04421daac6754111ce69"} Dec 05 18:02:57 crc kubenswrapper[4756]: I1205 18:02:57.033823 4756 scope.go:117] "RemoveContainer" containerID="66c9f830988b6ca4b2a2a551984f2f5b3c1579bdb9df04421daac6754111ce69" Dec 05 18:02:57 crc kubenswrapper[4756]: E1205 18:02:57.034477 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7b498658dd-jr2j2_openstack(1d3fdb31-592c-408e-9968-d8bdcc91a143)\"" pod="openstack/heat-api-7b498658dd-jr2j2" podUID="1d3fdb31-592c-408e-9968-d8bdcc91a143" Dec 05 18:02:57 crc kubenswrapper[4756]: I1205 18:02:57.488256 4756 scope.go:117] "RemoveContainer" containerID="188fa90329008b78d0e39894e5b5ecb79bc1d6d2ad208bf8c6a35b2c7555a3b1" Dec 05 18:02:57 crc kubenswrapper[4756]: I1205 18:02:57.511720 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-869596f5cd-sqsnf" podUID="ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.1.129:8004/healthcheck\": dial tcp 10.217.1.129:8004: connect: connection refused" Dec 05 18:02:57 crc kubenswrapper[4756]: I1205 18:02:57.590627 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-84dd6cffc-vc2c8" podUID="afa0b803-3a8a-4595-ace3-b4f442c478e4" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.1.130:8000/healthcheck\": dial tcp 10.217.1.130:8000: connect: connection refused" Dec 05 18:02:57 crc kubenswrapper[4756]: I1205 18:02:57.957574 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-869596f5cd-sqsnf" Dec 05 18:02:57 crc kubenswrapper[4756]: I1205 18:02:57.971144 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5s9z\" (UniqueName: \"kubernetes.io/projected/ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb-kube-api-access-b5s9z\") pod \"ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb\" (UID: \"ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb\") " Dec 05 18:02:57 crc kubenswrapper[4756]: I1205 18:02:57.971267 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb-config-data\") pod \"ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb\" (UID: \"ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb\") " Dec 05 18:02:57 crc kubenswrapper[4756]: I1205 18:02:57.971344 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb-combined-ca-bundle\") pod \"ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb\" (UID: \"ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb\") " Dec 05 18:02:57 crc kubenswrapper[4756]: I1205 18:02:57.971447 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb-config-data-custom\") pod \"ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb\" (UID: \"ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb\") " Dec 05 18:02:57 crc kubenswrapper[4756]: I1205 18:02:57.974982 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-84dd6cffc-vc2c8" Dec 05 18:02:57 crc kubenswrapper[4756]: I1205 18:02:57.978785 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb-kube-api-access-b5s9z" (OuterVolumeSpecName: "kube-api-access-b5s9z") pod "ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb" (UID: "ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb"). InnerVolumeSpecName "kube-api-access-b5s9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:02:57 crc kubenswrapper[4756]: I1205 18:02:57.986093 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb" (UID: "ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:02:58 crc kubenswrapper[4756]: I1205 18:02:58.049102 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb" (UID: "ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:02:58 crc kubenswrapper[4756]: I1205 18:02:58.054518 4756 scope.go:117] "RemoveContainer" containerID="66c9f830988b6ca4b2a2a551984f2f5b3c1579bdb9df04421daac6754111ce69" Dec 05 18:02:58 crc kubenswrapper[4756]: E1205 18:02:58.054811 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7b498658dd-jr2j2_openstack(1d3fdb31-592c-408e-9968-d8bdcc91a143)\"" pod="openstack/heat-api-7b498658dd-jr2j2" podUID="1d3fdb31-592c-408e-9968-d8bdcc91a143" Dec 05 18:02:58 crc kubenswrapper[4756]: I1205 18:02:58.065530 4756 scope.go:117] "RemoveContainer" containerID="9ec0f85a2155a02567e07d9256763d60674021bd10c79e89feb59769d2a59e0f" Dec 05 18:02:58 crc kubenswrapper[4756]: E1205 18:02:58.065838 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-997fb55bd-dxv2n_openstack(83c0758a-d657-4ebb-9a60-98ece921de66)\"" pod="openstack/heat-cfnapi-997fb55bd-dxv2n" podUID="83c0758a-d657-4ebb-9a60-98ece921de66" Dec 05 18:02:58 crc kubenswrapper[4756]: I1205 18:02:58.067905 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-84dd6cffc-vc2c8" Dec 05 18:02:58 crc kubenswrapper[4756]: I1205 18:02:58.070682 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-869596f5cd-sqsnf" Dec 05 18:02:58 crc kubenswrapper[4756]: I1205 18:02:58.072895 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afa0b803-3a8a-4595-ace3-b4f442c478e4-combined-ca-bundle\") pod \"afa0b803-3a8a-4595-ace3-b4f442c478e4\" (UID: \"afa0b803-3a8a-4595-ace3-b4f442c478e4\") " Dec 05 18:02:58 crc kubenswrapper[4756]: I1205 18:02:58.072971 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/afa0b803-3a8a-4595-ace3-b4f442c478e4-config-data-custom\") pod \"afa0b803-3a8a-4595-ace3-b4f442c478e4\" (UID: \"afa0b803-3a8a-4595-ace3-b4f442c478e4\") " Dec 05 18:02:58 crc kubenswrapper[4756]: I1205 18:02:58.073020 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gvt6\" (UniqueName: \"kubernetes.io/projected/afa0b803-3a8a-4595-ace3-b4f442c478e4-kube-api-access-7gvt6\") pod \"afa0b803-3a8a-4595-ace3-b4f442c478e4\" (UID: \"afa0b803-3a8a-4595-ace3-b4f442c478e4\") " Dec 05 18:02:58 crc kubenswrapper[4756]: I1205 18:02:58.073875 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5s9z\" (UniqueName: \"kubernetes.io/projected/ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb-kube-api-access-b5s9z\") on node \"crc\" DevicePath \"\"" Dec 05 18:02:58 crc kubenswrapper[4756]: I1205 18:02:58.074102 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb-config-data" (OuterVolumeSpecName: "config-data") pod "ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb" (UID: "ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:02:58 crc kubenswrapper[4756]: I1205 18:02:58.077213 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:02:58 crc kubenswrapper[4756]: I1205 18:02:58.077236 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 18:02:58 crc kubenswrapper[4756]: I1205 18:02:58.077741 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afa0b803-3a8a-4595-ace3-b4f442c478e4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "afa0b803-3a8a-4595-ace3-b4f442c478e4" (UID: "afa0b803-3a8a-4595-ace3-b4f442c478e4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:02:58 crc kubenswrapper[4756]: I1205 18:02:58.079445 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afa0b803-3a8a-4595-ace3-b4f442c478e4-kube-api-access-7gvt6" (OuterVolumeSpecName: "kube-api-access-7gvt6") pod "afa0b803-3a8a-4595-ace3-b4f442c478e4" (UID: "afa0b803-3a8a-4595-ace3-b4f442c478e4"). InnerVolumeSpecName "kube-api-access-7gvt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:02:58 crc kubenswrapper[4756]: I1205 18:02:58.079679 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-84dd6cffc-vc2c8" event={"ID":"afa0b803-3a8a-4595-ace3-b4f442c478e4","Type":"ContainerDied","Data":"478df789454f4dd21660aa4fd90b3a3fc4afd47be528d3874cf4f632a7221c9c"} Dec 05 18:02:58 crc kubenswrapper[4756]: I1205 18:02:58.079725 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-869596f5cd-sqsnf" event={"ID":"ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb","Type":"ContainerDied","Data":"0b0015833acf250c26ab0f24320a496234282921dc196c3f2f609cacdfde2c41"} Dec 05 18:02:58 crc kubenswrapper[4756]: I1205 18:02:58.079752 4756 scope.go:117] "RemoveContainer" containerID="d5eb4e7dc5215e8c9af526ba7cb926ce2858f2a4163229cdc3ef7c18289b07ba" Dec 05 18:02:58 crc kubenswrapper[4756]: I1205 18:02:58.120786 4756 scope.go:117] "RemoveContainer" containerID="732a6127977818e8fc1943d06bc286e130387418b872b585736710cdb234dc23" Dec 05 18:02:58 crc kubenswrapper[4756]: I1205 18:02:58.130151 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afa0b803-3a8a-4595-ace3-b4f442c478e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "afa0b803-3a8a-4595-ace3-b4f442c478e4" (UID: "afa0b803-3a8a-4595-ace3-b4f442c478e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:02:58 crc kubenswrapper[4756]: I1205 18:02:58.177718 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afa0b803-3a8a-4595-ace3-b4f442c478e4-config-data\") pod \"afa0b803-3a8a-4595-ace3-b4f442c478e4\" (UID: \"afa0b803-3a8a-4595-ace3-b4f442c478e4\") " Dec 05 18:02:58 crc kubenswrapper[4756]: I1205 18:02:58.178267 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afa0b803-3a8a-4595-ace3-b4f442c478e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:02:58 crc kubenswrapper[4756]: I1205 18:02:58.178282 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/afa0b803-3a8a-4595-ace3-b4f442c478e4-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 18:02:58 crc kubenswrapper[4756]: I1205 18:02:58.178294 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gvt6\" (UniqueName: \"kubernetes.io/projected/afa0b803-3a8a-4595-ace3-b4f442c478e4-kube-api-access-7gvt6\") on node \"crc\" DevicePath \"\"" Dec 05 18:02:58 crc kubenswrapper[4756]: I1205 18:02:58.178308 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 18:02:58 crc kubenswrapper[4756]: I1205 18:02:58.227608 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afa0b803-3a8a-4595-ace3-b4f442c478e4-config-data" (OuterVolumeSpecName: "config-data") pod "afa0b803-3a8a-4595-ace3-b4f442c478e4" (UID: "afa0b803-3a8a-4595-ace3-b4f442c478e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:02:58 crc kubenswrapper[4756]: I1205 18:02:58.279508 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afa0b803-3a8a-4595-ace3-b4f442c478e4-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 18:02:58 crc kubenswrapper[4756]: I1205 18:02:58.429332 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-84dd6cffc-vc2c8"] Dec 05 18:02:58 crc kubenswrapper[4756]: I1205 18:02:58.441310 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-84dd6cffc-vc2c8"] Dec 05 18:02:58 crc kubenswrapper[4756]: I1205 18:02:58.452553 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-869596f5cd-sqsnf"] Dec 05 18:02:58 crc kubenswrapper[4756]: I1205 18:02:58.460174 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-869596f5cd-sqsnf"] Dec 05 18:03:00 crc kubenswrapper[4756]: I1205 18:03:00.061460 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afa0b803-3a8a-4595-ace3-b4f442c478e4" path="/var/lib/kubelet/pods/afa0b803-3a8a-4595-ace3-b4f442c478e4/volumes" Dec 05 18:03:00 crc kubenswrapper[4756]: I1205 18:03:00.062225 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb" path="/var/lib/kubelet/pods/ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb/volumes" Dec 05 18:03:00 crc kubenswrapper[4756]: I1205 18:03:00.331954 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-7b498658dd-jr2j2" Dec 05 18:03:00 crc kubenswrapper[4756]: I1205 18:03:00.332504 4756 scope.go:117] "RemoveContainer" containerID="66c9f830988b6ca4b2a2a551984f2f5b3c1579bdb9df04421daac6754111ce69" Dec 05 18:03:00 crc kubenswrapper[4756]: E1205 18:03:00.332726 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7b498658dd-jr2j2_openstack(1d3fdb31-592c-408e-9968-d8bdcc91a143)\"" pod="openstack/heat-api-7b498658dd-jr2j2" podUID="1d3fdb31-592c-408e-9968-d8bdcc91a143" Dec 05 18:03:00 crc kubenswrapper[4756]: I1205 18:03:00.343970 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-997fb55bd-dxv2n" Dec 05 18:03:00 crc kubenswrapper[4756]: I1205 18:03:00.344043 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-997fb55bd-dxv2n" Dec 05 18:03:00 crc kubenswrapper[4756]: I1205 18:03:00.345111 4756 scope.go:117] "RemoveContainer" containerID="9ec0f85a2155a02567e07d9256763d60674021bd10c79e89feb59769d2a59e0f" Dec 05 18:03:00 crc kubenswrapper[4756]: E1205 18:03:00.345516 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-997fb55bd-dxv2n_openstack(83c0758a-d657-4ebb-9a60-98ece921de66)\"" pod="openstack/heat-cfnapi-997fb55bd-dxv2n" podUID="83c0758a-d657-4ebb-9a60-98ece921de66" Dec 05 18:03:02 crc kubenswrapper[4756]: I1205 18:03:02.664475 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-747f9c6d84-vtx8n" Dec 05 18:03:02 crc kubenswrapper[4756]: I1205 18:03:02.694219 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-647989cdb6-nq2w9" Dec 05 18:03:02 crc kubenswrapper[4756]: I1205 18:03:02.754965 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7b498658dd-jr2j2"] Dec 05 18:03:02 crc kubenswrapper[4756]: I1205 18:03:02.806652 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-997fb55bd-dxv2n"] Dec 05 18:03:02 crc kubenswrapper[4756]: I1205 18:03:02.857363 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-974fd87b7-7rfqs" Dec 05 18:03:03 crc kubenswrapper[4756]: I1205 18:03:03.347643 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-997fb55bd-dxv2n" Dec 05 18:03:03 crc kubenswrapper[4756]: I1205 18:03:03.353129 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7b498658dd-jr2j2" Dec 05 18:03:03 crc kubenswrapper[4756]: I1205 18:03:03.509127 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83c0758a-d657-4ebb-9a60-98ece921de66-config-data\") pod \"83c0758a-d657-4ebb-9a60-98ece921de66\" (UID: \"83c0758a-d657-4ebb-9a60-98ece921de66\") " Dec 05 18:03:03 crc kubenswrapper[4756]: I1205 18:03:03.509285 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83c0758a-d657-4ebb-9a60-98ece921de66-config-data-custom\") pod \"83c0758a-d657-4ebb-9a60-98ece921de66\" (UID: \"83c0758a-d657-4ebb-9a60-98ece921de66\") " Dec 05 18:03:03 crc kubenswrapper[4756]: I1205 18:03:03.509343 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d3fdb31-592c-408e-9968-d8bdcc91a143-config-data-custom\") pod \"1d3fdb31-592c-408e-9968-d8bdcc91a143\" (UID: \"1d3fdb31-592c-408e-9968-d8bdcc91a143\") " Dec 05 18:03:03 crc kubenswrapper[4756]: I1205 18:03:03.509395 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bc9vx\" (UniqueName: \"kubernetes.io/projected/83c0758a-d657-4ebb-9a60-98ece921de66-kube-api-access-bc9vx\") pod \"83c0758a-d657-4ebb-9a60-98ece921de66\" (UID: \"83c0758a-d657-4ebb-9a60-98ece921de66\") " Dec 05 18:03:03 crc kubenswrapper[4756]: I1205 18:03:03.509564 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d3fdb31-592c-408e-9968-d8bdcc91a143-combined-ca-bundle\") pod \"1d3fdb31-592c-408e-9968-d8bdcc91a143\" (UID: \"1d3fdb31-592c-408e-9968-d8bdcc91a143\") " Dec 05 18:03:03 crc kubenswrapper[4756]: I1205 18:03:03.509669 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83c0758a-d657-4ebb-9a60-98ece921de66-combined-ca-bundle\") pod \"83c0758a-d657-4ebb-9a60-98ece921de66\" (UID: \"83c0758a-d657-4ebb-9a60-98ece921de66\") " Dec 05 18:03:03 crc kubenswrapper[4756]: I1205 18:03:03.509735 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl2ll\" (UniqueName: \"kubernetes.io/projected/1d3fdb31-592c-408e-9968-d8bdcc91a143-kube-api-access-tl2ll\") pod \"1d3fdb31-592c-408e-9968-d8bdcc91a143\" (UID: \"1d3fdb31-592c-408e-9968-d8bdcc91a143\") " Dec 05 18:03:03 crc kubenswrapper[4756]: I1205 18:03:03.509812 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d3fdb31-592c-408e-9968-d8bdcc91a143-config-data\") pod \"1d3fdb31-592c-408e-9968-d8bdcc91a143\" (UID: \"1d3fdb31-592c-408e-9968-d8bdcc91a143\") " Dec 05 18:03:03 crc kubenswrapper[4756]: I1205 18:03:03.516928 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d3fdb31-592c-408e-9968-d8bdcc91a143-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1d3fdb31-592c-408e-9968-d8bdcc91a143" (UID: "1d3fdb31-592c-408e-9968-d8bdcc91a143"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:03:03 crc kubenswrapper[4756]: I1205 18:03:03.525674 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83c0758a-d657-4ebb-9a60-98ece921de66-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "83c0758a-d657-4ebb-9a60-98ece921de66" (UID: "83c0758a-d657-4ebb-9a60-98ece921de66"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:03:03 crc kubenswrapper[4756]: I1205 18:03:03.525806 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d3fdb31-592c-408e-9968-d8bdcc91a143-kube-api-access-tl2ll" (OuterVolumeSpecName: "kube-api-access-tl2ll") pod "1d3fdb31-592c-408e-9968-d8bdcc91a143" (UID: "1d3fdb31-592c-408e-9968-d8bdcc91a143"). InnerVolumeSpecName "kube-api-access-tl2ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:03:03 crc kubenswrapper[4756]: I1205 18:03:03.530184 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83c0758a-d657-4ebb-9a60-98ece921de66-kube-api-access-bc9vx" (OuterVolumeSpecName: "kube-api-access-bc9vx") pod "83c0758a-d657-4ebb-9a60-98ece921de66" (UID: "83c0758a-d657-4ebb-9a60-98ece921de66"). InnerVolumeSpecName "kube-api-access-bc9vx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:03:03 crc kubenswrapper[4756]: I1205 18:03:03.553851 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83c0758a-d657-4ebb-9a60-98ece921de66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83c0758a-d657-4ebb-9a60-98ece921de66" (UID: "83c0758a-d657-4ebb-9a60-98ece921de66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:03:03 crc kubenswrapper[4756]: I1205 18:03:03.565495 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d3fdb31-592c-408e-9968-d8bdcc91a143-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d3fdb31-592c-408e-9968-d8bdcc91a143" (UID: "1d3fdb31-592c-408e-9968-d8bdcc91a143"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:03:03 crc kubenswrapper[4756]: I1205 18:03:03.584067 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d3fdb31-592c-408e-9968-d8bdcc91a143-config-data" (OuterVolumeSpecName: "config-data") pod "1d3fdb31-592c-408e-9968-d8bdcc91a143" (UID: "1d3fdb31-592c-408e-9968-d8bdcc91a143"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:03:03 crc kubenswrapper[4756]: I1205 18:03:03.603693 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83c0758a-d657-4ebb-9a60-98ece921de66-config-data" (OuterVolumeSpecName: "config-data") pod "83c0758a-d657-4ebb-9a60-98ece921de66" (UID: "83c0758a-d657-4ebb-9a60-98ece921de66"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:03:03 crc kubenswrapper[4756]: I1205 18:03:03.612857 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d3fdb31-592c-408e-9968-d8bdcc91a143-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:03:03 crc kubenswrapper[4756]: I1205 18:03:03.612889 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83c0758a-d657-4ebb-9a60-98ece921de66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:03:03 crc kubenswrapper[4756]: I1205 18:03:03.612898 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl2ll\" (UniqueName: \"kubernetes.io/projected/1d3fdb31-592c-408e-9968-d8bdcc91a143-kube-api-access-tl2ll\") on node \"crc\" DevicePath \"\"" Dec 05 18:03:03 crc kubenswrapper[4756]: I1205 18:03:03.612909 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d3fdb31-592c-408e-9968-d8bdcc91a143-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 18:03:03 crc kubenswrapper[4756]: I1205 18:03:03.612919 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83c0758a-d657-4ebb-9a60-98ece921de66-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 18:03:03 crc kubenswrapper[4756]: I1205 18:03:03.612927 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83c0758a-d657-4ebb-9a60-98ece921de66-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 18:03:03 crc kubenswrapper[4756]: I1205 18:03:03.612936 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d3fdb31-592c-408e-9968-d8bdcc91a143-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 18:03:03 crc kubenswrapper[4756]: I1205 18:03:03.612943 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bc9vx\" (UniqueName: \"kubernetes.io/projected/83c0758a-d657-4ebb-9a60-98ece921de66-kube-api-access-bc9vx\") on node \"crc\" DevicePath \"\"" Dec 05 18:03:04 crc kubenswrapper[4756]: I1205 18:03:04.148939 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-997fb55bd-dxv2n" event={"ID":"83c0758a-d657-4ebb-9a60-98ece921de66","Type":"ContainerDied","Data":"824793bb6c739b414db413afc1081c2a07b10e974c1d50f5f79bfbf7573db7c5"} Dec 05 18:03:04 crc kubenswrapper[4756]: I1205 18:03:04.148999 4756 scope.go:117] "RemoveContainer" containerID="9ec0f85a2155a02567e07d9256763d60674021bd10c79e89feb59769d2a59e0f" Dec 05 18:03:04 crc kubenswrapper[4756]: I1205 18:03:04.149099 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-997fb55bd-dxv2n" Dec 05 18:03:04 crc kubenswrapper[4756]: I1205 18:03:04.154225 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7b498658dd-jr2j2" event={"ID":"1d3fdb31-592c-408e-9968-d8bdcc91a143","Type":"ContainerDied","Data":"c5f5d98291bfadb3c69c7b7498b0b3d45a2543380e9f1972811d204fe5d94944"} Dec 05 18:03:04 crc kubenswrapper[4756]: I1205 18:03:04.154309 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7b498658dd-jr2j2" Dec 05 18:03:04 crc kubenswrapper[4756]: I1205 18:03:04.181358 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-997fb55bd-dxv2n"] Dec 05 18:03:04 crc kubenswrapper[4756]: I1205 18:03:04.192270 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-997fb55bd-dxv2n"] Dec 05 18:03:04 crc kubenswrapper[4756]: I1205 18:03:04.229391 4756 scope.go:117] "RemoveContainer" containerID="66c9f830988b6ca4b2a2a551984f2f5b3c1579bdb9df04421daac6754111ce69" Dec 05 18:03:04 crc kubenswrapper[4756]: I1205 18:03:04.233629 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7b498658dd-jr2j2"] Dec 05 18:03:04 crc kubenswrapper[4756]: I1205 18:03:04.254648 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-7b498658dd-jr2j2"] Dec 05 18:03:04 crc kubenswrapper[4756]: I1205 18:03:04.688397 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5cf757d7d6-ltbqn" podUID="1acc47c0-70db-432a-b627-44a07ac96488" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.120:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.120:8443: connect: connection refused" Dec 05 18:03:04 crc kubenswrapper[4756]: I1205 18:03:04.688540 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5cf757d7d6-ltbqn" Dec 05 18:03:06 crc kubenswrapper[4756]: I1205 18:03:06.075494 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d3fdb31-592c-408e-9968-d8bdcc91a143" path="/var/lib/kubelet/pods/1d3fdb31-592c-408e-9968-d8bdcc91a143/volumes" Dec 05 18:03:06 crc kubenswrapper[4756]: I1205 18:03:06.077073 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83c0758a-d657-4ebb-9a60-98ece921de66" path="/var/lib/kubelet/pods/83c0758a-d657-4ebb-9a60-98ece921de66/volumes" Dec 05 18:03:10 crc kubenswrapper[4756]: I1205 18:03:10.340497 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-c5f6496c-9fc66" Dec 05 18:03:10 crc kubenswrapper[4756]: I1205 18:03:10.405139 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-974fd87b7-7rfqs"] Dec 05 18:03:10 crc kubenswrapper[4756]: I1205 18:03:10.405590 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-974fd87b7-7rfqs" podUID="b6f2c129-6474-48d8-89ba-285d3c589475" containerName="heat-engine" containerID="cri-o://e3e6919ebd940ed0f6e04eeced3a82eb40b9aa1f27fd72a129437f27a84ef085" gracePeriod=60 Dec 05 18:03:12 crc kubenswrapper[4756]: E1205 18:03:12.425732 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e3e6919ebd940ed0f6e04eeced3a82eb40b9aa1f27fd72a129437f27a84ef085" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 05 18:03:12 crc kubenswrapper[4756]: E1205 18:03:12.427165 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e3e6919ebd940ed0f6e04eeced3a82eb40b9aa1f27fd72a129437f27a84ef085" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 05 18:03:12 crc kubenswrapper[4756]: E1205 18:03:12.428237 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e3e6919ebd940ed0f6e04eeced3a82eb40b9aa1f27fd72a129437f27a84ef085" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 05 18:03:12 crc kubenswrapper[4756]: E1205 18:03:12.428269 4756 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-974fd87b7-7rfqs" podUID="b6f2c129-6474-48d8-89ba-285d3c589475" containerName="heat-engine" Dec 05 18:03:14 crc kubenswrapper[4756]: I1205 18:03:14.720387 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cf757d7d6-ltbqn" Dec 05 18:03:14 crc kubenswrapper[4756]: I1205 18:03:14.806749 4756 scope.go:117] "RemoveContainer" containerID="2256769bb8998f12b83ae149b8da26e39fe2cb55868940676535b29b2f768878" Dec 05 18:03:14 crc kubenswrapper[4756]: I1205 18:03:14.816696 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1acc47c0-70db-432a-b627-44a07ac96488-logs\") pod \"1acc47c0-70db-432a-b627-44a07ac96488\" (UID: \"1acc47c0-70db-432a-b627-44a07ac96488\") " Dec 05 18:03:14 crc kubenswrapper[4756]: I1205 18:03:14.816991 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1acc47c0-70db-432a-b627-44a07ac96488-config-data\") pod \"1acc47c0-70db-432a-b627-44a07ac96488\" (UID: \"1acc47c0-70db-432a-b627-44a07ac96488\") " Dec 05 18:03:14 crc kubenswrapper[4756]: I1205 18:03:14.817065 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1acc47c0-70db-432a-b627-44a07ac96488-horizon-tls-certs\") pod \"1acc47c0-70db-432a-b627-44a07ac96488\" (UID: \"1acc47c0-70db-432a-b627-44a07ac96488\") " Dec 05 18:03:14 crc kubenswrapper[4756]: I1205 18:03:14.817314 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1acc47c0-70db-432a-b627-44a07ac96488-combined-ca-bundle\") pod \"1acc47c0-70db-432a-b627-44a07ac96488\" (UID: \"1acc47c0-70db-432a-b627-44a07ac96488\") " Dec 05 18:03:14 crc kubenswrapper[4756]: I1205 18:03:14.817345 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1acc47c0-70db-432a-b627-44a07ac96488-scripts\") pod \"1acc47c0-70db-432a-b627-44a07ac96488\" (UID: \"1acc47c0-70db-432a-b627-44a07ac96488\") " Dec 05 18:03:14 crc kubenswrapper[4756]: I1205 18:03:14.817385 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2prv2\" (UniqueName: \"kubernetes.io/projected/1acc47c0-70db-432a-b627-44a07ac96488-kube-api-access-2prv2\") pod \"1acc47c0-70db-432a-b627-44a07ac96488\" (UID: \"1acc47c0-70db-432a-b627-44a07ac96488\") " Dec 05 18:03:14 crc kubenswrapper[4756]: I1205 18:03:14.817464 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1acc47c0-70db-432a-b627-44a07ac96488-horizon-secret-key\") pod \"1acc47c0-70db-432a-b627-44a07ac96488\" (UID: \"1acc47c0-70db-432a-b627-44a07ac96488\") " Dec 05 18:03:14 crc kubenswrapper[4756]: I1205 18:03:14.822799 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1acc47c0-70db-432a-b627-44a07ac96488-logs" (OuterVolumeSpecName: "logs") pod "1acc47c0-70db-432a-b627-44a07ac96488" (UID: "1acc47c0-70db-432a-b627-44a07ac96488"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:03:14 crc kubenswrapper[4756]: I1205 18:03:14.837774 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1acc47c0-70db-432a-b627-44a07ac96488-kube-api-access-2prv2" (OuterVolumeSpecName: "kube-api-access-2prv2") pod "1acc47c0-70db-432a-b627-44a07ac96488" (UID: "1acc47c0-70db-432a-b627-44a07ac96488"). InnerVolumeSpecName "kube-api-access-2prv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:03:14 crc kubenswrapper[4756]: I1205 18:03:14.841964 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1acc47c0-70db-432a-b627-44a07ac96488-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "1acc47c0-70db-432a-b627-44a07ac96488" (UID: "1acc47c0-70db-432a-b627-44a07ac96488"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:03:14 crc kubenswrapper[4756]: I1205 18:03:14.878272 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1acc47c0-70db-432a-b627-44a07ac96488-scripts" (OuterVolumeSpecName: "scripts") pod "1acc47c0-70db-432a-b627-44a07ac96488" (UID: "1acc47c0-70db-432a-b627-44a07ac96488"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:03:14 crc kubenswrapper[4756]: I1205 18:03:14.879996 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1acc47c0-70db-432a-b627-44a07ac96488-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1acc47c0-70db-432a-b627-44a07ac96488" (UID: "1acc47c0-70db-432a-b627-44a07ac96488"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:03:14 crc kubenswrapper[4756]: I1205 18:03:14.898719 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1acc47c0-70db-432a-b627-44a07ac96488-config-data" (OuterVolumeSpecName: "config-data") pod "1acc47c0-70db-432a-b627-44a07ac96488" (UID: "1acc47c0-70db-432a-b627-44a07ac96488"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:03:14 crc kubenswrapper[4756]: I1205 18:03:14.902929 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1acc47c0-70db-432a-b627-44a07ac96488-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "1acc47c0-70db-432a-b627-44a07ac96488" (UID: "1acc47c0-70db-432a-b627-44a07ac96488"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:03:14 crc kubenswrapper[4756]: I1205 18:03:14.922153 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2prv2\" (UniqueName: \"kubernetes.io/projected/1acc47c0-70db-432a-b627-44a07ac96488-kube-api-access-2prv2\") on node \"crc\" DevicePath \"\"" Dec 05 18:03:14 crc kubenswrapper[4756]: I1205 18:03:14.922190 4756 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1acc47c0-70db-432a-b627-44a07ac96488-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 18:03:14 crc kubenswrapper[4756]: I1205 18:03:14.922203 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1acc47c0-70db-432a-b627-44a07ac96488-logs\") on node \"crc\" DevicePath \"\"" Dec 05 18:03:14 crc kubenswrapper[4756]: I1205 18:03:14.922217 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1acc47c0-70db-432a-b627-44a07ac96488-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 18:03:14 crc kubenswrapper[4756]: I1205 18:03:14.922231 4756 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1acc47c0-70db-432a-b627-44a07ac96488-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 18:03:14 crc kubenswrapper[4756]: I1205 18:03:14.922244 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1acc47c0-70db-432a-b627-44a07ac96488-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:03:14 crc kubenswrapper[4756]: I1205 18:03:14.922291 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1acc47c0-70db-432a-b627-44a07ac96488-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 18:03:14 crc kubenswrapper[4756]: I1205 18:03:14.950229 4756 scope.go:117] "RemoveContainer" containerID="8776235cf14a56f635b39849b4b508b14b9236b26611f067af0b84b7096fa20c" Dec 05 18:03:14 crc kubenswrapper[4756]: I1205 18:03:14.992752 4756 scope.go:117] "RemoveContainer" containerID="c8d264d298711680548d2f39dc0f632d08f914d6d7147865ef75e46c4809ac0a" Dec 05 18:03:15 crc kubenswrapper[4756]: I1205 18:03:15.288603 4756 generic.go:334] "Generic (PLEG): container finished" podID="1acc47c0-70db-432a-b627-44a07ac96488" containerID="043e698988c172165106cc334b83ce51f1a714fce70d32284d54bc1e71d250b1" exitCode=137 Dec 05 18:03:15 crc kubenswrapper[4756]: I1205 18:03:15.288720 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cf757d7d6-ltbqn" event={"ID":"1acc47c0-70db-432a-b627-44a07ac96488","Type":"ContainerDied","Data":"043e698988c172165106cc334b83ce51f1a714fce70d32284d54bc1e71d250b1"} Dec 05 18:03:15 crc kubenswrapper[4756]: I1205 18:03:15.288769 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cf757d7d6-ltbqn" Dec 05 18:03:15 crc kubenswrapper[4756]: I1205 18:03:15.289011 4756 scope.go:117] "RemoveContainer" containerID="2ae2f3dbeef1997982e852eacc2fbee85a043c9468679be980f1e95bffcc84dd" Dec 05 18:03:15 crc kubenswrapper[4756]: I1205 18:03:15.288996 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cf757d7d6-ltbqn" event={"ID":"1acc47c0-70db-432a-b627-44a07ac96488","Type":"ContainerDied","Data":"8c33c45d4920406d42e95e539d380eae7d6e71a9d657752dd09a2ed0a5e3bdae"} Dec 05 18:03:15 crc kubenswrapper[4756]: I1205 18:03:15.360595 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5cf757d7d6-ltbqn"] Dec 05 18:03:15 crc kubenswrapper[4756]: I1205 18:03:15.371751 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5cf757d7d6-ltbqn"] Dec 05 18:03:15 crc kubenswrapper[4756]: I1205 18:03:15.526812 4756 scope.go:117] "RemoveContainer" containerID="043e698988c172165106cc334b83ce51f1a714fce70d32284d54bc1e71d250b1" Dec 05 18:03:15 crc kubenswrapper[4756]: I1205 18:03:15.563928 4756 scope.go:117] "RemoveContainer" containerID="2ae2f3dbeef1997982e852eacc2fbee85a043c9468679be980f1e95bffcc84dd" Dec 05 18:03:15 crc kubenswrapper[4756]: E1205 18:03:15.564826 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ae2f3dbeef1997982e852eacc2fbee85a043c9468679be980f1e95bffcc84dd\": container with ID starting with 2ae2f3dbeef1997982e852eacc2fbee85a043c9468679be980f1e95bffcc84dd not found: ID does not exist" containerID="2ae2f3dbeef1997982e852eacc2fbee85a043c9468679be980f1e95bffcc84dd" Dec 05 18:03:15 crc kubenswrapper[4756]: I1205 18:03:15.564861 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ae2f3dbeef1997982e852eacc2fbee85a043c9468679be980f1e95bffcc84dd"} err="failed to get container status \"2ae2f3dbeef1997982e852eacc2fbee85a043c9468679be980f1e95bffcc84dd\": rpc error: code = NotFound desc = could not find container \"2ae2f3dbeef1997982e852eacc2fbee85a043c9468679be980f1e95bffcc84dd\": container with ID starting with 2ae2f3dbeef1997982e852eacc2fbee85a043c9468679be980f1e95bffcc84dd not found: ID does not exist" Dec 05 18:03:15 crc kubenswrapper[4756]: I1205 18:03:15.564888 4756 scope.go:117] "RemoveContainer" containerID="043e698988c172165106cc334b83ce51f1a714fce70d32284d54bc1e71d250b1" Dec 05 18:03:15 crc kubenswrapper[4756]: E1205 18:03:15.566290 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"043e698988c172165106cc334b83ce51f1a714fce70d32284d54bc1e71d250b1\": container with ID starting with 043e698988c172165106cc334b83ce51f1a714fce70d32284d54bc1e71d250b1 not found: ID does not exist" containerID="043e698988c172165106cc334b83ce51f1a714fce70d32284d54bc1e71d250b1" Dec 05 18:03:15 crc kubenswrapper[4756]: I1205 18:03:15.566342 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"043e698988c172165106cc334b83ce51f1a714fce70d32284d54bc1e71d250b1"} err="failed to get container status \"043e698988c172165106cc334b83ce51f1a714fce70d32284d54bc1e71d250b1\": rpc error: code = NotFound desc = could not find container \"043e698988c172165106cc334b83ce51f1a714fce70d32284d54bc1e71d250b1\": container with ID starting with 043e698988c172165106cc334b83ce51f1a714fce70d32284d54bc1e71d250b1 not found: ID does not exist" Dec 05 18:03:16 crc kubenswrapper[4756]: I1205 18:03:16.077576 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1acc47c0-70db-432a-b627-44a07ac96488" path="/var/lib/kubelet/pods/1acc47c0-70db-432a-b627-44a07ac96488/volumes" Dec 05 18:03:18 crc kubenswrapper[4756]: I1205 18:03:18.121922 4756 patch_prober.go:28] interesting pod/router-default-5444994796-pt8fk container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 18:03:18 crc kubenswrapper[4756]: I1205 18:03:18.122910 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-pt8fk" podUID="6ad5d1b1-4e12-430f-91f1-70e8fd975a09" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 18:03:18 crc kubenswrapper[4756]: I1205 18:03:18.122290 4756 patch_prober.go:28] interesting pod/router-default-5444994796-pt8fk container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 18:03:18 crc kubenswrapper[4756]: I1205 18:03:18.123000 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-pt8fk" podUID="6ad5d1b1-4e12-430f-91f1-70e8fd975a09" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 18:03:19 crc kubenswrapper[4756]: I1205 18:03:19.695343 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5cf757d7d6-ltbqn" podUID="1acc47c0-70db-432a-b627-44a07ac96488" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.120:8443/dashboard/auth/login/?next=/dashboard/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 18:03:22 crc kubenswrapper[4756]: E1205 18:03:22.432822 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e3e6919ebd940ed0f6e04eeced3a82eb40b9aa1f27fd72a129437f27a84ef085" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 05 18:03:22 crc kubenswrapper[4756]: E1205 18:03:22.434577 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e3e6919ebd940ed0f6e04eeced3a82eb40b9aa1f27fd72a129437f27a84ef085" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 05 18:03:22 crc kubenswrapper[4756]: E1205 18:03:22.439919 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e3e6919ebd940ed0f6e04eeced3a82eb40b9aa1f27fd72a129437f27a84ef085" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 05 18:03:22 crc kubenswrapper[4756]: E1205 18:03:22.439959 4756 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-974fd87b7-7rfqs" podUID="b6f2c129-6474-48d8-89ba-285d3c589475" containerName="heat-engine" Dec 05 18:03:27 crc kubenswrapper[4756]: I1205 18:03:27.394244 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-974fd87b7-7rfqs" Dec 05 18:03:27 crc kubenswrapper[4756]: I1205 18:03:27.465596 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6f2c129-6474-48d8-89ba-285d3c589475-config-data\") pod \"b6f2c129-6474-48d8-89ba-285d3c589475\" (UID: \"b6f2c129-6474-48d8-89ba-285d3c589475\") " Dec 05 18:03:27 crc kubenswrapper[4756]: I1205 18:03:27.465644 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f2c129-6474-48d8-89ba-285d3c589475-combined-ca-bundle\") pod \"b6f2c129-6474-48d8-89ba-285d3c589475\" (UID: \"b6f2c129-6474-48d8-89ba-285d3c589475\") " Dec 05 18:03:27 crc kubenswrapper[4756]: I1205 18:03:27.465781 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6f2c129-6474-48d8-89ba-285d3c589475-config-data-custom\") pod \"b6f2c129-6474-48d8-89ba-285d3c589475\" (UID: \"b6f2c129-6474-48d8-89ba-285d3c589475\") " Dec 05 18:03:27 crc kubenswrapper[4756]: I1205 18:03:27.465850 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddrvr\" (UniqueName: \"kubernetes.io/projected/b6f2c129-6474-48d8-89ba-285d3c589475-kube-api-access-ddrvr\") pod \"b6f2c129-6474-48d8-89ba-285d3c589475\" (UID: \"b6f2c129-6474-48d8-89ba-285d3c589475\") " Dec 05 18:03:27 crc kubenswrapper[4756]: I1205 18:03:27.472012 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6f2c129-6474-48d8-89ba-285d3c589475-kube-api-access-ddrvr" (OuterVolumeSpecName: "kube-api-access-ddrvr") pod "b6f2c129-6474-48d8-89ba-285d3c589475" (UID: "b6f2c129-6474-48d8-89ba-285d3c589475"). InnerVolumeSpecName "kube-api-access-ddrvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:03:27 crc kubenswrapper[4756]: I1205 18:03:27.483147 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6f2c129-6474-48d8-89ba-285d3c589475-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b6f2c129-6474-48d8-89ba-285d3c589475" (UID: "b6f2c129-6474-48d8-89ba-285d3c589475"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:03:27 crc kubenswrapper[4756]: I1205 18:03:27.503700 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6f2c129-6474-48d8-89ba-285d3c589475-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6f2c129-6474-48d8-89ba-285d3c589475" (UID: "b6f2c129-6474-48d8-89ba-285d3c589475"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:03:27 crc kubenswrapper[4756]: I1205 18:03:27.511812 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-974fd87b7-7rfqs" Dec 05 18:03:27 crc kubenswrapper[4756]: I1205 18:03:27.511850 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-974fd87b7-7rfqs" event={"ID":"b6f2c129-6474-48d8-89ba-285d3c589475","Type":"ContainerDied","Data":"e3e6919ebd940ed0f6e04eeced3a82eb40b9aa1f27fd72a129437f27a84ef085"} Dec 05 18:03:27 crc kubenswrapper[4756]: I1205 18:03:27.511914 4756 scope.go:117] "RemoveContainer" containerID="e3e6919ebd940ed0f6e04eeced3a82eb40b9aa1f27fd72a129437f27a84ef085" Dec 05 18:03:27 crc kubenswrapper[4756]: I1205 18:03:27.511706 4756 generic.go:334] "Generic (PLEG): container finished" podID="b6f2c129-6474-48d8-89ba-285d3c589475" containerID="e3e6919ebd940ed0f6e04eeced3a82eb40b9aa1f27fd72a129437f27a84ef085" exitCode=0 Dec 05 18:03:27 crc kubenswrapper[4756]: I1205 18:03:27.512853 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-974fd87b7-7rfqs" event={"ID":"b6f2c129-6474-48d8-89ba-285d3c589475","Type":"ContainerDied","Data":"61add351ff6d086d79ea525bb25e9cef51037c67f76343c094ca1195bb22389b"} Dec 05 18:03:27 crc kubenswrapper[4756]: I1205 18:03:27.528834 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6f2c129-6474-48d8-89ba-285d3c589475-config-data" (OuterVolumeSpecName: "config-data") pod "b6f2c129-6474-48d8-89ba-285d3c589475" (UID: "b6f2c129-6474-48d8-89ba-285d3c589475"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:03:27 crc kubenswrapper[4756]: I1205 18:03:27.568553 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6f2c129-6474-48d8-89ba-285d3c589475-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 18:03:27 crc kubenswrapper[4756]: I1205 18:03:27.568591 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddrvr\" (UniqueName: \"kubernetes.io/projected/b6f2c129-6474-48d8-89ba-285d3c589475-kube-api-access-ddrvr\") on node \"crc\" DevicePath \"\"" Dec 05 18:03:27 crc kubenswrapper[4756]: I1205 18:03:27.568605 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6f2c129-6474-48d8-89ba-285d3c589475-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 18:03:27 crc kubenswrapper[4756]: I1205 18:03:27.568616 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f2c129-6474-48d8-89ba-285d3c589475-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:03:27 crc kubenswrapper[4756]: I1205 18:03:27.606823 4756 scope.go:117] "RemoveContainer" containerID="e3e6919ebd940ed0f6e04eeced3a82eb40b9aa1f27fd72a129437f27a84ef085" Dec 05 18:03:27 crc kubenswrapper[4756]: E1205 18:03:27.607297 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3e6919ebd940ed0f6e04eeced3a82eb40b9aa1f27fd72a129437f27a84ef085\": container with ID starting with e3e6919ebd940ed0f6e04eeced3a82eb40b9aa1f27fd72a129437f27a84ef085 not found: ID does not exist" containerID="e3e6919ebd940ed0f6e04eeced3a82eb40b9aa1f27fd72a129437f27a84ef085" Dec 05 18:03:27 crc kubenswrapper[4756]: I1205 18:03:27.607332 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3e6919ebd940ed0f6e04eeced3a82eb40b9aa1f27fd72a129437f27a84ef085"} err="failed to get container status \"e3e6919ebd940ed0f6e04eeced3a82eb40b9aa1f27fd72a129437f27a84ef085\": rpc error: code = NotFound desc = could not find container \"e3e6919ebd940ed0f6e04eeced3a82eb40b9aa1f27fd72a129437f27a84ef085\": container with ID starting with e3e6919ebd940ed0f6e04eeced3a82eb40b9aa1f27fd72a129437f27a84ef085 not found: ID does not exist" Dec 05 18:03:27 crc kubenswrapper[4756]: I1205 18:03:27.868927 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-974fd87b7-7rfqs"] Dec 05 18:03:27 crc kubenswrapper[4756]: I1205 18:03:27.884334 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-974fd87b7-7rfqs"] Dec 05 18:03:28 crc kubenswrapper[4756]: I1205 18:03:28.060207 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6f2c129-6474-48d8-89ba-285d3c589475" path="/var/lib/kubelet/pods/b6f2c129-6474-48d8-89ba-285d3c589475/volumes" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.045137 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts"] Dec 05 18:03:36 crc kubenswrapper[4756]: E1205 18:03:36.046385 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83c0758a-d657-4ebb-9a60-98ece921de66" containerName="heat-cfnapi" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.046406 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="83c0758a-d657-4ebb-9a60-98ece921de66" containerName="heat-cfnapi" Dec 05 18:03:36 crc kubenswrapper[4756]: E1205 18:03:36.046449 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3fdb31-592c-408e-9968-d8bdcc91a143" containerName="heat-api" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.046463 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3fdb31-592c-408e-9968-d8bdcc91a143" containerName="heat-api" Dec 05 18:03:36 crc kubenswrapper[4756]: E1205 18:03:36.046488 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1acc47c0-70db-432a-b627-44a07ac96488" containerName="horizon-log" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.046503 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1acc47c0-70db-432a-b627-44a07ac96488" containerName="horizon-log" Dec 05 18:03:36 crc kubenswrapper[4756]: E1205 18:03:36.046519 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1acc47c0-70db-432a-b627-44a07ac96488" containerName="horizon" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.046531 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1acc47c0-70db-432a-b627-44a07ac96488" containerName="horizon" Dec 05 18:03:36 crc kubenswrapper[4756]: E1205 18:03:36.046552 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6f2c129-6474-48d8-89ba-285d3c589475" containerName="heat-engine" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.046564 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6f2c129-6474-48d8-89ba-285d3c589475" containerName="heat-engine" Dec 05 18:03:36 crc kubenswrapper[4756]: E1205 18:03:36.046601 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83c0758a-d657-4ebb-9a60-98ece921de66" containerName="heat-cfnapi" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.046615 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="83c0758a-d657-4ebb-9a60-98ece921de66" containerName="heat-cfnapi" Dec 05 18:03:36 crc kubenswrapper[4756]: E1205 18:03:36.046654 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3fdb31-592c-408e-9968-d8bdcc91a143" containerName="heat-api" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.046666 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3fdb31-592c-408e-9968-d8bdcc91a143" containerName="heat-api" Dec 05 18:03:36 crc kubenswrapper[4756]: E1205 18:03:36.046694 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afa0b803-3a8a-4595-ace3-b4f442c478e4" containerName="heat-cfnapi" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.046705 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="afa0b803-3a8a-4595-ace3-b4f442c478e4" containerName="heat-cfnapi" Dec 05 18:03:36 crc kubenswrapper[4756]: E1205 18:03:36.046724 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb" containerName="heat-api" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.046736 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb" containerName="heat-api" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.047112 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1acc47c0-70db-432a-b627-44a07ac96488" containerName="horizon-log" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.047138 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3fdb31-592c-408e-9968-d8bdcc91a143" containerName="heat-api" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.047156 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3fdb31-592c-408e-9968-d8bdcc91a143" containerName="heat-api" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.047172 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="afa0b803-3a8a-4595-ace3-b4f442c478e4" containerName="heat-cfnapi" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.047196 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6f2c129-6474-48d8-89ba-285d3c589475" containerName="heat-engine" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.047209 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="83c0758a-d657-4ebb-9a60-98ece921de66" containerName="heat-cfnapi" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.047236 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="83c0758a-d657-4ebb-9a60-98ece921de66" containerName="heat-cfnapi" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.047260 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1acc47c0-70db-432a-b627-44a07ac96488" containerName="horizon" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.047290 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed46aaff-e7a4-42a7-b9d6-8430d9ba12bb" containerName="heat-api" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.050003 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.054396 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.070589 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts"] Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.175561 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9jnm\" (UniqueName: \"kubernetes.io/projected/894452cc-5e36-4dd4-a985-fb42bab5ca7b-kube-api-access-j9jnm\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts\" (UID: \"894452cc-5e36-4dd4-a985-fb42bab5ca7b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.175711 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/894452cc-5e36-4dd4-a985-fb42bab5ca7b-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts\" (UID: \"894452cc-5e36-4dd4-a985-fb42bab5ca7b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.175843 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/894452cc-5e36-4dd4-a985-fb42bab5ca7b-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts\" (UID: \"894452cc-5e36-4dd4-a985-fb42bab5ca7b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.279148 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/894452cc-5e36-4dd4-a985-fb42bab5ca7b-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts\" (UID: \"894452cc-5e36-4dd4-a985-fb42bab5ca7b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.279356 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9jnm\" (UniqueName: \"kubernetes.io/projected/894452cc-5e36-4dd4-a985-fb42bab5ca7b-kube-api-access-j9jnm\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts\" (UID: \"894452cc-5e36-4dd4-a985-fb42bab5ca7b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.279443 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/894452cc-5e36-4dd4-a985-fb42bab5ca7b-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts\" (UID: \"894452cc-5e36-4dd4-a985-fb42bab5ca7b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.279940 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/894452cc-5e36-4dd4-a985-fb42bab5ca7b-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts\" (UID: \"894452cc-5e36-4dd4-a985-fb42bab5ca7b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.279961 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/894452cc-5e36-4dd4-a985-fb42bab5ca7b-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts\" (UID: \"894452cc-5e36-4dd4-a985-fb42bab5ca7b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.315334 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9jnm\" (UniqueName: \"kubernetes.io/projected/894452cc-5e36-4dd4-a985-fb42bab5ca7b-kube-api-access-j9jnm\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts\" (UID: \"894452cc-5e36-4dd4-a985-fb42bab5ca7b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.396376 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts" Dec 05 18:03:36 crc kubenswrapper[4756]: I1205 18:03:36.905064 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts"] Dec 05 18:03:36 crc kubenswrapper[4756]: W1205 18:03:36.909997 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod894452cc_5e36_4dd4_a985_fb42bab5ca7b.slice/crio-b6074a08825cd718bf8c7f38c7eaef2ec4637c07999dc949a7ff1e0d81ea641d WatchSource:0}: Error finding container b6074a08825cd718bf8c7f38c7eaef2ec4637c07999dc949a7ff1e0d81ea641d: Status 404 returned error can't find the container with id b6074a08825cd718bf8c7f38c7eaef2ec4637c07999dc949a7ff1e0d81ea641d Dec 05 18:03:37 crc kubenswrapper[4756]: I1205 18:03:37.633001 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts" event={"ID":"894452cc-5e36-4dd4-a985-fb42bab5ca7b","Type":"ContainerStarted","Data":"b6074a08825cd718bf8c7f38c7eaef2ec4637c07999dc949a7ff1e0d81ea641d"} Dec 05 18:03:38 crc kubenswrapper[4756]: I1205 18:03:38.651482 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts" event={"ID":"894452cc-5e36-4dd4-a985-fb42bab5ca7b","Type":"ContainerStarted","Data":"3c4850eafe6ce672529435efdc6e1a3effbc2e5986c290a9c27164b665b56e1b"} Dec 05 18:03:39 crc kubenswrapper[4756]: I1205 18:03:39.664639 4756 generic.go:334] "Generic (PLEG): container finished" podID="894452cc-5e36-4dd4-a985-fb42bab5ca7b" containerID="3c4850eafe6ce672529435efdc6e1a3effbc2e5986c290a9c27164b665b56e1b" exitCode=0 Dec 05 18:03:39 crc kubenswrapper[4756]: I1205 18:03:39.664703 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts" event={"ID":"894452cc-5e36-4dd4-a985-fb42bab5ca7b","Type":"ContainerDied","Data":"3c4850eafe6ce672529435efdc6e1a3effbc2e5986c290a9c27164b665b56e1b"} Dec 05 18:03:39 crc kubenswrapper[4756]: I1205 18:03:39.667487 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 18:03:42 crc kubenswrapper[4756]: I1205 18:03:42.720166 4756 generic.go:334] "Generic (PLEG): container finished" podID="894452cc-5e36-4dd4-a985-fb42bab5ca7b" containerID="1f42a54eb9cabfde1b204cc4c39823a85bc813be321b05b8655e292a0edf56e7" exitCode=0 Dec 05 18:03:42 crc kubenswrapper[4756]: I1205 18:03:42.720308 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts" event={"ID":"894452cc-5e36-4dd4-a985-fb42bab5ca7b","Type":"ContainerDied","Data":"1f42a54eb9cabfde1b204cc4c39823a85bc813be321b05b8655e292a0edf56e7"} Dec 05 18:03:43 crc kubenswrapper[4756]: I1205 18:03:43.743923 4756 generic.go:334] "Generic (PLEG): container finished" podID="894452cc-5e36-4dd4-a985-fb42bab5ca7b" containerID="9ca00fa66da9593ead0ed189491f9a7ec53d5c6fcba862935300a638c00569b6" exitCode=0 Dec 05 18:03:43 crc kubenswrapper[4756]: I1205 18:03:43.744020 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts" event={"ID":"894452cc-5e36-4dd4-a985-fb42bab5ca7b","Type":"ContainerDied","Data":"9ca00fa66da9593ead0ed189491f9a7ec53d5c6fcba862935300a638c00569b6"} Dec 05 18:03:45 crc kubenswrapper[4756]: I1205 18:03:45.212370 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts" Dec 05 18:03:45 crc kubenswrapper[4756]: I1205 18:03:45.302989 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9jnm\" (UniqueName: \"kubernetes.io/projected/894452cc-5e36-4dd4-a985-fb42bab5ca7b-kube-api-access-j9jnm\") pod \"894452cc-5e36-4dd4-a985-fb42bab5ca7b\" (UID: \"894452cc-5e36-4dd4-a985-fb42bab5ca7b\") " Dec 05 18:03:45 crc kubenswrapper[4756]: I1205 18:03:45.303173 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/894452cc-5e36-4dd4-a985-fb42bab5ca7b-util\") pod \"894452cc-5e36-4dd4-a985-fb42bab5ca7b\" (UID: \"894452cc-5e36-4dd4-a985-fb42bab5ca7b\") " Dec 05 18:03:45 crc kubenswrapper[4756]: I1205 18:03:45.303345 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/894452cc-5e36-4dd4-a985-fb42bab5ca7b-bundle\") pod \"894452cc-5e36-4dd4-a985-fb42bab5ca7b\" (UID: \"894452cc-5e36-4dd4-a985-fb42bab5ca7b\") " Dec 05 18:03:45 crc kubenswrapper[4756]: I1205 18:03:45.305486 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/894452cc-5e36-4dd4-a985-fb42bab5ca7b-bundle" (OuterVolumeSpecName: "bundle") pod "894452cc-5e36-4dd4-a985-fb42bab5ca7b" (UID: "894452cc-5e36-4dd4-a985-fb42bab5ca7b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:03:45 crc kubenswrapper[4756]: I1205 18:03:45.312965 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/894452cc-5e36-4dd4-a985-fb42bab5ca7b-kube-api-access-j9jnm" (OuterVolumeSpecName: "kube-api-access-j9jnm") pod "894452cc-5e36-4dd4-a985-fb42bab5ca7b" (UID: "894452cc-5e36-4dd4-a985-fb42bab5ca7b"). InnerVolumeSpecName "kube-api-access-j9jnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:03:45 crc kubenswrapper[4756]: I1205 18:03:45.325021 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/894452cc-5e36-4dd4-a985-fb42bab5ca7b-util" (OuterVolumeSpecName: "util") pod "894452cc-5e36-4dd4-a985-fb42bab5ca7b" (UID: "894452cc-5e36-4dd4-a985-fb42bab5ca7b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:03:45 crc kubenswrapper[4756]: I1205 18:03:45.406824 4756 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/894452cc-5e36-4dd4-a985-fb42bab5ca7b-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:03:45 crc kubenswrapper[4756]: I1205 18:03:45.406878 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9jnm\" (UniqueName: \"kubernetes.io/projected/894452cc-5e36-4dd4-a985-fb42bab5ca7b-kube-api-access-j9jnm\") on node \"crc\" DevicePath \"\"" Dec 05 18:03:45 crc kubenswrapper[4756]: I1205 18:03:45.406900 4756 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/894452cc-5e36-4dd4-a985-fb42bab5ca7b-util\") on node \"crc\" DevicePath \"\"" Dec 05 18:03:45 crc kubenswrapper[4756]: I1205 18:03:45.776187 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts" event={"ID":"894452cc-5e36-4dd4-a985-fb42bab5ca7b","Type":"ContainerDied","Data":"b6074a08825cd718bf8c7f38c7eaef2ec4637c07999dc949a7ff1e0d81ea641d"} Dec 05 18:03:45 crc kubenswrapper[4756]: I1205 18:03:45.776268 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6074a08825cd718bf8c7f38c7eaef2ec4637c07999dc949a7ff1e0d81ea641d" Dec 05 18:03:45 crc kubenswrapper[4756]: I1205 18:03:45.776270 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts" Dec 05 18:03:47 crc kubenswrapper[4756]: I1205 18:03:47.067572 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-hngjr"] Dec 05 18:03:47 crc kubenswrapper[4756]: I1205 18:03:47.087571 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b531-account-create-update-kk8lh"] Dec 05 18:03:47 crc kubenswrapper[4756]: I1205 18:03:47.098438 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-b531-account-create-update-kk8lh"] Dec 05 18:03:47 crc kubenswrapper[4756]: I1205 18:03:47.109915 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-hngjr"] Dec 05 18:03:49 crc kubenswrapper[4756]: I1205 18:03:49.009577 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7caec4ac-8162-4d47-ac4a-e1372071db96" path="/var/lib/kubelet/pods/7caec4ac-8162-4d47-ac4a-e1372071db96/volumes" Dec 05 18:03:49 crc kubenswrapper[4756]: I1205 18:03:49.014281 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f897b315-f80b-4353-b12f-83409ab6bb17" path="/var/lib/kubelet/pods/f897b315-f80b-4353-b12f-83409ab6bb17/volumes" Dec 05 18:03:52 crc kubenswrapper[4756]: I1205 18:03:52.295466 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:03:52 crc kubenswrapper[4756]: I1205 18:03:52.295945 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.038152 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-n52f6"] Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.047352 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-n52f6"] Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.613802 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-4d4k8"] Dec 05 18:03:55 crc kubenswrapper[4756]: E1205 18:03:55.614395 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="894452cc-5e36-4dd4-a985-fb42bab5ca7b" containerName="util" Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.614427 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="894452cc-5e36-4dd4-a985-fb42bab5ca7b" containerName="util" Dec 05 18:03:55 crc kubenswrapper[4756]: E1205 18:03:55.614460 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="894452cc-5e36-4dd4-a985-fb42bab5ca7b" containerName="extract" Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.614467 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="894452cc-5e36-4dd4-a985-fb42bab5ca7b" containerName="extract" Dec 05 18:03:55 crc kubenswrapper[4756]: E1205 18:03:55.614487 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="894452cc-5e36-4dd4-a985-fb42bab5ca7b" containerName="pull" Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.614493 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="894452cc-5e36-4dd4-a985-fb42bab5ca7b" containerName="pull" Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.614703 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="894452cc-5e36-4dd4-a985-fb42bab5ca7b" containerName="extract" Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.615363 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4d4k8" Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.617985 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.618101 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.620193 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-l2r8l" Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.639987 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-4d4k8"] Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.720831 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qmhb\" (UniqueName: \"kubernetes.io/projected/f9df7e1a-4c29-43b1-b5fb-0a74fcd9acd7-kube-api-access-9qmhb\") pod \"obo-prometheus-operator-668cf9dfbb-4d4k8\" (UID: \"f9df7e1a-4c29-43b1-b5fb-0a74fcd9acd7\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4d4k8" Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.732785 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-bdc588f9-522g7"] Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.734004 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bdc588f9-522g7" Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.737165 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.748726 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-dtfv4" Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.753819 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-bdc588f9-44cdj"] Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.755304 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bdc588f9-44cdj" Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.773635 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-bdc588f9-522g7"] Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.791233 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-bdc588f9-44cdj"] Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.822542 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5399ecc6-c8a6-4fb3-812b-ac3fd080d6b2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-bdc588f9-522g7\" (UID: \"5399ecc6-c8a6-4fb3-812b-ac3fd080d6b2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bdc588f9-522g7" Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.822583 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5399ecc6-c8a6-4fb3-812b-ac3fd080d6b2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-bdc588f9-522g7\" (UID: \"5399ecc6-c8a6-4fb3-812b-ac3fd080d6b2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bdc588f9-522g7" Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.822634 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qmhb\" (UniqueName: \"kubernetes.io/projected/f9df7e1a-4c29-43b1-b5fb-0a74fcd9acd7-kube-api-access-9qmhb\") pod \"obo-prometheus-operator-668cf9dfbb-4d4k8\" (UID: \"f9df7e1a-4c29-43b1-b5fb-0a74fcd9acd7\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4d4k8" Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.839301 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qmhb\" (UniqueName: \"kubernetes.io/projected/f9df7e1a-4c29-43b1-b5fb-0a74fcd9acd7-kube-api-access-9qmhb\") pod \"obo-prometheus-operator-668cf9dfbb-4d4k8\" (UID: \"f9df7e1a-4c29-43b1-b5fb-0a74fcd9acd7\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4d4k8" Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.924131 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d6b206ae-0b75-402a-aea8-f3d6eaa6a7e7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-bdc588f9-44cdj\" (UID: \"d6b206ae-0b75-402a-aea8-f3d6eaa6a7e7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bdc588f9-44cdj" Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.924218 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5399ecc6-c8a6-4fb3-812b-ac3fd080d6b2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-bdc588f9-522g7\" (UID: \"5399ecc6-c8a6-4fb3-812b-ac3fd080d6b2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bdc588f9-522g7" Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.924284 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5399ecc6-c8a6-4fb3-812b-ac3fd080d6b2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-bdc588f9-522g7\" (UID: \"5399ecc6-c8a6-4fb3-812b-ac3fd080d6b2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bdc588f9-522g7" Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.925399 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d6b206ae-0b75-402a-aea8-f3d6eaa6a7e7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-bdc588f9-44cdj\" (UID: \"d6b206ae-0b75-402a-aea8-f3d6eaa6a7e7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bdc588f9-44cdj" Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.930824 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5399ecc6-c8a6-4fb3-812b-ac3fd080d6b2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-bdc588f9-522g7\" (UID: \"5399ecc6-c8a6-4fb3-812b-ac3fd080d6b2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bdc588f9-522g7" Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.930935 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5399ecc6-c8a6-4fb3-812b-ac3fd080d6b2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-bdc588f9-522g7\" (UID: \"5399ecc6-c8a6-4fb3-812b-ac3fd080d6b2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bdc588f9-522g7" Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.943249 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4d4k8" Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.957760 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-mxr46"] Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.962161 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-mxr46" Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.966702 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-wxgzw" Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.967071 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 05 18:03:55 crc kubenswrapper[4756]: I1205 18:03:55.983134 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-mxr46"] Dec 05 18:03:56 crc kubenswrapper[4756]: I1205 18:03:56.026819 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d6b206ae-0b75-402a-aea8-f3d6eaa6a7e7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-bdc588f9-44cdj\" (UID: \"d6b206ae-0b75-402a-aea8-f3d6eaa6a7e7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bdc588f9-44cdj" Dec 05 18:03:56 crc kubenswrapper[4756]: I1205 18:03:56.027163 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d6b206ae-0b75-402a-aea8-f3d6eaa6a7e7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-bdc588f9-44cdj\" (UID: \"d6b206ae-0b75-402a-aea8-f3d6eaa6a7e7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bdc588f9-44cdj" Dec 05 18:03:56 crc kubenswrapper[4756]: I1205 18:03:56.031335 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d6b206ae-0b75-402a-aea8-f3d6eaa6a7e7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-bdc588f9-44cdj\" (UID: \"d6b206ae-0b75-402a-aea8-f3d6eaa6a7e7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bdc588f9-44cdj" Dec 05 18:03:56 crc kubenswrapper[4756]: I1205 18:03:56.044277 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d6b206ae-0b75-402a-aea8-f3d6eaa6a7e7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-bdc588f9-44cdj\" (UID: \"d6b206ae-0b75-402a-aea8-f3d6eaa6a7e7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bdc588f9-44cdj" Dec 05 18:03:56 crc kubenswrapper[4756]: I1205 18:03:56.048936 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bdc588f9-522g7" Dec 05 18:03:56 crc kubenswrapper[4756]: I1205 18:03:56.083840 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bdc588f9-44cdj" Dec 05 18:03:56 crc kubenswrapper[4756]: I1205 18:03:56.100908 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ab62b10-42d1-4699-adcb-9412bb44b37c" path="/var/lib/kubelet/pods/8ab62b10-42d1-4699-adcb-9412bb44b37c/volumes" Dec 05 18:03:56 crc kubenswrapper[4756]: I1205 18:03:56.120923 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-mxdgp"] Dec 05 18:03:56 crc kubenswrapper[4756]: I1205 18:03:56.137712 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-mxdgp" Dec 05 18:03:56 crc kubenswrapper[4756]: I1205 18:03:56.140839 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/2304ca15-1116-4f4e-ab8e-ab04484010a3-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-mxr46\" (UID: \"2304ca15-1116-4f4e-ab8e-ab04484010a3\") " pod="openshift-operators/observability-operator-d8bb48f5d-mxr46" Dec 05 18:03:56 crc kubenswrapper[4756]: I1205 18:03:56.141324 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltfzl\" (UniqueName: \"kubernetes.io/projected/2304ca15-1116-4f4e-ab8e-ab04484010a3-kube-api-access-ltfzl\") pod \"observability-operator-d8bb48f5d-mxr46\" (UID: \"2304ca15-1116-4f4e-ab8e-ab04484010a3\") " pod="openshift-operators/observability-operator-d8bb48f5d-mxr46" Dec 05 18:03:56 crc kubenswrapper[4756]: I1205 18:03:56.145512 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-t526t" Dec 05 18:03:56 crc kubenswrapper[4756]: I1205 18:03:56.145716 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-mxdgp"] Dec 05 18:03:56 crc kubenswrapper[4756]: I1205 18:03:56.244634 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/2304ca15-1116-4f4e-ab8e-ab04484010a3-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-mxr46\" (UID: \"2304ca15-1116-4f4e-ab8e-ab04484010a3\") " pod="openshift-operators/observability-operator-d8bb48f5d-mxr46" Dec 05 18:03:56 crc kubenswrapper[4756]: I1205 18:03:56.244681 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/a6174fba-31f8-4814-9c46-1ef40e855c39-openshift-service-ca\") pod \"perses-operator-5446b9c989-mxdgp\" (UID: \"a6174fba-31f8-4814-9c46-1ef40e855c39\") " pod="openshift-operators/perses-operator-5446b9c989-mxdgp" Dec 05 18:03:56 crc kubenswrapper[4756]: I1205 18:03:56.244838 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltfzl\" (UniqueName: \"kubernetes.io/projected/2304ca15-1116-4f4e-ab8e-ab04484010a3-kube-api-access-ltfzl\") pod \"observability-operator-d8bb48f5d-mxr46\" (UID: \"2304ca15-1116-4f4e-ab8e-ab04484010a3\") " pod="openshift-operators/observability-operator-d8bb48f5d-mxr46" Dec 05 18:03:56 crc kubenswrapper[4756]: I1205 18:03:56.244869 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndmg5\" (UniqueName: \"kubernetes.io/projected/a6174fba-31f8-4814-9c46-1ef40e855c39-kube-api-access-ndmg5\") pod \"perses-operator-5446b9c989-mxdgp\" (UID: \"a6174fba-31f8-4814-9c46-1ef40e855c39\") " pod="openshift-operators/perses-operator-5446b9c989-mxdgp" Dec 05 18:03:56 crc kubenswrapper[4756]: I1205 18:03:56.257602 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/2304ca15-1116-4f4e-ab8e-ab04484010a3-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-mxr46\" (UID: \"2304ca15-1116-4f4e-ab8e-ab04484010a3\") " pod="openshift-operators/observability-operator-d8bb48f5d-mxr46" Dec 05 18:03:56 crc kubenswrapper[4756]: I1205 18:03:56.259732 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltfzl\" (UniqueName: \"kubernetes.io/projected/2304ca15-1116-4f4e-ab8e-ab04484010a3-kube-api-access-ltfzl\") pod \"observability-operator-d8bb48f5d-mxr46\" (UID: \"2304ca15-1116-4f4e-ab8e-ab04484010a3\") " pod="openshift-operators/observability-operator-d8bb48f5d-mxr46" Dec 05 18:03:56 crc kubenswrapper[4756]: I1205 18:03:56.347475 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndmg5\" (UniqueName: \"kubernetes.io/projected/a6174fba-31f8-4814-9c46-1ef40e855c39-kube-api-access-ndmg5\") pod \"perses-operator-5446b9c989-mxdgp\" (UID: \"a6174fba-31f8-4814-9c46-1ef40e855c39\") " pod="openshift-operators/perses-operator-5446b9c989-mxdgp" Dec 05 18:03:56 crc kubenswrapper[4756]: I1205 18:03:56.347777 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/a6174fba-31f8-4814-9c46-1ef40e855c39-openshift-service-ca\") pod \"perses-operator-5446b9c989-mxdgp\" (UID: \"a6174fba-31f8-4814-9c46-1ef40e855c39\") " pod="openshift-operators/perses-operator-5446b9c989-mxdgp" Dec 05 18:03:56 crc kubenswrapper[4756]: I1205 18:03:56.348721 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/a6174fba-31f8-4814-9c46-1ef40e855c39-openshift-service-ca\") pod \"perses-operator-5446b9c989-mxdgp\" (UID: \"a6174fba-31f8-4814-9c46-1ef40e855c39\") " pod="openshift-operators/perses-operator-5446b9c989-mxdgp" Dec 05 18:03:56 crc kubenswrapper[4756]: I1205 18:03:56.375188 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndmg5\" (UniqueName: \"kubernetes.io/projected/a6174fba-31f8-4814-9c46-1ef40e855c39-kube-api-access-ndmg5\") pod \"perses-operator-5446b9c989-mxdgp\" (UID: \"a6174fba-31f8-4814-9c46-1ef40e855c39\") " pod="openshift-operators/perses-operator-5446b9c989-mxdgp" Dec 05 18:03:56 crc kubenswrapper[4756]: I1205 18:03:56.411671 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-mxr46" Dec 05 18:03:56 crc kubenswrapper[4756]: I1205 18:03:56.511996 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-mxdgp" Dec 05 18:03:56 crc kubenswrapper[4756]: I1205 18:03:56.628475 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-4d4k8"] Dec 05 18:03:56 crc kubenswrapper[4756]: I1205 18:03:56.820146 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-bdc588f9-522g7"] Dec 05 18:03:56 crc kubenswrapper[4756]: W1205 18:03:56.843251 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6b206ae_0b75_402a_aea8_f3d6eaa6a7e7.slice/crio-3bb1a3103b89af7e7832df32f3c634520023a6ef5a720088a391ee8d1218da78 WatchSource:0}: Error finding container 3bb1a3103b89af7e7832df32f3c634520023a6ef5a720088a391ee8d1218da78: Status 404 returned error can't find the container with id 3bb1a3103b89af7e7832df32f3c634520023a6ef5a720088a391ee8d1218da78 Dec 05 18:03:56 crc kubenswrapper[4756]: I1205 18:03:56.844727 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-bdc588f9-44cdj"] Dec 05 18:03:57 crc kubenswrapper[4756]: I1205 18:03:57.127436 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4d4k8" event={"ID":"f9df7e1a-4c29-43b1-b5fb-0a74fcd9acd7","Type":"ContainerStarted","Data":"4a94445c51447e527c937c1dff18ce2ef36843fa61dcd7e88db59870cf272038"} Dec 05 18:03:57 crc kubenswrapper[4756]: I1205 18:03:57.129042 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bdc588f9-522g7" event={"ID":"5399ecc6-c8a6-4fb3-812b-ac3fd080d6b2","Type":"ContainerStarted","Data":"92f39e351820f10436a85f1dd583bf7ce257a49b671400972b98b681372ca28e"} Dec 05 18:03:57 crc kubenswrapper[4756]: I1205 18:03:57.130522 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bdc588f9-44cdj" event={"ID":"d6b206ae-0b75-402a-aea8-f3d6eaa6a7e7","Type":"ContainerStarted","Data":"3bb1a3103b89af7e7832df32f3c634520023a6ef5a720088a391ee8d1218da78"} Dec 05 18:03:57 crc kubenswrapper[4756]: I1205 18:03:57.231661 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-mxdgp"] Dec 05 18:03:57 crc kubenswrapper[4756]: W1205 18:03:57.241624 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2304ca15_1116_4f4e_ab8e_ab04484010a3.slice/crio-e6d31b887f3ed89a5ed6715fc2a70d9401cdc4ca5165eb80d4da611612725485 WatchSource:0}: Error finding container e6d31b887f3ed89a5ed6715fc2a70d9401cdc4ca5165eb80d4da611612725485: Status 404 returned error can't find the container with id e6d31b887f3ed89a5ed6715fc2a70d9401cdc4ca5165eb80d4da611612725485 Dec 05 18:03:57 crc kubenswrapper[4756]: I1205 18:03:57.247214 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-mxr46"] Dec 05 18:03:57 crc kubenswrapper[4756]: I1205 18:03:57.319681 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b5pxz"] Dec 05 18:03:57 crc kubenswrapper[4756]: I1205 18:03:57.322788 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b5pxz" Dec 05 18:03:57 crc kubenswrapper[4756]: I1205 18:03:57.340315 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b5pxz"] Dec 05 18:03:57 crc kubenswrapper[4756]: I1205 18:03:57.380333 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e140f75d-f032-431b-9cd8-3690958a676a-utilities\") pod \"redhat-marketplace-b5pxz\" (UID: \"e140f75d-f032-431b-9cd8-3690958a676a\") " pod="openshift-marketplace/redhat-marketplace-b5pxz" Dec 05 18:03:57 crc kubenswrapper[4756]: I1205 18:03:57.380478 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pcxd\" (UniqueName: \"kubernetes.io/projected/e140f75d-f032-431b-9cd8-3690958a676a-kube-api-access-8pcxd\") pod \"redhat-marketplace-b5pxz\" (UID: \"e140f75d-f032-431b-9cd8-3690958a676a\") " pod="openshift-marketplace/redhat-marketplace-b5pxz" Dec 05 18:03:57 crc kubenswrapper[4756]: I1205 18:03:57.380587 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e140f75d-f032-431b-9cd8-3690958a676a-catalog-content\") pod \"redhat-marketplace-b5pxz\" (UID: \"e140f75d-f032-431b-9cd8-3690958a676a\") " pod="openshift-marketplace/redhat-marketplace-b5pxz" Dec 05 18:03:57 crc kubenswrapper[4756]: I1205 18:03:57.482921 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e140f75d-f032-431b-9cd8-3690958a676a-utilities\") pod \"redhat-marketplace-b5pxz\" (UID: \"e140f75d-f032-431b-9cd8-3690958a676a\") " pod="openshift-marketplace/redhat-marketplace-b5pxz" Dec 05 18:03:57 crc kubenswrapper[4756]: I1205 18:03:57.483014 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pcxd\" (UniqueName: \"kubernetes.io/projected/e140f75d-f032-431b-9cd8-3690958a676a-kube-api-access-8pcxd\") pod \"redhat-marketplace-b5pxz\" (UID: \"e140f75d-f032-431b-9cd8-3690958a676a\") " pod="openshift-marketplace/redhat-marketplace-b5pxz" Dec 05 18:03:57 crc kubenswrapper[4756]: I1205 18:03:57.483115 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e140f75d-f032-431b-9cd8-3690958a676a-catalog-content\") pod \"redhat-marketplace-b5pxz\" (UID: \"e140f75d-f032-431b-9cd8-3690958a676a\") " pod="openshift-marketplace/redhat-marketplace-b5pxz" Dec 05 18:03:57 crc kubenswrapper[4756]: I1205 18:03:57.483687 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e140f75d-f032-431b-9cd8-3690958a676a-catalog-content\") pod \"redhat-marketplace-b5pxz\" (UID: \"e140f75d-f032-431b-9cd8-3690958a676a\") " pod="openshift-marketplace/redhat-marketplace-b5pxz" Dec 05 18:03:57 crc kubenswrapper[4756]: I1205 18:03:57.483685 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e140f75d-f032-431b-9cd8-3690958a676a-utilities\") pod \"redhat-marketplace-b5pxz\" (UID: \"e140f75d-f032-431b-9cd8-3690958a676a\") " pod="openshift-marketplace/redhat-marketplace-b5pxz" Dec 05 18:03:57 crc kubenswrapper[4756]: I1205 18:03:57.514379 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pcxd\" (UniqueName: \"kubernetes.io/projected/e140f75d-f032-431b-9cd8-3690958a676a-kube-api-access-8pcxd\") pod \"redhat-marketplace-b5pxz\" (UID: \"e140f75d-f032-431b-9cd8-3690958a676a\") " pod="openshift-marketplace/redhat-marketplace-b5pxz" Dec 05 18:03:57 crc kubenswrapper[4756]: I1205 18:03:57.655996 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b5pxz" Dec 05 18:03:58 crc kubenswrapper[4756]: I1205 18:03:58.162355 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-mxdgp" event={"ID":"a6174fba-31f8-4814-9c46-1ef40e855c39","Type":"ContainerStarted","Data":"cf3ffa4d66f046a95a9b1086922bd492b62e6d5f5fd58cc4b6bf25ef9f7fb274"} Dec 05 18:03:58 crc kubenswrapper[4756]: I1205 18:03:58.182225 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-mxr46" event={"ID":"2304ca15-1116-4f4e-ab8e-ab04484010a3","Type":"ContainerStarted","Data":"e6d31b887f3ed89a5ed6715fc2a70d9401cdc4ca5165eb80d4da611612725485"} Dec 05 18:03:58 crc kubenswrapper[4756]: I1205 18:03:58.185972 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b5pxz"] Dec 05 18:03:59 crc kubenswrapper[4756]: I1205 18:03:59.220150 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b5pxz" event={"ID":"e140f75d-f032-431b-9cd8-3690958a676a","Type":"ContainerStarted","Data":"ea3b6cf08469dcad618391678f3eec25c17561d169b42b88cad6f3f2e2c0673c"} Dec 05 18:03:59 crc kubenswrapper[4756]: I1205 18:03:59.571914 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xz5bc"] Dec 05 18:03:59 crc kubenswrapper[4756]: I1205 18:03:59.574128 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xz5bc" Dec 05 18:03:59 crc kubenswrapper[4756]: I1205 18:03:59.585506 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xz5bc"] Dec 05 18:03:59 crc kubenswrapper[4756]: I1205 18:03:59.650267 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60095cb4-2d1a-4350-892a-701be6b61b87-utilities\") pod \"community-operators-xz5bc\" (UID: \"60095cb4-2d1a-4350-892a-701be6b61b87\") " pod="openshift-marketplace/community-operators-xz5bc" Dec 05 18:03:59 crc kubenswrapper[4756]: I1205 18:03:59.650355 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60095cb4-2d1a-4350-892a-701be6b61b87-catalog-content\") pod \"community-operators-xz5bc\" (UID: \"60095cb4-2d1a-4350-892a-701be6b61b87\") " pod="openshift-marketplace/community-operators-xz5bc" Dec 05 18:03:59 crc kubenswrapper[4756]: I1205 18:03:59.650543 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktc8w\" (UniqueName: \"kubernetes.io/projected/60095cb4-2d1a-4350-892a-701be6b61b87-kube-api-access-ktc8w\") pod \"community-operators-xz5bc\" (UID: \"60095cb4-2d1a-4350-892a-701be6b61b87\") " pod="openshift-marketplace/community-operators-xz5bc" Dec 05 18:03:59 crc kubenswrapper[4756]: I1205 18:03:59.752331 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60095cb4-2d1a-4350-892a-701be6b61b87-utilities\") pod \"community-operators-xz5bc\" (UID: \"60095cb4-2d1a-4350-892a-701be6b61b87\") " pod="openshift-marketplace/community-operators-xz5bc" Dec 05 18:03:59 crc kubenswrapper[4756]: I1205 18:03:59.752405 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60095cb4-2d1a-4350-892a-701be6b61b87-catalog-content\") pod \"community-operators-xz5bc\" (UID: \"60095cb4-2d1a-4350-892a-701be6b61b87\") " pod="openshift-marketplace/community-operators-xz5bc" Dec 05 18:03:59 crc kubenswrapper[4756]: I1205 18:03:59.752540 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktc8w\" (UniqueName: \"kubernetes.io/projected/60095cb4-2d1a-4350-892a-701be6b61b87-kube-api-access-ktc8w\") pod \"community-operators-xz5bc\" (UID: \"60095cb4-2d1a-4350-892a-701be6b61b87\") " pod="openshift-marketplace/community-operators-xz5bc" Dec 05 18:03:59 crc kubenswrapper[4756]: I1205 18:03:59.753287 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60095cb4-2d1a-4350-892a-701be6b61b87-utilities\") pod \"community-operators-xz5bc\" (UID: \"60095cb4-2d1a-4350-892a-701be6b61b87\") " pod="openshift-marketplace/community-operators-xz5bc" Dec 05 18:03:59 crc kubenswrapper[4756]: I1205 18:03:59.753500 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60095cb4-2d1a-4350-892a-701be6b61b87-catalog-content\") pod \"community-operators-xz5bc\" (UID: \"60095cb4-2d1a-4350-892a-701be6b61b87\") " pod="openshift-marketplace/community-operators-xz5bc" Dec 05 18:03:59 crc kubenswrapper[4756]: I1205 18:03:59.774595 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktc8w\" (UniqueName: \"kubernetes.io/projected/60095cb4-2d1a-4350-892a-701be6b61b87-kube-api-access-ktc8w\") pod \"community-operators-xz5bc\" (UID: \"60095cb4-2d1a-4350-892a-701be6b61b87\") " pod="openshift-marketplace/community-operators-xz5bc" Dec 05 18:03:59 crc kubenswrapper[4756]: I1205 18:03:59.910040 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xz5bc" Dec 05 18:04:00 crc kubenswrapper[4756]: I1205 18:04:00.236011 4756 generic.go:334] "Generic (PLEG): container finished" podID="e140f75d-f032-431b-9cd8-3690958a676a" containerID="b1160d5fa9af310439f18bcd1119767d28fbac6e6c04333a4af7c8ec62450c46" exitCode=0 Dec 05 18:04:00 crc kubenswrapper[4756]: I1205 18:04:00.236268 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b5pxz" event={"ID":"e140f75d-f032-431b-9cd8-3690958a676a","Type":"ContainerDied","Data":"b1160d5fa9af310439f18bcd1119767d28fbac6e6c04333a4af7c8ec62450c46"} Dec 05 18:04:01 crc kubenswrapper[4756]: I1205 18:04:01.024381 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xz5bc"] Dec 05 18:04:01 crc kubenswrapper[4756]: W1205 18:04:01.047081 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60095cb4_2d1a_4350_892a_701be6b61b87.slice/crio-039e6aa1d177dfc6caae630faadf684e3ca5bf7542444d4855cf947a890d41e5 WatchSource:0}: Error finding container 039e6aa1d177dfc6caae630faadf684e3ca5bf7542444d4855cf947a890d41e5: Status 404 returned error can't find the container with id 039e6aa1d177dfc6caae630faadf684e3ca5bf7542444d4855cf947a890d41e5 Dec 05 18:04:01 crc kubenswrapper[4756]: I1205 18:04:01.254542 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xz5bc" event={"ID":"60095cb4-2d1a-4350-892a-701be6b61b87","Type":"ContainerStarted","Data":"039e6aa1d177dfc6caae630faadf684e3ca5bf7542444d4855cf947a890d41e5"} Dec 05 18:04:03 crc kubenswrapper[4756]: I1205 18:04:03.279573 4756 generic.go:334] "Generic (PLEG): container finished" podID="60095cb4-2d1a-4350-892a-701be6b61b87" containerID="3544e0a1a87200632b978742307be9e24b89afb081688903c867aeab550adabf" exitCode=0 Dec 05 18:04:03 crc kubenswrapper[4756]: I1205 18:04:03.279771 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xz5bc" event={"ID":"60095cb4-2d1a-4350-892a-701be6b61b87","Type":"ContainerDied","Data":"3544e0a1a87200632b978742307be9e24b89afb081688903c867aeab550adabf"} Dec 05 18:04:09 crc kubenswrapper[4756]: I1205 18:04:09.350540 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-mxdgp" event={"ID":"a6174fba-31f8-4814-9c46-1ef40e855c39","Type":"ContainerStarted","Data":"328b20003914db501b8cccdcc9045ccd10cae81881b70d40d37700cf7b6c90eb"} Dec 05 18:04:09 crc kubenswrapper[4756]: I1205 18:04:09.351118 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-mxdgp" Dec 05 18:04:09 crc kubenswrapper[4756]: I1205 18:04:09.355108 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bdc588f9-44cdj" event={"ID":"d6b206ae-0b75-402a-aea8-f3d6eaa6a7e7","Type":"ContainerStarted","Data":"dc430180dafaac48d60b4818c33d5173fafd1734861be0c195ac1813b3f24d18"} Dec 05 18:04:09 crc kubenswrapper[4756]: I1205 18:04:09.358166 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4d4k8" event={"ID":"f9df7e1a-4c29-43b1-b5fb-0a74fcd9acd7","Type":"ContainerStarted","Data":"57dbf8b29cf69511a70a137d54f2a68881a658fec5ad925557936faa7de77734"} Dec 05 18:04:09 crc kubenswrapper[4756]: I1205 18:04:09.361146 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b5pxz" event={"ID":"e140f75d-f032-431b-9cd8-3690958a676a","Type":"ContainerStarted","Data":"976211b14ea57c749f17cc009f52cfd4ed2611484f19a89976f19572d87e5aae"} Dec 05 18:04:09 crc kubenswrapper[4756]: I1205 18:04:09.363256 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bdc588f9-522g7" event={"ID":"5399ecc6-c8a6-4fb3-812b-ac3fd080d6b2","Type":"ContainerStarted","Data":"9e45ef905f38d2eaf85b45961f3ba858b89e893da783e11563f4a21c7e73ff1b"} Dec 05 18:04:09 crc kubenswrapper[4756]: I1205 18:04:09.365570 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xz5bc" event={"ID":"60095cb4-2d1a-4350-892a-701be6b61b87","Type":"ContainerStarted","Data":"e9b65f749b4aa7eb5c9686f23509023b93c9460e86a916a5b2600b50226966a1"} Dec 05 18:04:09 crc kubenswrapper[4756]: I1205 18:04:09.370058 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-mxr46" event={"ID":"2304ca15-1116-4f4e-ab8e-ab04484010a3","Type":"ContainerStarted","Data":"15365d63c7db7d146d3fdfe156ec45457dea410cdc4f8a502c76a6ee4e7300fd"} Dec 05 18:04:09 crc kubenswrapper[4756]: I1205 18:04:09.370677 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-mxr46" Dec 05 18:04:09 crc kubenswrapper[4756]: I1205 18:04:09.373639 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-mxr46" Dec 05 18:04:09 crc kubenswrapper[4756]: I1205 18:04:09.376928 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-mxdgp" podStartSLOduration=2.803581033 podStartE2EDuration="13.376912277s" podCreationTimestamp="2025-12-05 18:03:56 +0000 UTC" firstStartedPulling="2025-12-05 18:03:57.236310395 +0000 UTC m=+6497.474983897" lastFinishedPulling="2025-12-05 18:04:07.809641609 +0000 UTC m=+6508.048315141" observedRunningTime="2025-12-05 18:04:09.365104123 +0000 UTC m=+6509.603777635" watchObservedRunningTime="2025-12-05 18:04:09.376912277 +0000 UTC m=+6509.615585799" Dec 05 18:04:09 crc kubenswrapper[4756]: I1205 18:04:09.427198 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-4d4k8" podStartSLOduration=3.256873864 podStartE2EDuration="14.427176436s" podCreationTimestamp="2025-12-05 18:03:55 +0000 UTC" firstStartedPulling="2025-12-05 18:03:56.633601464 +0000 UTC m=+6496.872274986" lastFinishedPulling="2025-12-05 18:04:07.803904046 +0000 UTC m=+6508.042577558" observedRunningTime="2025-12-05 18:04:09.402094138 +0000 UTC m=+6509.640767660" watchObservedRunningTime="2025-12-05 18:04:09.427176436 +0000 UTC m=+6509.665849948" Dec 05 18:04:09 crc kubenswrapper[4756]: I1205 18:04:09.433845 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bdc588f9-44cdj" podStartSLOduration=3.5280636960000002 podStartE2EDuration="14.433827773s" podCreationTimestamp="2025-12-05 18:03:55 +0000 UTC" firstStartedPulling="2025-12-05 18:03:56.848096106 +0000 UTC m=+6497.086769608" lastFinishedPulling="2025-12-05 18:04:07.753860183 +0000 UTC m=+6507.992533685" observedRunningTime="2025-12-05 18:04:09.420110348 +0000 UTC m=+6509.658783870" watchObservedRunningTime="2025-12-05 18:04:09.433827773 +0000 UTC m=+6509.672501275" Dec 05 18:04:09 crc kubenswrapper[4756]: I1205 18:04:09.454899 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bdc588f9-522g7" podStartSLOduration=3.536280875 podStartE2EDuration="14.454878674s" podCreationTimestamp="2025-12-05 18:03:55 +0000 UTC" firstStartedPulling="2025-12-05 18:03:56.833140678 +0000 UTC m=+6497.071814190" lastFinishedPulling="2025-12-05 18:04:07.751738477 +0000 UTC m=+6507.990411989" observedRunningTime="2025-12-05 18:04:09.440199153 +0000 UTC m=+6509.678872665" watchObservedRunningTime="2025-12-05 18:04:09.454878674 +0000 UTC m=+6509.693552186" Dec 05 18:04:09 crc kubenswrapper[4756]: I1205 18:04:09.514978 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-mxr46" podStartSLOduration=3.949015386 podStartE2EDuration="14.514949933s" podCreationTimestamp="2025-12-05 18:03:55 +0000 UTC" firstStartedPulling="2025-12-05 18:03:57.244143864 +0000 UTC m=+6497.482817376" lastFinishedPulling="2025-12-05 18:04:07.810078411 +0000 UTC m=+6508.048751923" observedRunningTime="2025-12-05 18:04:09.505979654 +0000 UTC m=+6509.744653186" watchObservedRunningTime="2025-12-05 18:04:09.514949933 +0000 UTC m=+6509.753623455" Dec 05 18:04:11 crc kubenswrapper[4756]: I1205 18:04:11.401038 4756 generic.go:334] "Generic (PLEG): container finished" podID="e140f75d-f032-431b-9cd8-3690958a676a" containerID="976211b14ea57c749f17cc009f52cfd4ed2611484f19a89976f19572d87e5aae" exitCode=0 Dec 05 18:04:11 crc kubenswrapper[4756]: I1205 18:04:11.401160 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b5pxz" event={"ID":"e140f75d-f032-431b-9cd8-3690958a676a","Type":"ContainerDied","Data":"976211b14ea57c749f17cc009f52cfd4ed2611484f19a89976f19572d87e5aae"} Dec 05 18:04:12 crc kubenswrapper[4756]: E1205 18:04:12.272770 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60095cb4_2d1a_4350_892a_701be6b61b87.slice/crio-conmon-e9b65f749b4aa7eb5c9686f23509023b93c9460e86a916a5b2600b50226966a1.scope\": RecentStats: unable to find data in memory cache]" Dec 05 18:04:12 crc kubenswrapper[4756]: I1205 18:04:12.413824 4756 generic.go:334] "Generic (PLEG): container finished" podID="60095cb4-2d1a-4350-892a-701be6b61b87" containerID="e9b65f749b4aa7eb5c9686f23509023b93c9460e86a916a5b2600b50226966a1" exitCode=0 Dec 05 18:04:12 crc kubenswrapper[4756]: I1205 18:04:12.413870 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xz5bc" event={"ID":"60095cb4-2d1a-4350-892a-701be6b61b87","Type":"ContainerDied","Data":"e9b65f749b4aa7eb5c9686f23509023b93c9460e86a916a5b2600b50226966a1"} Dec 05 18:04:14 crc kubenswrapper[4756]: I1205 18:04:14.433595 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b5pxz" event={"ID":"e140f75d-f032-431b-9cd8-3690958a676a","Type":"ContainerStarted","Data":"a445691c49a8986c6c438ed48c5a7b3d8c004577083d7c764afc6599f1eea261"} Dec 05 18:04:14 crc kubenswrapper[4756]: I1205 18:04:14.436706 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xz5bc" event={"ID":"60095cb4-2d1a-4350-892a-701be6b61b87","Type":"ContainerStarted","Data":"6b3f205fd780a3373d9974a3c578e30e7840dc90624df5765536ed3c560c5388"} Dec 05 18:04:14 crc kubenswrapper[4756]: I1205 18:04:14.468532 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b5pxz" podStartSLOduration=6.488900512 podStartE2EDuration="17.468512815s" podCreationTimestamp="2025-12-05 18:03:57 +0000 UTC" firstStartedPulling="2025-12-05 18:04:02.518963589 +0000 UTC m=+6502.757637102" lastFinishedPulling="2025-12-05 18:04:13.498575893 +0000 UTC m=+6513.737249405" observedRunningTime="2025-12-05 18:04:14.451406039 +0000 UTC m=+6514.690079551" watchObservedRunningTime="2025-12-05 18:04:14.468512815 +0000 UTC m=+6514.707186327" Dec 05 18:04:14 crc kubenswrapper[4756]: I1205 18:04:14.476196 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xz5bc" podStartSLOduration=5.815656245 podStartE2EDuration="15.476179839s" podCreationTimestamp="2025-12-05 18:03:59 +0000 UTC" firstStartedPulling="2025-12-05 18:04:03.873061812 +0000 UTC m=+6504.111735324" lastFinishedPulling="2025-12-05 18:04:13.533585366 +0000 UTC m=+6513.772258918" observedRunningTime="2025-12-05 18:04:14.467127298 +0000 UTC m=+6514.705800810" watchObservedRunningTime="2025-12-05 18:04:14.476179839 +0000 UTC m=+6514.714853351" Dec 05 18:04:15 crc kubenswrapper[4756]: I1205 18:04:15.179073 4756 scope.go:117] "RemoveContainer" containerID="14507b4fabfd51b9d18a6b9011a932bcb597dcbb353982c9b5a04d1d067abac8" Dec 05 18:04:15 crc kubenswrapper[4756]: I1205 18:04:15.248532 4756 scope.go:117] "RemoveContainer" containerID="d2c8b2fcdcb79d1b70377ef1af6c3e4e7ed49c6e3ebcf669f7f9b255d4e13df3" Dec 05 18:04:15 crc kubenswrapper[4756]: I1205 18:04:15.303702 4756 scope.go:117] "RemoveContainer" containerID="0d986c55807f81424b35fb0eb146aec0ec4574f65ced380b93d72823a3eeb3c0" Dec 05 18:04:16 crc kubenswrapper[4756]: I1205 18:04:16.519103 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-mxdgp" Dec 05 18:04:17 crc kubenswrapper[4756]: I1205 18:04:17.656861 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b5pxz" Dec 05 18:04:17 crc kubenswrapper[4756]: I1205 18:04:17.656969 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b5pxz" Dec 05 18:04:17 crc kubenswrapper[4756]: I1205 18:04:17.782476 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b5pxz" Dec 05 18:04:18 crc kubenswrapper[4756]: I1205 18:04:18.563678 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b5pxz" Dec 05 18:04:19 crc kubenswrapper[4756]: I1205 18:04:19.592845 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 05 18:04:19 crc kubenswrapper[4756]: I1205 18:04:19.593354 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="cc3702a4-9b2f-46b4-95d3-c0c291132199" containerName="openstackclient" containerID="cri-o://4f64ef23f1ae1c759d4f18d2a0f2a70fc69012a045387303ee110d5bc5a2d5f9" gracePeriod=2 Dec 05 18:04:19 crc kubenswrapper[4756]: I1205 18:04:19.608699 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 05 18:04:19 crc kubenswrapper[4756]: I1205 18:04:19.687481 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 05 18:04:19 crc kubenswrapper[4756]: E1205 18:04:19.687974 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc3702a4-9b2f-46b4-95d3-c0c291132199" containerName="openstackclient" Dec 05 18:04:19 crc kubenswrapper[4756]: I1205 18:04:19.687991 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc3702a4-9b2f-46b4-95d3-c0c291132199" containerName="openstackclient" Dec 05 18:04:19 crc kubenswrapper[4756]: I1205 18:04:19.688199 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc3702a4-9b2f-46b4-95d3-c0c291132199" containerName="openstackclient" Dec 05 18:04:19 crc kubenswrapper[4756]: I1205 18:04:19.689014 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 18:04:19 crc kubenswrapper[4756]: I1205 18:04:19.697947 4756 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="cc3702a4-9b2f-46b4-95d3-c0c291132199" podUID="9727cd88-b0ba-4d3c-a639-0ff7d7686194" Dec 05 18:04:19 crc kubenswrapper[4756]: I1205 18:04:19.700337 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 18:04:19 crc kubenswrapper[4756]: I1205 18:04:19.860272 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9727cd88-b0ba-4d3c-a639-0ff7d7686194-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9727cd88-b0ba-4d3c-a639-0ff7d7686194\") " pod="openstack/openstackclient" Dec 05 18:04:19 crc kubenswrapper[4756]: I1205 18:04:19.860675 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9727cd88-b0ba-4d3c-a639-0ff7d7686194-openstack-config\") pod \"openstackclient\" (UID: \"9727cd88-b0ba-4d3c-a639-0ff7d7686194\") " pod="openstack/openstackclient" Dec 05 18:04:19 crc kubenswrapper[4756]: I1205 18:04:19.860706 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9727cd88-b0ba-4d3c-a639-0ff7d7686194-openstack-config-secret\") pod \"openstackclient\" (UID: \"9727cd88-b0ba-4d3c-a639-0ff7d7686194\") " pod="openstack/openstackclient" Dec 05 18:04:19 crc kubenswrapper[4756]: I1205 18:04:19.860801 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-662pv\" (UniqueName: \"kubernetes.io/projected/9727cd88-b0ba-4d3c-a639-0ff7d7686194-kube-api-access-662pv\") pod \"openstackclient\" (UID: \"9727cd88-b0ba-4d3c-a639-0ff7d7686194\") " pod="openstack/openstackclient" Dec 05 18:04:19 crc kubenswrapper[4756]: I1205 18:04:19.875612 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 18:04:19 crc kubenswrapper[4756]: I1205 18:04:19.877032 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 18:04:19 crc kubenswrapper[4756]: I1205 18:04:19.890876 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-9wxxr" Dec 05 18:04:19 crc kubenswrapper[4756]: I1205 18:04:19.896596 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 18:04:19 crc kubenswrapper[4756]: I1205 18:04:19.910213 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xz5bc" Dec 05 18:04:19 crc kubenswrapper[4756]: I1205 18:04:19.910796 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xz5bc" Dec 05 18:04:19 crc kubenswrapper[4756]: I1205 18:04:19.965701 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9727cd88-b0ba-4d3c-a639-0ff7d7686194-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9727cd88-b0ba-4d3c-a639-0ff7d7686194\") " pod="openstack/openstackclient" Dec 05 18:04:19 crc kubenswrapper[4756]: I1205 18:04:19.965794 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9727cd88-b0ba-4d3c-a639-0ff7d7686194-openstack-config\") pod \"openstackclient\" (UID: \"9727cd88-b0ba-4d3c-a639-0ff7d7686194\") " pod="openstack/openstackclient" Dec 05 18:04:19 crc kubenswrapper[4756]: I1205 18:04:19.965827 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9727cd88-b0ba-4d3c-a639-0ff7d7686194-openstack-config-secret\") pod \"openstackclient\" (UID: \"9727cd88-b0ba-4d3c-a639-0ff7d7686194\") " pod="openstack/openstackclient" Dec 05 18:04:19 crc kubenswrapper[4756]: I1205 18:04:19.965914 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-662pv\" (UniqueName: \"kubernetes.io/projected/9727cd88-b0ba-4d3c-a639-0ff7d7686194-kube-api-access-662pv\") pod \"openstackclient\" (UID: \"9727cd88-b0ba-4d3c-a639-0ff7d7686194\") " pod="openstack/openstackclient" Dec 05 18:04:19 crc kubenswrapper[4756]: I1205 18:04:19.967444 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9727cd88-b0ba-4d3c-a639-0ff7d7686194-openstack-config\") pod \"openstackclient\" (UID: \"9727cd88-b0ba-4d3c-a639-0ff7d7686194\") " pod="openstack/openstackclient" Dec 05 18:04:19 crc kubenswrapper[4756]: I1205 18:04:19.978193 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9727cd88-b0ba-4d3c-a639-0ff7d7686194-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9727cd88-b0ba-4d3c-a639-0ff7d7686194\") " pod="openstack/openstackclient" Dec 05 18:04:19 crc kubenswrapper[4756]: I1205 18:04:19.985867 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9727cd88-b0ba-4d3c-a639-0ff7d7686194-openstack-config-secret\") pod \"openstackclient\" (UID: \"9727cd88-b0ba-4d3c-a639-0ff7d7686194\") " pod="openstack/openstackclient" Dec 05 18:04:20 crc kubenswrapper[4756]: I1205 18:04:20.001434 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-662pv\" (UniqueName: \"kubernetes.io/projected/9727cd88-b0ba-4d3c-a639-0ff7d7686194-kube-api-access-662pv\") pod \"openstackclient\" (UID: \"9727cd88-b0ba-4d3c-a639-0ff7d7686194\") " pod="openstack/openstackclient" Dec 05 18:04:20 crc kubenswrapper[4756]: I1205 18:04:20.014726 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 18:04:20 crc kubenswrapper[4756]: I1205 18:04:20.038394 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xz5bc" Dec 05 18:04:20 crc kubenswrapper[4756]: I1205 18:04:20.067767 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9bjh\" (UniqueName: \"kubernetes.io/projected/73bd33f1-e4bf-454f-9a7c-bff0564e02dc-kube-api-access-f9bjh\") pod \"kube-state-metrics-0\" (UID: \"73bd33f1-e4bf-454f-9a7c-bff0564e02dc\") " pod="openstack/kube-state-metrics-0" Dec 05 18:04:20 crc kubenswrapper[4756]: I1205 18:04:20.170158 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9bjh\" (UniqueName: \"kubernetes.io/projected/73bd33f1-e4bf-454f-9a7c-bff0564e02dc-kube-api-access-f9bjh\") pod \"kube-state-metrics-0\" (UID: \"73bd33f1-e4bf-454f-9a7c-bff0564e02dc\") " pod="openstack/kube-state-metrics-0" Dec 05 18:04:20 crc kubenswrapper[4756]: I1205 18:04:20.223552 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9bjh\" (UniqueName: \"kubernetes.io/projected/73bd33f1-e4bf-454f-9a7c-bff0564e02dc-kube-api-access-f9bjh\") pod \"kube-state-metrics-0\" (UID: \"73bd33f1-e4bf-454f-9a7c-bff0564e02dc\") " pod="openstack/kube-state-metrics-0" Dec 05 18:04:20 crc kubenswrapper[4756]: I1205 18:04:20.342586 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b5pxz"] Dec 05 18:04:20 crc kubenswrapper[4756]: I1205 18:04:20.513809 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 18:04:20 crc kubenswrapper[4756]: I1205 18:04:20.608805 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xz5bc" Dec 05 18:04:20 crc kubenswrapper[4756]: I1205 18:04:20.918980 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 05 18:04:20 crc kubenswrapper[4756]: I1205 18:04:20.930946 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 05 18:04:20 crc kubenswrapper[4756]: I1205 18:04:20.942451 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Dec 05 18:04:20 crc kubenswrapper[4756]: I1205 18:04:20.942657 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Dec 05 18:04:20 crc kubenswrapper[4756]: I1205 18:04:20.942761 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Dec 05 18:04:20 crc kubenswrapper[4756]: I1205 18:04:20.942861 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-n2pgg" Dec 05 18:04:20 crc kubenswrapper[4756]: I1205 18:04:20.942969 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Dec 05 18:04:20 crc kubenswrapper[4756]: I1205 18:04:20.989933 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 05 18:04:21 crc kubenswrapper[4756]: W1205 18:04:21.095944 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9727cd88_b0ba_4d3c_a639_0ff7d7686194.slice/crio-83e2b1ab205919ea55d1c6f04abe91c107f8c162fd26203a4eb1a9f11b5bcefb WatchSource:0}: Error finding container 83e2b1ab205919ea55d1c6f04abe91c107f8c162fd26203a4eb1a9f11b5bcefb: Status 404 returned error can't find the container with id 83e2b1ab205919ea55d1c6f04abe91c107f8c162fd26203a4eb1a9f11b5bcefb Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.100377 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.122988 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2421f4e9-7c0c-4600-bc22-54c6802397b2-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"2421f4e9-7c0c-4600-bc22-54c6802397b2\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.123029 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqwts\" (UniqueName: \"kubernetes.io/projected/2421f4e9-7c0c-4600-bc22-54c6802397b2-kube-api-access-fqwts\") pod \"alertmanager-metric-storage-0\" (UID: \"2421f4e9-7c0c-4600-bc22-54c6802397b2\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.123159 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2421f4e9-7c0c-4600-bc22-54c6802397b2-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"2421f4e9-7c0c-4600-bc22-54c6802397b2\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.123890 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2421f4e9-7c0c-4600-bc22-54c6802397b2-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"2421f4e9-7c0c-4600-bc22-54c6802397b2\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.123963 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2421f4e9-7c0c-4600-bc22-54c6802397b2-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"2421f4e9-7c0c-4600-bc22-54c6802397b2\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.123988 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2421f4e9-7c0c-4600-bc22-54c6802397b2-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"2421f4e9-7c0c-4600-bc22-54c6802397b2\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.124009 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/2421f4e9-7c0c-4600-bc22-54c6802397b2-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"2421f4e9-7c0c-4600-bc22-54c6802397b2\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.227254 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2421f4e9-7c0c-4600-bc22-54c6802397b2-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"2421f4e9-7c0c-4600-bc22-54c6802397b2\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.227298 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqwts\" (UniqueName: \"kubernetes.io/projected/2421f4e9-7c0c-4600-bc22-54c6802397b2-kube-api-access-fqwts\") pod \"alertmanager-metric-storage-0\" (UID: \"2421f4e9-7c0c-4600-bc22-54c6802397b2\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.227345 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2421f4e9-7c0c-4600-bc22-54c6802397b2-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"2421f4e9-7c0c-4600-bc22-54c6802397b2\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.227411 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2421f4e9-7c0c-4600-bc22-54c6802397b2-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"2421f4e9-7c0c-4600-bc22-54c6802397b2\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.227451 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2421f4e9-7c0c-4600-bc22-54c6802397b2-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"2421f4e9-7c0c-4600-bc22-54c6802397b2\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.227471 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2421f4e9-7c0c-4600-bc22-54c6802397b2-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"2421f4e9-7c0c-4600-bc22-54c6802397b2\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.227485 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/2421f4e9-7c0c-4600-bc22-54c6802397b2-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"2421f4e9-7c0c-4600-bc22-54c6802397b2\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.227907 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/2421f4e9-7c0c-4600-bc22-54c6802397b2-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"2421f4e9-7c0c-4600-bc22-54c6802397b2\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.234497 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2421f4e9-7c0c-4600-bc22-54c6802397b2-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"2421f4e9-7c0c-4600-bc22-54c6802397b2\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.236719 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2421f4e9-7c0c-4600-bc22-54c6802397b2-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"2421f4e9-7c0c-4600-bc22-54c6802397b2\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.237860 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2421f4e9-7c0c-4600-bc22-54c6802397b2-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"2421f4e9-7c0c-4600-bc22-54c6802397b2\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.266301 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2421f4e9-7c0c-4600-bc22-54c6802397b2-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"2421f4e9-7c0c-4600-bc22-54c6802397b2\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.273353 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2421f4e9-7c0c-4600-bc22-54c6802397b2-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"2421f4e9-7c0c-4600-bc22-54c6802397b2\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.273947 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqwts\" (UniqueName: \"kubernetes.io/projected/2421f4e9-7c0c-4600-bc22-54c6802397b2-kube-api-access-fqwts\") pod \"alertmanager-metric-storage-0\" (UID: \"2421f4e9-7c0c-4600-bc22-54c6802397b2\") " pod="openstack/alertmanager-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.345848 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: W1205 18:04:21.433043 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73bd33f1_e4bf_454f_9a7c_bff0564e02dc.slice/crio-e03fce4436860dd992dbc42961763a17494084adb95aa2d25ab6f4cf0959acac WatchSource:0}: Error finding container e03fce4436860dd992dbc42961763a17494084adb95aa2d25ab6f4cf0959acac: Status 404 returned error can't find the container with id e03fce4436860dd992dbc42961763a17494084adb95aa2d25ab6f4cf0959acac Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.436144 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.466573 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.470273 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.482979 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-lm5dm" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.483174 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.483184 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.483290 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.483401 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.484871 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.486113 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.544988 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"9727cd88-b0ba-4d3c-a639-0ff7d7686194","Type":"ContainerStarted","Data":"290708e9f9981fcc806436ab7c45c5672be3f19fb9ef6d73fd0ce0e49c2a683a"} Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.545053 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"9727cd88-b0ba-4d3c-a639-0ff7d7686194","Type":"ContainerStarted","Data":"83e2b1ab205919ea55d1c6f04abe91c107f8c162fd26203a4eb1a9f11b5bcefb"} Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.547133 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b5pxz" podUID="e140f75d-f032-431b-9cd8-3690958a676a" containerName="registry-server" containerID="cri-o://a445691c49a8986c6c438ed48c5a7b3d8c004577083d7c764afc6599f1eea261" gracePeriod=2 Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.547355 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"73bd33f1-e4bf-454f-9a7c-bff0564e02dc","Type":"ContainerStarted","Data":"e03fce4436860dd992dbc42961763a17494084adb95aa2d25ab6f4cf0959acac"} Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.562171 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.5621558589999998 podStartE2EDuration="2.562155859s" podCreationTimestamp="2025-12-05 18:04:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 18:04:21.560219018 +0000 UTC m=+6521.798892530" watchObservedRunningTime="2025-12-05 18:04:21.562155859 +0000 UTC m=+6521.800829371" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.637502 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4f9ee8de-3300-43ef-887a-8e46472cd642\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9ee8de-3300-43ef-887a-8e46472cd642\") pod \"prometheus-metric-storage-0\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.637778 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxslj\" (UniqueName: \"kubernetes.io/projected/69c44af2-cb75-43ba-9551-98cf81bcb6dd-kube-api-access-rxslj\") pod \"prometheus-metric-storage-0\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.637830 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/69c44af2-cb75-43ba-9551-98cf81bcb6dd-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.637859 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/69c44af2-cb75-43ba-9551-98cf81bcb6dd-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.637948 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/69c44af2-cb75-43ba-9551-98cf81bcb6dd-config\") pod \"prometheus-metric-storage-0\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.638018 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/69c44af2-cb75-43ba-9551-98cf81bcb6dd-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.638038 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/69c44af2-cb75-43ba-9551-98cf81bcb6dd-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.638068 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/69c44af2-cb75-43ba-9551-98cf81bcb6dd-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.741698 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxslj\" (UniqueName: \"kubernetes.io/projected/69c44af2-cb75-43ba-9551-98cf81bcb6dd-kube-api-access-rxslj\") pod \"prometheus-metric-storage-0\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.741812 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/69c44af2-cb75-43ba-9551-98cf81bcb6dd-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.741878 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/69c44af2-cb75-43ba-9551-98cf81bcb6dd-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.741921 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/69c44af2-cb75-43ba-9551-98cf81bcb6dd-config\") pod \"prometheus-metric-storage-0\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.741997 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/69c44af2-cb75-43ba-9551-98cf81bcb6dd-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.742033 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/69c44af2-cb75-43ba-9551-98cf81bcb6dd-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.742067 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/69c44af2-cb75-43ba-9551-98cf81bcb6dd-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.742179 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4f9ee8de-3300-43ef-887a-8e46472cd642\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9ee8de-3300-43ef-887a-8e46472cd642\") pod \"prometheus-metric-storage-0\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.743876 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/69c44af2-cb75-43ba-9551-98cf81bcb6dd-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.749936 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/69c44af2-cb75-43ba-9551-98cf81bcb6dd-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.754072 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.754116 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4f9ee8de-3300-43ef-887a-8e46472cd642\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9ee8de-3300-43ef-887a-8e46472cd642\") pod \"prometheus-metric-storage-0\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/90292c5c4a512e0007daec4c81258d0de0aa3d24f4fd9666135a0842aac2ff66/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.754826 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/69c44af2-cb75-43ba-9551-98cf81bcb6dd-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.755058 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/69c44af2-cb75-43ba-9551-98cf81bcb6dd-config\") pod \"prometheus-metric-storage-0\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.756697 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/69c44af2-cb75-43ba-9551-98cf81bcb6dd-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.757184 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/69c44af2-cb75-43ba-9551-98cf81bcb6dd-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.762539 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxslj\" (UniqueName: \"kubernetes.io/projected/69c44af2-cb75-43ba-9551-98cf81bcb6dd-kube-api-access-rxslj\") pod \"prometheus-metric-storage-0\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.836140 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4f9ee8de-3300-43ef-887a-8e46472cd642\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9ee8de-3300-43ef-887a-8e46472cd642\") pod \"prometheus-metric-storage-0\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.918880 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.942528 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.960697 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cc3702a4-9b2f-46b4-95d3-c0c291132199-openstack-config-secret\") pod \"cc3702a4-9b2f-46b4-95d3-c0c291132199\" (UID: \"cc3702a4-9b2f-46b4-95d3-c0c291132199\") " Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.960797 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cc3702a4-9b2f-46b4-95d3-c0c291132199-openstack-config\") pod \"cc3702a4-9b2f-46b4-95d3-c0c291132199\" (UID: \"cc3702a4-9b2f-46b4-95d3-c0c291132199\") " Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.960849 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc3702a4-9b2f-46b4-95d3-c0c291132199-combined-ca-bundle\") pod \"cc3702a4-9b2f-46b4-95d3-c0c291132199\" (UID: \"cc3702a4-9b2f-46b4-95d3-c0c291132199\") " Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.961061 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrgmp\" (UniqueName: \"kubernetes.io/projected/cc3702a4-9b2f-46b4-95d3-c0c291132199-kube-api-access-nrgmp\") pod \"cc3702a4-9b2f-46b4-95d3-c0c291132199\" (UID: \"cc3702a4-9b2f-46b4-95d3-c0c291132199\") " Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.964720 4756 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="cc3702a4-9b2f-46b4-95d3-c0c291132199" podUID="9727cd88-b0ba-4d3c-a639-0ff7d7686194" Dec 05 18:04:21 crc kubenswrapper[4756]: I1205 18:04:21.970260 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc3702a4-9b2f-46b4-95d3-c0c291132199-kube-api-access-nrgmp" (OuterVolumeSpecName: "kube-api-access-nrgmp") pod "cc3702a4-9b2f-46b4-95d3-c0c291132199" (UID: "cc3702a4-9b2f-46b4-95d3-c0c291132199"). InnerVolumeSpecName "kube-api-access-nrgmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.000362 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc3702a4-9b2f-46b4-95d3-c0c291132199-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "cc3702a4-9b2f-46b4-95d3-c0c291132199" (UID: "cc3702a4-9b2f-46b4-95d3-c0c291132199"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.003548 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc3702a4-9b2f-46b4-95d3-c0c291132199-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc3702a4-9b2f-46b4-95d3-c0c291132199" (UID: "cc3702a4-9b2f-46b4-95d3-c0c291132199"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.035150 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc3702a4-9b2f-46b4-95d3-c0c291132199-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "cc3702a4-9b2f-46b4-95d3-c0c291132199" (UID: "cc3702a4-9b2f-46b4-95d3-c0c291132199"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.063826 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrgmp\" (UniqueName: \"kubernetes.io/projected/cc3702a4-9b2f-46b4-95d3-c0c291132199-kube-api-access-nrgmp\") on node \"crc\" DevicePath \"\"" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.063869 4756 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cc3702a4-9b2f-46b4-95d3-c0c291132199-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.063882 4756 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cc3702a4-9b2f-46b4-95d3-c0c291132199-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.063894 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc3702a4-9b2f-46b4-95d3-c0c291132199-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.086733 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc3702a4-9b2f-46b4-95d3-c0c291132199" path="/var/lib/kubelet/pods/cc3702a4-9b2f-46b4-95d3-c0c291132199/volumes" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.132806 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.273017 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b5pxz" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.298108 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.298180 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.376756 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e140f75d-f032-431b-9cd8-3690958a676a-catalog-content\") pod \"e140f75d-f032-431b-9cd8-3690958a676a\" (UID: \"e140f75d-f032-431b-9cd8-3690958a676a\") " Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.376860 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pcxd\" (UniqueName: \"kubernetes.io/projected/e140f75d-f032-431b-9cd8-3690958a676a-kube-api-access-8pcxd\") pod \"e140f75d-f032-431b-9cd8-3690958a676a\" (UID: \"e140f75d-f032-431b-9cd8-3690958a676a\") " Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.376885 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e140f75d-f032-431b-9cd8-3690958a676a-utilities\") pod \"e140f75d-f032-431b-9cd8-3690958a676a\" (UID: \"e140f75d-f032-431b-9cd8-3690958a676a\") " Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.378918 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e140f75d-f032-431b-9cd8-3690958a676a-utilities" (OuterVolumeSpecName: "utilities") pod "e140f75d-f032-431b-9cd8-3690958a676a" (UID: "e140f75d-f032-431b-9cd8-3690958a676a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.392294 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e140f75d-f032-431b-9cd8-3690958a676a-kube-api-access-8pcxd" (OuterVolumeSpecName: "kube-api-access-8pcxd") pod "e140f75d-f032-431b-9cd8-3690958a676a" (UID: "e140f75d-f032-431b-9cd8-3690958a676a"). InnerVolumeSpecName "kube-api-access-8pcxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.414597 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e140f75d-f032-431b-9cd8-3690958a676a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e140f75d-f032-431b-9cd8-3690958a676a" (UID: "e140f75d-f032-431b-9cd8-3690958a676a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.479390 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e140f75d-f032-431b-9cd8-3690958a676a-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.479443 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pcxd\" (UniqueName: \"kubernetes.io/projected/e140f75d-f032-431b-9cd8-3690958a676a-kube-api-access-8pcxd\") on node \"crc\" DevicePath \"\"" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.479454 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e140f75d-f032-431b-9cd8-3690958a676a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.563124 4756 generic.go:334] "Generic (PLEG): container finished" podID="e140f75d-f032-431b-9cd8-3690958a676a" containerID="a445691c49a8986c6c438ed48c5a7b3d8c004577083d7c764afc6599f1eea261" exitCode=0 Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.563452 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b5pxz" event={"ID":"e140f75d-f032-431b-9cd8-3690958a676a","Type":"ContainerDied","Data":"a445691c49a8986c6c438ed48c5a7b3d8c004577083d7c764afc6599f1eea261"} Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.563523 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b5pxz" event={"ID":"e140f75d-f032-431b-9cd8-3690958a676a","Type":"ContainerDied","Data":"ea3b6cf08469dcad618391678f3eec25c17561d169b42b88cad6f3f2e2c0673c"} Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.563548 4756 scope.go:117] "RemoveContainer" containerID="a445691c49a8986c6c438ed48c5a7b3d8c004577083d7c764afc6599f1eea261" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.563908 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b5pxz" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.573448 4756 generic.go:334] "Generic (PLEG): container finished" podID="cc3702a4-9b2f-46b4-95d3-c0c291132199" containerID="4f64ef23f1ae1c759d4f18d2a0f2a70fc69012a045387303ee110d5bc5a2d5f9" exitCode=137 Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.573492 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.576945 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"2421f4e9-7c0c-4600-bc22-54c6802397b2","Type":"ContainerStarted","Data":"5370f59c78d7047ee661389ac17cd409dad313852ca962d9ff7105ff89c22595"} Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.581909 4756 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="cc3702a4-9b2f-46b4-95d3-c0c291132199" podUID="9727cd88-b0ba-4d3c-a639-0ff7d7686194" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.585646 4756 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="cc3702a4-9b2f-46b4-95d3-c0c291132199" podUID="9727cd88-b0ba-4d3c-a639-0ff7d7686194" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.593583 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"73bd33f1-e4bf-454f-9a7c-bff0564e02dc","Type":"ContainerStarted","Data":"389e071702a7ccd2f7993ba8dc8f6cc4010a164ed836a46b4a94df90e182ecca"} Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.595955 4756 scope.go:117] "RemoveContainer" containerID="976211b14ea57c749f17cc009f52cfd4ed2611484f19a89976f19572d87e5aae" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.634071 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.203450848 podStartE2EDuration="3.634046846s" podCreationTimestamp="2025-12-05 18:04:19 +0000 UTC" firstStartedPulling="2025-12-05 18:04:21.438158327 +0000 UTC m=+6521.676831839" lastFinishedPulling="2025-12-05 18:04:21.868754325 +0000 UTC m=+6522.107427837" observedRunningTime="2025-12-05 18:04:22.6176926 +0000 UTC m=+6522.856366112" watchObservedRunningTime="2025-12-05 18:04:22.634046846 +0000 UTC m=+6522.872720358" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.640819 4756 scope.go:117] "RemoveContainer" containerID="b1160d5fa9af310439f18bcd1119767d28fbac6e6c04333a4af7c8ec62450c46" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.657519 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.668994 4756 scope.go:117] "RemoveContainer" containerID="a445691c49a8986c6c438ed48c5a7b3d8c004577083d7c764afc6599f1eea261" Dec 05 18:04:22 crc kubenswrapper[4756]: E1205 18:04:22.670401 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a445691c49a8986c6c438ed48c5a7b3d8c004577083d7c764afc6599f1eea261\": container with ID starting with a445691c49a8986c6c438ed48c5a7b3d8c004577083d7c764afc6599f1eea261 not found: ID does not exist" containerID="a445691c49a8986c6c438ed48c5a7b3d8c004577083d7c764afc6599f1eea261" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.670440 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a445691c49a8986c6c438ed48c5a7b3d8c004577083d7c764afc6599f1eea261"} err="failed to get container status \"a445691c49a8986c6c438ed48c5a7b3d8c004577083d7c764afc6599f1eea261\": rpc error: code = NotFound desc = could not find container \"a445691c49a8986c6c438ed48c5a7b3d8c004577083d7c764afc6599f1eea261\": container with ID starting with a445691c49a8986c6c438ed48c5a7b3d8c004577083d7c764afc6599f1eea261 not found: ID does not exist" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.670457 4756 scope.go:117] "RemoveContainer" containerID="976211b14ea57c749f17cc009f52cfd4ed2611484f19a89976f19572d87e5aae" Dec 05 18:04:22 crc kubenswrapper[4756]: E1205 18:04:22.670638 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"976211b14ea57c749f17cc009f52cfd4ed2611484f19a89976f19572d87e5aae\": container with ID starting with 976211b14ea57c749f17cc009f52cfd4ed2611484f19a89976f19572d87e5aae not found: ID does not exist" containerID="976211b14ea57c749f17cc009f52cfd4ed2611484f19a89976f19572d87e5aae" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.670659 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"976211b14ea57c749f17cc009f52cfd4ed2611484f19a89976f19572d87e5aae"} err="failed to get container status \"976211b14ea57c749f17cc009f52cfd4ed2611484f19a89976f19572d87e5aae\": rpc error: code = NotFound desc = could not find container \"976211b14ea57c749f17cc009f52cfd4ed2611484f19a89976f19572d87e5aae\": container with ID starting with 976211b14ea57c749f17cc009f52cfd4ed2611484f19a89976f19572d87e5aae not found: ID does not exist" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.670673 4756 scope.go:117] "RemoveContainer" containerID="b1160d5fa9af310439f18bcd1119767d28fbac6e6c04333a4af7c8ec62450c46" Dec 05 18:04:22 crc kubenswrapper[4756]: E1205 18:04:22.671248 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1160d5fa9af310439f18bcd1119767d28fbac6e6c04333a4af7c8ec62450c46\": container with ID starting with b1160d5fa9af310439f18bcd1119767d28fbac6e6c04333a4af7c8ec62450c46 not found: ID does not exist" containerID="b1160d5fa9af310439f18bcd1119767d28fbac6e6c04333a4af7c8ec62450c46" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.671323 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1160d5fa9af310439f18bcd1119767d28fbac6e6c04333a4af7c8ec62450c46"} err="failed to get container status \"b1160d5fa9af310439f18bcd1119767d28fbac6e6c04333a4af7c8ec62450c46\": rpc error: code = NotFound desc = could not find container \"b1160d5fa9af310439f18bcd1119767d28fbac6e6c04333a4af7c8ec62450c46\": container with ID starting with b1160d5fa9af310439f18bcd1119767d28fbac6e6c04333a4af7c8ec62450c46 not found: ID does not exist" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.671352 4756 scope.go:117] "RemoveContainer" containerID="4f64ef23f1ae1c759d4f18d2a0f2a70fc69012a045387303ee110d5bc5a2d5f9" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.674765 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b5pxz"] Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.694152 4756 scope.go:117] "RemoveContainer" containerID="4f64ef23f1ae1c759d4f18d2a0f2a70fc69012a045387303ee110d5bc5a2d5f9" Dec 05 18:04:22 crc kubenswrapper[4756]: E1205 18:04:22.697061 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f64ef23f1ae1c759d4f18d2a0f2a70fc69012a045387303ee110d5bc5a2d5f9\": container with ID starting with 4f64ef23f1ae1c759d4f18d2a0f2a70fc69012a045387303ee110d5bc5a2d5f9 not found: ID does not exist" containerID="4f64ef23f1ae1c759d4f18d2a0f2a70fc69012a045387303ee110d5bc5a2d5f9" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.697106 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f64ef23f1ae1c759d4f18d2a0f2a70fc69012a045387303ee110d5bc5a2d5f9"} err="failed to get container status \"4f64ef23f1ae1c759d4f18d2a0f2a70fc69012a045387303ee110d5bc5a2d5f9\": rpc error: code = NotFound desc = could not find container \"4f64ef23f1ae1c759d4f18d2a0f2a70fc69012a045387303ee110d5bc5a2d5f9\": container with ID starting with 4f64ef23f1ae1c759d4f18d2a0f2a70fc69012a045387303ee110d5bc5a2d5f9 not found: ID does not exist" Dec 05 18:04:22 crc kubenswrapper[4756]: I1205 18:04:22.715363 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b5pxz"] Dec 05 18:04:23 crc kubenswrapper[4756]: I1205 18:04:23.057730 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-jgc5r"] Dec 05 18:04:23 crc kubenswrapper[4756]: I1205 18:04:23.067189 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-33d3-account-create-update-dd5jn"] Dec 05 18:04:23 crc kubenswrapper[4756]: I1205 18:04:23.078891 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-33d3-account-create-update-dd5jn"] Dec 05 18:04:23 crc kubenswrapper[4756]: I1205 18:04:23.093815 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-jgc5r"] Dec 05 18:04:23 crc kubenswrapper[4756]: I1205 18:04:23.314538 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xz5bc"] Dec 05 18:04:23 crc kubenswrapper[4756]: I1205 18:04:23.604145 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"69c44af2-cb75-43ba-9551-98cf81bcb6dd","Type":"ContainerStarted","Data":"525e3cd09ef49473ac2bd940ef9ea0d449128aa910310cd322c3b791b289cd23"} Dec 05 18:04:23 crc kubenswrapper[4756]: I1205 18:04:23.615518 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xz5bc" podUID="60095cb4-2d1a-4350-892a-701be6b61b87" containerName="registry-server" containerID="cri-o://6b3f205fd780a3373d9974a3c578e30e7840dc90624df5765536ed3c560c5388" gracePeriod=2 Dec 05 18:04:23 crc kubenswrapper[4756]: I1205 18:04:23.616122 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 05 18:04:24 crc kubenswrapper[4756]: I1205 18:04:24.067233 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d555bdfb-a12a-48ad-83ca-4c0ab4275377" path="/var/lib/kubelet/pods/d555bdfb-a12a-48ad-83ca-4c0ab4275377/volumes" Dec 05 18:04:24 crc kubenswrapper[4756]: I1205 18:04:24.067793 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e140f75d-f032-431b-9cd8-3690958a676a" path="/var/lib/kubelet/pods/e140f75d-f032-431b-9cd8-3690958a676a/volumes" Dec 05 18:04:24 crc kubenswrapper[4756]: I1205 18:04:24.068822 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd6e6d5d-35ad-4319-b7eb-438aee2bb13c" path="/var/lib/kubelet/pods/fd6e6d5d-35ad-4319-b7eb-438aee2bb13c/volumes" Dec 05 18:04:24 crc kubenswrapper[4756]: I1205 18:04:24.457940 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xz5bc" Dec 05 18:04:24 crc kubenswrapper[4756]: I1205 18:04:24.555234 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60095cb4-2d1a-4350-892a-701be6b61b87-catalog-content\") pod \"60095cb4-2d1a-4350-892a-701be6b61b87\" (UID: \"60095cb4-2d1a-4350-892a-701be6b61b87\") " Dec 05 18:04:24 crc kubenswrapper[4756]: I1205 18:04:24.555306 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktc8w\" (UniqueName: \"kubernetes.io/projected/60095cb4-2d1a-4350-892a-701be6b61b87-kube-api-access-ktc8w\") pod \"60095cb4-2d1a-4350-892a-701be6b61b87\" (UID: \"60095cb4-2d1a-4350-892a-701be6b61b87\") " Dec 05 18:04:24 crc kubenswrapper[4756]: I1205 18:04:24.555439 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60095cb4-2d1a-4350-892a-701be6b61b87-utilities\") pod \"60095cb4-2d1a-4350-892a-701be6b61b87\" (UID: \"60095cb4-2d1a-4350-892a-701be6b61b87\") " Dec 05 18:04:24 crc kubenswrapper[4756]: I1205 18:04:24.556649 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60095cb4-2d1a-4350-892a-701be6b61b87-utilities" (OuterVolumeSpecName: "utilities") pod "60095cb4-2d1a-4350-892a-701be6b61b87" (UID: "60095cb4-2d1a-4350-892a-701be6b61b87"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:04:24 crc kubenswrapper[4756]: I1205 18:04:24.567825 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60095cb4-2d1a-4350-892a-701be6b61b87-kube-api-access-ktc8w" (OuterVolumeSpecName: "kube-api-access-ktc8w") pod "60095cb4-2d1a-4350-892a-701be6b61b87" (UID: "60095cb4-2d1a-4350-892a-701be6b61b87"). InnerVolumeSpecName "kube-api-access-ktc8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:04:24 crc kubenswrapper[4756]: I1205 18:04:24.602230 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60095cb4-2d1a-4350-892a-701be6b61b87-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60095cb4-2d1a-4350-892a-701be6b61b87" (UID: "60095cb4-2d1a-4350-892a-701be6b61b87"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:04:24 crc kubenswrapper[4756]: I1205 18:04:24.628890 4756 generic.go:334] "Generic (PLEG): container finished" podID="60095cb4-2d1a-4350-892a-701be6b61b87" containerID="6b3f205fd780a3373d9974a3c578e30e7840dc90624df5765536ed3c560c5388" exitCode=0 Dec 05 18:04:24 crc kubenswrapper[4756]: I1205 18:04:24.628982 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xz5bc" Dec 05 18:04:24 crc kubenswrapper[4756]: I1205 18:04:24.628978 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xz5bc" event={"ID":"60095cb4-2d1a-4350-892a-701be6b61b87","Type":"ContainerDied","Data":"6b3f205fd780a3373d9974a3c578e30e7840dc90624df5765536ed3c560c5388"} Dec 05 18:04:24 crc kubenswrapper[4756]: I1205 18:04:24.629057 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xz5bc" event={"ID":"60095cb4-2d1a-4350-892a-701be6b61b87","Type":"ContainerDied","Data":"039e6aa1d177dfc6caae630faadf684e3ca5bf7542444d4855cf947a890d41e5"} Dec 05 18:04:24 crc kubenswrapper[4756]: I1205 18:04:24.629087 4756 scope.go:117] "RemoveContainer" containerID="6b3f205fd780a3373d9974a3c578e30e7840dc90624df5765536ed3c560c5388" Dec 05 18:04:24 crc kubenswrapper[4756]: I1205 18:04:24.653564 4756 scope.go:117] "RemoveContainer" containerID="e9b65f749b4aa7eb5c9686f23509023b93c9460e86a916a5b2600b50226966a1" Dec 05 18:04:24 crc kubenswrapper[4756]: I1205 18:04:24.663132 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60095cb4-2d1a-4350-892a-701be6b61b87-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 18:04:24 crc kubenswrapper[4756]: I1205 18:04:24.663173 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktc8w\" (UniqueName: \"kubernetes.io/projected/60095cb4-2d1a-4350-892a-701be6b61b87-kube-api-access-ktc8w\") on node \"crc\" DevicePath \"\"" Dec 05 18:04:24 crc kubenswrapper[4756]: I1205 18:04:24.663183 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60095cb4-2d1a-4350-892a-701be6b61b87-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 18:04:24 crc kubenswrapper[4756]: I1205 18:04:24.670569 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xz5bc"] Dec 05 18:04:24 crc kubenswrapper[4756]: I1205 18:04:24.690844 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xz5bc"] Dec 05 18:04:24 crc kubenswrapper[4756]: I1205 18:04:24.701451 4756 scope.go:117] "RemoveContainer" containerID="3544e0a1a87200632b978742307be9e24b89afb081688903c867aeab550adabf" Dec 05 18:04:24 crc kubenswrapper[4756]: I1205 18:04:24.720881 4756 scope.go:117] "RemoveContainer" containerID="6b3f205fd780a3373d9974a3c578e30e7840dc90624df5765536ed3c560c5388" Dec 05 18:04:24 crc kubenswrapper[4756]: E1205 18:04:24.721531 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b3f205fd780a3373d9974a3c578e30e7840dc90624df5765536ed3c560c5388\": container with ID starting with 6b3f205fd780a3373d9974a3c578e30e7840dc90624df5765536ed3c560c5388 not found: ID does not exist" containerID="6b3f205fd780a3373d9974a3c578e30e7840dc90624df5765536ed3c560c5388" Dec 05 18:04:24 crc kubenswrapper[4756]: I1205 18:04:24.721608 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b3f205fd780a3373d9974a3c578e30e7840dc90624df5765536ed3c560c5388"} err="failed to get container status \"6b3f205fd780a3373d9974a3c578e30e7840dc90624df5765536ed3c560c5388\": rpc error: code = NotFound desc = could not find container \"6b3f205fd780a3373d9974a3c578e30e7840dc90624df5765536ed3c560c5388\": container with ID starting with 6b3f205fd780a3373d9974a3c578e30e7840dc90624df5765536ed3c560c5388 not found: ID does not exist" Dec 05 18:04:24 crc kubenswrapper[4756]: I1205 18:04:24.721641 4756 scope.go:117] "RemoveContainer" containerID="e9b65f749b4aa7eb5c9686f23509023b93c9460e86a916a5b2600b50226966a1" Dec 05 18:04:24 crc kubenswrapper[4756]: E1205 18:04:24.722172 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9b65f749b4aa7eb5c9686f23509023b93c9460e86a916a5b2600b50226966a1\": container with ID starting with e9b65f749b4aa7eb5c9686f23509023b93c9460e86a916a5b2600b50226966a1 not found: ID does not exist" containerID="e9b65f749b4aa7eb5c9686f23509023b93c9460e86a916a5b2600b50226966a1" Dec 05 18:04:24 crc kubenswrapper[4756]: I1205 18:04:24.722205 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9b65f749b4aa7eb5c9686f23509023b93c9460e86a916a5b2600b50226966a1"} err="failed to get container status \"e9b65f749b4aa7eb5c9686f23509023b93c9460e86a916a5b2600b50226966a1\": rpc error: code = NotFound desc = could not find container \"e9b65f749b4aa7eb5c9686f23509023b93c9460e86a916a5b2600b50226966a1\": container with ID starting with e9b65f749b4aa7eb5c9686f23509023b93c9460e86a916a5b2600b50226966a1 not found: ID does not exist" Dec 05 18:04:24 crc kubenswrapper[4756]: I1205 18:04:24.722226 4756 scope.go:117] "RemoveContainer" containerID="3544e0a1a87200632b978742307be9e24b89afb081688903c867aeab550adabf" Dec 05 18:04:24 crc kubenswrapper[4756]: E1205 18:04:24.722514 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3544e0a1a87200632b978742307be9e24b89afb081688903c867aeab550adabf\": container with ID starting with 3544e0a1a87200632b978742307be9e24b89afb081688903c867aeab550adabf not found: ID does not exist" containerID="3544e0a1a87200632b978742307be9e24b89afb081688903c867aeab550adabf" Dec 05 18:04:24 crc kubenswrapper[4756]: I1205 18:04:24.722559 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3544e0a1a87200632b978742307be9e24b89afb081688903c867aeab550adabf"} err="failed to get container status \"3544e0a1a87200632b978742307be9e24b89afb081688903c867aeab550adabf\": rpc error: code = NotFound desc = could not find container \"3544e0a1a87200632b978742307be9e24b89afb081688903c867aeab550adabf\": container with ID starting with 3544e0a1a87200632b978742307be9e24b89afb081688903c867aeab550adabf not found: ID does not exist" Dec 05 18:04:26 crc kubenswrapper[4756]: I1205 18:04:26.065742 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60095cb4-2d1a-4350-892a-701be6b61b87" path="/var/lib/kubelet/pods/60095cb4-2d1a-4350-892a-701be6b61b87/volumes" Dec 05 18:04:28 crc kubenswrapper[4756]: I1205 18:04:28.683495 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"69c44af2-cb75-43ba-9551-98cf81bcb6dd","Type":"ContainerStarted","Data":"b7dae7a63666d7e6d407c53dc86de42b300430af8973f63d8dbe72070ffdc43a"} Dec 05 18:04:28 crc kubenswrapper[4756]: I1205 18:04:28.687584 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"2421f4e9-7c0c-4600-bc22-54c6802397b2","Type":"ContainerStarted","Data":"f54f4d3f11ed7b9bcd3f279714c3cd5d736fabc7c908645d576ecc6a383aaad8"} Dec 05 18:04:29 crc kubenswrapper[4756]: I1205 18:04:29.056571 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-f5kct"] Dec 05 18:04:29 crc kubenswrapper[4756]: I1205 18:04:29.067566 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-f5kct"] Dec 05 18:04:30 crc kubenswrapper[4756]: I1205 18:04:30.064482 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5204a282-bb58-4118-8f80-4f0f7d2a9028" path="/var/lib/kubelet/pods/5204a282-bb58-4118-8f80-4f0f7d2a9028/volumes" Dec 05 18:04:30 crc kubenswrapper[4756]: I1205 18:04:30.522685 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 05 18:04:37 crc kubenswrapper[4756]: I1205 18:04:37.809656 4756 generic.go:334] "Generic (PLEG): container finished" podID="2421f4e9-7c0c-4600-bc22-54c6802397b2" containerID="f54f4d3f11ed7b9bcd3f279714c3cd5d736fabc7c908645d576ecc6a383aaad8" exitCode=0 Dec 05 18:04:37 crc kubenswrapper[4756]: I1205 18:04:37.809728 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"2421f4e9-7c0c-4600-bc22-54c6802397b2","Type":"ContainerDied","Data":"f54f4d3f11ed7b9bcd3f279714c3cd5d736fabc7c908645d576ecc6a383aaad8"} Dec 05 18:04:38 crc kubenswrapper[4756]: I1205 18:04:38.825785 4756 generic.go:334] "Generic (PLEG): container finished" podID="69c44af2-cb75-43ba-9551-98cf81bcb6dd" containerID="b7dae7a63666d7e6d407c53dc86de42b300430af8973f63d8dbe72070ffdc43a" exitCode=0 Dec 05 18:04:38 crc kubenswrapper[4756]: I1205 18:04:38.825907 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"69c44af2-cb75-43ba-9551-98cf81bcb6dd","Type":"ContainerDied","Data":"b7dae7a63666d7e6d407c53dc86de42b300430af8973f63d8dbe72070ffdc43a"} Dec 05 18:04:41 crc kubenswrapper[4756]: I1205 18:04:41.853949 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"2421f4e9-7c0c-4600-bc22-54c6802397b2","Type":"ContainerStarted","Data":"0a319e410a31939044f567b404225630ff929982457a1f59a2145f8545c10616"} Dec 05 18:04:47 crc kubenswrapper[4756]: I1205 18:04:47.961480 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"2421f4e9-7c0c-4600-bc22-54c6802397b2","Type":"ContainerStarted","Data":"6404d9523ec51a97184d205a719e8b3ac92112268d2661806640bd4f91679d13"} Dec 05 18:04:47 crc kubenswrapper[4756]: I1205 18:04:47.962373 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Dec 05 18:04:47 crc kubenswrapper[4756]: I1205 18:04:47.966605 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Dec 05 18:04:47 crc kubenswrapper[4756]: I1205 18:04:47.993410 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=8.656496833 podStartE2EDuration="27.993376034s" podCreationTimestamp="2025-12-05 18:04:20 +0000 UTC" firstStartedPulling="2025-12-05 18:04:21.939803427 +0000 UTC m=+6522.178476929" lastFinishedPulling="2025-12-05 18:04:41.276682618 +0000 UTC m=+6541.515356130" observedRunningTime="2025-12-05 18:04:47.986303835 +0000 UTC m=+6548.224977367" watchObservedRunningTime="2025-12-05 18:04:47.993376034 +0000 UTC m=+6548.232049576" Dec 05 18:04:49 crc kubenswrapper[4756]: I1205 18:04:49.992555 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"69c44af2-cb75-43ba-9551-98cf81bcb6dd","Type":"ContainerStarted","Data":"cebd6d6761495a0b5175c0c13832312100d13cdfcbe83e734a8b61c73331f25b"} Dec 05 18:04:52 crc kubenswrapper[4756]: I1205 18:04:52.295698 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:04:52 crc kubenswrapper[4756]: I1205 18:04:52.296121 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:04:52 crc kubenswrapper[4756]: I1205 18:04:52.296187 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 18:04:52 crc kubenswrapper[4756]: I1205 18:04:52.297450 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 18:04:52 crc kubenswrapper[4756]: I1205 18:04:52.297597 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326" gracePeriod=600 Dec 05 18:04:54 crc kubenswrapper[4756]: E1205 18:04:54.581963 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:04:55 crc kubenswrapper[4756]: I1205 18:04:55.073970 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"69c44af2-cb75-43ba-9551-98cf81bcb6dd","Type":"ContainerStarted","Data":"fab3ab1e52e8cf54dc1474ed637c589211ce9d51b9adce744090ef8e60e1c46a"} Dec 05 18:04:55 crc kubenswrapper[4756]: I1205 18:04:55.080045 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326" exitCode=0 Dec 05 18:04:55 crc kubenswrapper[4756]: I1205 18:04:55.080118 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326"} Dec 05 18:04:55 crc kubenswrapper[4756]: I1205 18:04:55.080219 4756 scope.go:117] "RemoveContainer" containerID="32339976c849fb944e24e4a905b3c0b31eadc92ee27ed11d626318af61e21b3a" Dec 05 18:04:55 crc kubenswrapper[4756]: I1205 18:04:55.081803 4756 scope.go:117] "RemoveContainer" containerID="c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326" Dec 05 18:04:55 crc kubenswrapper[4756]: E1205 18:04:55.083112 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:04:57 crc kubenswrapper[4756]: I1205 18:04:57.111165 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"69c44af2-cb75-43ba-9551-98cf81bcb6dd","Type":"ContainerStarted","Data":"4fe8b4d5770f811a026a98835d5d590a272c24353ef8baf9e2c7b37d3ed6bee3"} Dec 05 18:04:57 crc kubenswrapper[4756]: I1205 18:04:57.133836 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 05 18:04:57 crc kubenswrapper[4756]: I1205 18:04:57.152218 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.169626619 podStartE2EDuration="37.152184968s" podCreationTimestamp="2025-12-05 18:04:20 +0000 UTC" firstStartedPulling="2025-12-05 18:04:22.669106339 +0000 UTC m=+6522.907779851" lastFinishedPulling="2025-12-05 18:04:56.651664688 +0000 UTC m=+6556.890338200" observedRunningTime="2025-12-05 18:04:57.138502664 +0000 UTC m=+6557.377176166" watchObservedRunningTime="2025-12-05 18:04:57.152184968 +0000 UTC m=+6557.390858550" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.550046 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 18:05:05 crc kubenswrapper[4756]: E1205 18:05:05.551168 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60095cb4-2d1a-4350-892a-701be6b61b87" containerName="extract-content" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.551189 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="60095cb4-2d1a-4350-892a-701be6b61b87" containerName="extract-content" Dec 05 18:05:05 crc kubenswrapper[4756]: E1205 18:05:05.551203 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60095cb4-2d1a-4350-892a-701be6b61b87" containerName="extract-utilities" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.551211 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="60095cb4-2d1a-4350-892a-701be6b61b87" containerName="extract-utilities" Dec 05 18:05:05 crc kubenswrapper[4756]: E1205 18:05:05.551242 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e140f75d-f032-431b-9cd8-3690958a676a" containerName="extract-utilities" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.551251 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="e140f75d-f032-431b-9cd8-3690958a676a" containerName="extract-utilities" Dec 05 18:05:05 crc kubenswrapper[4756]: E1205 18:05:05.551264 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e140f75d-f032-431b-9cd8-3690958a676a" containerName="registry-server" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.551271 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="e140f75d-f032-431b-9cd8-3690958a676a" containerName="registry-server" Dec 05 18:05:05 crc kubenswrapper[4756]: E1205 18:05:05.551296 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e140f75d-f032-431b-9cd8-3690958a676a" containerName="extract-content" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.551304 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="e140f75d-f032-431b-9cd8-3690958a676a" containerName="extract-content" Dec 05 18:05:05 crc kubenswrapper[4756]: E1205 18:05:05.551321 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60095cb4-2d1a-4350-892a-701be6b61b87" containerName="registry-server" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.551329 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="60095cb4-2d1a-4350-892a-701be6b61b87" containerName="registry-server" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.551607 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="60095cb4-2d1a-4350-892a-701be6b61b87" containerName="registry-server" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.551634 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="e140f75d-f032-431b-9cd8-3690958a676a" containerName="registry-server" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.554179 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.557904 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.559607 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\") " pod="openstack/ceilometer-0" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.559715 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-run-httpd\") pod \"ceilometer-0\" (UID: \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\") " pod="openstack/ceilometer-0" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.560034 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-config-data\") pod \"ceilometer-0\" (UID: \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\") " pod="openstack/ceilometer-0" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.560126 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4plv4\" (UniqueName: \"kubernetes.io/projected/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-kube-api-access-4plv4\") pod \"ceilometer-0\" (UID: \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\") " pod="openstack/ceilometer-0" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.560171 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\") " pod="openstack/ceilometer-0" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.560490 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-log-httpd\") pod \"ceilometer-0\" (UID: \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\") " pod="openstack/ceilometer-0" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.560647 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-scripts\") pod \"ceilometer-0\" (UID: \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\") " pod="openstack/ceilometer-0" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.569930 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.570402 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.662936 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\") " pod="openstack/ceilometer-0" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.663392 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-run-httpd\") pod \"ceilometer-0\" (UID: \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\") " pod="openstack/ceilometer-0" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.663588 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-config-data\") pod \"ceilometer-0\" (UID: \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\") " pod="openstack/ceilometer-0" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.663632 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4plv4\" (UniqueName: \"kubernetes.io/projected/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-kube-api-access-4plv4\") pod \"ceilometer-0\" (UID: \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\") " pod="openstack/ceilometer-0" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.663667 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\") " pod="openstack/ceilometer-0" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.663781 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-log-httpd\") pod \"ceilometer-0\" (UID: \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\") " pod="openstack/ceilometer-0" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.663850 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-scripts\") pod \"ceilometer-0\" (UID: \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\") " pod="openstack/ceilometer-0" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.663929 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-run-httpd\") pod \"ceilometer-0\" (UID: \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\") " pod="openstack/ceilometer-0" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.664241 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-log-httpd\") pod \"ceilometer-0\" (UID: \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\") " pod="openstack/ceilometer-0" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.669232 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\") " pod="openstack/ceilometer-0" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.669248 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-scripts\") pod \"ceilometer-0\" (UID: \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\") " pod="openstack/ceilometer-0" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.669356 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-config-data\") pod \"ceilometer-0\" (UID: \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\") " pod="openstack/ceilometer-0" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.672292 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\") " pod="openstack/ceilometer-0" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.679754 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4plv4\" (UniqueName: \"kubernetes.io/projected/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-kube-api-access-4plv4\") pod \"ceilometer-0\" (UID: \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\") " pod="openstack/ceilometer-0" Dec 05 18:05:05 crc kubenswrapper[4756]: I1205 18:05:05.872849 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 18:05:06 crc kubenswrapper[4756]: I1205 18:05:06.443232 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 18:05:07 crc kubenswrapper[4756]: I1205 18:05:07.134469 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:07 crc kubenswrapper[4756]: I1205 18:05:07.136856 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:07 crc kubenswrapper[4756]: I1205 18:05:07.241374 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b47c6a6-a6a9-4753-89eb-6c34d061cd71","Type":"ContainerStarted","Data":"a6e8c82980c57b5ad27bbed2ac3cf2356e26b3aa6c3680a8c805dbda942af4b8"} Dec 05 18:05:07 crc kubenswrapper[4756]: I1205 18:05:07.243529 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:08 crc kubenswrapper[4756]: I1205 18:05:08.050297 4756 scope.go:117] "RemoveContainer" containerID="c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326" Dec 05 18:05:08 crc kubenswrapper[4756]: E1205 18:05:08.051196 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:05:08 crc kubenswrapper[4756]: I1205 18:05:08.955198 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 05 18:05:08 crc kubenswrapper[4756]: I1205 18:05:08.955939 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="9727cd88-b0ba-4d3c-a639-0ff7d7686194" containerName="openstackclient" containerID="cri-o://290708e9f9981fcc806436ab7c45c5672be3f19fb9ef6d73fd0ce0e49c2a683a" gracePeriod=2 Dec 05 18:05:08 crc kubenswrapper[4756]: I1205 18:05:08.969930 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 05 18:05:09 crc kubenswrapper[4756]: I1205 18:05:09.018380 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 05 18:05:09 crc kubenswrapper[4756]: E1205 18:05:09.019002 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9727cd88-b0ba-4d3c-a639-0ff7d7686194" containerName="openstackclient" Dec 05 18:05:09 crc kubenswrapper[4756]: I1205 18:05:09.019024 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="9727cd88-b0ba-4d3c-a639-0ff7d7686194" containerName="openstackclient" Dec 05 18:05:09 crc kubenswrapper[4756]: I1205 18:05:09.020088 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="9727cd88-b0ba-4d3c-a639-0ff7d7686194" containerName="openstackclient" Dec 05 18:05:09 crc kubenswrapper[4756]: I1205 18:05:09.022723 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 18:05:09 crc kubenswrapper[4756]: I1205 18:05:09.052559 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 18:05:09 crc kubenswrapper[4756]: I1205 18:05:09.069180 4756 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="9727cd88-b0ba-4d3c-a639-0ff7d7686194" podUID="59b410c3-f27c-4aba-a63b-28285991c0f8" Dec 05 18:05:09 crc kubenswrapper[4756]: I1205 18:05:09.074144 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/59b410c3-f27c-4aba-a63b-28285991c0f8-openstack-config\") pod \"openstackclient\" (UID: \"59b410c3-f27c-4aba-a63b-28285991c0f8\") " pod="openstack/openstackclient" Dec 05 18:05:09 crc kubenswrapper[4756]: I1205 18:05:09.074247 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/59b410c3-f27c-4aba-a63b-28285991c0f8-openstack-config-secret\") pod \"openstackclient\" (UID: \"59b410c3-f27c-4aba-a63b-28285991c0f8\") " pod="openstack/openstackclient" Dec 05 18:05:09 crc kubenswrapper[4756]: I1205 18:05:09.074280 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdhk7\" (UniqueName: \"kubernetes.io/projected/59b410c3-f27c-4aba-a63b-28285991c0f8-kube-api-access-sdhk7\") pod \"openstackclient\" (UID: \"59b410c3-f27c-4aba-a63b-28285991c0f8\") " pod="openstack/openstackclient" Dec 05 18:05:09 crc kubenswrapper[4756]: I1205 18:05:09.074304 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59b410c3-f27c-4aba-a63b-28285991c0f8-combined-ca-bundle\") pod \"openstackclient\" (UID: \"59b410c3-f27c-4aba-a63b-28285991c0f8\") " pod="openstack/openstackclient" Dec 05 18:05:09 crc kubenswrapper[4756]: I1205 18:05:09.178851 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/59b410c3-f27c-4aba-a63b-28285991c0f8-openstack-config\") pod \"openstackclient\" (UID: \"59b410c3-f27c-4aba-a63b-28285991c0f8\") " pod="openstack/openstackclient" Dec 05 18:05:09 crc kubenswrapper[4756]: I1205 18:05:09.178969 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/59b410c3-f27c-4aba-a63b-28285991c0f8-openstack-config-secret\") pod \"openstackclient\" (UID: \"59b410c3-f27c-4aba-a63b-28285991c0f8\") " pod="openstack/openstackclient" Dec 05 18:05:09 crc kubenswrapper[4756]: I1205 18:05:09.179001 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdhk7\" (UniqueName: \"kubernetes.io/projected/59b410c3-f27c-4aba-a63b-28285991c0f8-kube-api-access-sdhk7\") pod \"openstackclient\" (UID: \"59b410c3-f27c-4aba-a63b-28285991c0f8\") " pod="openstack/openstackclient" Dec 05 18:05:09 crc kubenswrapper[4756]: I1205 18:05:09.179025 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59b410c3-f27c-4aba-a63b-28285991c0f8-combined-ca-bundle\") pod \"openstackclient\" (UID: \"59b410c3-f27c-4aba-a63b-28285991c0f8\") " pod="openstack/openstackclient" Dec 05 18:05:09 crc kubenswrapper[4756]: I1205 18:05:09.180697 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/59b410c3-f27c-4aba-a63b-28285991c0f8-openstack-config\") pod \"openstackclient\" (UID: \"59b410c3-f27c-4aba-a63b-28285991c0f8\") " pod="openstack/openstackclient" Dec 05 18:05:09 crc kubenswrapper[4756]: I1205 18:05:09.189279 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59b410c3-f27c-4aba-a63b-28285991c0f8-combined-ca-bundle\") pod \"openstackclient\" (UID: \"59b410c3-f27c-4aba-a63b-28285991c0f8\") " pod="openstack/openstackclient" Dec 05 18:05:09 crc kubenswrapper[4756]: I1205 18:05:09.211547 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/59b410c3-f27c-4aba-a63b-28285991c0f8-openstack-config-secret\") pod \"openstackclient\" (UID: \"59b410c3-f27c-4aba-a63b-28285991c0f8\") " pod="openstack/openstackclient" Dec 05 18:05:09 crc kubenswrapper[4756]: I1205 18:05:09.221054 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdhk7\" (UniqueName: \"kubernetes.io/projected/59b410c3-f27c-4aba-a63b-28285991c0f8-kube-api-access-sdhk7\") pod \"openstackclient\" (UID: \"59b410c3-f27c-4aba-a63b-28285991c0f8\") " pod="openstack/openstackclient" Dec 05 18:05:09 crc kubenswrapper[4756]: I1205 18:05:09.286454 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b47c6a6-a6a9-4753-89eb-6c34d061cd71","Type":"ContainerStarted","Data":"54cba42d0632eeaede6fe5fd55ac92a668f0d7393f3f4705fffac3b3ac4b5deb"} Dec 05 18:05:09 crc kubenswrapper[4756]: I1205 18:05:09.353772 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 18:05:09 crc kubenswrapper[4756]: I1205 18:05:09.944964 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 18:05:10 crc kubenswrapper[4756]: I1205 18:05:10.010605 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 18:05:10 crc kubenswrapper[4756]: I1205 18:05:10.010918 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="69c44af2-cb75-43ba-9551-98cf81bcb6dd" containerName="prometheus" containerID="cri-o://cebd6d6761495a0b5175c0c13832312100d13cdfcbe83e734a8b61c73331f25b" gracePeriod=600 Dec 05 18:05:10 crc kubenswrapper[4756]: I1205 18:05:10.011621 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="69c44af2-cb75-43ba-9551-98cf81bcb6dd" containerName="thanos-sidecar" containerID="cri-o://4fe8b4d5770f811a026a98835d5d590a272c24353ef8baf9e2c7b37d3ed6bee3" gracePeriod=600 Dec 05 18:05:10 crc kubenswrapper[4756]: I1205 18:05:10.011717 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="69c44af2-cb75-43ba-9551-98cf81bcb6dd" containerName="config-reloader" containerID="cri-o://fab3ab1e52e8cf54dc1474ed637c589211ce9d51b9adce744090ef8e60e1c46a" gracePeriod=600 Dec 05 18:05:10 crc kubenswrapper[4756]: I1205 18:05:10.307114 4756 generic.go:334] "Generic (PLEG): container finished" podID="69c44af2-cb75-43ba-9551-98cf81bcb6dd" containerID="4fe8b4d5770f811a026a98835d5d590a272c24353ef8baf9e2c7b37d3ed6bee3" exitCode=0 Dec 05 18:05:10 crc kubenswrapper[4756]: I1205 18:05:10.307329 4756 generic.go:334] "Generic (PLEG): container finished" podID="69c44af2-cb75-43ba-9551-98cf81bcb6dd" containerID="cebd6d6761495a0b5175c0c13832312100d13cdfcbe83e734a8b61c73331f25b" exitCode=0 Dec 05 18:05:10 crc kubenswrapper[4756]: I1205 18:05:10.307221 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"69c44af2-cb75-43ba-9551-98cf81bcb6dd","Type":"ContainerDied","Data":"4fe8b4d5770f811a026a98835d5d590a272c24353ef8baf9e2c7b37d3ed6bee3"} Dec 05 18:05:10 crc kubenswrapper[4756]: I1205 18:05:10.307404 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"69c44af2-cb75-43ba-9551-98cf81bcb6dd","Type":"ContainerDied","Data":"cebd6d6761495a0b5175c0c13832312100d13cdfcbe83e734a8b61c73331f25b"} Dec 05 18:05:10 crc kubenswrapper[4756]: I1205 18:05:10.312111 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"59b410c3-f27c-4aba-a63b-28285991c0f8","Type":"ContainerStarted","Data":"d68196ec381c1dc9627ce851aaf663d7c948cd1d54cba91494e5a6a81c1b4753"} Dec 05 18:05:10 crc kubenswrapper[4756]: I1205 18:05:10.320190 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b47c6a6-a6a9-4753-89eb-6c34d061cd71","Type":"ContainerStarted","Data":"d2eaff74ac8a4658ce440b063ee1ffcee9a5f7a4ede2b1d055ed76a8a23c26e4"} Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.141707 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.225751 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/69c44af2-cb75-43ba-9551-98cf81bcb6dd-config-out\") pod \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.225842 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxslj\" (UniqueName: \"kubernetes.io/projected/69c44af2-cb75-43ba-9551-98cf81bcb6dd-kube-api-access-rxslj\") pod \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.225950 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/69c44af2-cb75-43ba-9551-98cf81bcb6dd-web-config\") pod \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.226016 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/69c44af2-cb75-43ba-9551-98cf81bcb6dd-config\") pod \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.226109 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/69c44af2-cb75-43ba-9551-98cf81bcb6dd-thanos-prometheus-http-client-file\") pod \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.226142 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/69c44af2-cb75-43ba-9551-98cf81bcb6dd-tls-assets\") pod \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.226354 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9ee8de-3300-43ef-887a-8e46472cd642\") pod \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.226404 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/69c44af2-cb75-43ba-9551-98cf81bcb6dd-prometheus-metric-storage-rulefiles-0\") pod \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\" (UID: \"69c44af2-cb75-43ba-9551-98cf81bcb6dd\") " Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.228142 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69c44af2-cb75-43ba-9551-98cf81bcb6dd-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "69c44af2-cb75-43ba-9551-98cf81bcb6dd" (UID: "69c44af2-cb75-43ba-9551-98cf81bcb6dd"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.231383 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69c44af2-cb75-43ba-9551-98cf81bcb6dd-config-out" (OuterVolumeSpecName: "config-out") pod "69c44af2-cb75-43ba-9551-98cf81bcb6dd" (UID: "69c44af2-cb75-43ba-9551-98cf81bcb6dd"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.232367 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69c44af2-cb75-43ba-9551-98cf81bcb6dd-kube-api-access-rxslj" (OuterVolumeSpecName: "kube-api-access-rxslj") pod "69c44af2-cb75-43ba-9551-98cf81bcb6dd" (UID: "69c44af2-cb75-43ba-9551-98cf81bcb6dd"). InnerVolumeSpecName "kube-api-access-rxslj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.232708 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69c44af2-cb75-43ba-9551-98cf81bcb6dd-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "69c44af2-cb75-43ba-9551-98cf81bcb6dd" (UID: "69c44af2-cb75-43ba-9551-98cf81bcb6dd"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.241380 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69c44af2-cb75-43ba-9551-98cf81bcb6dd-config" (OuterVolumeSpecName: "config") pod "69c44af2-cb75-43ba-9551-98cf81bcb6dd" (UID: "69c44af2-cb75-43ba-9551-98cf81bcb6dd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.246532 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69c44af2-cb75-43ba-9551-98cf81bcb6dd-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "69c44af2-cb75-43ba-9551-98cf81bcb6dd" (UID: "69c44af2-cb75-43ba-9551-98cf81bcb6dd"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.285230 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9ee8de-3300-43ef-887a-8e46472cd642" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "69c44af2-cb75-43ba-9551-98cf81bcb6dd" (UID: "69c44af2-cb75-43ba-9551-98cf81bcb6dd"). InnerVolumeSpecName "pvc-4f9ee8de-3300-43ef-887a-8e46472cd642". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.288447 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69c44af2-cb75-43ba-9551-98cf81bcb6dd-web-config" (OuterVolumeSpecName: "web-config") pod "69c44af2-cb75-43ba-9551-98cf81bcb6dd" (UID: "69c44af2-cb75-43ba-9551-98cf81bcb6dd"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.329019 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxslj\" (UniqueName: \"kubernetes.io/projected/69c44af2-cb75-43ba-9551-98cf81bcb6dd-kube-api-access-rxslj\") on node \"crc\" DevicePath \"\"" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.329048 4756 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/69c44af2-cb75-43ba-9551-98cf81bcb6dd-web-config\") on node \"crc\" DevicePath \"\"" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.329059 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/69c44af2-cb75-43ba-9551-98cf81bcb6dd-config\") on node \"crc\" DevicePath \"\"" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.329069 4756 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/69c44af2-cb75-43ba-9551-98cf81bcb6dd-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.329079 4756 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/69c44af2-cb75-43ba-9551-98cf81bcb6dd-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.329136 4756 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-4f9ee8de-3300-43ef-887a-8e46472cd642\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9ee8de-3300-43ef-887a-8e46472cd642\") on node \"crc\" " Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.329152 4756 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/69c44af2-cb75-43ba-9551-98cf81bcb6dd-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.329163 4756 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/69c44af2-cb75-43ba-9551-98cf81bcb6dd-config-out\") on node \"crc\" DevicePath \"\"" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.343736 4756 generic.go:334] "Generic (PLEG): container finished" podID="9727cd88-b0ba-4d3c-a639-0ff7d7686194" containerID="290708e9f9981fcc806436ab7c45c5672be3f19fb9ef6d73fd0ce0e49c2a683a" exitCode=137 Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.343831 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83e2b1ab205919ea55d1c6f04abe91c107f8c162fd26203a4eb1a9f11b5bcefb" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.349502 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b47c6a6-a6a9-4753-89eb-6c34d061cd71","Type":"ContainerStarted","Data":"0c805a4fc0987b3444fa59edecc166d5ea04d88eafc056338b31e9bb7322c175"} Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.354368 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"59b410c3-f27c-4aba-a63b-28285991c0f8","Type":"ContainerStarted","Data":"426970b017d2e2491a6376fb9ca4b8079849544013e9be746cf2cdad9061649b"} Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.364701 4756 generic.go:334] "Generic (PLEG): container finished" podID="69c44af2-cb75-43ba-9551-98cf81bcb6dd" containerID="fab3ab1e52e8cf54dc1474ed637c589211ce9d51b9adce744090ef8e60e1c46a" exitCode=0 Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.364997 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"69c44af2-cb75-43ba-9551-98cf81bcb6dd","Type":"ContainerDied","Data":"fab3ab1e52e8cf54dc1474ed637c589211ce9d51b9adce744090ef8e60e1c46a"} Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.365106 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"69c44af2-cb75-43ba-9551-98cf81bcb6dd","Type":"ContainerDied","Data":"525e3cd09ef49473ac2bd940ef9ea0d449128aa910310cd322c3b791b289cd23"} Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.365225 4756 scope.go:117] "RemoveContainer" containerID="4fe8b4d5770f811a026a98835d5d590a272c24353ef8baf9e2c7b37d3ed6bee3" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.365564 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.375191 4756 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.375390 4756 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-4f9ee8de-3300-43ef-887a-8e46472cd642" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9ee8de-3300-43ef-887a-8e46472cd642") on node "crc" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.382824 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.3828012210000002 podStartE2EDuration="3.382801221s" podCreationTimestamp="2025-12-05 18:05:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 18:05:11.375714572 +0000 UTC m=+6571.614388084" watchObservedRunningTime="2025-12-05 18:05:11.382801221 +0000 UTC m=+6571.621474733" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.432236 4756 reconciler_common.go:293] "Volume detached for volume \"pvc-4f9ee8de-3300-43ef-887a-8e46472cd642\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9ee8de-3300-43ef-887a-8e46472cd642\") on node \"crc\" DevicePath \"\"" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.462626 4756 scope.go:117] "RemoveContainer" containerID="fab3ab1e52e8cf54dc1474ed637c589211ce9d51b9adce744090ef8e60e1c46a" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.471162 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.496919 4756 scope.go:117] "RemoveContainer" containerID="cebd6d6761495a0b5175c0c13832312100d13cdfcbe83e734a8b61c73331f25b" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.498140 4756 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="9727cd88-b0ba-4d3c-a639-0ff7d7686194" podUID="59b410c3-f27c-4aba-a63b-28285991c0f8" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.515852 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.539814 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.543140 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9727cd88-b0ba-4d3c-a639-0ff7d7686194-openstack-config\") pod \"9727cd88-b0ba-4d3c-a639-0ff7d7686194\" (UID: \"9727cd88-b0ba-4d3c-a639-0ff7d7686194\") " Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.543229 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9727cd88-b0ba-4d3c-a639-0ff7d7686194-openstack-config-secret\") pod \"9727cd88-b0ba-4d3c-a639-0ff7d7686194\" (UID: \"9727cd88-b0ba-4d3c-a639-0ff7d7686194\") " Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.543267 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9727cd88-b0ba-4d3c-a639-0ff7d7686194-combined-ca-bundle\") pod \"9727cd88-b0ba-4d3c-a639-0ff7d7686194\" (UID: \"9727cd88-b0ba-4d3c-a639-0ff7d7686194\") " Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.543480 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-662pv\" (UniqueName: \"kubernetes.io/projected/9727cd88-b0ba-4d3c-a639-0ff7d7686194-kube-api-access-662pv\") pod \"9727cd88-b0ba-4d3c-a639-0ff7d7686194\" (UID: \"9727cd88-b0ba-4d3c-a639-0ff7d7686194\") " Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.562780 4756 scope.go:117] "RemoveContainer" containerID="b7dae7a63666d7e6d407c53dc86de42b300430af8973f63d8dbe72070ffdc43a" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.565325 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9727cd88-b0ba-4d3c-a639-0ff7d7686194-kube-api-access-662pv" (OuterVolumeSpecName: "kube-api-access-662pv") pod "9727cd88-b0ba-4d3c-a639-0ff7d7686194" (UID: "9727cd88-b0ba-4d3c-a639-0ff7d7686194"). InnerVolumeSpecName "kube-api-access-662pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.569011 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 18:05:11 crc kubenswrapper[4756]: E1205 18:05:11.569783 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c44af2-cb75-43ba-9551-98cf81bcb6dd" containerName="thanos-sidecar" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.569802 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c44af2-cb75-43ba-9551-98cf81bcb6dd" containerName="thanos-sidecar" Dec 05 18:05:11 crc kubenswrapper[4756]: E1205 18:05:11.569825 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c44af2-cb75-43ba-9551-98cf81bcb6dd" containerName="prometheus" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.569832 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c44af2-cb75-43ba-9551-98cf81bcb6dd" containerName="prometheus" Dec 05 18:05:11 crc kubenswrapper[4756]: E1205 18:05:11.569859 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c44af2-cb75-43ba-9551-98cf81bcb6dd" containerName="init-config-reloader" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.569865 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c44af2-cb75-43ba-9551-98cf81bcb6dd" containerName="init-config-reloader" Dec 05 18:05:11 crc kubenswrapper[4756]: E1205 18:05:11.569896 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c44af2-cb75-43ba-9551-98cf81bcb6dd" containerName="config-reloader" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.569903 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c44af2-cb75-43ba-9551-98cf81bcb6dd" containerName="config-reloader" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.570266 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="69c44af2-cb75-43ba-9551-98cf81bcb6dd" containerName="thanos-sidecar" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.570303 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="69c44af2-cb75-43ba-9551-98cf81bcb6dd" containerName="config-reloader" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.570335 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="69c44af2-cb75-43ba-9551-98cf81bcb6dd" containerName="prometheus" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.573777 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.578941 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.579576 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-lm5dm" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.579966 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.580110 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.580288 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.580520 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.595795 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.607555 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.635736 4756 scope.go:117] "RemoveContainer" containerID="4fe8b4d5770f811a026a98835d5d590a272c24353ef8baf9e2c7b37d3ed6bee3" Dec 05 18:05:11 crc kubenswrapper[4756]: E1205 18:05:11.637542 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fe8b4d5770f811a026a98835d5d590a272c24353ef8baf9e2c7b37d3ed6bee3\": container with ID starting with 4fe8b4d5770f811a026a98835d5d590a272c24353ef8baf9e2c7b37d3ed6bee3 not found: ID does not exist" containerID="4fe8b4d5770f811a026a98835d5d590a272c24353ef8baf9e2c7b37d3ed6bee3" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.637622 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fe8b4d5770f811a026a98835d5d590a272c24353ef8baf9e2c7b37d3ed6bee3"} err="failed to get container status \"4fe8b4d5770f811a026a98835d5d590a272c24353ef8baf9e2c7b37d3ed6bee3\": rpc error: code = NotFound desc = could not find container \"4fe8b4d5770f811a026a98835d5d590a272c24353ef8baf9e2c7b37d3ed6bee3\": container with ID starting with 4fe8b4d5770f811a026a98835d5d590a272c24353ef8baf9e2c7b37d3ed6bee3 not found: ID does not exist" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.637649 4756 scope.go:117] "RemoveContainer" containerID="fab3ab1e52e8cf54dc1474ed637c589211ce9d51b9adce744090ef8e60e1c46a" Dec 05 18:05:11 crc kubenswrapper[4756]: E1205 18:05:11.638174 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fab3ab1e52e8cf54dc1474ed637c589211ce9d51b9adce744090ef8e60e1c46a\": container with ID starting with fab3ab1e52e8cf54dc1474ed637c589211ce9d51b9adce744090ef8e60e1c46a not found: ID does not exist" containerID="fab3ab1e52e8cf54dc1474ed637c589211ce9d51b9adce744090ef8e60e1c46a" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.638194 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fab3ab1e52e8cf54dc1474ed637c589211ce9d51b9adce744090ef8e60e1c46a"} err="failed to get container status \"fab3ab1e52e8cf54dc1474ed637c589211ce9d51b9adce744090ef8e60e1c46a\": rpc error: code = NotFound desc = could not find container \"fab3ab1e52e8cf54dc1474ed637c589211ce9d51b9adce744090ef8e60e1c46a\": container with ID starting with fab3ab1e52e8cf54dc1474ed637c589211ce9d51b9adce744090ef8e60e1c46a not found: ID does not exist" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.638207 4756 scope.go:117] "RemoveContainer" containerID="cebd6d6761495a0b5175c0c13832312100d13cdfcbe83e734a8b61c73331f25b" Dec 05 18:05:11 crc kubenswrapper[4756]: E1205 18:05:11.641264 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cebd6d6761495a0b5175c0c13832312100d13cdfcbe83e734a8b61c73331f25b\": container with ID starting with cebd6d6761495a0b5175c0c13832312100d13cdfcbe83e734a8b61c73331f25b not found: ID does not exist" containerID="cebd6d6761495a0b5175c0c13832312100d13cdfcbe83e734a8b61c73331f25b" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.641304 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cebd6d6761495a0b5175c0c13832312100d13cdfcbe83e734a8b61c73331f25b"} err="failed to get container status \"cebd6d6761495a0b5175c0c13832312100d13cdfcbe83e734a8b61c73331f25b\": rpc error: code = NotFound desc = could not find container \"cebd6d6761495a0b5175c0c13832312100d13cdfcbe83e734a8b61c73331f25b\": container with ID starting with cebd6d6761495a0b5175c0c13832312100d13cdfcbe83e734a8b61c73331f25b not found: ID does not exist" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.641347 4756 scope.go:117] "RemoveContainer" containerID="b7dae7a63666d7e6d407c53dc86de42b300430af8973f63d8dbe72070ffdc43a" Dec 05 18:05:11 crc kubenswrapper[4756]: E1205 18:05:11.641953 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7dae7a63666d7e6d407c53dc86de42b300430af8973f63d8dbe72070ffdc43a\": container with ID starting with b7dae7a63666d7e6d407c53dc86de42b300430af8973f63d8dbe72070ffdc43a not found: ID does not exist" containerID="b7dae7a63666d7e6d407c53dc86de42b300430af8973f63d8dbe72070ffdc43a" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.642549 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7dae7a63666d7e6d407c53dc86de42b300430af8973f63d8dbe72070ffdc43a"} err="failed to get container status \"b7dae7a63666d7e6d407c53dc86de42b300430af8973f63d8dbe72070ffdc43a\": rpc error: code = NotFound desc = could not find container \"b7dae7a63666d7e6d407c53dc86de42b300430af8973f63d8dbe72070ffdc43a\": container with ID starting with b7dae7a63666d7e6d407c53dc86de42b300430af8973f63d8dbe72070ffdc43a not found: ID does not exist" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.646719 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/347732e6-8a29-4364-97a7-b6a44556fe88-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.646760 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/347732e6-8a29-4364-97a7-b6a44556fe88-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.646791 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/347732e6-8a29-4364-97a7-b6a44556fe88-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.646857 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4f9ee8de-3300-43ef-887a-8e46472cd642\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9ee8de-3300-43ef-887a-8e46472cd642\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.646874 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/347732e6-8a29-4364-97a7-b6a44556fe88-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.646900 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/347732e6-8a29-4364-97a7-b6a44556fe88-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.646931 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/347732e6-8a29-4364-97a7-b6a44556fe88-config\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.646960 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/347732e6-8a29-4364-97a7-b6a44556fe88-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.646982 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/347732e6-8a29-4364-97a7-b6a44556fe88-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.647009 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/347732e6-8a29-4364-97a7-b6a44556fe88-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.647025 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmh97\" (UniqueName: \"kubernetes.io/projected/347732e6-8a29-4364-97a7-b6a44556fe88-kube-api-access-vmh97\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.647090 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-662pv\" (UniqueName: \"kubernetes.io/projected/9727cd88-b0ba-4d3c-a639-0ff7d7686194-kube-api-access-662pv\") on node \"crc\" DevicePath \"\"" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.650888 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9727cd88-b0ba-4d3c-a639-0ff7d7686194-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "9727cd88-b0ba-4d3c-a639-0ff7d7686194" (UID: "9727cd88-b0ba-4d3c-a639-0ff7d7686194"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.651355 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9727cd88-b0ba-4d3c-a639-0ff7d7686194-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "9727cd88-b0ba-4d3c-a639-0ff7d7686194" (UID: "9727cd88-b0ba-4d3c-a639-0ff7d7686194"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.674920 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9727cd88-b0ba-4d3c-a639-0ff7d7686194-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9727cd88-b0ba-4d3c-a639-0ff7d7686194" (UID: "9727cd88-b0ba-4d3c-a639-0ff7d7686194"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.748995 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4f9ee8de-3300-43ef-887a-8e46472cd642\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9ee8de-3300-43ef-887a-8e46472cd642\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.749061 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/347732e6-8a29-4364-97a7-b6a44556fe88-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.749097 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/347732e6-8a29-4364-97a7-b6a44556fe88-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.749135 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/347732e6-8a29-4364-97a7-b6a44556fe88-config\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.749164 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/347732e6-8a29-4364-97a7-b6a44556fe88-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.749189 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/347732e6-8a29-4364-97a7-b6a44556fe88-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.749211 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/347732e6-8a29-4364-97a7-b6a44556fe88-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.749227 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmh97\" (UniqueName: \"kubernetes.io/projected/347732e6-8a29-4364-97a7-b6a44556fe88-kube-api-access-vmh97\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.749311 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/347732e6-8a29-4364-97a7-b6a44556fe88-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.749329 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/347732e6-8a29-4364-97a7-b6a44556fe88-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.749358 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/347732e6-8a29-4364-97a7-b6a44556fe88-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.749439 4756 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9727cd88-b0ba-4d3c-a639-0ff7d7686194-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.749451 4756 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9727cd88-b0ba-4d3c-a639-0ff7d7686194-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.749462 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9727cd88-b0ba-4d3c-a639-0ff7d7686194-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.751033 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/347732e6-8a29-4364-97a7-b6a44556fe88-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.759845 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.759888 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4f9ee8de-3300-43ef-887a-8e46472cd642\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9ee8de-3300-43ef-887a-8e46472cd642\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/90292c5c4a512e0007daec4c81258d0de0aa3d24f4fd9666135a0842aac2ff66/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.761511 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/347732e6-8a29-4364-97a7-b6a44556fe88-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.761532 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/347732e6-8a29-4364-97a7-b6a44556fe88-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.765369 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/347732e6-8a29-4364-97a7-b6a44556fe88-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.765481 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/347732e6-8a29-4364-97a7-b6a44556fe88-config\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.765871 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/347732e6-8a29-4364-97a7-b6a44556fe88-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.765905 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/347732e6-8a29-4364-97a7-b6a44556fe88-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.766856 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/347732e6-8a29-4364-97a7-b6a44556fe88-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.767081 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/347732e6-8a29-4364-97a7-b6a44556fe88-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.768632 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmh97\" (UniqueName: \"kubernetes.io/projected/347732e6-8a29-4364-97a7-b6a44556fe88-kube-api-access-vmh97\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.828680 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4f9ee8de-3300-43ef-887a-8e46472cd642\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9ee8de-3300-43ef-887a-8e46472cd642\") pod \"prometheus-metric-storage-0\" (UID: \"347732e6-8a29-4364-97a7-b6a44556fe88\") " pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:11 crc kubenswrapper[4756]: I1205 18:05:11.922187 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:12 crc kubenswrapper[4756]: I1205 18:05:12.079146 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69c44af2-cb75-43ba-9551-98cf81bcb6dd" path="/var/lib/kubelet/pods/69c44af2-cb75-43ba-9551-98cf81bcb6dd/volumes" Dec 05 18:05:12 crc kubenswrapper[4756]: I1205 18:05:12.080569 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9727cd88-b0ba-4d3c-a639-0ff7d7686194" path="/var/lib/kubelet/pods/9727cd88-b0ba-4d3c-a639-0ff7d7686194/volumes" Dec 05 18:05:12 crc kubenswrapper[4756]: I1205 18:05:12.385978 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 18:05:12 crc kubenswrapper[4756]: I1205 18:05:12.386962 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b47c6a6-a6a9-4753-89eb-6c34d061cd71","Type":"ContainerStarted","Data":"b32851486fe9beee21ca03d70531e370c222f31f52105e53e4b7f60280c2b971"} Dec 05 18:05:12 crc kubenswrapper[4756]: I1205 18:05:12.386995 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 18:05:12 crc kubenswrapper[4756]: I1205 18:05:12.405689 4756 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="9727cd88-b0ba-4d3c-a639-0ff7d7686194" podUID="59b410c3-f27c-4aba-a63b-28285991c0f8" Dec 05 18:05:12 crc kubenswrapper[4756]: I1205 18:05:12.407288 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.61206321 podStartE2EDuration="7.407263114s" podCreationTimestamp="2025-12-05 18:05:05 +0000 UTC" firstStartedPulling="2025-12-05 18:05:06.442893153 +0000 UTC m=+6566.681566675" lastFinishedPulling="2025-12-05 18:05:11.238093067 +0000 UTC m=+6571.476766579" observedRunningTime="2025-12-05 18:05:12.403252927 +0000 UTC m=+6572.641926449" watchObservedRunningTime="2025-12-05 18:05:12.407263114 +0000 UTC m=+6572.645936626" Dec 05 18:05:12 crc kubenswrapper[4756]: I1205 18:05:12.443589 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 18:05:13 crc kubenswrapper[4756]: I1205 18:05:13.404591 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"347732e6-8a29-4364-97a7-b6a44556fe88","Type":"ContainerStarted","Data":"36d5d92d4aee2728ae880e484c6e6456d3905a115e5d1649679c0ba72cd77e07"} Dec 05 18:05:15 crc kubenswrapper[4756]: I1205 18:05:15.419073 4756 scope.go:117] "RemoveContainer" containerID="abe730b63172ef2c155656c3a5b6ebdcaec877cac7d0d39f876e3f8bd7a2f1b7" Dec 05 18:05:15 crc kubenswrapper[4756]: I1205 18:05:15.470352 4756 scope.go:117] "RemoveContainer" containerID="0da5d3bf22c2db9d97173b74dff5b59c93ac8555b85eb8c8f363fc5497d268a1" Dec 05 18:05:15 crc kubenswrapper[4756]: I1205 18:05:15.507657 4756 scope.go:117] "RemoveContainer" containerID="d863f579dc79bf3abf3955d7a31847ce960064c361e86884a86bb8e159169706" Dec 05 18:05:15 crc kubenswrapper[4756]: I1205 18:05:15.529926 4756 scope.go:117] "RemoveContainer" containerID="28b1806a8d327e614cf899cc40b4f686fd487104d7ecfd2f8e8e2ae7a832aee6" Dec 05 18:05:15 crc kubenswrapper[4756]: I1205 18:05:15.693239 4756 scope.go:117] "RemoveContainer" containerID="8563fba2b165b0edb22d1ae88b9c43eb36f9194ef479f98c7b0ae181a6e2902d" Dec 05 18:05:16 crc kubenswrapper[4756]: I1205 18:05:16.148328 4756 scope.go:117] "RemoveContainer" containerID="9c554b92351ab4f2b3edf692ba55affc575d5bc6cb1888551b5702ee14704bee" Dec 05 18:05:16 crc kubenswrapper[4756]: I1205 18:05:16.175222 4756 scope.go:117] "RemoveContainer" containerID="3884131a2957aaffd06693e7875800762487a6618a48df9e30deba351f1f3bfd" Dec 05 18:05:16 crc kubenswrapper[4756]: I1205 18:05:16.813920 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-z2bjx"] Dec 05 18:05:16 crc kubenswrapper[4756]: I1205 18:05:16.817247 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-z2bjx" Dec 05 18:05:16 crc kubenswrapper[4756]: I1205 18:05:16.824108 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-z2bjx"] Dec 05 18:05:16 crc kubenswrapper[4756]: I1205 18:05:16.870511 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzzp7\" (UniqueName: \"kubernetes.io/projected/0839cf88-fa43-437b-b982-11ceb1c10581-kube-api-access-vzzp7\") pod \"aodh-db-create-z2bjx\" (UID: \"0839cf88-fa43-437b-b982-11ceb1c10581\") " pod="openstack/aodh-db-create-z2bjx" Dec 05 18:05:16 crc kubenswrapper[4756]: I1205 18:05:16.870641 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0839cf88-fa43-437b-b982-11ceb1c10581-operator-scripts\") pod \"aodh-db-create-z2bjx\" (UID: \"0839cf88-fa43-437b-b982-11ceb1c10581\") " pod="openstack/aodh-db-create-z2bjx" Dec 05 18:05:16 crc kubenswrapper[4756]: I1205 18:05:16.919841 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-1320-account-create-update-ngrp7"] Dec 05 18:05:16 crc kubenswrapper[4756]: I1205 18:05:16.921300 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-1320-account-create-update-ngrp7" Dec 05 18:05:16 crc kubenswrapper[4756]: I1205 18:05:16.923662 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Dec 05 18:05:16 crc kubenswrapper[4756]: I1205 18:05:16.933796 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-1320-account-create-update-ngrp7"] Dec 05 18:05:16 crc kubenswrapper[4756]: I1205 18:05:16.972641 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hft24\" (UniqueName: \"kubernetes.io/projected/b47b3cd5-11a6-414f-822e-941e222f3b44-kube-api-access-hft24\") pod \"aodh-1320-account-create-update-ngrp7\" (UID: \"b47b3cd5-11a6-414f-822e-941e222f3b44\") " pod="openstack/aodh-1320-account-create-update-ngrp7" Dec 05 18:05:16 crc kubenswrapper[4756]: I1205 18:05:16.972710 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzzp7\" (UniqueName: \"kubernetes.io/projected/0839cf88-fa43-437b-b982-11ceb1c10581-kube-api-access-vzzp7\") pod \"aodh-db-create-z2bjx\" (UID: \"0839cf88-fa43-437b-b982-11ceb1c10581\") " pod="openstack/aodh-db-create-z2bjx" Dec 05 18:05:16 crc kubenswrapper[4756]: I1205 18:05:16.972869 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0839cf88-fa43-437b-b982-11ceb1c10581-operator-scripts\") pod \"aodh-db-create-z2bjx\" (UID: \"0839cf88-fa43-437b-b982-11ceb1c10581\") " pod="openstack/aodh-db-create-z2bjx" Dec 05 18:05:16 crc kubenswrapper[4756]: I1205 18:05:16.972963 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b47b3cd5-11a6-414f-822e-941e222f3b44-operator-scripts\") pod \"aodh-1320-account-create-update-ngrp7\" (UID: \"b47b3cd5-11a6-414f-822e-941e222f3b44\") " pod="openstack/aodh-1320-account-create-update-ngrp7" Dec 05 18:05:16 crc kubenswrapper[4756]: I1205 18:05:16.974245 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0839cf88-fa43-437b-b982-11ceb1c10581-operator-scripts\") pod \"aodh-db-create-z2bjx\" (UID: \"0839cf88-fa43-437b-b982-11ceb1c10581\") " pod="openstack/aodh-db-create-z2bjx" Dec 05 18:05:16 crc kubenswrapper[4756]: I1205 18:05:16.998841 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzzp7\" (UniqueName: \"kubernetes.io/projected/0839cf88-fa43-437b-b982-11ceb1c10581-kube-api-access-vzzp7\") pod \"aodh-db-create-z2bjx\" (UID: \"0839cf88-fa43-437b-b982-11ceb1c10581\") " pod="openstack/aodh-db-create-z2bjx" Dec 05 18:05:17 crc kubenswrapper[4756]: I1205 18:05:17.075115 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b47b3cd5-11a6-414f-822e-941e222f3b44-operator-scripts\") pod \"aodh-1320-account-create-update-ngrp7\" (UID: \"b47b3cd5-11a6-414f-822e-941e222f3b44\") " pod="openstack/aodh-1320-account-create-update-ngrp7" Dec 05 18:05:17 crc kubenswrapper[4756]: I1205 18:05:17.075644 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hft24\" (UniqueName: \"kubernetes.io/projected/b47b3cd5-11a6-414f-822e-941e222f3b44-kube-api-access-hft24\") pod \"aodh-1320-account-create-update-ngrp7\" (UID: \"b47b3cd5-11a6-414f-822e-941e222f3b44\") " pod="openstack/aodh-1320-account-create-update-ngrp7" Dec 05 18:05:17 crc kubenswrapper[4756]: I1205 18:05:17.075818 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b47b3cd5-11a6-414f-822e-941e222f3b44-operator-scripts\") pod \"aodh-1320-account-create-update-ngrp7\" (UID: \"b47b3cd5-11a6-414f-822e-941e222f3b44\") " pod="openstack/aodh-1320-account-create-update-ngrp7" Dec 05 18:05:17 crc kubenswrapper[4756]: I1205 18:05:17.093503 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hft24\" (UniqueName: \"kubernetes.io/projected/b47b3cd5-11a6-414f-822e-941e222f3b44-kube-api-access-hft24\") pod \"aodh-1320-account-create-update-ngrp7\" (UID: \"b47b3cd5-11a6-414f-822e-941e222f3b44\") " pod="openstack/aodh-1320-account-create-update-ngrp7" Dec 05 18:05:17 crc kubenswrapper[4756]: I1205 18:05:17.149481 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-z2bjx" Dec 05 18:05:17 crc kubenswrapper[4756]: I1205 18:05:17.252765 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-1320-account-create-update-ngrp7" Dec 05 18:05:17 crc kubenswrapper[4756]: I1205 18:05:17.488920 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"347732e6-8a29-4364-97a7-b6a44556fe88","Type":"ContainerStarted","Data":"33de89c98f7358b984e8a9c7524a9bcdea7633d78c8511dd19333676171c632c"} Dec 05 18:05:17 crc kubenswrapper[4756]: I1205 18:05:17.727286 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-z2bjx"] Dec 05 18:05:17 crc kubenswrapper[4756]: W1205 18:05:17.850570 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb47b3cd5_11a6_414f_822e_941e222f3b44.slice/crio-2d749c93ce9d30457065a6a0f197d7b988cf3f681f4878eaa48bc4c0c03a21e6 WatchSource:0}: Error finding container 2d749c93ce9d30457065a6a0f197d7b988cf3f681f4878eaa48bc4c0c03a21e6: Status 404 returned error can't find the container with id 2d749c93ce9d30457065a6a0f197d7b988cf3f681f4878eaa48bc4c0c03a21e6 Dec 05 18:05:17 crc kubenswrapper[4756]: I1205 18:05:17.852823 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-1320-account-create-update-ngrp7"] Dec 05 18:05:18 crc kubenswrapper[4756]: I1205 18:05:18.498776 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-1320-account-create-update-ngrp7" event={"ID":"b47b3cd5-11a6-414f-822e-941e222f3b44","Type":"ContainerStarted","Data":"5b4c714c78cccf810542a7c208b283ff5de91b2669f18cf66b2445b89188aa48"} Dec 05 18:05:18 crc kubenswrapper[4756]: I1205 18:05:18.499140 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-1320-account-create-update-ngrp7" event={"ID":"b47b3cd5-11a6-414f-822e-941e222f3b44","Type":"ContainerStarted","Data":"2d749c93ce9d30457065a6a0f197d7b988cf3f681f4878eaa48bc4c0c03a21e6"} Dec 05 18:05:18 crc kubenswrapper[4756]: I1205 18:05:18.504428 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-z2bjx" event={"ID":"0839cf88-fa43-437b-b982-11ceb1c10581","Type":"ContainerStarted","Data":"45f2829d6cc4266cb29ec3188d3be04fae0197efef27e58fa6eeecb130fc71a2"} Dec 05 18:05:18 crc kubenswrapper[4756]: I1205 18:05:18.504466 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-z2bjx" event={"ID":"0839cf88-fa43-437b-b982-11ceb1c10581","Type":"ContainerStarted","Data":"672e4b8b1e19ab69c164cee5346abdde50499849c036b6f016aae98cc1d50264"} Dec 05 18:05:18 crc kubenswrapper[4756]: I1205 18:05:18.525102 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-1320-account-create-update-ngrp7" podStartSLOduration=2.525084181 podStartE2EDuration="2.525084181s" podCreationTimestamp="2025-12-05 18:05:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 18:05:18.520594071 +0000 UTC m=+6578.759267603" watchObservedRunningTime="2025-12-05 18:05:18.525084181 +0000 UTC m=+6578.763757703" Dec 05 18:05:18 crc kubenswrapper[4756]: I1205 18:05:18.541359 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-create-z2bjx" podStartSLOduration=2.541340174 podStartE2EDuration="2.541340174s" podCreationTimestamp="2025-12-05 18:05:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 18:05:18.53555523 +0000 UTC m=+6578.774228752" watchObservedRunningTime="2025-12-05 18:05:18.541340174 +0000 UTC m=+6578.780013696" Dec 05 18:05:20 crc kubenswrapper[4756]: I1205 18:05:20.530464 4756 generic.go:334] "Generic (PLEG): container finished" podID="b47b3cd5-11a6-414f-822e-941e222f3b44" containerID="5b4c714c78cccf810542a7c208b283ff5de91b2669f18cf66b2445b89188aa48" exitCode=0 Dec 05 18:05:20 crc kubenswrapper[4756]: I1205 18:05:20.530546 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-1320-account-create-update-ngrp7" event={"ID":"b47b3cd5-11a6-414f-822e-941e222f3b44","Type":"ContainerDied","Data":"5b4c714c78cccf810542a7c208b283ff5de91b2669f18cf66b2445b89188aa48"} Dec 05 18:05:20 crc kubenswrapper[4756]: I1205 18:05:20.535170 4756 generic.go:334] "Generic (PLEG): container finished" podID="0839cf88-fa43-437b-b982-11ceb1c10581" containerID="45f2829d6cc4266cb29ec3188d3be04fae0197efef27e58fa6eeecb130fc71a2" exitCode=0 Dec 05 18:05:20 crc kubenswrapper[4756]: I1205 18:05:20.535226 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-z2bjx" event={"ID":"0839cf88-fa43-437b-b982-11ceb1c10581","Type":"ContainerDied","Data":"45f2829d6cc4266cb29ec3188d3be04fae0197efef27e58fa6eeecb130fc71a2"} Dec 05 18:05:22 crc kubenswrapper[4756]: I1205 18:05:22.073143 4756 scope.go:117] "RemoveContainer" containerID="c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326" Dec 05 18:05:22 crc kubenswrapper[4756]: E1205 18:05:22.074115 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:05:22 crc kubenswrapper[4756]: I1205 18:05:22.136465 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-z2bjx" Dec 05 18:05:22 crc kubenswrapper[4756]: I1205 18:05:22.144790 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-1320-account-create-update-ngrp7" Dec 05 18:05:22 crc kubenswrapper[4756]: I1205 18:05:22.204234 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b47b3cd5-11a6-414f-822e-941e222f3b44-operator-scripts\") pod \"b47b3cd5-11a6-414f-822e-941e222f3b44\" (UID: \"b47b3cd5-11a6-414f-822e-941e222f3b44\") " Dec 05 18:05:22 crc kubenswrapper[4756]: I1205 18:05:22.204450 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0839cf88-fa43-437b-b982-11ceb1c10581-operator-scripts\") pod \"0839cf88-fa43-437b-b982-11ceb1c10581\" (UID: \"0839cf88-fa43-437b-b982-11ceb1c10581\") " Dec 05 18:05:22 crc kubenswrapper[4756]: I1205 18:05:22.204529 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hft24\" (UniqueName: \"kubernetes.io/projected/b47b3cd5-11a6-414f-822e-941e222f3b44-kube-api-access-hft24\") pod \"b47b3cd5-11a6-414f-822e-941e222f3b44\" (UID: \"b47b3cd5-11a6-414f-822e-941e222f3b44\") " Dec 05 18:05:22 crc kubenswrapper[4756]: I1205 18:05:22.204629 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzzp7\" (UniqueName: \"kubernetes.io/projected/0839cf88-fa43-437b-b982-11ceb1c10581-kube-api-access-vzzp7\") pod \"0839cf88-fa43-437b-b982-11ceb1c10581\" (UID: \"0839cf88-fa43-437b-b982-11ceb1c10581\") " Dec 05 18:05:22 crc kubenswrapper[4756]: I1205 18:05:22.206564 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0839cf88-fa43-437b-b982-11ceb1c10581-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0839cf88-fa43-437b-b982-11ceb1c10581" (UID: "0839cf88-fa43-437b-b982-11ceb1c10581"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:05:22 crc kubenswrapper[4756]: I1205 18:05:22.206587 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b47b3cd5-11a6-414f-822e-941e222f3b44-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b47b3cd5-11a6-414f-822e-941e222f3b44" (UID: "b47b3cd5-11a6-414f-822e-941e222f3b44"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:05:22 crc kubenswrapper[4756]: I1205 18:05:22.210925 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b47b3cd5-11a6-414f-822e-941e222f3b44-kube-api-access-hft24" (OuterVolumeSpecName: "kube-api-access-hft24") pod "b47b3cd5-11a6-414f-822e-941e222f3b44" (UID: "b47b3cd5-11a6-414f-822e-941e222f3b44"). InnerVolumeSpecName "kube-api-access-hft24". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:05:22 crc kubenswrapper[4756]: I1205 18:05:22.212001 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0839cf88-fa43-437b-b982-11ceb1c10581-kube-api-access-vzzp7" (OuterVolumeSpecName: "kube-api-access-vzzp7") pod "0839cf88-fa43-437b-b982-11ceb1c10581" (UID: "0839cf88-fa43-437b-b982-11ceb1c10581"). InnerVolumeSpecName "kube-api-access-vzzp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:05:22 crc kubenswrapper[4756]: I1205 18:05:22.307623 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0839cf88-fa43-437b-b982-11ceb1c10581-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 18:05:22 crc kubenswrapper[4756]: I1205 18:05:22.307663 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hft24\" (UniqueName: \"kubernetes.io/projected/b47b3cd5-11a6-414f-822e-941e222f3b44-kube-api-access-hft24\") on node \"crc\" DevicePath \"\"" Dec 05 18:05:22 crc kubenswrapper[4756]: I1205 18:05:22.307679 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzzp7\" (UniqueName: \"kubernetes.io/projected/0839cf88-fa43-437b-b982-11ceb1c10581-kube-api-access-vzzp7\") on node \"crc\" DevicePath \"\"" Dec 05 18:05:22 crc kubenswrapper[4756]: I1205 18:05:22.307694 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b47b3cd5-11a6-414f-822e-941e222f3b44-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 18:05:22 crc kubenswrapper[4756]: I1205 18:05:22.562731 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-1320-account-create-update-ngrp7" Dec 05 18:05:22 crc kubenswrapper[4756]: I1205 18:05:22.562741 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-1320-account-create-update-ngrp7" event={"ID":"b47b3cd5-11a6-414f-822e-941e222f3b44","Type":"ContainerDied","Data":"2d749c93ce9d30457065a6a0f197d7b988cf3f681f4878eaa48bc4c0c03a21e6"} Dec 05 18:05:22 crc kubenswrapper[4756]: I1205 18:05:22.562805 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d749c93ce9d30457065a6a0f197d7b988cf3f681f4878eaa48bc4c0c03a21e6" Dec 05 18:05:22 crc kubenswrapper[4756]: I1205 18:05:22.566559 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-z2bjx" event={"ID":"0839cf88-fa43-437b-b982-11ceb1c10581","Type":"ContainerDied","Data":"672e4b8b1e19ab69c164cee5346abdde50499849c036b6f016aae98cc1d50264"} Dec 05 18:05:22 crc kubenswrapper[4756]: I1205 18:05:22.566605 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="672e4b8b1e19ab69c164cee5346abdde50499849c036b6f016aae98cc1d50264" Dec 05 18:05:22 crc kubenswrapper[4756]: I1205 18:05:22.566635 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-z2bjx" Dec 05 18:05:24 crc kubenswrapper[4756]: I1205 18:05:24.585079 4756 generic.go:334] "Generic (PLEG): container finished" podID="347732e6-8a29-4364-97a7-b6a44556fe88" containerID="33de89c98f7358b984e8a9c7524a9bcdea7633d78c8511dd19333676171c632c" exitCode=0 Dec 05 18:05:24 crc kubenswrapper[4756]: I1205 18:05:24.585149 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"347732e6-8a29-4364-97a7-b6a44556fe88","Type":"ContainerDied","Data":"33de89c98f7358b984e8a9c7524a9bcdea7633d78c8511dd19333676171c632c"} Dec 05 18:05:25 crc kubenswrapper[4756]: I1205 18:05:25.598115 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"347732e6-8a29-4364-97a7-b6a44556fe88","Type":"ContainerStarted","Data":"9cfc988bf4913d0c9d75606ec68757d0d126ac76b13e970324632b0256c99a6b"} Dec 05 18:05:27 crc kubenswrapper[4756]: I1205 18:05:27.081145 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-lncvr"] Dec 05 18:05:27 crc kubenswrapper[4756]: I1205 18:05:27.093621 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-t2z9m"] Dec 05 18:05:27 crc kubenswrapper[4756]: I1205 18:05:27.105503 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-lncvr"] Dec 05 18:05:27 crc kubenswrapper[4756]: I1205 18:05:27.115593 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-t2z9m"] Dec 05 18:05:27 crc kubenswrapper[4756]: I1205 18:05:27.200943 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-k9mdr"] Dec 05 18:05:27 crc kubenswrapper[4756]: E1205 18:05:27.201658 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0839cf88-fa43-437b-b982-11ceb1c10581" containerName="mariadb-database-create" Dec 05 18:05:27 crc kubenswrapper[4756]: I1205 18:05:27.201684 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="0839cf88-fa43-437b-b982-11ceb1c10581" containerName="mariadb-database-create" Dec 05 18:05:27 crc kubenswrapper[4756]: E1205 18:05:27.201714 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b47b3cd5-11a6-414f-822e-941e222f3b44" containerName="mariadb-account-create-update" Dec 05 18:05:27 crc kubenswrapper[4756]: I1205 18:05:27.201724 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b47b3cd5-11a6-414f-822e-941e222f3b44" containerName="mariadb-account-create-update" Dec 05 18:05:27 crc kubenswrapper[4756]: I1205 18:05:27.202006 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="0839cf88-fa43-437b-b982-11ceb1c10581" containerName="mariadb-database-create" Dec 05 18:05:27 crc kubenswrapper[4756]: I1205 18:05:27.202041 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b47b3cd5-11a6-414f-822e-941e222f3b44" containerName="mariadb-account-create-update" Dec 05 18:05:27 crc kubenswrapper[4756]: I1205 18:05:27.203040 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-k9mdr" Dec 05 18:05:27 crc kubenswrapper[4756]: I1205 18:05:27.205915 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 05 18:05:27 crc kubenswrapper[4756]: I1205 18:05:27.205940 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 05 18:05:27 crc kubenswrapper[4756]: I1205 18:05:27.205949 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-87z99" Dec 05 18:05:27 crc kubenswrapper[4756]: I1205 18:05:27.206182 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 18:05:27 crc kubenswrapper[4756]: I1205 18:05:27.232927 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-k9mdr"] Dec 05 18:05:27 crc kubenswrapper[4756]: I1205 18:05:27.236943 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbbe16c2-80cb-411f-b3b9-fd45a440c956-combined-ca-bundle\") pod \"aodh-db-sync-k9mdr\" (UID: \"fbbe16c2-80cb-411f-b3b9-fd45a440c956\") " pod="openstack/aodh-db-sync-k9mdr" Dec 05 18:05:27 crc kubenswrapper[4756]: I1205 18:05:27.237057 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbbe16c2-80cb-411f-b3b9-fd45a440c956-config-data\") pod \"aodh-db-sync-k9mdr\" (UID: \"fbbe16c2-80cb-411f-b3b9-fd45a440c956\") " pod="openstack/aodh-db-sync-k9mdr" Dec 05 18:05:27 crc kubenswrapper[4756]: I1205 18:05:27.237430 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbbe16c2-80cb-411f-b3b9-fd45a440c956-scripts\") pod \"aodh-db-sync-k9mdr\" (UID: \"fbbe16c2-80cb-411f-b3b9-fd45a440c956\") " pod="openstack/aodh-db-sync-k9mdr" Dec 05 18:05:27 crc kubenswrapper[4756]: I1205 18:05:27.237528 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6g8h\" (UniqueName: \"kubernetes.io/projected/fbbe16c2-80cb-411f-b3b9-fd45a440c956-kube-api-access-t6g8h\") pod \"aodh-db-sync-k9mdr\" (UID: \"fbbe16c2-80cb-411f-b3b9-fd45a440c956\") " pod="openstack/aodh-db-sync-k9mdr" Dec 05 18:05:27 crc kubenswrapper[4756]: I1205 18:05:27.339465 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbbe16c2-80cb-411f-b3b9-fd45a440c956-scripts\") pod \"aodh-db-sync-k9mdr\" (UID: \"fbbe16c2-80cb-411f-b3b9-fd45a440c956\") " pod="openstack/aodh-db-sync-k9mdr" Dec 05 18:05:27 crc kubenswrapper[4756]: I1205 18:05:27.339778 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6g8h\" (UniqueName: \"kubernetes.io/projected/fbbe16c2-80cb-411f-b3b9-fd45a440c956-kube-api-access-t6g8h\") pod \"aodh-db-sync-k9mdr\" (UID: \"fbbe16c2-80cb-411f-b3b9-fd45a440c956\") " pod="openstack/aodh-db-sync-k9mdr" Dec 05 18:05:27 crc kubenswrapper[4756]: I1205 18:05:27.339899 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbbe16c2-80cb-411f-b3b9-fd45a440c956-combined-ca-bundle\") pod \"aodh-db-sync-k9mdr\" (UID: \"fbbe16c2-80cb-411f-b3b9-fd45a440c956\") " pod="openstack/aodh-db-sync-k9mdr" Dec 05 18:05:27 crc kubenswrapper[4756]: I1205 18:05:27.340000 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbbe16c2-80cb-411f-b3b9-fd45a440c956-config-data\") pod \"aodh-db-sync-k9mdr\" (UID: \"fbbe16c2-80cb-411f-b3b9-fd45a440c956\") " pod="openstack/aodh-db-sync-k9mdr" Dec 05 18:05:27 crc kubenswrapper[4756]: I1205 18:05:27.429092 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbbe16c2-80cb-411f-b3b9-fd45a440c956-scripts\") pod \"aodh-db-sync-k9mdr\" (UID: \"fbbe16c2-80cb-411f-b3b9-fd45a440c956\") " pod="openstack/aodh-db-sync-k9mdr" Dec 05 18:05:27 crc kubenswrapper[4756]: I1205 18:05:27.430591 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbbe16c2-80cb-411f-b3b9-fd45a440c956-combined-ca-bundle\") pod \"aodh-db-sync-k9mdr\" (UID: \"fbbe16c2-80cb-411f-b3b9-fd45a440c956\") " pod="openstack/aodh-db-sync-k9mdr" Dec 05 18:05:27 crc kubenswrapper[4756]: I1205 18:05:27.431432 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbbe16c2-80cb-411f-b3b9-fd45a440c956-config-data\") pod \"aodh-db-sync-k9mdr\" (UID: \"fbbe16c2-80cb-411f-b3b9-fd45a440c956\") " pod="openstack/aodh-db-sync-k9mdr" Dec 05 18:05:27 crc kubenswrapper[4756]: I1205 18:05:27.431537 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6g8h\" (UniqueName: \"kubernetes.io/projected/fbbe16c2-80cb-411f-b3b9-fd45a440c956-kube-api-access-t6g8h\") pod \"aodh-db-sync-k9mdr\" (UID: \"fbbe16c2-80cb-411f-b3b9-fd45a440c956\") " pod="openstack/aodh-db-sync-k9mdr" Dec 05 18:05:27 crc kubenswrapper[4756]: I1205 18:05:27.530229 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-k9mdr" Dec 05 18:05:28 crc kubenswrapper[4756]: I1205 18:05:28.042141 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-1d5a-account-create-update-fpm7x"] Dec 05 18:05:28 crc kubenswrapper[4756]: I1205 18:05:28.088042 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbd2a51b-4e22-45f7-b95a-5f95837d5a11" path="/var/lib/kubelet/pods/bbd2a51b-4e22-45f7-b95a-5f95837d5a11/volumes" Dec 05 18:05:28 crc kubenswrapper[4756]: I1205 18:05:28.089131 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7ead6de-7881-4fe3-810e-6812c8a0669d" path="/var/lib/kubelet/pods/f7ead6de-7881-4fe3-810e-6812c8a0669d/volumes" Dec 05 18:05:28 crc kubenswrapper[4756]: I1205 18:05:28.089769 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-bv2sm"] Dec 05 18:05:28 crc kubenswrapper[4756]: I1205 18:05:28.089813 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-4a2e-account-create-update-hd9cg"] Dec 05 18:05:28 crc kubenswrapper[4756]: I1205 18:05:28.105501 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-1d5a-account-create-update-fpm7x"] Dec 05 18:05:28 crc kubenswrapper[4756]: I1205 18:05:28.155536 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-4a2e-account-create-update-hd9cg"] Dec 05 18:05:28 crc kubenswrapper[4756]: I1205 18:05:28.196299 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-bv2sm"] Dec 05 18:05:28 crc kubenswrapper[4756]: I1205 18:05:28.240783 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-35b0-account-create-update-rjv9j"] Dec 05 18:05:28 crc kubenswrapper[4756]: I1205 18:05:28.273634 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-35b0-account-create-update-rjv9j"] Dec 05 18:05:28 crc kubenswrapper[4756]: I1205 18:05:28.286629 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-k9mdr"] Dec 05 18:05:28 crc kubenswrapper[4756]: I1205 18:05:28.632023 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-k9mdr" event={"ID":"fbbe16c2-80cb-411f-b3b9-fd45a440c956","Type":"ContainerStarted","Data":"328632c24da905affb9bb3a086e27b116224644d193d94be21a9aac984f5c839"} Dec 05 18:05:28 crc kubenswrapper[4756]: I1205 18:05:28.635115 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"347732e6-8a29-4364-97a7-b6a44556fe88","Type":"ContainerStarted","Data":"56fd43fe5f97a8057f71b0011cf6b90c43c8b234240552c6ce86c3c3f092fe6d"} Dec 05 18:05:29 crc kubenswrapper[4756]: I1205 18:05:29.661730 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"347732e6-8a29-4364-97a7-b6a44556fe88","Type":"ContainerStarted","Data":"3024f01abbd8b03d4083fe89f35d6ab391f2efb3128a96119d5591c8908578a0"} Dec 05 18:05:29 crc kubenswrapper[4756]: I1205 18:05:29.696591 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=18.696575595 podStartE2EDuration="18.696575595s" podCreationTimestamp="2025-12-05 18:05:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 18:05:29.686682551 +0000 UTC m=+6589.925356073" watchObservedRunningTime="2025-12-05 18:05:29.696575595 +0000 UTC m=+6589.935249107" Dec 05 18:05:30 crc kubenswrapper[4756]: I1205 18:05:30.076609 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e04f6fd-6dd5-4eb3-8af2-ba12964173fc" path="/var/lib/kubelet/pods/4e04f6fd-6dd5-4eb3-8af2-ba12964173fc/volumes" Dec 05 18:05:30 crc kubenswrapper[4756]: I1205 18:05:30.077879 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75698abd-8581-4a45-b861-f397a191a09e" path="/var/lib/kubelet/pods/75698abd-8581-4a45-b861-f397a191a09e/volumes" Dec 05 18:05:30 crc kubenswrapper[4756]: I1205 18:05:30.078911 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7790cbf8-4ff7-4061-a33a-8468d51abe63" path="/var/lib/kubelet/pods/7790cbf8-4ff7-4061-a33a-8468d51abe63/volumes" Dec 05 18:05:30 crc kubenswrapper[4756]: I1205 18:05:30.080234 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8cfbbe8-ec99-4a0c-a960-7faf5c545935" path="/var/lib/kubelet/pods/e8cfbbe8-ec99-4a0c-a960-7faf5c545935/volumes" Dec 05 18:05:31 crc kubenswrapper[4756]: I1205 18:05:31.922902 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:33 crc kubenswrapper[4756]: I1205 18:05:33.705526 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-k9mdr" event={"ID":"fbbe16c2-80cb-411f-b3b9-fd45a440c956","Type":"ContainerStarted","Data":"95f65d4a2e40626845a8ce515e74c1ca1f5bd50def7427a4828f33a12a2a29d3"} Dec 05 18:05:33 crc kubenswrapper[4756]: I1205 18:05:33.725166 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-k9mdr" podStartSLOduration=2.316176234 podStartE2EDuration="6.725150892s" podCreationTimestamp="2025-12-05 18:05:27 +0000 UTC" firstStartedPulling="2025-12-05 18:05:28.161344449 +0000 UTC m=+6588.400017961" lastFinishedPulling="2025-12-05 18:05:32.570319107 +0000 UTC m=+6592.808992619" observedRunningTime="2025-12-05 18:05:33.719492732 +0000 UTC m=+6593.958166274" watchObservedRunningTime="2025-12-05 18:05:33.725150892 +0000 UTC m=+6593.963824394" Dec 05 18:05:35 crc kubenswrapper[4756]: I1205 18:05:35.731242 4756 generic.go:334] "Generic (PLEG): container finished" podID="fbbe16c2-80cb-411f-b3b9-fd45a440c956" containerID="95f65d4a2e40626845a8ce515e74c1ca1f5bd50def7427a4828f33a12a2a29d3" exitCode=0 Dec 05 18:05:35 crc kubenswrapper[4756]: I1205 18:05:35.731352 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-k9mdr" event={"ID":"fbbe16c2-80cb-411f-b3b9-fd45a440c956","Type":"ContainerDied","Data":"95f65d4a2e40626845a8ce515e74c1ca1f5bd50def7427a4828f33a12a2a29d3"} Dec 05 18:05:35 crc kubenswrapper[4756]: I1205 18:05:35.878694 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 05 18:05:36 crc kubenswrapper[4756]: I1205 18:05:36.050364 4756 scope.go:117] "RemoveContainer" containerID="c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326" Dec 05 18:05:36 crc kubenswrapper[4756]: E1205 18:05:36.050740 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:05:37 crc kubenswrapper[4756]: I1205 18:05:37.050782 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cc548"] Dec 05 18:05:37 crc kubenswrapper[4756]: I1205 18:05:37.064534 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cc548"] Dec 05 18:05:37 crc kubenswrapper[4756]: I1205 18:05:37.265943 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-k9mdr" Dec 05 18:05:37 crc kubenswrapper[4756]: I1205 18:05:37.390245 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbbe16c2-80cb-411f-b3b9-fd45a440c956-scripts\") pod \"fbbe16c2-80cb-411f-b3b9-fd45a440c956\" (UID: \"fbbe16c2-80cb-411f-b3b9-fd45a440c956\") " Dec 05 18:05:37 crc kubenswrapper[4756]: I1205 18:05:37.390344 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6g8h\" (UniqueName: \"kubernetes.io/projected/fbbe16c2-80cb-411f-b3b9-fd45a440c956-kube-api-access-t6g8h\") pod \"fbbe16c2-80cb-411f-b3b9-fd45a440c956\" (UID: \"fbbe16c2-80cb-411f-b3b9-fd45a440c956\") " Dec 05 18:05:37 crc kubenswrapper[4756]: I1205 18:05:37.390390 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbbe16c2-80cb-411f-b3b9-fd45a440c956-combined-ca-bundle\") pod \"fbbe16c2-80cb-411f-b3b9-fd45a440c956\" (UID: \"fbbe16c2-80cb-411f-b3b9-fd45a440c956\") " Dec 05 18:05:37 crc kubenswrapper[4756]: I1205 18:05:37.390611 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbbe16c2-80cb-411f-b3b9-fd45a440c956-config-data\") pod \"fbbe16c2-80cb-411f-b3b9-fd45a440c956\" (UID: \"fbbe16c2-80cb-411f-b3b9-fd45a440c956\") " Dec 05 18:05:37 crc kubenswrapper[4756]: I1205 18:05:37.400689 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbbe16c2-80cb-411f-b3b9-fd45a440c956-kube-api-access-t6g8h" (OuterVolumeSpecName: "kube-api-access-t6g8h") pod "fbbe16c2-80cb-411f-b3b9-fd45a440c956" (UID: "fbbe16c2-80cb-411f-b3b9-fd45a440c956"). InnerVolumeSpecName "kube-api-access-t6g8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:05:37 crc kubenswrapper[4756]: I1205 18:05:37.400821 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbbe16c2-80cb-411f-b3b9-fd45a440c956-scripts" (OuterVolumeSpecName: "scripts") pod "fbbe16c2-80cb-411f-b3b9-fd45a440c956" (UID: "fbbe16c2-80cb-411f-b3b9-fd45a440c956"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:05:37 crc kubenswrapper[4756]: I1205 18:05:37.444802 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbbe16c2-80cb-411f-b3b9-fd45a440c956-config-data" (OuterVolumeSpecName: "config-data") pod "fbbe16c2-80cb-411f-b3b9-fd45a440c956" (UID: "fbbe16c2-80cb-411f-b3b9-fd45a440c956"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:05:37 crc kubenswrapper[4756]: I1205 18:05:37.445075 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbbe16c2-80cb-411f-b3b9-fd45a440c956-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbbe16c2-80cb-411f-b3b9-fd45a440c956" (UID: "fbbe16c2-80cb-411f-b3b9-fd45a440c956"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:05:37 crc kubenswrapper[4756]: I1205 18:05:37.494274 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbbe16c2-80cb-411f-b3b9-fd45a440c956-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 18:05:37 crc kubenswrapper[4756]: I1205 18:05:37.494326 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbbe16c2-80cb-411f-b3b9-fd45a440c956-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 18:05:37 crc kubenswrapper[4756]: I1205 18:05:37.494359 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6g8h\" (UniqueName: \"kubernetes.io/projected/fbbe16c2-80cb-411f-b3b9-fd45a440c956-kube-api-access-t6g8h\") on node \"crc\" DevicePath \"\"" Dec 05 18:05:37 crc kubenswrapper[4756]: I1205 18:05:37.494373 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbbe16c2-80cb-411f-b3b9-fd45a440c956-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:05:37 crc kubenswrapper[4756]: I1205 18:05:37.766715 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-k9mdr" event={"ID":"fbbe16c2-80cb-411f-b3b9-fd45a440c956","Type":"ContainerDied","Data":"328632c24da905affb9bb3a086e27b116224644d193d94be21a9aac984f5c839"} Dec 05 18:05:37 crc kubenswrapper[4756]: I1205 18:05:37.766753 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="328632c24da905affb9bb3a086e27b116224644d193d94be21a9aac984f5c839" Dec 05 18:05:37 crc kubenswrapper[4756]: I1205 18:05:37.766780 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-k9mdr" Dec 05 18:05:38 crc kubenswrapper[4756]: I1205 18:05:38.061401 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44c361f5-fc03-4aa7-94a0-0a3e23537626" path="/var/lib/kubelet/pods/44c361f5-fc03-4aa7-94a0-0a3e23537626/volumes" Dec 05 18:05:39 crc kubenswrapper[4756]: I1205 18:05:39.883186 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 18:05:39 crc kubenswrapper[4756]: I1205 18:05:39.883792 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="73bd33f1-e4bf-454f-9a7c-bff0564e02dc" containerName="kube-state-metrics" containerID="cri-o://389e071702a7ccd2f7993ba8dc8f6cc4010a164ed836a46b4a94df90e182ecca" gracePeriod=30 Dec 05 18:05:40 crc kubenswrapper[4756]: I1205 18:05:40.515265 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="73bd33f1-e4bf-454f-9a7c-bff0564e02dc" containerName="kube-state-metrics" probeResult="failure" output="Get \"http://10.217.1.145:8081/readyz\": dial tcp 10.217.1.145:8081: connect: connection refused" Dec 05 18:05:40 crc kubenswrapper[4756]: I1205 18:05:40.822108 4756 generic.go:334] "Generic (PLEG): container finished" podID="73bd33f1-e4bf-454f-9a7c-bff0564e02dc" containerID="389e071702a7ccd2f7993ba8dc8f6cc4010a164ed836a46b4a94df90e182ecca" exitCode=2 Dec 05 18:05:40 crc kubenswrapper[4756]: I1205 18:05:40.822349 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"73bd33f1-e4bf-454f-9a7c-bff0564e02dc","Type":"ContainerDied","Data":"389e071702a7ccd2f7993ba8dc8f6cc4010a164ed836a46b4a94df90e182ecca"} Dec 05 18:05:40 crc kubenswrapper[4756]: I1205 18:05:40.948994 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 18:05:41 crc kubenswrapper[4756]: I1205 18:05:41.079304 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9bjh\" (UniqueName: \"kubernetes.io/projected/73bd33f1-e4bf-454f-9a7c-bff0564e02dc-kube-api-access-f9bjh\") pod \"73bd33f1-e4bf-454f-9a7c-bff0564e02dc\" (UID: \"73bd33f1-e4bf-454f-9a7c-bff0564e02dc\") " Dec 05 18:05:41 crc kubenswrapper[4756]: I1205 18:05:41.087888 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73bd33f1-e4bf-454f-9a7c-bff0564e02dc-kube-api-access-f9bjh" (OuterVolumeSpecName: "kube-api-access-f9bjh") pod "73bd33f1-e4bf-454f-9a7c-bff0564e02dc" (UID: "73bd33f1-e4bf-454f-9a7c-bff0564e02dc"). InnerVolumeSpecName "kube-api-access-f9bjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:05:41 crc kubenswrapper[4756]: I1205 18:05:41.181996 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9bjh\" (UniqueName: \"kubernetes.io/projected/73bd33f1-e4bf-454f-9a7c-bff0564e02dc-kube-api-access-f9bjh\") on node \"crc\" DevicePath \"\"" Dec 05 18:05:41 crc kubenswrapper[4756]: I1205 18:05:41.833031 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"73bd33f1-e4bf-454f-9a7c-bff0564e02dc","Type":"ContainerDied","Data":"e03fce4436860dd992dbc42961763a17494084adb95aa2d25ab6f4cf0959acac"} Dec 05 18:05:41 crc kubenswrapper[4756]: I1205 18:05:41.833309 4756 scope.go:117] "RemoveContainer" containerID="389e071702a7ccd2f7993ba8dc8f6cc4010a164ed836a46b4a94df90e182ecca" Dec 05 18:05:41 crc kubenswrapper[4756]: I1205 18:05:41.833142 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 18:05:41 crc kubenswrapper[4756]: I1205 18:05:41.876596 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 18:05:41 crc kubenswrapper[4756]: I1205 18:05:41.901869 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 18:05:41 crc kubenswrapper[4756]: I1205 18:05:41.911538 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 18:05:41 crc kubenswrapper[4756]: E1205 18:05:41.912183 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbbe16c2-80cb-411f-b3b9-fd45a440c956" containerName="aodh-db-sync" Dec 05 18:05:41 crc kubenswrapper[4756]: I1205 18:05:41.912200 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbbe16c2-80cb-411f-b3b9-fd45a440c956" containerName="aodh-db-sync" Dec 05 18:05:41 crc kubenswrapper[4756]: E1205 18:05:41.912227 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73bd33f1-e4bf-454f-9a7c-bff0564e02dc" containerName="kube-state-metrics" Dec 05 18:05:41 crc kubenswrapper[4756]: I1205 18:05:41.912234 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="73bd33f1-e4bf-454f-9a7c-bff0564e02dc" containerName="kube-state-metrics" Dec 05 18:05:41 crc kubenswrapper[4756]: I1205 18:05:41.912554 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbbe16c2-80cb-411f-b3b9-fd45a440c956" containerName="aodh-db-sync" Dec 05 18:05:41 crc kubenswrapper[4756]: I1205 18:05:41.912589 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="73bd33f1-e4bf-454f-9a7c-bff0564e02dc" containerName="kube-state-metrics" Dec 05 18:05:41 crc kubenswrapper[4756]: I1205 18:05:41.915850 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 18:05:41 crc kubenswrapper[4756]: I1205 18:05:41.918890 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 05 18:05:41 crc kubenswrapper[4756]: I1205 18:05:41.919463 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 05 18:05:41 crc kubenswrapper[4756]: I1205 18:05:41.920446 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 18:05:41 crc kubenswrapper[4756]: I1205 18:05:41.922343 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:41 crc kubenswrapper[4756]: I1205 18:05:41.937887 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.006504 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.006848 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3b47c6a6-a6a9-4753-89eb-6c34d061cd71" containerName="ceilometer-central-agent" containerID="cri-o://54cba42d0632eeaede6fe5fd55ac92a668f0d7393f3f4705fffac3b3ac4b5deb" gracePeriod=30 Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.007433 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3b47c6a6-a6a9-4753-89eb-6c34d061cd71" containerName="proxy-httpd" containerID="cri-o://b32851486fe9beee21ca03d70531e370c222f31f52105e53e4b7f60280c2b971" gracePeriod=30 Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.007509 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3b47c6a6-a6a9-4753-89eb-6c34d061cd71" containerName="sg-core" containerID="cri-o://0c805a4fc0987b3444fa59edecc166d5ea04d88eafc056338b31e9bb7322c175" gracePeriod=30 Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.007558 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3b47c6a6-a6a9-4753-89eb-6c34d061cd71" containerName="ceilometer-notification-agent" containerID="cri-o://d2eaff74ac8a4658ce440b063ee1ffcee9a5f7a4ede2b1d055ed76a8a23c26e4" gracePeriod=30 Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.065625 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73bd33f1-e4bf-454f-9a7c-bff0564e02dc" path="/var/lib/kubelet/pods/73bd33f1-e4bf-454f-9a7c-bff0564e02dc/volumes" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.066534 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.074643 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.084502 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-87z99" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.084865 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.084923 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.110126 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/468d379d-4396-4212-8a9b-20e95ec098dd-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"468d379d-4396-4212-8a9b-20e95ec098dd\") " pod="openstack/kube-state-metrics-0" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.110242 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/468d379d-4396-4212-8a9b-20e95ec098dd-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"468d379d-4396-4212-8a9b-20e95ec098dd\") " pod="openstack/kube-state-metrics-0" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.110466 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcqnd\" (UniqueName: \"kubernetes.io/projected/468d379d-4396-4212-8a9b-20e95ec098dd-kube-api-access-rcqnd\") pod \"kube-state-metrics-0\" (UID: \"468d379d-4396-4212-8a9b-20e95ec098dd\") " pod="openstack/kube-state-metrics-0" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.110720 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/468d379d-4396-4212-8a9b-20e95ec098dd-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"468d379d-4396-4212-8a9b-20e95ec098dd\") " pod="openstack/kube-state-metrics-0" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.165592 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.213109 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/468d379d-4396-4212-8a9b-20e95ec098dd-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"468d379d-4396-4212-8a9b-20e95ec098dd\") " pod="openstack/kube-state-metrics-0" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.213188 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d16b4f-2886-43a0-b25a-19cd3058505a-config-data\") pod \"aodh-0\" (UID: \"67d16b4f-2886-43a0-b25a-19cd3058505a\") " pod="openstack/aodh-0" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.213235 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jjx7\" (UniqueName: \"kubernetes.io/projected/67d16b4f-2886-43a0-b25a-19cd3058505a-kube-api-access-6jjx7\") pod \"aodh-0\" (UID: \"67d16b4f-2886-43a0-b25a-19cd3058505a\") " pod="openstack/aodh-0" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.213278 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d16b4f-2886-43a0-b25a-19cd3058505a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"67d16b4f-2886-43a0-b25a-19cd3058505a\") " pod="openstack/aodh-0" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.213300 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/468d379d-4396-4212-8a9b-20e95ec098dd-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"468d379d-4396-4212-8a9b-20e95ec098dd\") " pod="openstack/kube-state-metrics-0" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.213333 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/468d379d-4396-4212-8a9b-20e95ec098dd-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"468d379d-4396-4212-8a9b-20e95ec098dd\") " pod="openstack/kube-state-metrics-0" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.213386 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67d16b4f-2886-43a0-b25a-19cd3058505a-scripts\") pod \"aodh-0\" (UID: \"67d16b4f-2886-43a0-b25a-19cd3058505a\") " pod="openstack/aodh-0" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.213409 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcqnd\" (UniqueName: \"kubernetes.io/projected/468d379d-4396-4212-8a9b-20e95ec098dd-kube-api-access-rcqnd\") pod \"kube-state-metrics-0\" (UID: \"468d379d-4396-4212-8a9b-20e95ec098dd\") " pod="openstack/kube-state-metrics-0" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.219919 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/468d379d-4396-4212-8a9b-20e95ec098dd-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"468d379d-4396-4212-8a9b-20e95ec098dd\") " pod="openstack/kube-state-metrics-0" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.229926 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/468d379d-4396-4212-8a9b-20e95ec098dd-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"468d379d-4396-4212-8a9b-20e95ec098dd\") " pod="openstack/kube-state-metrics-0" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.233024 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/468d379d-4396-4212-8a9b-20e95ec098dd-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"468d379d-4396-4212-8a9b-20e95ec098dd\") " pod="openstack/kube-state-metrics-0" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.235170 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcqnd\" (UniqueName: \"kubernetes.io/projected/468d379d-4396-4212-8a9b-20e95ec098dd-kube-api-access-rcqnd\") pod \"kube-state-metrics-0\" (UID: \"468d379d-4396-4212-8a9b-20e95ec098dd\") " pod="openstack/kube-state-metrics-0" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.247971 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.314935 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d16b4f-2886-43a0-b25a-19cd3058505a-config-data\") pod \"aodh-0\" (UID: \"67d16b4f-2886-43a0-b25a-19cd3058505a\") " pod="openstack/aodh-0" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.315261 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jjx7\" (UniqueName: \"kubernetes.io/projected/67d16b4f-2886-43a0-b25a-19cd3058505a-kube-api-access-6jjx7\") pod \"aodh-0\" (UID: \"67d16b4f-2886-43a0-b25a-19cd3058505a\") " pod="openstack/aodh-0" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.315305 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d16b4f-2886-43a0-b25a-19cd3058505a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"67d16b4f-2886-43a0-b25a-19cd3058505a\") " pod="openstack/aodh-0" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.315387 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67d16b4f-2886-43a0-b25a-19cd3058505a-scripts\") pod \"aodh-0\" (UID: \"67d16b4f-2886-43a0-b25a-19cd3058505a\") " pod="openstack/aodh-0" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.326981 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67d16b4f-2886-43a0-b25a-19cd3058505a-scripts\") pod \"aodh-0\" (UID: \"67d16b4f-2886-43a0-b25a-19cd3058505a\") " pod="openstack/aodh-0" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.334869 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d16b4f-2886-43a0-b25a-19cd3058505a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"67d16b4f-2886-43a0-b25a-19cd3058505a\") " pod="openstack/aodh-0" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.339564 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d16b4f-2886-43a0-b25a-19cd3058505a-config-data\") pod \"aodh-0\" (UID: \"67d16b4f-2886-43a0-b25a-19cd3058505a\") " pod="openstack/aodh-0" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.343064 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jjx7\" (UniqueName: \"kubernetes.io/projected/67d16b4f-2886-43a0-b25a-19cd3058505a-kube-api-access-6jjx7\") pod \"aodh-0\" (UID: \"67d16b4f-2886-43a0-b25a-19cd3058505a\") " pod="openstack/aodh-0" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.403160 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.765362 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.846742 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"468d379d-4396-4212-8a9b-20e95ec098dd","Type":"ContainerStarted","Data":"7424f74db369dbcc748139aeedb6bbdd5c99c900312a6f5a775b8a87f7cf7503"} Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.850331 4756 generic.go:334] "Generic (PLEG): container finished" podID="3b47c6a6-a6a9-4753-89eb-6c34d061cd71" containerID="b32851486fe9beee21ca03d70531e370c222f31f52105e53e4b7f60280c2b971" exitCode=0 Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.850376 4756 generic.go:334] "Generic (PLEG): container finished" podID="3b47c6a6-a6a9-4753-89eb-6c34d061cd71" containerID="0c805a4fc0987b3444fa59edecc166d5ea04d88eafc056338b31e9bb7322c175" exitCode=2 Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.850406 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b47c6a6-a6a9-4753-89eb-6c34d061cd71","Type":"ContainerDied","Data":"b32851486fe9beee21ca03d70531e370c222f31f52105e53e4b7f60280c2b971"} Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.850471 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b47c6a6-a6a9-4753-89eb-6c34d061cd71","Type":"ContainerDied","Data":"0c805a4fc0987b3444fa59edecc166d5ea04d88eafc056338b31e9bb7322c175"} Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.860679 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 05 18:05:42 crc kubenswrapper[4756]: I1205 18:05:42.926779 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 05 18:05:42 crc kubenswrapper[4756]: W1205 18:05:42.941915 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67d16b4f_2886_43a0_b25a_19cd3058505a.slice/crio-2406704d31d96830d552b9841b46bd5004133ce5dd4b467e6ca02008bd4e108b WatchSource:0}: Error finding container 2406704d31d96830d552b9841b46bd5004133ce5dd4b467e6ca02008bd4e108b: Status 404 returned error can't find the container with id 2406704d31d96830d552b9841b46bd5004133ce5dd4b467e6ca02008bd4e108b Dec 05 18:05:43 crc kubenswrapper[4756]: I1205 18:05:43.861707 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"67d16b4f-2886-43a0-b25a-19cd3058505a","Type":"ContainerStarted","Data":"2406704d31d96830d552b9841b46bd5004133ce5dd4b467e6ca02008bd4e108b"} Dec 05 18:05:43 crc kubenswrapper[4756]: I1205 18:05:43.864432 4756 generic.go:334] "Generic (PLEG): container finished" podID="3b47c6a6-a6a9-4753-89eb-6c34d061cd71" containerID="d2eaff74ac8a4658ce440b063ee1ffcee9a5f7a4ede2b1d055ed76a8a23c26e4" exitCode=0 Dec 05 18:05:43 crc kubenswrapper[4756]: I1205 18:05:43.864543 4756 generic.go:334] "Generic (PLEG): container finished" podID="3b47c6a6-a6a9-4753-89eb-6c34d061cd71" containerID="54cba42d0632eeaede6fe5fd55ac92a668f0d7393f3f4705fffac3b3ac4b5deb" exitCode=0 Dec 05 18:05:43 crc kubenswrapper[4756]: I1205 18:05:43.865943 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b47c6a6-a6a9-4753-89eb-6c34d061cd71","Type":"ContainerDied","Data":"d2eaff74ac8a4658ce440b063ee1ffcee9a5f7a4ede2b1d055ed76a8a23c26e4"} Dec 05 18:05:43 crc kubenswrapper[4756]: I1205 18:05:43.866079 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b47c6a6-a6a9-4753-89eb-6c34d061cd71","Type":"ContainerDied","Data":"54cba42d0632eeaede6fe5fd55ac92a668f0d7393f3f4705fffac3b3ac4b5deb"} Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.488044 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.560610 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-config-data\") pod \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\" (UID: \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\") " Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.562281 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-scripts\") pod \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\" (UID: \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\") " Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.562511 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4plv4\" (UniqueName: \"kubernetes.io/projected/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-kube-api-access-4plv4\") pod \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\" (UID: \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\") " Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.562695 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-sg-core-conf-yaml\") pod \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\" (UID: \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\") " Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.562931 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-log-httpd\") pod \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\" (UID: \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\") " Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.563514 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-combined-ca-bundle\") pod \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\" (UID: \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\") " Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.564266 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-run-httpd\") pod \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\" (UID: \"3b47c6a6-a6a9-4753-89eb-6c34d061cd71\") " Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.564555 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3b47c6a6-a6a9-4753-89eb-6c34d061cd71" (UID: "3b47c6a6-a6a9-4753-89eb-6c34d061cd71"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.564968 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3b47c6a6-a6a9-4753-89eb-6c34d061cd71" (UID: "3b47c6a6-a6a9-4753-89eb-6c34d061cd71"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.571739 4756 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.584096 4756 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.573575 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-kube-api-access-4plv4" (OuterVolumeSpecName: "kube-api-access-4plv4") pod "3b47c6a6-a6a9-4753-89eb-6c34d061cd71" (UID: "3b47c6a6-a6a9-4753-89eb-6c34d061cd71"). InnerVolumeSpecName "kube-api-access-4plv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.582571 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-scripts" (OuterVolumeSpecName: "scripts") pod "3b47c6a6-a6a9-4753-89eb-6c34d061cd71" (UID: "3b47c6a6-a6a9-4753-89eb-6c34d061cd71"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.634476 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3b47c6a6-a6a9-4753-89eb-6c34d061cd71" (UID: "3b47c6a6-a6a9-4753-89eb-6c34d061cd71"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.686360 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.686385 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4plv4\" (UniqueName: \"kubernetes.io/projected/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-kube-api-access-4plv4\") on node \"crc\" DevicePath \"\"" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.686395 4756 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.691134 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b47c6a6-a6a9-4753-89eb-6c34d061cd71" (UID: "3b47c6a6-a6a9-4753-89eb-6c34d061cd71"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.739396 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-config-data" (OuterVolumeSpecName: "config-data") pod "3b47c6a6-a6a9-4753-89eb-6c34d061cd71" (UID: "3b47c6a6-a6a9-4753-89eb-6c34d061cd71"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.788562 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.788593 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b47c6a6-a6a9-4753-89eb-6c34d061cd71-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.878994 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"468d379d-4396-4212-8a9b-20e95ec098dd","Type":"ContainerStarted","Data":"b1e9d792c196f31f0aa860a9a47d3b0d327afdef354475243d9f4d14ba03c8da"} Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.879534 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.881746 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b47c6a6-a6a9-4753-89eb-6c34d061cd71","Type":"ContainerDied","Data":"a6e8c82980c57b5ad27bbed2ac3cf2356e26b3aa6c3680a8c805dbda942af4b8"} Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.881791 4756 scope.go:117] "RemoveContainer" containerID="b32851486fe9beee21ca03d70531e370c222f31f52105e53e4b7f60280c2b971" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.881864 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.896297 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.869734579 podStartE2EDuration="3.896282747s" podCreationTimestamp="2025-12-05 18:05:41 +0000 UTC" firstStartedPulling="2025-12-05 18:05:42.790297651 +0000 UTC m=+6603.028971163" lastFinishedPulling="2025-12-05 18:05:43.816845819 +0000 UTC m=+6604.055519331" observedRunningTime="2025-12-05 18:05:44.891865799 +0000 UTC m=+6605.130539311" watchObservedRunningTime="2025-12-05 18:05:44.896282747 +0000 UTC m=+6605.134956259" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.921268 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.943697 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.958618 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 18:05:44 crc kubenswrapper[4756]: E1205 18:05:44.974891 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b47c6a6-a6a9-4753-89eb-6c34d061cd71" containerName="ceilometer-notification-agent" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.975107 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b47c6a6-a6a9-4753-89eb-6c34d061cd71" containerName="ceilometer-notification-agent" Dec 05 18:05:44 crc kubenswrapper[4756]: E1205 18:05:44.975173 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b47c6a6-a6a9-4753-89eb-6c34d061cd71" containerName="proxy-httpd" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.975245 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b47c6a6-a6a9-4753-89eb-6c34d061cd71" containerName="proxy-httpd" Dec 05 18:05:44 crc kubenswrapper[4756]: E1205 18:05:44.975310 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b47c6a6-a6a9-4753-89eb-6c34d061cd71" containerName="ceilometer-central-agent" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.975366 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b47c6a6-a6a9-4753-89eb-6c34d061cd71" containerName="ceilometer-central-agent" Dec 05 18:05:44 crc kubenswrapper[4756]: E1205 18:05:44.975470 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b47c6a6-a6a9-4753-89eb-6c34d061cd71" containerName="sg-core" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.975524 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b47c6a6-a6a9-4753-89eb-6c34d061cd71" containerName="sg-core" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.976665 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b47c6a6-a6a9-4753-89eb-6c34d061cd71" containerName="proxy-httpd" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.976755 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b47c6a6-a6a9-4753-89eb-6c34d061cd71" containerName="ceilometer-notification-agent" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.976844 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b47c6a6-a6a9-4753-89eb-6c34d061cd71" containerName="sg-core" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.976929 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b47c6a6-a6a9-4753-89eb-6c34d061cd71" containerName="ceilometer-central-agent" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.983320 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.986267 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.986538 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.987525 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 18:05:44 crc kubenswrapper[4756]: I1205 18:05:44.997363 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 18:05:45 crc kubenswrapper[4756]: I1205 18:05:45.082518 4756 scope.go:117] "RemoveContainer" containerID="0c805a4fc0987b3444fa59edecc166d5ea04d88eafc056338b31e9bb7322c175" Dec 05 18:05:45 crc kubenswrapper[4756]: E1205 18:05:45.095854 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b47c6a6_a6a9_4753_89eb_6c34d061cd71.slice/crio-a6e8c82980c57b5ad27bbed2ac3cf2356e26b3aa6c3680a8c805dbda942af4b8\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b47c6a6_a6a9_4753_89eb_6c34d061cd71.slice\": RecentStats: unable to find data in memory cache]" Dec 05 18:05:45 crc kubenswrapper[4756]: I1205 18:05:45.097083 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48ed367f-7f29-48d7-87ab-feef220c7c42-scripts\") pod \"ceilometer-0\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " pod="openstack/ceilometer-0" Dec 05 18:05:45 crc kubenswrapper[4756]: I1205 18:05:45.097130 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ed367f-7f29-48d7-87ab-feef220c7c42-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " pod="openstack/ceilometer-0" Dec 05 18:05:45 crc kubenswrapper[4756]: I1205 18:05:45.097347 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/48ed367f-7f29-48d7-87ab-feef220c7c42-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " pod="openstack/ceilometer-0" Dec 05 18:05:45 crc kubenswrapper[4756]: I1205 18:05:45.097544 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48ed367f-7f29-48d7-87ab-feef220c7c42-run-httpd\") pod \"ceilometer-0\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " pod="openstack/ceilometer-0" Dec 05 18:05:45 crc kubenswrapper[4756]: I1205 18:05:45.097776 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48ed367f-7f29-48d7-87ab-feef220c7c42-log-httpd\") pod \"ceilometer-0\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " pod="openstack/ceilometer-0" Dec 05 18:05:45 crc kubenswrapper[4756]: I1205 18:05:45.097846 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8dqs\" (UniqueName: \"kubernetes.io/projected/48ed367f-7f29-48d7-87ab-feef220c7c42-kube-api-access-k8dqs\") pod \"ceilometer-0\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " pod="openstack/ceilometer-0" Dec 05 18:05:45 crc kubenswrapper[4756]: I1205 18:05:45.097938 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/48ed367f-7f29-48d7-87ab-feef220c7c42-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " pod="openstack/ceilometer-0" Dec 05 18:05:45 crc kubenswrapper[4756]: I1205 18:05:45.097990 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ed367f-7f29-48d7-87ab-feef220c7c42-config-data\") pod \"ceilometer-0\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " pod="openstack/ceilometer-0" Dec 05 18:05:45 crc kubenswrapper[4756]: I1205 18:05:45.109826 4756 scope.go:117] "RemoveContainer" containerID="d2eaff74ac8a4658ce440b063ee1ffcee9a5f7a4ede2b1d055ed76a8a23c26e4" Dec 05 18:05:45 crc kubenswrapper[4756]: I1205 18:05:45.144019 4756 scope.go:117] "RemoveContainer" containerID="54cba42d0632eeaede6fe5fd55ac92a668f0d7393f3f4705fffac3b3ac4b5deb" Dec 05 18:05:45 crc kubenswrapper[4756]: I1205 18:05:45.199569 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/48ed367f-7f29-48d7-87ab-feef220c7c42-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " pod="openstack/ceilometer-0" Dec 05 18:05:45 crc kubenswrapper[4756]: I1205 18:05:45.199727 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ed367f-7f29-48d7-87ab-feef220c7c42-config-data\") pod \"ceilometer-0\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " pod="openstack/ceilometer-0" Dec 05 18:05:45 crc kubenswrapper[4756]: I1205 18:05:45.199804 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48ed367f-7f29-48d7-87ab-feef220c7c42-scripts\") pod \"ceilometer-0\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " pod="openstack/ceilometer-0" Dec 05 18:05:45 crc kubenswrapper[4756]: I1205 18:05:45.199826 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ed367f-7f29-48d7-87ab-feef220c7c42-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " pod="openstack/ceilometer-0" Dec 05 18:05:45 crc kubenswrapper[4756]: I1205 18:05:45.199898 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/48ed367f-7f29-48d7-87ab-feef220c7c42-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " pod="openstack/ceilometer-0" Dec 05 18:05:45 crc kubenswrapper[4756]: I1205 18:05:45.199949 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48ed367f-7f29-48d7-87ab-feef220c7c42-run-httpd\") pod \"ceilometer-0\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " pod="openstack/ceilometer-0" Dec 05 18:05:45 crc kubenswrapper[4756]: I1205 18:05:45.200053 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48ed367f-7f29-48d7-87ab-feef220c7c42-log-httpd\") pod \"ceilometer-0\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " pod="openstack/ceilometer-0" Dec 05 18:05:45 crc kubenswrapper[4756]: I1205 18:05:45.200080 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8dqs\" (UniqueName: \"kubernetes.io/projected/48ed367f-7f29-48d7-87ab-feef220c7c42-kube-api-access-k8dqs\") pod \"ceilometer-0\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " pod="openstack/ceilometer-0" Dec 05 18:05:45 crc kubenswrapper[4756]: I1205 18:05:45.200916 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48ed367f-7f29-48d7-87ab-feef220c7c42-log-httpd\") pod \"ceilometer-0\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " pod="openstack/ceilometer-0" Dec 05 18:05:45 crc kubenswrapper[4756]: I1205 18:05:45.201798 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48ed367f-7f29-48d7-87ab-feef220c7c42-run-httpd\") pod \"ceilometer-0\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " pod="openstack/ceilometer-0" Dec 05 18:05:45 crc kubenswrapper[4756]: I1205 18:05:45.203469 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/48ed367f-7f29-48d7-87ab-feef220c7c42-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " pod="openstack/ceilometer-0" Dec 05 18:05:45 crc kubenswrapper[4756]: I1205 18:05:45.204307 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ed367f-7f29-48d7-87ab-feef220c7c42-config-data\") pod \"ceilometer-0\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " pod="openstack/ceilometer-0" Dec 05 18:05:45 crc kubenswrapper[4756]: I1205 18:05:45.204749 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ed367f-7f29-48d7-87ab-feef220c7c42-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " pod="openstack/ceilometer-0" Dec 05 18:05:45 crc kubenswrapper[4756]: I1205 18:05:45.205604 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/48ed367f-7f29-48d7-87ab-feef220c7c42-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " pod="openstack/ceilometer-0" Dec 05 18:05:45 crc kubenswrapper[4756]: I1205 18:05:45.219052 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48ed367f-7f29-48d7-87ab-feef220c7c42-scripts\") pod \"ceilometer-0\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " pod="openstack/ceilometer-0" Dec 05 18:05:45 crc kubenswrapper[4756]: I1205 18:05:45.219874 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8dqs\" (UniqueName: \"kubernetes.io/projected/48ed367f-7f29-48d7-87ab-feef220c7c42-kube-api-access-k8dqs\") pod \"ceilometer-0\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " pod="openstack/ceilometer-0" Dec 05 18:05:45 crc kubenswrapper[4756]: I1205 18:05:45.300099 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 18:05:45 crc kubenswrapper[4756]: I1205 18:05:45.391951 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 05 18:05:45 crc kubenswrapper[4756]: W1205 18:05:45.882226 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48ed367f_7f29_48d7_87ab_feef220c7c42.slice/crio-172cc3abb5ea6fc6cbca358e5a6e480ab7fc080b4e60db24b7daf9ec0988bd39 WatchSource:0}: Error finding container 172cc3abb5ea6fc6cbca358e5a6e480ab7fc080b4e60db24b7daf9ec0988bd39: Status 404 returned error can't find the container with id 172cc3abb5ea6fc6cbca358e5a6e480ab7fc080b4e60db24b7daf9ec0988bd39 Dec 05 18:05:45 crc kubenswrapper[4756]: I1205 18:05:45.888548 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 18:05:46 crc kubenswrapper[4756]: I1205 18:05:46.063371 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b47c6a6-a6a9-4753-89eb-6c34d061cd71" path="/var/lib/kubelet/pods/3b47c6a6-a6a9-4753-89eb-6c34d061cd71/volumes" Dec 05 18:05:46 crc kubenswrapper[4756]: I1205 18:05:46.799012 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 18:05:46 crc kubenswrapper[4756]: I1205 18:05:46.904400 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48ed367f-7f29-48d7-87ab-feef220c7c42","Type":"ContainerStarted","Data":"73f5efaeca3a5a2813cb59bf8eef6ced854a1301c39626bdfef0103860f213b5"} Dec 05 18:05:46 crc kubenswrapper[4756]: I1205 18:05:46.904480 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48ed367f-7f29-48d7-87ab-feef220c7c42","Type":"ContainerStarted","Data":"172cc3abb5ea6fc6cbca358e5a6e480ab7fc080b4e60db24b7daf9ec0988bd39"} Dec 05 18:05:46 crc kubenswrapper[4756]: I1205 18:05:46.906472 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"67d16b4f-2886-43a0-b25a-19cd3058505a","Type":"ContainerStarted","Data":"215dd00325594b477041bd7759619891b0d3a777616aa3cba20d894e6663486b"} Dec 05 18:05:48 crc kubenswrapper[4756]: I1205 18:05:48.051366 4756 scope.go:117] "RemoveContainer" containerID="c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326" Dec 05 18:05:48 crc kubenswrapper[4756]: E1205 18:05:48.052733 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:05:50 crc kubenswrapper[4756]: I1205 18:05:50.074034 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8f6x8"] Dec 05 18:05:50 crc kubenswrapper[4756]: I1205 18:05:50.074077 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8f6x8"] Dec 05 18:05:54 crc kubenswrapper[4756]: I1205 18:05:54.989764 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-f65kp" podUID="ced54d49-8a17-4bba-a783-c0f69ed32a7a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.77:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 18:05:55 crc kubenswrapper[4756]: I1205 18:05:55.219926 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b408fd08-c950-4490-800d-4167c055ed5a" path="/var/lib/kubelet/pods/b408fd08-c950-4490-800d-4167c055ed5a/volumes" Dec 05 18:05:55 crc kubenswrapper[4756]: E1205 18:05:55.222336 4756 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="3.173s" Dec 05 18:05:55 crc kubenswrapper[4756]: I1205 18:05:55.222524 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-cqrvr"] Dec 05 18:05:55 crc kubenswrapper[4756]: I1205 18:05:55.222561 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-cqrvr"] Dec 05 18:05:55 crc kubenswrapper[4756]: I1205 18:05:55.264543 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3aeab9a-21a1-4186-856a-a04db31f11e7" path="/var/lib/kubelet/pods/e3aeab9a-21a1-4186-856a-a04db31f11e7/volumes" Dec 05 18:05:55 crc kubenswrapper[4756]: I1205 18:05:55.446615 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 05 18:05:56 crc kubenswrapper[4756]: I1205 18:05:56.017243 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48ed367f-7f29-48d7-87ab-feef220c7c42","Type":"ContainerStarted","Data":"74672667367e1836d91c8e663b1c11f6657b5467c020eecce43cbd69197ae670"} Dec 05 18:05:59 crc kubenswrapper[4756]: E1205 18:05:59.377652 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-aodh-evaluator@sha256:3473a5f5c914f9ba397ffc5ea9d8eeedd85d31a3c9244df7457f3c3e74eaefc4" Dec 05 18:05:59 crc kubenswrapper[4756]: E1205 18:05:59.378797 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:aodh-evaluator,Image:quay.io/podified-antelope-centos9/openstack-aodh-evaluator@sha256:3473a5f5c914f9ba397ffc5ea9d8eeedd85d31a3c9244df7457f3c3e74eaefc4,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n696h8ch5fbh68fh58fh658h87h549h666h569h7h7fh68chd9h89h7bh585hc7h56fh597h57dh5b4h54dh558h598h697h678h5c8h688h88h5c4h8fq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:aodh-evaluator-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6jjx7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod aodh-0_openstack(67d16b4f-2886-43a0-b25a-19cd3058505a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 18:06:00 crc kubenswrapper[4756]: I1205 18:06:00.087138 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48ed367f-7f29-48d7-87ab-feef220c7c42","Type":"ContainerStarted","Data":"0838f6659f5482031241d722602d546139e285e6d3359f6ecd05be4384350a49"} Dec 05 18:06:02 crc kubenswrapper[4756]: I1205 18:06:02.050449 4756 scope.go:117] "RemoveContainer" containerID="c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326" Dec 05 18:06:02 crc kubenswrapper[4756]: E1205 18:06:02.052152 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:06:10 crc kubenswrapper[4756]: I1205 18:06:10.080996 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-84g29"] Dec 05 18:06:10 crc kubenswrapper[4756]: I1205 18:06:10.084933 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-84g29"] Dec 05 18:06:12 crc kubenswrapper[4756]: I1205 18:06:12.069056 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78450a2e-9807-4e4c-8575-75926a469387" path="/var/lib/kubelet/pods/78450a2e-9807-4e4c-8575-75926a469387/volumes" Dec 05 18:06:14 crc kubenswrapper[4756]: I1205 18:06:14.052060 4756 scope.go:117] "RemoveContainer" containerID="c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326" Dec 05 18:06:14 crc kubenswrapper[4756]: E1205 18:06:14.052884 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:06:15 crc kubenswrapper[4756]: I1205 18:06:15.278159 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48ed367f-7f29-48d7-87ab-feef220c7c42","Type":"ContainerStarted","Data":"a7490642f7482f1bb0a8f4592053566f15eeaf0d46a2dfaa9647ce93785d5034"} Dec 05 18:06:15 crc kubenswrapper[4756]: I1205 18:06:15.279897 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="48ed367f-7f29-48d7-87ab-feef220c7c42" containerName="ceilometer-central-agent" containerID="cri-o://73f5efaeca3a5a2813cb59bf8eef6ced854a1301c39626bdfef0103860f213b5" gracePeriod=30 Dec 05 18:06:15 crc kubenswrapper[4756]: I1205 18:06:15.280135 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 18:06:15 crc kubenswrapper[4756]: I1205 18:06:15.280652 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="48ed367f-7f29-48d7-87ab-feef220c7c42" containerName="proxy-httpd" containerID="cri-o://a7490642f7482f1bb0a8f4592053566f15eeaf0d46a2dfaa9647ce93785d5034" gracePeriod=30 Dec 05 18:06:15 crc kubenswrapper[4756]: I1205 18:06:15.280828 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="48ed367f-7f29-48d7-87ab-feef220c7c42" containerName="sg-core" containerID="cri-o://0838f6659f5482031241d722602d546139e285e6d3359f6ecd05be4384350a49" gracePeriod=30 Dec 05 18:06:15 crc kubenswrapper[4756]: I1205 18:06:15.280980 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="48ed367f-7f29-48d7-87ab-feef220c7c42" containerName="ceilometer-notification-agent" containerID="cri-o://74672667367e1836d91c8e663b1c11f6657b5467c020eecce43cbd69197ae670" gracePeriod=30 Dec 05 18:06:15 crc kubenswrapper[4756]: I1205 18:06:15.291329 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"67d16b4f-2886-43a0-b25a-19cd3058505a","Type":"ContainerStarted","Data":"22991e8f719cf6c84e09dc9a17c8cfc7d465676f11bfeae7130be1ad607010ef"} Dec 05 18:06:15 crc kubenswrapper[4756]: I1205 18:06:15.306774 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.215047138 podStartE2EDuration="31.306758294s" podCreationTimestamp="2025-12-05 18:05:44 +0000 UTC" firstStartedPulling="2025-12-05 18:05:45.884894184 +0000 UTC m=+6606.123567686" lastFinishedPulling="2025-12-05 18:06:13.97660533 +0000 UTC m=+6634.215278842" observedRunningTime="2025-12-05 18:06:15.303672172 +0000 UTC m=+6635.542345944" watchObservedRunningTime="2025-12-05 18:06:15.306758294 +0000 UTC m=+6635.545431816" Dec 05 18:06:15 crc kubenswrapper[4756]: E1205 18:06:15.987890 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48ed367f_7f29_48d7_87ab_feef220c7c42.slice/crio-conmon-73f5efaeca3a5a2813cb59bf8eef6ced854a1301c39626bdfef0103860f213b5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48ed367f_7f29_48d7_87ab_feef220c7c42.slice/crio-conmon-0838f6659f5482031241d722602d546139e285e6d3359f6ecd05be4384350a49.scope\": RecentStats: unable to find data in memory cache]" Dec 05 18:06:16 crc kubenswrapper[4756]: I1205 18:06:16.302291 4756 generic.go:334] "Generic (PLEG): container finished" podID="48ed367f-7f29-48d7-87ab-feef220c7c42" containerID="a7490642f7482f1bb0a8f4592053566f15eeaf0d46a2dfaa9647ce93785d5034" exitCode=0 Dec 05 18:06:16 crc kubenswrapper[4756]: I1205 18:06:16.302349 4756 generic.go:334] "Generic (PLEG): container finished" podID="48ed367f-7f29-48d7-87ab-feef220c7c42" containerID="0838f6659f5482031241d722602d546139e285e6d3359f6ecd05be4384350a49" exitCode=2 Dec 05 18:06:16 crc kubenswrapper[4756]: I1205 18:06:16.302361 4756 generic.go:334] "Generic (PLEG): container finished" podID="48ed367f-7f29-48d7-87ab-feef220c7c42" containerID="73f5efaeca3a5a2813cb59bf8eef6ced854a1301c39626bdfef0103860f213b5" exitCode=0 Dec 05 18:06:16 crc kubenswrapper[4756]: I1205 18:06:16.302384 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48ed367f-7f29-48d7-87ab-feef220c7c42","Type":"ContainerDied","Data":"a7490642f7482f1bb0a8f4592053566f15eeaf0d46a2dfaa9647ce93785d5034"} Dec 05 18:06:16 crc kubenswrapper[4756]: I1205 18:06:16.302504 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48ed367f-7f29-48d7-87ab-feef220c7c42","Type":"ContainerDied","Data":"0838f6659f5482031241d722602d546139e285e6d3359f6ecd05be4384350a49"} Dec 05 18:06:16 crc kubenswrapper[4756]: I1205 18:06:16.302520 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48ed367f-7f29-48d7-87ab-feef220c7c42","Type":"ContainerDied","Data":"73f5efaeca3a5a2813cb59bf8eef6ced854a1301c39626bdfef0103860f213b5"} Dec 05 18:06:16 crc kubenswrapper[4756]: I1205 18:06:16.484263 4756 scope.go:117] "RemoveContainer" containerID="2145246b616e84f1d8b93e824c59619d1a9721be5fb28a199ede99666dd321ca" Dec 05 18:06:16 crc kubenswrapper[4756]: I1205 18:06:16.517683 4756 scope.go:117] "RemoveContainer" containerID="f9ad32dd323df73b2b5c6ca4c082d12426c8e7a9df13805713d1b3729f2816b7" Dec 05 18:06:16 crc kubenswrapper[4756]: I1205 18:06:16.590221 4756 scope.go:117] "RemoveContainer" containerID="539ccc1002f311157024a76e2ad32d96e8bf4c542bd474ba228df54a3e992049" Dec 05 18:06:16 crc kubenswrapper[4756]: I1205 18:06:16.653113 4756 scope.go:117] "RemoveContainer" containerID="bc0ffb0f85716f7ace17617226442dbacc2a04251cb570cbba06ba02bd834aaa" Dec 05 18:06:16 crc kubenswrapper[4756]: I1205 18:06:16.721132 4756 scope.go:117] "RemoveContainer" containerID="7bc4caa0f637823e9d8a1ae823b3d0cdf78830820a3bc1abcbea9dc8c269d92b" Dec 05 18:06:16 crc kubenswrapper[4756]: I1205 18:06:16.749362 4756 scope.go:117] "RemoveContainer" containerID="2096f8d3137aea9e3ac87d65dd4a98d9318d3a8e034372ee465958b51029fcad" Dec 05 18:06:16 crc kubenswrapper[4756]: I1205 18:06:16.798122 4756 scope.go:117] "RemoveContainer" containerID="e2bd8c4dea1f9017a0a23145fabace2f0947a1c1d65c2709bb7864a2ae6a2c50" Dec 05 18:06:16 crc kubenswrapper[4756]: I1205 18:06:16.828512 4756 scope.go:117] "RemoveContainer" containerID="99f940b808d324a18dbae92f7c4a8cebcb8fe7a4aedb4c6c1221e450076618c9" Dec 05 18:06:16 crc kubenswrapper[4756]: I1205 18:06:16.852658 4756 scope.go:117] "RemoveContainer" containerID="fd7fff6a580f35fd53cbc8830b129d4787bd28a52bf8cfd666ebe25d217791da" Dec 05 18:06:16 crc kubenswrapper[4756]: I1205 18:06:16.878061 4756 scope.go:117] "RemoveContainer" containerID="0a520722fdae61242c11e7a24284b9cef58f1c5bcedd5c3460b72c55cee0776b" Dec 05 18:06:19 crc kubenswrapper[4756]: I1205 18:06:19.335836 4756 generic.go:334] "Generic (PLEG): container finished" podID="48ed367f-7f29-48d7-87ab-feef220c7c42" containerID="74672667367e1836d91c8e663b1c11f6657b5467c020eecce43cbd69197ae670" exitCode=0 Dec 05 18:06:19 crc kubenswrapper[4756]: I1205 18:06:19.335915 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48ed367f-7f29-48d7-87ab-feef220c7c42","Type":"ContainerDied","Data":"74672667367e1836d91c8e663b1c11f6657b5467c020eecce43cbd69197ae670"} Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.103352 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.278482 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48ed367f-7f29-48d7-87ab-feef220c7c42-run-httpd\") pod \"48ed367f-7f29-48d7-87ab-feef220c7c42\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.278858 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8dqs\" (UniqueName: \"kubernetes.io/projected/48ed367f-7f29-48d7-87ab-feef220c7c42-kube-api-access-k8dqs\") pod \"48ed367f-7f29-48d7-87ab-feef220c7c42\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.278902 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48ed367f-7f29-48d7-87ab-feef220c7c42-log-httpd\") pod \"48ed367f-7f29-48d7-87ab-feef220c7c42\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.278998 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/48ed367f-7f29-48d7-87ab-feef220c7c42-ceilometer-tls-certs\") pod \"48ed367f-7f29-48d7-87ab-feef220c7c42\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.279034 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ed367f-7f29-48d7-87ab-feef220c7c42-config-data\") pod \"48ed367f-7f29-48d7-87ab-feef220c7c42\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.279127 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ed367f-7f29-48d7-87ab-feef220c7c42-combined-ca-bundle\") pod \"48ed367f-7f29-48d7-87ab-feef220c7c42\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.279149 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/48ed367f-7f29-48d7-87ab-feef220c7c42-sg-core-conf-yaml\") pod \"48ed367f-7f29-48d7-87ab-feef220c7c42\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.279195 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48ed367f-7f29-48d7-87ab-feef220c7c42-scripts\") pod \"48ed367f-7f29-48d7-87ab-feef220c7c42\" (UID: \"48ed367f-7f29-48d7-87ab-feef220c7c42\") " Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.279409 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48ed367f-7f29-48d7-87ab-feef220c7c42-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "48ed367f-7f29-48d7-87ab-feef220c7c42" (UID: "48ed367f-7f29-48d7-87ab-feef220c7c42"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.280118 4756 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48ed367f-7f29-48d7-87ab-feef220c7c42-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.281783 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48ed367f-7f29-48d7-87ab-feef220c7c42-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "48ed367f-7f29-48d7-87ab-feef220c7c42" (UID: "48ed367f-7f29-48d7-87ab-feef220c7c42"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.285892 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48ed367f-7f29-48d7-87ab-feef220c7c42-scripts" (OuterVolumeSpecName: "scripts") pod "48ed367f-7f29-48d7-87ab-feef220c7c42" (UID: "48ed367f-7f29-48d7-87ab-feef220c7c42"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.289480 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48ed367f-7f29-48d7-87ab-feef220c7c42-kube-api-access-k8dqs" (OuterVolumeSpecName: "kube-api-access-k8dqs") pod "48ed367f-7f29-48d7-87ab-feef220c7c42" (UID: "48ed367f-7f29-48d7-87ab-feef220c7c42"). InnerVolumeSpecName "kube-api-access-k8dqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.320566 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48ed367f-7f29-48d7-87ab-feef220c7c42-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "48ed367f-7f29-48d7-87ab-feef220c7c42" (UID: "48ed367f-7f29-48d7-87ab-feef220c7c42"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.350211 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48ed367f-7f29-48d7-87ab-feef220c7c42","Type":"ContainerDied","Data":"172cc3abb5ea6fc6cbca358e5a6e480ab7fc080b4e60db24b7daf9ec0988bd39"} Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.350260 4756 scope.go:117] "RemoveContainer" containerID="a7490642f7482f1bb0a8f4592053566f15eeaf0d46a2dfaa9647ce93785d5034" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.350476 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.375098 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48ed367f-7f29-48d7-87ab-feef220c7c42-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48ed367f-7f29-48d7-87ab-feef220c7c42" (UID: "48ed367f-7f29-48d7-87ab-feef220c7c42"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.382734 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8dqs\" (UniqueName: \"kubernetes.io/projected/48ed367f-7f29-48d7-87ab-feef220c7c42-kube-api-access-k8dqs\") on node \"crc\" DevicePath \"\"" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.382789 4756 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48ed367f-7f29-48d7-87ab-feef220c7c42-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.382801 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ed367f-7f29-48d7-87ab-feef220c7c42-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.382810 4756 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/48ed367f-7f29-48d7-87ab-feef220c7c42-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.382820 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48ed367f-7f29-48d7-87ab-feef220c7c42-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.387684 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48ed367f-7f29-48d7-87ab-feef220c7c42-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "48ed367f-7f29-48d7-87ab-feef220c7c42" (UID: "48ed367f-7f29-48d7-87ab-feef220c7c42"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.400497 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48ed367f-7f29-48d7-87ab-feef220c7c42-config-data" (OuterVolumeSpecName: "config-data") pod "48ed367f-7f29-48d7-87ab-feef220c7c42" (UID: "48ed367f-7f29-48d7-87ab-feef220c7c42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.485678 4756 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/48ed367f-7f29-48d7-87ab-feef220c7c42-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.485713 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ed367f-7f29-48d7-87ab-feef220c7c42-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.699094 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.714884 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.735119 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 18:06:20 crc kubenswrapper[4756]: E1205 18:06:20.735653 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48ed367f-7f29-48d7-87ab-feef220c7c42" containerName="proxy-httpd" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.735673 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="48ed367f-7f29-48d7-87ab-feef220c7c42" containerName="proxy-httpd" Dec 05 18:06:20 crc kubenswrapper[4756]: E1205 18:06:20.735689 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48ed367f-7f29-48d7-87ab-feef220c7c42" containerName="ceilometer-central-agent" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.735698 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="48ed367f-7f29-48d7-87ab-feef220c7c42" containerName="ceilometer-central-agent" Dec 05 18:06:20 crc kubenswrapper[4756]: E1205 18:06:20.735717 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48ed367f-7f29-48d7-87ab-feef220c7c42" containerName="ceilometer-notification-agent" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.735727 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="48ed367f-7f29-48d7-87ab-feef220c7c42" containerName="ceilometer-notification-agent" Dec 05 18:06:20 crc kubenswrapper[4756]: E1205 18:06:20.735745 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48ed367f-7f29-48d7-87ab-feef220c7c42" containerName="sg-core" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.735754 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="48ed367f-7f29-48d7-87ab-feef220c7c42" containerName="sg-core" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.736099 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="48ed367f-7f29-48d7-87ab-feef220c7c42" containerName="proxy-httpd" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.736134 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="48ed367f-7f29-48d7-87ab-feef220c7c42" containerName="sg-core" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.736155 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="48ed367f-7f29-48d7-87ab-feef220c7c42" containerName="ceilometer-notification-agent" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.736182 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="48ed367f-7f29-48d7-87ab-feef220c7c42" containerName="ceilometer-central-agent" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.738633 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.744354 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.744362 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.748951 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.753330 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.894242 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40ae1a02-0032-4316-8fa4-826e47163b78-run-httpd\") pod \"ceilometer-0\" (UID: \"40ae1a02-0032-4316-8fa4-826e47163b78\") " pod="openstack/ceilometer-0" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.894316 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40ae1a02-0032-4316-8fa4-826e47163b78-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"40ae1a02-0032-4316-8fa4-826e47163b78\") " pod="openstack/ceilometer-0" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.894346 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40ae1a02-0032-4316-8fa4-826e47163b78-scripts\") pod \"ceilometer-0\" (UID: \"40ae1a02-0032-4316-8fa4-826e47163b78\") " pod="openstack/ceilometer-0" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.894406 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/40ae1a02-0032-4316-8fa4-826e47163b78-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"40ae1a02-0032-4316-8fa4-826e47163b78\") " pod="openstack/ceilometer-0" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.894462 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40ae1a02-0032-4316-8fa4-826e47163b78-log-httpd\") pod \"ceilometer-0\" (UID: \"40ae1a02-0032-4316-8fa4-826e47163b78\") " pod="openstack/ceilometer-0" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.894484 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40ae1a02-0032-4316-8fa4-826e47163b78-config-data\") pod \"ceilometer-0\" (UID: \"40ae1a02-0032-4316-8fa4-826e47163b78\") " pod="openstack/ceilometer-0" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.894538 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/40ae1a02-0032-4316-8fa4-826e47163b78-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"40ae1a02-0032-4316-8fa4-826e47163b78\") " pod="openstack/ceilometer-0" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.894565 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zg46\" (UniqueName: \"kubernetes.io/projected/40ae1a02-0032-4316-8fa4-826e47163b78-kube-api-access-5zg46\") pod \"ceilometer-0\" (UID: \"40ae1a02-0032-4316-8fa4-826e47163b78\") " pod="openstack/ceilometer-0" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.996244 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/40ae1a02-0032-4316-8fa4-826e47163b78-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"40ae1a02-0032-4316-8fa4-826e47163b78\") " pod="openstack/ceilometer-0" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.996300 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zg46\" (UniqueName: \"kubernetes.io/projected/40ae1a02-0032-4316-8fa4-826e47163b78-kube-api-access-5zg46\") pod \"ceilometer-0\" (UID: \"40ae1a02-0032-4316-8fa4-826e47163b78\") " pod="openstack/ceilometer-0" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.996347 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40ae1a02-0032-4316-8fa4-826e47163b78-run-httpd\") pod \"ceilometer-0\" (UID: \"40ae1a02-0032-4316-8fa4-826e47163b78\") " pod="openstack/ceilometer-0" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.996386 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40ae1a02-0032-4316-8fa4-826e47163b78-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"40ae1a02-0032-4316-8fa4-826e47163b78\") " pod="openstack/ceilometer-0" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.996426 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40ae1a02-0032-4316-8fa4-826e47163b78-scripts\") pod \"ceilometer-0\" (UID: \"40ae1a02-0032-4316-8fa4-826e47163b78\") " pod="openstack/ceilometer-0" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.996481 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/40ae1a02-0032-4316-8fa4-826e47163b78-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"40ae1a02-0032-4316-8fa4-826e47163b78\") " pod="openstack/ceilometer-0" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.996516 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40ae1a02-0032-4316-8fa4-826e47163b78-log-httpd\") pod \"ceilometer-0\" (UID: \"40ae1a02-0032-4316-8fa4-826e47163b78\") " pod="openstack/ceilometer-0" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.996535 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40ae1a02-0032-4316-8fa4-826e47163b78-config-data\") pod \"ceilometer-0\" (UID: \"40ae1a02-0032-4316-8fa4-826e47163b78\") " pod="openstack/ceilometer-0" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.997961 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40ae1a02-0032-4316-8fa4-826e47163b78-run-httpd\") pod \"ceilometer-0\" (UID: \"40ae1a02-0032-4316-8fa4-826e47163b78\") " pod="openstack/ceilometer-0" Dec 05 18:06:20 crc kubenswrapper[4756]: I1205 18:06:20.998192 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40ae1a02-0032-4316-8fa4-826e47163b78-log-httpd\") pod \"ceilometer-0\" (UID: \"40ae1a02-0032-4316-8fa4-826e47163b78\") " pod="openstack/ceilometer-0" Dec 05 18:06:21 crc kubenswrapper[4756]: I1205 18:06:21.000376 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/40ae1a02-0032-4316-8fa4-826e47163b78-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"40ae1a02-0032-4316-8fa4-826e47163b78\") " pod="openstack/ceilometer-0" Dec 05 18:06:21 crc kubenswrapper[4756]: I1205 18:06:21.000968 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/40ae1a02-0032-4316-8fa4-826e47163b78-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"40ae1a02-0032-4316-8fa4-826e47163b78\") " pod="openstack/ceilometer-0" Dec 05 18:06:21 crc kubenswrapper[4756]: I1205 18:06:21.001963 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40ae1a02-0032-4316-8fa4-826e47163b78-scripts\") pod \"ceilometer-0\" (UID: \"40ae1a02-0032-4316-8fa4-826e47163b78\") " pod="openstack/ceilometer-0" Dec 05 18:06:21 crc kubenswrapper[4756]: I1205 18:06:21.002530 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40ae1a02-0032-4316-8fa4-826e47163b78-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"40ae1a02-0032-4316-8fa4-826e47163b78\") " pod="openstack/ceilometer-0" Dec 05 18:06:21 crc kubenswrapper[4756]: I1205 18:06:21.003392 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40ae1a02-0032-4316-8fa4-826e47163b78-config-data\") pod \"ceilometer-0\" (UID: \"40ae1a02-0032-4316-8fa4-826e47163b78\") " pod="openstack/ceilometer-0" Dec 05 18:06:21 crc kubenswrapper[4756]: I1205 18:06:21.013645 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zg46\" (UniqueName: \"kubernetes.io/projected/40ae1a02-0032-4316-8fa4-826e47163b78-kube-api-access-5zg46\") pod \"ceilometer-0\" (UID: \"40ae1a02-0032-4316-8fa4-826e47163b78\") " pod="openstack/ceilometer-0" Dec 05 18:06:21 crc kubenswrapper[4756]: I1205 18:06:21.069395 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 18:06:21 crc kubenswrapper[4756]: I1205 18:06:21.609158 4756 scope.go:117] "RemoveContainer" containerID="0838f6659f5482031241d722602d546139e285e6d3359f6ecd05be4384350a49" Dec 05 18:06:21 crc kubenswrapper[4756]: I1205 18:06:21.920800 4756 scope.go:117] "RemoveContainer" containerID="74672667367e1836d91c8e663b1c11f6657b5467c020eecce43cbd69197ae670" Dec 05 18:06:22 crc kubenswrapper[4756]: I1205 18:06:22.073520 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48ed367f-7f29-48d7-87ab-feef220c7c42" path="/var/lib/kubelet/pods/48ed367f-7f29-48d7-87ab-feef220c7c42/volumes" Dec 05 18:06:22 crc kubenswrapper[4756]: I1205 18:06:22.116011 4756 scope.go:117] "RemoveContainer" containerID="73f5efaeca3a5a2813cb59bf8eef6ced854a1301c39626bdfef0103860f213b5" Dec 05 18:06:22 crc kubenswrapper[4756]: E1205 18:06:22.534462 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"aodh-evaluator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/aodh-0" podUID="67d16b4f-2886-43a0-b25a-19cd3058505a" Dec 05 18:06:22 crc kubenswrapper[4756]: I1205 18:06:22.534549 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 18:06:22 crc kubenswrapper[4756]: W1205 18:06:22.541551 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40ae1a02_0032_4316_8fa4_826e47163b78.slice/crio-70cb05cbb52712cfcf67b524920553acadc55791e6af3dab4858385cddcbfb34 WatchSource:0}: Error finding container 70cb05cbb52712cfcf67b524920553acadc55791e6af3dab4858385cddcbfb34: Status 404 returned error can't find the container with id 70cb05cbb52712cfcf67b524920553acadc55791e6af3dab4858385cddcbfb34 Dec 05 18:06:23 crc kubenswrapper[4756]: I1205 18:06:23.383752 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"67d16b4f-2886-43a0-b25a-19cd3058505a","Type":"ContainerStarted","Data":"82a7a99ef6cab6e16564236d6637f2d694ee210e8fc1ca93e04045818eb938bd"} Dec 05 18:06:23 crc kubenswrapper[4756]: I1205 18:06:23.384005 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="67d16b4f-2886-43a0-b25a-19cd3058505a" containerName="aodh-api" containerID="cri-o://215dd00325594b477041bd7759619891b0d3a777616aa3cba20d894e6663486b" gracePeriod=30 Dec 05 18:06:23 crc kubenswrapper[4756]: I1205 18:06:23.385141 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="67d16b4f-2886-43a0-b25a-19cd3058505a" containerName="aodh-listener" containerID="cri-o://82a7a99ef6cab6e16564236d6637f2d694ee210e8fc1ca93e04045818eb938bd" gracePeriod=30 Dec 05 18:06:23 crc kubenswrapper[4756]: I1205 18:06:23.385257 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="67d16b4f-2886-43a0-b25a-19cd3058505a" containerName="aodh-notifier" containerID="cri-o://22991e8f719cf6c84e09dc9a17c8cfc7d465676f11bfeae7130be1ad607010ef" gracePeriod=30 Dec 05 18:06:23 crc kubenswrapper[4756]: I1205 18:06:23.386650 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40ae1a02-0032-4316-8fa4-826e47163b78","Type":"ContainerStarted","Data":"70cb05cbb52712cfcf67b524920553acadc55791e6af3dab4858385cddcbfb34"} Dec 05 18:06:24 crc kubenswrapper[4756]: I1205 18:06:24.404438 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40ae1a02-0032-4316-8fa4-826e47163b78","Type":"ContainerStarted","Data":"10d9b33fd3c830f9d205be6e92e68ca8375ebffd60298a61e293fe3c84875d06"} Dec 05 18:06:24 crc kubenswrapper[4756]: I1205 18:06:24.408253 4756 generic.go:334] "Generic (PLEG): container finished" podID="67d16b4f-2886-43a0-b25a-19cd3058505a" containerID="215dd00325594b477041bd7759619891b0d3a777616aa3cba20d894e6663486b" exitCode=0 Dec 05 18:06:24 crc kubenswrapper[4756]: I1205 18:06:24.408289 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"67d16b4f-2886-43a0-b25a-19cd3058505a","Type":"ContainerDied","Data":"215dd00325594b477041bd7759619891b0d3a777616aa3cba20d894e6663486b"} Dec 05 18:06:25 crc kubenswrapper[4756]: I1205 18:06:25.420342 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40ae1a02-0032-4316-8fa4-826e47163b78","Type":"ContainerStarted","Data":"bd10bddf44476d4ae99f7b44d1e4dde5e5e68ada4aa22ef01cc31c4eca965f90"} Dec 05 18:06:27 crc kubenswrapper[4756]: I1205 18:06:27.049701 4756 scope.go:117] "RemoveContainer" containerID="c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326" Dec 05 18:06:27 crc kubenswrapper[4756]: E1205 18:06:27.050347 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:06:29 crc kubenswrapper[4756]: I1205 18:06:29.989623 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kzhn8" podUID="c64b6bd4-0f30-4b95-8bac-8c302e466540" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.50:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 18:06:35 crc kubenswrapper[4756]: I1205 18:06:35.539747 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40ae1a02-0032-4316-8fa4-826e47163b78","Type":"ContainerStarted","Data":"46745cb77bac2b418bd524b9d2684bbaf1d226a19dcecabe3e5254ada9105bb3"} Dec 05 18:06:37 crc kubenswrapper[4756]: I1205 18:06:37.567898 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40ae1a02-0032-4316-8fa4-826e47163b78","Type":"ContainerStarted","Data":"4e81adf3b534d2a7469bcec38d1f5dcd9d1209f0321d0823b3f351db3b37b33e"} Dec 05 18:06:37 crc kubenswrapper[4756]: I1205 18:06:37.568762 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 18:06:37 crc kubenswrapper[4756]: I1205 18:06:37.598628 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.8853096049999998 podStartE2EDuration="17.598609701s" podCreationTimestamp="2025-12-05 18:06:20 +0000 UTC" firstStartedPulling="2025-12-05 18:06:22.544960518 +0000 UTC m=+6642.783634040" lastFinishedPulling="2025-12-05 18:06:36.258260624 +0000 UTC m=+6656.496934136" observedRunningTime="2025-12-05 18:06:37.591543673 +0000 UTC m=+6657.830217225" watchObservedRunningTime="2025-12-05 18:06:37.598609701 +0000 UTC m=+6657.837283203" Dec 05 18:06:41 crc kubenswrapper[4756]: I1205 18:06:41.051099 4756 scope.go:117] "RemoveContainer" containerID="c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326" Dec 05 18:06:41 crc kubenswrapper[4756]: E1205 18:06:41.054577 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:06:51 crc kubenswrapper[4756]: I1205 18:06:51.261917 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 05 18:06:52 crc kubenswrapper[4756]: I1205 18:06:52.050079 4756 scope.go:117] "RemoveContainer" containerID="c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326" Dec 05 18:06:52 crc kubenswrapper[4756]: E1205 18:06:52.051065 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:06:54 crc kubenswrapper[4756]: I1205 18:06:54.783185 4756 generic.go:334] "Generic (PLEG): container finished" podID="67d16b4f-2886-43a0-b25a-19cd3058505a" containerID="82a7a99ef6cab6e16564236d6637f2d694ee210e8fc1ca93e04045818eb938bd" exitCode=137 Dec 05 18:06:54 crc kubenswrapper[4756]: I1205 18:06:54.784234 4756 generic.go:334] "Generic (PLEG): container finished" podID="67d16b4f-2886-43a0-b25a-19cd3058505a" containerID="22991e8f719cf6c84e09dc9a17c8cfc7d465676f11bfeae7130be1ad607010ef" exitCode=137 Dec 05 18:06:54 crc kubenswrapper[4756]: I1205 18:06:54.783276 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"67d16b4f-2886-43a0-b25a-19cd3058505a","Type":"ContainerDied","Data":"82a7a99ef6cab6e16564236d6637f2d694ee210e8fc1ca93e04045818eb938bd"} Dec 05 18:06:54 crc kubenswrapper[4756]: I1205 18:06:54.784298 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"67d16b4f-2886-43a0-b25a-19cd3058505a","Type":"ContainerDied","Data":"22991e8f719cf6c84e09dc9a17c8cfc7d465676f11bfeae7130be1ad607010ef"} Dec 05 18:06:54 crc kubenswrapper[4756]: I1205 18:06:54.919637 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 05 18:06:54 crc kubenswrapper[4756]: I1205 18:06:54.941272 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67d16b4f-2886-43a0-b25a-19cd3058505a-scripts\") pod \"67d16b4f-2886-43a0-b25a-19cd3058505a\" (UID: \"67d16b4f-2886-43a0-b25a-19cd3058505a\") " Dec 05 18:06:54 crc kubenswrapper[4756]: I1205 18:06:54.941484 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d16b4f-2886-43a0-b25a-19cd3058505a-config-data\") pod \"67d16b4f-2886-43a0-b25a-19cd3058505a\" (UID: \"67d16b4f-2886-43a0-b25a-19cd3058505a\") " Dec 05 18:06:54 crc kubenswrapper[4756]: I1205 18:06:54.941610 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d16b4f-2886-43a0-b25a-19cd3058505a-combined-ca-bundle\") pod \"67d16b4f-2886-43a0-b25a-19cd3058505a\" (UID: \"67d16b4f-2886-43a0-b25a-19cd3058505a\") " Dec 05 18:06:54 crc kubenswrapper[4756]: I1205 18:06:54.941681 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jjx7\" (UniqueName: \"kubernetes.io/projected/67d16b4f-2886-43a0-b25a-19cd3058505a-kube-api-access-6jjx7\") pod \"67d16b4f-2886-43a0-b25a-19cd3058505a\" (UID: \"67d16b4f-2886-43a0-b25a-19cd3058505a\") " Dec 05 18:06:54 crc kubenswrapper[4756]: I1205 18:06:54.951547 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67d16b4f-2886-43a0-b25a-19cd3058505a-scripts" (OuterVolumeSpecName: "scripts") pod "67d16b4f-2886-43a0-b25a-19cd3058505a" (UID: "67d16b4f-2886-43a0-b25a-19cd3058505a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:06:54 crc kubenswrapper[4756]: I1205 18:06:54.951705 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67d16b4f-2886-43a0-b25a-19cd3058505a-kube-api-access-6jjx7" (OuterVolumeSpecName: "kube-api-access-6jjx7") pod "67d16b4f-2886-43a0-b25a-19cd3058505a" (UID: "67d16b4f-2886-43a0-b25a-19cd3058505a"). InnerVolumeSpecName "kube-api-access-6jjx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.044363 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jjx7\" (UniqueName: \"kubernetes.io/projected/67d16b4f-2886-43a0-b25a-19cd3058505a-kube-api-access-6jjx7\") on node \"crc\" DevicePath \"\"" Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.044445 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67d16b4f-2886-43a0-b25a-19cd3058505a-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.053740 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67d16b4f-2886-43a0-b25a-19cd3058505a-config-data" (OuterVolumeSpecName: "config-data") pod "67d16b4f-2886-43a0-b25a-19cd3058505a" (UID: "67d16b4f-2886-43a0-b25a-19cd3058505a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.063353 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67d16b4f-2886-43a0-b25a-19cd3058505a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67d16b4f-2886-43a0-b25a-19cd3058505a" (UID: "67d16b4f-2886-43a0-b25a-19cd3058505a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.146693 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d16b4f-2886-43a0-b25a-19cd3058505a-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.146731 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d16b4f-2886-43a0-b25a-19cd3058505a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.802133 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"67d16b4f-2886-43a0-b25a-19cd3058505a","Type":"ContainerDied","Data":"2406704d31d96830d552b9841b46bd5004133ce5dd4b467e6ca02008bd4e108b"} Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.802499 4756 scope.go:117] "RemoveContainer" containerID="82a7a99ef6cab6e16564236d6637f2d694ee210e8fc1ca93e04045818eb938bd" Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.802228 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.836832 4756 scope.go:117] "RemoveContainer" containerID="22991e8f719cf6c84e09dc9a17c8cfc7d465676f11bfeae7130be1ad607010ef" Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.879556 4756 scope.go:117] "RemoveContainer" containerID="215dd00325594b477041bd7759619891b0d3a777616aa3cba20d894e6663486b" Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.889754 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.901072 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.963347 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 05 18:06:55 crc kubenswrapper[4756]: E1205 18:06:55.965413 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67d16b4f-2886-43a0-b25a-19cd3058505a" containerName="aodh-api" Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.965477 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="67d16b4f-2886-43a0-b25a-19cd3058505a" containerName="aodh-api" Dec 05 18:06:55 crc kubenswrapper[4756]: E1205 18:06:55.965549 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67d16b4f-2886-43a0-b25a-19cd3058505a" containerName="aodh-listener" Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.965559 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="67d16b4f-2886-43a0-b25a-19cd3058505a" containerName="aodh-listener" Dec 05 18:06:55 crc kubenswrapper[4756]: E1205 18:06:55.965633 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67d16b4f-2886-43a0-b25a-19cd3058505a" containerName="aodh-notifier" Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.965641 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="67d16b4f-2886-43a0-b25a-19cd3058505a" containerName="aodh-notifier" Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.966460 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="67d16b4f-2886-43a0-b25a-19cd3058505a" containerName="aodh-listener" Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.966493 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="67d16b4f-2886-43a0-b25a-19cd3058505a" containerName="aodh-notifier" Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.966530 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="67d16b4f-2886-43a0-b25a-19cd3058505a" containerName="aodh-api" Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.970100 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.978033 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.978344 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.978658 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.980458 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-87z99" Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.980851 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.980914 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.983250 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/46fbc0c7-e619-4b43-bd9f-1af28c6321b5-public-tls-certs\") pod \"aodh-0\" (UID: \"46fbc0c7-e619-4b43-bd9f-1af28c6321b5\") " pod="openstack/aodh-0" Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.983335 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46fbc0c7-e619-4b43-bd9f-1af28c6321b5-config-data\") pod \"aodh-0\" (UID: \"46fbc0c7-e619-4b43-bd9f-1af28c6321b5\") " pod="openstack/aodh-0" Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.983405 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46fbc0c7-e619-4b43-bd9f-1af28c6321b5-scripts\") pod \"aodh-0\" (UID: \"46fbc0c7-e619-4b43-bd9f-1af28c6321b5\") " pod="openstack/aodh-0" Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.983581 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46fbc0c7-e619-4b43-bd9f-1af28c6321b5-combined-ca-bundle\") pod \"aodh-0\" (UID: \"46fbc0c7-e619-4b43-bd9f-1af28c6321b5\") " pod="openstack/aodh-0" Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.984012 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbbnv\" (UniqueName: \"kubernetes.io/projected/46fbc0c7-e619-4b43-bd9f-1af28c6321b5-kube-api-access-vbbnv\") pod \"aodh-0\" (UID: \"46fbc0c7-e619-4b43-bd9f-1af28c6321b5\") " pod="openstack/aodh-0" Dec 05 18:06:55 crc kubenswrapper[4756]: I1205 18:06:55.984162 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/46fbc0c7-e619-4b43-bd9f-1af28c6321b5-internal-tls-certs\") pod \"aodh-0\" (UID: \"46fbc0c7-e619-4b43-bd9f-1af28c6321b5\") " pod="openstack/aodh-0" Dec 05 18:06:56 crc kubenswrapper[4756]: I1205 18:06:56.062492 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67d16b4f-2886-43a0-b25a-19cd3058505a" path="/var/lib/kubelet/pods/67d16b4f-2886-43a0-b25a-19cd3058505a/volumes" Dec 05 18:06:56 crc kubenswrapper[4756]: I1205 18:06:56.085634 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/46fbc0c7-e619-4b43-bd9f-1af28c6321b5-public-tls-certs\") pod \"aodh-0\" (UID: \"46fbc0c7-e619-4b43-bd9f-1af28c6321b5\") " pod="openstack/aodh-0" Dec 05 18:06:56 crc kubenswrapper[4756]: I1205 18:06:56.085707 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46fbc0c7-e619-4b43-bd9f-1af28c6321b5-config-data\") pod \"aodh-0\" (UID: \"46fbc0c7-e619-4b43-bd9f-1af28c6321b5\") " pod="openstack/aodh-0" Dec 05 18:06:56 crc kubenswrapper[4756]: I1205 18:06:56.085763 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46fbc0c7-e619-4b43-bd9f-1af28c6321b5-scripts\") pod \"aodh-0\" (UID: \"46fbc0c7-e619-4b43-bd9f-1af28c6321b5\") " pod="openstack/aodh-0" Dec 05 18:06:56 crc kubenswrapper[4756]: I1205 18:06:56.085822 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46fbc0c7-e619-4b43-bd9f-1af28c6321b5-combined-ca-bundle\") pod \"aodh-0\" (UID: \"46fbc0c7-e619-4b43-bd9f-1af28c6321b5\") " pod="openstack/aodh-0" Dec 05 18:06:56 crc kubenswrapper[4756]: I1205 18:06:56.085882 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/46fbc0c7-e619-4b43-bd9f-1af28c6321b5-internal-tls-certs\") pod \"aodh-0\" (UID: \"46fbc0c7-e619-4b43-bd9f-1af28c6321b5\") " pod="openstack/aodh-0" Dec 05 18:06:56 crc kubenswrapper[4756]: I1205 18:06:56.085934 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbbnv\" (UniqueName: \"kubernetes.io/projected/46fbc0c7-e619-4b43-bd9f-1af28c6321b5-kube-api-access-vbbnv\") pod \"aodh-0\" (UID: \"46fbc0c7-e619-4b43-bd9f-1af28c6321b5\") " pod="openstack/aodh-0" Dec 05 18:06:56 crc kubenswrapper[4756]: I1205 18:06:56.090974 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46fbc0c7-e619-4b43-bd9f-1af28c6321b5-scripts\") pod \"aodh-0\" (UID: \"46fbc0c7-e619-4b43-bd9f-1af28c6321b5\") " pod="openstack/aodh-0" Dec 05 18:06:56 crc kubenswrapper[4756]: I1205 18:06:56.091035 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46fbc0c7-e619-4b43-bd9f-1af28c6321b5-combined-ca-bundle\") pod \"aodh-0\" (UID: \"46fbc0c7-e619-4b43-bd9f-1af28c6321b5\") " pod="openstack/aodh-0" Dec 05 18:06:56 crc kubenswrapper[4756]: I1205 18:06:56.091323 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/46fbc0c7-e619-4b43-bd9f-1af28c6321b5-internal-tls-certs\") pod \"aodh-0\" (UID: \"46fbc0c7-e619-4b43-bd9f-1af28c6321b5\") " pod="openstack/aodh-0" Dec 05 18:06:56 crc kubenswrapper[4756]: I1205 18:06:56.093367 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46fbc0c7-e619-4b43-bd9f-1af28c6321b5-config-data\") pod \"aodh-0\" (UID: \"46fbc0c7-e619-4b43-bd9f-1af28c6321b5\") " pod="openstack/aodh-0" Dec 05 18:06:56 crc kubenswrapper[4756]: I1205 18:06:56.094128 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/46fbc0c7-e619-4b43-bd9f-1af28c6321b5-public-tls-certs\") pod \"aodh-0\" (UID: \"46fbc0c7-e619-4b43-bd9f-1af28c6321b5\") " pod="openstack/aodh-0" Dec 05 18:06:56 crc kubenswrapper[4756]: I1205 18:06:56.106285 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbbnv\" (UniqueName: \"kubernetes.io/projected/46fbc0c7-e619-4b43-bd9f-1af28c6321b5-kube-api-access-vbbnv\") pod \"aodh-0\" (UID: \"46fbc0c7-e619-4b43-bd9f-1af28c6321b5\") " pod="openstack/aodh-0" Dec 05 18:06:56 crc kubenswrapper[4756]: I1205 18:06:56.290748 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 05 18:06:57 crc kubenswrapper[4756]: I1205 18:06:57.034584 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 05 18:06:57 crc kubenswrapper[4756]: W1205 18:06:57.037914 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46fbc0c7_e619_4b43_bd9f_1af28c6321b5.slice/crio-5f70d7dc9e275fd5f4ee6458ad80637f2a1c40067eaf9b2a83254c3058a94b87 WatchSource:0}: Error finding container 5f70d7dc9e275fd5f4ee6458ad80637f2a1c40067eaf9b2a83254c3058a94b87: Status 404 returned error can't find the container with id 5f70d7dc9e275fd5f4ee6458ad80637f2a1c40067eaf9b2a83254c3058a94b87 Dec 05 18:06:57 crc kubenswrapper[4756]: I1205 18:06:57.835878 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"46fbc0c7-e619-4b43-bd9f-1af28c6321b5","Type":"ContainerStarted","Data":"5f70d7dc9e275fd5f4ee6458ad80637f2a1c40067eaf9b2a83254c3058a94b87"} Dec 05 18:06:59 crc kubenswrapper[4756]: I1205 18:06:59.862771 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"46fbc0c7-e619-4b43-bd9f-1af28c6321b5","Type":"ContainerStarted","Data":"d8455c0f073cdfd71898adde3bea44443b3f10c0a9c45f306f7ad2db9920ed39"} Dec 05 18:07:02 crc kubenswrapper[4756]: I1205 18:07:02.904195 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"46fbc0c7-e619-4b43-bd9f-1af28c6321b5","Type":"ContainerStarted","Data":"30c36e1554e380413bc5b82cb0acc483b9f0406688169892f1fcb56bfb435fd6"} Dec 05 18:07:05 crc kubenswrapper[4756]: I1205 18:07:05.938055 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"46fbc0c7-e619-4b43-bd9f-1af28c6321b5","Type":"ContainerStarted","Data":"5a009bca3041cd707181b03accafe4cfee7827ffa27fc7e9989ceb1ad949319a"} Dec 05 18:07:06 crc kubenswrapper[4756]: I1205 18:07:06.988011 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"46fbc0c7-e619-4b43-bd9f-1af28c6321b5","Type":"ContainerStarted","Data":"7a2ac5f84eca895487e0264c49bdfcda52ff8ddf6f32afa6dafa2069e6333c16"} Dec 05 18:07:07 crc kubenswrapper[4756]: I1205 18:07:07.055048 4756 scope.go:117] "RemoveContainer" containerID="c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326" Dec 05 18:07:07 crc kubenswrapper[4756]: E1205 18:07:07.055741 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:07:08 crc kubenswrapper[4756]: I1205 18:07:08.035646 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=3.442048503 podStartE2EDuration="13.035626173s" podCreationTimestamp="2025-12-05 18:06:55 +0000 UTC" firstStartedPulling="2025-12-05 18:06:57.040343344 +0000 UTC m=+6677.279016856" lastFinishedPulling="2025-12-05 18:07:06.633921014 +0000 UTC m=+6686.872594526" observedRunningTime="2025-12-05 18:07:08.029277275 +0000 UTC m=+6688.267950817" watchObservedRunningTime="2025-12-05 18:07:08.035626173 +0000 UTC m=+6688.274299695" Dec 05 18:07:18 crc kubenswrapper[4756]: I1205 18:07:18.050973 4756 scope.go:117] "RemoveContainer" containerID="c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326" Dec 05 18:07:18 crc kubenswrapper[4756]: E1205 18:07:18.051921 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:07:33 crc kubenswrapper[4756]: I1205 18:07:33.051033 4756 scope.go:117] "RemoveContainer" containerID="c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326" Dec 05 18:07:33 crc kubenswrapper[4756]: E1205 18:07:33.054773 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:07:46 crc kubenswrapper[4756]: I1205 18:07:46.041153 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-gzb66"] Dec 05 18:07:46 crc kubenswrapper[4756]: I1205 18:07:46.049729 4756 scope.go:117] "RemoveContainer" containerID="c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326" Dec 05 18:07:46 crc kubenswrapper[4756]: E1205 18:07:46.050168 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:07:46 crc kubenswrapper[4756]: I1205 18:07:46.066261 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-f6ca-account-create-update-jxn5g"] Dec 05 18:07:46 crc kubenswrapper[4756]: I1205 18:07:46.076301 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-gzb66"] Dec 05 18:07:46 crc kubenswrapper[4756]: I1205 18:07:46.083841 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-f6ca-account-create-update-jxn5g"] Dec 05 18:07:48 crc kubenswrapper[4756]: I1205 18:07:48.095199 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21b9a5ca-5ea6-4080-a7bb-24bcca2ff889" path="/var/lib/kubelet/pods/21b9a5ca-5ea6-4080-a7bb-24bcca2ff889/volumes" Dec 05 18:07:48 crc kubenswrapper[4756]: I1205 18:07:48.098135 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1b297ed-9607-4bd5-9ec1-20125ee78453" path="/var/lib/kubelet/pods/d1b297ed-9607-4bd5-9ec1-20125ee78453/volumes" Dec 05 18:07:48 crc kubenswrapper[4756]: I1205 18:07:48.608237 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbd7c5f7-z2bsf"] Dec 05 18:07:48 crc kubenswrapper[4756]: I1205 18:07:48.619862 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" Dec 05 18:07:48 crc kubenswrapper[4756]: I1205 18:07:48.622582 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Dec 05 18:07:48 crc kubenswrapper[4756]: I1205 18:07:48.624985 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbd7c5f7-z2bsf"] Dec 05 18:07:48 crc kubenswrapper[4756]: I1205 18:07:48.696900 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29b93260-f32d-4fa7-aff4-b88007615c64-ovsdbserver-nb\") pod \"dnsmasq-dns-bbd7c5f7-z2bsf\" (UID: \"29b93260-f32d-4fa7-aff4-b88007615c64\") " pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" Dec 05 18:07:48 crc kubenswrapper[4756]: I1205 18:07:48.697001 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b93260-f32d-4fa7-aff4-b88007615c64-config\") pod \"dnsmasq-dns-bbd7c5f7-z2bsf\" (UID: \"29b93260-f32d-4fa7-aff4-b88007615c64\") " pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" Dec 05 18:07:48 crc kubenswrapper[4756]: I1205 18:07:48.697029 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29b93260-f32d-4fa7-aff4-b88007615c64-ovsdbserver-sb\") pod \"dnsmasq-dns-bbd7c5f7-z2bsf\" (UID: \"29b93260-f32d-4fa7-aff4-b88007615c64\") " pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" Dec 05 18:07:48 crc kubenswrapper[4756]: I1205 18:07:48.697637 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/29b93260-f32d-4fa7-aff4-b88007615c64-openstack-cell1\") pod \"dnsmasq-dns-bbd7c5f7-z2bsf\" (UID: \"29b93260-f32d-4fa7-aff4-b88007615c64\") " pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" Dec 05 18:07:48 crc kubenswrapper[4756]: I1205 18:07:48.697772 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29b93260-f32d-4fa7-aff4-b88007615c64-dns-svc\") pod \"dnsmasq-dns-bbd7c5f7-z2bsf\" (UID: \"29b93260-f32d-4fa7-aff4-b88007615c64\") " pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" Dec 05 18:07:48 crc kubenswrapper[4756]: I1205 18:07:48.697866 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl8vl\" (UniqueName: \"kubernetes.io/projected/29b93260-f32d-4fa7-aff4-b88007615c64-kube-api-access-bl8vl\") pod \"dnsmasq-dns-bbd7c5f7-z2bsf\" (UID: \"29b93260-f32d-4fa7-aff4-b88007615c64\") " pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" Dec 05 18:07:48 crc kubenswrapper[4756]: I1205 18:07:48.799640 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/29b93260-f32d-4fa7-aff4-b88007615c64-openstack-cell1\") pod \"dnsmasq-dns-bbd7c5f7-z2bsf\" (UID: \"29b93260-f32d-4fa7-aff4-b88007615c64\") " pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" Dec 05 18:07:48 crc kubenswrapper[4756]: I1205 18:07:48.799716 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29b93260-f32d-4fa7-aff4-b88007615c64-dns-svc\") pod \"dnsmasq-dns-bbd7c5f7-z2bsf\" (UID: \"29b93260-f32d-4fa7-aff4-b88007615c64\") " pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" Dec 05 18:07:48 crc kubenswrapper[4756]: I1205 18:07:48.799767 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl8vl\" (UniqueName: \"kubernetes.io/projected/29b93260-f32d-4fa7-aff4-b88007615c64-kube-api-access-bl8vl\") pod \"dnsmasq-dns-bbd7c5f7-z2bsf\" (UID: \"29b93260-f32d-4fa7-aff4-b88007615c64\") " pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" Dec 05 18:07:48 crc kubenswrapper[4756]: I1205 18:07:48.799890 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29b93260-f32d-4fa7-aff4-b88007615c64-ovsdbserver-nb\") pod \"dnsmasq-dns-bbd7c5f7-z2bsf\" (UID: \"29b93260-f32d-4fa7-aff4-b88007615c64\") " pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" Dec 05 18:07:48 crc kubenswrapper[4756]: I1205 18:07:48.799958 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b93260-f32d-4fa7-aff4-b88007615c64-config\") pod \"dnsmasq-dns-bbd7c5f7-z2bsf\" (UID: \"29b93260-f32d-4fa7-aff4-b88007615c64\") " pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" Dec 05 18:07:48 crc kubenswrapper[4756]: I1205 18:07:48.799999 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29b93260-f32d-4fa7-aff4-b88007615c64-ovsdbserver-sb\") pod \"dnsmasq-dns-bbd7c5f7-z2bsf\" (UID: \"29b93260-f32d-4fa7-aff4-b88007615c64\") " pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" Dec 05 18:07:48 crc kubenswrapper[4756]: I1205 18:07:48.800637 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29b93260-f32d-4fa7-aff4-b88007615c64-dns-svc\") pod \"dnsmasq-dns-bbd7c5f7-z2bsf\" (UID: \"29b93260-f32d-4fa7-aff4-b88007615c64\") " pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" Dec 05 18:07:48 crc kubenswrapper[4756]: I1205 18:07:48.800815 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29b93260-f32d-4fa7-aff4-b88007615c64-ovsdbserver-nb\") pod \"dnsmasq-dns-bbd7c5f7-z2bsf\" (UID: \"29b93260-f32d-4fa7-aff4-b88007615c64\") " pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" Dec 05 18:07:48 crc kubenswrapper[4756]: I1205 18:07:48.801022 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b93260-f32d-4fa7-aff4-b88007615c64-config\") pod \"dnsmasq-dns-bbd7c5f7-z2bsf\" (UID: \"29b93260-f32d-4fa7-aff4-b88007615c64\") " pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" Dec 05 18:07:48 crc kubenswrapper[4756]: I1205 18:07:48.801538 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/29b93260-f32d-4fa7-aff4-b88007615c64-openstack-cell1\") pod \"dnsmasq-dns-bbd7c5f7-z2bsf\" (UID: \"29b93260-f32d-4fa7-aff4-b88007615c64\") " pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" Dec 05 18:07:48 crc kubenswrapper[4756]: I1205 18:07:48.801772 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29b93260-f32d-4fa7-aff4-b88007615c64-ovsdbserver-sb\") pod \"dnsmasq-dns-bbd7c5f7-z2bsf\" (UID: \"29b93260-f32d-4fa7-aff4-b88007615c64\") " pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" Dec 05 18:07:48 crc kubenswrapper[4756]: I1205 18:07:48.837037 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl8vl\" (UniqueName: \"kubernetes.io/projected/29b93260-f32d-4fa7-aff4-b88007615c64-kube-api-access-bl8vl\") pod \"dnsmasq-dns-bbd7c5f7-z2bsf\" (UID: \"29b93260-f32d-4fa7-aff4-b88007615c64\") " pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" Dec 05 18:07:48 crc kubenswrapper[4756]: I1205 18:07:48.947998 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" Dec 05 18:07:49 crc kubenswrapper[4756]: I1205 18:07:49.480385 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbd7c5f7-z2bsf"] Dec 05 18:07:49 crc kubenswrapper[4756]: I1205 18:07:49.580232 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" event={"ID":"29b93260-f32d-4fa7-aff4-b88007615c64","Type":"ContainerStarted","Data":"658c56779f39569f0e1d0e2d3452d37597c000617e4a39a3e48f5aa15d5cc854"} Dec 05 18:07:50 crc kubenswrapper[4756]: I1205 18:07:50.591238 4756 generic.go:334] "Generic (PLEG): container finished" podID="29b93260-f32d-4fa7-aff4-b88007615c64" containerID="38fbe843d0c07aa44beb7133e26b8614baa83f6ddeb2fbe7a92282de3cbce023" exitCode=0 Dec 05 18:07:50 crc kubenswrapper[4756]: I1205 18:07:50.591337 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" event={"ID":"29b93260-f32d-4fa7-aff4-b88007615c64","Type":"ContainerDied","Data":"38fbe843d0c07aa44beb7133e26b8614baa83f6ddeb2fbe7a92282de3cbce023"} Dec 05 18:07:51 crc kubenswrapper[4756]: I1205 18:07:51.607845 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" event={"ID":"29b93260-f32d-4fa7-aff4-b88007615c64","Type":"ContainerStarted","Data":"af577d50c0b70f187e598851627b6909d927971b6b3154f699f9cab158c8dab4"} Dec 05 18:07:51 crc kubenswrapper[4756]: I1205 18:07:51.608237 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" Dec 05 18:07:51 crc kubenswrapper[4756]: I1205 18:07:51.661598 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" podStartSLOduration=3.661562576 podStartE2EDuration="3.661562576s" podCreationTimestamp="2025-12-05 18:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 18:07:51.636887313 +0000 UTC m=+6731.875560895" watchObservedRunningTime="2025-12-05 18:07:51.661562576 +0000 UTC m=+6731.900236168" Dec 05 18:07:53 crc kubenswrapper[4756]: I1205 18:07:53.051825 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-9wdht"] Dec 05 18:07:53 crc kubenswrapper[4756]: I1205 18:07:53.068640 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-9wdht"] Dec 05 18:07:54 crc kubenswrapper[4756]: I1205 18:07:54.042984 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-dab0-account-create-update-mb5rs"] Dec 05 18:07:54 crc kubenswrapper[4756]: I1205 18:07:54.072084 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e86958a-c21c-450c-ab3d-9c1c6ab884c0" path="/var/lib/kubelet/pods/2e86958a-c21c-450c-ab3d-9c1c6ab884c0/volumes" Dec 05 18:07:54 crc kubenswrapper[4756]: I1205 18:07:54.073641 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-dab0-account-create-update-mb5rs"] Dec 05 18:07:56 crc kubenswrapper[4756]: I1205 18:07:56.063782 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbd70a73-ff81-45bc-848a-97f2c4b83c6c" path="/var/lib/kubelet/pods/fbd70a73-ff81-45bc-848a-97f2c4b83c6c/volumes" Dec 05 18:07:58 crc kubenswrapper[4756]: I1205 18:07:58.949734 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.056629 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b6f4996c9-dtntv"] Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.056897 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b6f4996c9-dtntv" podUID="d3184588-cae4-4f59-9cbc-fa85211c15ef" containerName="dnsmasq-dns" containerID="cri-o://537707e9aa45774427bf93b9a0a3492d4fd58ff52ab08eeb9ed42f51100ea61d" gracePeriod=10 Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.158907 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c97695b55-7nkqx"] Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.160673 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c97695b55-7nkqx" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.182559 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c97695b55-7nkqx"] Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.347606 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e-dns-svc\") pod \"dnsmasq-dns-7c97695b55-7nkqx\" (UID: \"a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e\") " pod="openstack/dnsmasq-dns-7c97695b55-7nkqx" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.347914 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-784sn\" (UniqueName: \"kubernetes.io/projected/a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e-kube-api-access-784sn\") pod \"dnsmasq-dns-7c97695b55-7nkqx\" (UID: \"a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e\") " pod="openstack/dnsmasq-dns-7c97695b55-7nkqx" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.347974 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e-ovsdbserver-nb\") pod \"dnsmasq-dns-7c97695b55-7nkqx\" (UID: \"a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e\") " pod="openstack/dnsmasq-dns-7c97695b55-7nkqx" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.347998 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e-openstack-cell1\") pod \"dnsmasq-dns-7c97695b55-7nkqx\" (UID: \"a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e\") " pod="openstack/dnsmasq-dns-7c97695b55-7nkqx" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.348075 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e-config\") pod \"dnsmasq-dns-7c97695b55-7nkqx\" (UID: \"a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e\") " pod="openstack/dnsmasq-dns-7c97695b55-7nkqx" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.348186 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e-ovsdbserver-sb\") pod \"dnsmasq-dns-7c97695b55-7nkqx\" (UID: \"a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e\") " pod="openstack/dnsmasq-dns-7c97695b55-7nkqx" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.451046 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e-ovsdbserver-sb\") pod \"dnsmasq-dns-7c97695b55-7nkqx\" (UID: \"a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e\") " pod="openstack/dnsmasq-dns-7c97695b55-7nkqx" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.451163 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e-dns-svc\") pod \"dnsmasq-dns-7c97695b55-7nkqx\" (UID: \"a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e\") " pod="openstack/dnsmasq-dns-7c97695b55-7nkqx" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.451854 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-784sn\" (UniqueName: \"kubernetes.io/projected/a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e-kube-api-access-784sn\") pod \"dnsmasq-dns-7c97695b55-7nkqx\" (UID: \"a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e\") " pod="openstack/dnsmasq-dns-7c97695b55-7nkqx" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.451968 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e-ovsdbserver-nb\") pod \"dnsmasq-dns-7c97695b55-7nkqx\" (UID: \"a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e\") " pod="openstack/dnsmasq-dns-7c97695b55-7nkqx" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.452256 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e-ovsdbserver-sb\") pod \"dnsmasq-dns-7c97695b55-7nkqx\" (UID: \"a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e\") " pod="openstack/dnsmasq-dns-7c97695b55-7nkqx" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.453040 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e-ovsdbserver-nb\") pod \"dnsmasq-dns-7c97695b55-7nkqx\" (UID: \"a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e\") " pod="openstack/dnsmasq-dns-7c97695b55-7nkqx" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.453892 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e-dns-svc\") pod \"dnsmasq-dns-7c97695b55-7nkqx\" (UID: \"a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e\") " pod="openstack/dnsmasq-dns-7c97695b55-7nkqx" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.454292 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e-openstack-cell1\") pod \"dnsmasq-dns-7c97695b55-7nkqx\" (UID: \"a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e\") " pod="openstack/dnsmasq-dns-7c97695b55-7nkqx" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.454601 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e-openstack-cell1\") pod \"dnsmasq-dns-7c97695b55-7nkqx\" (UID: \"a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e\") " pod="openstack/dnsmasq-dns-7c97695b55-7nkqx" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.455037 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e-config\") pod \"dnsmasq-dns-7c97695b55-7nkqx\" (UID: \"a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e\") " pod="openstack/dnsmasq-dns-7c97695b55-7nkqx" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.456534 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e-config\") pod \"dnsmasq-dns-7c97695b55-7nkqx\" (UID: \"a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e\") " pod="openstack/dnsmasq-dns-7c97695b55-7nkqx" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.478775 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-784sn\" (UniqueName: \"kubernetes.io/projected/a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e-kube-api-access-784sn\") pod \"dnsmasq-dns-7c97695b55-7nkqx\" (UID: \"a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e\") " pod="openstack/dnsmasq-dns-7c97695b55-7nkqx" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.547480 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c97695b55-7nkqx" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.626908 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b6f4996c9-dtntv" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.763159 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3184588-cae4-4f59-9cbc-fa85211c15ef-ovsdbserver-nb\") pod \"d3184588-cae4-4f59-9cbc-fa85211c15ef\" (UID: \"d3184588-cae4-4f59-9cbc-fa85211c15ef\") " Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.763305 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3184588-cae4-4f59-9cbc-fa85211c15ef-dns-svc\") pod \"d3184588-cae4-4f59-9cbc-fa85211c15ef\" (UID: \"d3184588-cae4-4f59-9cbc-fa85211c15ef\") " Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.763361 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3184588-cae4-4f59-9cbc-fa85211c15ef-ovsdbserver-sb\") pod \"d3184588-cae4-4f59-9cbc-fa85211c15ef\" (UID: \"d3184588-cae4-4f59-9cbc-fa85211c15ef\") " Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.763490 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rd7fp\" (UniqueName: \"kubernetes.io/projected/d3184588-cae4-4f59-9cbc-fa85211c15ef-kube-api-access-rd7fp\") pod \"d3184588-cae4-4f59-9cbc-fa85211c15ef\" (UID: \"d3184588-cae4-4f59-9cbc-fa85211c15ef\") " Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.763537 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3184588-cae4-4f59-9cbc-fa85211c15ef-config\") pod \"d3184588-cae4-4f59-9cbc-fa85211c15ef\" (UID: \"d3184588-cae4-4f59-9cbc-fa85211c15ef\") " Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.800666 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3184588-cae4-4f59-9cbc-fa85211c15ef-kube-api-access-rd7fp" (OuterVolumeSpecName: "kube-api-access-rd7fp") pod "d3184588-cae4-4f59-9cbc-fa85211c15ef" (UID: "d3184588-cae4-4f59-9cbc-fa85211c15ef"). InnerVolumeSpecName "kube-api-access-rd7fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.814938 4756 generic.go:334] "Generic (PLEG): container finished" podID="d3184588-cae4-4f59-9cbc-fa85211c15ef" containerID="537707e9aa45774427bf93b9a0a3492d4fd58ff52ab08eeb9ed42f51100ea61d" exitCode=0 Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.815022 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6f4996c9-dtntv" event={"ID":"d3184588-cae4-4f59-9cbc-fa85211c15ef","Type":"ContainerDied","Data":"537707e9aa45774427bf93b9a0a3492d4fd58ff52ab08eeb9ed42f51100ea61d"} Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.815050 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6f4996c9-dtntv" event={"ID":"d3184588-cae4-4f59-9cbc-fa85211c15ef","Type":"ContainerDied","Data":"d5f0ebfac33def1a52a3aed0876ea6f59c0a0e149b6d87f57e82aaa582014cc8"} Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.815067 4756 scope.go:117] "RemoveContainer" containerID="537707e9aa45774427bf93b9a0a3492d4fd58ff52ab08eeb9ed42f51100ea61d" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.815232 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b6f4996c9-dtntv" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.865797 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rd7fp\" (UniqueName: \"kubernetes.io/projected/d3184588-cae4-4f59-9cbc-fa85211c15ef-kube-api-access-rd7fp\") on node \"crc\" DevicePath \"\"" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.868931 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3184588-cae4-4f59-9cbc-fa85211c15ef-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d3184588-cae4-4f59-9cbc-fa85211c15ef" (UID: "d3184588-cae4-4f59-9cbc-fa85211c15ef"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.872685 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3184588-cae4-4f59-9cbc-fa85211c15ef-config" (OuterVolumeSpecName: "config") pod "d3184588-cae4-4f59-9cbc-fa85211c15ef" (UID: "d3184588-cae4-4f59-9cbc-fa85211c15ef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.872842 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3184588-cae4-4f59-9cbc-fa85211c15ef-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d3184588-cae4-4f59-9cbc-fa85211c15ef" (UID: "d3184588-cae4-4f59-9cbc-fa85211c15ef"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.880006 4756 scope.go:117] "RemoveContainer" containerID="a3c45982eea9173cb99e2492e2f09a21a9ce2e3b3a4c9b215d542937787c68e7" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.884669 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3184588-cae4-4f59-9cbc-fa85211c15ef-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d3184588-cae4-4f59-9cbc-fa85211c15ef" (UID: "d3184588-cae4-4f59-9cbc-fa85211c15ef"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.914748 4756 scope.go:117] "RemoveContainer" containerID="537707e9aa45774427bf93b9a0a3492d4fd58ff52ab08eeb9ed42f51100ea61d" Dec 05 18:07:59 crc kubenswrapper[4756]: E1205 18:07:59.917109 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"537707e9aa45774427bf93b9a0a3492d4fd58ff52ab08eeb9ed42f51100ea61d\": container with ID starting with 537707e9aa45774427bf93b9a0a3492d4fd58ff52ab08eeb9ed42f51100ea61d not found: ID does not exist" containerID="537707e9aa45774427bf93b9a0a3492d4fd58ff52ab08eeb9ed42f51100ea61d" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.917146 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"537707e9aa45774427bf93b9a0a3492d4fd58ff52ab08eeb9ed42f51100ea61d"} err="failed to get container status \"537707e9aa45774427bf93b9a0a3492d4fd58ff52ab08eeb9ed42f51100ea61d\": rpc error: code = NotFound desc = could not find container \"537707e9aa45774427bf93b9a0a3492d4fd58ff52ab08eeb9ed42f51100ea61d\": container with ID starting with 537707e9aa45774427bf93b9a0a3492d4fd58ff52ab08eeb9ed42f51100ea61d not found: ID does not exist" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.917165 4756 scope.go:117] "RemoveContainer" containerID="a3c45982eea9173cb99e2492e2f09a21a9ce2e3b3a4c9b215d542937787c68e7" Dec 05 18:07:59 crc kubenswrapper[4756]: E1205 18:07:59.919027 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3c45982eea9173cb99e2492e2f09a21a9ce2e3b3a4c9b215d542937787c68e7\": container with ID starting with a3c45982eea9173cb99e2492e2f09a21a9ce2e3b3a4c9b215d542937787c68e7 not found: ID does not exist" containerID="a3c45982eea9173cb99e2492e2f09a21a9ce2e3b3a4c9b215d542937787c68e7" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.919047 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3c45982eea9173cb99e2492e2f09a21a9ce2e3b3a4c9b215d542937787c68e7"} err="failed to get container status \"a3c45982eea9173cb99e2492e2f09a21a9ce2e3b3a4c9b215d542937787c68e7\": rpc error: code = NotFound desc = could not find container \"a3c45982eea9173cb99e2492e2f09a21a9ce2e3b3a4c9b215d542937787c68e7\": container with ID starting with a3c45982eea9173cb99e2492e2f09a21a9ce2e3b3a4c9b215d542937787c68e7 not found: ID does not exist" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.968892 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3184588-cae4-4f59-9cbc-fa85211c15ef-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.968984 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3184588-cae4-4f59-9cbc-fa85211c15ef-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.969001 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3184588-cae4-4f59-9cbc-fa85211c15ef-config\") on node \"crc\" DevicePath \"\"" Dec 05 18:07:59 crc kubenswrapper[4756]: I1205 18:07:59.969011 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3184588-cae4-4f59-9cbc-fa85211c15ef-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 18:08:00 crc kubenswrapper[4756]: I1205 18:08:00.138017 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b6f4996c9-dtntv"] Dec 05 18:08:00 crc kubenswrapper[4756]: I1205 18:08:00.152700 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b6f4996c9-dtntv"] Dec 05 18:08:00 crc kubenswrapper[4756]: I1205 18:08:00.271396 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c97695b55-7nkqx"] Dec 05 18:08:00 crc kubenswrapper[4756]: I1205 18:08:00.826285 4756 generic.go:334] "Generic (PLEG): container finished" podID="a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e" containerID="44384357f879a9ca6144452c060e8e60c38e4f2451d239a393874d76e8e56e18" exitCode=0 Dec 05 18:08:00 crc kubenswrapper[4756]: I1205 18:08:00.826358 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c97695b55-7nkqx" event={"ID":"a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e","Type":"ContainerDied","Data":"44384357f879a9ca6144452c060e8e60c38e4f2451d239a393874d76e8e56e18"} Dec 05 18:08:00 crc kubenswrapper[4756]: I1205 18:08:00.826729 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c97695b55-7nkqx" event={"ID":"a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e","Type":"ContainerStarted","Data":"b918602752878f944aca0dab76f19a5612d3b46efde67a22aba0adb165ff9fa5"} Dec 05 18:08:01 crc kubenswrapper[4756]: I1205 18:08:01.050025 4756 scope.go:117] "RemoveContainer" containerID="c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326" Dec 05 18:08:01 crc kubenswrapper[4756]: E1205 18:08:01.050305 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:08:01 crc kubenswrapper[4756]: I1205 18:08:01.841893 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c97695b55-7nkqx" event={"ID":"a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e","Type":"ContainerStarted","Data":"7644a0cc372f44c2ac0476f169a6e3d73c62e5b15f15141b95475ea8b77773bd"} Dec 05 18:08:01 crc kubenswrapper[4756]: I1205 18:08:01.842045 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c97695b55-7nkqx" Dec 05 18:08:01 crc kubenswrapper[4756]: I1205 18:08:01.870483 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c97695b55-7nkqx" podStartSLOduration=2.870458026 podStartE2EDuration="2.870458026s" podCreationTimestamp="2025-12-05 18:07:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 18:08:01.865986778 +0000 UTC m=+6742.104660290" watchObservedRunningTime="2025-12-05 18:08:01.870458026 +0000 UTC m=+6742.109131548" Dec 05 18:08:02 crc kubenswrapper[4756]: I1205 18:08:02.068215 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3184588-cae4-4f59-9cbc-fa85211c15ef" path="/var/lib/kubelet/pods/d3184588-cae4-4f59-9cbc-fa85211c15ef/volumes" Dec 05 18:08:09 crc kubenswrapper[4756]: I1205 18:08:09.550371 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c97695b55-7nkqx" Dec 05 18:08:09 crc kubenswrapper[4756]: I1205 18:08:09.627084 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbd7c5f7-z2bsf"] Dec 05 18:08:09 crc kubenswrapper[4756]: I1205 18:08:09.627348 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" podUID="29b93260-f32d-4fa7-aff4-b88007615c64" containerName="dnsmasq-dns" containerID="cri-o://af577d50c0b70f187e598851627b6909d927971b6b3154f699f9cab158c8dab4" gracePeriod=10 Dec 05 18:08:09 crc kubenswrapper[4756]: I1205 18:08:09.948732 4756 generic.go:334] "Generic (PLEG): container finished" podID="29b93260-f32d-4fa7-aff4-b88007615c64" containerID="af577d50c0b70f187e598851627b6909d927971b6b3154f699f9cab158c8dab4" exitCode=0 Dec 05 18:08:09 crc kubenswrapper[4756]: I1205 18:08:09.948845 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" event={"ID":"29b93260-f32d-4fa7-aff4-b88007615c64","Type":"ContainerDied","Data":"af577d50c0b70f187e598851627b6909d927971b6b3154f699f9cab158c8dab4"} Dec 05 18:08:10 crc kubenswrapper[4756]: I1205 18:08:10.122669 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" Dec 05 18:08:10 crc kubenswrapper[4756]: I1205 18:08:10.229312 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/29b93260-f32d-4fa7-aff4-b88007615c64-openstack-cell1\") pod \"29b93260-f32d-4fa7-aff4-b88007615c64\" (UID: \"29b93260-f32d-4fa7-aff4-b88007615c64\") " Dec 05 18:08:10 crc kubenswrapper[4756]: I1205 18:08:10.229400 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29b93260-f32d-4fa7-aff4-b88007615c64-dns-svc\") pod \"29b93260-f32d-4fa7-aff4-b88007615c64\" (UID: \"29b93260-f32d-4fa7-aff4-b88007615c64\") " Dec 05 18:08:10 crc kubenswrapper[4756]: I1205 18:08:10.229559 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl8vl\" (UniqueName: \"kubernetes.io/projected/29b93260-f32d-4fa7-aff4-b88007615c64-kube-api-access-bl8vl\") pod \"29b93260-f32d-4fa7-aff4-b88007615c64\" (UID: \"29b93260-f32d-4fa7-aff4-b88007615c64\") " Dec 05 18:08:10 crc kubenswrapper[4756]: I1205 18:08:10.229622 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b93260-f32d-4fa7-aff4-b88007615c64-config\") pod \"29b93260-f32d-4fa7-aff4-b88007615c64\" (UID: \"29b93260-f32d-4fa7-aff4-b88007615c64\") " Dec 05 18:08:10 crc kubenswrapper[4756]: I1205 18:08:10.229651 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29b93260-f32d-4fa7-aff4-b88007615c64-ovsdbserver-nb\") pod \"29b93260-f32d-4fa7-aff4-b88007615c64\" (UID: \"29b93260-f32d-4fa7-aff4-b88007615c64\") " Dec 05 18:08:10 crc kubenswrapper[4756]: I1205 18:08:10.229771 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29b93260-f32d-4fa7-aff4-b88007615c64-ovsdbserver-sb\") pod \"29b93260-f32d-4fa7-aff4-b88007615c64\" (UID: \"29b93260-f32d-4fa7-aff4-b88007615c64\") " Dec 05 18:08:10 crc kubenswrapper[4756]: I1205 18:08:10.247670 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29b93260-f32d-4fa7-aff4-b88007615c64-kube-api-access-bl8vl" (OuterVolumeSpecName: "kube-api-access-bl8vl") pod "29b93260-f32d-4fa7-aff4-b88007615c64" (UID: "29b93260-f32d-4fa7-aff4-b88007615c64"). InnerVolumeSpecName "kube-api-access-bl8vl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:08:10 crc kubenswrapper[4756]: I1205 18:08:10.294518 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29b93260-f32d-4fa7-aff4-b88007615c64-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "29b93260-f32d-4fa7-aff4-b88007615c64" (UID: "29b93260-f32d-4fa7-aff4-b88007615c64"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:08:10 crc kubenswrapper[4756]: I1205 18:08:10.302142 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29b93260-f32d-4fa7-aff4-b88007615c64-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "29b93260-f32d-4fa7-aff4-b88007615c64" (UID: "29b93260-f32d-4fa7-aff4-b88007615c64"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:08:10 crc kubenswrapper[4756]: I1205 18:08:10.305932 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29b93260-f32d-4fa7-aff4-b88007615c64-config" (OuterVolumeSpecName: "config") pod "29b93260-f32d-4fa7-aff4-b88007615c64" (UID: "29b93260-f32d-4fa7-aff4-b88007615c64"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:08:10 crc kubenswrapper[4756]: I1205 18:08:10.307733 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29b93260-f32d-4fa7-aff4-b88007615c64-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "29b93260-f32d-4fa7-aff4-b88007615c64" (UID: "29b93260-f32d-4fa7-aff4-b88007615c64"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:08:10 crc kubenswrapper[4756]: I1205 18:08:10.309034 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29b93260-f32d-4fa7-aff4-b88007615c64-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "29b93260-f32d-4fa7-aff4-b88007615c64" (UID: "29b93260-f32d-4fa7-aff4-b88007615c64"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:08:10 crc kubenswrapper[4756]: I1205 18:08:10.332178 4756 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/29b93260-f32d-4fa7-aff4-b88007615c64-openstack-cell1\") on node \"crc\" DevicePath \"\"" Dec 05 18:08:10 crc kubenswrapper[4756]: I1205 18:08:10.332229 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29b93260-f32d-4fa7-aff4-b88007615c64-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 18:08:10 crc kubenswrapper[4756]: I1205 18:08:10.332245 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bl8vl\" (UniqueName: \"kubernetes.io/projected/29b93260-f32d-4fa7-aff4-b88007615c64-kube-api-access-bl8vl\") on node \"crc\" DevicePath \"\"" Dec 05 18:08:10 crc kubenswrapper[4756]: I1205 18:08:10.332259 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b93260-f32d-4fa7-aff4-b88007615c64-config\") on node \"crc\" DevicePath \"\"" Dec 05 18:08:10 crc kubenswrapper[4756]: I1205 18:08:10.332270 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29b93260-f32d-4fa7-aff4-b88007615c64-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 18:08:10 crc kubenswrapper[4756]: I1205 18:08:10.332280 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29b93260-f32d-4fa7-aff4-b88007615c64-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 18:08:10 crc kubenswrapper[4756]: I1205 18:08:10.966649 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" event={"ID":"29b93260-f32d-4fa7-aff4-b88007615c64","Type":"ContainerDied","Data":"658c56779f39569f0e1d0e2d3452d37597c000617e4a39a3e48f5aa15d5cc854"} Dec 05 18:08:10 crc kubenswrapper[4756]: I1205 18:08:10.966746 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbd7c5f7-z2bsf" Dec 05 18:08:10 crc kubenswrapper[4756]: I1205 18:08:10.967029 4756 scope.go:117] "RemoveContainer" containerID="af577d50c0b70f187e598851627b6909d927971b6b3154f699f9cab158c8dab4" Dec 05 18:08:11 crc kubenswrapper[4756]: I1205 18:08:11.005830 4756 scope.go:117] "RemoveContainer" containerID="38fbe843d0c07aa44beb7133e26b8614baa83f6ddeb2fbe7a92282de3cbce023" Dec 05 18:08:11 crc kubenswrapper[4756]: I1205 18:08:11.027243 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbd7c5f7-z2bsf"] Dec 05 18:08:11 crc kubenswrapper[4756]: I1205 18:08:11.041495 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbd7c5f7-z2bsf"] Dec 05 18:08:12 crc kubenswrapper[4756]: I1205 18:08:12.068916 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29b93260-f32d-4fa7-aff4-b88007615c64" path="/var/lib/kubelet/pods/29b93260-f32d-4fa7-aff4-b88007615c64/volumes" Dec 05 18:08:13 crc kubenswrapper[4756]: I1205 18:08:13.051340 4756 scope.go:117] "RemoveContainer" containerID="c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326" Dec 05 18:08:13 crc kubenswrapper[4756]: E1205 18:08:13.051908 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:08:17 crc kubenswrapper[4756]: I1205 18:08:17.312714 4756 scope.go:117] "RemoveContainer" containerID="b972f8c9ac3047a70673317b0b8920e6ec7a4e3b5fda4aa995531b36874923fb" Dec 05 18:08:17 crc kubenswrapper[4756]: I1205 18:08:17.348879 4756 scope.go:117] "RemoveContainer" containerID="4a420e8f95023e1b60e4eb8c84c3186f08ba9d5fbb1df363ffccf87a2f9817ee" Dec 05 18:08:17 crc kubenswrapper[4756]: I1205 18:08:17.421633 4756 scope.go:117] "RemoveContainer" containerID="1af54360917122aebce3a0e235a19758c38c8ac755c1af3946ee5439e66049ff" Dec 05 18:08:17 crc kubenswrapper[4756]: I1205 18:08:17.477993 4756 scope.go:117] "RemoveContainer" containerID="b405674568a18fce3eedd450c0c4e743461d865d46ad238926bae7cf000fc9bb" Dec 05 18:08:19 crc kubenswrapper[4756]: I1205 18:08:19.862494 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4"] Dec 05 18:08:19 crc kubenswrapper[4756]: E1205 18:08:19.863640 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3184588-cae4-4f59-9cbc-fa85211c15ef" containerName="dnsmasq-dns" Dec 05 18:08:19 crc kubenswrapper[4756]: I1205 18:08:19.863659 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3184588-cae4-4f59-9cbc-fa85211c15ef" containerName="dnsmasq-dns" Dec 05 18:08:19 crc kubenswrapper[4756]: E1205 18:08:19.863682 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29b93260-f32d-4fa7-aff4-b88007615c64" containerName="init" Dec 05 18:08:19 crc kubenswrapper[4756]: I1205 18:08:19.863690 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="29b93260-f32d-4fa7-aff4-b88007615c64" containerName="init" Dec 05 18:08:19 crc kubenswrapper[4756]: E1205 18:08:19.863713 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29b93260-f32d-4fa7-aff4-b88007615c64" containerName="dnsmasq-dns" Dec 05 18:08:19 crc kubenswrapper[4756]: I1205 18:08:19.863722 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="29b93260-f32d-4fa7-aff4-b88007615c64" containerName="dnsmasq-dns" Dec 05 18:08:19 crc kubenswrapper[4756]: E1205 18:08:19.863761 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3184588-cae4-4f59-9cbc-fa85211c15ef" containerName="init" Dec 05 18:08:19 crc kubenswrapper[4756]: I1205 18:08:19.863770 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3184588-cae4-4f59-9cbc-fa85211c15ef" containerName="init" Dec 05 18:08:19 crc kubenswrapper[4756]: I1205 18:08:19.864063 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3184588-cae4-4f59-9cbc-fa85211c15ef" containerName="dnsmasq-dns" Dec 05 18:08:19 crc kubenswrapper[4756]: I1205 18:08:19.864085 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="29b93260-f32d-4fa7-aff4-b88007615c64" containerName="dnsmasq-dns" Dec 05 18:08:19 crc kubenswrapper[4756]: I1205 18:08:19.865007 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4" Dec 05 18:08:19 crc kubenswrapper[4756]: I1205 18:08:19.868224 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 18:08:19 crc kubenswrapper[4756]: I1205 18:08:19.868575 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dfk9j" Dec 05 18:08:19 crc kubenswrapper[4756]: I1205 18:08:19.868697 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 18:08:19 crc kubenswrapper[4756]: I1205 18:08:19.868870 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 18:08:19 crc kubenswrapper[4756]: I1205 18:08:19.922744 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4"] Dec 05 18:08:19 crc kubenswrapper[4756]: I1205 18:08:19.997895 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmwj5\" (UniqueName: \"kubernetes.io/projected/5870dc85-1351-4505-b996-5d5b65eee0ee-kube-api-access-mmwj5\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4\" (UID: \"5870dc85-1351-4505-b996-5d5b65eee0ee\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4" Dec 05 18:08:19 crc kubenswrapper[4756]: I1205 18:08:19.998038 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5870dc85-1351-4505-b996-5d5b65eee0ee-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4\" (UID: \"5870dc85-1351-4505-b996-5d5b65eee0ee\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4" Dec 05 18:08:19 crc kubenswrapper[4756]: I1205 18:08:19.998085 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5870dc85-1351-4505-b996-5d5b65eee0ee-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4\" (UID: \"5870dc85-1351-4505-b996-5d5b65eee0ee\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4" Dec 05 18:08:19 crc kubenswrapper[4756]: I1205 18:08:19.998188 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5870dc85-1351-4505-b996-5d5b65eee0ee-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4\" (UID: \"5870dc85-1351-4505-b996-5d5b65eee0ee\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4" Dec 05 18:08:20 crc kubenswrapper[4756]: I1205 18:08:20.099806 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmwj5\" (UniqueName: \"kubernetes.io/projected/5870dc85-1351-4505-b996-5d5b65eee0ee-kube-api-access-mmwj5\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4\" (UID: \"5870dc85-1351-4505-b996-5d5b65eee0ee\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4" Dec 05 18:08:20 crc kubenswrapper[4756]: I1205 18:08:20.100142 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5870dc85-1351-4505-b996-5d5b65eee0ee-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4\" (UID: \"5870dc85-1351-4505-b996-5d5b65eee0ee\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4" Dec 05 18:08:20 crc kubenswrapper[4756]: I1205 18:08:20.100340 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5870dc85-1351-4505-b996-5d5b65eee0ee-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4\" (UID: \"5870dc85-1351-4505-b996-5d5b65eee0ee\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4" Dec 05 18:08:20 crc kubenswrapper[4756]: I1205 18:08:20.100488 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5870dc85-1351-4505-b996-5d5b65eee0ee-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4\" (UID: \"5870dc85-1351-4505-b996-5d5b65eee0ee\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4" Dec 05 18:08:20 crc kubenswrapper[4756]: I1205 18:08:20.110152 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5870dc85-1351-4505-b996-5d5b65eee0ee-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4\" (UID: \"5870dc85-1351-4505-b996-5d5b65eee0ee\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4" Dec 05 18:08:20 crc kubenswrapper[4756]: I1205 18:08:20.120576 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5870dc85-1351-4505-b996-5d5b65eee0ee-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4\" (UID: \"5870dc85-1351-4505-b996-5d5b65eee0ee\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4" Dec 05 18:08:20 crc kubenswrapper[4756]: I1205 18:08:20.121895 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5870dc85-1351-4505-b996-5d5b65eee0ee-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4\" (UID: \"5870dc85-1351-4505-b996-5d5b65eee0ee\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4" Dec 05 18:08:20 crc kubenswrapper[4756]: I1205 18:08:20.136519 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmwj5\" (UniqueName: \"kubernetes.io/projected/5870dc85-1351-4505-b996-5d5b65eee0ee-kube-api-access-mmwj5\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4\" (UID: \"5870dc85-1351-4505-b996-5d5b65eee0ee\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4" Dec 05 18:08:20 crc kubenswrapper[4756]: I1205 18:08:20.207512 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4" Dec 05 18:08:20 crc kubenswrapper[4756]: I1205 18:08:20.881743 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4"] Dec 05 18:08:20 crc kubenswrapper[4756]: W1205 18:08:20.884240 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5870dc85_1351_4505_b996_5d5b65eee0ee.slice/crio-34c7fd29a434498d4cbc5ffdfa232865f05425cc207b4749ed29d753177bc8da WatchSource:0}: Error finding container 34c7fd29a434498d4cbc5ffdfa232865f05425cc207b4749ed29d753177bc8da: Status 404 returned error can't find the container with id 34c7fd29a434498d4cbc5ffdfa232865f05425cc207b4749ed29d753177bc8da Dec 05 18:08:21 crc kubenswrapper[4756]: I1205 18:08:21.143566 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4" event={"ID":"5870dc85-1351-4505-b996-5d5b65eee0ee","Type":"ContainerStarted","Data":"34c7fd29a434498d4cbc5ffdfa232865f05425cc207b4749ed29d753177bc8da"} Dec 05 18:08:23 crc kubenswrapper[4756]: I1205 18:08:23.190288 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zlf4b"] Dec 05 18:08:23 crc kubenswrapper[4756]: I1205 18:08:23.192990 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zlf4b" Dec 05 18:08:23 crc kubenswrapper[4756]: I1205 18:08:23.199398 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zlf4b"] Dec 05 18:08:23 crc kubenswrapper[4756]: I1205 18:08:23.289449 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbe94a6f-7b68-458a-9f21-a3986682a5f5-utilities\") pod \"certified-operators-zlf4b\" (UID: \"fbe94a6f-7b68-458a-9f21-a3986682a5f5\") " pod="openshift-marketplace/certified-operators-zlf4b" Dec 05 18:08:23 crc kubenswrapper[4756]: I1205 18:08:23.289523 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbe94a6f-7b68-458a-9f21-a3986682a5f5-catalog-content\") pod \"certified-operators-zlf4b\" (UID: \"fbe94a6f-7b68-458a-9f21-a3986682a5f5\") " pod="openshift-marketplace/certified-operators-zlf4b" Dec 05 18:08:23 crc kubenswrapper[4756]: I1205 18:08:23.289564 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-795v8\" (UniqueName: \"kubernetes.io/projected/fbe94a6f-7b68-458a-9f21-a3986682a5f5-kube-api-access-795v8\") pod \"certified-operators-zlf4b\" (UID: \"fbe94a6f-7b68-458a-9f21-a3986682a5f5\") " pod="openshift-marketplace/certified-operators-zlf4b" Dec 05 18:08:23 crc kubenswrapper[4756]: I1205 18:08:23.391839 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbe94a6f-7b68-458a-9f21-a3986682a5f5-utilities\") pod \"certified-operators-zlf4b\" (UID: \"fbe94a6f-7b68-458a-9f21-a3986682a5f5\") " pod="openshift-marketplace/certified-operators-zlf4b" Dec 05 18:08:23 crc kubenswrapper[4756]: I1205 18:08:23.391922 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbe94a6f-7b68-458a-9f21-a3986682a5f5-catalog-content\") pod \"certified-operators-zlf4b\" (UID: \"fbe94a6f-7b68-458a-9f21-a3986682a5f5\") " pod="openshift-marketplace/certified-operators-zlf4b" Dec 05 18:08:23 crc kubenswrapper[4756]: I1205 18:08:23.391964 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-795v8\" (UniqueName: \"kubernetes.io/projected/fbe94a6f-7b68-458a-9f21-a3986682a5f5-kube-api-access-795v8\") pod \"certified-operators-zlf4b\" (UID: \"fbe94a6f-7b68-458a-9f21-a3986682a5f5\") " pod="openshift-marketplace/certified-operators-zlf4b" Dec 05 18:08:23 crc kubenswrapper[4756]: I1205 18:08:23.392297 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbe94a6f-7b68-458a-9f21-a3986682a5f5-utilities\") pod \"certified-operators-zlf4b\" (UID: \"fbe94a6f-7b68-458a-9f21-a3986682a5f5\") " pod="openshift-marketplace/certified-operators-zlf4b" Dec 05 18:08:23 crc kubenswrapper[4756]: I1205 18:08:23.392383 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbe94a6f-7b68-458a-9f21-a3986682a5f5-catalog-content\") pod \"certified-operators-zlf4b\" (UID: \"fbe94a6f-7b68-458a-9f21-a3986682a5f5\") " pod="openshift-marketplace/certified-operators-zlf4b" Dec 05 18:08:23 crc kubenswrapper[4756]: I1205 18:08:23.422163 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-795v8\" (UniqueName: \"kubernetes.io/projected/fbe94a6f-7b68-458a-9f21-a3986682a5f5-kube-api-access-795v8\") pod \"certified-operators-zlf4b\" (UID: \"fbe94a6f-7b68-458a-9f21-a3986682a5f5\") " pod="openshift-marketplace/certified-operators-zlf4b" Dec 05 18:08:23 crc kubenswrapper[4756]: I1205 18:08:23.522024 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zlf4b" Dec 05 18:08:24 crc kubenswrapper[4756]: I1205 18:08:24.384214 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zlf4b"] Dec 05 18:08:25 crc kubenswrapper[4756]: I1205 18:08:25.181557 4756 generic.go:334] "Generic (PLEG): container finished" podID="fbe94a6f-7b68-458a-9f21-a3986682a5f5" containerID="5939fb65451b729d76b85308cd572846bad509baa139269573de7930a66218b0" exitCode=0 Dec 05 18:08:25 crc kubenswrapper[4756]: I1205 18:08:25.181681 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zlf4b" event={"ID":"fbe94a6f-7b68-458a-9f21-a3986682a5f5","Type":"ContainerDied","Data":"5939fb65451b729d76b85308cd572846bad509baa139269573de7930a66218b0"} Dec 05 18:08:25 crc kubenswrapper[4756]: I1205 18:08:25.181836 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zlf4b" event={"ID":"fbe94a6f-7b68-458a-9f21-a3986682a5f5","Type":"ContainerStarted","Data":"4b8201d4a26fc1d4988b6fc4272e4c03bee30f7709674307d4af2d1bb5287d04"} Dec 05 18:08:27 crc kubenswrapper[4756]: I1205 18:08:27.049832 4756 scope.go:117] "RemoveContainer" containerID="c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326" Dec 05 18:08:27 crc kubenswrapper[4756]: E1205 18:08:27.052073 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:08:33 crc kubenswrapper[4756]: I1205 18:08:33.318790 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4" event={"ID":"5870dc85-1351-4505-b996-5d5b65eee0ee","Type":"ContainerStarted","Data":"4effb154453de3038891cd28478f3d676df39c134eee75e3f595b643a487cfc1"} Dec 05 18:08:33 crc kubenswrapper[4756]: I1205 18:08:33.322078 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zlf4b" event={"ID":"fbe94a6f-7b68-458a-9f21-a3986682a5f5","Type":"ContainerStarted","Data":"d421db56bb70b294a339e7ff439e3303d24303bcc0ff3f97c53d36523639c952"} Dec 05 18:08:33 crc kubenswrapper[4756]: I1205 18:08:33.339679 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4" podStartSLOduration=2.9037337770000002 podStartE2EDuration="14.338490851s" podCreationTimestamp="2025-12-05 18:08:19 +0000 UTC" firstStartedPulling="2025-12-05 18:08:20.887870253 +0000 UTC m=+6761.126543805" lastFinishedPulling="2025-12-05 18:08:32.322627327 +0000 UTC m=+6772.561300879" observedRunningTime="2025-12-05 18:08:33.337199957 +0000 UTC m=+6773.575873489" watchObservedRunningTime="2025-12-05 18:08:33.338490851 +0000 UTC m=+6773.577164363" Dec 05 18:08:36 crc kubenswrapper[4756]: I1205 18:08:36.366538 4756 generic.go:334] "Generic (PLEG): container finished" podID="fbe94a6f-7b68-458a-9f21-a3986682a5f5" containerID="d421db56bb70b294a339e7ff439e3303d24303bcc0ff3f97c53d36523639c952" exitCode=0 Dec 05 18:08:36 crc kubenswrapper[4756]: I1205 18:08:36.367077 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zlf4b" event={"ID":"fbe94a6f-7b68-458a-9f21-a3986682a5f5","Type":"ContainerDied","Data":"d421db56bb70b294a339e7ff439e3303d24303bcc0ff3f97c53d36523639c952"} Dec 05 18:08:37 crc kubenswrapper[4756]: I1205 18:08:37.395083 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zlf4b" event={"ID":"fbe94a6f-7b68-458a-9f21-a3986682a5f5","Type":"ContainerStarted","Data":"ca50ce3aae54106df018475ae07d30fed212fff24969c432eee3da78f979c232"} Dec 05 18:08:37 crc kubenswrapper[4756]: I1205 18:08:37.422965 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zlf4b" podStartSLOduration=9.815372611 podStartE2EDuration="14.422944253s" podCreationTimestamp="2025-12-05 18:08:23 +0000 UTC" firstStartedPulling="2025-12-05 18:08:32.200678238 +0000 UTC m=+6772.439351790" lastFinishedPulling="2025-12-05 18:08:36.80824992 +0000 UTC m=+6777.046923432" observedRunningTime="2025-12-05 18:08:37.417057917 +0000 UTC m=+6777.655731449" watchObservedRunningTime="2025-12-05 18:08:37.422944253 +0000 UTC m=+6777.661617765" Dec 05 18:08:38 crc kubenswrapper[4756]: I1205 18:08:38.050848 4756 scope.go:117] "RemoveContainer" containerID="c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326" Dec 05 18:08:38 crc kubenswrapper[4756]: E1205 18:08:38.051737 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:08:43 crc kubenswrapper[4756]: I1205 18:08:43.522572 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zlf4b" Dec 05 18:08:43 crc kubenswrapper[4756]: I1205 18:08:43.523672 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zlf4b" Dec 05 18:08:43 crc kubenswrapper[4756]: I1205 18:08:43.588248 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zlf4b" Dec 05 18:08:44 crc kubenswrapper[4756]: I1205 18:08:44.540139 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zlf4b" Dec 05 18:08:44 crc kubenswrapper[4756]: I1205 18:08:44.605021 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zlf4b"] Dec 05 18:08:46 crc kubenswrapper[4756]: I1205 18:08:46.500211 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zlf4b" podUID="fbe94a6f-7b68-458a-9f21-a3986682a5f5" containerName="registry-server" containerID="cri-o://ca50ce3aae54106df018475ae07d30fed212fff24969c432eee3da78f979c232" gracePeriod=2 Dec 05 18:08:47 crc kubenswrapper[4756]: I1205 18:08:47.024777 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zlf4b" Dec 05 18:08:47 crc kubenswrapper[4756]: I1205 18:08:47.103812 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbe94a6f-7b68-458a-9f21-a3986682a5f5-catalog-content\") pod \"fbe94a6f-7b68-458a-9f21-a3986682a5f5\" (UID: \"fbe94a6f-7b68-458a-9f21-a3986682a5f5\") " Dec 05 18:08:47 crc kubenswrapper[4756]: I1205 18:08:47.103977 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbe94a6f-7b68-458a-9f21-a3986682a5f5-utilities\") pod \"fbe94a6f-7b68-458a-9f21-a3986682a5f5\" (UID: \"fbe94a6f-7b68-458a-9f21-a3986682a5f5\") " Dec 05 18:08:47 crc kubenswrapper[4756]: I1205 18:08:47.104018 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-795v8\" (UniqueName: \"kubernetes.io/projected/fbe94a6f-7b68-458a-9f21-a3986682a5f5-kube-api-access-795v8\") pod \"fbe94a6f-7b68-458a-9f21-a3986682a5f5\" (UID: \"fbe94a6f-7b68-458a-9f21-a3986682a5f5\") " Dec 05 18:08:47 crc kubenswrapper[4756]: I1205 18:08:47.105396 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbe94a6f-7b68-458a-9f21-a3986682a5f5-utilities" (OuterVolumeSpecName: "utilities") pod "fbe94a6f-7b68-458a-9f21-a3986682a5f5" (UID: "fbe94a6f-7b68-458a-9f21-a3986682a5f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:08:47 crc kubenswrapper[4756]: I1205 18:08:47.110151 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbe94a6f-7b68-458a-9f21-a3986682a5f5-kube-api-access-795v8" (OuterVolumeSpecName: "kube-api-access-795v8") pod "fbe94a6f-7b68-458a-9f21-a3986682a5f5" (UID: "fbe94a6f-7b68-458a-9f21-a3986682a5f5"). InnerVolumeSpecName "kube-api-access-795v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:08:47 crc kubenswrapper[4756]: I1205 18:08:47.156776 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbe94a6f-7b68-458a-9f21-a3986682a5f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fbe94a6f-7b68-458a-9f21-a3986682a5f5" (UID: "fbe94a6f-7b68-458a-9f21-a3986682a5f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:08:47 crc kubenswrapper[4756]: I1205 18:08:47.206384 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbe94a6f-7b68-458a-9f21-a3986682a5f5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 18:08:47 crc kubenswrapper[4756]: I1205 18:08:47.206443 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbe94a6f-7b68-458a-9f21-a3986682a5f5-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 18:08:47 crc kubenswrapper[4756]: I1205 18:08:47.206459 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-795v8\" (UniqueName: \"kubernetes.io/projected/fbe94a6f-7b68-458a-9f21-a3986682a5f5-kube-api-access-795v8\") on node \"crc\" DevicePath \"\"" Dec 05 18:08:47 crc kubenswrapper[4756]: I1205 18:08:47.511943 4756 generic.go:334] "Generic (PLEG): container finished" podID="fbe94a6f-7b68-458a-9f21-a3986682a5f5" containerID="ca50ce3aae54106df018475ae07d30fed212fff24969c432eee3da78f979c232" exitCode=0 Dec 05 18:08:47 crc kubenswrapper[4756]: I1205 18:08:47.511993 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zlf4b" Dec 05 18:08:47 crc kubenswrapper[4756]: I1205 18:08:47.511982 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zlf4b" event={"ID":"fbe94a6f-7b68-458a-9f21-a3986682a5f5","Type":"ContainerDied","Data":"ca50ce3aae54106df018475ae07d30fed212fff24969c432eee3da78f979c232"} Dec 05 18:08:47 crc kubenswrapper[4756]: I1205 18:08:47.512094 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zlf4b" event={"ID":"fbe94a6f-7b68-458a-9f21-a3986682a5f5","Type":"ContainerDied","Data":"4b8201d4a26fc1d4988b6fc4272e4c03bee30f7709674307d4af2d1bb5287d04"} Dec 05 18:08:47 crc kubenswrapper[4756]: I1205 18:08:47.512116 4756 scope.go:117] "RemoveContainer" containerID="ca50ce3aae54106df018475ae07d30fed212fff24969c432eee3da78f979c232" Dec 05 18:08:47 crc kubenswrapper[4756]: I1205 18:08:47.551036 4756 scope.go:117] "RemoveContainer" containerID="d421db56bb70b294a339e7ff439e3303d24303bcc0ff3f97c53d36523639c952" Dec 05 18:08:47 crc kubenswrapper[4756]: I1205 18:08:47.555117 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zlf4b"] Dec 05 18:08:47 crc kubenswrapper[4756]: I1205 18:08:47.568718 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zlf4b"] Dec 05 18:08:47 crc kubenswrapper[4756]: I1205 18:08:47.590596 4756 scope.go:117] "RemoveContainer" containerID="5939fb65451b729d76b85308cd572846bad509baa139269573de7930a66218b0" Dec 05 18:08:47 crc kubenswrapper[4756]: I1205 18:08:47.655846 4756 scope.go:117] "RemoveContainer" containerID="ca50ce3aae54106df018475ae07d30fed212fff24969c432eee3da78f979c232" Dec 05 18:08:47 crc kubenswrapper[4756]: E1205 18:08:47.656499 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca50ce3aae54106df018475ae07d30fed212fff24969c432eee3da78f979c232\": container with ID starting with ca50ce3aae54106df018475ae07d30fed212fff24969c432eee3da78f979c232 not found: ID does not exist" containerID="ca50ce3aae54106df018475ae07d30fed212fff24969c432eee3da78f979c232" Dec 05 18:08:47 crc kubenswrapper[4756]: I1205 18:08:47.656543 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca50ce3aae54106df018475ae07d30fed212fff24969c432eee3da78f979c232"} err="failed to get container status \"ca50ce3aae54106df018475ae07d30fed212fff24969c432eee3da78f979c232\": rpc error: code = NotFound desc = could not find container \"ca50ce3aae54106df018475ae07d30fed212fff24969c432eee3da78f979c232\": container with ID starting with ca50ce3aae54106df018475ae07d30fed212fff24969c432eee3da78f979c232 not found: ID does not exist" Dec 05 18:08:47 crc kubenswrapper[4756]: I1205 18:08:47.656570 4756 scope.go:117] "RemoveContainer" containerID="d421db56bb70b294a339e7ff439e3303d24303bcc0ff3f97c53d36523639c952" Dec 05 18:08:47 crc kubenswrapper[4756]: E1205 18:08:47.657027 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d421db56bb70b294a339e7ff439e3303d24303bcc0ff3f97c53d36523639c952\": container with ID starting with d421db56bb70b294a339e7ff439e3303d24303bcc0ff3f97c53d36523639c952 not found: ID does not exist" containerID="d421db56bb70b294a339e7ff439e3303d24303bcc0ff3f97c53d36523639c952" Dec 05 18:08:47 crc kubenswrapper[4756]: I1205 18:08:47.657096 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d421db56bb70b294a339e7ff439e3303d24303bcc0ff3f97c53d36523639c952"} err="failed to get container status \"d421db56bb70b294a339e7ff439e3303d24303bcc0ff3f97c53d36523639c952\": rpc error: code = NotFound desc = could not find container \"d421db56bb70b294a339e7ff439e3303d24303bcc0ff3f97c53d36523639c952\": container with ID starting with d421db56bb70b294a339e7ff439e3303d24303bcc0ff3f97c53d36523639c952 not found: ID does not exist" Dec 05 18:08:47 crc kubenswrapper[4756]: I1205 18:08:47.657134 4756 scope.go:117] "RemoveContainer" containerID="5939fb65451b729d76b85308cd572846bad509baa139269573de7930a66218b0" Dec 05 18:08:47 crc kubenswrapper[4756]: E1205 18:08:47.657560 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5939fb65451b729d76b85308cd572846bad509baa139269573de7930a66218b0\": container with ID starting with 5939fb65451b729d76b85308cd572846bad509baa139269573de7930a66218b0 not found: ID does not exist" containerID="5939fb65451b729d76b85308cd572846bad509baa139269573de7930a66218b0" Dec 05 18:08:47 crc kubenswrapper[4756]: I1205 18:08:47.657585 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5939fb65451b729d76b85308cd572846bad509baa139269573de7930a66218b0"} err="failed to get container status \"5939fb65451b729d76b85308cd572846bad509baa139269573de7930a66218b0\": rpc error: code = NotFound desc = could not find container \"5939fb65451b729d76b85308cd572846bad509baa139269573de7930a66218b0\": container with ID starting with 5939fb65451b729d76b85308cd572846bad509baa139269573de7930a66218b0 not found: ID does not exist" Dec 05 18:08:48 crc kubenswrapper[4756]: I1205 18:08:48.073283 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbe94a6f-7b68-458a-9f21-a3986682a5f5" path="/var/lib/kubelet/pods/fbe94a6f-7b68-458a-9f21-a3986682a5f5/volumes" Dec 05 18:08:49 crc kubenswrapper[4756]: I1205 18:08:49.049801 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-cbqdw"] Dec 05 18:08:49 crc kubenswrapper[4756]: I1205 18:08:49.050669 4756 scope.go:117] "RemoveContainer" containerID="c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326" Dec 05 18:08:49 crc kubenswrapper[4756]: E1205 18:08:49.050918 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:08:49 crc kubenswrapper[4756]: I1205 18:08:49.066507 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-cbqdw"] Dec 05 18:08:50 crc kubenswrapper[4756]: I1205 18:08:50.067529 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afb86990-349d-45b0-83a0-866c6332f6b0" path="/var/lib/kubelet/pods/afb86990-349d-45b0-83a0-866c6332f6b0/volumes" Dec 05 18:08:55 crc kubenswrapper[4756]: I1205 18:08:55.591910 4756 generic.go:334] "Generic (PLEG): container finished" podID="5870dc85-1351-4505-b996-5d5b65eee0ee" containerID="4effb154453de3038891cd28478f3d676df39c134eee75e3f595b643a487cfc1" exitCode=0 Dec 05 18:08:55 crc kubenswrapper[4756]: I1205 18:08:55.592005 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4" event={"ID":"5870dc85-1351-4505-b996-5d5b65eee0ee","Type":"ContainerDied","Data":"4effb154453de3038891cd28478f3d676df39c134eee75e3f595b643a487cfc1"} Dec 05 18:08:57 crc kubenswrapper[4756]: I1205 18:08:57.243707 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4" Dec 05 18:08:57 crc kubenswrapper[4756]: I1205 18:08:57.313077 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmwj5\" (UniqueName: \"kubernetes.io/projected/5870dc85-1351-4505-b996-5d5b65eee0ee-kube-api-access-mmwj5\") pod \"5870dc85-1351-4505-b996-5d5b65eee0ee\" (UID: \"5870dc85-1351-4505-b996-5d5b65eee0ee\") " Dec 05 18:08:57 crc kubenswrapper[4756]: I1205 18:08:57.313131 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5870dc85-1351-4505-b996-5d5b65eee0ee-inventory\") pod \"5870dc85-1351-4505-b996-5d5b65eee0ee\" (UID: \"5870dc85-1351-4505-b996-5d5b65eee0ee\") " Dec 05 18:08:57 crc kubenswrapper[4756]: I1205 18:08:57.313461 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5870dc85-1351-4505-b996-5d5b65eee0ee-pre-adoption-validation-combined-ca-bundle\") pod \"5870dc85-1351-4505-b996-5d5b65eee0ee\" (UID: \"5870dc85-1351-4505-b996-5d5b65eee0ee\") " Dec 05 18:08:57 crc kubenswrapper[4756]: I1205 18:08:57.313505 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5870dc85-1351-4505-b996-5d5b65eee0ee-ssh-key\") pod \"5870dc85-1351-4505-b996-5d5b65eee0ee\" (UID: \"5870dc85-1351-4505-b996-5d5b65eee0ee\") " Dec 05 18:08:57 crc kubenswrapper[4756]: I1205 18:08:57.319687 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5870dc85-1351-4505-b996-5d5b65eee0ee-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "5870dc85-1351-4505-b996-5d5b65eee0ee" (UID: "5870dc85-1351-4505-b996-5d5b65eee0ee"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:08:57 crc kubenswrapper[4756]: I1205 18:08:57.320624 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5870dc85-1351-4505-b996-5d5b65eee0ee-kube-api-access-mmwj5" (OuterVolumeSpecName: "kube-api-access-mmwj5") pod "5870dc85-1351-4505-b996-5d5b65eee0ee" (UID: "5870dc85-1351-4505-b996-5d5b65eee0ee"). InnerVolumeSpecName "kube-api-access-mmwj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:08:57 crc kubenswrapper[4756]: I1205 18:08:57.348757 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5870dc85-1351-4505-b996-5d5b65eee0ee-inventory" (OuterVolumeSpecName: "inventory") pod "5870dc85-1351-4505-b996-5d5b65eee0ee" (UID: "5870dc85-1351-4505-b996-5d5b65eee0ee"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:08:57 crc kubenswrapper[4756]: I1205 18:08:57.353624 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5870dc85-1351-4505-b996-5d5b65eee0ee-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5870dc85-1351-4505-b996-5d5b65eee0ee" (UID: "5870dc85-1351-4505-b996-5d5b65eee0ee"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:08:57 crc kubenswrapper[4756]: I1205 18:08:57.415641 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmwj5\" (UniqueName: \"kubernetes.io/projected/5870dc85-1351-4505-b996-5d5b65eee0ee-kube-api-access-mmwj5\") on node \"crc\" DevicePath \"\"" Dec 05 18:08:57 crc kubenswrapper[4756]: I1205 18:08:57.415907 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5870dc85-1351-4505-b996-5d5b65eee0ee-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 18:08:57 crc kubenswrapper[4756]: I1205 18:08:57.415924 4756 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5870dc85-1351-4505-b996-5d5b65eee0ee-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:08:57 crc kubenswrapper[4756]: I1205 18:08:57.415938 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5870dc85-1351-4505-b996-5d5b65eee0ee-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 18:08:57 crc kubenswrapper[4756]: I1205 18:08:57.618811 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4" event={"ID":"5870dc85-1351-4505-b996-5d5b65eee0ee","Type":"ContainerDied","Data":"34c7fd29a434498d4cbc5ffdfa232865f05425cc207b4749ed29d753177bc8da"} Dec 05 18:08:57 crc kubenswrapper[4756]: I1205 18:08:57.618863 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4" Dec 05 18:08:57 crc kubenswrapper[4756]: I1205 18:08:57.618873 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34c7fd29a434498d4cbc5ffdfa232865f05425cc207b4749ed29d753177bc8da" Dec 05 18:09:02 crc kubenswrapper[4756]: I1205 18:09:02.050120 4756 scope.go:117] "RemoveContainer" containerID="c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326" Dec 05 18:09:02 crc kubenswrapper[4756]: E1205 18:09:02.051105 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:09:03 crc kubenswrapper[4756]: I1205 18:09:03.444539 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v"] Dec 05 18:09:03 crc kubenswrapper[4756]: E1205 18:09:03.445354 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe94a6f-7b68-458a-9f21-a3986682a5f5" containerName="registry-server" Dec 05 18:09:03 crc kubenswrapper[4756]: I1205 18:09:03.445370 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe94a6f-7b68-458a-9f21-a3986682a5f5" containerName="registry-server" Dec 05 18:09:03 crc kubenswrapper[4756]: E1205 18:09:03.445387 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe94a6f-7b68-458a-9f21-a3986682a5f5" containerName="extract-content" Dec 05 18:09:03 crc kubenswrapper[4756]: I1205 18:09:03.445397 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe94a6f-7b68-458a-9f21-a3986682a5f5" containerName="extract-content" Dec 05 18:09:03 crc kubenswrapper[4756]: E1205 18:09:03.445446 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe94a6f-7b68-458a-9f21-a3986682a5f5" containerName="extract-utilities" Dec 05 18:09:03 crc kubenswrapper[4756]: I1205 18:09:03.445458 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe94a6f-7b68-458a-9f21-a3986682a5f5" containerName="extract-utilities" Dec 05 18:09:03 crc kubenswrapper[4756]: E1205 18:09:03.445470 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5870dc85-1351-4505-b996-5d5b65eee0ee" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 05 18:09:03 crc kubenswrapper[4756]: I1205 18:09:03.445480 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5870dc85-1351-4505-b996-5d5b65eee0ee" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 05 18:09:03 crc kubenswrapper[4756]: I1205 18:09:03.445759 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="5870dc85-1351-4505-b996-5d5b65eee0ee" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 05 18:09:03 crc kubenswrapper[4756]: I1205 18:09:03.445774 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbe94a6f-7b68-458a-9f21-a3986682a5f5" containerName="registry-server" Dec 05 18:09:03 crc kubenswrapper[4756]: I1205 18:09:03.446725 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v" Dec 05 18:09:03 crc kubenswrapper[4756]: I1205 18:09:03.449076 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 18:09:03 crc kubenswrapper[4756]: I1205 18:09:03.449276 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dfk9j" Dec 05 18:09:03 crc kubenswrapper[4756]: I1205 18:09:03.449794 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 18:09:03 crc kubenswrapper[4756]: I1205 18:09:03.450330 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 18:09:03 crc kubenswrapper[4756]: I1205 18:09:03.454141 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v"] Dec 05 18:09:03 crc kubenswrapper[4756]: I1205 18:09:03.595657 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f633ab6-c240-4b57-8941-d8dc399f4a84-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v\" (UID: \"9f633ab6-c240-4b57-8941-d8dc399f4a84\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v" Dec 05 18:09:03 crc kubenswrapper[4756]: I1205 18:09:03.595944 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h729t\" (UniqueName: \"kubernetes.io/projected/9f633ab6-c240-4b57-8941-d8dc399f4a84-kube-api-access-h729t\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v\" (UID: \"9f633ab6-c240-4b57-8941-d8dc399f4a84\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v" Dec 05 18:09:03 crc kubenswrapper[4756]: I1205 18:09:03.596043 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f633ab6-c240-4b57-8941-d8dc399f4a84-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v\" (UID: \"9f633ab6-c240-4b57-8941-d8dc399f4a84\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v" Dec 05 18:09:03 crc kubenswrapper[4756]: I1205 18:09:03.596102 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f633ab6-c240-4b57-8941-d8dc399f4a84-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v\" (UID: \"9f633ab6-c240-4b57-8941-d8dc399f4a84\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v" Dec 05 18:09:03 crc kubenswrapper[4756]: I1205 18:09:03.697504 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h729t\" (UniqueName: \"kubernetes.io/projected/9f633ab6-c240-4b57-8941-d8dc399f4a84-kube-api-access-h729t\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v\" (UID: \"9f633ab6-c240-4b57-8941-d8dc399f4a84\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v" Dec 05 18:09:03 crc kubenswrapper[4756]: I1205 18:09:03.697566 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f633ab6-c240-4b57-8941-d8dc399f4a84-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v\" (UID: \"9f633ab6-c240-4b57-8941-d8dc399f4a84\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v" Dec 05 18:09:03 crc kubenswrapper[4756]: I1205 18:09:03.697627 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f633ab6-c240-4b57-8941-d8dc399f4a84-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v\" (UID: \"9f633ab6-c240-4b57-8941-d8dc399f4a84\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v" Dec 05 18:09:03 crc kubenswrapper[4756]: I1205 18:09:03.697682 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f633ab6-c240-4b57-8941-d8dc399f4a84-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v\" (UID: \"9f633ab6-c240-4b57-8941-d8dc399f4a84\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v" Dec 05 18:09:03 crc kubenswrapper[4756]: I1205 18:09:03.704817 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f633ab6-c240-4b57-8941-d8dc399f4a84-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v\" (UID: \"9f633ab6-c240-4b57-8941-d8dc399f4a84\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v" Dec 05 18:09:03 crc kubenswrapper[4756]: I1205 18:09:03.705140 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f633ab6-c240-4b57-8941-d8dc399f4a84-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v\" (UID: \"9f633ab6-c240-4b57-8941-d8dc399f4a84\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v" Dec 05 18:09:03 crc kubenswrapper[4756]: I1205 18:09:03.708778 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f633ab6-c240-4b57-8941-d8dc399f4a84-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v\" (UID: \"9f633ab6-c240-4b57-8941-d8dc399f4a84\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v" Dec 05 18:09:03 crc kubenswrapper[4756]: I1205 18:09:03.719510 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h729t\" (UniqueName: \"kubernetes.io/projected/9f633ab6-c240-4b57-8941-d8dc399f4a84-kube-api-access-h729t\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v\" (UID: \"9f633ab6-c240-4b57-8941-d8dc399f4a84\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v" Dec 05 18:09:03 crc kubenswrapper[4756]: I1205 18:09:03.822579 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v" Dec 05 18:09:04 crc kubenswrapper[4756]: I1205 18:09:04.404400 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v"] Dec 05 18:09:04 crc kubenswrapper[4756]: W1205 18:09:04.413295 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f633ab6_c240_4b57_8941_d8dc399f4a84.slice/crio-4b7315e07424c91e2b8578870176e721784115a12e17d08f4c2634bd66841080 WatchSource:0}: Error finding container 4b7315e07424c91e2b8578870176e721784115a12e17d08f4c2634bd66841080: Status 404 returned error can't find the container with id 4b7315e07424c91e2b8578870176e721784115a12e17d08f4c2634bd66841080 Dec 05 18:09:04 crc kubenswrapper[4756]: I1205 18:09:04.417469 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 18:09:04 crc kubenswrapper[4756]: I1205 18:09:04.690280 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v" event={"ID":"9f633ab6-c240-4b57-8941-d8dc399f4a84","Type":"ContainerStarted","Data":"4b7315e07424c91e2b8578870176e721784115a12e17d08f4c2634bd66841080"} Dec 05 18:09:05 crc kubenswrapper[4756]: I1205 18:09:05.704024 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v" event={"ID":"9f633ab6-c240-4b57-8941-d8dc399f4a84","Type":"ContainerStarted","Data":"4dadf9dbaa2cacf865fe38d1047c092a6ace3ae8472f1a5c43c0c90fdc72f101"} Dec 05 18:09:05 crc kubenswrapper[4756]: I1205 18:09:05.743005 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v" podStartSLOduration=2.312155131 podStartE2EDuration="2.742974616s" podCreationTimestamp="2025-12-05 18:09:03 +0000 UTC" firstStartedPulling="2025-12-05 18:09:04.417209888 +0000 UTC m=+6804.655883400" lastFinishedPulling="2025-12-05 18:09:04.848029373 +0000 UTC m=+6805.086702885" observedRunningTime="2025-12-05 18:09:05.736536656 +0000 UTC m=+6805.975210208" watchObservedRunningTime="2025-12-05 18:09:05.742974616 +0000 UTC m=+6805.981648148" Dec 05 18:09:15 crc kubenswrapper[4756]: I1205 18:09:15.050620 4756 scope.go:117] "RemoveContainer" containerID="c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326" Dec 05 18:09:15 crc kubenswrapper[4756]: E1205 18:09:15.051602 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:09:17 crc kubenswrapper[4756]: I1205 18:09:17.688385 4756 scope.go:117] "RemoveContainer" containerID="2d951ae761902714016151d66a905a1495077017502c3430f0831f7b18dfe92c" Dec 05 18:09:17 crc kubenswrapper[4756]: I1205 18:09:17.730172 4756 scope.go:117] "RemoveContainer" containerID="9b785dab74a5334616b6f862f1c6192d2b90225ada895ee551e113ca02e6374c" Dec 05 18:09:26 crc kubenswrapper[4756]: I1205 18:09:26.050290 4756 scope.go:117] "RemoveContainer" containerID="c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326" Dec 05 18:09:26 crc kubenswrapper[4756]: E1205 18:09:26.051225 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:09:38 crc kubenswrapper[4756]: I1205 18:09:38.050650 4756 scope.go:117] "RemoveContainer" containerID="c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326" Dec 05 18:09:38 crc kubenswrapper[4756]: E1205 18:09:38.052909 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:09:52 crc kubenswrapper[4756]: I1205 18:09:52.050310 4756 scope.go:117] "RemoveContainer" containerID="c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326" Dec 05 18:09:52 crc kubenswrapper[4756]: E1205 18:09:52.051193 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:10:07 crc kubenswrapper[4756]: I1205 18:10:07.049505 4756 scope.go:117] "RemoveContainer" containerID="c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326" Dec 05 18:10:08 crc kubenswrapper[4756]: I1205 18:10:08.491165 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"5afb3dc90120f4e9e53bde4a5bc50b142a7b2e00719706caa44fbf69f6e97832"} Dec 05 18:11:17 crc kubenswrapper[4756]: I1205 18:11:17.859771 4756 scope.go:117] "RemoveContainer" containerID="290708e9f9981fcc806436ab7c45c5672be3f19fb9ef6d73fd0ce0e49c2a683a" Dec 05 18:12:00 crc kubenswrapper[4756]: I1205 18:12:00.358091 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j7jgk"] Dec 05 18:12:00 crc kubenswrapper[4756]: I1205 18:12:00.398990 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j7jgk" Dec 05 18:12:00 crc kubenswrapper[4756]: I1205 18:12:00.419255 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j7jgk"] Dec 05 18:12:00 crc kubenswrapper[4756]: I1205 18:12:00.483056 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ed44f2d-6b8b-4c1e-8950-fe5783feb876-catalog-content\") pod \"redhat-operators-j7jgk\" (UID: \"4ed44f2d-6b8b-4c1e-8950-fe5783feb876\") " pod="openshift-marketplace/redhat-operators-j7jgk" Dec 05 18:12:00 crc kubenswrapper[4756]: I1205 18:12:00.483107 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvphk\" (UniqueName: \"kubernetes.io/projected/4ed44f2d-6b8b-4c1e-8950-fe5783feb876-kube-api-access-rvphk\") pod \"redhat-operators-j7jgk\" (UID: \"4ed44f2d-6b8b-4c1e-8950-fe5783feb876\") " pod="openshift-marketplace/redhat-operators-j7jgk" Dec 05 18:12:00 crc kubenswrapper[4756]: I1205 18:12:00.483176 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ed44f2d-6b8b-4c1e-8950-fe5783feb876-utilities\") pod \"redhat-operators-j7jgk\" (UID: \"4ed44f2d-6b8b-4c1e-8950-fe5783feb876\") " pod="openshift-marketplace/redhat-operators-j7jgk" Dec 05 18:12:00 crc kubenswrapper[4756]: I1205 18:12:00.585810 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ed44f2d-6b8b-4c1e-8950-fe5783feb876-catalog-content\") pod \"redhat-operators-j7jgk\" (UID: \"4ed44f2d-6b8b-4c1e-8950-fe5783feb876\") " pod="openshift-marketplace/redhat-operators-j7jgk" Dec 05 18:12:00 crc kubenswrapper[4756]: I1205 18:12:00.585873 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvphk\" (UniqueName: \"kubernetes.io/projected/4ed44f2d-6b8b-4c1e-8950-fe5783feb876-kube-api-access-rvphk\") pod \"redhat-operators-j7jgk\" (UID: \"4ed44f2d-6b8b-4c1e-8950-fe5783feb876\") " pod="openshift-marketplace/redhat-operators-j7jgk" Dec 05 18:12:00 crc kubenswrapper[4756]: I1205 18:12:00.585985 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ed44f2d-6b8b-4c1e-8950-fe5783feb876-utilities\") pod \"redhat-operators-j7jgk\" (UID: \"4ed44f2d-6b8b-4c1e-8950-fe5783feb876\") " pod="openshift-marketplace/redhat-operators-j7jgk" Dec 05 18:12:00 crc kubenswrapper[4756]: I1205 18:12:00.586454 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ed44f2d-6b8b-4c1e-8950-fe5783feb876-utilities\") pod \"redhat-operators-j7jgk\" (UID: \"4ed44f2d-6b8b-4c1e-8950-fe5783feb876\") " pod="openshift-marketplace/redhat-operators-j7jgk" Dec 05 18:12:00 crc kubenswrapper[4756]: I1205 18:12:00.586493 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ed44f2d-6b8b-4c1e-8950-fe5783feb876-catalog-content\") pod \"redhat-operators-j7jgk\" (UID: \"4ed44f2d-6b8b-4c1e-8950-fe5783feb876\") " pod="openshift-marketplace/redhat-operators-j7jgk" Dec 05 18:12:00 crc kubenswrapper[4756]: I1205 18:12:00.605189 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvphk\" (UniqueName: \"kubernetes.io/projected/4ed44f2d-6b8b-4c1e-8950-fe5783feb876-kube-api-access-rvphk\") pod \"redhat-operators-j7jgk\" (UID: \"4ed44f2d-6b8b-4c1e-8950-fe5783feb876\") " pod="openshift-marketplace/redhat-operators-j7jgk" Dec 05 18:12:00 crc kubenswrapper[4756]: I1205 18:12:00.726017 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j7jgk" Dec 05 18:12:01 crc kubenswrapper[4756]: I1205 18:12:01.204083 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j7jgk"] Dec 05 18:12:01 crc kubenswrapper[4756]: I1205 18:12:01.893121 4756 generic.go:334] "Generic (PLEG): container finished" podID="4ed44f2d-6b8b-4c1e-8950-fe5783feb876" containerID="f79585be8b4d30fd0745f2705603b47aad27ffde0c2fcde43930754a1226c526" exitCode=0 Dec 05 18:12:01 crc kubenswrapper[4756]: I1205 18:12:01.893193 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7jgk" event={"ID":"4ed44f2d-6b8b-4c1e-8950-fe5783feb876","Type":"ContainerDied","Data":"f79585be8b4d30fd0745f2705603b47aad27ffde0c2fcde43930754a1226c526"} Dec 05 18:12:01 crc kubenswrapper[4756]: I1205 18:12:01.893529 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7jgk" event={"ID":"4ed44f2d-6b8b-4c1e-8950-fe5783feb876","Type":"ContainerStarted","Data":"9c7064cebe337e9e4b8efccfbd1f7c33a5b793376b4806b172c6841bb0d28dfe"} Dec 05 18:12:02 crc kubenswrapper[4756]: I1205 18:12:02.908062 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7jgk" event={"ID":"4ed44f2d-6b8b-4c1e-8950-fe5783feb876","Type":"ContainerStarted","Data":"2569b3d9aa78b02ed29d730fd9582e71484fbc85547f77f138b9226c0c4465ba"} Dec 05 18:12:06 crc kubenswrapper[4756]: I1205 18:12:06.957517 4756 generic.go:334] "Generic (PLEG): container finished" podID="4ed44f2d-6b8b-4c1e-8950-fe5783feb876" containerID="2569b3d9aa78b02ed29d730fd9582e71484fbc85547f77f138b9226c0c4465ba" exitCode=0 Dec 05 18:12:06 crc kubenswrapper[4756]: I1205 18:12:06.958236 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7jgk" event={"ID":"4ed44f2d-6b8b-4c1e-8950-fe5783feb876","Type":"ContainerDied","Data":"2569b3d9aa78b02ed29d730fd9582e71484fbc85547f77f138b9226c0c4465ba"} Dec 05 18:12:07 crc kubenswrapper[4756]: I1205 18:12:07.973810 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7jgk" event={"ID":"4ed44f2d-6b8b-4c1e-8950-fe5783feb876","Type":"ContainerStarted","Data":"133693356ce86ae6480fc771e90c43f061cc8be87e82b4f6c09b2c16c84f6c3d"} Dec 05 18:12:08 crc kubenswrapper[4756]: I1205 18:12:08.018412 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j7jgk" podStartSLOduration=2.494392221 podStartE2EDuration="8.018381764s" podCreationTimestamp="2025-12-05 18:12:00 +0000 UTC" firstStartedPulling="2025-12-05 18:12:01.895697681 +0000 UTC m=+6982.134371233" lastFinishedPulling="2025-12-05 18:12:07.419687264 +0000 UTC m=+6987.658360776" observedRunningTime="2025-12-05 18:12:08.004970899 +0000 UTC m=+6988.243644501" watchObservedRunningTime="2025-12-05 18:12:08.018381764 +0000 UTC m=+6988.257055356" Dec 05 18:12:10 crc kubenswrapper[4756]: I1205 18:12:10.727187 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j7jgk" Dec 05 18:12:10 crc kubenswrapper[4756]: I1205 18:12:10.727782 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j7jgk" Dec 05 18:12:11 crc kubenswrapper[4756]: I1205 18:12:11.780928 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j7jgk" podUID="4ed44f2d-6b8b-4c1e-8950-fe5783feb876" containerName="registry-server" probeResult="failure" output=< Dec 05 18:12:11 crc kubenswrapper[4756]: timeout: failed to connect service ":50051" within 1s Dec 05 18:12:11 crc kubenswrapper[4756]: > Dec 05 18:12:21 crc kubenswrapper[4756]: I1205 18:12:21.813326 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j7jgk" podUID="4ed44f2d-6b8b-4c1e-8950-fe5783feb876" containerName="registry-server" probeResult="failure" output=< Dec 05 18:12:21 crc kubenswrapper[4756]: timeout: failed to connect service ":50051" within 1s Dec 05 18:12:21 crc kubenswrapper[4756]: > Dec 05 18:12:22 crc kubenswrapper[4756]: I1205 18:12:22.296014 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:12:22 crc kubenswrapper[4756]: I1205 18:12:22.296127 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:12:26 crc kubenswrapper[4756]: I1205 18:12:26.075524 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-9c83-account-create-update-zvw9n"] Dec 05 18:12:26 crc kubenswrapper[4756]: I1205 18:12:26.099817 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-d8tqq"] Dec 05 18:12:26 crc kubenswrapper[4756]: I1205 18:12:26.111930 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-d8tqq"] Dec 05 18:12:26 crc kubenswrapper[4756]: I1205 18:12:26.124106 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-9c83-account-create-update-zvw9n"] Dec 05 18:12:28 crc kubenswrapper[4756]: I1205 18:12:28.066543 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cb4dac2-605b-4204-b359-96783ae01e91" path="/var/lib/kubelet/pods/1cb4dac2-605b-4204-b359-96783ae01e91/volumes" Dec 05 18:12:28 crc kubenswrapper[4756]: I1205 18:12:28.068267 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f4f52fb-d424-47ef-9320-98036fde3926" path="/var/lib/kubelet/pods/6f4f52fb-d424-47ef-9320-98036fde3926/volumes" Dec 05 18:12:30 crc kubenswrapper[4756]: I1205 18:12:30.808709 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j7jgk" Dec 05 18:12:30 crc kubenswrapper[4756]: I1205 18:12:30.873603 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j7jgk" Dec 05 18:12:31 crc kubenswrapper[4756]: I1205 18:12:31.549333 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j7jgk"] Dec 05 18:12:32 crc kubenswrapper[4756]: I1205 18:12:32.280377 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j7jgk" podUID="4ed44f2d-6b8b-4c1e-8950-fe5783feb876" containerName="registry-server" containerID="cri-o://133693356ce86ae6480fc771e90c43f061cc8be87e82b4f6c09b2c16c84f6c3d" gracePeriod=2 Dec 05 18:12:32 crc kubenswrapper[4756]: I1205 18:12:32.798712 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j7jgk" Dec 05 18:12:32 crc kubenswrapper[4756]: I1205 18:12:32.922482 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvphk\" (UniqueName: \"kubernetes.io/projected/4ed44f2d-6b8b-4c1e-8950-fe5783feb876-kube-api-access-rvphk\") pod \"4ed44f2d-6b8b-4c1e-8950-fe5783feb876\" (UID: \"4ed44f2d-6b8b-4c1e-8950-fe5783feb876\") " Dec 05 18:12:32 crc kubenswrapper[4756]: I1205 18:12:32.922741 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ed44f2d-6b8b-4c1e-8950-fe5783feb876-catalog-content\") pod \"4ed44f2d-6b8b-4c1e-8950-fe5783feb876\" (UID: \"4ed44f2d-6b8b-4c1e-8950-fe5783feb876\") " Dec 05 18:12:32 crc kubenswrapper[4756]: I1205 18:12:32.922891 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ed44f2d-6b8b-4c1e-8950-fe5783feb876-utilities\") pod \"4ed44f2d-6b8b-4c1e-8950-fe5783feb876\" (UID: \"4ed44f2d-6b8b-4c1e-8950-fe5783feb876\") " Dec 05 18:12:32 crc kubenswrapper[4756]: I1205 18:12:32.924416 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ed44f2d-6b8b-4c1e-8950-fe5783feb876-utilities" (OuterVolumeSpecName: "utilities") pod "4ed44f2d-6b8b-4c1e-8950-fe5783feb876" (UID: "4ed44f2d-6b8b-4c1e-8950-fe5783feb876"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:12:32 crc kubenswrapper[4756]: I1205 18:12:32.933754 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ed44f2d-6b8b-4c1e-8950-fe5783feb876-kube-api-access-rvphk" (OuterVolumeSpecName: "kube-api-access-rvphk") pod "4ed44f2d-6b8b-4c1e-8950-fe5783feb876" (UID: "4ed44f2d-6b8b-4c1e-8950-fe5783feb876"). InnerVolumeSpecName "kube-api-access-rvphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:12:33 crc kubenswrapper[4756]: I1205 18:12:33.027153 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvphk\" (UniqueName: \"kubernetes.io/projected/4ed44f2d-6b8b-4c1e-8950-fe5783feb876-kube-api-access-rvphk\") on node \"crc\" DevicePath \"\"" Dec 05 18:12:33 crc kubenswrapper[4756]: I1205 18:12:33.028705 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ed44f2d-6b8b-4c1e-8950-fe5783feb876-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 18:12:33 crc kubenswrapper[4756]: I1205 18:12:33.042247 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ed44f2d-6b8b-4c1e-8950-fe5783feb876-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ed44f2d-6b8b-4c1e-8950-fe5783feb876" (UID: "4ed44f2d-6b8b-4c1e-8950-fe5783feb876"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:12:33 crc kubenswrapper[4756]: I1205 18:12:33.131568 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ed44f2d-6b8b-4c1e-8950-fe5783feb876-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 18:12:33 crc kubenswrapper[4756]: I1205 18:12:33.290285 4756 generic.go:334] "Generic (PLEG): container finished" podID="4ed44f2d-6b8b-4c1e-8950-fe5783feb876" containerID="133693356ce86ae6480fc771e90c43f061cc8be87e82b4f6c09b2c16c84f6c3d" exitCode=0 Dec 05 18:12:33 crc kubenswrapper[4756]: I1205 18:12:33.290323 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7jgk" event={"ID":"4ed44f2d-6b8b-4c1e-8950-fe5783feb876","Type":"ContainerDied","Data":"133693356ce86ae6480fc771e90c43f061cc8be87e82b4f6c09b2c16c84f6c3d"} Dec 05 18:12:33 crc kubenswrapper[4756]: I1205 18:12:33.290348 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7jgk" event={"ID":"4ed44f2d-6b8b-4c1e-8950-fe5783feb876","Type":"ContainerDied","Data":"9c7064cebe337e9e4b8efccfbd1f7c33a5b793376b4806b172c6841bb0d28dfe"} Dec 05 18:12:33 crc kubenswrapper[4756]: I1205 18:12:33.290368 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j7jgk" Dec 05 18:12:33 crc kubenswrapper[4756]: I1205 18:12:33.290384 4756 scope.go:117] "RemoveContainer" containerID="133693356ce86ae6480fc771e90c43f061cc8be87e82b4f6c09b2c16c84f6c3d" Dec 05 18:12:33 crc kubenswrapper[4756]: I1205 18:12:33.309401 4756 scope.go:117] "RemoveContainer" containerID="2569b3d9aa78b02ed29d730fd9582e71484fbc85547f77f138b9226c0c4465ba" Dec 05 18:12:33 crc kubenswrapper[4756]: I1205 18:12:33.331392 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j7jgk"] Dec 05 18:12:33 crc kubenswrapper[4756]: I1205 18:12:33.341439 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j7jgk"] Dec 05 18:12:33 crc kubenswrapper[4756]: I1205 18:12:33.378172 4756 scope.go:117] "RemoveContainer" containerID="f79585be8b4d30fd0745f2705603b47aad27ffde0c2fcde43930754a1226c526" Dec 05 18:12:33 crc kubenswrapper[4756]: I1205 18:12:33.410193 4756 scope.go:117] "RemoveContainer" containerID="133693356ce86ae6480fc771e90c43f061cc8be87e82b4f6c09b2c16c84f6c3d" Dec 05 18:12:33 crc kubenswrapper[4756]: E1205 18:12:33.410797 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"133693356ce86ae6480fc771e90c43f061cc8be87e82b4f6c09b2c16c84f6c3d\": container with ID starting with 133693356ce86ae6480fc771e90c43f061cc8be87e82b4f6c09b2c16c84f6c3d not found: ID does not exist" containerID="133693356ce86ae6480fc771e90c43f061cc8be87e82b4f6c09b2c16c84f6c3d" Dec 05 18:12:33 crc kubenswrapper[4756]: I1205 18:12:33.410837 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"133693356ce86ae6480fc771e90c43f061cc8be87e82b4f6c09b2c16c84f6c3d"} err="failed to get container status \"133693356ce86ae6480fc771e90c43f061cc8be87e82b4f6c09b2c16c84f6c3d\": rpc error: code = NotFound desc = could not find container \"133693356ce86ae6480fc771e90c43f061cc8be87e82b4f6c09b2c16c84f6c3d\": container with ID starting with 133693356ce86ae6480fc771e90c43f061cc8be87e82b4f6c09b2c16c84f6c3d not found: ID does not exist" Dec 05 18:12:33 crc kubenswrapper[4756]: I1205 18:12:33.410860 4756 scope.go:117] "RemoveContainer" containerID="2569b3d9aa78b02ed29d730fd9582e71484fbc85547f77f138b9226c0c4465ba" Dec 05 18:12:33 crc kubenswrapper[4756]: E1205 18:12:33.411186 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2569b3d9aa78b02ed29d730fd9582e71484fbc85547f77f138b9226c0c4465ba\": container with ID starting with 2569b3d9aa78b02ed29d730fd9582e71484fbc85547f77f138b9226c0c4465ba not found: ID does not exist" containerID="2569b3d9aa78b02ed29d730fd9582e71484fbc85547f77f138b9226c0c4465ba" Dec 05 18:12:33 crc kubenswrapper[4756]: I1205 18:12:33.411207 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2569b3d9aa78b02ed29d730fd9582e71484fbc85547f77f138b9226c0c4465ba"} err="failed to get container status \"2569b3d9aa78b02ed29d730fd9582e71484fbc85547f77f138b9226c0c4465ba\": rpc error: code = NotFound desc = could not find container \"2569b3d9aa78b02ed29d730fd9582e71484fbc85547f77f138b9226c0c4465ba\": container with ID starting with 2569b3d9aa78b02ed29d730fd9582e71484fbc85547f77f138b9226c0c4465ba not found: ID does not exist" Dec 05 18:12:33 crc kubenswrapper[4756]: I1205 18:12:33.411219 4756 scope.go:117] "RemoveContainer" containerID="f79585be8b4d30fd0745f2705603b47aad27ffde0c2fcde43930754a1226c526" Dec 05 18:12:33 crc kubenswrapper[4756]: E1205 18:12:33.414663 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f79585be8b4d30fd0745f2705603b47aad27ffde0c2fcde43930754a1226c526\": container with ID starting with f79585be8b4d30fd0745f2705603b47aad27ffde0c2fcde43930754a1226c526 not found: ID does not exist" containerID="f79585be8b4d30fd0745f2705603b47aad27ffde0c2fcde43930754a1226c526" Dec 05 18:12:33 crc kubenswrapper[4756]: I1205 18:12:33.414692 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f79585be8b4d30fd0745f2705603b47aad27ffde0c2fcde43930754a1226c526"} err="failed to get container status \"f79585be8b4d30fd0745f2705603b47aad27ffde0c2fcde43930754a1226c526\": rpc error: code = NotFound desc = could not find container \"f79585be8b4d30fd0745f2705603b47aad27ffde0c2fcde43930754a1226c526\": container with ID starting with f79585be8b4d30fd0745f2705603b47aad27ffde0c2fcde43930754a1226c526 not found: ID does not exist" Dec 05 18:12:34 crc kubenswrapper[4756]: I1205 18:12:34.077775 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ed44f2d-6b8b-4c1e-8950-fe5783feb876" path="/var/lib/kubelet/pods/4ed44f2d-6b8b-4c1e-8950-fe5783feb876/volumes" Dec 05 18:12:40 crc kubenswrapper[4756]: I1205 18:12:40.174008 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-bh7pp"] Dec 05 18:12:40 crc kubenswrapper[4756]: I1205 18:12:40.187070 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-bh7pp"] Dec 05 18:12:42 crc kubenswrapper[4756]: I1205 18:12:42.071167 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94fa0a94-d869-4426-8a74-8a96196d3772" path="/var/lib/kubelet/pods/94fa0a94-d869-4426-8a74-8a96196d3772/volumes" Dec 05 18:12:52 crc kubenswrapper[4756]: I1205 18:12:52.295364 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:12:52 crc kubenswrapper[4756]: I1205 18:12:52.296235 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:13:17 crc kubenswrapper[4756]: I1205 18:13:17.935595 4756 scope.go:117] "RemoveContainer" containerID="928830a8f658bbca2e7acc46db3e6620982abf02692e8228a1a78d4abc63a205" Dec 05 18:13:17 crc kubenswrapper[4756]: I1205 18:13:17.980504 4756 scope.go:117] "RemoveContainer" containerID="c75b73a0e4088d60d58a2cb4173d7ffedffb47ef484a97ca6341656f45596422" Dec 05 18:13:18 crc kubenswrapper[4756]: I1205 18:13:18.054029 4756 scope.go:117] "RemoveContainer" containerID="5b89a727354a70c4b0ab2230315f95b7308b65fcd141b4fd732f293ce064bea6" Dec 05 18:13:22 crc kubenswrapper[4756]: I1205 18:13:22.295241 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:13:22 crc kubenswrapper[4756]: I1205 18:13:22.296129 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:13:22 crc kubenswrapper[4756]: I1205 18:13:22.296199 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 18:13:22 crc kubenswrapper[4756]: I1205 18:13:22.297396 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5afb3dc90120f4e9e53bde4a5bc50b142a7b2e00719706caa44fbf69f6e97832"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 18:13:22 crc kubenswrapper[4756]: I1205 18:13:22.297519 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://5afb3dc90120f4e9e53bde4a5bc50b142a7b2e00719706caa44fbf69f6e97832" gracePeriod=600 Dec 05 18:13:22 crc kubenswrapper[4756]: I1205 18:13:22.910390 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="5afb3dc90120f4e9e53bde4a5bc50b142a7b2e00719706caa44fbf69f6e97832" exitCode=0 Dec 05 18:13:22 crc kubenswrapper[4756]: I1205 18:13:22.910463 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"5afb3dc90120f4e9e53bde4a5bc50b142a7b2e00719706caa44fbf69f6e97832"} Dec 05 18:13:22 crc kubenswrapper[4756]: I1205 18:13:22.910997 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19"} Dec 05 18:13:22 crc kubenswrapper[4756]: I1205 18:13:22.911034 4756 scope.go:117] "RemoveContainer" containerID="c96aba89be9c26589a033e3983d91a9360df7d07bdfa52fbd5d6f23e02eda326" Dec 05 18:14:01 crc kubenswrapper[4756]: I1205 18:14:01.275003 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8vj92"] Dec 05 18:14:01 crc kubenswrapper[4756]: E1205 18:14:01.280167 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ed44f2d-6b8b-4c1e-8950-fe5783feb876" containerName="extract-utilities" Dec 05 18:14:01 crc kubenswrapper[4756]: I1205 18:14:01.280472 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ed44f2d-6b8b-4c1e-8950-fe5783feb876" containerName="extract-utilities" Dec 05 18:14:01 crc kubenswrapper[4756]: E1205 18:14:01.280563 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ed44f2d-6b8b-4c1e-8950-fe5783feb876" containerName="registry-server" Dec 05 18:14:01 crc kubenswrapper[4756]: I1205 18:14:01.280578 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ed44f2d-6b8b-4c1e-8950-fe5783feb876" containerName="registry-server" Dec 05 18:14:01 crc kubenswrapper[4756]: E1205 18:14:01.280617 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ed44f2d-6b8b-4c1e-8950-fe5783feb876" containerName="extract-content" Dec 05 18:14:01 crc kubenswrapper[4756]: I1205 18:14:01.280631 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ed44f2d-6b8b-4c1e-8950-fe5783feb876" containerName="extract-content" Dec 05 18:14:01 crc kubenswrapper[4756]: I1205 18:14:01.281002 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ed44f2d-6b8b-4c1e-8950-fe5783feb876" containerName="registry-server" Dec 05 18:14:01 crc kubenswrapper[4756]: I1205 18:14:01.283804 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vj92" Dec 05 18:14:01 crc kubenswrapper[4756]: I1205 18:14:01.290468 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8vj92"] Dec 05 18:14:01 crc kubenswrapper[4756]: I1205 18:14:01.336128 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f830196a-2a13-4b70-97b6-db8ff60fa2e7-catalog-content\") pod \"community-operators-8vj92\" (UID: \"f830196a-2a13-4b70-97b6-db8ff60fa2e7\") " pod="openshift-marketplace/community-operators-8vj92" Dec 05 18:14:01 crc kubenswrapper[4756]: I1205 18:14:01.336246 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr6bq\" (UniqueName: \"kubernetes.io/projected/f830196a-2a13-4b70-97b6-db8ff60fa2e7-kube-api-access-hr6bq\") pod \"community-operators-8vj92\" (UID: \"f830196a-2a13-4b70-97b6-db8ff60fa2e7\") " pod="openshift-marketplace/community-operators-8vj92" Dec 05 18:14:01 crc kubenswrapper[4756]: I1205 18:14:01.336499 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f830196a-2a13-4b70-97b6-db8ff60fa2e7-utilities\") pod \"community-operators-8vj92\" (UID: \"f830196a-2a13-4b70-97b6-db8ff60fa2e7\") " pod="openshift-marketplace/community-operators-8vj92" Dec 05 18:14:01 crc kubenswrapper[4756]: I1205 18:14:01.438932 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f830196a-2a13-4b70-97b6-db8ff60fa2e7-catalog-content\") pod \"community-operators-8vj92\" (UID: \"f830196a-2a13-4b70-97b6-db8ff60fa2e7\") " pod="openshift-marketplace/community-operators-8vj92" Dec 05 18:14:01 crc kubenswrapper[4756]: I1205 18:14:01.439061 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr6bq\" (UniqueName: \"kubernetes.io/projected/f830196a-2a13-4b70-97b6-db8ff60fa2e7-kube-api-access-hr6bq\") pod \"community-operators-8vj92\" (UID: \"f830196a-2a13-4b70-97b6-db8ff60fa2e7\") " pod="openshift-marketplace/community-operators-8vj92" Dec 05 18:14:01 crc kubenswrapper[4756]: I1205 18:14:01.439237 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f830196a-2a13-4b70-97b6-db8ff60fa2e7-utilities\") pod \"community-operators-8vj92\" (UID: \"f830196a-2a13-4b70-97b6-db8ff60fa2e7\") " pod="openshift-marketplace/community-operators-8vj92" Dec 05 18:14:01 crc kubenswrapper[4756]: I1205 18:14:01.439457 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f830196a-2a13-4b70-97b6-db8ff60fa2e7-catalog-content\") pod \"community-operators-8vj92\" (UID: \"f830196a-2a13-4b70-97b6-db8ff60fa2e7\") " pod="openshift-marketplace/community-operators-8vj92" Dec 05 18:14:01 crc kubenswrapper[4756]: I1205 18:14:01.439778 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f830196a-2a13-4b70-97b6-db8ff60fa2e7-utilities\") pod \"community-operators-8vj92\" (UID: \"f830196a-2a13-4b70-97b6-db8ff60fa2e7\") " pod="openshift-marketplace/community-operators-8vj92" Dec 05 18:14:01 crc kubenswrapper[4756]: I1205 18:14:01.467486 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr6bq\" (UniqueName: \"kubernetes.io/projected/f830196a-2a13-4b70-97b6-db8ff60fa2e7-kube-api-access-hr6bq\") pod \"community-operators-8vj92\" (UID: \"f830196a-2a13-4b70-97b6-db8ff60fa2e7\") " pod="openshift-marketplace/community-operators-8vj92" Dec 05 18:14:01 crc kubenswrapper[4756]: I1205 18:14:01.615306 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vj92" Dec 05 18:14:02 crc kubenswrapper[4756]: I1205 18:14:02.206164 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8vj92"] Dec 05 18:14:02 crc kubenswrapper[4756]: I1205 18:14:02.402384 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vj92" event={"ID":"f830196a-2a13-4b70-97b6-db8ff60fa2e7","Type":"ContainerStarted","Data":"3eb90f2ae637973cafd2334024bc4f5feea95744af0236133b867c28ad94f6de"} Dec 05 18:14:03 crc kubenswrapper[4756]: I1205 18:14:03.414996 4756 generic.go:334] "Generic (PLEG): container finished" podID="f830196a-2a13-4b70-97b6-db8ff60fa2e7" containerID="ef83c8cdb149fad03ff30d9895ad3934f588dc98157147030779e883868bde27" exitCode=0 Dec 05 18:14:03 crc kubenswrapper[4756]: I1205 18:14:03.415050 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vj92" event={"ID":"f830196a-2a13-4b70-97b6-db8ff60fa2e7","Type":"ContainerDied","Data":"ef83c8cdb149fad03ff30d9895ad3934f588dc98157147030779e883868bde27"} Dec 05 18:14:04 crc kubenswrapper[4756]: I1205 18:14:04.425975 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vj92" event={"ID":"f830196a-2a13-4b70-97b6-db8ff60fa2e7","Type":"ContainerStarted","Data":"c8354e3edce1f796d6919cab4e5212b159c77a7f3f13b008328d972847e97441"} Dec 05 18:14:05 crc kubenswrapper[4756]: I1205 18:14:05.438922 4756 generic.go:334] "Generic (PLEG): container finished" podID="f830196a-2a13-4b70-97b6-db8ff60fa2e7" containerID="c8354e3edce1f796d6919cab4e5212b159c77a7f3f13b008328d972847e97441" exitCode=0 Dec 05 18:14:05 crc kubenswrapper[4756]: I1205 18:14:05.438968 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vj92" event={"ID":"f830196a-2a13-4b70-97b6-db8ff60fa2e7","Type":"ContainerDied","Data":"c8354e3edce1f796d6919cab4e5212b159c77a7f3f13b008328d972847e97441"} Dec 05 18:14:05 crc kubenswrapper[4756]: I1205 18:14:05.441698 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 18:14:06 crc kubenswrapper[4756]: I1205 18:14:06.449363 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vj92" event={"ID":"f830196a-2a13-4b70-97b6-db8ff60fa2e7","Type":"ContainerStarted","Data":"4af129489126132441933cf7dfdfcdee645ce79d560a63a4022acc71f488ac71"} Dec 05 18:14:06 crc kubenswrapper[4756]: I1205 18:14:06.478390 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8vj92" podStartSLOduration=2.764420902 podStartE2EDuration="5.478371022s" podCreationTimestamp="2025-12-05 18:14:01 +0000 UTC" firstStartedPulling="2025-12-05 18:14:03.417795525 +0000 UTC m=+7103.656469037" lastFinishedPulling="2025-12-05 18:14:06.131745615 +0000 UTC m=+7106.370419157" observedRunningTime="2025-12-05 18:14:06.46657632 +0000 UTC m=+7106.705249842" watchObservedRunningTime="2025-12-05 18:14:06.478371022 +0000 UTC m=+7106.717044534" Dec 05 18:14:11 crc kubenswrapper[4756]: I1205 18:14:11.507974 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gp6lq"] Dec 05 18:14:11 crc kubenswrapper[4756]: I1205 18:14:11.513024 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gp6lq" Dec 05 18:14:11 crc kubenswrapper[4756]: I1205 18:14:11.540702 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gp6lq"] Dec 05 18:14:11 crc kubenswrapper[4756]: I1205 18:14:11.616286 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8vj92" Dec 05 18:14:11 crc kubenswrapper[4756]: I1205 18:14:11.616335 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8vj92" Dec 05 18:14:11 crc kubenswrapper[4756]: I1205 18:14:11.625656 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d8d47c0-a676-45eb-aece-9fde9a846f84-catalog-content\") pod \"redhat-marketplace-gp6lq\" (UID: \"2d8d47c0-a676-45eb-aece-9fde9a846f84\") " pod="openshift-marketplace/redhat-marketplace-gp6lq" Dec 05 18:14:11 crc kubenswrapper[4756]: I1205 18:14:11.625730 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6f68\" (UniqueName: \"kubernetes.io/projected/2d8d47c0-a676-45eb-aece-9fde9a846f84-kube-api-access-d6f68\") pod \"redhat-marketplace-gp6lq\" (UID: \"2d8d47c0-a676-45eb-aece-9fde9a846f84\") " pod="openshift-marketplace/redhat-marketplace-gp6lq" Dec 05 18:14:11 crc kubenswrapper[4756]: I1205 18:14:11.625931 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d8d47c0-a676-45eb-aece-9fde9a846f84-utilities\") pod \"redhat-marketplace-gp6lq\" (UID: \"2d8d47c0-a676-45eb-aece-9fde9a846f84\") " pod="openshift-marketplace/redhat-marketplace-gp6lq" Dec 05 18:14:11 crc kubenswrapper[4756]: I1205 18:14:11.682661 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8vj92" Dec 05 18:14:11 crc kubenswrapper[4756]: I1205 18:14:11.728296 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d8d47c0-a676-45eb-aece-9fde9a846f84-catalog-content\") pod \"redhat-marketplace-gp6lq\" (UID: \"2d8d47c0-a676-45eb-aece-9fde9a846f84\") " pod="openshift-marketplace/redhat-marketplace-gp6lq" Dec 05 18:14:11 crc kubenswrapper[4756]: I1205 18:14:11.728368 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6f68\" (UniqueName: \"kubernetes.io/projected/2d8d47c0-a676-45eb-aece-9fde9a846f84-kube-api-access-d6f68\") pod \"redhat-marketplace-gp6lq\" (UID: \"2d8d47c0-a676-45eb-aece-9fde9a846f84\") " pod="openshift-marketplace/redhat-marketplace-gp6lq" Dec 05 18:14:11 crc kubenswrapper[4756]: I1205 18:14:11.728446 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d8d47c0-a676-45eb-aece-9fde9a846f84-utilities\") pod \"redhat-marketplace-gp6lq\" (UID: \"2d8d47c0-a676-45eb-aece-9fde9a846f84\") " pod="openshift-marketplace/redhat-marketplace-gp6lq" Dec 05 18:14:11 crc kubenswrapper[4756]: I1205 18:14:11.728993 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d8d47c0-a676-45eb-aece-9fde9a846f84-utilities\") pod \"redhat-marketplace-gp6lq\" (UID: \"2d8d47c0-a676-45eb-aece-9fde9a846f84\") " pod="openshift-marketplace/redhat-marketplace-gp6lq" Dec 05 18:14:11 crc kubenswrapper[4756]: I1205 18:14:11.729199 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d8d47c0-a676-45eb-aece-9fde9a846f84-catalog-content\") pod \"redhat-marketplace-gp6lq\" (UID: \"2d8d47c0-a676-45eb-aece-9fde9a846f84\") " pod="openshift-marketplace/redhat-marketplace-gp6lq" Dec 05 18:14:11 crc kubenswrapper[4756]: I1205 18:14:11.754360 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6f68\" (UniqueName: \"kubernetes.io/projected/2d8d47c0-a676-45eb-aece-9fde9a846f84-kube-api-access-d6f68\") pod \"redhat-marketplace-gp6lq\" (UID: \"2d8d47c0-a676-45eb-aece-9fde9a846f84\") " pod="openshift-marketplace/redhat-marketplace-gp6lq" Dec 05 18:14:11 crc kubenswrapper[4756]: I1205 18:14:11.846924 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gp6lq" Dec 05 18:14:12 crc kubenswrapper[4756]: I1205 18:14:12.388228 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gp6lq"] Dec 05 18:14:12 crc kubenswrapper[4756]: W1205 18:14:12.407223 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d8d47c0_a676_45eb_aece_9fde9a846f84.slice/crio-c14c4e41cce938b755189757ee0d1e45b3926cb21bd30db1cae3571bd89486b3 WatchSource:0}: Error finding container c14c4e41cce938b755189757ee0d1e45b3926cb21bd30db1cae3571bd89486b3: Status 404 returned error can't find the container with id c14c4e41cce938b755189757ee0d1e45b3926cb21bd30db1cae3571bd89486b3 Dec 05 18:14:12 crc kubenswrapper[4756]: I1205 18:14:12.538125 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gp6lq" event={"ID":"2d8d47c0-a676-45eb-aece-9fde9a846f84","Type":"ContainerStarted","Data":"c14c4e41cce938b755189757ee0d1e45b3926cb21bd30db1cae3571bd89486b3"} Dec 05 18:14:12 crc kubenswrapper[4756]: I1205 18:14:12.617558 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8vj92" Dec 05 18:14:13 crc kubenswrapper[4756]: I1205 18:14:13.549715 4756 generic.go:334] "Generic (PLEG): container finished" podID="2d8d47c0-a676-45eb-aece-9fde9a846f84" containerID="25eb509308542d8cf40516aae7283a7b9da731be45b714cae156133e4700b5a1" exitCode=0 Dec 05 18:14:13 crc kubenswrapper[4756]: I1205 18:14:13.549789 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gp6lq" event={"ID":"2d8d47c0-a676-45eb-aece-9fde9a846f84","Type":"ContainerDied","Data":"25eb509308542d8cf40516aae7283a7b9da731be45b714cae156133e4700b5a1"} Dec 05 18:14:14 crc kubenswrapper[4756]: I1205 18:14:14.065899 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8vj92"] Dec 05 18:14:14 crc kubenswrapper[4756]: I1205 18:14:14.563821 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8vj92" podUID="f830196a-2a13-4b70-97b6-db8ff60fa2e7" containerName="registry-server" containerID="cri-o://4af129489126132441933cf7dfdfcdee645ce79d560a63a4022acc71f488ac71" gracePeriod=2 Dec 05 18:14:15 crc kubenswrapper[4756]: I1205 18:14:15.171741 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vj92" Dec 05 18:14:15 crc kubenswrapper[4756]: I1205 18:14:15.216241 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr6bq\" (UniqueName: \"kubernetes.io/projected/f830196a-2a13-4b70-97b6-db8ff60fa2e7-kube-api-access-hr6bq\") pod \"f830196a-2a13-4b70-97b6-db8ff60fa2e7\" (UID: \"f830196a-2a13-4b70-97b6-db8ff60fa2e7\") " Dec 05 18:14:15 crc kubenswrapper[4756]: I1205 18:14:15.216337 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f830196a-2a13-4b70-97b6-db8ff60fa2e7-utilities\") pod \"f830196a-2a13-4b70-97b6-db8ff60fa2e7\" (UID: \"f830196a-2a13-4b70-97b6-db8ff60fa2e7\") " Dec 05 18:14:15 crc kubenswrapper[4756]: I1205 18:14:15.216481 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f830196a-2a13-4b70-97b6-db8ff60fa2e7-catalog-content\") pod \"f830196a-2a13-4b70-97b6-db8ff60fa2e7\" (UID: \"f830196a-2a13-4b70-97b6-db8ff60fa2e7\") " Dec 05 18:14:15 crc kubenswrapper[4756]: I1205 18:14:15.223000 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f830196a-2a13-4b70-97b6-db8ff60fa2e7-utilities" (OuterVolumeSpecName: "utilities") pod "f830196a-2a13-4b70-97b6-db8ff60fa2e7" (UID: "f830196a-2a13-4b70-97b6-db8ff60fa2e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:14:15 crc kubenswrapper[4756]: I1205 18:14:15.233477 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f830196a-2a13-4b70-97b6-db8ff60fa2e7-kube-api-access-hr6bq" (OuterVolumeSpecName: "kube-api-access-hr6bq") pod "f830196a-2a13-4b70-97b6-db8ff60fa2e7" (UID: "f830196a-2a13-4b70-97b6-db8ff60fa2e7"). InnerVolumeSpecName "kube-api-access-hr6bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:14:15 crc kubenswrapper[4756]: I1205 18:14:15.282488 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f830196a-2a13-4b70-97b6-db8ff60fa2e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f830196a-2a13-4b70-97b6-db8ff60fa2e7" (UID: "f830196a-2a13-4b70-97b6-db8ff60fa2e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:14:15 crc kubenswrapper[4756]: I1205 18:14:15.318629 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f830196a-2a13-4b70-97b6-db8ff60fa2e7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 18:14:15 crc kubenswrapper[4756]: I1205 18:14:15.318658 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr6bq\" (UniqueName: \"kubernetes.io/projected/f830196a-2a13-4b70-97b6-db8ff60fa2e7-kube-api-access-hr6bq\") on node \"crc\" DevicePath \"\"" Dec 05 18:14:15 crc kubenswrapper[4756]: I1205 18:14:15.318668 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f830196a-2a13-4b70-97b6-db8ff60fa2e7-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 18:14:15 crc kubenswrapper[4756]: I1205 18:14:15.577188 4756 generic.go:334] "Generic (PLEG): container finished" podID="2d8d47c0-a676-45eb-aece-9fde9a846f84" containerID="6c32e5338106e8d8f845a7472ffece30b348daee85c8cfc21526557be1ea8cac" exitCode=0 Dec 05 18:14:15 crc kubenswrapper[4756]: I1205 18:14:15.577551 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gp6lq" event={"ID":"2d8d47c0-a676-45eb-aece-9fde9a846f84","Type":"ContainerDied","Data":"6c32e5338106e8d8f845a7472ffece30b348daee85c8cfc21526557be1ea8cac"} Dec 05 18:14:15 crc kubenswrapper[4756]: I1205 18:14:15.588608 4756 generic.go:334] "Generic (PLEG): container finished" podID="f830196a-2a13-4b70-97b6-db8ff60fa2e7" containerID="4af129489126132441933cf7dfdfcdee645ce79d560a63a4022acc71f488ac71" exitCode=0 Dec 05 18:14:15 crc kubenswrapper[4756]: I1205 18:14:15.588654 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vj92" event={"ID":"f830196a-2a13-4b70-97b6-db8ff60fa2e7","Type":"ContainerDied","Data":"4af129489126132441933cf7dfdfcdee645ce79d560a63a4022acc71f488ac71"} Dec 05 18:14:15 crc kubenswrapper[4756]: I1205 18:14:15.588682 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vj92" event={"ID":"f830196a-2a13-4b70-97b6-db8ff60fa2e7","Type":"ContainerDied","Data":"3eb90f2ae637973cafd2334024bc4f5feea95744af0236133b867c28ad94f6de"} Dec 05 18:14:15 crc kubenswrapper[4756]: I1205 18:14:15.588702 4756 scope.go:117] "RemoveContainer" containerID="4af129489126132441933cf7dfdfcdee645ce79d560a63a4022acc71f488ac71" Dec 05 18:14:15 crc kubenswrapper[4756]: I1205 18:14:15.588828 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vj92" Dec 05 18:14:15 crc kubenswrapper[4756]: I1205 18:14:15.624104 4756 scope.go:117] "RemoveContainer" containerID="c8354e3edce1f796d6919cab4e5212b159c77a7f3f13b008328d972847e97441" Dec 05 18:14:15 crc kubenswrapper[4756]: I1205 18:14:15.655194 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8vj92"] Dec 05 18:14:15 crc kubenswrapper[4756]: I1205 18:14:15.675010 4756 scope.go:117] "RemoveContainer" containerID="ef83c8cdb149fad03ff30d9895ad3934f588dc98157147030779e883868bde27" Dec 05 18:14:15 crc kubenswrapper[4756]: I1205 18:14:15.675942 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8vj92"] Dec 05 18:14:15 crc kubenswrapper[4756]: I1205 18:14:15.736190 4756 scope.go:117] "RemoveContainer" containerID="4af129489126132441933cf7dfdfcdee645ce79d560a63a4022acc71f488ac71" Dec 05 18:14:15 crc kubenswrapper[4756]: E1205 18:14:15.736896 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4af129489126132441933cf7dfdfcdee645ce79d560a63a4022acc71f488ac71\": container with ID starting with 4af129489126132441933cf7dfdfcdee645ce79d560a63a4022acc71f488ac71 not found: ID does not exist" containerID="4af129489126132441933cf7dfdfcdee645ce79d560a63a4022acc71f488ac71" Dec 05 18:14:15 crc kubenswrapper[4756]: I1205 18:14:15.736934 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4af129489126132441933cf7dfdfcdee645ce79d560a63a4022acc71f488ac71"} err="failed to get container status \"4af129489126132441933cf7dfdfcdee645ce79d560a63a4022acc71f488ac71\": rpc error: code = NotFound desc = could not find container \"4af129489126132441933cf7dfdfcdee645ce79d560a63a4022acc71f488ac71\": container with ID starting with 4af129489126132441933cf7dfdfcdee645ce79d560a63a4022acc71f488ac71 not found: ID does not exist" Dec 05 18:14:15 crc kubenswrapper[4756]: I1205 18:14:15.736975 4756 scope.go:117] "RemoveContainer" containerID="c8354e3edce1f796d6919cab4e5212b159c77a7f3f13b008328d972847e97441" Dec 05 18:14:15 crc kubenswrapper[4756]: E1205 18:14:15.737251 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8354e3edce1f796d6919cab4e5212b159c77a7f3f13b008328d972847e97441\": container with ID starting with c8354e3edce1f796d6919cab4e5212b159c77a7f3f13b008328d972847e97441 not found: ID does not exist" containerID="c8354e3edce1f796d6919cab4e5212b159c77a7f3f13b008328d972847e97441" Dec 05 18:14:15 crc kubenswrapper[4756]: I1205 18:14:15.737296 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8354e3edce1f796d6919cab4e5212b159c77a7f3f13b008328d972847e97441"} err="failed to get container status \"c8354e3edce1f796d6919cab4e5212b159c77a7f3f13b008328d972847e97441\": rpc error: code = NotFound desc = could not find container \"c8354e3edce1f796d6919cab4e5212b159c77a7f3f13b008328d972847e97441\": container with ID starting with c8354e3edce1f796d6919cab4e5212b159c77a7f3f13b008328d972847e97441 not found: ID does not exist" Dec 05 18:14:15 crc kubenswrapper[4756]: I1205 18:14:15.737324 4756 scope.go:117] "RemoveContainer" containerID="ef83c8cdb149fad03ff30d9895ad3934f588dc98157147030779e883868bde27" Dec 05 18:14:15 crc kubenswrapper[4756]: E1205 18:14:15.737835 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef83c8cdb149fad03ff30d9895ad3934f588dc98157147030779e883868bde27\": container with ID starting with ef83c8cdb149fad03ff30d9895ad3934f588dc98157147030779e883868bde27 not found: ID does not exist" containerID="ef83c8cdb149fad03ff30d9895ad3934f588dc98157147030779e883868bde27" Dec 05 18:14:15 crc kubenswrapper[4756]: I1205 18:14:15.737859 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef83c8cdb149fad03ff30d9895ad3934f588dc98157147030779e883868bde27"} err="failed to get container status \"ef83c8cdb149fad03ff30d9895ad3934f588dc98157147030779e883868bde27\": rpc error: code = NotFound desc = could not find container \"ef83c8cdb149fad03ff30d9895ad3934f588dc98157147030779e883868bde27\": container with ID starting with ef83c8cdb149fad03ff30d9895ad3934f588dc98157147030779e883868bde27 not found: ID does not exist" Dec 05 18:14:16 crc kubenswrapper[4756]: I1205 18:14:16.062798 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f830196a-2a13-4b70-97b6-db8ff60fa2e7" path="/var/lib/kubelet/pods/f830196a-2a13-4b70-97b6-db8ff60fa2e7/volumes" Dec 05 18:14:16 crc kubenswrapper[4756]: I1205 18:14:16.603075 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gp6lq" event={"ID":"2d8d47c0-a676-45eb-aece-9fde9a846f84","Type":"ContainerStarted","Data":"6a78119b7e843bf34100495a53df1b36eac707dc5b9175d75d012253ab57c6cb"} Dec 05 18:14:16 crc kubenswrapper[4756]: I1205 18:14:16.633584 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gp6lq" podStartSLOduration=3.214690164 podStartE2EDuration="5.633565021s" podCreationTimestamp="2025-12-05 18:14:11 +0000 UTC" firstStartedPulling="2025-12-05 18:14:13.551979809 +0000 UTC m=+7113.790653331" lastFinishedPulling="2025-12-05 18:14:15.970854666 +0000 UTC m=+7116.209528188" observedRunningTime="2025-12-05 18:14:16.627301335 +0000 UTC m=+7116.865974847" watchObservedRunningTime="2025-12-05 18:14:16.633565021 +0000 UTC m=+7116.872238563" Dec 05 18:14:21 crc kubenswrapper[4756]: I1205 18:14:21.848080 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gp6lq" Dec 05 18:14:21 crc kubenswrapper[4756]: I1205 18:14:21.848690 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gp6lq" Dec 05 18:14:21 crc kubenswrapper[4756]: I1205 18:14:21.906854 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gp6lq" Dec 05 18:14:22 crc kubenswrapper[4756]: I1205 18:14:22.736097 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gp6lq" Dec 05 18:14:22 crc kubenswrapper[4756]: I1205 18:14:22.804637 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gp6lq"] Dec 05 18:14:24 crc kubenswrapper[4756]: I1205 18:14:24.693833 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gp6lq" podUID="2d8d47c0-a676-45eb-aece-9fde9a846f84" containerName="registry-server" containerID="cri-o://6a78119b7e843bf34100495a53df1b36eac707dc5b9175d75d012253ab57c6cb" gracePeriod=2 Dec 05 18:14:25 crc kubenswrapper[4756]: I1205 18:14:25.188458 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gp6lq" Dec 05 18:14:25 crc kubenswrapper[4756]: I1205 18:14:25.261862 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d8d47c0-a676-45eb-aece-9fde9a846f84-utilities\") pod \"2d8d47c0-a676-45eb-aece-9fde9a846f84\" (UID: \"2d8d47c0-a676-45eb-aece-9fde9a846f84\") " Dec 05 18:14:25 crc kubenswrapper[4756]: I1205 18:14:25.261955 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d8d47c0-a676-45eb-aece-9fde9a846f84-catalog-content\") pod \"2d8d47c0-a676-45eb-aece-9fde9a846f84\" (UID: \"2d8d47c0-a676-45eb-aece-9fde9a846f84\") " Dec 05 18:14:25 crc kubenswrapper[4756]: I1205 18:14:25.262059 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6f68\" (UniqueName: \"kubernetes.io/projected/2d8d47c0-a676-45eb-aece-9fde9a846f84-kube-api-access-d6f68\") pod \"2d8d47c0-a676-45eb-aece-9fde9a846f84\" (UID: \"2d8d47c0-a676-45eb-aece-9fde9a846f84\") " Dec 05 18:14:25 crc kubenswrapper[4756]: I1205 18:14:25.262871 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d8d47c0-a676-45eb-aece-9fde9a846f84-utilities" (OuterVolumeSpecName: "utilities") pod "2d8d47c0-a676-45eb-aece-9fde9a846f84" (UID: "2d8d47c0-a676-45eb-aece-9fde9a846f84"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:14:25 crc kubenswrapper[4756]: I1205 18:14:25.267652 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d8d47c0-a676-45eb-aece-9fde9a846f84-kube-api-access-d6f68" (OuterVolumeSpecName: "kube-api-access-d6f68") pod "2d8d47c0-a676-45eb-aece-9fde9a846f84" (UID: "2d8d47c0-a676-45eb-aece-9fde9a846f84"). InnerVolumeSpecName "kube-api-access-d6f68". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:14:25 crc kubenswrapper[4756]: I1205 18:14:25.287018 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d8d47c0-a676-45eb-aece-9fde9a846f84-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d8d47c0-a676-45eb-aece-9fde9a846f84" (UID: "2d8d47c0-a676-45eb-aece-9fde9a846f84"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:14:25 crc kubenswrapper[4756]: I1205 18:14:25.366677 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d8d47c0-a676-45eb-aece-9fde9a846f84-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 18:14:25 crc kubenswrapper[4756]: I1205 18:14:25.366737 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d8d47c0-a676-45eb-aece-9fde9a846f84-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 18:14:25 crc kubenswrapper[4756]: I1205 18:14:25.366752 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6f68\" (UniqueName: \"kubernetes.io/projected/2d8d47c0-a676-45eb-aece-9fde9a846f84-kube-api-access-d6f68\") on node \"crc\" DevicePath \"\"" Dec 05 18:14:25 crc kubenswrapper[4756]: I1205 18:14:25.713211 4756 generic.go:334] "Generic (PLEG): container finished" podID="2d8d47c0-a676-45eb-aece-9fde9a846f84" containerID="6a78119b7e843bf34100495a53df1b36eac707dc5b9175d75d012253ab57c6cb" exitCode=0 Dec 05 18:14:25 crc kubenswrapper[4756]: I1205 18:14:25.713292 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gp6lq" event={"ID":"2d8d47c0-a676-45eb-aece-9fde9a846f84","Type":"ContainerDied","Data":"6a78119b7e843bf34100495a53df1b36eac707dc5b9175d75d012253ab57c6cb"} Dec 05 18:14:25 crc kubenswrapper[4756]: I1205 18:14:25.713369 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gp6lq" event={"ID":"2d8d47c0-a676-45eb-aece-9fde9a846f84","Type":"ContainerDied","Data":"c14c4e41cce938b755189757ee0d1e45b3926cb21bd30db1cae3571bd89486b3"} Dec 05 18:14:25 crc kubenswrapper[4756]: I1205 18:14:25.713405 4756 scope.go:117] "RemoveContainer" containerID="6a78119b7e843bf34100495a53df1b36eac707dc5b9175d75d012253ab57c6cb" Dec 05 18:14:25 crc kubenswrapper[4756]: I1205 18:14:25.713321 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gp6lq" Dec 05 18:14:25 crc kubenswrapper[4756]: I1205 18:14:25.755690 4756 scope.go:117] "RemoveContainer" containerID="6c32e5338106e8d8f845a7472ffece30b348daee85c8cfc21526557be1ea8cac" Dec 05 18:14:25 crc kubenswrapper[4756]: I1205 18:14:25.793376 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gp6lq"] Dec 05 18:14:25 crc kubenswrapper[4756]: I1205 18:14:25.803689 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gp6lq"] Dec 05 18:14:25 crc kubenswrapper[4756]: I1205 18:14:25.804063 4756 scope.go:117] "RemoveContainer" containerID="25eb509308542d8cf40516aae7283a7b9da731be45b714cae156133e4700b5a1" Dec 05 18:14:25 crc kubenswrapper[4756]: I1205 18:14:25.844922 4756 scope.go:117] "RemoveContainer" containerID="6a78119b7e843bf34100495a53df1b36eac707dc5b9175d75d012253ab57c6cb" Dec 05 18:14:25 crc kubenswrapper[4756]: E1205 18:14:25.845869 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a78119b7e843bf34100495a53df1b36eac707dc5b9175d75d012253ab57c6cb\": container with ID starting with 6a78119b7e843bf34100495a53df1b36eac707dc5b9175d75d012253ab57c6cb not found: ID does not exist" containerID="6a78119b7e843bf34100495a53df1b36eac707dc5b9175d75d012253ab57c6cb" Dec 05 18:14:25 crc kubenswrapper[4756]: I1205 18:14:25.845915 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a78119b7e843bf34100495a53df1b36eac707dc5b9175d75d012253ab57c6cb"} err="failed to get container status \"6a78119b7e843bf34100495a53df1b36eac707dc5b9175d75d012253ab57c6cb\": rpc error: code = NotFound desc = could not find container \"6a78119b7e843bf34100495a53df1b36eac707dc5b9175d75d012253ab57c6cb\": container with ID starting with 6a78119b7e843bf34100495a53df1b36eac707dc5b9175d75d012253ab57c6cb not found: ID does not exist" Dec 05 18:14:25 crc kubenswrapper[4756]: I1205 18:14:25.845942 4756 scope.go:117] "RemoveContainer" containerID="6c32e5338106e8d8f845a7472ffece30b348daee85c8cfc21526557be1ea8cac" Dec 05 18:14:25 crc kubenswrapper[4756]: E1205 18:14:25.847930 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c32e5338106e8d8f845a7472ffece30b348daee85c8cfc21526557be1ea8cac\": container with ID starting with 6c32e5338106e8d8f845a7472ffece30b348daee85c8cfc21526557be1ea8cac not found: ID does not exist" containerID="6c32e5338106e8d8f845a7472ffece30b348daee85c8cfc21526557be1ea8cac" Dec 05 18:14:25 crc kubenswrapper[4756]: I1205 18:14:25.847963 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c32e5338106e8d8f845a7472ffece30b348daee85c8cfc21526557be1ea8cac"} err="failed to get container status \"6c32e5338106e8d8f845a7472ffece30b348daee85c8cfc21526557be1ea8cac\": rpc error: code = NotFound desc = could not find container \"6c32e5338106e8d8f845a7472ffece30b348daee85c8cfc21526557be1ea8cac\": container with ID starting with 6c32e5338106e8d8f845a7472ffece30b348daee85c8cfc21526557be1ea8cac not found: ID does not exist" Dec 05 18:14:25 crc kubenswrapper[4756]: I1205 18:14:25.847983 4756 scope.go:117] "RemoveContainer" containerID="25eb509308542d8cf40516aae7283a7b9da731be45b714cae156133e4700b5a1" Dec 05 18:14:25 crc kubenswrapper[4756]: E1205 18:14:25.848262 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25eb509308542d8cf40516aae7283a7b9da731be45b714cae156133e4700b5a1\": container with ID starting with 25eb509308542d8cf40516aae7283a7b9da731be45b714cae156133e4700b5a1 not found: ID does not exist" containerID="25eb509308542d8cf40516aae7283a7b9da731be45b714cae156133e4700b5a1" Dec 05 18:14:25 crc kubenswrapper[4756]: I1205 18:14:25.848307 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25eb509308542d8cf40516aae7283a7b9da731be45b714cae156133e4700b5a1"} err="failed to get container status \"25eb509308542d8cf40516aae7283a7b9da731be45b714cae156133e4700b5a1\": rpc error: code = NotFound desc = could not find container \"25eb509308542d8cf40516aae7283a7b9da731be45b714cae156133e4700b5a1\": container with ID starting with 25eb509308542d8cf40516aae7283a7b9da731be45b714cae156133e4700b5a1 not found: ID does not exist" Dec 05 18:14:26 crc kubenswrapper[4756]: I1205 18:14:26.070884 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d8d47c0-a676-45eb-aece-9fde9a846f84" path="/var/lib/kubelet/pods/2d8d47c0-a676-45eb-aece-9fde9a846f84/volumes" Dec 05 18:15:00 crc kubenswrapper[4756]: I1205 18:15:00.204122 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415975-lng7z"] Dec 05 18:15:00 crc kubenswrapper[4756]: E1205 18:15:00.205303 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d8d47c0-a676-45eb-aece-9fde9a846f84" containerName="extract-content" Dec 05 18:15:00 crc kubenswrapper[4756]: I1205 18:15:00.205322 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d8d47c0-a676-45eb-aece-9fde9a846f84" containerName="extract-content" Dec 05 18:15:00 crc kubenswrapper[4756]: E1205 18:15:00.205351 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d8d47c0-a676-45eb-aece-9fde9a846f84" containerName="extract-utilities" Dec 05 18:15:00 crc kubenswrapper[4756]: I1205 18:15:00.205360 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d8d47c0-a676-45eb-aece-9fde9a846f84" containerName="extract-utilities" Dec 05 18:15:00 crc kubenswrapper[4756]: E1205 18:15:00.205395 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f830196a-2a13-4b70-97b6-db8ff60fa2e7" containerName="extract-content" Dec 05 18:15:00 crc kubenswrapper[4756]: I1205 18:15:00.205405 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f830196a-2a13-4b70-97b6-db8ff60fa2e7" containerName="extract-content" Dec 05 18:15:00 crc kubenswrapper[4756]: E1205 18:15:00.205441 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d8d47c0-a676-45eb-aece-9fde9a846f84" containerName="registry-server" Dec 05 18:15:00 crc kubenswrapper[4756]: I1205 18:15:00.205449 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d8d47c0-a676-45eb-aece-9fde9a846f84" containerName="registry-server" Dec 05 18:15:00 crc kubenswrapper[4756]: E1205 18:15:00.205477 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f830196a-2a13-4b70-97b6-db8ff60fa2e7" containerName="registry-server" Dec 05 18:15:00 crc kubenswrapper[4756]: I1205 18:15:00.205484 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f830196a-2a13-4b70-97b6-db8ff60fa2e7" containerName="registry-server" Dec 05 18:15:00 crc kubenswrapper[4756]: E1205 18:15:00.205501 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f830196a-2a13-4b70-97b6-db8ff60fa2e7" containerName="extract-utilities" Dec 05 18:15:00 crc kubenswrapper[4756]: I1205 18:15:00.205510 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f830196a-2a13-4b70-97b6-db8ff60fa2e7" containerName="extract-utilities" Dec 05 18:15:00 crc kubenswrapper[4756]: I1205 18:15:00.205817 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f830196a-2a13-4b70-97b6-db8ff60fa2e7" containerName="registry-server" Dec 05 18:15:00 crc kubenswrapper[4756]: I1205 18:15:00.205841 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d8d47c0-a676-45eb-aece-9fde9a846f84" containerName="registry-server" Dec 05 18:15:00 crc kubenswrapper[4756]: I1205 18:15:00.206839 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415975-lng7z" Dec 05 18:15:00 crc kubenswrapper[4756]: I1205 18:15:00.211267 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 18:15:00 crc kubenswrapper[4756]: I1205 18:15:00.211593 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 18:15:00 crc kubenswrapper[4756]: I1205 18:15:00.224091 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415975-lng7z"] Dec 05 18:15:00 crc kubenswrapper[4756]: I1205 18:15:00.364286 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f7b0c7b-38ad-488b-9561-7c44304160c2-config-volume\") pod \"collect-profiles-29415975-lng7z\" (UID: \"5f7b0c7b-38ad-488b-9561-7c44304160c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415975-lng7z" Dec 05 18:15:00 crc kubenswrapper[4756]: I1205 18:15:00.364438 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4hbk\" (UniqueName: \"kubernetes.io/projected/5f7b0c7b-38ad-488b-9561-7c44304160c2-kube-api-access-x4hbk\") pod \"collect-profiles-29415975-lng7z\" (UID: \"5f7b0c7b-38ad-488b-9561-7c44304160c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415975-lng7z" Dec 05 18:15:00 crc kubenswrapper[4756]: I1205 18:15:00.364844 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f7b0c7b-38ad-488b-9561-7c44304160c2-secret-volume\") pod \"collect-profiles-29415975-lng7z\" (UID: \"5f7b0c7b-38ad-488b-9561-7c44304160c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415975-lng7z" Dec 05 18:15:00 crc kubenswrapper[4756]: I1205 18:15:00.466501 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f7b0c7b-38ad-488b-9561-7c44304160c2-config-volume\") pod \"collect-profiles-29415975-lng7z\" (UID: \"5f7b0c7b-38ad-488b-9561-7c44304160c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415975-lng7z" Dec 05 18:15:00 crc kubenswrapper[4756]: I1205 18:15:00.466614 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4hbk\" (UniqueName: \"kubernetes.io/projected/5f7b0c7b-38ad-488b-9561-7c44304160c2-kube-api-access-x4hbk\") pod \"collect-profiles-29415975-lng7z\" (UID: \"5f7b0c7b-38ad-488b-9561-7c44304160c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415975-lng7z" Dec 05 18:15:00 crc kubenswrapper[4756]: I1205 18:15:00.466657 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f7b0c7b-38ad-488b-9561-7c44304160c2-secret-volume\") pod \"collect-profiles-29415975-lng7z\" (UID: \"5f7b0c7b-38ad-488b-9561-7c44304160c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415975-lng7z" Dec 05 18:15:00 crc kubenswrapper[4756]: I1205 18:15:00.467269 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f7b0c7b-38ad-488b-9561-7c44304160c2-config-volume\") pod \"collect-profiles-29415975-lng7z\" (UID: \"5f7b0c7b-38ad-488b-9561-7c44304160c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415975-lng7z" Dec 05 18:15:00 crc kubenswrapper[4756]: I1205 18:15:00.472992 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f7b0c7b-38ad-488b-9561-7c44304160c2-secret-volume\") pod \"collect-profiles-29415975-lng7z\" (UID: \"5f7b0c7b-38ad-488b-9561-7c44304160c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415975-lng7z" Dec 05 18:15:00 crc kubenswrapper[4756]: I1205 18:15:00.495628 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4hbk\" (UniqueName: \"kubernetes.io/projected/5f7b0c7b-38ad-488b-9561-7c44304160c2-kube-api-access-x4hbk\") pod \"collect-profiles-29415975-lng7z\" (UID: \"5f7b0c7b-38ad-488b-9561-7c44304160c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415975-lng7z" Dec 05 18:15:00 crc kubenswrapper[4756]: I1205 18:15:00.538886 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415975-lng7z" Dec 05 18:15:01 crc kubenswrapper[4756]: I1205 18:15:01.030172 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415975-lng7z"] Dec 05 18:15:01 crc kubenswrapper[4756]: W1205 18:15:01.045577 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f7b0c7b_38ad_488b_9561_7c44304160c2.slice/crio-bf0306ba61eb642bc215c9ab9edc09541582a9fedac0ec288514db1181f28cf0 WatchSource:0}: Error finding container bf0306ba61eb642bc215c9ab9edc09541582a9fedac0ec288514db1181f28cf0: Status 404 returned error can't find the container with id bf0306ba61eb642bc215c9ab9edc09541582a9fedac0ec288514db1181f28cf0 Dec 05 18:15:01 crc kubenswrapper[4756]: I1205 18:15:01.185937 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415975-lng7z" event={"ID":"5f7b0c7b-38ad-488b-9561-7c44304160c2","Type":"ContainerStarted","Data":"bf0306ba61eb642bc215c9ab9edc09541582a9fedac0ec288514db1181f28cf0"} Dec 05 18:15:02 crc kubenswrapper[4756]: I1205 18:15:02.214920 4756 generic.go:334] "Generic (PLEG): container finished" podID="5f7b0c7b-38ad-488b-9561-7c44304160c2" containerID="20d997a5255c2ad47b02407528d6e6a7c5eeac8f8d564a6d13604cc590dd6f25" exitCode=0 Dec 05 18:15:02 crc kubenswrapper[4756]: I1205 18:15:02.214981 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415975-lng7z" event={"ID":"5f7b0c7b-38ad-488b-9561-7c44304160c2","Type":"ContainerDied","Data":"20d997a5255c2ad47b02407528d6e6a7c5eeac8f8d564a6d13604cc590dd6f25"} Dec 05 18:15:03 crc kubenswrapper[4756]: I1205 18:15:03.623659 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415975-lng7z" Dec 05 18:15:03 crc kubenswrapper[4756]: I1205 18:15:03.745456 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f7b0c7b-38ad-488b-9561-7c44304160c2-config-volume\") pod \"5f7b0c7b-38ad-488b-9561-7c44304160c2\" (UID: \"5f7b0c7b-38ad-488b-9561-7c44304160c2\") " Dec 05 18:15:03 crc kubenswrapper[4756]: I1205 18:15:03.745712 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f7b0c7b-38ad-488b-9561-7c44304160c2-secret-volume\") pod \"5f7b0c7b-38ad-488b-9561-7c44304160c2\" (UID: \"5f7b0c7b-38ad-488b-9561-7c44304160c2\") " Dec 05 18:15:03 crc kubenswrapper[4756]: I1205 18:15:03.745970 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4hbk\" (UniqueName: \"kubernetes.io/projected/5f7b0c7b-38ad-488b-9561-7c44304160c2-kube-api-access-x4hbk\") pod \"5f7b0c7b-38ad-488b-9561-7c44304160c2\" (UID: \"5f7b0c7b-38ad-488b-9561-7c44304160c2\") " Dec 05 18:15:03 crc kubenswrapper[4756]: I1205 18:15:03.746440 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f7b0c7b-38ad-488b-9561-7c44304160c2-config-volume" (OuterVolumeSpecName: "config-volume") pod "5f7b0c7b-38ad-488b-9561-7c44304160c2" (UID: "5f7b0c7b-38ad-488b-9561-7c44304160c2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:15:03 crc kubenswrapper[4756]: I1205 18:15:03.746891 4756 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f7b0c7b-38ad-488b-9561-7c44304160c2-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 18:15:03 crc kubenswrapper[4756]: I1205 18:15:03.752738 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f7b0c7b-38ad-488b-9561-7c44304160c2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5f7b0c7b-38ad-488b-9561-7c44304160c2" (UID: "5f7b0c7b-38ad-488b-9561-7c44304160c2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:15:03 crc kubenswrapper[4756]: I1205 18:15:03.752785 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f7b0c7b-38ad-488b-9561-7c44304160c2-kube-api-access-x4hbk" (OuterVolumeSpecName: "kube-api-access-x4hbk") pod "5f7b0c7b-38ad-488b-9561-7c44304160c2" (UID: "5f7b0c7b-38ad-488b-9561-7c44304160c2"). InnerVolumeSpecName "kube-api-access-x4hbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:15:03 crc kubenswrapper[4756]: I1205 18:15:03.849127 4756 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f7b0c7b-38ad-488b-9561-7c44304160c2-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 18:15:03 crc kubenswrapper[4756]: I1205 18:15:03.849160 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4hbk\" (UniqueName: \"kubernetes.io/projected/5f7b0c7b-38ad-488b-9561-7c44304160c2-kube-api-access-x4hbk\") on node \"crc\" DevicePath \"\"" Dec 05 18:15:04 crc kubenswrapper[4756]: I1205 18:15:04.243107 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415975-lng7z" event={"ID":"5f7b0c7b-38ad-488b-9561-7c44304160c2","Type":"ContainerDied","Data":"bf0306ba61eb642bc215c9ab9edc09541582a9fedac0ec288514db1181f28cf0"} Dec 05 18:15:04 crc kubenswrapper[4756]: I1205 18:15:04.243162 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf0306ba61eb642bc215c9ab9edc09541582a9fedac0ec288514db1181f28cf0" Dec 05 18:15:04 crc kubenswrapper[4756]: I1205 18:15:04.243243 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415975-lng7z" Dec 05 18:15:04 crc kubenswrapper[4756]: I1205 18:15:04.712812 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415930-nzhkp"] Dec 05 18:15:04 crc kubenswrapper[4756]: I1205 18:15:04.721725 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415930-nzhkp"] Dec 05 18:15:06 crc kubenswrapper[4756]: I1205 18:15:06.073993 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10262f71-e744-4bd2-9a0a-0645feca3a76" path="/var/lib/kubelet/pods/10262f71-e744-4bd2-9a0a-0645feca3a76/volumes" Dec 05 18:15:18 crc kubenswrapper[4756]: I1205 18:15:18.276691 4756 scope.go:117] "RemoveContainer" containerID="19646591add41f3b387f29aa12e936e59ae3ed7296ea1fa7a249eb04af334055" Dec 05 18:15:22 crc kubenswrapper[4756]: I1205 18:15:22.039773 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-1320-account-create-update-ngrp7"] Dec 05 18:15:22 crc kubenswrapper[4756]: I1205 18:15:22.064501 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-z2bjx"] Dec 05 18:15:22 crc kubenswrapper[4756]: I1205 18:15:22.064782 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-z2bjx"] Dec 05 18:15:22 crc kubenswrapper[4756]: I1205 18:15:22.073615 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-1320-account-create-update-ngrp7"] Dec 05 18:15:22 crc kubenswrapper[4756]: I1205 18:15:22.295311 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:15:22 crc kubenswrapper[4756]: I1205 18:15:22.295843 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:15:24 crc kubenswrapper[4756]: I1205 18:15:24.065696 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0839cf88-fa43-437b-b982-11ceb1c10581" path="/var/lib/kubelet/pods/0839cf88-fa43-437b-b982-11ceb1c10581/volumes" Dec 05 18:15:24 crc kubenswrapper[4756]: I1205 18:15:24.066636 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b47b3cd5-11a6-414f-822e-941e222f3b44" path="/var/lib/kubelet/pods/b47b3cd5-11a6-414f-822e-941e222f3b44/volumes" Dec 05 18:15:37 crc kubenswrapper[4756]: I1205 18:15:37.067303 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-k9mdr"] Dec 05 18:15:37 crc kubenswrapper[4756]: I1205 18:15:37.084269 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-k9mdr"] Dec 05 18:15:38 crc kubenswrapper[4756]: I1205 18:15:38.060844 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbbe16c2-80cb-411f-b3b9-fd45a440c956" path="/var/lib/kubelet/pods/fbbe16c2-80cb-411f-b3b9-fd45a440c956/volumes" Dec 05 18:15:52 crc kubenswrapper[4756]: I1205 18:15:52.295569 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:15:52 crc kubenswrapper[4756]: I1205 18:15:52.296645 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:16:18 crc kubenswrapper[4756]: I1205 18:16:18.404949 4756 scope.go:117] "RemoveContainer" containerID="45f2829d6cc4266cb29ec3188d3be04fae0197efef27e58fa6eeecb130fc71a2" Dec 05 18:16:18 crc kubenswrapper[4756]: I1205 18:16:18.447370 4756 scope.go:117] "RemoveContainer" containerID="5b4c714c78cccf810542a7c208b283ff5de91b2669f18cf66b2445b89188aa48" Dec 05 18:16:18 crc kubenswrapper[4756]: I1205 18:16:18.551752 4756 scope.go:117] "RemoveContainer" containerID="95f65d4a2e40626845a8ce515e74c1ca1f5bd50def7427a4828f33a12a2a29d3" Dec 05 18:16:22 crc kubenswrapper[4756]: I1205 18:16:22.299096 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:16:22 crc kubenswrapper[4756]: I1205 18:16:22.299545 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:16:22 crc kubenswrapper[4756]: I1205 18:16:22.299592 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 18:16:22 crc kubenswrapper[4756]: I1205 18:16:22.300388 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 18:16:22 crc kubenswrapper[4756]: I1205 18:16:22.300450 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19" gracePeriod=600 Dec 05 18:16:22 crc kubenswrapper[4756]: E1205 18:16:22.428766 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:16:23 crc kubenswrapper[4756]: I1205 18:16:23.185276 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19" exitCode=0 Dec 05 18:16:23 crc kubenswrapper[4756]: I1205 18:16:23.185527 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19"} Dec 05 18:16:23 crc kubenswrapper[4756]: I1205 18:16:23.185821 4756 scope.go:117] "RemoveContainer" containerID="5afb3dc90120f4e9e53bde4a5bc50b142a7b2e00719706caa44fbf69f6e97832" Dec 05 18:16:23 crc kubenswrapper[4756]: I1205 18:16:23.186943 4756 scope.go:117] "RemoveContainer" containerID="1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19" Dec 05 18:16:23 crc kubenswrapper[4756]: E1205 18:16:23.187411 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:16:37 crc kubenswrapper[4756]: I1205 18:16:37.049123 4756 scope.go:117] "RemoveContainer" containerID="1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19" Dec 05 18:16:37 crc kubenswrapper[4756]: E1205 18:16:37.050653 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:16:49 crc kubenswrapper[4756]: I1205 18:16:49.049736 4756 scope.go:117] "RemoveContainer" containerID="1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19" Dec 05 18:16:49 crc kubenswrapper[4756]: E1205 18:16:49.050510 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:17:00 crc kubenswrapper[4756]: I1205 18:17:00.070889 4756 scope.go:117] "RemoveContainer" containerID="1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19" Dec 05 18:17:00 crc kubenswrapper[4756]: E1205 18:17:00.072084 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:17:13 crc kubenswrapper[4756]: I1205 18:17:13.050318 4756 scope.go:117] "RemoveContainer" containerID="1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19" Dec 05 18:17:13 crc kubenswrapper[4756]: E1205 18:17:13.051440 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:17:25 crc kubenswrapper[4756]: I1205 18:17:25.050314 4756 scope.go:117] "RemoveContainer" containerID="1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19" Dec 05 18:17:25 crc kubenswrapper[4756]: E1205 18:17:25.051102 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:17:38 crc kubenswrapper[4756]: I1205 18:17:38.051490 4756 scope.go:117] "RemoveContainer" containerID="1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19" Dec 05 18:17:38 crc kubenswrapper[4756]: E1205 18:17:38.052381 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:17:50 crc kubenswrapper[4756]: I1205 18:17:50.073768 4756 scope.go:117] "RemoveContainer" containerID="1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19" Dec 05 18:17:50 crc kubenswrapper[4756]: E1205 18:17:50.075553 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:18:04 crc kubenswrapper[4756]: I1205 18:18:04.050624 4756 scope.go:117] "RemoveContainer" containerID="1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19" Dec 05 18:18:04 crc kubenswrapper[4756]: E1205 18:18:04.051265 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:18:19 crc kubenswrapper[4756]: I1205 18:18:19.050196 4756 scope.go:117] "RemoveContainer" containerID="1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19" Dec 05 18:18:19 crc kubenswrapper[4756]: E1205 18:18:19.052823 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:18:30 crc kubenswrapper[4756]: I1205 18:18:30.060263 4756 scope.go:117] "RemoveContainer" containerID="1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19" Dec 05 18:18:30 crc kubenswrapper[4756]: E1205 18:18:30.062517 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:18:41 crc kubenswrapper[4756]: I1205 18:18:41.049873 4756 scope.go:117] "RemoveContainer" containerID="1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19" Dec 05 18:18:41 crc kubenswrapper[4756]: E1205 18:18:41.051310 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:18:55 crc kubenswrapper[4756]: I1205 18:18:55.050159 4756 scope.go:117] "RemoveContainer" containerID="1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19" Dec 05 18:18:55 crc kubenswrapper[4756]: E1205 18:18:55.050972 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:19:09 crc kubenswrapper[4756]: I1205 18:19:09.050304 4756 scope.go:117] "RemoveContainer" containerID="1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19" Dec 05 18:19:09 crc kubenswrapper[4756]: E1205 18:19:09.052057 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:19:21 crc kubenswrapper[4756]: I1205 18:19:21.051147 4756 scope.go:117] "RemoveContainer" containerID="1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19" Dec 05 18:19:21 crc kubenswrapper[4756]: E1205 18:19:21.052280 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:19:23 crc kubenswrapper[4756]: I1205 18:19:23.421533 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gtdzc"] Dec 05 18:19:23 crc kubenswrapper[4756]: E1205 18:19:23.424090 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f7b0c7b-38ad-488b-9561-7c44304160c2" containerName="collect-profiles" Dec 05 18:19:23 crc kubenswrapper[4756]: I1205 18:19:23.424293 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f7b0c7b-38ad-488b-9561-7c44304160c2" containerName="collect-profiles" Dec 05 18:19:23 crc kubenswrapper[4756]: I1205 18:19:23.424928 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f7b0c7b-38ad-488b-9561-7c44304160c2" containerName="collect-profiles" Dec 05 18:19:23 crc kubenswrapper[4756]: I1205 18:19:23.428279 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gtdzc" Dec 05 18:19:23 crc kubenswrapper[4756]: I1205 18:19:23.437511 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gtdzc"] Dec 05 18:19:23 crc kubenswrapper[4756]: I1205 18:19:23.620877 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5077268-05c3-4ad4-a230-1cc82b27fefa-catalog-content\") pod \"certified-operators-gtdzc\" (UID: \"e5077268-05c3-4ad4-a230-1cc82b27fefa\") " pod="openshift-marketplace/certified-operators-gtdzc" Dec 05 18:19:23 crc kubenswrapper[4756]: I1205 18:19:23.620962 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5077268-05c3-4ad4-a230-1cc82b27fefa-utilities\") pod \"certified-operators-gtdzc\" (UID: \"e5077268-05c3-4ad4-a230-1cc82b27fefa\") " pod="openshift-marketplace/certified-operators-gtdzc" Dec 05 18:19:23 crc kubenswrapper[4756]: I1205 18:19:23.621783 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t22m5\" (UniqueName: \"kubernetes.io/projected/e5077268-05c3-4ad4-a230-1cc82b27fefa-kube-api-access-t22m5\") pod \"certified-operators-gtdzc\" (UID: \"e5077268-05c3-4ad4-a230-1cc82b27fefa\") " pod="openshift-marketplace/certified-operators-gtdzc" Dec 05 18:19:23 crc kubenswrapper[4756]: I1205 18:19:23.724035 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5077268-05c3-4ad4-a230-1cc82b27fefa-catalog-content\") pod \"certified-operators-gtdzc\" (UID: \"e5077268-05c3-4ad4-a230-1cc82b27fefa\") " pod="openshift-marketplace/certified-operators-gtdzc" Dec 05 18:19:23 crc kubenswrapper[4756]: I1205 18:19:23.724190 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5077268-05c3-4ad4-a230-1cc82b27fefa-utilities\") pod \"certified-operators-gtdzc\" (UID: \"e5077268-05c3-4ad4-a230-1cc82b27fefa\") " pod="openshift-marketplace/certified-operators-gtdzc" Dec 05 18:19:23 crc kubenswrapper[4756]: I1205 18:19:23.724368 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t22m5\" (UniqueName: \"kubernetes.io/projected/e5077268-05c3-4ad4-a230-1cc82b27fefa-kube-api-access-t22m5\") pod \"certified-operators-gtdzc\" (UID: \"e5077268-05c3-4ad4-a230-1cc82b27fefa\") " pod="openshift-marketplace/certified-operators-gtdzc" Dec 05 18:19:23 crc kubenswrapper[4756]: I1205 18:19:23.724685 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5077268-05c3-4ad4-a230-1cc82b27fefa-catalog-content\") pod \"certified-operators-gtdzc\" (UID: \"e5077268-05c3-4ad4-a230-1cc82b27fefa\") " pod="openshift-marketplace/certified-operators-gtdzc" Dec 05 18:19:23 crc kubenswrapper[4756]: I1205 18:19:23.724735 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5077268-05c3-4ad4-a230-1cc82b27fefa-utilities\") pod \"certified-operators-gtdzc\" (UID: \"e5077268-05c3-4ad4-a230-1cc82b27fefa\") " pod="openshift-marketplace/certified-operators-gtdzc" Dec 05 18:19:23 crc kubenswrapper[4756]: I1205 18:19:23.762356 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t22m5\" (UniqueName: \"kubernetes.io/projected/e5077268-05c3-4ad4-a230-1cc82b27fefa-kube-api-access-t22m5\") pod \"certified-operators-gtdzc\" (UID: \"e5077268-05c3-4ad4-a230-1cc82b27fefa\") " pod="openshift-marketplace/certified-operators-gtdzc" Dec 05 18:19:24 crc kubenswrapper[4756]: I1205 18:19:24.058587 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gtdzc" Dec 05 18:19:24 crc kubenswrapper[4756]: I1205 18:19:24.554666 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gtdzc"] Dec 05 18:19:25 crc kubenswrapper[4756]: I1205 18:19:25.325799 4756 generic.go:334] "Generic (PLEG): container finished" podID="e5077268-05c3-4ad4-a230-1cc82b27fefa" containerID="13d5a7db838b7a069795a01daea5086cda5be2422825d0c369b85ad5b33056ca" exitCode=0 Dec 05 18:19:25 crc kubenswrapper[4756]: I1205 18:19:25.325884 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gtdzc" event={"ID":"e5077268-05c3-4ad4-a230-1cc82b27fefa","Type":"ContainerDied","Data":"13d5a7db838b7a069795a01daea5086cda5be2422825d0c369b85ad5b33056ca"} Dec 05 18:19:25 crc kubenswrapper[4756]: I1205 18:19:25.326164 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gtdzc" event={"ID":"e5077268-05c3-4ad4-a230-1cc82b27fefa","Type":"ContainerStarted","Data":"09e39c2bf89129517bce48aacfa6cd196e4dedafc07d3f09fceb0c9ea8ae1e06"} Dec 05 18:19:25 crc kubenswrapper[4756]: I1205 18:19:25.328098 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 18:19:27 crc kubenswrapper[4756]: I1205 18:19:27.348395 4756 generic.go:334] "Generic (PLEG): container finished" podID="e5077268-05c3-4ad4-a230-1cc82b27fefa" containerID="20c40e9b39e67924f836bf64bc55e0e4d3e801d0fb1d9daac6de9e5ec5969f45" exitCode=0 Dec 05 18:19:27 crc kubenswrapper[4756]: I1205 18:19:27.348980 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gtdzc" event={"ID":"e5077268-05c3-4ad4-a230-1cc82b27fefa","Type":"ContainerDied","Data":"20c40e9b39e67924f836bf64bc55e0e4d3e801d0fb1d9daac6de9e5ec5969f45"} Dec 05 18:19:29 crc kubenswrapper[4756]: I1205 18:19:29.378208 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gtdzc" event={"ID":"e5077268-05c3-4ad4-a230-1cc82b27fefa","Type":"ContainerStarted","Data":"c5bc24b0a6aa4d36027e0ac1ccd1d41efdb6a53e60c4785a1f5a4015ad3152fd"} Dec 05 18:19:33 crc kubenswrapper[4756]: I1205 18:19:33.050451 4756 scope.go:117] "RemoveContainer" containerID="1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19" Dec 05 18:19:33 crc kubenswrapper[4756]: E1205 18:19:33.051762 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:19:34 crc kubenswrapper[4756]: I1205 18:19:34.073175 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gtdzc" Dec 05 18:19:34 crc kubenswrapper[4756]: I1205 18:19:34.073250 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gtdzc" Dec 05 18:19:34 crc kubenswrapper[4756]: I1205 18:19:34.155747 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gtdzc" Dec 05 18:19:34 crc kubenswrapper[4756]: I1205 18:19:34.190775 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gtdzc" podStartSLOduration=7.964981321 podStartE2EDuration="11.190750323s" podCreationTimestamp="2025-12-05 18:19:23 +0000 UTC" firstStartedPulling="2025-12-05 18:19:25.327822106 +0000 UTC m=+7425.566495608" lastFinishedPulling="2025-12-05 18:19:28.553591068 +0000 UTC m=+7428.792264610" observedRunningTime="2025-12-05 18:19:29.401161397 +0000 UTC m=+7429.639834909" watchObservedRunningTime="2025-12-05 18:19:34.190750323 +0000 UTC m=+7434.429423865" Dec 05 18:19:34 crc kubenswrapper[4756]: I1205 18:19:34.491980 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gtdzc" Dec 05 18:19:34 crc kubenswrapper[4756]: I1205 18:19:34.549969 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gtdzc"] Dec 05 18:19:36 crc kubenswrapper[4756]: I1205 18:19:36.464454 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gtdzc" podUID="e5077268-05c3-4ad4-a230-1cc82b27fefa" containerName="registry-server" containerID="cri-o://c5bc24b0a6aa4d36027e0ac1ccd1d41efdb6a53e60c4785a1f5a4015ad3152fd" gracePeriod=2 Dec 05 18:19:37 crc kubenswrapper[4756]: E1205 18:19:37.069622 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5077268_05c3_4ad4_a230_1cc82b27fefa.slice/crio-conmon-c5bc24b0a6aa4d36027e0ac1ccd1d41efdb6a53e60c4785a1f5a4015ad3152fd.scope\": RecentStats: unable to find data in memory cache]" Dec 05 18:19:37 crc kubenswrapper[4756]: I1205 18:19:37.482367 4756 generic.go:334] "Generic (PLEG): container finished" podID="e5077268-05c3-4ad4-a230-1cc82b27fefa" containerID="c5bc24b0a6aa4d36027e0ac1ccd1d41efdb6a53e60c4785a1f5a4015ad3152fd" exitCode=0 Dec 05 18:19:37 crc kubenswrapper[4756]: I1205 18:19:37.482441 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gtdzc" event={"ID":"e5077268-05c3-4ad4-a230-1cc82b27fefa","Type":"ContainerDied","Data":"c5bc24b0a6aa4d36027e0ac1ccd1d41efdb6a53e60c4785a1f5a4015ad3152fd"} Dec 05 18:19:37 crc kubenswrapper[4756]: I1205 18:19:37.743972 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gtdzc" Dec 05 18:19:37 crc kubenswrapper[4756]: I1205 18:19:37.840710 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5077268-05c3-4ad4-a230-1cc82b27fefa-catalog-content\") pod \"e5077268-05c3-4ad4-a230-1cc82b27fefa\" (UID: \"e5077268-05c3-4ad4-a230-1cc82b27fefa\") " Dec 05 18:19:37 crc kubenswrapper[4756]: I1205 18:19:37.840834 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5077268-05c3-4ad4-a230-1cc82b27fefa-utilities\") pod \"e5077268-05c3-4ad4-a230-1cc82b27fefa\" (UID: \"e5077268-05c3-4ad4-a230-1cc82b27fefa\") " Dec 05 18:19:37 crc kubenswrapper[4756]: I1205 18:19:37.840964 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t22m5\" (UniqueName: \"kubernetes.io/projected/e5077268-05c3-4ad4-a230-1cc82b27fefa-kube-api-access-t22m5\") pod \"e5077268-05c3-4ad4-a230-1cc82b27fefa\" (UID: \"e5077268-05c3-4ad4-a230-1cc82b27fefa\") " Dec 05 18:19:37 crc kubenswrapper[4756]: I1205 18:19:37.842191 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5077268-05c3-4ad4-a230-1cc82b27fefa-utilities" (OuterVolumeSpecName: "utilities") pod "e5077268-05c3-4ad4-a230-1cc82b27fefa" (UID: "e5077268-05c3-4ad4-a230-1cc82b27fefa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:19:37 crc kubenswrapper[4756]: I1205 18:19:37.850742 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5077268-05c3-4ad4-a230-1cc82b27fefa-kube-api-access-t22m5" (OuterVolumeSpecName: "kube-api-access-t22m5") pod "e5077268-05c3-4ad4-a230-1cc82b27fefa" (UID: "e5077268-05c3-4ad4-a230-1cc82b27fefa"). InnerVolumeSpecName "kube-api-access-t22m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:19:37 crc kubenswrapper[4756]: I1205 18:19:37.906974 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5077268-05c3-4ad4-a230-1cc82b27fefa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5077268-05c3-4ad4-a230-1cc82b27fefa" (UID: "e5077268-05c3-4ad4-a230-1cc82b27fefa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:19:37 crc kubenswrapper[4756]: I1205 18:19:37.944058 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5077268-05c3-4ad4-a230-1cc82b27fefa-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 18:19:37 crc kubenswrapper[4756]: I1205 18:19:37.944109 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5077268-05c3-4ad4-a230-1cc82b27fefa-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 18:19:37 crc kubenswrapper[4756]: I1205 18:19:37.944134 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t22m5\" (UniqueName: \"kubernetes.io/projected/e5077268-05c3-4ad4-a230-1cc82b27fefa-kube-api-access-t22m5\") on node \"crc\" DevicePath \"\"" Dec 05 18:19:38 crc kubenswrapper[4756]: I1205 18:19:38.502917 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gtdzc" event={"ID":"e5077268-05c3-4ad4-a230-1cc82b27fefa","Type":"ContainerDied","Data":"09e39c2bf89129517bce48aacfa6cd196e4dedafc07d3f09fceb0c9ea8ae1e06"} Dec 05 18:19:38 crc kubenswrapper[4756]: I1205 18:19:38.503010 4756 scope.go:117] "RemoveContainer" containerID="c5bc24b0a6aa4d36027e0ac1ccd1d41efdb6a53e60c4785a1f5a4015ad3152fd" Dec 05 18:19:38 crc kubenswrapper[4756]: I1205 18:19:38.503035 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gtdzc" Dec 05 18:19:38 crc kubenswrapper[4756]: I1205 18:19:38.547149 4756 scope.go:117] "RemoveContainer" containerID="20c40e9b39e67924f836bf64bc55e0e4d3e801d0fb1d9daac6de9e5ec5969f45" Dec 05 18:19:38 crc kubenswrapper[4756]: I1205 18:19:38.551412 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gtdzc"] Dec 05 18:19:38 crc kubenswrapper[4756]: I1205 18:19:38.566564 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gtdzc"] Dec 05 18:19:38 crc kubenswrapper[4756]: I1205 18:19:38.593596 4756 scope.go:117] "RemoveContainer" containerID="13d5a7db838b7a069795a01daea5086cda5be2422825d0c369b85ad5b33056ca" Dec 05 18:19:40 crc kubenswrapper[4756]: I1205 18:19:40.067542 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5077268-05c3-4ad4-a230-1cc82b27fefa" path="/var/lib/kubelet/pods/e5077268-05c3-4ad4-a230-1cc82b27fefa/volumes" Dec 05 18:19:48 crc kubenswrapper[4756]: I1205 18:19:48.050366 4756 scope.go:117] "RemoveContainer" containerID="1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19" Dec 05 18:19:48 crc kubenswrapper[4756]: E1205 18:19:48.052079 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:19:53 crc kubenswrapper[4756]: I1205 18:19:53.679716 4756 generic.go:334] "Generic (PLEG): container finished" podID="9f633ab6-c240-4b57-8941-d8dc399f4a84" containerID="4dadf9dbaa2cacf865fe38d1047c092a6ace3ae8472f1a5c43c0c90fdc72f101" exitCode=0 Dec 05 18:19:53 crc kubenswrapper[4756]: I1205 18:19:53.679825 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v" event={"ID":"9f633ab6-c240-4b57-8941-d8dc399f4a84","Type":"ContainerDied","Data":"4dadf9dbaa2cacf865fe38d1047c092a6ace3ae8472f1a5c43c0c90fdc72f101"} Dec 05 18:19:55 crc kubenswrapper[4756]: I1205 18:19:55.210622 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v" Dec 05 18:19:55 crc kubenswrapper[4756]: I1205 18:19:55.374074 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h729t\" (UniqueName: \"kubernetes.io/projected/9f633ab6-c240-4b57-8941-d8dc399f4a84-kube-api-access-h729t\") pod \"9f633ab6-c240-4b57-8941-d8dc399f4a84\" (UID: \"9f633ab6-c240-4b57-8941-d8dc399f4a84\") " Dec 05 18:19:55 crc kubenswrapper[4756]: I1205 18:19:55.374301 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f633ab6-c240-4b57-8941-d8dc399f4a84-tripleo-cleanup-combined-ca-bundle\") pod \"9f633ab6-c240-4b57-8941-d8dc399f4a84\" (UID: \"9f633ab6-c240-4b57-8941-d8dc399f4a84\") " Dec 05 18:19:55 crc kubenswrapper[4756]: I1205 18:19:55.374381 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f633ab6-c240-4b57-8941-d8dc399f4a84-inventory\") pod \"9f633ab6-c240-4b57-8941-d8dc399f4a84\" (UID: \"9f633ab6-c240-4b57-8941-d8dc399f4a84\") " Dec 05 18:19:55 crc kubenswrapper[4756]: I1205 18:19:55.374567 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f633ab6-c240-4b57-8941-d8dc399f4a84-ssh-key\") pod \"9f633ab6-c240-4b57-8941-d8dc399f4a84\" (UID: \"9f633ab6-c240-4b57-8941-d8dc399f4a84\") " Dec 05 18:19:55 crc kubenswrapper[4756]: I1205 18:19:55.379742 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f633ab6-c240-4b57-8941-d8dc399f4a84-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "9f633ab6-c240-4b57-8941-d8dc399f4a84" (UID: "9f633ab6-c240-4b57-8941-d8dc399f4a84"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:19:55 crc kubenswrapper[4756]: I1205 18:19:55.384114 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f633ab6-c240-4b57-8941-d8dc399f4a84-kube-api-access-h729t" (OuterVolumeSpecName: "kube-api-access-h729t") pod "9f633ab6-c240-4b57-8941-d8dc399f4a84" (UID: "9f633ab6-c240-4b57-8941-d8dc399f4a84"). InnerVolumeSpecName "kube-api-access-h729t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:19:55 crc kubenswrapper[4756]: I1205 18:19:55.405976 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f633ab6-c240-4b57-8941-d8dc399f4a84-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9f633ab6-c240-4b57-8941-d8dc399f4a84" (UID: "9f633ab6-c240-4b57-8941-d8dc399f4a84"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:19:55 crc kubenswrapper[4756]: I1205 18:19:55.413573 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f633ab6-c240-4b57-8941-d8dc399f4a84-inventory" (OuterVolumeSpecName: "inventory") pod "9f633ab6-c240-4b57-8941-d8dc399f4a84" (UID: "9f633ab6-c240-4b57-8941-d8dc399f4a84"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:19:55 crc kubenswrapper[4756]: I1205 18:19:55.477827 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h729t\" (UniqueName: \"kubernetes.io/projected/9f633ab6-c240-4b57-8941-d8dc399f4a84-kube-api-access-h729t\") on node \"crc\" DevicePath \"\"" Dec 05 18:19:55 crc kubenswrapper[4756]: I1205 18:19:55.477879 4756 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f633ab6-c240-4b57-8941-d8dc399f4a84-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:19:55 crc kubenswrapper[4756]: I1205 18:19:55.477901 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f633ab6-c240-4b57-8941-d8dc399f4a84-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 18:19:55 crc kubenswrapper[4756]: I1205 18:19:55.477918 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f633ab6-c240-4b57-8941-d8dc399f4a84-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 18:19:55 crc kubenswrapper[4756]: I1205 18:19:55.709452 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v" event={"ID":"9f633ab6-c240-4b57-8941-d8dc399f4a84","Type":"ContainerDied","Data":"4b7315e07424c91e2b8578870176e721784115a12e17d08f4c2634bd66841080"} Dec 05 18:19:55 crc kubenswrapper[4756]: I1205 18:19:55.709937 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b7315e07424c91e2b8578870176e721784115a12e17d08f4c2634bd66841080" Dec 05 18:19:55 crc kubenswrapper[4756]: I1205 18:19:55.709560 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v" Dec 05 18:19:59 crc kubenswrapper[4756]: I1205 18:19:59.050439 4756 scope.go:117] "RemoveContainer" containerID="1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19" Dec 05 18:19:59 crc kubenswrapper[4756]: E1205 18:19:59.051578 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:20:07 crc kubenswrapper[4756]: I1205 18:20:07.038986 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-59lvc"] Dec 05 18:20:07 crc kubenswrapper[4756]: E1205 18:20:07.040024 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f633ab6-c240-4b57-8941-d8dc399f4a84" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 05 18:20:07 crc kubenswrapper[4756]: I1205 18:20:07.040043 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f633ab6-c240-4b57-8941-d8dc399f4a84" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 05 18:20:07 crc kubenswrapper[4756]: E1205 18:20:07.040082 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5077268-05c3-4ad4-a230-1cc82b27fefa" containerName="extract-utilities" Dec 05 18:20:07 crc kubenswrapper[4756]: I1205 18:20:07.040092 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5077268-05c3-4ad4-a230-1cc82b27fefa" containerName="extract-utilities" Dec 05 18:20:07 crc kubenswrapper[4756]: E1205 18:20:07.040131 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5077268-05c3-4ad4-a230-1cc82b27fefa" containerName="registry-server" Dec 05 18:20:07 crc kubenswrapper[4756]: I1205 18:20:07.040140 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5077268-05c3-4ad4-a230-1cc82b27fefa" containerName="registry-server" Dec 05 18:20:07 crc kubenswrapper[4756]: E1205 18:20:07.040156 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5077268-05c3-4ad4-a230-1cc82b27fefa" containerName="extract-content" Dec 05 18:20:07 crc kubenswrapper[4756]: I1205 18:20:07.040164 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5077268-05c3-4ad4-a230-1cc82b27fefa" containerName="extract-content" Dec 05 18:20:07 crc kubenswrapper[4756]: I1205 18:20:07.040512 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5077268-05c3-4ad4-a230-1cc82b27fefa" containerName="registry-server" Dec 05 18:20:07 crc kubenswrapper[4756]: I1205 18:20:07.040549 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f633ab6-c240-4b57-8941-d8dc399f4a84" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 05 18:20:07 crc kubenswrapper[4756]: I1205 18:20:07.041485 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-59lvc" Dec 05 18:20:07 crc kubenswrapper[4756]: I1205 18:20:07.045529 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 18:20:07 crc kubenswrapper[4756]: I1205 18:20:07.045531 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 18:20:07 crc kubenswrapper[4756]: I1205 18:20:07.045940 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 18:20:07 crc kubenswrapper[4756]: I1205 18:20:07.048072 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dfk9j" Dec 05 18:20:07 crc kubenswrapper[4756]: I1205 18:20:07.056379 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-59lvc"] Dec 05 18:20:07 crc kubenswrapper[4756]: I1205 18:20:07.159653 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2204483-1509-4302-84cc-15270057dda4-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-59lvc\" (UID: \"c2204483-1509-4302-84cc-15270057dda4\") " pod="openstack/bootstrap-openstack-openstack-cell1-59lvc" Dec 05 18:20:07 crc kubenswrapper[4756]: I1205 18:20:07.159739 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2204483-1509-4302-84cc-15270057dda4-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-59lvc\" (UID: \"c2204483-1509-4302-84cc-15270057dda4\") " pod="openstack/bootstrap-openstack-openstack-cell1-59lvc" Dec 05 18:20:07 crc kubenswrapper[4756]: I1205 18:20:07.160020 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv98b\" (UniqueName: \"kubernetes.io/projected/c2204483-1509-4302-84cc-15270057dda4-kube-api-access-zv98b\") pod \"bootstrap-openstack-openstack-cell1-59lvc\" (UID: \"c2204483-1509-4302-84cc-15270057dda4\") " pod="openstack/bootstrap-openstack-openstack-cell1-59lvc" Dec 05 18:20:07 crc kubenswrapper[4756]: I1205 18:20:07.160067 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2204483-1509-4302-84cc-15270057dda4-inventory\") pod \"bootstrap-openstack-openstack-cell1-59lvc\" (UID: \"c2204483-1509-4302-84cc-15270057dda4\") " pod="openstack/bootstrap-openstack-openstack-cell1-59lvc" Dec 05 18:20:07 crc kubenswrapper[4756]: I1205 18:20:07.262077 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2204483-1509-4302-84cc-15270057dda4-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-59lvc\" (UID: \"c2204483-1509-4302-84cc-15270057dda4\") " pod="openstack/bootstrap-openstack-openstack-cell1-59lvc" Dec 05 18:20:07 crc kubenswrapper[4756]: I1205 18:20:07.262188 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2204483-1509-4302-84cc-15270057dda4-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-59lvc\" (UID: \"c2204483-1509-4302-84cc-15270057dda4\") " pod="openstack/bootstrap-openstack-openstack-cell1-59lvc" Dec 05 18:20:07 crc kubenswrapper[4756]: I1205 18:20:07.262265 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv98b\" (UniqueName: \"kubernetes.io/projected/c2204483-1509-4302-84cc-15270057dda4-kube-api-access-zv98b\") pod \"bootstrap-openstack-openstack-cell1-59lvc\" (UID: \"c2204483-1509-4302-84cc-15270057dda4\") " pod="openstack/bootstrap-openstack-openstack-cell1-59lvc" Dec 05 18:20:07 crc kubenswrapper[4756]: I1205 18:20:07.262307 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2204483-1509-4302-84cc-15270057dda4-inventory\") pod \"bootstrap-openstack-openstack-cell1-59lvc\" (UID: \"c2204483-1509-4302-84cc-15270057dda4\") " pod="openstack/bootstrap-openstack-openstack-cell1-59lvc" Dec 05 18:20:07 crc kubenswrapper[4756]: I1205 18:20:07.268594 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2204483-1509-4302-84cc-15270057dda4-inventory\") pod \"bootstrap-openstack-openstack-cell1-59lvc\" (UID: \"c2204483-1509-4302-84cc-15270057dda4\") " pod="openstack/bootstrap-openstack-openstack-cell1-59lvc" Dec 05 18:20:07 crc kubenswrapper[4756]: I1205 18:20:07.269239 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2204483-1509-4302-84cc-15270057dda4-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-59lvc\" (UID: \"c2204483-1509-4302-84cc-15270057dda4\") " pod="openstack/bootstrap-openstack-openstack-cell1-59lvc" Dec 05 18:20:07 crc kubenswrapper[4756]: I1205 18:20:07.271280 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2204483-1509-4302-84cc-15270057dda4-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-59lvc\" (UID: \"c2204483-1509-4302-84cc-15270057dda4\") " pod="openstack/bootstrap-openstack-openstack-cell1-59lvc" Dec 05 18:20:07 crc kubenswrapper[4756]: I1205 18:20:07.278759 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv98b\" (UniqueName: \"kubernetes.io/projected/c2204483-1509-4302-84cc-15270057dda4-kube-api-access-zv98b\") pod \"bootstrap-openstack-openstack-cell1-59lvc\" (UID: \"c2204483-1509-4302-84cc-15270057dda4\") " pod="openstack/bootstrap-openstack-openstack-cell1-59lvc" Dec 05 18:20:07 crc kubenswrapper[4756]: I1205 18:20:07.368201 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-59lvc" Dec 05 18:20:07 crc kubenswrapper[4756]: I1205 18:20:07.937750 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-59lvc"] Dec 05 18:20:07 crc kubenswrapper[4756]: W1205 18:20:07.940925 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2204483_1509_4302_84cc_15270057dda4.slice/crio-8754cdf6fac28d09b86792b7ae445a1493843f4ffd220782c2a033c0a6334eec WatchSource:0}: Error finding container 8754cdf6fac28d09b86792b7ae445a1493843f4ffd220782c2a033c0a6334eec: Status 404 returned error can't find the container with id 8754cdf6fac28d09b86792b7ae445a1493843f4ffd220782c2a033c0a6334eec Dec 05 18:20:08 crc kubenswrapper[4756]: I1205 18:20:08.892700 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-59lvc" event={"ID":"c2204483-1509-4302-84cc-15270057dda4","Type":"ContainerStarted","Data":"8754cdf6fac28d09b86792b7ae445a1493843f4ffd220782c2a033c0a6334eec"} Dec 05 18:20:09 crc kubenswrapper[4756]: I1205 18:20:09.904480 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-59lvc" event={"ID":"c2204483-1509-4302-84cc-15270057dda4","Type":"ContainerStarted","Data":"0214fb03651554bb59595eab798f1034d90c373ee20da7f29bdb4f29d28e2317"} Dec 05 18:20:09 crc kubenswrapper[4756]: I1205 18:20:09.935252 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-59lvc" podStartSLOduration=2.102218828 podStartE2EDuration="2.935234616s" podCreationTimestamp="2025-12-05 18:20:07 +0000 UTC" firstStartedPulling="2025-12-05 18:20:07.945321006 +0000 UTC m=+7468.183994528" lastFinishedPulling="2025-12-05 18:20:08.778336804 +0000 UTC m=+7469.017010316" observedRunningTime="2025-12-05 18:20:09.926405939 +0000 UTC m=+7470.165079511" watchObservedRunningTime="2025-12-05 18:20:09.935234616 +0000 UTC m=+7470.173908128" Dec 05 18:20:11 crc kubenswrapper[4756]: I1205 18:20:11.049994 4756 scope.go:117] "RemoveContainer" containerID="1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19" Dec 05 18:20:11 crc kubenswrapper[4756]: E1205 18:20:11.051757 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:20:24 crc kubenswrapper[4756]: I1205 18:20:24.051573 4756 scope.go:117] "RemoveContainer" containerID="1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19" Dec 05 18:20:24 crc kubenswrapper[4756]: E1205 18:20:24.053226 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:20:35 crc kubenswrapper[4756]: I1205 18:20:35.050642 4756 scope.go:117] "RemoveContainer" containerID="1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19" Dec 05 18:20:35 crc kubenswrapper[4756]: E1205 18:20:35.052357 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:20:46 crc kubenswrapper[4756]: I1205 18:20:46.050156 4756 scope.go:117] "RemoveContainer" containerID="1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19" Dec 05 18:20:46 crc kubenswrapper[4756]: E1205 18:20:46.051019 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:21:00 crc kubenswrapper[4756]: I1205 18:21:00.069270 4756 scope.go:117] "RemoveContainer" containerID="1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19" Dec 05 18:21:00 crc kubenswrapper[4756]: E1205 18:21:00.070467 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:21:13 crc kubenswrapper[4756]: I1205 18:21:13.051598 4756 scope.go:117] "RemoveContainer" containerID="1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19" Dec 05 18:21:13 crc kubenswrapper[4756]: E1205 18:21:13.052863 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:21:26 crc kubenswrapper[4756]: I1205 18:21:26.049676 4756 scope.go:117] "RemoveContainer" containerID="1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19" Dec 05 18:21:26 crc kubenswrapper[4756]: I1205 18:21:26.788036 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"298b99a53eaa71e556883bbd5801e9357223f1284c428a6c6e79def0181cd2ac"} Dec 05 18:22:41 crc kubenswrapper[4756]: I1205 18:22:41.920376 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hxpcx"] Dec 05 18:22:41 crc kubenswrapper[4756]: I1205 18:22:41.926802 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hxpcx" Dec 05 18:22:41 crc kubenswrapper[4756]: I1205 18:22:41.945851 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hxpcx"] Dec 05 18:22:41 crc kubenswrapper[4756]: I1205 18:22:41.986644 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6mgk\" (UniqueName: \"kubernetes.io/projected/a758fd19-c366-4a42-bd99-f9cfe48fdce8-kube-api-access-t6mgk\") pod \"redhat-operators-hxpcx\" (UID: \"a758fd19-c366-4a42-bd99-f9cfe48fdce8\") " pod="openshift-marketplace/redhat-operators-hxpcx" Dec 05 18:22:41 crc kubenswrapper[4756]: I1205 18:22:41.986721 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a758fd19-c366-4a42-bd99-f9cfe48fdce8-catalog-content\") pod \"redhat-operators-hxpcx\" (UID: \"a758fd19-c366-4a42-bd99-f9cfe48fdce8\") " pod="openshift-marketplace/redhat-operators-hxpcx" Dec 05 18:22:41 crc kubenswrapper[4756]: I1205 18:22:41.986795 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a758fd19-c366-4a42-bd99-f9cfe48fdce8-utilities\") pod \"redhat-operators-hxpcx\" (UID: \"a758fd19-c366-4a42-bd99-f9cfe48fdce8\") " pod="openshift-marketplace/redhat-operators-hxpcx" Dec 05 18:22:42 crc kubenswrapper[4756]: I1205 18:22:42.088782 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6mgk\" (UniqueName: \"kubernetes.io/projected/a758fd19-c366-4a42-bd99-f9cfe48fdce8-kube-api-access-t6mgk\") pod \"redhat-operators-hxpcx\" (UID: \"a758fd19-c366-4a42-bd99-f9cfe48fdce8\") " pod="openshift-marketplace/redhat-operators-hxpcx" Dec 05 18:22:42 crc kubenswrapper[4756]: I1205 18:22:42.088891 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a758fd19-c366-4a42-bd99-f9cfe48fdce8-catalog-content\") pod \"redhat-operators-hxpcx\" (UID: \"a758fd19-c366-4a42-bd99-f9cfe48fdce8\") " pod="openshift-marketplace/redhat-operators-hxpcx" Dec 05 18:22:42 crc kubenswrapper[4756]: I1205 18:22:42.089012 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a758fd19-c366-4a42-bd99-f9cfe48fdce8-utilities\") pod \"redhat-operators-hxpcx\" (UID: \"a758fd19-c366-4a42-bd99-f9cfe48fdce8\") " pod="openshift-marketplace/redhat-operators-hxpcx" Dec 05 18:22:42 crc kubenswrapper[4756]: I1205 18:22:42.090828 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a758fd19-c366-4a42-bd99-f9cfe48fdce8-catalog-content\") pod \"redhat-operators-hxpcx\" (UID: \"a758fd19-c366-4a42-bd99-f9cfe48fdce8\") " pod="openshift-marketplace/redhat-operators-hxpcx" Dec 05 18:22:42 crc kubenswrapper[4756]: I1205 18:22:42.091677 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a758fd19-c366-4a42-bd99-f9cfe48fdce8-utilities\") pod \"redhat-operators-hxpcx\" (UID: \"a758fd19-c366-4a42-bd99-f9cfe48fdce8\") " pod="openshift-marketplace/redhat-operators-hxpcx" Dec 05 18:22:42 crc kubenswrapper[4756]: I1205 18:22:42.111877 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6mgk\" (UniqueName: \"kubernetes.io/projected/a758fd19-c366-4a42-bd99-f9cfe48fdce8-kube-api-access-t6mgk\") pod \"redhat-operators-hxpcx\" (UID: \"a758fd19-c366-4a42-bd99-f9cfe48fdce8\") " pod="openshift-marketplace/redhat-operators-hxpcx" Dec 05 18:22:42 crc kubenswrapper[4756]: I1205 18:22:42.257754 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hxpcx" Dec 05 18:22:42 crc kubenswrapper[4756]: I1205 18:22:42.783182 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hxpcx"] Dec 05 18:22:42 crc kubenswrapper[4756]: W1205 18:22:42.800012 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda758fd19_c366_4a42_bd99_f9cfe48fdce8.slice/crio-0d7e0877bd84026981d83cfc202a696d05fc40b4a8d83cc34d3031155d83d922 WatchSource:0}: Error finding container 0d7e0877bd84026981d83cfc202a696d05fc40b4a8d83cc34d3031155d83d922: Status 404 returned error can't find the container with id 0d7e0877bd84026981d83cfc202a696d05fc40b4a8d83cc34d3031155d83d922 Dec 05 18:22:43 crc kubenswrapper[4756]: I1205 18:22:43.737559 4756 generic.go:334] "Generic (PLEG): container finished" podID="a758fd19-c366-4a42-bd99-f9cfe48fdce8" containerID="4dddc7624c6d24d20c4c4fac7398a4484be624be894251aeb0f39dbaa13b8fe1" exitCode=0 Dec 05 18:22:43 crc kubenswrapper[4756]: I1205 18:22:43.737643 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hxpcx" event={"ID":"a758fd19-c366-4a42-bd99-f9cfe48fdce8","Type":"ContainerDied","Data":"4dddc7624c6d24d20c4c4fac7398a4484be624be894251aeb0f39dbaa13b8fe1"} Dec 05 18:22:43 crc kubenswrapper[4756]: I1205 18:22:43.737985 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hxpcx" event={"ID":"a758fd19-c366-4a42-bd99-f9cfe48fdce8","Type":"ContainerStarted","Data":"0d7e0877bd84026981d83cfc202a696d05fc40b4a8d83cc34d3031155d83d922"} Dec 05 18:22:44 crc kubenswrapper[4756]: I1205 18:22:44.750456 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hxpcx" event={"ID":"a758fd19-c366-4a42-bd99-f9cfe48fdce8","Type":"ContainerStarted","Data":"149a98dc8f6f714e8450d0ecbacbf272070110863989d7e08ad87e8e9f320d28"} Dec 05 18:22:49 crc kubenswrapper[4756]: I1205 18:22:49.806236 4756 generic.go:334] "Generic (PLEG): container finished" podID="a758fd19-c366-4a42-bd99-f9cfe48fdce8" containerID="149a98dc8f6f714e8450d0ecbacbf272070110863989d7e08ad87e8e9f320d28" exitCode=0 Dec 05 18:22:49 crc kubenswrapper[4756]: I1205 18:22:49.806354 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hxpcx" event={"ID":"a758fd19-c366-4a42-bd99-f9cfe48fdce8","Type":"ContainerDied","Data":"149a98dc8f6f714e8450d0ecbacbf272070110863989d7e08ad87e8e9f320d28"} Dec 05 18:22:50 crc kubenswrapper[4756]: I1205 18:22:50.822897 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hxpcx" event={"ID":"a758fd19-c366-4a42-bd99-f9cfe48fdce8","Type":"ContainerStarted","Data":"09dd69169cb27a840c9d683648f6511aca1e91b36797cf5ac7b09120d6a31e26"} Dec 05 18:22:50 crc kubenswrapper[4756]: I1205 18:22:50.859161 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hxpcx" podStartSLOduration=3.390816975 podStartE2EDuration="9.859135678s" podCreationTimestamp="2025-12-05 18:22:41 +0000 UTC" firstStartedPulling="2025-12-05 18:22:43.742138683 +0000 UTC m=+7623.980812235" lastFinishedPulling="2025-12-05 18:22:50.210457396 +0000 UTC m=+7630.449130938" observedRunningTime="2025-12-05 18:22:50.852293224 +0000 UTC m=+7631.090966756" watchObservedRunningTime="2025-12-05 18:22:50.859135678 +0000 UTC m=+7631.097809230" Dec 05 18:22:52 crc kubenswrapper[4756]: I1205 18:22:52.257907 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hxpcx" Dec 05 18:22:52 crc kubenswrapper[4756]: I1205 18:22:52.259656 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hxpcx" Dec 05 18:22:53 crc kubenswrapper[4756]: I1205 18:22:53.324367 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hxpcx" podUID="a758fd19-c366-4a42-bd99-f9cfe48fdce8" containerName="registry-server" probeResult="failure" output=< Dec 05 18:22:53 crc kubenswrapper[4756]: timeout: failed to connect service ":50051" within 1s Dec 05 18:22:53 crc kubenswrapper[4756]: > Dec 05 18:23:02 crc kubenswrapper[4756]: I1205 18:23:02.332283 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hxpcx" Dec 05 18:23:02 crc kubenswrapper[4756]: I1205 18:23:02.406723 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hxpcx" Dec 05 18:23:02 crc kubenswrapper[4756]: I1205 18:23:02.590109 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hxpcx"] Dec 05 18:23:03 crc kubenswrapper[4756]: I1205 18:23:03.991511 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hxpcx" podUID="a758fd19-c366-4a42-bd99-f9cfe48fdce8" containerName="registry-server" containerID="cri-o://09dd69169cb27a840c9d683648f6511aca1e91b36797cf5ac7b09120d6a31e26" gracePeriod=2 Dec 05 18:23:04 crc kubenswrapper[4756]: I1205 18:23:04.562354 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hxpcx" Dec 05 18:23:04 crc kubenswrapper[4756]: I1205 18:23:04.621569 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6mgk\" (UniqueName: \"kubernetes.io/projected/a758fd19-c366-4a42-bd99-f9cfe48fdce8-kube-api-access-t6mgk\") pod \"a758fd19-c366-4a42-bd99-f9cfe48fdce8\" (UID: \"a758fd19-c366-4a42-bd99-f9cfe48fdce8\") " Dec 05 18:23:04 crc kubenswrapper[4756]: I1205 18:23:04.621893 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a758fd19-c366-4a42-bd99-f9cfe48fdce8-utilities\") pod \"a758fd19-c366-4a42-bd99-f9cfe48fdce8\" (UID: \"a758fd19-c366-4a42-bd99-f9cfe48fdce8\") " Dec 05 18:23:04 crc kubenswrapper[4756]: I1205 18:23:04.622173 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a758fd19-c366-4a42-bd99-f9cfe48fdce8-catalog-content\") pod \"a758fd19-c366-4a42-bd99-f9cfe48fdce8\" (UID: \"a758fd19-c366-4a42-bd99-f9cfe48fdce8\") " Dec 05 18:23:04 crc kubenswrapper[4756]: I1205 18:23:04.623247 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a758fd19-c366-4a42-bd99-f9cfe48fdce8-utilities" (OuterVolumeSpecName: "utilities") pod "a758fd19-c366-4a42-bd99-f9cfe48fdce8" (UID: "a758fd19-c366-4a42-bd99-f9cfe48fdce8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:23:04 crc kubenswrapper[4756]: I1205 18:23:04.636634 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a758fd19-c366-4a42-bd99-f9cfe48fdce8-kube-api-access-t6mgk" (OuterVolumeSpecName: "kube-api-access-t6mgk") pod "a758fd19-c366-4a42-bd99-f9cfe48fdce8" (UID: "a758fd19-c366-4a42-bd99-f9cfe48fdce8"). InnerVolumeSpecName "kube-api-access-t6mgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:23:04 crc kubenswrapper[4756]: I1205 18:23:04.726884 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a758fd19-c366-4a42-bd99-f9cfe48fdce8-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 18:23:04 crc kubenswrapper[4756]: I1205 18:23:04.726945 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6mgk\" (UniqueName: \"kubernetes.io/projected/a758fd19-c366-4a42-bd99-f9cfe48fdce8-kube-api-access-t6mgk\") on node \"crc\" DevicePath \"\"" Dec 05 18:23:04 crc kubenswrapper[4756]: I1205 18:23:04.755712 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a758fd19-c366-4a42-bd99-f9cfe48fdce8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a758fd19-c366-4a42-bd99-f9cfe48fdce8" (UID: "a758fd19-c366-4a42-bd99-f9cfe48fdce8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:23:04 crc kubenswrapper[4756]: I1205 18:23:04.828937 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a758fd19-c366-4a42-bd99-f9cfe48fdce8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 18:23:05 crc kubenswrapper[4756]: I1205 18:23:05.010704 4756 generic.go:334] "Generic (PLEG): container finished" podID="a758fd19-c366-4a42-bd99-f9cfe48fdce8" containerID="09dd69169cb27a840c9d683648f6511aca1e91b36797cf5ac7b09120d6a31e26" exitCode=0 Dec 05 18:23:05 crc kubenswrapper[4756]: I1205 18:23:05.010798 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hxpcx" event={"ID":"a758fd19-c366-4a42-bd99-f9cfe48fdce8","Type":"ContainerDied","Data":"09dd69169cb27a840c9d683648f6511aca1e91b36797cf5ac7b09120d6a31e26"} Dec 05 18:23:05 crc kubenswrapper[4756]: I1205 18:23:05.010917 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hxpcx" event={"ID":"a758fd19-c366-4a42-bd99-f9cfe48fdce8","Type":"ContainerDied","Data":"0d7e0877bd84026981d83cfc202a696d05fc40b4a8d83cc34d3031155d83d922"} Dec 05 18:23:05 crc kubenswrapper[4756]: I1205 18:23:05.010827 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hxpcx" Dec 05 18:23:05 crc kubenswrapper[4756]: I1205 18:23:05.010968 4756 scope.go:117] "RemoveContainer" containerID="09dd69169cb27a840c9d683648f6511aca1e91b36797cf5ac7b09120d6a31e26" Dec 05 18:23:05 crc kubenswrapper[4756]: I1205 18:23:05.072799 4756 scope.go:117] "RemoveContainer" containerID="149a98dc8f6f714e8450d0ecbacbf272070110863989d7e08ad87e8e9f320d28" Dec 05 18:23:05 crc kubenswrapper[4756]: I1205 18:23:05.074223 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hxpcx"] Dec 05 18:23:05 crc kubenswrapper[4756]: I1205 18:23:05.090775 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hxpcx"] Dec 05 18:23:05 crc kubenswrapper[4756]: I1205 18:23:05.116818 4756 scope.go:117] "RemoveContainer" containerID="4dddc7624c6d24d20c4c4fac7398a4484be624be894251aeb0f39dbaa13b8fe1" Dec 05 18:23:05 crc kubenswrapper[4756]: I1205 18:23:05.158790 4756 scope.go:117] "RemoveContainer" containerID="09dd69169cb27a840c9d683648f6511aca1e91b36797cf5ac7b09120d6a31e26" Dec 05 18:23:05 crc kubenswrapper[4756]: E1205 18:23:05.159831 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09dd69169cb27a840c9d683648f6511aca1e91b36797cf5ac7b09120d6a31e26\": container with ID starting with 09dd69169cb27a840c9d683648f6511aca1e91b36797cf5ac7b09120d6a31e26 not found: ID does not exist" containerID="09dd69169cb27a840c9d683648f6511aca1e91b36797cf5ac7b09120d6a31e26" Dec 05 18:23:05 crc kubenswrapper[4756]: I1205 18:23:05.159883 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09dd69169cb27a840c9d683648f6511aca1e91b36797cf5ac7b09120d6a31e26"} err="failed to get container status \"09dd69169cb27a840c9d683648f6511aca1e91b36797cf5ac7b09120d6a31e26\": rpc error: code = NotFound desc = could not find container \"09dd69169cb27a840c9d683648f6511aca1e91b36797cf5ac7b09120d6a31e26\": container with ID starting with 09dd69169cb27a840c9d683648f6511aca1e91b36797cf5ac7b09120d6a31e26 not found: ID does not exist" Dec 05 18:23:05 crc kubenswrapper[4756]: I1205 18:23:05.159935 4756 scope.go:117] "RemoveContainer" containerID="149a98dc8f6f714e8450d0ecbacbf272070110863989d7e08ad87e8e9f320d28" Dec 05 18:23:05 crc kubenswrapper[4756]: E1205 18:23:05.160712 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"149a98dc8f6f714e8450d0ecbacbf272070110863989d7e08ad87e8e9f320d28\": container with ID starting with 149a98dc8f6f714e8450d0ecbacbf272070110863989d7e08ad87e8e9f320d28 not found: ID does not exist" containerID="149a98dc8f6f714e8450d0ecbacbf272070110863989d7e08ad87e8e9f320d28" Dec 05 18:23:05 crc kubenswrapper[4756]: I1205 18:23:05.160748 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"149a98dc8f6f714e8450d0ecbacbf272070110863989d7e08ad87e8e9f320d28"} err="failed to get container status \"149a98dc8f6f714e8450d0ecbacbf272070110863989d7e08ad87e8e9f320d28\": rpc error: code = NotFound desc = could not find container \"149a98dc8f6f714e8450d0ecbacbf272070110863989d7e08ad87e8e9f320d28\": container with ID starting with 149a98dc8f6f714e8450d0ecbacbf272070110863989d7e08ad87e8e9f320d28 not found: ID does not exist" Dec 05 18:23:05 crc kubenswrapper[4756]: I1205 18:23:05.161997 4756 scope.go:117] "RemoveContainer" containerID="4dddc7624c6d24d20c4c4fac7398a4484be624be894251aeb0f39dbaa13b8fe1" Dec 05 18:23:05 crc kubenswrapper[4756]: E1205 18:23:05.162604 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dddc7624c6d24d20c4c4fac7398a4484be624be894251aeb0f39dbaa13b8fe1\": container with ID starting with 4dddc7624c6d24d20c4c4fac7398a4484be624be894251aeb0f39dbaa13b8fe1 not found: ID does not exist" containerID="4dddc7624c6d24d20c4c4fac7398a4484be624be894251aeb0f39dbaa13b8fe1" Dec 05 18:23:05 crc kubenswrapper[4756]: I1205 18:23:05.162634 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dddc7624c6d24d20c4c4fac7398a4484be624be894251aeb0f39dbaa13b8fe1"} err="failed to get container status \"4dddc7624c6d24d20c4c4fac7398a4484be624be894251aeb0f39dbaa13b8fe1\": rpc error: code = NotFound desc = could not find container \"4dddc7624c6d24d20c4c4fac7398a4484be624be894251aeb0f39dbaa13b8fe1\": container with ID starting with 4dddc7624c6d24d20c4c4fac7398a4484be624be894251aeb0f39dbaa13b8fe1 not found: ID does not exist" Dec 05 18:23:06 crc kubenswrapper[4756]: I1205 18:23:06.074096 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a758fd19-c366-4a42-bd99-f9cfe48fdce8" path="/var/lib/kubelet/pods/a758fd19-c366-4a42-bd99-f9cfe48fdce8/volumes" Dec 05 18:23:24 crc kubenswrapper[4756]: I1205 18:23:24.250466 4756 generic.go:334] "Generic (PLEG): container finished" podID="c2204483-1509-4302-84cc-15270057dda4" containerID="0214fb03651554bb59595eab798f1034d90c373ee20da7f29bdb4f29d28e2317" exitCode=0 Dec 05 18:23:24 crc kubenswrapper[4756]: I1205 18:23:24.250601 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-59lvc" event={"ID":"c2204483-1509-4302-84cc-15270057dda4","Type":"ContainerDied","Data":"0214fb03651554bb59595eab798f1034d90c373ee20da7f29bdb4f29d28e2317"} Dec 05 18:23:25 crc kubenswrapper[4756]: I1205 18:23:25.769237 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-59lvc" Dec 05 18:23:25 crc kubenswrapper[4756]: I1205 18:23:25.778067 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2204483-1509-4302-84cc-15270057dda4-bootstrap-combined-ca-bundle\") pod \"c2204483-1509-4302-84cc-15270057dda4\" (UID: \"c2204483-1509-4302-84cc-15270057dda4\") " Dec 05 18:23:25 crc kubenswrapper[4756]: I1205 18:23:25.778442 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv98b\" (UniqueName: \"kubernetes.io/projected/c2204483-1509-4302-84cc-15270057dda4-kube-api-access-zv98b\") pod \"c2204483-1509-4302-84cc-15270057dda4\" (UID: \"c2204483-1509-4302-84cc-15270057dda4\") " Dec 05 18:23:25 crc kubenswrapper[4756]: I1205 18:23:25.778517 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2204483-1509-4302-84cc-15270057dda4-inventory\") pod \"c2204483-1509-4302-84cc-15270057dda4\" (UID: \"c2204483-1509-4302-84cc-15270057dda4\") " Dec 05 18:23:25 crc kubenswrapper[4756]: I1205 18:23:25.778584 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2204483-1509-4302-84cc-15270057dda4-ssh-key\") pod \"c2204483-1509-4302-84cc-15270057dda4\" (UID: \"c2204483-1509-4302-84cc-15270057dda4\") " Dec 05 18:23:25 crc kubenswrapper[4756]: I1205 18:23:25.786145 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2204483-1509-4302-84cc-15270057dda4-kube-api-access-zv98b" (OuterVolumeSpecName: "kube-api-access-zv98b") pod "c2204483-1509-4302-84cc-15270057dda4" (UID: "c2204483-1509-4302-84cc-15270057dda4"). InnerVolumeSpecName "kube-api-access-zv98b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:23:25 crc kubenswrapper[4756]: I1205 18:23:25.787566 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2204483-1509-4302-84cc-15270057dda4-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "c2204483-1509-4302-84cc-15270057dda4" (UID: "c2204483-1509-4302-84cc-15270057dda4"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:23:25 crc kubenswrapper[4756]: I1205 18:23:25.846712 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2204483-1509-4302-84cc-15270057dda4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c2204483-1509-4302-84cc-15270057dda4" (UID: "c2204483-1509-4302-84cc-15270057dda4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:23:25 crc kubenswrapper[4756]: I1205 18:23:25.873725 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2204483-1509-4302-84cc-15270057dda4-inventory" (OuterVolumeSpecName: "inventory") pod "c2204483-1509-4302-84cc-15270057dda4" (UID: "c2204483-1509-4302-84cc-15270057dda4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:23:25 crc kubenswrapper[4756]: I1205 18:23:25.885313 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2204483-1509-4302-84cc-15270057dda4-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 18:23:25 crc kubenswrapper[4756]: I1205 18:23:25.885353 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2204483-1509-4302-84cc-15270057dda4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 18:23:25 crc kubenswrapper[4756]: I1205 18:23:25.885366 4756 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2204483-1509-4302-84cc-15270057dda4-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:23:25 crc kubenswrapper[4756]: I1205 18:23:25.885383 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv98b\" (UniqueName: \"kubernetes.io/projected/c2204483-1509-4302-84cc-15270057dda4-kube-api-access-zv98b\") on node \"crc\" DevicePath \"\"" Dec 05 18:23:26 crc kubenswrapper[4756]: I1205 18:23:26.275545 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-59lvc" event={"ID":"c2204483-1509-4302-84cc-15270057dda4","Type":"ContainerDied","Data":"8754cdf6fac28d09b86792b7ae445a1493843f4ffd220782c2a033c0a6334eec"} Dec 05 18:23:26 crc kubenswrapper[4756]: I1205 18:23:26.276014 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8754cdf6fac28d09b86792b7ae445a1493843f4ffd220782c2a033c0a6334eec" Dec 05 18:23:26 crc kubenswrapper[4756]: I1205 18:23:26.275693 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-59lvc" Dec 05 18:23:26 crc kubenswrapper[4756]: I1205 18:23:26.398352 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-sz96x"] Dec 05 18:23:26 crc kubenswrapper[4756]: E1205 18:23:26.398954 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a758fd19-c366-4a42-bd99-f9cfe48fdce8" containerName="extract-utilities" Dec 05 18:23:26 crc kubenswrapper[4756]: I1205 18:23:26.398978 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a758fd19-c366-4a42-bd99-f9cfe48fdce8" containerName="extract-utilities" Dec 05 18:23:26 crc kubenswrapper[4756]: E1205 18:23:26.398994 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2204483-1509-4302-84cc-15270057dda4" containerName="bootstrap-openstack-openstack-cell1" Dec 05 18:23:26 crc kubenswrapper[4756]: I1205 18:23:26.399004 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2204483-1509-4302-84cc-15270057dda4" containerName="bootstrap-openstack-openstack-cell1" Dec 05 18:23:26 crc kubenswrapper[4756]: E1205 18:23:26.399022 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a758fd19-c366-4a42-bd99-f9cfe48fdce8" containerName="extract-content" Dec 05 18:23:26 crc kubenswrapper[4756]: I1205 18:23:26.399030 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a758fd19-c366-4a42-bd99-f9cfe48fdce8" containerName="extract-content" Dec 05 18:23:26 crc kubenswrapper[4756]: E1205 18:23:26.399058 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a758fd19-c366-4a42-bd99-f9cfe48fdce8" containerName="registry-server" Dec 05 18:23:26 crc kubenswrapper[4756]: I1205 18:23:26.399066 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a758fd19-c366-4a42-bd99-f9cfe48fdce8" containerName="registry-server" Dec 05 18:23:26 crc kubenswrapper[4756]: I1205 18:23:26.399312 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="a758fd19-c366-4a42-bd99-f9cfe48fdce8" containerName="registry-server" Dec 05 18:23:26 crc kubenswrapper[4756]: I1205 18:23:26.399328 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2204483-1509-4302-84cc-15270057dda4" containerName="bootstrap-openstack-openstack-cell1" Dec 05 18:23:26 crc kubenswrapper[4756]: I1205 18:23:26.400364 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-sz96x" Dec 05 18:23:26 crc kubenswrapper[4756]: I1205 18:23:26.406618 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 18:23:26 crc kubenswrapper[4756]: I1205 18:23:26.406977 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dfk9j" Dec 05 18:23:26 crc kubenswrapper[4756]: I1205 18:23:26.407125 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 18:23:26 crc kubenswrapper[4756]: I1205 18:23:26.409592 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 18:23:26 crc kubenswrapper[4756]: I1205 18:23:26.438732 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-sz96x"] Dec 05 18:23:26 crc kubenswrapper[4756]: I1205 18:23:26.509269 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8232bee5-aac4-45a3-8321-8fb9161c6566-ssh-key\") pod \"download-cache-openstack-openstack-cell1-sz96x\" (UID: \"8232bee5-aac4-45a3-8321-8fb9161c6566\") " pod="openstack/download-cache-openstack-openstack-cell1-sz96x" Dec 05 18:23:26 crc kubenswrapper[4756]: I1205 18:23:26.509498 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gpgr\" (UniqueName: \"kubernetes.io/projected/8232bee5-aac4-45a3-8321-8fb9161c6566-kube-api-access-4gpgr\") pod \"download-cache-openstack-openstack-cell1-sz96x\" (UID: \"8232bee5-aac4-45a3-8321-8fb9161c6566\") " pod="openstack/download-cache-openstack-openstack-cell1-sz96x" Dec 05 18:23:26 crc kubenswrapper[4756]: I1205 18:23:26.509541 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8232bee5-aac4-45a3-8321-8fb9161c6566-inventory\") pod \"download-cache-openstack-openstack-cell1-sz96x\" (UID: \"8232bee5-aac4-45a3-8321-8fb9161c6566\") " pod="openstack/download-cache-openstack-openstack-cell1-sz96x" Dec 05 18:23:26 crc kubenswrapper[4756]: I1205 18:23:26.611793 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gpgr\" (UniqueName: \"kubernetes.io/projected/8232bee5-aac4-45a3-8321-8fb9161c6566-kube-api-access-4gpgr\") pod \"download-cache-openstack-openstack-cell1-sz96x\" (UID: \"8232bee5-aac4-45a3-8321-8fb9161c6566\") " pod="openstack/download-cache-openstack-openstack-cell1-sz96x" Dec 05 18:23:26 crc kubenswrapper[4756]: I1205 18:23:26.611845 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8232bee5-aac4-45a3-8321-8fb9161c6566-inventory\") pod \"download-cache-openstack-openstack-cell1-sz96x\" (UID: \"8232bee5-aac4-45a3-8321-8fb9161c6566\") " pod="openstack/download-cache-openstack-openstack-cell1-sz96x" Dec 05 18:23:26 crc kubenswrapper[4756]: I1205 18:23:26.611989 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8232bee5-aac4-45a3-8321-8fb9161c6566-ssh-key\") pod \"download-cache-openstack-openstack-cell1-sz96x\" (UID: \"8232bee5-aac4-45a3-8321-8fb9161c6566\") " pod="openstack/download-cache-openstack-openstack-cell1-sz96x" Dec 05 18:23:26 crc kubenswrapper[4756]: I1205 18:23:26.617791 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8232bee5-aac4-45a3-8321-8fb9161c6566-inventory\") pod \"download-cache-openstack-openstack-cell1-sz96x\" (UID: \"8232bee5-aac4-45a3-8321-8fb9161c6566\") " pod="openstack/download-cache-openstack-openstack-cell1-sz96x" Dec 05 18:23:26 crc kubenswrapper[4756]: I1205 18:23:26.618254 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8232bee5-aac4-45a3-8321-8fb9161c6566-ssh-key\") pod \"download-cache-openstack-openstack-cell1-sz96x\" (UID: \"8232bee5-aac4-45a3-8321-8fb9161c6566\") " pod="openstack/download-cache-openstack-openstack-cell1-sz96x" Dec 05 18:23:26 crc kubenswrapper[4756]: I1205 18:23:26.644449 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gpgr\" (UniqueName: \"kubernetes.io/projected/8232bee5-aac4-45a3-8321-8fb9161c6566-kube-api-access-4gpgr\") pod \"download-cache-openstack-openstack-cell1-sz96x\" (UID: \"8232bee5-aac4-45a3-8321-8fb9161c6566\") " pod="openstack/download-cache-openstack-openstack-cell1-sz96x" Dec 05 18:23:26 crc kubenswrapper[4756]: I1205 18:23:26.730257 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-sz96x" Dec 05 18:23:27 crc kubenswrapper[4756]: I1205 18:23:27.373151 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-sz96x"] Dec 05 18:23:27 crc kubenswrapper[4756]: W1205 18:23:27.376313 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8232bee5_aac4_45a3_8321_8fb9161c6566.slice/crio-629972f12e07ad06614595f2924a67003ca7aacfb13c4ca0558f6a41d1fdbad7 WatchSource:0}: Error finding container 629972f12e07ad06614595f2924a67003ca7aacfb13c4ca0558f6a41d1fdbad7: Status 404 returned error can't find the container with id 629972f12e07ad06614595f2924a67003ca7aacfb13c4ca0558f6a41d1fdbad7 Dec 05 18:23:28 crc kubenswrapper[4756]: I1205 18:23:28.304215 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-sz96x" event={"ID":"8232bee5-aac4-45a3-8321-8fb9161c6566","Type":"ContainerStarted","Data":"44c08ca03663c3878fb0f612982536b70ec542927bd4e3b205fa54f6cc3e10eb"} Dec 05 18:23:28 crc kubenswrapper[4756]: I1205 18:23:28.304612 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-sz96x" event={"ID":"8232bee5-aac4-45a3-8321-8fb9161c6566","Type":"ContainerStarted","Data":"629972f12e07ad06614595f2924a67003ca7aacfb13c4ca0558f6a41d1fdbad7"} Dec 05 18:23:28 crc kubenswrapper[4756]: I1205 18:23:28.337397 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-sz96x" podStartSLOduration=1.845514036 podStartE2EDuration="2.337375278s" podCreationTimestamp="2025-12-05 18:23:26 +0000 UTC" firstStartedPulling="2025-12-05 18:23:27.3847863 +0000 UTC m=+7667.623459822" lastFinishedPulling="2025-12-05 18:23:27.876647512 +0000 UTC m=+7668.115321064" observedRunningTime="2025-12-05 18:23:28.334922562 +0000 UTC m=+7668.573596084" watchObservedRunningTime="2025-12-05 18:23:28.337375278 +0000 UTC m=+7668.576048830" Dec 05 18:23:52 crc kubenswrapper[4756]: I1205 18:23:52.296026 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:23:52 crc kubenswrapper[4756]: I1205 18:23:52.296625 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:24:22 crc kubenswrapper[4756]: I1205 18:24:22.295140 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:24:22 crc kubenswrapper[4756]: I1205 18:24:22.295649 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:24:52 crc kubenswrapper[4756]: I1205 18:24:52.295786 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:24:52 crc kubenswrapper[4756]: I1205 18:24:52.296596 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:24:52 crc kubenswrapper[4756]: I1205 18:24:52.296664 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 18:24:52 crc kubenswrapper[4756]: I1205 18:24:52.297935 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"298b99a53eaa71e556883bbd5801e9357223f1284c428a6c6e79def0181cd2ac"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 18:24:52 crc kubenswrapper[4756]: I1205 18:24:52.298203 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://298b99a53eaa71e556883bbd5801e9357223f1284c428a6c6e79def0181cd2ac" gracePeriod=600 Dec 05 18:24:52 crc kubenswrapper[4756]: I1205 18:24:52.477075 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="298b99a53eaa71e556883bbd5801e9357223f1284c428a6c6e79def0181cd2ac" exitCode=0 Dec 05 18:24:52 crc kubenswrapper[4756]: I1205 18:24:52.477552 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"298b99a53eaa71e556883bbd5801e9357223f1284c428a6c6e79def0181cd2ac"} Dec 05 18:24:52 crc kubenswrapper[4756]: I1205 18:24:52.477599 4756 scope.go:117] "RemoveContainer" containerID="1794fc25d0c5059bb932b1c6f992b031e7b3558b28e7f7fefd6cb1e4ac256f19" Dec 05 18:24:53 crc kubenswrapper[4756]: I1205 18:24:53.493736 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"82a63a8180647fe397c85e330207b3509035d03c3d6bad2fdf4a5955635b3ce8"} Dec 05 18:24:59 crc kubenswrapper[4756]: I1205 18:24:59.603239 4756 generic.go:334] "Generic (PLEG): container finished" podID="8232bee5-aac4-45a3-8321-8fb9161c6566" containerID="44c08ca03663c3878fb0f612982536b70ec542927bd4e3b205fa54f6cc3e10eb" exitCode=0 Dec 05 18:24:59 crc kubenswrapper[4756]: I1205 18:24:59.603321 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-sz96x" event={"ID":"8232bee5-aac4-45a3-8321-8fb9161c6566","Type":"ContainerDied","Data":"44c08ca03663c3878fb0f612982536b70ec542927bd4e3b205fa54f6cc3e10eb"} Dec 05 18:25:01 crc kubenswrapper[4756]: I1205 18:25:01.169963 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-sz96x" Dec 05 18:25:01 crc kubenswrapper[4756]: I1205 18:25:01.274964 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8232bee5-aac4-45a3-8321-8fb9161c6566-ssh-key\") pod \"8232bee5-aac4-45a3-8321-8fb9161c6566\" (UID: \"8232bee5-aac4-45a3-8321-8fb9161c6566\") " Dec 05 18:25:01 crc kubenswrapper[4756]: I1205 18:25:01.275317 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8232bee5-aac4-45a3-8321-8fb9161c6566-inventory\") pod \"8232bee5-aac4-45a3-8321-8fb9161c6566\" (UID: \"8232bee5-aac4-45a3-8321-8fb9161c6566\") " Dec 05 18:25:01 crc kubenswrapper[4756]: I1205 18:25:01.275499 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gpgr\" (UniqueName: \"kubernetes.io/projected/8232bee5-aac4-45a3-8321-8fb9161c6566-kube-api-access-4gpgr\") pod \"8232bee5-aac4-45a3-8321-8fb9161c6566\" (UID: \"8232bee5-aac4-45a3-8321-8fb9161c6566\") " Dec 05 18:25:01 crc kubenswrapper[4756]: I1205 18:25:01.282347 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8232bee5-aac4-45a3-8321-8fb9161c6566-kube-api-access-4gpgr" (OuterVolumeSpecName: "kube-api-access-4gpgr") pod "8232bee5-aac4-45a3-8321-8fb9161c6566" (UID: "8232bee5-aac4-45a3-8321-8fb9161c6566"). InnerVolumeSpecName "kube-api-access-4gpgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:25:01 crc kubenswrapper[4756]: I1205 18:25:01.310764 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8232bee5-aac4-45a3-8321-8fb9161c6566-inventory" (OuterVolumeSpecName: "inventory") pod "8232bee5-aac4-45a3-8321-8fb9161c6566" (UID: "8232bee5-aac4-45a3-8321-8fb9161c6566"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:25:01 crc kubenswrapper[4756]: I1205 18:25:01.312633 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8232bee5-aac4-45a3-8321-8fb9161c6566-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8232bee5-aac4-45a3-8321-8fb9161c6566" (UID: "8232bee5-aac4-45a3-8321-8fb9161c6566"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:25:01 crc kubenswrapper[4756]: I1205 18:25:01.379663 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8232bee5-aac4-45a3-8321-8fb9161c6566-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 18:25:01 crc kubenswrapper[4756]: I1205 18:25:01.379709 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8232bee5-aac4-45a3-8321-8fb9161c6566-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 18:25:01 crc kubenswrapper[4756]: I1205 18:25:01.379726 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gpgr\" (UniqueName: \"kubernetes.io/projected/8232bee5-aac4-45a3-8321-8fb9161c6566-kube-api-access-4gpgr\") on node \"crc\" DevicePath \"\"" Dec 05 18:25:01 crc kubenswrapper[4756]: I1205 18:25:01.630524 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-sz96x" event={"ID":"8232bee5-aac4-45a3-8321-8fb9161c6566","Type":"ContainerDied","Data":"629972f12e07ad06614595f2924a67003ca7aacfb13c4ca0558f6a41d1fdbad7"} Dec 05 18:25:01 crc kubenswrapper[4756]: I1205 18:25:01.630572 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="629972f12e07ad06614595f2924a67003ca7aacfb13c4ca0558f6a41d1fdbad7" Dec 05 18:25:01 crc kubenswrapper[4756]: I1205 18:25:01.630639 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-sz96x" Dec 05 18:25:01 crc kubenswrapper[4756]: I1205 18:25:01.753690 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-fgh87"] Dec 05 18:25:01 crc kubenswrapper[4756]: E1205 18:25:01.754270 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8232bee5-aac4-45a3-8321-8fb9161c6566" containerName="download-cache-openstack-openstack-cell1" Dec 05 18:25:01 crc kubenswrapper[4756]: I1205 18:25:01.754288 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="8232bee5-aac4-45a3-8321-8fb9161c6566" containerName="download-cache-openstack-openstack-cell1" Dec 05 18:25:01 crc kubenswrapper[4756]: I1205 18:25:01.754689 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="8232bee5-aac4-45a3-8321-8fb9161c6566" containerName="download-cache-openstack-openstack-cell1" Dec 05 18:25:01 crc kubenswrapper[4756]: I1205 18:25:01.755630 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-fgh87" Dec 05 18:25:01 crc kubenswrapper[4756]: I1205 18:25:01.759042 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dfk9j" Dec 05 18:25:01 crc kubenswrapper[4756]: I1205 18:25:01.759587 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 18:25:01 crc kubenswrapper[4756]: I1205 18:25:01.759636 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 18:25:01 crc kubenswrapper[4756]: I1205 18:25:01.759859 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 18:25:01 crc kubenswrapper[4756]: I1205 18:25:01.769125 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-fgh87"] Dec 05 18:25:01 crc kubenswrapper[4756]: I1205 18:25:01.891635 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/454b8b51-3072-47be-be52-4665332c05ee-inventory\") pod \"configure-network-openstack-openstack-cell1-fgh87\" (UID: \"454b8b51-3072-47be-be52-4665332c05ee\") " pod="openstack/configure-network-openstack-openstack-cell1-fgh87" Dec 05 18:25:01 crc kubenswrapper[4756]: I1205 18:25:01.891769 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/454b8b51-3072-47be-be52-4665332c05ee-ssh-key\") pod \"configure-network-openstack-openstack-cell1-fgh87\" (UID: \"454b8b51-3072-47be-be52-4665332c05ee\") " pod="openstack/configure-network-openstack-openstack-cell1-fgh87" Dec 05 18:25:01 crc kubenswrapper[4756]: I1205 18:25:01.892009 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb987\" (UniqueName: \"kubernetes.io/projected/454b8b51-3072-47be-be52-4665332c05ee-kube-api-access-sb987\") pod \"configure-network-openstack-openstack-cell1-fgh87\" (UID: \"454b8b51-3072-47be-be52-4665332c05ee\") " pod="openstack/configure-network-openstack-openstack-cell1-fgh87" Dec 05 18:25:01 crc kubenswrapper[4756]: I1205 18:25:01.994805 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/454b8b51-3072-47be-be52-4665332c05ee-ssh-key\") pod \"configure-network-openstack-openstack-cell1-fgh87\" (UID: \"454b8b51-3072-47be-be52-4665332c05ee\") " pod="openstack/configure-network-openstack-openstack-cell1-fgh87" Dec 05 18:25:01 crc kubenswrapper[4756]: I1205 18:25:01.995273 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb987\" (UniqueName: \"kubernetes.io/projected/454b8b51-3072-47be-be52-4665332c05ee-kube-api-access-sb987\") pod \"configure-network-openstack-openstack-cell1-fgh87\" (UID: \"454b8b51-3072-47be-be52-4665332c05ee\") " pod="openstack/configure-network-openstack-openstack-cell1-fgh87" Dec 05 18:25:01 crc kubenswrapper[4756]: I1205 18:25:01.996782 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/454b8b51-3072-47be-be52-4665332c05ee-inventory\") pod \"configure-network-openstack-openstack-cell1-fgh87\" (UID: \"454b8b51-3072-47be-be52-4665332c05ee\") " pod="openstack/configure-network-openstack-openstack-cell1-fgh87" Dec 05 18:25:02 crc kubenswrapper[4756]: I1205 18:25:02.001715 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/454b8b51-3072-47be-be52-4665332c05ee-inventory\") pod \"configure-network-openstack-openstack-cell1-fgh87\" (UID: \"454b8b51-3072-47be-be52-4665332c05ee\") " pod="openstack/configure-network-openstack-openstack-cell1-fgh87" Dec 05 18:25:02 crc kubenswrapper[4756]: I1205 18:25:02.002598 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/454b8b51-3072-47be-be52-4665332c05ee-ssh-key\") pod \"configure-network-openstack-openstack-cell1-fgh87\" (UID: \"454b8b51-3072-47be-be52-4665332c05ee\") " pod="openstack/configure-network-openstack-openstack-cell1-fgh87" Dec 05 18:25:02 crc kubenswrapper[4756]: I1205 18:25:02.023747 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb987\" (UniqueName: \"kubernetes.io/projected/454b8b51-3072-47be-be52-4665332c05ee-kube-api-access-sb987\") pod \"configure-network-openstack-openstack-cell1-fgh87\" (UID: \"454b8b51-3072-47be-be52-4665332c05ee\") " pod="openstack/configure-network-openstack-openstack-cell1-fgh87" Dec 05 18:25:02 crc kubenswrapper[4756]: I1205 18:25:02.073644 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-fgh87" Dec 05 18:25:03 crc kubenswrapper[4756]: I1205 18:25:03.336677 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-fgh87"] Dec 05 18:25:03 crc kubenswrapper[4756]: W1205 18:25:03.344388 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod454b8b51_3072_47be_be52_4665332c05ee.slice/crio-0d8f38f8c487c10a643b63a04a1eaed1be4a18145712c9a818322d42f21b16ed WatchSource:0}: Error finding container 0d8f38f8c487c10a643b63a04a1eaed1be4a18145712c9a818322d42f21b16ed: Status 404 returned error can't find the container with id 0d8f38f8c487c10a643b63a04a1eaed1be4a18145712c9a818322d42f21b16ed Dec 05 18:25:03 crc kubenswrapper[4756]: I1205 18:25:03.346818 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 18:25:03 crc kubenswrapper[4756]: I1205 18:25:03.655486 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-fgh87" event={"ID":"454b8b51-3072-47be-be52-4665332c05ee","Type":"ContainerStarted","Data":"0d8f38f8c487c10a643b63a04a1eaed1be4a18145712c9a818322d42f21b16ed"} Dec 05 18:25:04 crc kubenswrapper[4756]: I1205 18:25:04.672463 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-fgh87" event={"ID":"454b8b51-3072-47be-be52-4665332c05ee","Type":"ContainerStarted","Data":"f2cd565c3a51cdf9870e86c83075546d4f78fc55deb9fddcc03f2effadf1e9b5"} Dec 05 18:25:04 crc kubenswrapper[4756]: I1205 18:25:04.693440 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-fgh87" podStartSLOduration=3.139319037 podStartE2EDuration="3.693399691s" podCreationTimestamp="2025-12-05 18:25:01 +0000 UTC" firstStartedPulling="2025-12-05 18:25:03.346561236 +0000 UTC m=+7763.585234758" lastFinishedPulling="2025-12-05 18:25:03.90064186 +0000 UTC m=+7764.139315412" observedRunningTime="2025-12-05 18:25:04.690998837 +0000 UTC m=+7764.929672349" watchObservedRunningTime="2025-12-05 18:25:04.693399691 +0000 UTC m=+7764.932073203" Dec 05 18:25:18 crc kubenswrapper[4756]: I1205 18:25:18.074447 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qk5d9"] Dec 05 18:25:18 crc kubenswrapper[4756]: I1205 18:25:18.077742 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qk5d9" Dec 05 18:25:18 crc kubenswrapper[4756]: I1205 18:25:18.091676 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qk5d9"] Dec 05 18:25:18 crc kubenswrapper[4756]: I1205 18:25:18.170890 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5d77c1b-a01d-4090-9409-b204de52b9e8-catalog-content\") pod \"redhat-marketplace-qk5d9\" (UID: \"b5d77c1b-a01d-4090-9409-b204de52b9e8\") " pod="openshift-marketplace/redhat-marketplace-qk5d9" Dec 05 18:25:18 crc kubenswrapper[4756]: I1205 18:25:18.171241 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5d77c1b-a01d-4090-9409-b204de52b9e8-utilities\") pod \"redhat-marketplace-qk5d9\" (UID: \"b5d77c1b-a01d-4090-9409-b204de52b9e8\") " pod="openshift-marketplace/redhat-marketplace-qk5d9" Dec 05 18:25:18 crc kubenswrapper[4756]: I1205 18:25:18.171345 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ttwg\" (UniqueName: \"kubernetes.io/projected/b5d77c1b-a01d-4090-9409-b204de52b9e8-kube-api-access-2ttwg\") pod \"redhat-marketplace-qk5d9\" (UID: \"b5d77c1b-a01d-4090-9409-b204de52b9e8\") " pod="openshift-marketplace/redhat-marketplace-qk5d9" Dec 05 18:25:18 crc kubenswrapper[4756]: I1205 18:25:18.273380 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5d77c1b-a01d-4090-9409-b204de52b9e8-utilities\") pod \"redhat-marketplace-qk5d9\" (UID: \"b5d77c1b-a01d-4090-9409-b204de52b9e8\") " pod="openshift-marketplace/redhat-marketplace-qk5d9" Dec 05 18:25:18 crc kubenswrapper[4756]: I1205 18:25:18.273559 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ttwg\" (UniqueName: \"kubernetes.io/projected/b5d77c1b-a01d-4090-9409-b204de52b9e8-kube-api-access-2ttwg\") pod \"redhat-marketplace-qk5d9\" (UID: \"b5d77c1b-a01d-4090-9409-b204de52b9e8\") " pod="openshift-marketplace/redhat-marketplace-qk5d9" Dec 05 18:25:18 crc kubenswrapper[4756]: I1205 18:25:18.273985 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5d77c1b-a01d-4090-9409-b204de52b9e8-utilities\") pod \"redhat-marketplace-qk5d9\" (UID: \"b5d77c1b-a01d-4090-9409-b204de52b9e8\") " pod="openshift-marketplace/redhat-marketplace-qk5d9" Dec 05 18:25:18 crc kubenswrapper[4756]: I1205 18:25:18.274125 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5d77c1b-a01d-4090-9409-b204de52b9e8-catalog-content\") pod \"redhat-marketplace-qk5d9\" (UID: \"b5d77c1b-a01d-4090-9409-b204de52b9e8\") " pod="openshift-marketplace/redhat-marketplace-qk5d9" Dec 05 18:25:18 crc kubenswrapper[4756]: I1205 18:25:18.274614 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5d77c1b-a01d-4090-9409-b204de52b9e8-catalog-content\") pod \"redhat-marketplace-qk5d9\" (UID: \"b5d77c1b-a01d-4090-9409-b204de52b9e8\") " pod="openshift-marketplace/redhat-marketplace-qk5d9" Dec 05 18:25:18 crc kubenswrapper[4756]: I1205 18:25:18.298568 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ttwg\" (UniqueName: \"kubernetes.io/projected/b5d77c1b-a01d-4090-9409-b204de52b9e8-kube-api-access-2ttwg\") pod \"redhat-marketplace-qk5d9\" (UID: \"b5d77c1b-a01d-4090-9409-b204de52b9e8\") " pod="openshift-marketplace/redhat-marketplace-qk5d9" Dec 05 18:25:18 crc kubenswrapper[4756]: I1205 18:25:18.421884 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qk5d9" Dec 05 18:25:18 crc kubenswrapper[4756]: I1205 18:25:18.958592 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qk5d9"] Dec 05 18:25:19 crc kubenswrapper[4756]: I1205 18:25:19.860368 4756 generic.go:334] "Generic (PLEG): container finished" podID="b5d77c1b-a01d-4090-9409-b204de52b9e8" containerID="ffde72c14cdd0000eebc571fbfd7373473ae54e3aa2f97af54219629d458c016" exitCode=0 Dec 05 18:25:19 crc kubenswrapper[4756]: I1205 18:25:19.860491 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qk5d9" event={"ID":"b5d77c1b-a01d-4090-9409-b204de52b9e8","Type":"ContainerDied","Data":"ffde72c14cdd0000eebc571fbfd7373473ae54e3aa2f97af54219629d458c016"} Dec 05 18:25:19 crc kubenswrapper[4756]: I1205 18:25:19.860740 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qk5d9" event={"ID":"b5d77c1b-a01d-4090-9409-b204de52b9e8","Type":"ContainerStarted","Data":"086bc3ba44087f99fb932ac558bf4cfb340d0268aceabd4a92d333c48a415b6a"} Dec 05 18:25:24 crc kubenswrapper[4756]: I1205 18:25:24.918047 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qk5d9" event={"ID":"b5d77c1b-a01d-4090-9409-b204de52b9e8","Type":"ContainerStarted","Data":"c5f494f04ff6635465231cb0a85c33167015aebe61cba35c7acb6672e66615ee"} Dec 05 18:25:25 crc kubenswrapper[4756]: I1205 18:25:25.932616 4756 generic.go:334] "Generic (PLEG): container finished" podID="b5d77c1b-a01d-4090-9409-b204de52b9e8" containerID="c5f494f04ff6635465231cb0a85c33167015aebe61cba35c7acb6672e66615ee" exitCode=0 Dec 05 18:25:25 crc kubenswrapper[4756]: I1205 18:25:25.932748 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qk5d9" event={"ID":"b5d77c1b-a01d-4090-9409-b204de52b9e8","Type":"ContainerDied","Data":"c5f494f04ff6635465231cb0a85c33167015aebe61cba35c7acb6672e66615ee"} Dec 05 18:25:26 crc kubenswrapper[4756]: I1205 18:25:26.955000 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qk5d9" event={"ID":"b5d77c1b-a01d-4090-9409-b204de52b9e8","Type":"ContainerStarted","Data":"014b44ea9460cff5e12caed977dd6fa01702e8523cf6b1007d6880044f98a138"} Dec 05 18:25:26 crc kubenswrapper[4756]: I1205 18:25:26.978093 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qk5d9" podStartSLOduration=2.205222078 podStartE2EDuration="8.978075566s" podCreationTimestamp="2025-12-05 18:25:18 +0000 UTC" firstStartedPulling="2025-12-05 18:25:19.865048777 +0000 UTC m=+7780.103722329" lastFinishedPulling="2025-12-05 18:25:26.637902305 +0000 UTC m=+7786.876575817" observedRunningTime="2025-12-05 18:25:26.975884787 +0000 UTC m=+7787.214558299" watchObservedRunningTime="2025-12-05 18:25:26.978075566 +0000 UTC m=+7787.216749078" Dec 05 18:25:28 crc kubenswrapper[4756]: I1205 18:25:28.421946 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qk5d9" Dec 05 18:25:28 crc kubenswrapper[4756]: I1205 18:25:28.422266 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qk5d9" Dec 05 18:25:29 crc kubenswrapper[4756]: I1205 18:25:29.491373 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-qk5d9" podUID="b5d77c1b-a01d-4090-9409-b204de52b9e8" containerName="registry-server" probeResult="failure" output=< Dec 05 18:25:29 crc kubenswrapper[4756]: timeout: failed to connect service ":50051" within 1s Dec 05 18:25:29 crc kubenswrapper[4756]: > Dec 05 18:25:38 crc kubenswrapper[4756]: I1205 18:25:38.502332 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qk5d9" Dec 05 18:25:38 crc kubenswrapper[4756]: I1205 18:25:38.582104 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qk5d9" Dec 05 18:25:38 crc kubenswrapper[4756]: I1205 18:25:38.753411 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qk5d9"] Dec 05 18:25:40 crc kubenswrapper[4756]: I1205 18:25:40.104326 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qk5d9" podUID="b5d77c1b-a01d-4090-9409-b204de52b9e8" containerName="registry-server" containerID="cri-o://014b44ea9460cff5e12caed977dd6fa01702e8523cf6b1007d6880044f98a138" gracePeriod=2 Dec 05 18:25:40 crc kubenswrapper[4756]: I1205 18:25:40.653610 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qk5d9" Dec 05 18:25:40 crc kubenswrapper[4756]: I1205 18:25:40.728176 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5d77c1b-a01d-4090-9409-b204de52b9e8-catalog-content\") pod \"b5d77c1b-a01d-4090-9409-b204de52b9e8\" (UID: \"b5d77c1b-a01d-4090-9409-b204de52b9e8\") " Dec 05 18:25:40 crc kubenswrapper[4756]: I1205 18:25:40.728288 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ttwg\" (UniqueName: \"kubernetes.io/projected/b5d77c1b-a01d-4090-9409-b204de52b9e8-kube-api-access-2ttwg\") pod \"b5d77c1b-a01d-4090-9409-b204de52b9e8\" (UID: \"b5d77c1b-a01d-4090-9409-b204de52b9e8\") " Dec 05 18:25:40 crc kubenswrapper[4756]: I1205 18:25:40.728507 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5d77c1b-a01d-4090-9409-b204de52b9e8-utilities\") pod \"b5d77c1b-a01d-4090-9409-b204de52b9e8\" (UID: \"b5d77c1b-a01d-4090-9409-b204de52b9e8\") " Dec 05 18:25:40 crc kubenswrapper[4756]: I1205 18:25:40.729907 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5d77c1b-a01d-4090-9409-b204de52b9e8-utilities" (OuterVolumeSpecName: "utilities") pod "b5d77c1b-a01d-4090-9409-b204de52b9e8" (UID: "b5d77c1b-a01d-4090-9409-b204de52b9e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:25:40 crc kubenswrapper[4756]: I1205 18:25:40.733549 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5d77c1b-a01d-4090-9409-b204de52b9e8-kube-api-access-2ttwg" (OuterVolumeSpecName: "kube-api-access-2ttwg") pod "b5d77c1b-a01d-4090-9409-b204de52b9e8" (UID: "b5d77c1b-a01d-4090-9409-b204de52b9e8"). InnerVolumeSpecName "kube-api-access-2ttwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:25:40 crc kubenswrapper[4756]: I1205 18:25:40.763501 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5d77c1b-a01d-4090-9409-b204de52b9e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b5d77c1b-a01d-4090-9409-b204de52b9e8" (UID: "b5d77c1b-a01d-4090-9409-b204de52b9e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:25:40 crc kubenswrapper[4756]: I1205 18:25:40.831064 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5d77c1b-a01d-4090-9409-b204de52b9e8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 18:25:40 crc kubenswrapper[4756]: I1205 18:25:40.831099 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ttwg\" (UniqueName: \"kubernetes.io/projected/b5d77c1b-a01d-4090-9409-b204de52b9e8-kube-api-access-2ttwg\") on node \"crc\" DevicePath \"\"" Dec 05 18:25:40 crc kubenswrapper[4756]: I1205 18:25:40.831111 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5d77c1b-a01d-4090-9409-b204de52b9e8-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 18:25:41 crc kubenswrapper[4756]: I1205 18:25:41.122680 4756 generic.go:334] "Generic (PLEG): container finished" podID="b5d77c1b-a01d-4090-9409-b204de52b9e8" containerID="014b44ea9460cff5e12caed977dd6fa01702e8523cf6b1007d6880044f98a138" exitCode=0 Dec 05 18:25:41 crc kubenswrapper[4756]: I1205 18:25:41.122685 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qk5d9" Dec 05 18:25:41 crc kubenswrapper[4756]: I1205 18:25:41.122727 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qk5d9" event={"ID":"b5d77c1b-a01d-4090-9409-b204de52b9e8","Type":"ContainerDied","Data":"014b44ea9460cff5e12caed977dd6fa01702e8523cf6b1007d6880044f98a138"} Dec 05 18:25:41 crc kubenswrapper[4756]: I1205 18:25:41.123194 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qk5d9" event={"ID":"b5d77c1b-a01d-4090-9409-b204de52b9e8","Type":"ContainerDied","Data":"086bc3ba44087f99fb932ac558bf4cfb340d0268aceabd4a92d333c48a415b6a"} Dec 05 18:25:41 crc kubenswrapper[4756]: I1205 18:25:41.123227 4756 scope.go:117] "RemoveContainer" containerID="014b44ea9460cff5e12caed977dd6fa01702e8523cf6b1007d6880044f98a138" Dec 05 18:25:41 crc kubenswrapper[4756]: I1205 18:25:41.173344 4756 scope.go:117] "RemoveContainer" containerID="c5f494f04ff6635465231cb0a85c33167015aebe61cba35c7acb6672e66615ee" Dec 05 18:25:41 crc kubenswrapper[4756]: I1205 18:25:41.207624 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qk5d9"] Dec 05 18:25:41 crc kubenswrapper[4756]: I1205 18:25:41.225279 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qk5d9"] Dec 05 18:25:41 crc kubenswrapper[4756]: I1205 18:25:41.232800 4756 scope.go:117] "RemoveContainer" containerID="ffde72c14cdd0000eebc571fbfd7373473ae54e3aa2f97af54219629d458c016" Dec 05 18:25:41 crc kubenswrapper[4756]: I1205 18:25:41.279920 4756 scope.go:117] "RemoveContainer" containerID="014b44ea9460cff5e12caed977dd6fa01702e8523cf6b1007d6880044f98a138" Dec 05 18:25:41 crc kubenswrapper[4756]: E1205 18:25:41.281959 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"014b44ea9460cff5e12caed977dd6fa01702e8523cf6b1007d6880044f98a138\": container with ID starting with 014b44ea9460cff5e12caed977dd6fa01702e8523cf6b1007d6880044f98a138 not found: ID does not exist" containerID="014b44ea9460cff5e12caed977dd6fa01702e8523cf6b1007d6880044f98a138" Dec 05 18:25:41 crc kubenswrapper[4756]: I1205 18:25:41.281992 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"014b44ea9460cff5e12caed977dd6fa01702e8523cf6b1007d6880044f98a138"} err="failed to get container status \"014b44ea9460cff5e12caed977dd6fa01702e8523cf6b1007d6880044f98a138\": rpc error: code = NotFound desc = could not find container \"014b44ea9460cff5e12caed977dd6fa01702e8523cf6b1007d6880044f98a138\": container with ID starting with 014b44ea9460cff5e12caed977dd6fa01702e8523cf6b1007d6880044f98a138 not found: ID does not exist" Dec 05 18:25:41 crc kubenswrapper[4756]: I1205 18:25:41.282016 4756 scope.go:117] "RemoveContainer" containerID="c5f494f04ff6635465231cb0a85c33167015aebe61cba35c7acb6672e66615ee" Dec 05 18:25:41 crc kubenswrapper[4756]: E1205 18:25:41.282462 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5f494f04ff6635465231cb0a85c33167015aebe61cba35c7acb6672e66615ee\": container with ID starting with c5f494f04ff6635465231cb0a85c33167015aebe61cba35c7acb6672e66615ee not found: ID does not exist" containerID="c5f494f04ff6635465231cb0a85c33167015aebe61cba35c7acb6672e66615ee" Dec 05 18:25:41 crc kubenswrapper[4756]: I1205 18:25:41.282491 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5f494f04ff6635465231cb0a85c33167015aebe61cba35c7acb6672e66615ee"} err="failed to get container status \"c5f494f04ff6635465231cb0a85c33167015aebe61cba35c7acb6672e66615ee\": rpc error: code = NotFound desc = could not find container \"c5f494f04ff6635465231cb0a85c33167015aebe61cba35c7acb6672e66615ee\": container with ID starting with c5f494f04ff6635465231cb0a85c33167015aebe61cba35c7acb6672e66615ee not found: ID does not exist" Dec 05 18:25:41 crc kubenswrapper[4756]: I1205 18:25:41.282507 4756 scope.go:117] "RemoveContainer" containerID="ffde72c14cdd0000eebc571fbfd7373473ae54e3aa2f97af54219629d458c016" Dec 05 18:25:41 crc kubenswrapper[4756]: E1205 18:25:41.282793 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffde72c14cdd0000eebc571fbfd7373473ae54e3aa2f97af54219629d458c016\": container with ID starting with ffde72c14cdd0000eebc571fbfd7373473ae54e3aa2f97af54219629d458c016 not found: ID does not exist" containerID="ffde72c14cdd0000eebc571fbfd7373473ae54e3aa2f97af54219629d458c016" Dec 05 18:25:41 crc kubenswrapper[4756]: I1205 18:25:41.282822 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffde72c14cdd0000eebc571fbfd7373473ae54e3aa2f97af54219629d458c016"} err="failed to get container status \"ffde72c14cdd0000eebc571fbfd7373473ae54e3aa2f97af54219629d458c016\": rpc error: code = NotFound desc = could not find container \"ffde72c14cdd0000eebc571fbfd7373473ae54e3aa2f97af54219629d458c016\": container with ID starting with ffde72c14cdd0000eebc571fbfd7373473ae54e3aa2f97af54219629d458c016 not found: ID does not exist" Dec 05 18:25:42 crc kubenswrapper[4756]: I1205 18:25:42.060489 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5d77c1b-a01d-4090-9409-b204de52b9e8" path="/var/lib/kubelet/pods/b5d77c1b-a01d-4090-9409-b204de52b9e8/volumes" Dec 05 18:26:22 crc kubenswrapper[4756]: I1205 18:26:22.646659 4756 generic.go:334] "Generic (PLEG): container finished" podID="454b8b51-3072-47be-be52-4665332c05ee" containerID="f2cd565c3a51cdf9870e86c83075546d4f78fc55deb9fddcc03f2effadf1e9b5" exitCode=0 Dec 05 18:26:22 crc kubenswrapper[4756]: I1205 18:26:22.646767 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-fgh87" event={"ID":"454b8b51-3072-47be-be52-4665332c05ee","Type":"ContainerDied","Data":"f2cd565c3a51cdf9870e86c83075546d4f78fc55deb9fddcc03f2effadf1e9b5"} Dec 05 18:26:24 crc kubenswrapper[4756]: I1205 18:26:24.172552 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-fgh87" Dec 05 18:26:24 crc kubenswrapper[4756]: I1205 18:26:24.344751 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/454b8b51-3072-47be-be52-4665332c05ee-ssh-key\") pod \"454b8b51-3072-47be-be52-4665332c05ee\" (UID: \"454b8b51-3072-47be-be52-4665332c05ee\") " Dec 05 18:26:24 crc kubenswrapper[4756]: I1205 18:26:24.345329 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/454b8b51-3072-47be-be52-4665332c05ee-inventory\") pod \"454b8b51-3072-47be-be52-4665332c05ee\" (UID: \"454b8b51-3072-47be-be52-4665332c05ee\") " Dec 05 18:26:24 crc kubenswrapper[4756]: I1205 18:26:24.345473 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb987\" (UniqueName: \"kubernetes.io/projected/454b8b51-3072-47be-be52-4665332c05ee-kube-api-access-sb987\") pod \"454b8b51-3072-47be-be52-4665332c05ee\" (UID: \"454b8b51-3072-47be-be52-4665332c05ee\") " Dec 05 18:26:24 crc kubenswrapper[4756]: I1205 18:26:24.357595 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/454b8b51-3072-47be-be52-4665332c05ee-kube-api-access-sb987" (OuterVolumeSpecName: "kube-api-access-sb987") pod "454b8b51-3072-47be-be52-4665332c05ee" (UID: "454b8b51-3072-47be-be52-4665332c05ee"). InnerVolumeSpecName "kube-api-access-sb987". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:26:24 crc kubenswrapper[4756]: I1205 18:26:24.391601 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/454b8b51-3072-47be-be52-4665332c05ee-inventory" (OuterVolumeSpecName: "inventory") pod "454b8b51-3072-47be-be52-4665332c05ee" (UID: "454b8b51-3072-47be-be52-4665332c05ee"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:26:24 crc kubenswrapper[4756]: I1205 18:26:24.406493 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/454b8b51-3072-47be-be52-4665332c05ee-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "454b8b51-3072-47be-be52-4665332c05ee" (UID: "454b8b51-3072-47be-be52-4665332c05ee"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:26:24 crc kubenswrapper[4756]: I1205 18:26:24.447761 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/454b8b51-3072-47be-be52-4665332c05ee-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 18:26:24 crc kubenswrapper[4756]: I1205 18:26:24.447820 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/454b8b51-3072-47be-be52-4665332c05ee-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 18:26:24 crc kubenswrapper[4756]: I1205 18:26:24.447832 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb987\" (UniqueName: \"kubernetes.io/projected/454b8b51-3072-47be-be52-4665332c05ee-kube-api-access-sb987\") on node \"crc\" DevicePath \"\"" Dec 05 18:26:24 crc kubenswrapper[4756]: I1205 18:26:24.671902 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-fgh87" event={"ID":"454b8b51-3072-47be-be52-4665332c05ee","Type":"ContainerDied","Data":"0d8f38f8c487c10a643b63a04a1eaed1be4a18145712c9a818322d42f21b16ed"} Dec 05 18:26:24 crc kubenswrapper[4756]: I1205 18:26:24.671941 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d8f38f8c487c10a643b63a04a1eaed1be4a18145712c9a818322d42f21b16ed" Dec 05 18:26:24 crc kubenswrapper[4756]: I1205 18:26:24.671993 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-fgh87" Dec 05 18:26:24 crc kubenswrapper[4756]: I1205 18:26:24.789457 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-6cshr"] Dec 05 18:26:24 crc kubenswrapper[4756]: E1205 18:26:24.790003 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5d77c1b-a01d-4090-9409-b204de52b9e8" containerName="extract-utilities" Dec 05 18:26:24 crc kubenswrapper[4756]: I1205 18:26:24.790024 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5d77c1b-a01d-4090-9409-b204de52b9e8" containerName="extract-utilities" Dec 05 18:26:24 crc kubenswrapper[4756]: E1205 18:26:24.790034 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5d77c1b-a01d-4090-9409-b204de52b9e8" containerName="registry-server" Dec 05 18:26:24 crc kubenswrapper[4756]: I1205 18:26:24.790040 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5d77c1b-a01d-4090-9409-b204de52b9e8" containerName="registry-server" Dec 05 18:26:24 crc kubenswrapper[4756]: E1205 18:26:24.790053 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5d77c1b-a01d-4090-9409-b204de52b9e8" containerName="extract-content" Dec 05 18:26:24 crc kubenswrapper[4756]: I1205 18:26:24.790060 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5d77c1b-a01d-4090-9409-b204de52b9e8" containerName="extract-content" Dec 05 18:26:24 crc kubenswrapper[4756]: E1205 18:26:24.790094 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="454b8b51-3072-47be-be52-4665332c05ee" containerName="configure-network-openstack-openstack-cell1" Dec 05 18:26:24 crc kubenswrapper[4756]: I1205 18:26:24.790102 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="454b8b51-3072-47be-be52-4665332c05ee" containerName="configure-network-openstack-openstack-cell1" Dec 05 18:26:24 crc kubenswrapper[4756]: I1205 18:26:24.790300 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5d77c1b-a01d-4090-9409-b204de52b9e8" containerName="registry-server" Dec 05 18:26:24 crc kubenswrapper[4756]: I1205 18:26:24.790334 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="454b8b51-3072-47be-be52-4665332c05ee" containerName="configure-network-openstack-openstack-cell1" Dec 05 18:26:24 crc kubenswrapper[4756]: I1205 18:26:24.791190 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-6cshr" Dec 05 18:26:24 crc kubenswrapper[4756]: I1205 18:26:24.793570 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dfk9j" Dec 05 18:26:24 crc kubenswrapper[4756]: I1205 18:26:24.793961 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 18:26:24 crc kubenswrapper[4756]: I1205 18:26:24.793969 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 18:26:24 crc kubenswrapper[4756]: I1205 18:26:24.796716 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 18:26:24 crc kubenswrapper[4756]: I1205 18:26:24.800032 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-6cshr"] Dec 05 18:26:24 crc kubenswrapper[4756]: I1205 18:26:24.958901 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c9d9\" (UniqueName: \"kubernetes.io/projected/70125fdc-4134-4348-9b99-63b2893674e1-kube-api-access-6c9d9\") pod \"validate-network-openstack-openstack-cell1-6cshr\" (UID: \"70125fdc-4134-4348-9b99-63b2893674e1\") " pod="openstack/validate-network-openstack-openstack-cell1-6cshr" Dec 05 18:26:24 crc kubenswrapper[4756]: I1205 18:26:24.959063 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70125fdc-4134-4348-9b99-63b2893674e1-inventory\") pod \"validate-network-openstack-openstack-cell1-6cshr\" (UID: \"70125fdc-4134-4348-9b99-63b2893674e1\") " pod="openstack/validate-network-openstack-openstack-cell1-6cshr" Dec 05 18:26:24 crc kubenswrapper[4756]: I1205 18:26:24.959129 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70125fdc-4134-4348-9b99-63b2893674e1-ssh-key\") pod \"validate-network-openstack-openstack-cell1-6cshr\" (UID: \"70125fdc-4134-4348-9b99-63b2893674e1\") " pod="openstack/validate-network-openstack-openstack-cell1-6cshr" Dec 05 18:26:25 crc kubenswrapper[4756]: I1205 18:26:25.060171 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70125fdc-4134-4348-9b99-63b2893674e1-inventory\") pod \"validate-network-openstack-openstack-cell1-6cshr\" (UID: \"70125fdc-4134-4348-9b99-63b2893674e1\") " pod="openstack/validate-network-openstack-openstack-cell1-6cshr" Dec 05 18:26:25 crc kubenswrapper[4756]: I1205 18:26:25.060539 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70125fdc-4134-4348-9b99-63b2893674e1-ssh-key\") pod \"validate-network-openstack-openstack-cell1-6cshr\" (UID: \"70125fdc-4134-4348-9b99-63b2893674e1\") " pod="openstack/validate-network-openstack-openstack-cell1-6cshr" Dec 05 18:26:25 crc kubenswrapper[4756]: I1205 18:26:25.060654 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c9d9\" (UniqueName: \"kubernetes.io/projected/70125fdc-4134-4348-9b99-63b2893674e1-kube-api-access-6c9d9\") pod \"validate-network-openstack-openstack-cell1-6cshr\" (UID: \"70125fdc-4134-4348-9b99-63b2893674e1\") " pod="openstack/validate-network-openstack-openstack-cell1-6cshr" Dec 05 18:26:25 crc kubenswrapper[4756]: I1205 18:26:25.064376 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70125fdc-4134-4348-9b99-63b2893674e1-ssh-key\") pod \"validate-network-openstack-openstack-cell1-6cshr\" (UID: \"70125fdc-4134-4348-9b99-63b2893674e1\") " pod="openstack/validate-network-openstack-openstack-cell1-6cshr" Dec 05 18:26:25 crc kubenswrapper[4756]: I1205 18:26:25.070017 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70125fdc-4134-4348-9b99-63b2893674e1-inventory\") pod \"validate-network-openstack-openstack-cell1-6cshr\" (UID: \"70125fdc-4134-4348-9b99-63b2893674e1\") " pod="openstack/validate-network-openstack-openstack-cell1-6cshr" Dec 05 18:26:25 crc kubenswrapper[4756]: I1205 18:26:25.085480 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c9d9\" (UniqueName: \"kubernetes.io/projected/70125fdc-4134-4348-9b99-63b2893674e1-kube-api-access-6c9d9\") pod \"validate-network-openstack-openstack-cell1-6cshr\" (UID: \"70125fdc-4134-4348-9b99-63b2893674e1\") " pod="openstack/validate-network-openstack-openstack-cell1-6cshr" Dec 05 18:26:25 crc kubenswrapper[4756]: I1205 18:26:25.119668 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-6cshr" Dec 05 18:26:25 crc kubenswrapper[4756]: I1205 18:26:25.760341 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-6cshr"] Dec 05 18:26:25 crc kubenswrapper[4756]: W1205 18:26:25.764506 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70125fdc_4134_4348_9b99_63b2893674e1.slice/crio-dc77a6dc9a15bb9ce298ee9845695900e76aa8ac6be35a13349c287d06c2e611 WatchSource:0}: Error finding container dc77a6dc9a15bb9ce298ee9845695900e76aa8ac6be35a13349c287d06c2e611: Status 404 returned error can't find the container with id dc77a6dc9a15bb9ce298ee9845695900e76aa8ac6be35a13349c287d06c2e611 Dec 05 18:26:26 crc kubenswrapper[4756]: I1205 18:26:26.699021 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-6cshr" event={"ID":"70125fdc-4134-4348-9b99-63b2893674e1","Type":"ContainerStarted","Data":"dc77a6dc9a15bb9ce298ee9845695900e76aa8ac6be35a13349c287d06c2e611"} Dec 05 18:26:27 crc kubenswrapper[4756]: I1205 18:26:27.713610 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-6cshr" event={"ID":"70125fdc-4134-4348-9b99-63b2893674e1","Type":"ContainerStarted","Data":"dd08cd2ed0444e77792ef2e3c2cd75371ab95eab477d16e1732f550016bf5b37"} Dec 05 18:26:27 crc kubenswrapper[4756]: I1205 18:26:27.746814 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-6cshr" podStartSLOduration=3.072116227 podStartE2EDuration="3.746785489s" podCreationTimestamp="2025-12-05 18:26:24 +0000 UTC" firstStartedPulling="2025-12-05 18:26:25.767624749 +0000 UTC m=+7846.006298271" lastFinishedPulling="2025-12-05 18:26:26.442293981 +0000 UTC m=+7846.680967533" observedRunningTime="2025-12-05 18:26:27.732409593 +0000 UTC m=+7847.971083135" watchObservedRunningTime="2025-12-05 18:26:27.746785489 +0000 UTC m=+7847.985459041" Dec 05 18:26:31 crc kubenswrapper[4756]: I1205 18:26:31.779052 4756 generic.go:334] "Generic (PLEG): container finished" podID="70125fdc-4134-4348-9b99-63b2893674e1" containerID="dd08cd2ed0444e77792ef2e3c2cd75371ab95eab477d16e1732f550016bf5b37" exitCode=0 Dec 05 18:26:31 crc kubenswrapper[4756]: I1205 18:26:31.779198 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-6cshr" event={"ID":"70125fdc-4134-4348-9b99-63b2893674e1","Type":"ContainerDied","Data":"dd08cd2ed0444e77792ef2e3c2cd75371ab95eab477d16e1732f550016bf5b37"} Dec 05 18:26:33 crc kubenswrapper[4756]: I1205 18:26:33.425732 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-6cshr" Dec 05 18:26:33 crc kubenswrapper[4756]: I1205 18:26:33.496942 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70125fdc-4134-4348-9b99-63b2893674e1-ssh-key\") pod \"70125fdc-4134-4348-9b99-63b2893674e1\" (UID: \"70125fdc-4134-4348-9b99-63b2893674e1\") " Dec 05 18:26:33 crc kubenswrapper[4756]: I1205 18:26:33.497580 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70125fdc-4134-4348-9b99-63b2893674e1-inventory\") pod \"70125fdc-4134-4348-9b99-63b2893674e1\" (UID: \"70125fdc-4134-4348-9b99-63b2893674e1\") " Dec 05 18:26:33 crc kubenswrapper[4756]: I1205 18:26:33.497865 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c9d9\" (UniqueName: \"kubernetes.io/projected/70125fdc-4134-4348-9b99-63b2893674e1-kube-api-access-6c9d9\") pod \"70125fdc-4134-4348-9b99-63b2893674e1\" (UID: \"70125fdc-4134-4348-9b99-63b2893674e1\") " Dec 05 18:26:33 crc kubenswrapper[4756]: I1205 18:26:33.509970 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70125fdc-4134-4348-9b99-63b2893674e1-kube-api-access-6c9d9" (OuterVolumeSpecName: "kube-api-access-6c9d9") pod "70125fdc-4134-4348-9b99-63b2893674e1" (UID: "70125fdc-4134-4348-9b99-63b2893674e1"). InnerVolumeSpecName "kube-api-access-6c9d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:26:33 crc kubenswrapper[4756]: I1205 18:26:33.544964 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70125fdc-4134-4348-9b99-63b2893674e1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "70125fdc-4134-4348-9b99-63b2893674e1" (UID: "70125fdc-4134-4348-9b99-63b2893674e1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:26:33 crc kubenswrapper[4756]: I1205 18:26:33.552809 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70125fdc-4134-4348-9b99-63b2893674e1-inventory" (OuterVolumeSpecName: "inventory") pod "70125fdc-4134-4348-9b99-63b2893674e1" (UID: "70125fdc-4134-4348-9b99-63b2893674e1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:26:33 crc kubenswrapper[4756]: I1205 18:26:33.605855 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70125fdc-4134-4348-9b99-63b2893674e1-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 18:26:33 crc kubenswrapper[4756]: I1205 18:26:33.605890 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6c9d9\" (UniqueName: \"kubernetes.io/projected/70125fdc-4134-4348-9b99-63b2893674e1-kube-api-access-6c9d9\") on node \"crc\" DevicePath \"\"" Dec 05 18:26:33 crc kubenswrapper[4756]: I1205 18:26:33.605903 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70125fdc-4134-4348-9b99-63b2893674e1-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 18:26:33 crc kubenswrapper[4756]: I1205 18:26:33.804372 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-6cshr" event={"ID":"70125fdc-4134-4348-9b99-63b2893674e1","Type":"ContainerDied","Data":"dc77a6dc9a15bb9ce298ee9845695900e76aa8ac6be35a13349c287d06c2e611"} Dec 05 18:26:33 crc kubenswrapper[4756]: I1205 18:26:33.804433 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc77a6dc9a15bb9ce298ee9845695900e76aa8ac6be35a13349c287d06c2e611" Dec 05 18:26:33 crc kubenswrapper[4756]: I1205 18:26:33.804524 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-6cshr" Dec 05 18:26:33 crc kubenswrapper[4756]: I1205 18:26:33.901677 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-pqvr5"] Dec 05 18:26:33 crc kubenswrapper[4756]: E1205 18:26:33.909891 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70125fdc-4134-4348-9b99-63b2893674e1" containerName="validate-network-openstack-openstack-cell1" Dec 05 18:26:33 crc kubenswrapper[4756]: I1205 18:26:33.910136 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="70125fdc-4134-4348-9b99-63b2893674e1" containerName="validate-network-openstack-openstack-cell1" Dec 05 18:26:33 crc kubenswrapper[4756]: I1205 18:26:33.910404 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="70125fdc-4134-4348-9b99-63b2893674e1" containerName="validate-network-openstack-openstack-cell1" Dec 05 18:26:33 crc kubenswrapper[4756]: I1205 18:26:33.911218 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-pqvr5" Dec 05 18:26:33 crc kubenswrapper[4756]: I1205 18:26:33.914401 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-pqvr5"] Dec 05 18:26:33 crc kubenswrapper[4756]: I1205 18:26:33.914449 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dfk9j" Dec 05 18:26:33 crc kubenswrapper[4756]: I1205 18:26:33.914521 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 18:26:33 crc kubenswrapper[4756]: I1205 18:26:33.914577 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 18:26:33 crc kubenswrapper[4756]: I1205 18:26:33.915165 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 18:26:34 crc kubenswrapper[4756]: I1205 18:26:34.013970 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6bxt\" (UniqueName: \"kubernetes.io/projected/ad9215e6-b603-4771-b2a8-962d2160e2f6-kube-api-access-s6bxt\") pod \"install-os-openstack-openstack-cell1-pqvr5\" (UID: \"ad9215e6-b603-4771-b2a8-962d2160e2f6\") " pod="openstack/install-os-openstack-openstack-cell1-pqvr5" Dec 05 18:26:34 crc kubenswrapper[4756]: I1205 18:26:34.014112 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad9215e6-b603-4771-b2a8-962d2160e2f6-ssh-key\") pod \"install-os-openstack-openstack-cell1-pqvr5\" (UID: \"ad9215e6-b603-4771-b2a8-962d2160e2f6\") " pod="openstack/install-os-openstack-openstack-cell1-pqvr5" Dec 05 18:26:34 crc kubenswrapper[4756]: I1205 18:26:34.014508 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad9215e6-b603-4771-b2a8-962d2160e2f6-inventory\") pod \"install-os-openstack-openstack-cell1-pqvr5\" (UID: \"ad9215e6-b603-4771-b2a8-962d2160e2f6\") " pod="openstack/install-os-openstack-openstack-cell1-pqvr5" Dec 05 18:26:34 crc kubenswrapper[4756]: I1205 18:26:34.116791 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6bxt\" (UniqueName: \"kubernetes.io/projected/ad9215e6-b603-4771-b2a8-962d2160e2f6-kube-api-access-s6bxt\") pod \"install-os-openstack-openstack-cell1-pqvr5\" (UID: \"ad9215e6-b603-4771-b2a8-962d2160e2f6\") " pod="openstack/install-os-openstack-openstack-cell1-pqvr5" Dec 05 18:26:34 crc kubenswrapper[4756]: I1205 18:26:34.116877 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad9215e6-b603-4771-b2a8-962d2160e2f6-ssh-key\") pod \"install-os-openstack-openstack-cell1-pqvr5\" (UID: \"ad9215e6-b603-4771-b2a8-962d2160e2f6\") " pod="openstack/install-os-openstack-openstack-cell1-pqvr5" Dec 05 18:26:34 crc kubenswrapper[4756]: I1205 18:26:34.117201 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad9215e6-b603-4771-b2a8-962d2160e2f6-inventory\") pod \"install-os-openstack-openstack-cell1-pqvr5\" (UID: \"ad9215e6-b603-4771-b2a8-962d2160e2f6\") " pod="openstack/install-os-openstack-openstack-cell1-pqvr5" Dec 05 18:26:34 crc kubenswrapper[4756]: I1205 18:26:34.128465 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad9215e6-b603-4771-b2a8-962d2160e2f6-inventory\") pod \"install-os-openstack-openstack-cell1-pqvr5\" (UID: \"ad9215e6-b603-4771-b2a8-962d2160e2f6\") " pod="openstack/install-os-openstack-openstack-cell1-pqvr5" Dec 05 18:26:34 crc kubenswrapper[4756]: I1205 18:26:34.128710 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad9215e6-b603-4771-b2a8-962d2160e2f6-ssh-key\") pod \"install-os-openstack-openstack-cell1-pqvr5\" (UID: \"ad9215e6-b603-4771-b2a8-962d2160e2f6\") " pod="openstack/install-os-openstack-openstack-cell1-pqvr5" Dec 05 18:26:34 crc kubenswrapper[4756]: I1205 18:26:34.142328 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6bxt\" (UniqueName: \"kubernetes.io/projected/ad9215e6-b603-4771-b2a8-962d2160e2f6-kube-api-access-s6bxt\") pod \"install-os-openstack-openstack-cell1-pqvr5\" (UID: \"ad9215e6-b603-4771-b2a8-962d2160e2f6\") " pod="openstack/install-os-openstack-openstack-cell1-pqvr5" Dec 05 18:26:34 crc kubenswrapper[4756]: I1205 18:26:34.241934 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-pqvr5" Dec 05 18:26:34 crc kubenswrapper[4756]: I1205 18:26:34.807407 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-pqvr5"] Dec 05 18:26:35 crc kubenswrapper[4756]: I1205 18:26:35.831590 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-pqvr5" event={"ID":"ad9215e6-b603-4771-b2a8-962d2160e2f6","Type":"ContainerStarted","Data":"09562660c80d8092ede94a1dec6fb036892d8dba2706c7a1a0e4882b7ce9074f"} Dec 05 18:26:35 crc kubenswrapper[4756]: I1205 18:26:35.832591 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-pqvr5" event={"ID":"ad9215e6-b603-4771-b2a8-962d2160e2f6","Type":"ContainerStarted","Data":"6bc0a892d73280a0b82cefa130bf5a2462ccf6ed0843ffcdeee606a08d8ca335"} Dec 05 18:26:35 crc kubenswrapper[4756]: I1205 18:26:35.868357 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-pqvr5" podStartSLOduration=2.457705977 podStartE2EDuration="2.868331169s" podCreationTimestamp="2025-12-05 18:26:33 +0000 UTC" firstStartedPulling="2025-12-05 18:26:34.824828667 +0000 UTC m=+7855.063502179" lastFinishedPulling="2025-12-05 18:26:35.235453859 +0000 UTC m=+7855.474127371" observedRunningTime="2025-12-05 18:26:35.862134972 +0000 UTC m=+7856.100808504" watchObservedRunningTime="2025-12-05 18:26:35.868331169 +0000 UTC m=+7856.107004721" Dec 05 18:26:52 crc kubenswrapper[4756]: I1205 18:26:52.295176 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:26:52 crc kubenswrapper[4756]: I1205 18:26:52.295726 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:27:22 crc kubenswrapper[4756]: I1205 18:27:22.297189 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:27:22 crc kubenswrapper[4756]: I1205 18:27:22.297742 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:27:24 crc kubenswrapper[4756]: I1205 18:27:24.465935 4756 generic.go:334] "Generic (PLEG): container finished" podID="ad9215e6-b603-4771-b2a8-962d2160e2f6" containerID="09562660c80d8092ede94a1dec6fb036892d8dba2706c7a1a0e4882b7ce9074f" exitCode=0 Dec 05 18:27:24 crc kubenswrapper[4756]: I1205 18:27:24.466068 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-pqvr5" event={"ID":"ad9215e6-b603-4771-b2a8-962d2160e2f6","Type":"ContainerDied","Data":"09562660c80d8092ede94a1dec6fb036892d8dba2706c7a1a0e4882b7ce9074f"} Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.057279 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-pqvr5" Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.145557 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad9215e6-b603-4771-b2a8-962d2160e2f6-inventory\") pod \"ad9215e6-b603-4771-b2a8-962d2160e2f6\" (UID: \"ad9215e6-b603-4771-b2a8-962d2160e2f6\") " Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.145671 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6bxt\" (UniqueName: \"kubernetes.io/projected/ad9215e6-b603-4771-b2a8-962d2160e2f6-kube-api-access-s6bxt\") pod \"ad9215e6-b603-4771-b2a8-962d2160e2f6\" (UID: \"ad9215e6-b603-4771-b2a8-962d2160e2f6\") " Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.145731 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad9215e6-b603-4771-b2a8-962d2160e2f6-ssh-key\") pod \"ad9215e6-b603-4771-b2a8-962d2160e2f6\" (UID: \"ad9215e6-b603-4771-b2a8-962d2160e2f6\") " Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.152742 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad9215e6-b603-4771-b2a8-962d2160e2f6-kube-api-access-s6bxt" (OuterVolumeSpecName: "kube-api-access-s6bxt") pod "ad9215e6-b603-4771-b2a8-962d2160e2f6" (UID: "ad9215e6-b603-4771-b2a8-962d2160e2f6"). InnerVolumeSpecName "kube-api-access-s6bxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.178243 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad9215e6-b603-4771-b2a8-962d2160e2f6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ad9215e6-b603-4771-b2a8-962d2160e2f6" (UID: "ad9215e6-b603-4771-b2a8-962d2160e2f6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.201633 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad9215e6-b603-4771-b2a8-962d2160e2f6-inventory" (OuterVolumeSpecName: "inventory") pod "ad9215e6-b603-4771-b2a8-962d2160e2f6" (UID: "ad9215e6-b603-4771-b2a8-962d2160e2f6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.247816 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad9215e6-b603-4771-b2a8-962d2160e2f6-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.247848 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6bxt\" (UniqueName: \"kubernetes.io/projected/ad9215e6-b603-4771-b2a8-962d2160e2f6-kube-api-access-s6bxt\") on node \"crc\" DevicePath \"\"" Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.247859 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad9215e6-b603-4771-b2a8-962d2160e2f6-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.491726 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-pqvr5" event={"ID":"ad9215e6-b603-4771-b2a8-962d2160e2f6","Type":"ContainerDied","Data":"6bc0a892d73280a0b82cefa130bf5a2462ccf6ed0843ffcdeee606a08d8ca335"} Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.491791 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bc0a892d73280a0b82cefa130bf5a2462ccf6ed0843ffcdeee606a08d8ca335" Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.491889 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-pqvr5" Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.624638 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-wkhtn"] Dec 05 18:27:26 crc kubenswrapper[4756]: E1205 18:27:26.625092 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad9215e6-b603-4771-b2a8-962d2160e2f6" containerName="install-os-openstack-openstack-cell1" Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.625116 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad9215e6-b603-4771-b2a8-962d2160e2f6" containerName="install-os-openstack-openstack-cell1" Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.625441 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad9215e6-b603-4771-b2a8-962d2160e2f6" containerName="install-os-openstack-openstack-cell1" Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.626558 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-wkhtn" Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.635040 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.635907 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dfk9j" Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.636360 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.636983 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.658696 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-wkhtn"] Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.757339 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q88v6\" (UniqueName: \"kubernetes.io/projected/6261e620-19f8-4624-87eb-97a8fddd6283-kube-api-access-q88v6\") pod \"configure-os-openstack-openstack-cell1-wkhtn\" (UID: \"6261e620-19f8-4624-87eb-97a8fddd6283\") " pod="openstack/configure-os-openstack-openstack-cell1-wkhtn" Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.757571 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6261e620-19f8-4624-87eb-97a8fddd6283-inventory\") pod \"configure-os-openstack-openstack-cell1-wkhtn\" (UID: \"6261e620-19f8-4624-87eb-97a8fddd6283\") " pod="openstack/configure-os-openstack-openstack-cell1-wkhtn" Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.757835 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6261e620-19f8-4624-87eb-97a8fddd6283-ssh-key\") pod \"configure-os-openstack-openstack-cell1-wkhtn\" (UID: \"6261e620-19f8-4624-87eb-97a8fddd6283\") " pod="openstack/configure-os-openstack-openstack-cell1-wkhtn" Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.859722 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q88v6\" (UniqueName: \"kubernetes.io/projected/6261e620-19f8-4624-87eb-97a8fddd6283-kube-api-access-q88v6\") pod \"configure-os-openstack-openstack-cell1-wkhtn\" (UID: \"6261e620-19f8-4624-87eb-97a8fddd6283\") " pod="openstack/configure-os-openstack-openstack-cell1-wkhtn" Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.859845 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6261e620-19f8-4624-87eb-97a8fddd6283-inventory\") pod \"configure-os-openstack-openstack-cell1-wkhtn\" (UID: \"6261e620-19f8-4624-87eb-97a8fddd6283\") " pod="openstack/configure-os-openstack-openstack-cell1-wkhtn" Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.859995 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6261e620-19f8-4624-87eb-97a8fddd6283-ssh-key\") pod \"configure-os-openstack-openstack-cell1-wkhtn\" (UID: \"6261e620-19f8-4624-87eb-97a8fddd6283\") " pod="openstack/configure-os-openstack-openstack-cell1-wkhtn" Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.865407 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6261e620-19f8-4624-87eb-97a8fddd6283-ssh-key\") pod \"configure-os-openstack-openstack-cell1-wkhtn\" (UID: \"6261e620-19f8-4624-87eb-97a8fddd6283\") " pod="openstack/configure-os-openstack-openstack-cell1-wkhtn" Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.865468 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6261e620-19f8-4624-87eb-97a8fddd6283-inventory\") pod \"configure-os-openstack-openstack-cell1-wkhtn\" (UID: \"6261e620-19f8-4624-87eb-97a8fddd6283\") " pod="openstack/configure-os-openstack-openstack-cell1-wkhtn" Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.888270 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q88v6\" (UniqueName: \"kubernetes.io/projected/6261e620-19f8-4624-87eb-97a8fddd6283-kube-api-access-q88v6\") pod \"configure-os-openstack-openstack-cell1-wkhtn\" (UID: \"6261e620-19f8-4624-87eb-97a8fddd6283\") " pod="openstack/configure-os-openstack-openstack-cell1-wkhtn" Dec 05 18:27:26 crc kubenswrapper[4756]: I1205 18:27:26.958795 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-wkhtn" Dec 05 18:27:27 crc kubenswrapper[4756]: I1205 18:27:27.621634 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-wkhtn"] Dec 05 18:27:27 crc kubenswrapper[4756]: W1205 18:27:27.621850 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6261e620_19f8_4624_87eb_97a8fddd6283.slice/crio-144d29db8593f5aca2d2a2174f1a79b902d3ab68d89bc66d8333044d32744442 WatchSource:0}: Error finding container 144d29db8593f5aca2d2a2174f1a79b902d3ab68d89bc66d8333044d32744442: Status 404 returned error can't find the container with id 144d29db8593f5aca2d2a2174f1a79b902d3ab68d89bc66d8333044d32744442 Dec 05 18:27:28 crc kubenswrapper[4756]: I1205 18:27:28.529333 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-wkhtn" event={"ID":"6261e620-19f8-4624-87eb-97a8fddd6283","Type":"ContainerStarted","Data":"057a5320d02c060ec8e525d89786140268d9065c19824ba14580a058faca303e"} Dec 05 18:27:28 crc kubenswrapper[4756]: I1205 18:27:28.529746 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-wkhtn" event={"ID":"6261e620-19f8-4624-87eb-97a8fddd6283","Type":"ContainerStarted","Data":"144d29db8593f5aca2d2a2174f1a79b902d3ab68d89bc66d8333044d32744442"} Dec 05 18:27:28 crc kubenswrapper[4756]: I1205 18:27:28.562606 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-wkhtn" podStartSLOduration=1.993589623 podStartE2EDuration="2.562577137s" podCreationTimestamp="2025-12-05 18:27:26 +0000 UTC" firstStartedPulling="2025-12-05 18:27:27.625668857 +0000 UTC m=+7907.864342379" lastFinishedPulling="2025-12-05 18:27:28.194656371 +0000 UTC m=+7908.433329893" observedRunningTime="2025-12-05 18:27:28.545361806 +0000 UTC m=+7908.784035328" watchObservedRunningTime="2025-12-05 18:27:28.562577137 +0000 UTC m=+7908.801250689" Dec 05 18:27:52 crc kubenswrapper[4756]: I1205 18:27:52.295916 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:27:52 crc kubenswrapper[4756]: I1205 18:27:52.296865 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:27:52 crc kubenswrapper[4756]: I1205 18:27:52.296938 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 18:27:52 crc kubenswrapper[4756]: I1205 18:27:52.298198 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"82a63a8180647fe397c85e330207b3509035d03c3d6bad2fdf4a5955635b3ce8"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 18:27:52 crc kubenswrapper[4756]: I1205 18:27:52.298329 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://82a63a8180647fe397c85e330207b3509035d03c3d6bad2fdf4a5955635b3ce8" gracePeriod=600 Dec 05 18:27:52 crc kubenswrapper[4756]: E1205 18:27:52.443871 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:27:52 crc kubenswrapper[4756]: I1205 18:27:52.838656 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="82a63a8180647fe397c85e330207b3509035d03c3d6bad2fdf4a5955635b3ce8" exitCode=0 Dec 05 18:27:52 crc kubenswrapper[4756]: I1205 18:27:52.838735 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"82a63a8180647fe397c85e330207b3509035d03c3d6bad2fdf4a5955635b3ce8"} Dec 05 18:27:52 crc kubenswrapper[4756]: I1205 18:27:52.839151 4756 scope.go:117] "RemoveContainer" containerID="298b99a53eaa71e556883bbd5801e9357223f1284c428a6c6e79def0181cd2ac" Dec 05 18:27:52 crc kubenswrapper[4756]: I1205 18:27:52.839940 4756 scope.go:117] "RemoveContainer" containerID="82a63a8180647fe397c85e330207b3509035d03c3d6bad2fdf4a5955635b3ce8" Dec 05 18:27:52 crc kubenswrapper[4756]: E1205 18:27:52.840361 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:28:04 crc kubenswrapper[4756]: I1205 18:28:04.050026 4756 scope.go:117] "RemoveContainer" containerID="82a63a8180647fe397c85e330207b3509035d03c3d6bad2fdf4a5955635b3ce8" Dec 05 18:28:04 crc kubenswrapper[4756]: E1205 18:28:04.051357 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:28:15 crc kubenswrapper[4756]: I1205 18:28:15.114168 4756 generic.go:334] "Generic (PLEG): container finished" podID="6261e620-19f8-4624-87eb-97a8fddd6283" containerID="057a5320d02c060ec8e525d89786140268d9065c19824ba14580a058faca303e" exitCode=0 Dec 05 18:28:15 crc kubenswrapper[4756]: I1205 18:28:15.114569 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-wkhtn" event={"ID":"6261e620-19f8-4624-87eb-97a8fddd6283","Type":"ContainerDied","Data":"057a5320d02c060ec8e525d89786140268d9065c19824ba14580a058faca303e"} Dec 05 18:28:16 crc kubenswrapper[4756]: I1205 18:28:16.706048 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-wkhtn" Dec 05 18:28:16 crc kubenswrapper[4756]: I1205 18:28:16.833148 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q88v6\" (UniqueName: \"kubernetes.io/projected/6261e620-19f8-4624-87eb-97a8fddd6283-kube-api-access-q88v6\") pod \"6261e620-19f8-4624-87eb-97a8fddd6283\" (UID: \"6261e620-19f8-4624-87eb-97a8fddd6283\") " Dec 05 18:28:16 crc kubenswrapper[4756]: I1205 18:28:16.833725 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6261e620-19f8-4624-87eb-97a8fddd6283-ssh-key\") pod \"6261e620-19f8-4624-87eb-97a8fddd6283\" (UID: \"6261e620-19f8-4624-87eb-97a8fddd6283\") " Dec 05 18:28:16 crc kubenswrapper[4756]: I1205 18:28:16.833942 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6261e620-19f8-4624-87eb-97a8fddd6283-inventory\") pod \"6261e620-19f8-4624-87eb-97a8fddd6283\" (UID: \"6261e620-19f8-4624-87eb-97a8fddd6283\") " Dec 05 18:28:16 crc kubenswrapper[4756]: I1205 18:28:16.846103 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6261e620-19f8-4624-87eb-97a8fddd6283-kube-api-access-q88v6" (OuterVolumeSpecName: "kube-api-access-q88v6") pod "6261e620-19f8-4624-87eb-97a8fddd6283" (UID: "6261e620-19f8-4624-87eb-97a8fddd6283"). InnerVolumeSpecName "kube-api-access-q88v6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:28:16 crc kubenswrapper[4756]: I1205 18:28:16.875007 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6261e620-19f8-4624-87eb-97a8fddd6283-inventory" (OuterVolumeSpecName: "inventory") pod "6261e620-19f8-4624-87eb-97a8fddd6283" (UID: "6261e620-19f8-4624-87eb-97a8fddd6283"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:28:16 crc kubenswrapper[4756]: I1205 18:28:16.879054 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6261e620-19f8-4624-87eb-97a8fddd6283-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6261e620-19f8-4624-87eb-97a8fddd6283" (UID: "6261e620-19f8-4624-87eb-97a8fddd6283"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:28:16 crc kubenswrapper[4756]: I1205 18:28:16.945536 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q88v6\" (UniqueName: \"kubernetes.io/projected/6261e620-19f8-4624-87eb-97a8fddd6283-kube-api-access-q88v6\") on node \"crc\" DevicePath \"\"" Dec 05 18:28:16 crc kubenswrapper[4756]: I1205 18:28:16.945612 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6261e620-19f8-4624-87eb-97a8fddd6283-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 18:28:16 crc kubenswrapper[4756]: I1205 18:28:16.945629 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6261e620-19f8-4624-87eb-97a8fddd6283-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 18:28:17 crc kubenswrapper[4756]: I1205 18:28:17.147771 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-wkhtn" event={"ID":"6261e620-19f8-4624-87eb-97a8fddd6283","Type":"ContainerDied","Data":"144d29db8593f5aca2d2a2174f1a79b902d3ab68d89bc66d8333044d32744442"} Dec 05 18:28:17 crc kubenswrapper[4756]: I1205 18:28:17.148269 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="144d29db8593f5aca2d2a2174f1a79b902d3ab68d89bc66d8333044d32744442" Dec 05 18:28:17 crc kubenswrapper[4756]: I1205 18:28:17.148542 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-wkhtn" Dec 05 18:28:17 crc kubenswrapper[4756]: I1205 18:28:17.248235 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-hdbm2"] Dec 05 18:28:17 crc kubenswrapper[4756]: E1205 18:28:17.249096 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6261e620-19f8-4624-87eb-97a8fddd6283" containerName="configure-os-openstack-openstack-cell1" Dec 05 18:28:17 crc kubenswrapper[4756]: I1205 18:28:17.249184 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="6261e620-19f8-4624-87eb-97a8fddd6283" containerName="configure-os-openstack-openstack-cell1" Dec 05 18:28:17 crc kubenswrapper[4756]: I1205 18:28:17.249520 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="6261e620-19f8-4624-87eb-97a8fddd6283" containerName="configure-os-openstack-openstack-cell1" Dec 05 18:28:17 crc kubenswrapper[4756]: I1205 18:28:17.251103 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-hdbm2" Dec 05 18:28:17 crc kubenswrapper[4756]: I1205 18:28:17.256296 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 18:28:17 crc kubenswrapper[4756]: I1205 18:28:17.256526 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 18:28:17 crc kubenswrapper[4756]: I1205 18:28:17.256776 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dfk9j" Dec 05 18:28:17 crc kubenswrapper[4756]: I1205 18:28:17.256943 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 18:28:17 crc kubenswrapper[4756]: I1205 18:28:17.286332 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-hdbm2"] Dec 05 18:28:17 crc kubenswrapper[4756]: I1205 18:28:17.355558 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/06709cc8-4a67-48a2-906a-9388a9f0d56e-inventory-0\") pod \"ssh-known-hosts-openstack-hdbm2\" (UID: \"06709cc8-4a67-48a2-906a-9388a9f0d56e\") " pod="openstack/ssh-known-hosts-openstack-hdbm2" Dec 05 18:28:17 crc kubenswrapper[4756]: I1205 18:28:17.356250 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrqb6\" (UniqueName: \"kubernetes.io/projected/06709cc8-4a67-48a2-906a-9388a9f0d56e-kube-api-access-vrqb6\") pod \"ssh-known-hosts-openstack-hdbm2\" (UID: \"06709cc8-4a67-48a2-906a-9388a9f0d56e\") " pod="openstack/ssh-known-hosts-openstack-hdbm2" Dec 05 18:28:17 crc kubenswrapper[4756]: I1205 18:28:17.356345 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/06709cc8-4a67-48a2-906a-9388a9f0d56e-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-hdbm2\" (UID: \"06709cc8-4a67-48a2-906a-9388a9f0d56e\") " pod="openstack/ssh-known-hosts-openstack-hdbm2" Dec 05 18:28:17 crc kubenswrapper[4756]: I1205 18:28:17.459131 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrqb6\" (UniqueName: \"kubernetes.io/projected/06709cc8-4a67-48a2-906a-9388a9f0d56e-kube-api-access-vrqb6\") pod \"ssh-known-hosts-openstack-hdbm2\" (UID: \"06709cc8-4a67-48a2-906a-9388a9f0d56e\") " pod="openstack/ssh-known-hosts-openstack-hdbm2" Dec 05 18:28:17 crc kubenswrapper[4756]: I1205 18:28:17.459220 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/06709cc8-4a67-48a2-906a-9388a9f0d56e-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-hdbm2\" (UID: \"06709cc8-4a67-48a2-906a-9388a9f0d56e\") " pod="openstack/ssh-known-hosts-openstack-hdbm2" Dec 05 18:28:17 crc kubenswrapper[4756]: I1205 18:28:17.459363 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/06709cc8-4a67-48a2-906a-9388a9f0d56e-inventory-0\") pod \"ssh-known-hosts-openstack-hdbm2\" (UID: \"06709cc8-4a67-48a2-906a-9388a9f0d56e\") " pod="openstack/ssh-known-hosts-openstack-hdbm2" Dec 05 18:28:17 crc kubenswrapper[4756]: I1205 18:28:17.468256 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/06709cc8-4a67-48a2-906a-9388a9f0d56e-inventory-0\") pod \"ssh-known-hosts-openstack-hdbm2\" (UID: \"06709cc8-4a67-48a2-906a-9388a9f0d56e\") " pod="openstack/ssh-known-hosts-openstack-hdbm2" Dec 05 18:28:17 crc kubenswrapper[4756]: I1205 18:28:17.472843 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/06709cc8-4a67-48a2-906a-9388a9f0d56e-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-hdbm2\" (UID: \"06709cc8-4a67-48a2-906a-9388a9f0d56e\") " pod="openstack/ssh-known-hosts-openstack-hdbm2" Dec 05 18:28:17 crc kubenswrapper[4756]: I1205 18:28:17.473694 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrqb6\" (UniqueName: \"kubernetes.io/projected/06709cc8-4a67-48a2-906a-9388a9f0d56e-kube-api-access-vrqb6\") pod \"ssh-known-hosts-openstack-hdbm2\" (UID: \"06709cc8-4a67-48a2-906a-9388a9f0d56e\") " pod="openstack/ssh-known-hosts-openstack-hdbm2" Dec 05 18:28:17 crc kubenswrapper[4756]: I1205 18:28:17.592138 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-hdbm2" Dec 05 18:28:18 crc kubenswrapper[4756]: I1205 18:28:18.186640 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-hdbm2"] Dec 05 18:28:19 crc kubenswrapper[4756]: I1205 18:28:19.049755 4756 scope.go:117] "RemoveContainer" containerID="82a63a8180647fe397c85e330207b3509035d03c3d6bad2fdf4a5955635b3ce8" Dec 05 18:28:19 crc kubenswrapper[4756]: E1205 18:28:19.050528 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:28:19 crc kubenswrapper[4756]: I1205 18:28:19.171374 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-hdbm2" event={"ID":"06709cc8-4a67-48a2-906a-9388a9f0d56e","Type":"ContainerStarted","Data":"6c4d8fc6b3af8f5c6d45a55ec659034364c17bd98b1f7d1bd9d65d89a1341a84"} Dec 05 18:28:19 crc kubenswrapper[4756]: I1205 18:28:19.171475 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-hdbm2" event={"ID":"06709cc8-4a67-48a2-906a-9388a9f0d56e","Type":"ContainerStarted","Data":"845a1eef87cefeb470425939c71525f1586911691145c32a6fc389fb93215a6c"} Dec 05 18:28:19 crc kubenswrapper[4756]: I1205 18:28:19.213655 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-hdbm2" podStartSLOduration=1.782161654 podStartE2EDuration="2.213621767s" podCreationTimestamp="2025-12-05 18:28:17 +0000 UTC" firstStartedPulling="2025-12-05 18:28:18.18408119 +0000 UTC m=+7958.422754712" lastFinishedPulling="2025-12-05 18:28:18.615541273 +0000 UTC m=+7958.854214825" observedRunningTime="2025-12-05 18:28:19.191129223 +0000 UTC m=+7959.429802785" watchObservedRunningTime="2025-12-05 18:28:19.213621767 +0000 UTC m=+7959.452295319" Dec 05 18:28:28 crc kubenswrapper[4756]: I1205 18:28:28.293779 4756 generic.go:334] "Generic (PLEG): container finished" podID="06709cc8-4a67-48a2-906a-9388a9f0d56e" containerID="6c4d8fc6b3af8f5c6d45a55ec659034364c17bd98b1f7d1bd9d65d89a1341a84" exitCode=0 Dec 05 18:28:28 crc kubenswrapper[4756]: I1205 18:28:28.293917 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-hdbm2" event={"ID":"06709cc8-4a67-48a2-906a-9388a9f0d56e","Type":"ContainerDied","Data":"6c4d8fc6b3af8f5c6d45a55ec659034364c17bd98b1f7d1bd9d65d89a1341a84"} Dec 05 18:28:29 crc kubenswrapper[4756]: I1205 18:28:29.910802 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-hdbm2" Dec 05 18:28:29 crc kubenswrapper[4756]: I1205 18:28:29.993600 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/06709cc8-4a67-48a2-906a-9388a9f0d56e-ssh-key-openstack-cell1\") pod \"06709cc8-4a67-48a2-906a-9388a9f0d56e\" (UID: \"06709cc8-4a67-48a2-906a-9388a9f0d56e\") " Dec 05 18:28:29 crc kubenswrapper[4756]: I1205 18:28:29.993801 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/06709cc8-4a67-48a2-906a-9388a9f0d56e-inventory-0\") pod \"06709cc8-4a67-48a2-906a-9388a9f0d56e\" (UID: \"06709cc8-4a67-48a2-906a-9388a9f0d56e\") " Dec 05 18:28:29 crc kubenswrapper[4756]: I1205 18:28:29.994152 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrqb6\" (UniqueName: \"kubernetes.io/projected/06709cc8-4a67-48a2-906a-9388a9f0d56e-kube-api-access-vrqb6\") pod \"06709cc8-4a67-48a2-906a-9388a9f0d56e\" (UID: \"06709cc8-4a67-48a2-906a-9388a9f0d56e\") " Dec 05 18:28:30 crc kubenswrapper[4756]: I1205 18:28:30.000682 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06709cc8-4a67-48a2-906a-9388a9f0d56e-kube-api-access-vrqb6" (OuterVolumeSpecName: "kube-api-access-vrqb6") pod "06709cc8-4a67-48a2-906a-9388a9f0d56e" (UID: "06709cc8-4a67-48a2-906a-9388a9f0d56e"). InnerVolumeSpecName "kube-api-access-vrqb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:28:30 crc kubenswrapper[4756]: I1205 18:28:30.031661 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06709cc8-4a67-48a2-906a-9388a9f0d56e-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "06709cc8-4a67-48a2-906a-9388a9f0d56e" (UID: "06709cc8-4a67-48a2-906a-9388a9f0d56e"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:28:30 crc kubenswrapper[4756]: I1205 18:28:30.049163 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06709cc8-4a67-48a2-906a-9388a9f0d56e-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "06709cc8-4a67-48a2-906a-9388a9f0d56e" (UID: "06709cc8-4a67-48a2-906a-9388a9f0d56e"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:28:30 crc kubenswrapper[4756]: I1205 18:28:30.096142 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrqb6\" (UniqueName: \"kubernetes.io/projected/06709cc8-4a67-48a2-906a-9388a9f0d56e-kube-api-access-vrqb6\") on node \"crc\" DevicePath \"\"" Dec 05 18:28:30 crc kubenswrapper[4756]: I1205 18:28:30.096192 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/06709cc8-4a67-48a2-906a-9388a9f0d56e-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Dec 05 18:28:30 crc kubenswrapper[4756]: I1205 18:28:30.096204 4756 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/06709cc8-4a67-48a2-906a-9388a9f0d56e-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 05 18:28:30 crc kubenswrapper[4756]: I1205 18:28:30.326234 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-hdbm2" event={"ID":"06709cc8-4a67-48a2-906a-9388a9f0d56e","Type":"ContainerDied","Data":"845a1eef87cefeb470425939c71525f1586911691145c32a6fc389fb93215a6c"} Dec 05 18:28:30 crc kubenswrapper[4756]: I1205 18:28:30.326300 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="845a1eef87cefeb470425939c71525f1586911691145c32a6fc389fb93215a6c" Dec 05 18:28:30 crc kubenswrapper[4756]: I1205 18:28:30.326372 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-hdbm2" Dec 05 18:28:30 crc kubenswrapper[4756]: I1205 18:28:30.492393 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-4bxkw"] Dec 05 18:28:30 crc kubenswrapper[4756]: E1205 18:28:30.493056 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06709cc8-4a67-48a2-906a-9388a9f0d56e" containerName="ssh-known-hosts-openstack" Dec 05 18:28:30 crc kubenswrapper[4756]: I1205 18:28:30.493076 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="06709cc8-4a67-48a2-906a-9388a9f0d56e" containerName="ssh-known-hosts-openstack" Dec 05 18:28:30 crc kubenswrapper[4756]: I1205 18:28:30.493390 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="06709cc8-4a67-48a2-906a-9388a9f0d56e" containerName="ssh-known-hosts-openstack" Dec 05 18:28:30 crc kubenswrapper[4756]: I1205 18:28:30.494961 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-4bxkw" Dec 05 18:28:30 crc kubenswrapper[4756]: I1205 18:28:30.498783 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 18:28:30 crc kubenswrapper[4756]: I1205 18:28:30.498824 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dfk9j" Dec 05 18:28:30 crc kubenswrapper[4756]: I1205 18:28:30.499065 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 18:28:30 crc kubenswrapper[4756]: I1205 18:28:30.500846 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 18:28:30 crc kubenswrapper[4756]: I1205 18:28:30.515766 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-4bxkw"] Dec 05 18:28:30 crc kubenswrapper[4756]: I1205 18:28:30.637340 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5b789592-0645-42b6-ae30-a9eb32ed067e-ssh-key\") pod \"run-os-openstack-openstack-cell1-4bxkw\" (UID: \"5b789592-0645-42b6-ae30-a9eb32ed067e\") " pod="openstack/run-os-openstack-openstack-cell1-4bxkw" Dec 05 18:28:30 crc kubenswrapper[4756]: I1205 18:28:30.637638 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6msv\" (UniqueName: \"kubernetes.io/projected/5b789592-0645-42b6-ae30-a9eb32ed067e-kube-api-access-z6msv\") pod \"run-os-openstack-openstack-cell1-4bxkw\" (UID: \"5b789592-0645-42b6-ae30-a9eb32ed067e\") " pod="openstack/run-os-openstack-openstack-cell1-4bxkw" Dec 05 18:28:30 crc kubenswrapper[4756]: I1205 18:28:30.637983 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b789592-0645-42b6-ae30-a9eb32ed067e-inventory\") pod \"run-os-openstack-openstack-cell1-4bxkw\" (UID: \"5b789592-0645-42b6-ae30-a9eb32ed067e\") " pod="openstack/run-os-openstack-openstack-cell1-4bxkw" Dec 05 18:28:30 crc kubenswrapper[4756]: I1205 18:28:30.739571 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5b789592-0645-42b6-ae30-a9eb32ed067e-ssh-key\") pod \"run-os-openstack-openstack-cell1-4bxkw\" (UID: \"5b789592-0645-42b6-ae30-a9eb32ed067e\") " pod="openstack/run-os-openstack-openstack-cell1-4bxkw" Dec 05 18:28:30 crc kubenswrapper[4756]: I1205 18:28:30.739745 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6msv\" (UniqueName: \"kubernetes.io/projected/5b789592-0645-42b6-ae30-a9eb32ed067e-kube-api-access-z6msv\") pod \"run-os-openstack-openstack-cell1-4bxkw\" (UID: \"5b789592-0645-42b6-ae30-a9eb32ed067e\") " pod="openstack/run-os-openstack-openstack-cell1-4bxkw" Dec 05 18:28:30 crc kubenswrapper[4756]: I1205 18:28:30.739998 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b789592-0645-42b6-ae30-a9eb32ed067e-inventory\") pod \"run-os-openstack-openstack-cell1-4bxkw\" (UID: \"5b789592-0645-42b6-ae30-a9eb32ed067e\") " pod="openstack/run-os-openstack-openstack-cell1-4bxkw" Dec 05 18:28:30 crc kubenswrapper[4756]: I1205 18:28:30.746707 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b789592-0645-42b6-ae30-a9eb32ed067e-inventory\") pod \"run-os-openstack-openstack-cell1-4bxkw\" (UID: \"5b789592-0645-42b6-ae30-a9eb32ed067e\") " pod="openstack/run-os-openstack-openstack-cell1-4bxkw" Dec 05 18:28:30 crc kubenswrapper[4756]: I1205 18:28:30.756625 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5b789592-0645-42b6-ae30-a9eb32ed067e-ssh-key\") pod \"run-os-openstack-openstack-cell1-4bxkw\" (UID: \"5b789592-0645-42b6-ae30-a9eb32ed067e\") " pod="openstack/run-os-openstack-openstack-cell1-4bxkw" Dec 05 18:28:30 crc kubenswrapper[4756]: I1205 18:28:30.758203 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6msv\" (UniqueName: \"kubernetes.io/projected/5b789592-0645-42b6-ae30-a9eb32ed067e-kube-api-access-z6msv\") pod \"run-os-openstack-openstack-cell1-4bxkw\" (UID: \"5b789592-0645-42b6-ae30-a9eb32ed067e\") " pod="openstack/run-os-openstack-openstack-cell1-4bxkw" Dec 05 18:28:30 crc kubenswrapper[4756]: I1205 18:28:30.834799 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-4bxkw" Dec 05 18:28:31 crc kubenswrapper[4756]: I1205 18:28:31.444776 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-4bxkw"] Dec 05 18:28:32 crc kubenswrapper[4756]: I1205 18:28:32.354532 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-4bxkw" event={"ID":"5b789592-0645-42b6-ae30-a9eb32ed067e","Type":"ContainerStarted","Data":"abbc391d298a2fdaeaf18f746452b7aef67cad4b24596354c35de0837eeabd80"} Dec 05 18:28:33 crc kubenswrapper[4756]: I1205 18:28:33.368188 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-4bxkw" event={"ID":"5b789592-0645-42b6-ae30-a9eb32ed067e","Type":"ContainerStarted","Data":"472fc01b8744465b9c2ed1ad76d1b2e3c209f557b0ac638ecc5e826952d0861b"} Dec 05 18:28:33 crc kubenswrapper[4756]: I1205 18:28:33.433053 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-4bxkw" podStartSLOduration=2.851515651 podStartE2EDuration="3.433020632s" podCreationTimestamp="2025-12-05 18:28:30 +0000 UTC" firstStartedPulling="2025-12-05 18:28:31.448830748 +0000 UTC m=+7971.687504260" lastFinishedPulling="2025-12-05 18:28:32.030335719 +0000 UTC m=+7972.269009241" observedRunningTime="2025-12-05 18:28:33.419231242 +0000 UTC m=+7973.657904794" watchObservedRunningTime="2025-12-05 18:28:33.433020632 +0000 UTC m=+7973.671694204" Dec 05 18:28:34 crc kubenswrapper[4756]: I1205 18:28:34.050247 4756 scope.go:117] "RemoveContainer" containerID="82a63a8180647fe397c85e330207b3509035d03c3d6bad2fdf4a5955635b3ce8" Dec 05 18:28:34 crc kubenswrapper[4756]: E1205 18:28:34.051119 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:28:40 crc kubenswrapper[4756]: I1205 18:28:40.503989 4756 generic.go:334] "Generic (PLEG): container finished" podID="5b789592-0645-42b6-ae30-a9eb32ed067e" containerID="472fc01b8744465b9c2ed1ad76d1b2e3c209f557b0ac638ecc5e826952d0861b" exitCode=0 Dec 05 18:28:40 crc kubenswrapper[4756]: I1205 18:28:40.504074 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-4bxkw" event={"ID":"5b789592-0645-42b6-ae30-a9eb32ed067e","Type":"ContainerDied","Data":"472fc01b8744465b9c2ed1ad76d1b2e3c209f557b0ac638ecc5e826952d0861b"} Dec 05 18:28:42 crc kubenswrapper[4756]: I1205 18:28:42.051333 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-4bxkw" Dec 05 18:28:42 crc kubenswrapper[4756]: I1205 18:28:42.248813 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6msv\" (UniqueName: \"kubernetes.io/projected/5b789592-0645-42b6-ae30-a9eb32ed067e-kube-api-access-z6msv\") pod \"5b789592-0645-42b6-ae30-a9eb32ed067e\" (UID: \"5b789592-0645-42b6-ae30-a9eb32ed067e\") " Dec 05 18:28:42 crc kubenswrapper[4756]: I1205 18:28:42.248983 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5b789592-0645-42b6-ae30-a9eb32ed067e-ssh-key\") pod \"5b789592-0645-42b6-ae30-a9eb32ed067e\" (UID: \"5b789592-0645-42b6-ae30-a9eb32ed067e\") " Dec 05 18:28:42 crc kubenswrapper[4756]: I1205 18:28:42.249311 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b789592-0645-42b6-ae30-a9eb32ed067e-inventory\") pod \"5b789592-0645-42b6-ae30-a9eb32ed067e\" (UID: \"5b789592-0645-42b6-ae30-a9eb32ed067e\") " Dec 05 18:28:42 crc kubenswrapper[4756]: I1205 18:28:42.260871 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b789592-0645-42b6-ae30-a9eb32ed067e-kube-api-access-z6msv" (OuterVolumeSpecName: "kube-api-access-z6msv") pod "5b789592-0645-42b6-ae30-a9eb32ed067e" (UID: "5b789592-0645-42b6-ae30-a9eb32ed067e"). InnerVolumeSpecName "kube-api-access-z6msv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:28:42 crc kubenswrapper[4756]: I1205 18:28:42.308760 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b789592-0645-42b6-ae30-a9eb32ed067e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5b789592-0645-42b6-ae30-a9eb32ed067e" (UID: "5b789592-0645-42b6-ae30-a9eb32ed067e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:28:42 crc kubenswrapper[4756]: I1205 18:28:42.313532 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b789592-0645-42b6-ae30-a9eb32ed067e-inventory" (OuterVolumeSpecName: "inventory") pod "5b789592-0645-42b6-ae30-a9eb32ed067e" (UID: "5b789592-0645-42b6-ae30-a9eb32ed067e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:28:42 crc kubenswrapper[4756]: I1205 18:28:42.353371 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6msv\" (UniqueName: \"kubernetes.io/projected/5b789592-0645-42b6-ae30-a9eb32ed067e-kube-api-access-z6msv\") on node \"crc\" DevicePath \"\"" Dec 05 18:28:42 crc kubenswrapper[4756]: I1205 18:28:42.353408 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5b789592-0645-42b6-ae30-a9eb32ed067e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 18:28:42 crc kubenswrapper[4756]: I1205 18:28:42.353437 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b789592-0645-42b6-ae30-a9eb32ed067e-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 18:28:42 crc kubenswrapper[4756]: I1205 18:28:42.530238 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-4bxkw" event={"ID":"5b789592-0645-42b6-ae30-a9eb32ed067e","Type":"ContainerDied","Data":"abbc391d298a2fdaeaf18f746452b7aef67cad4b24596354c35de0837eeabd80"} Dec 05 18:28:42 crc kubenswrapper[4756]: I1205 18:28:42.530335 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abbc391d298a2fdaeaf18f746452b7aef67cad4b24596354c35de0837eeabd80" Dec 05 18:28:42 crc kubenswrapper[4756]: I1205 18:28:42.530787 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-4bxkw" Dec 05 18:28:42 crc kubenswrapper[4756]: I1205 18:28:42.678019 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-hw8vv"] Dec 05 18:28:42 crc kubenswrapper[4756]: E1205 18:28:42.678925 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b789592-0645-42b6-ae30-a9eb32ed067e" containerName="run-os-openstack-openstack-cell1" Dec 05 18:28:42 crc kubenswrapper[4756]: I1205 18:28:42.678975 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b789592-0645-42b6-ae30-a9eb32ed067e" containerName="run-os-openstack-openstack-cell1" Dec 05 18:28:42 crc kubenswrapper[4756]: I1205 18:28:42.679588 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b789592-0645-42b6-ae30-a9eb32ed067e" containerName="run-os-openstack-openstack-cell1" Dec 05 18:28:42 crc kubenswrapper[4756]: I1205 18:28:42.681995 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-hw8vv" Dec 05 18:28:42 crc kubenswrapper[4756]: I1205 18:28:42.686695 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 18:28:42 crc kubenswrapper[4756]: I1205 18:28:42.686853 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 18:28:42 crc kubenswrapper[4756]: I1205 18:28:42.687093 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 18:28:42 crc kubenswrapper[4756]: I1205 18:28:42.687132 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dfk9j" Dec 05 18:28:42 crc kubenswrapper[4756]: I1205 18:28:42.690388 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-hw8vv"] Dec 05 18:28:42 crc kubenswrapper[4756]: I1205 18:28:42.872043 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4212fd48-503e-40ea-9f8b-efb0cc7d999f-inventory\") pod \"reboot-os-openstack-openstack-cell1-hw8vv\" (UID: \"4212fd48-503e-40ea-9f8b-efb0cc7d999f\") " pod="openstack/reboot-os-openstack-openstack-cell1-hw8vv" Dec 05 18:28:42 crc kubenswrapper[4756]: I1205 18:28:42.872138 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4212fd48-503e-40ea-9f8b-efb0cc7d999f-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-hw8vv\" (UID: \"4212fd48-503e-40ea-9f8b-efb0cc7d999f\") " pod="openstack/reboot-os-openstack-openstack-cell1-hw8vv" Dec 05 18:28:42 crc kubenswrapper[4756]: I1205 18:28:42.872348 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ngzk\" (UniqueName: \"kubernetes.io/projected/4212fd48-503e-40ea-9f8b-efb0cc7d999f-kube-api-access-7ngzk\") pod \"reboot-os-openstack-openstack-cell1-hw8vv\" (UID: \"4212fd48-503e-40ea-9f8b-efb0cc7d999f\") " pod="openstack/reboot-os-openstack-openstack-cell1-hw8vv" Dec 05 18:28:42 crc kubenswrapper[4756]: I1205 18:28:42.974873 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ngzk\" (UniqueName: \"kubernetes.io/projected/4212fd48-503e-40ea-9f8b-efb0cc7d999f-kube-api-access-7ngzk\") pod \"reboot-os-openstack-openstack-cell1-hw8vv\" (UID: \"4212fd48-503e-40ea-9f8b-efb0cc7d999f\") " pod="openstack/reboot-os-openstack-openstack-cell1-hw8vv" Dec 05 18:28:42 crc kubenswrapper[4756]: I1205 18:28:42.975031 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4212fd48-503e-40ea-9f8b-efb0cc7d999f-inventory\") pod \"reboot-os-openstack-openstack-cell1-hw8vv\" (UID: \"4212fd48-503e-40ea-9f8b-efb0cc7d999f\") " pod="openstack/reboot-os-openstack-openstack-cell1-hw8vv" Dec 05 18:28:42 crc kubenswrapper[4756]: I1205 18:28:42.975104 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4212fd48-503e-40ea-9f8b-efb0cc7d999f-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-hw8vv\" (UID: \"4212fd48-503e-40ea-9f8b-efb0cc7d999f\") " pod="openstack/reboot-os-openstack-openstack-cell1-hw8vv" Dec 05 18:28:42 crc kubenswrapper[4756]: I1205 18:28:42.982492 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4212fd48-503e-40ea-9f8b-efb0cc7d999f-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-hw8vv\" (UID: \"4212fd48-503e-40ea-9f8b-efb0cc7d999f\") " pod="openstack/reboot-os-openstack-openstack-cell1-hw8vv" Dec 05 18:28:43 crc kubenswrapper[4756]: I1205 18:28:43.000088 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4212fd48-503e-40ea-9f8b-efb0cc7d999f-inventory\") pod \"reboot-os-openstack-openstack-cell1-hw8vv\" (UID: \"4212fd48-503e-40ea-9f8b-efb0cc7d999f\") " pod="openstack/reboot-os-openstack-openstack-cell1-hw8vv" Dec 05 18:28:43 crc kubenswrapper[4756]: I1205 18:28:43.003887 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ngzk\" (UniqueName: \"kubernetes.io/projected/4212fd48-503e-40ea-9f8b-efb0cc7d999f-kube-api-access-7ngzk\") pod \"reboot-os-openstack-openstack-cell1-hw8vv\" (UID: \"4212fd48-503e-40ea-9f8b-efb0cc7d999f\") " pod="openstack/reboot-os-openstack-openstack-cell1-hw8vv" Dec 05 18:28:43 crc kubenswrapper[4756]: I1205 18:28:43.033281 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-hw8vv" Dec 05 18:28:43 crc kubenswrapper[4756]: I1205 18:28:43.675233 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-hw8vv"] Dec 05 18:28:44 crc kubenswrapper[4756]: I1205 18:28:44.557284 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-hw8vv" event={"ID":"4212fd48-503e-40ea-9f8b-efb0cc7d999f","Type":"ContainerStarted","Data":"9fc4ec14d660b79f5c6d05c0d9391bb9d83d50ed005a5019221ec7897fa213ce"} Dec 05 18:28:44 crc kubenswrapper[4756]: I1205 18:28:44.557881 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-hw8vv" event={"ID":"4212fd48-503e-40ea-9f8b-efb0cc7d999f","Type":"ContainerStarted","Data":"e3d8e3310de0affd49a405099abf5e413ac5ab9f61eedf9a1d5f6163536a73de"} Dec 05 18:28:46 crc kubenswrapper[4756]: I1205 18:28:46.050478 4756 scope.go:117] "RemoveContainer" containerID="82a63a8180647fe397c85e330207b3509035d03c3d6bad2fdf4a5955635b3ce8" Dec 05 18:28:46 crc kubenswrapper[4756]: E1205 18:28:46.051737 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:28:58 crc kubenswrapper[4756]: I1205 18:28:58.055361 4756 scope.go:117] "RemoveContainer" containerID="82a63a8180647fe397c85e330207b3509035d03c3d6bad2fdf4a5955635b3ce8" Dec 05 18:28:58 crc kubenswrapper[4756]: E1205 18:28:58.056746 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:29:01 crc kubenswrapper[4756]: I1205 18:29:01.802336 4756 generic.go:334] "Generic (PLEG): container finished" podID="4212fd48-503e-40ea-9f8b-efb0cc7d999f" containerID="9fc4ec14d660b79f5c6d05c0d9391bb9d83d50ed005a5019221ec7897fa213ce" exitCode=0 Dec 05 18:29:01 crc kubenswrapper[4756]: I1205 18:29:01.802487 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-hw8vv" event={"ID":"4212fd48-503e-40ea-9f8b-efb0cc7d999f","Type":"ContainerDied","Data":"9fc4ec14d660b79f5c6d05c0d9391bb9d83d50ed005a5019221ec7897fa213ce"} Dec 05 18:29:03 crc kubenswrapper[4756]: I1205 18:29:03.405294 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-hw8vv" Dec 05 18:29:03 crc kubenswrapper[4756]: I1205 18:29:03.459673 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4212fd48-503e-40ea-9f8b-efb0cc7d999f-ssh-key\") pod \"4212fd48-503e-40ea-9f8b-efb0cc7d999f\" (UID: \"4212fd48-503e-40ea-9f8b-efb0cc7d999f\") " Dec 05 18:29:03 crc kubenswrapper[4756]: I1205 18:29:03.460228 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ngzk\" (UniqueName: \"kubernetes.io/projected/4212fd48-503e-40ea-9f8b-efb0cc7d999f-kube-api-access-7ngzk\") pod \"4212fd48-503e-40ea-9f8b-efb0cc7d999f\" (UID: \"4212fd48-503e-40ea-9f8b-efb0cc7d999f\") " Dec 05 18:29:03 crc kubenswrapper[4756]: I1205 18:29:03.460543 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4212fd48-503e-40ea-9f8b-efb0cc7d999f-inventory\") pod \"4212fd48-503e-40ea-9f8b-efb0cc7d999f\" (UID: \"4212fd48-503e-40ea-9f8b-efb0cc7d999f\") " Dec 05 18:29:03 crc kubenswrapper[4756]: I1205 18:29:03.476028 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4212fd48-503e-40ea-9f8b-efb0cc7d999f-kube-api-access-7ngzk" (OuterVolumeSpecName: "kube-api-access-7ngzk") pod "4212fd48-503e-40ea-9f8b-efb0cc7d999f" (UID: "4212fd48-503e-40ea-9f8b-efb0cc7d999f"). InnerVolumeSpecName "kube-api-access-7ngzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:29:03 crc kubenswrapper[4756]: I1205 18:29:03.492746 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4212fd48-503e-40ea-9f8b-efb0cc7d999f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4212fd48-503e-40ea-9f8b-efb0cc7d999f" (UID: "4212fd48-503e-40ea-9f8b-efb0cc7d999f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:29:03 crc kubenswrapper[4756]: I1205 18:29:03.513121 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4212fd48-503e-40ea-9f8b-efb0cc7d999f-inventory" (OuterVolumeSpecName: "inventory") pod "4212fd48-503e-40ea-9f8b-efb0cc7d999f" (UID: "4212fd48-503e-40ea-9f8b-efb0cc7d999f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:29:03 crc kubenswrapper[4756]: I1205 18:29:03.564678 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4212fd48-503e-40ea-9f8b-efb0cc7d999f-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 18:29:03 crc kubenswrapper[4756]: I1205 18:29:03.564720 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4212fd48-503e-40ea-9f8b-efb0cc7d999f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 18:29:03 crc kubenswrapper[4756]: I1205 18:29:03.564735 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ngzk\" (UniqueName: \"kubernetes.io/projected/4212fd48-503e-40ea-9f8b-efb0cc7d999f-kube-api-access-7ngzk\") on node \"crc\" DevicePath \"\"" Dec 05 18:29:03 crc kubenswrapper[4756]: I1205 18:29:03.831979 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-hw8vv" event={"ID":"4212fd48-503e-40ea-9f8b-efb0cc7d999f","Type":"ContainerDied","Data":"e3d8e3310de0affd49a405099abf5e413ac5ab9f61eedf9a1d5f6163536a73de"} Dec 05 18:29:03 crc kubenswrapper[4756]: I1205 18:29:03.832047 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3d8e3310de0affd49a405099abf5e413ac5ab9f61eedf9a1d5f6163536a73de" Dec 05 18:29:03 crc kubenswrapper[4756]: I1205 18:29:03.832393 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-hw8vv" Dec 05 18:29:03 crc kubenswrapper[4756]: I1205 18:29:03.966108 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-k4w9p"] Dec 05 18:29:03 crc kubenswrapper[4756]: E1205 18:29:03.966945 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4212fd48-503e-40ea-9f8b-efb0cc7d999f" containerName="reboot-os-openstack-openstack-cell1" Dec 05 18:29:03 crc kubenswrapper[4756]: I1205 18:29:03.966977 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4212fd48-503e-40ea-9f8b-efb0cc7d999f" containerName="reboot-os-openstack-openstack-cell1" Dec 05 18:29:03 crc kubenswrapper[4756]: I1205 18:29:03.967598 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="4212fd48-503e-40ea-9f8b-efb0cc7d999f" containerName="reboot-os-openstack-openstack-cell1" Dec 05 18:29:03 crc kubenswrapper[4756]: I1205 18:29:03.969369 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:03 crc kubenswrapper[4756]: I1205 18:29:03.972296 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dfk9j" Dec 05 18:29:03 crc kubenswrapper[4756]: I1205 18:29:03.973282 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 18:29:03 crc kubenswrapper[4756]: I1205 18:29:03.973554 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-telemetry-default-certs-0" Dec 05 18:29:03 crc kubenswrapper[4756]: I1205 18:29:03.973763 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 18:29:03 crc kubenswrapper[4756]: I1205 18:29:03.974166 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-libvirt-default-certs-0" Dec 05 18:29:03 crc kubenswrapper[4756]: I1205 18:29:03.974485 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 18:29:03 crc kubenswrapper[4756]: I1205 18:29:03.977221 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-ovn-default-certs-0" Dec 05 18:29:03 crc kubenswrapper[4756]: I1205 18:29:03.983143 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-neutron-metadata-default-certs-0" Dec 05 18:29:03 crc kubenswrapper[4756]: I1205 18:29:03.984913 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-k4w9p"] Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.077532 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.077777 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.077876 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-ssh-key\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.078150 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8zc4\" (UniqueName: \"kubernetes.io/projected/80f46aab-07ca-4d94-be4b-87d3598beb35-kube-api-access-s8zc4\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.078227 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80f46aab-07ca-4d94-be4b-87d3598beb35-openstack-cell1-telemetry-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.078460 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.079616 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.080633 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80f46aab-07ca-4d94-be4b-87d3598beb35-openstack-cell1-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.080729 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80f46aab-07ca-4d94-be4b-87d3598beb35-openstack-cell1-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.080839 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-inventory\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.080917 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80f46aab-07ca-4d94-be4b-87d3598beb35-openstack-cell1-libvirt-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.080995 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.081057 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.081086 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.182867 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.182915 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80f46aab-07ca-4d94-be4b-87d3598beb35-openstack-cell1-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.182942 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80f46aab-07ca-4d94-be4b-87d3598beb35-openstack-cell1-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.182983 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-inventory\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.183012 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80f46aab-07ca-4d94-be4b-87d3598beb35-openstack-cell1-libvirt-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.183047 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.183074 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.183092 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.183139 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.183163 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.183182 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-ssh-key\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.183207 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8zc4\" (UniqueName: \"kubernetes.io/projected/80f46aab-07ca-4d94-be4b-87d3598beb35-kube-api-access-s8zc4\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.183226 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80f46aab-07ca-4d94-be4b-87d3598beb35-openstack-cell1-telemetry-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.183271 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.187877 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80f46aab-07ca-4d94-be4b-87d3598beb35-openstack-cell1-libvirt-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.188486 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.190206 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80f46aab-07ca-4d94-be4b-87d3598beb35-openstack-cell1-telemetry-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.190546 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80f46aab-07ca-4d94-be4b-87d3598beb35-openstack-cell1-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.190644 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.191064 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.191546 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.191947 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.192988 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-ssh-key\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.193246 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-inventory\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.193747 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.194539 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80f46aab-07ca-4d94-be4b-87d3598beb35-openstack-cell1-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.195288 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.220093 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8zc4\" (UniqueName: \"kubernetes.io/projected/80f46aab-07ca-4d94-be4b-87d3598beb35-kube-api-access-s8zc4\") pod \"install-certs-openstack-openstack-cell1-k4w9p\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.309595 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:04 crc kubenswrapper[4756]: I1205 18:29:04.875518 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-k4w9p"] Dec 05 18:29:05 crc kubenswrapper[4756]: I1205 18:29:05.854386 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" event={"ID":"80f46aab-07ca-4d94-be4b-87d3598beb35","Type":"ContainerStarted","Data":"8b7f0b0499ee12e7823f30208ce5e1af3d06d223ce1156649de6ab7c24d058cc"} Dec 05 18:29:05 crc kubenswrapper[4756]: I1205 18:29:05.854822 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" event={"ID":"80f46aab-07ca-4d94-be4b-87d3598beb35","Type":"ContainerStarted","Data":"0061f2d757d769cd696c9b5423ff329a1d7b7ba0afb41266e0e7001116711837"} Dec 05 18:29:05 crc kubenswrapper[4756]: I1205 18:29:05.889600 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" podStartSLOduration=2.440253725 podStartE2EDuration="2.889572676s" podCreationTimestamp="2025-12-05 18:29:03 +0000 UTC" firstStartedPulling="2025-12-05 18:29:04.882939434 +0000 UTC m=+8005.121612986" lastFinishedPulling="2025-12-05 18:29:05.332258385 +0000 UTC m=+8005.570931937" observedRunningTime="2025-12-05 18:29:05.872743384 +0000 UTC m=+8006.111416906" watchObservedRunningTime="2025-12-05 18:29:05.889572676 +0000 UTC m=+8006.128246218" Dec 05 18:29:11 crc kubenswrapper[4756]: I1205 18:29:11.050884 4756 scope.go:117] "RemoveContainer" containerID="82a63a8180647fe397c85e330207b3509035d03c3d6bad2fdf4a5955635b3ce8" Dec 05 18:29:11 crc kubenswrapper[4756]: E1205 18:29:11.052227 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:29:23 crc kubenswrapper[4756]: I1205 18:29:23.049876 4756 scope.go:117] "RemoveContainer" containerID="82a63a8180647fe397c85e330207b3509035d03c3d6bad2fdf4a5955635b3ce8" Dec 05 18:29:23 crc kubenswrapper[4756]: E1205 18:29:23.050614 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:29:34 crc kubenswrapper[4756]: I1205 18:29:34.556221 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vd6r7"] Dec 05 18:29:34 crc kubenswrapper[4756]: I1205 18:29:34.559602 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vd6r7" Dec 05 18:29:34 crc kubenswrapper[4756]: I1205 18:29:34.570945 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vd6r7"] Dec 05 18:29:34 crc kubenswrapper[4756]: I1205 18:29:34.588961 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eb7fc5e-57b6-4b19-8772-9768b7537a52-utilities\") pod \"certified-operators-vd6r7\" (UID: \"8eb7fc5e-57b6-4b19-8772-9768b7537a52\") " pod="openshift-marketplace/certified-operators-vd6r7" Dec 05 18:29:34 crc kubenswrapper[4756]: I1205 18:29:34.589153 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gbvn\" (UniqueName: \"kubernetes.io/projected/8eb7fc5e-57b6-4b19-8772-9768b7537a52-kube-api-access-9gbvn\") pod \"certified-operators-vd6r7\" (UID: \"8eb7fc5e-57b6-4b19-8772-9768b7537a52\") " pod="openshift-marketplace/certified-operators-vd6r7" Dec 05 18:29:34 crc kubenswrapper[4756]: I1205 18:29:34.589357 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eb7fc5e-57b6-4b19-8772-9768b7537a52-catalog-content\") pod \"certified-operators-vd6r7\" (UID: \"8eb7fc5e-57b6-4b19-8772-9768b7537a52\") " pod="openshift-marketplace/certified-operators-vd6r7" Dec 05 18:29:34 crc kubenswrapper[4756]: I1205 18:29:34.691845 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eb7fc5e-57b6-4b19-8772-9768b7537a52-catalog-content\") pod \"certified-operators-vd6r7\" (UID: \"8eb7fc5e-57b6-4b19-8772-9768b7537a52\") " pod="openshift-marketplace/certified-operators-vd6r7" Dec 05 18:29:34 crc kubenswrapper[4756]: I1205 18:29:34.692123 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eb7fc5e-57b6-4b19-8772-9768b7537a52-utilities\") pod \"certified-operators-vd6r7\" (UID: \"8eb7fc5e-57b6-4b19-8772-9768b7537a52\") " pod="openshift-marketplace/certified-operators-vd6r7" Dec 05 18:29:34 crc kubenswrapper[4756]: I1205 18:29:34.692212 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gbvn\" (UniqueName: \"kubernetes.io/projected/8eb7fc5e-57b6-4b19-8772-9768b7537a52-kube-api-access-9gbvn\") pod \"certified-operators-vd6r7\" (UID: \"8eb7fc5e-57b6-4b19-8772-9768b7537a52\") " pod="openshift-marketplace/certified-operators-vd6r7" Dec 05 18:29:34 crc kubenswrapper[4756]: I1205 18:29:34.692503 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eb7fc5e-57b6-4b19-8772-9768b7537a52-catalog-content\") pod \"certified-operators-vd6r7\" (UID: \"8eb7fc5e-57b6-4b19-8772-9768b7537a52\") " pod="openshift-marketplace/certified-operators-vd6r7" Dec 05 18:29:34 crc kubenswrapper[4756]: I1205 18:29:34.692838 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eb7fc5e-57b6-4b19-8772-9768b7537a52-utilities\") pod \"certified-operators-vd6r7\" (UID: \"8eb7fc5e-57b6-4b19-8772-9768b7537a52\") " pod="openshift-marketplace/certified-operators-vd6r7" Dec 05 18:29:34 crc kubenswrapper[4756]: I1205 18:29:34.719296 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gbvn\" (UniqueName: \"kubernetes.io/projected/8eb7fc5e-57b6-4b19-8772-9768b7537a52-kube-api-access-9gbvn\") pod \"certified-operators-vd6r7\" (UID: \"8eb7fc5e-57b6-4b19-8772-9768b7537a52\") " pod="openshift-marketplace/certified-operators-vd6r7" Dec 05 18:29:34 crc kubenswrapper[4756]: I1205 18:29:34.893602 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vd6r7" Dec 05 18:29:35 crc kubenswrapper[4756]: I1205 18:29:35.421521 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vd6r7"] Dec 05 18:29:36 crc kubenswrapper[4756]: I1205 18:29:36.253153 4756 generic.go:334] "Generic (PLEG): container finished" podID="8eb7fc5e-57b6-4b19-8772-9768b7537a52" containerID="275292ef9a25f65b5ec79dedfff6c35dac73b56f47524a89ab90a74c920f69b9" exitCode=0 Dec 05 18:29:36 crc kubenswrapper[4756]: I1205 18:29:36.253521 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vd6r7" event={"ID":"8eb7fc5e-57b6-4b19-8772-9768b7537a52","Type":"ContainerDied","Data":"275292ef9a25f65b5ec79dedfff6c35dac73b56f47524a89ab90a74c920f69b9"} Dec 05 18:29:36 crc kubenswrapper[4756]: I1205 18:29:36.253558 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vd6r7" event={"ID":"8eb7fc5e-57b6-4b19-8772-9768b7537a52","Type":"ContainerStarted","Data":"9b96bc0c0efa555281218d7c208adee49a100e543c13faded28af555e080812b"} Dec 05 18:29:38 crc kubenswrapper[4756]: I1205 18:29:38.050455 4756 scope.go:117] "RemoveContainer" containerID="82a63a8180647fe397c85e330207b3509035d03c3d6bad2fdf4a5955635b3ce8" Dec 05 18:29:38 crc kubenswrapper[4756]: E1205 18:29:38.052297 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:29:38 crc kubenswrapper[4756]: I1205 18:29:38.282792 4756 generic.go:334] "Generic (PLEG): container finished" podID="8eb7fc5e-57b6-4b19-8772-9768b7537a52" containerID="dbb60ec54daf5512ce12c99353c3f1f001041b159705a2ae8cdd70e26e820d00" exitCode=0 Dec 05 18:29:38 crc kubenswrapper[4756]: I1205 18:29:38.282841 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vd6r7" event={"ID":"8eb7fc5e-57b6-4b19-8772-9768b7537a52","Type":"ContainerDied","Data":"dbb60ec54daf5512ce12c99353c3f1f001041b159705a2ae8cdd70e26e820d00"} Dec 05 18:29:39 crc kubenswrapper[4756]: I1205 18:29:39.303780 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vd6r7" event={"ID":"8eb7fc5e-57b6-4b19-8772-9768b7537a52","Type":"ContainerStarted","Data":"825ec1e3b6d31f48cda9c0492ab5b4607db028eb8b2c347bbe8416a61783a57d"} Dec 05 18:29:43 crc kubenswrapper[4756]: I1205 18:29:43.366817 4756 generic.go:334] "Generic (PLEG): container finished" podID="80f46aab-07ca-4d94-be4b-87d3598beb35" containerID="8b7f0b0499ee12e7823f30208ce5e1af3d06d223ce1156649de6ab7c24d058cc" exitCode=0 Dec 05 18:29:43 crc kubenswrapper[4756]: I1205 18:29:43.366917 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" event={"ID":"80f46aab-07ca-4d94-be4b-87d3598beb35","Type":"ContainerDied","Data":"8b7f0b0499ee12e7823f30208ce5e1af3d06d223ce1156649de6ab7c24d058cc"} Dec 05 18:29:43 crc kubenswrapper[4756]: I1205 18:29:43.416033 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vd6r7" podStartSLOduration=6.959453344 podStartE2EDuration="9.416003518s" podCreationTimestamp="2025-12-05 18:29:34 +0000 UTC" firstStartedPulling="2025-12-05 18:29:36.256654039 +0000 UTC m=+8036.495327561" lastFinishedPulling="2025-12-05 18:29:38.713204193 +0000 UTC m=+8038.951877735" observedRunningTime="2025-12-05 18:29:39.338041197 +0000 UTC m=+8039.576714709" watchObservedRunningTime="2025-12-05 18:29:43.416003518 +0000 UTC m=+8043.654677070" Dec 05 18:29:44 crc kubenswrapper[4756]: I1205 18:29:44.894602 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vd6r7" Dec 05 18:29:44 crc kubenswrapper[4756]: I1205 18:29:44.894903 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vd6r7" Dec 05 18:29:44 crc kubenswrapper[4756]: I1205 18:29:44.908862 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:44 crc kubenswrapper[4756]: I1205 18:29:44.977071 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vd6r7" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.008255 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-neutron-metadata-combined-ca-bundle\") pod \"80f46aab-07ca-4d94-be4b-87d3598beb35\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.008358 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-inventory\") pod \"80f46aab-07ca-4d94-be4b-87d3598beb35\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.008419 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80f46aab-07ca-4d94-be4b-87d3598beb35-openstack-cell1-ovn-default-certs-0\") pod \"80f46aab-07ca-4d94-be4b-87d3598beb35\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.008484 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-neutron-sriov-combined-ca-bundle\") pod \"80f46aab-07ca-4d94-be4b-87d3598beb35\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.008503 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-libvirt-combined-ca-bundle\") pod \"80f46aab-07ca-4d94-be4b-87d3598beb35\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.008553 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-ovn-combined-ca-bundle\") pod \"80f46aab-07ca-4d94-be4b-87d3598beb35\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.008570 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-nova-combined-ca-bundle\") pod \"80f46aab-07ca-4d94-be4b-87d3598beb35\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.008587 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-ssh-key\") pod \"80f46aab-07ca-4d94-be4b-87d3598beb35\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.008622 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80f46aab-07ca-4d94-be4b-87d3598beb35-openstack-cell1-telemetry-default-certs-0\") pod \"80f46aab-07ca-4d94-be4b-87d3598beb35\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.008703 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8zc4\" (UniqueName: \"kubernetes.io/projected/80f46aab-07ca-4d94-be4b-87d3598beb35-kube-api-access-s8zc4\") pod \"80f46aab-07ca-4d94-be4b-87d3598beb35\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.008734 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-telemetry-combined-ca-bundle\") pod \"80f46aab-07ca-4d94-be4b-87d3598beb35\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.008763 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80f46aab-07ca-4d94-be4b-87d3598beb35-openstack-cell1-libvirt-default-certs-0\") pod \"80f46aab-07ca-4d94-be4b-87d3598beb35\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.008787 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-bootstrap-combined-ca-bundle\") pod \"80f46aab-07ca-4d94-be4b-87d3598beb35\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.008803 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80f46aab-07ca-4d94-be4b-87d3598beb35-openstack-cell1-neutron-metadata-default-certs-0\") pod \"80f46aab-07ca-4d94-be4b-87d3598beb35\" (UID: \"80f46aab-07ca-4d94-be4b-87d3598beb35\") " Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.014515 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80f46aab-07ca-4d94-be4b-87d3598beb35-openstack-cell1-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-libvirt-default-certs-0") pod "80f46aab-07ca-4d94-be4b-87d3598beb35" (UID: "80f46aab-07ca-4d94-be4b-87d3598beb35"). InnerVolumeSpecName "openstack-cell1-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.014893 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80f46aab-07ca-4d94-be4b-87d3598beb35-kube-api-access-s8zc4" (OuterVolumeSpecName: "kube-api-access-s8zc4") pod "80f46aab-07ca-4d94-be4b-87d3598beb35" (UID: "80f46aab-07ca-4d94-be4b-87d3598beb35"). InnerVolumeSpecName "kube-api-access-s8zc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.016746 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "80f46aab-07ca-4d94-be4b-87d3598beb35" (UID: "80f46aab-07ca-4d94-be4b-87d3598beb35"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.018067 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "80f46aab-07ca-4d94-be4b-87d3598beb35" (UID: "80f46aab-07ca-4d94-be4b-87d3598beb35"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.020365 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80f46aab-07ca-4d94-be4b-87d3598beb35-openstack-cell1-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-telemetry-default-certs-0") pod "80f46aab-07ca-4d94-be4b-87d3598beb35" (UID: "80f46aab-07ca-4d94-be4b-87d3598beb35"). InnerVolumeSpecName "openstack-cell1-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.020890 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "80f46aab-07ca-4d94-be4b-87d3598beb35" (UID: "80f46aab-07ca-4d94-be4b-87d3598beb35"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.021050 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "80f46aab-07ca-4d94-be4b-87d3598beb35" (UID: "80f46aab-07ca-4d94-be4b-87d3598beb35"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.027686 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80f46aab-07ca-4d94-be4b-87d3598beb35-openstack-cell1-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-ovn-default-certs-0") pod "80f46aab-07ca-4d94-be4b-87d3598beb35" (UID: "80f46aab-07ca-4d94-be4b-87d3598beb35"). InnerVolumeSpecName "openstack-cell1-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.028197 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "80f46aab-07ca-4d94-be4b-87d3598beb35" (UID: "80f46aab-07ca-4d94-be4b-87d3598beb35"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.029625 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "80f46aab-07ca-4d94-be4b-87d3598beb35" (UID: "80f46aab-07ca-4d94-be4b-87d3598beb35"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.033520 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80f46aab-07ca-4d94-be4b-87d3598beb35-openstack-cell1-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-neutron-metadata-default-certs-0") pod "80f46aab-07ca-4d94-be4b-87d3598beb35" (UID: "80f46aab-07ca-4d94-be4b-87d3598beb35"). InnerVolumeSpecName "openstack-cell1-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.037908 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "80f46aab-07ca-4d94-be4b-87d3598beb35" (UID: "80f46aab-07ca-4d94-be4b-87d3598beb35"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.061934 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-inventory" (OuterVolumeSpecName: "inventory") pod "80f46aab-07ca-4d94-be4b-87d3598beb35" (UID: "80f46aab-07ca-4d94-be4b-87d3598beb35"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.082687 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "80f46aab-07ca-4d94-be4b-87d3598beb35" (UID: "80f46aab-07ca-4d94-be4b-87d3598beb35"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.122540 4756 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.122873 4756 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.122887 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.122901 4756 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80f46aab-07ca-4d94-be4b-87d3598beb35-openstack-cell1-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.122915 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8zc4\" (UniqueName: \"kubernetes.io/projected/80f46aab-07ca-4d94-be4b-87d3598beb35-kube-api-access-s8zc4\") on node \"crc\" DevicePath \"\"" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.122930 4756 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.123007 4756 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80f46aab-07ca-4d94-be4b-87d3598beb35-openstack-cell1-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.123021 4756 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.123038 4756 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80f46aab-07ca-4d94-be4b-87d3598beb35-openstack-cell1-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.123052 4756 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.123066 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.123081 4756 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80f46aab-07ca-4d94-be4b-87d3598beb35-openstack-cell1-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.123095 4756 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.123109 4756 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80f46aab-07ca-4d94-be4b-87d3598beb35-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.393463 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" event={"ID":"80f46aab-07ca-4d94-be4b-87d3598beb35","Type":"ContainerDied","Data":"0061f2d757d769cd696c9b5423ff329a1d7b7ba0afb41266e0e7001116711837"} Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.393517 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-k4w9p" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.393534 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0061f2d757d769cd696c9b5423ff329a1d7b7ba0afb41266e0e7001116711837" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.496748 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vd6r7" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.549266 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-gtpb9"] Dec 05 18:29:45 crc kubenswrapper[4756]: E1205 18:29:45.550914 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80f46aab-07ca-4d94-be4b-87d3598beb35" containerName="install-certs-openstack-openstack-cell1" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.550956 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="80f46aab-07ca-4d94-be4b-87d3598beb35" containerName="install-certs-openstack-openstack-cell1" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.551400 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="80f46aab-07ca-4d94-be4b-87d3598beb35" containerName="install-certs-openstack-openstack-cell1" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.552512 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-gtpb9" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.556555 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.556548 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.556692 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.556993 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.557696 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dfk9j" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.569778 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-gtpb9"] Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.587887 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vd6r7"] Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.746091 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/47d509cd-7752-45f8-b9f0-7dd6cf3e0311-ssh-key\") pod \"ovn-openstack-openstack-cell1-gtpb9\" (UID: \"47d509cd-7752-45f8-b9f0-7dd6cf3e0311\") " pod="openstack/ovn-openstack-openstack-cell1-gtpb9" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.746141 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47d509cd-7752-45f8-b9f0-7dd6cf3e0311-inventory\") pod \"ovn-openstack-openstack-cell1-gtpb9\" (UID: \"47d509cd-7752-45f8-b9f0-7dd6cf3e0311\") " pod="openstack/ovn-openstack-openstack-cell1-gtpb9" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.746204 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnzdj\" (UniqueName: \"kubernetes.io/projected/47d509cd-7752-45f8-b9f0-7dd6cf3e0311-kube-api-access-dnzdj\") pod \"ovn-openstack-openstack-cell1-gtpb9\" (UID: \"47d509cd-7752-45f8-b9f0-7dd6cf3e0311\") " pod="openstack/ovn-openstack-openstack-cell1-gtpb9" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.746241 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d509cd-7752-45f8-b9f0-7dd6cf3e0311-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-gtpb9\" (UID: \"47d509cd-7752-45f8-b9f0-7dd6cf3e0311\") " pod="openstack/ovn-openstack-openstack-cell1-gtpb9" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.746277 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/47d509cd-7752-45f8-b9f0-7dd6cf3e0311-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-gtpb9\" (UID: \"47d509cd-7752-45f8-b9f0-7dd6cf3e0311\") " pod="openstack/ovn-openstack-openstack-cell1-gtpb9" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.850746 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/47d509cd-7752-45f8-b9f0-7dd6cf3e0311-ssh-key\") pod \"ovn-openstack-openstack-cell1-gtpb9\" (UID: \"47d509cd-7752-45f8-b9f0-7dd6cf3e0311\") " pod="openstack/ovn-openstack-openstack-cell1-gtpb9" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.850802 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47d509cd-7752-45f8-b9f0-7dd6cf3e0311-inventory\") pod \"ovn-openstack-openstack-cell1-gtpb9\" (UID: \"47d509cd-7752-45f8-b9f0-7dd6cf3e0311\") " pod="openstack/ovn-openstack-openstack-cell1-gtpb9" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.850848 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnzdj\" (UniqueName: \"kubernetes.io/projected/47d509cd-7752-45f8-b9f0-7dd6cf3e0311-kube-api-access-dnzdj\") pod \"ovn-openstack-openstack-cell1-gtpb9\" (UID: \"47d509cd-7752-45f8-b9f0-7dd6cf3e0311\") " pod="openstack/ovn-openstack-openstack-cell1-gtpb9" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.850876 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d509cd-7752-45f8-b9f0-7dd6cf3e0311-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-gtpb9\" (UID: \"47d509cd-7752-45f8-b9f0-7dd6cf3e0311\") " pod="openstack/ovn-openstack-openstack-cell1-gtpb9" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.850903 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/47d509cd-7752-45f8-b9f0-7dd6cf3e0311-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-gtpb9\" (UID: \"47d509cd-7752-45f8-b9f0-7dd6cf3e0311\") " pod="openstack/ovn-openstack-openstack-cell1-gtpb9" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.851820 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/47d509cd-7752-45f8-b9f0-7dd6cf3e0311-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-gtpb9\" (UID: \"47d509cd-7752-45f8-b9f0-7dd6cf3e0311\") " pod="openstack/ovn-openstack-openstack-cell1-gtpb9" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.865884 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/47d509cd-7752-45f8-b9f0-7dd6cf3e0311-ssh-key\") pod \"ovn-openstack-openstack-cell1-gtpb9\" (UID: \"47d509cd-7752-45f8-b9f0-7dd6cf3e0311\") " pod="openstack/ovn-openstack-openstack-cell1-gtpb9" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.866566 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47d509cd-7752-45f8-b9f0-7dd6cf3e0311-inventory\") pod \"ovn-openstack-openstack-cell1-gtpb9\" (UID: \"47d509cd-7752-45f8-b9f0-7dd6cf3e0311\") " pod="openstack/ovn-openstack-openstack-cell1-gtpb9" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.881322 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d509cd-7752-45f8-b9f0-7dd6cf3e0311-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-gtpb9\" (UID: \"47d509cd-7752-45f8-b9f0-7dd6cf3e0311\") " pod="openstack/ovn-openstack-openstack-cell1-gtpb9" Dec 05 18:29:45 crc kubenswrapper[4756]: I1205 18:29:45.894984 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnzdj\" (UniqueName: \"kubernetes.io/projected/47d509cd-7752-45f8-b9f0-7dd6cf3e0311-kube-api-access-dnzdj\") pod \"ovn-openstack-openstack-cell1-gtpb9\" (UID: \"47d509cd-7752-45f8-b9f0-7dd6cf3e0311\") " pod="openstack/ovn-openstack-openstack-cell1-gtpb9" Dec 05 18:29:46 crc kubenswrapper[4756]: I1205 18:29:46.171243 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-gtpb9" Dec 05 18:29:47 crc kubenswrapper[4756]: I1205 18:29:47.415087 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vd6r7" podUID="8eb7fc5e-57b6-4b19-8772-9768b7537a52" containerName="registry-server" containerID="cri-o://825ec1e3b6d31f48cda9c0492ab5b4607db028eb8b2c347bbe8416a61783a57d" gracePeriod=2 Dec 05 18:29:47 crc kubenswrapper[4756]: I1205 18:29:47.539792 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-gtpb9"] Dec 05 18:29:48 crc kubenswrapper[4756]: I1205 18:29:48.430433 4756 generic.go:334] "Generic (PLEG): container finished" podID="8eb7fc5e-57b6-4b19-8772-9768b7537a52" containerID="825ec1e3b6d31f48cda9c0492ab5b4607db028eb8b2c347bbe8416a61783a57d" exitCode=0 Dec 05 18:29:48 crc kubenswrapper[4756]: I1205 18:29:48.430547 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vd6r7" event={"ID":"8eb7fc5e-57b6-4b19-8772-9768b7537a52","Type":"ContainerDied","Data":"825ec1e3b6d31f48cda9c0492ab5b4607db028eb8b2c347bbe8416a61783a57d"} Dec 05 18:29:48 crc kubenswrapper[4756]: I1205 18:29:48.432780 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-gtpb9" event={"ID":"47d509cd-7752-45f8-b9f0-7dd6cf3e0311","Type":"ContainerStarted","Data":"4045e0fc556ab317e93f4376e46b0ee12cbfb7e33bb28a7104f44aa8f90d9070"} Dec 05 18:29:48 crc kubenswrapper[4756]: I1205 18:29:48.692297 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vd6r7" Dec 05 18:29:48 crc kubenswrapper[4756]: I1205 18:29:48.734610 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eb7fc5e-57b6-4b19-8772-9768b7537a52-catalog-content\") pod \"8eb7fc5e-57b6-4b19-8772-9768b7537a52\" (UID: \"8eb7fc5e-57b6-4b19-8772-9768b7537a52\") " Dec 05 18:29:48 crc kubenswrapper[4756]: I1205 18:29:48.735619 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gbvn\" (UniqueName: \"kubernetes.io/projected/8eb7fc5e-57b6-4b19-8772-9768b7537a52-kube-api-access-9gbvn\") pod \"8eb7fc5e-57b6-4b19-8772-9768b7537a52\" (UID: \"8eb7fc5e-57b6-4b19-8772-9768b7537a52\") " Dec 05 18:29:48 crc kubenswrapper[4756]: I1205 18:29:48.735775 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eb7fc5e-57b6-4b19-8772-9768b7537a52-utilities\") pod \"8eb7fc5e-57b6-4b19-8772-9768b7537a52\" (UID: \"8eb7fc5e-57b6-4b19-8772-9768b7537a52\") " Dec 05 18:29:48 crc kubenswrapper[4756]: I1205 18:29:48.738152 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8eb7fc5e-57b6-4b19-8772-9768b7537a52-utilities" (OuterVolumeSpecName: "utilities") pod "8eb7fc5e-57b6-4b19-8772-9768b7537a52" (UID: "8eb7fc5e-57b6-4b19-8772-9768b7537a52"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:29:48 crc kubenswrapper[4756]: I1205 18:29:48.754761 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eb7fc5e-57b6-4b19-8772-9768b7537a52-kube-api-access-9gbvn" (OuterVolumeSpecName: "kube-api-access-9gbvn") pod "8eb7fc5e-57b6-4b19-8772-9768b7537a52" (UID: "8eb7fc5e-57b6-4b19-8772-9768b7537a52"). InnerVolumeSpecName "kube-api-access-9gbvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:29:48 crc kubenswrapper[4756]: I1205 18:29:48.835253 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8eb7fc5e-57b6-4b19-8772-9768b7537a52-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8eb7fc5e-57b6-4b19-8772-9768b7537a52" (UID: "8eb7fc5e-57b6-4b19-8772-9768b7537a52"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:29:48 crc kubenswrapper[4756]: I1205 18:29:48.838109 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gbvn\" (UniqueName: \"kubernetes.io/projected/8eb7fc5e-57b6-4b19-8772-9768b7537a52-kube-api-access-9gbvn\") on node \"crc\" DevicePath \"\"" Dec 05 18:29:48 crc kubenswrapper[4756]: I1205 18:29:48.838144 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eb7fc5e-57b6-4b19-8772-9768b7537a52-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 18:29:48 crc kubenswrapper[4756]: I1205 18:29:48.838156 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eb7fc5e-57b6-4b19-8772-9768b7537a52-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 18:29:49 crc kubenswrapper[4756]: I1205 18:29:49.452447 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vd6r7" event={"ID":"8eb7fc5e-57b6-4b19-8772-9768b7537a52","Type":"ContainerDied","Data":"9b96bc0c0efa555281218d7c208adee49a100e543c13faded28af555e080812b"} Dec 05 18:29:49 crc kubenswrapper[4756]: I1205 18:29:49.452510 4756 scope.go:117] "RemoveContainer" containerID="825ec1e3b6d31f48cda9c0492ab5b4607db028eb8b2c347bbe8416a61783a57d" Dec 05 18:29:49 crc kubenswrapper[4756]: I1205 18:29:49.452628 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vd6r7" Dec 05 18:29:49 crc kubenswrapper[4756]: I1205 18:29:49.467126 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-gtpb9" event={"ID":"47d509cd-7752-45f8-b9f0-7dd6cf3e0311","Type":"ContainerStarted","Data":"bf9a9291f08cc538751c35798ea04ff3332d717d76ccbdcf5e4c62cc9106f2a3"} Dec 05 18:29:49 crc kubenswrapper[4756]: I1205 18:29:49.494353 4756 scope.go:117] "RemoveContainer" containerID="dbb60ec54daf5512ce12c99353c3f1f001041b159705a2ae8cdd70e26e820d00" Dec 05 18:29:49 crc kubenswrapper[4756]: I1205 18:29:49.499498 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-gtpb9" podStartSLOduration=3.832679248 podStartE2EDuration="4.499473607s" podCreationTimestamp="2025-12-05 18:29:45 +0000 UTC" firstStartedPulling="2025-12-05 18:29:47.562993203 +0000 UTC m=+8047.801666725" lastFinishedPulling="2025-12-05 18:29:48.229787572 +0000 UTC m=+8048.468461084" observedRunningTime="2025-12-05 18:29:49.488098792 +0000 UTC m=+8049.726772344" watchObservedRunningTime="2025-12-05 18:29:49.499473607 +0000 UTC m=+8049.738147129" Dec 05 18:29:49 crc kubenswrapper[4756]: I1205 18:29:49.528072 4756 scope.go:117] "RemoveContainer" containerID="275292ef9a25f65b5ec79dedfff6c35dac73b56f47524a89ab90a74c920f69b9" Dec 05 18:29:49 crc kubenswrapper[4756]: I1205 18:29:49.544618 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vd6r7"] Dec 05 18:29:49 crc kubenswrapper[4756]: I1205 18:29:49.562115 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vd6r7"] Dec 05 18:29:50 crc kubenswrapper[4756]: I1205 18:29:50.076938 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eb7fc5e-57b6-4b19-8772-9768b7537a52" path="/var/lib/kubelet/pods/8eb7fc5e-57b6-4b19-8772-9768b7537a52/volumes" Dec 05 18:29:53 crc kubenswrapper[4756]: I1205 18:29:53.050022 4756 scope.go:117] "RemoveContainer" containerID="82a63a8180647fe397c85e330207b3509035d03c3d6bad2fdf4a5955635b3ce8" Dec 05 18:29:53 crc kubenswrapper[4756]: E1205 18:29:53.051775 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:30:00 crc kubenswrapper[4756]: I1205 18:30:00.184176 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415990-jpzpq"] Dec 05 18:30:00 crc kubenswrapper[4756]: E1205 18:30:00.185233 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eb7fc5e-57b6-4b19-8772-9768b7537a52" containerName="registry-server" Dec 05 18:30:00 crc kubenswrapper[4756]: I1205 18:30:00.185263 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eb7fc5e-57b6-4b19-8772-9768b7537a52" containerName="registry-server" Dec 05 18:30:00 crc kubenswrapper[4756]: E1205 18:30:00.185297 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eb7fc5e-57b6-4b19-8772-9768b7537a52" containerName="extract-utilities" Dec 05 18:30:00 crc kubenswrapper[4756]: I1205 18:30:00.185309 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eb7fc5e-57b6-4b19-8772-9768b7537a52" containerName="extract-utilities" Dec 05 18:30:00 crc kubenswrapper[4756]: E1205 18:30:00.185349 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eb7fc5e-57b6-4b19-8772-9768b7537a52" containerName="extract-content" Dec 05 18:30:00 crc kubenswrapper[4756]: I1205 18:30:00.185361 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eb7fc5e-57b6-4b19-8772-9768b7537a52" containerName="extract-content" Dec 05 18:30:00 crc kubenswrapper[4756]: I1205 18:30:00.185797 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eb7fc5e-57b6-4b19-8772-9768b7537a52" containerName="registry-server" Dec 05 18:30:00 crc kubenswrapper[4756]: I1205 18:30:00.187105 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415990-jpzpq" Dec 05 18:30:00 crc kubenswrapper[4756]: I1205 18:30:00.190254 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 18:30:00 crc kubenswrapper[4756]: I1205 18:30:00.190710 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 18:30:00 crc kubenswrapper[4756]: I1205 18:30:00.216271 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415990-jpzpq"] Dec 05 18:30:00 crc kubenswrapper[4756]: I1205 18:30:00.318089 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjw48\" (UniqueName: \"kubernetes.io/projected/7d8deec6-9788-4d17-967a-cfbd1cb7f0b6-kube-api-access-vjw48\") pod \"collect-profiles-29415990-jpzpq\" (UID: \"7d8deec6-9788-4d17-967a-cfbd1cb7f0b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415990-jpzpq" Dec 05 18:30:00 crc kubenswrapper[4756]: I1205 18:30:00.318169 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d8deec6-9788-4d17-967a-cfbd1cb7f0b6-config-volume\") pod \"collect-profiles-29415990-jpzpq\" (UID: \"7d8deec6-9788-4d17-967a-cfbd1cb7f0b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415990-jpzpq" Dec 05 18:30:00 crc kubenswrapper[4756]: I1205 18:30:00.318503 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d8deec6-9788-4d17-967a-cfbd1cb7f0b6-secret-volume\") pod \"collect-profiles-29415990-jpzpq\" (UID: \"7d8deec6-9788-4d17-967a-cfbd1cb7f0b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415990-jpzpq" Dec 05 18:30:00 crc kubenswrapper[4756]: I1205 18:30:00.421765 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjw48\" (UniqueName: \"kubernetes.io/projected/7d8deec6-9788-4d17-967a-cfbd1cb7f0b6-kube-api-access-vjw48\") pod \"collect-profiles-29415990-jpzpq\" (UID: \"7d8deec6-9788-4d17-967a-cfbd1cb7f0b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415990-jpzpq" Dec 05 18:30:00 crc kubenswrapper[4756]: I1205 18:30:00.421841 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d8deec6-9788-4d17-967a-cfbd1cb7f0b6-config-volume\") pod \"collect-profiles-29415990-jpzpq\" (UID: \"7d8deec6-9788-4d17-967a-cfbd1cb7f0b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415990-jpzpq" Dec 05 18:30:00 crc kubenswrapper[4756]: I1205 18:30:00.421894 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d8deec6-9788-4d17-967a-cfbd1cb7f0b6-secret-volume\") pod \"collect-profiles-29415990-jpzpq\" (UID: \"7d8deec6-9788-4d17-967a-cfbd1cb7f0b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415990-jpzpq" Dec 05 18:30:00 crc kubenswrapper[4756]: I1205 18:30:00.423798 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d8deec6-9788-4d17-967a-cfbd1cb7f0b6-config-volume\") pod \"collect-profiles-29415990-jpzpq\" (UID: \"7d8deec6-9788-4d17-967a-cfbd1cb7f0b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415990-jpzpq" Dec 05 18:30:00 crc kubenswrapper[4756]: I1205 18:30:00.437710 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d8deec6-9788-4d17-967a-cfbd1cb7f0b6-secret-volume\") pod \"collect-profiles-29415990-jpzpq\" (UID: \"7d8deec6-9788-4d17-967a-cfbd1cb7f0b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415990-jpzpq" Dec 05 18:30:00 crc kubenswrapper[4756]: I1205 18:30:00.441248 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjw48\" (UniqueName: \"kubernetes.io/projected/7d8deec6-9788-4d17-967a-cfbd1cb7f0b6-kube-api-access-vjw48\") pod \"collect-profiles-29415990-jpzpq\" (UID: \"7d8deec6-9788-4d17-967a-cfbd1cb7f0b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415990-jpzpq" Dec 05 18:30:00 crc kubenswrapper[4756]: I1205 18:30:00.517704 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415990-jpzpq" Dec 05 18:30:01 crc kubenswrapper[4756]: I1205 18:30:01.158768 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415990-jpzpq"] Dec 05 18:30:01 crc kubenswrapper[4756]: I1205 18:30:01.626976 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415990-jpzpq" event={"ID":"7d8deec6-9788-4d17-967a-cfbd1cb7f0b6","Type":"ContainerStarted","Data":"5a3911457dfbebec3b32ce1a290c71fdfe7f9d45b34d2a3c99eba29454c23f19"} Dec 05 18:30:01 crc kubenswrapper[4756]: I1205 18:30:01.627275 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415990-jpzpq" event={"ID":"7d8deec6-9788-4d17-967a-cfbd1cb7f0b6","Type":"ContainerStarted","Data":"af68abfde3039b70272a381e1290f9cb166c25cb774e826cdcd8bd60f6a1f28d"} Dec 05 18:30:01 crc kubenswrapper[4756]: I1205 18:30:01.659088 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29415990-jpzpq" podStartSLOduration=1.659071467 podStartE2EDuration="1.659071467s" podCreationTimestamp="2025-12-05 18:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 18:30:01.653053635 +0000 UTC m=+8061.891727177" watchObservedRunningTime="2025-12-05 18:30:01.659071467 +0000 UTC m=+8061.897744989" Dec 05 18:30:02 crc kubenswrapper[4756]: I1205 18:30:02.640656 4756 generic.go:334] "Generic (PLEG): container finished" podID="7d8deec6-9788-4d17-967a-cfbd1cb7f0b6" containerID="5a3911457dfbebec3b32ce1a290c71fdfe7f9d45b34d2a3c99eba29454c23f19" exitCode=0 Dec 05 18:30:02 crc kubenswrapper[4756]: I1205 18:30:02.640712 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415990-jpzpq" event={"ID":"7d8deec6-9788-4d17-967a-cfbd1cb7f0b6","Type":"ContainerDied","Data":"5a3911457dfbebec3b32ce1a290c71fdfe7f9d45b34d2a3c99eba29454c23f19"} Dec 05 18:30:04 crc kubenswrapper[4756]: I1205 18:30:04.021742 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415990-jpzpq" Dec 05 18:30:04 crc kubenswrapper[4756]: I1205 18:30:04.217330 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjw48\" (UniqueName: \"kubernetes.io/projected/7d8deec6-9788-4d17-967a-cfbd1cb7f0b6-kube-api-access-vjw48\") pod \"7d8deec6-9788-4d17-967a-cfbd1cb7f0b6\" (UID: \"7d8deec6-9788-4d17-967a-cfbd1cb7f0b6\") " Dec 05 18:30:04 crc kubenswrapper[4756]: I1205 18:30:04.217392 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d8deec6-9788-4d17-967a-cfbd1cb7f0b6-secret-volume\") pod \"7d8deec6-9788-4d17-967a-cfbd1cb7f0b6\" (UID: \"7d8deec6-9788-4d17-967a-cfbd1cb7f0b6\") " Dec 05 18:30:04 crc kubenswrapper[4756]: I1205 18:30:04.217540 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d8deec6-9788-4d17-967a-cfbd1cb7f0b6-config-volume\") pod \"7d8deec6-9788-4d17-967a-cfbd1cb7f0b6\" (UID: \"7d8deec6-9788-4d17-967a-cfbd1cb7f0b6\") " Dec 05 18:30:04 crc kubenswrapper[4756]: I1205 18:30:04.218226 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d8deec6-9788-4d17-967a-cfbd1cb7f0b6-config-volume" (OuterVolumeSpecName: "config-volume") pod "7d8deec6-9788-4d17-967a-cfbd1cb7f0b6" (UID: "7d8deec6-9788-4d17-967a-cfbd1cb7f0b6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:30:04 crc kubenswrapper[4756]: I1205 18:30:04.239644 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d8deec6-9788-4d17-967a-cfbd1cb7f0b6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7d8deec6-9788-4d17-967a-cfbd1cb7f0b6" (UID: "7d8deec6-9788-4d17-967a-cfbd1cb7f0b6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:30:04 crc kubenswrapper[4756]: I1205 18:30:04.239791 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d8deec6-9788-4d17-967a-cfbd1cb7f0b6-kube-api-access-vjw48" (OuterVolumeSpecName: "kube-api-access-vjw48") pod "7d8deec6-9788-4d17-967a-cfbd1cb7f0b6" (UID: "7d8deec6-9788-4d17-967a-cfbd1cb7f0b6"). InnerVolumeSpecName "kube-api-access-vjw48". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:30:04 crc kubenswrapper[4756]: I1205 18:30:04.321659 4756 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d8deec6-9788-4d17-967a-cfbd1cb7f0b6-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 18:30:04 crc kubenswrapper[4756]: I1205 18:30:04.321702 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjw48\" (UniqueName: \"kubernetes.io/projected/7d8deec6-9788-4d17-967a-cfbd1cb7f0b6-kube-api-access-vjw48\") on node \"crc\" DevicePath \"\"" Dec 05 18:30:04 crc kubenswrapper[4756]: I1205 18:30:04.321719 4756 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d8deec6-9788-4d17-967a-cfbd1cb7f0b6-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 18:30:04 crc kubenswrapper[4756]: I1205 18:30:04.673168 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415990-jpzpq" event={"ID":"7d8deec6-9788-4d17-967a-cfbd1cb7f0b6","Type":"ContainerDied","Data":"af68abfde3039b70272a381e1290f9cb166c25cb774e826cdcd8bd60f6a1f28d"} Dec 05 18:30:04 crc kubenswrapper[4756]: I1205 18:30:04.673452 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af68abfde3039b70272a381e1290f9cb166c25cb774e826cdcd8bd60f6a1f28d" Dec 05 18:30:04 crc kubenswrapper[4756]: I1205 18:30:04.673367 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415990-jpzpq" Dec 05 18:30:04 crc kubenswrapper[4756]: I1205 18:30:04.746511 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415945-9cs75"] Dec 05 18:30:04 crc kubenswrapper[4756]: I1205 18:30:04.755891 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415945-9cs75"] Dec 05 18:30:06 crc kubenswrapper[4756]: I1205 18:30:06.062341 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9cde916-c76d-4c7b-8fa6-2e7632818f2c" path="/var/lib/kubelet/pods/d9cde916-c76d-4c7b-8fa6-2e7632818f2c/volumes" Dec 05 18:30:08 crc kubenswrapper[4756]: I1205 18:30:08.053221 4756 scope.go:117] "RemoveContainer" containerID="82a63a8180647fe397c85e330207b3509035d03c3d6bad2fdf4a5955635b3ce8" Dec 05 18:30:08 crc kubenswrapper[4756]: E1205 18:30:08.053864 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:30:19 crc kubenswrapper[4756]: I1205 18:30:19.048588 4756 scope.go:117] "RemoveContainer" containerID="7553c094e1595a33ff5d20362be6cc3e66fd159e6f376684479e298a7bde4070" Dec 05 18:30:21 crc kubenswrapper[4756]: I1205 18:30:21.050567 4756 scope.go:117] "RemoveContainer" containerID="82a63a8180647fe397c85e330207b3509035d03c3d6bad2fdf4a5955635b3ce8" Dec 05 18:30:21 crc kubenswrapper[4756]: E1205 18:30:21.051837 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:30:33 crc kubenswrapper[4756]: I1205 18:30:33.049531 4756 scope.go:117] "RemoveContainer" containerID="82a63a8180647fe397c85e330207b3509035d03c3d6bad2fdf4a5955635b3ce8" Dec 05 18:30:33 crc kubenswrapper[4756]: E1205 18:30:33.050617 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:30:45 crc kubenswrapper[4756]: I1205 18:30:45.052300 4756 scope.go:117] "RemoveContainer" containerID="82a63a8180647fe397c85e330207b3509035d03c3d6bad2fdf4a5955635b3ce8" Dec 05 18:30:45 crc kubenswrapper[4756]: E1205 18:30:45.053357 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:30:57 crc kubenswrapper[4756]: I1205 18:30:57.050853 4756 scope.go:117] "RemoveContainer" containerID="82a63a8180647fe397c85e330207b3509035d03c3d6bad2fdf4a5955635b3ce8" Dec 05 18:30:57 crc kubenswrapper[4756]: E1205 18:30:57.051983 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:31:01 crc kubenswrapper[4756]: I1205 18:31:01.400539 4756 generic.go:334] "Generic (PLEG): container finished" podID="47d509cd-7752-45f8-b9f0-7dd6cf3e0311" containerID="bf9a9291f08cc538751c35798ea04ff3332d717d76ccbdcf5e4c62cc9106f2a3" exitCode=0 Dec 05 18:31:01 crc kubenswrapper[4756]: I1205 18:31:01.400611 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-gtpb9" event={"ID":"47d509cd-7752-45f8-b9f0-7dd6cf3e0311","Type":"ContainerDied","Data":"bf9a9291f08cc538751c35798ea04ff3332d717d76ccbdcf5e4c62cc9106f2a3"} Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.007812 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-gtpb9" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.138311 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/47d509cd-7752-45f8-b9f0-7dd6cf3e0311-ovncontroller-config-0\") pod \"47d509cd-7752-45f8-b9f0-7dd6cf3e0311\" (UID: \"47d509cd-7752-45f8-b9f0-7dd6cf3e0311\") " Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.139012 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnzdj\" (UniqueName: \"kubernetes.io/projected/47d509cd-7752-45f8-b9f0-7dd6cf3e0311-kube-api-access-dnzdj\") pod \"47d509cd-7752-45f8-b9f0-7dd6cf3e0311\" (UID: \"47d509cd-7752-45f8-b9f0-7dd6cf3e0311\") " Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.139114 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d509cd-7752-45f8-b9f0-7dd6cf3e0311-ovn-combined-ca-bundle\") pod \"47d509cd-7752-45f8-b9f0-7dd6cf3e0311\" (UID: \"47d509cd-7752-45f8-b9f0-7dd6cf3e0311\") " Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.139467 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/47d509cd-7752-45f8-b9f0-7dd6cf3e0311-ssh-key\") pod \"47d509cd-7752-45f8-b9f0-7dd6cf3e0311\" (UID: \"47d509cd-7752-45f8-b9f0-7dd6cf3e0311\") " Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.139533 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47d509cd-7752-45f8-b9f0-7dd6cf3e0311-inventory\") pod \"47d509cd-7752-45f8-b9f0-7dd6cf3e0311\" (UID: \"47d509cd-7752-45f8-b9f0-7dd6cf3e0311\") " Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.145340 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47d509cd-7752-45f8-b9f0-7dd6cf3e0311-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "47d509cd-7752-45f8-b9f0-7dd6cf3e0311" (UID: "47d509cd-7752-45f8-b9f0-7dd6cf3e0311"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.146100 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47d509cd-7752-45f8-b9f0-7dd6cf3e0311-kube-api-access-dnzdj" (OuterVolumeSpecName: "kube-api-access-dnzdj") pod "47d509cd-7752-45f8-b9f0-7dd6cf3e0311" (UID: "47d509cd-7752-45f8-b9f0-7dd6cf3e0311"). InnerVolumeSpecName "kube-api-access-dnzdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.177964 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47d509cd-7752-45f8-b9f0-7dd6cf3e0311-inventory" (OuterVolumeSpecName: "inventory") pod "47d509cd-7752-45f8-b9f0-7dd6cf3e0311" (UID: "47d509cd-7752-45f8-b9f0-7dd6cf3e0311"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.188480 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47d509cd-7752-45f8-b9f0-7dd6cf3e0311-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "47d509cd-7752-45f8-b9f0-7dd6cf3e0311" (UID: "47d509cd-7752-45f8-b9f0-7dd6cf3e0311"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.206666 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47d509cd-7752-45f8-b9f0-7dd6cf3e0311-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "47d509cd-7752-45f8-b9f0-7dd6cf3e0311" (UID: "47d509cd-7752-45f8-b9f0-7dd6cf3e0311"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.243583 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnzdj\" (UniqueName: \"kubernetes.io/projected/47d509cd-7752-45f8-b9f0-7dd6cf3e0311-kube-api-access-dnzdj\") on node \"crc\" DevicePath \"\"" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.243631 4756 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d509cd-7752-45f8-b9f0-7dd6cf3e0311-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.243652 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/47d509cd-7752-45f8-b9f0-7dd6cf3e0311-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.243673 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47d509cd-7752-45f8-b9f0-7dd6cf3e0311-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.243693 4756 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/47d509cd-7752-45f8-b9f0-7dd6cf3e0311-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.431088 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-gtpb9" event={"ID":"47d509cd-7752-45f8-b9f0-7dd6cf3e0311","Type":"ContainerDied","Data":"4045e0fc556ab317e93f4376e46b0ee12cbfb7e33bb28a7104f44aa8f90d9070"} Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.431127 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4045e0fc556ab317e93f4376e46b0ee12cbfb7e33bb28a7104f44aa8f90d9070" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.431162 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-gtpb9" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.592046 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-djz95"] Dec 05 18:31:03 crc kubenswrapper[4756]: E1205 18:31:03.592901 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47d509cd-7752-45f8-b9f0-7dd6cf3e0311" containerName="ovn-openstack-openstack-cell1" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.592926 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="47d509cd-7752-45f8-b9f0-7dd6cf3e0311" containerName="ovn-openstack-openstack-cell1" Dec 05 18:31:03 crc kubenswrapper[4756]: E1205 18:31:03.592969 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d8deec6-9788-4d17-967a-cfbd1cb7f0b6" containerName="collect-profiles" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.592978 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d8deec6-9788-4d17-967a-cfbd1cb7f0b6" containerName="collect-profiles" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.593224 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d8deec6-9788-4d17-967a-cfbd1cb7f0b6" containerName="collect-profiles" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.593262 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="47d509cd-7752-45f8-b9f0-7dd6cf3e0311" containerName="ovn-openstack-openstack-cell1" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.594115 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-djz95" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.596790 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.596945 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.597287 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.597435 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dfk9j" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.598039 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.599842 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.615450 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-djz95"] Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.652860 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-djz95\" (UID: \"268ef7e0-0883-4119-a1cd-e1e9f9d061fe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-djz95" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.652927 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-djz95\" (UID: \"268ef7e0-0883-4119-a1cd-e1e9f9d061fe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-djz95" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.652982 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6w84\" (UniqueName: \"kubernetes.io/projected/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-kube-api-access-v6w84\") pod \"neutron-metadata-openstack-openstack-cell1-djz95\" (UID: \"268ef7e0-0883-4119-a1cd-e1e9f9d061fe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-djz95" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.653027 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-djz95\" (UID: \"268ef7e0-0883-4119-a1cd-e1e9f9d061fe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-djz95" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.653118 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-djz95\" (UID: \"268ef7e0-0883-4119-a1cd-e1e9f9d061fe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-djz95" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.653190 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-djz95\" (UID: \"268ef7e0-0883-4119-a1cd-e1e9f9d061fe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-djz95" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.755060 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-djz95\" (UID: \"268ef7e0-0883-4119-a1cd-e1e9f9d061fe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-djz95" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.755138 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-djz95\" (UID: \"268ef7e0-0883-4119-a1cd-e1e9f9d061fe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-djz95" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.755274 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-djz95\" (UID: \"268ef7e0-0883-4119-a1cd-e1e9f9d061fe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-djz95" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.755309 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-djz95\" (UID: \"268ef7e0-0883-4119-a1cd-e1e9f9d061fe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-djz95" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.755353 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6w84\" (UniqueName: \"kubernetes.io/projected/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-kube-api-access-v6w84\") pod \"neutron-metadata-openstack-openstack-cell1-djz95\" (UID: \"268ef7e0-0883-4119-a1cd-e1e9f9d061fe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-djz95" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.755396 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-djz95\" (UID: \"268ef7e0-0883-4119-a1cd-e1e9f9d061fe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-djz95" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.762478 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-djz95\" (UID: \"268ef7e0-0883-4119-a1cd-e1e9f9d061fe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-djz95" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.762980 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-djz95\" (UID: \"268ef7e0-0883-4119-a1cd-e1e9f9d061fe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-djz95" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.764622 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-djz95\" (UID: \"268ef7e0-0883-4119-a1cd-e1e9f9d061fe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-djz95" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.764675 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-djz95\" (UID: \"268ef7e0-0883-4119-a1cd-e1e9f9d061fe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-djz95" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.771248 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-djz95\" (UID: \"268ef7e0-0883-4119-a1cd-e1e9f9d061fe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-djz95" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.779488 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6w84\" (UniqueName: \"kubernetes.io/projected/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-kube-api-access-v6w84\") pod \"neutron-metadata-openstack-openstack-cell1-djz95\" (UID: \"268ef7e0-0883-4119-a1cd-e1e9f9d061fe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-djz95" Dec 05 18:31:03 crc kubenswrapper[4756]: I1205 18:31:03.913778 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-djz95" Dec 05 18:31:04 crc kubenswrapper[4756]: I1205 18:31:04.521361 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 18:31:04 crc kubenswrapper[4756]: I1205 18:31:04.522779 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-djz95"] Dec 05 18:31:05 crc kubenswrapper[4756]: I1205 18:31:05.458096 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-djz95" event={"ID":"268ef7e0-0883-4119-a1cd-e1e9f9d061fe","Type":"ContainerStarted","Data":"504eec7669d6cbf14c4b98dca3a8fa88e277c2ed66b08ee1bc2a508e05ed8d49"} Dec 05 18:31:05 crc kubenswrapper[4756]: I1205 18:31:05.458802 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-djz95" event={"ID":"268ef7e0-0883-4119-a1cd-e1e9f9d061fe","Type":"ContainerStarted","Data":"fac1af144b507591546422dc5aedc6a87c54c1757bda8060bbdc97a8f99698a2"} Dec 05 18:31:05 crc kubenswrapper[4756]: I1205 18:31:05.490514 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-djz95" podStartSLOduration=2.014457829 podStartE2EDuration="2.490483538s" podCreationTimestamp="2025-12-05 18:31:03 +0000 UTC" firstStartedPulling="2025-12-05 18:31:04.521177767 +0000 UTC m=+8124.759851279" lastFinishedPulling="2025-12-05 18:31:04.997203466 +0000 UTC m=+8125.235876988" observedRunningTime="2025-12-05 18:31:05.483237914 +0000 UTC m=+8125.721911466" watchObservedRunningTime="2025-12-05 18:31:05.490483538 +0000 UTC m=+8125.729157100" Dec 05 18:31:08 crc kubenswrapper[4756]: I1205 18:31:08.050878 4756 scope.go:117] "RemoveContainer" containerID="82a63a8180647fe397c85e330207b3509035d03c3d6bad2fdf4a5955635b3ce8" Dec 05 18:31:08 crc kubenswrapper[4756]: E1205 18:31:08.051985 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:31:23 crc kubenswrapper[4756]: I1205 18:31:23.050030 4756 scope.go:117] "RemoveContainer" containerID="82a63a8180647fe397c85e330207b3509035d03c3d6bad2fdf4a5955635b3ce8" Dec 05 18:31:23 crc kubenswrapper[4756]: E1205 18:31:23.051245 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:31:35 crc kubenswrapper[4756]: I1205 18:31:35.053897 4756 scope.go:117] "RemoveContainer" containerID="82a63a8180647fe397c85e330207b3509035d03c3d6bad2fdf4a5955635b3ce8" Dec 05 18:31:35 crc kubenswrapper[4756]: E1205 18:31:35.062854 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:31:48 crc kubenswrapper[4756]: I1205 18:31:48.050116 4756 scope.go:117] "RemoveContainer" containerID="82a63a8180647fe397c85e330207b3509035d03c3d6bad2fdf4a5955635b3ce8" Dec 05 18:31:48 crc kubenswrapper[4756]: E1205 18:31:48.051643 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:32:01 crc kubenswrapper[4756]: I1205 18:32:01.049909 4756 scope.go:117] "RemoveContainer" containerID="82a63a8180647fe397c85e330207b3509035d03c3d6bad2fdf4a5955635b3ce8" Dec 05 18:32:01 crc kubenswrapper[4756]: E1205 18:32:01.050606 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:32:03 crc kubenswrapper[4756]: I1205 18:32:03.200126 4756 generic.go:334] "Generic (PLEG): container finished" podID="268ef7e0-0883-4119-a1cd-e1e9f9d061fe" containerID="504eec7669d6cbf14c4b98dca3a8fa88e277c2ed66b08ee1bc2a508e05ed8d49" exitCode=0 Dec 05 18:32:03 crc kubenswrapper[4756]: I1205 18:32:03.200232 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-djz95" event={"ID":"268ef7e0-0883-4119-a1cd-e1e9f9d061fe","Type":"ContainerDied","Data":"504eec7669d6cbf14c4b98dca3a8fa88e277c2ed66b08ee1bc2a508e05ed8d49"} Dec 05 18:32:04 crc kubenswrapper[4756]: I1205 18:32:04.729048 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-djz95" Dec 05 18:32:04 crc kubenswrapper[4756]: I1205 18:32:04.801952 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-nova-metadata-neutron-config-0\") pod \"268ef7e0-0883-4119-a1cd-e1e9f9d061fe\" (UID: \"268ef7e0-0883-4119-a1cd-e1e9f9d061fe\") " Dec 05 18:32:04 crc kubenswrapper[4756]: I1205 18:32:04.802041 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6w84\" (UniqueName: \"kubernetes.io/projected/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-kube-api-access-v6w84\") pod \"268ef7e0-0883-4119-a1cd-e1e9f9d061fe\" (UID: \"268ef7e0-0883-4119-a1cd-e1e9f9d061fe\") " Dec 05 18:32:04 crc kubenswrapper[4756]: I1205 18:32:04.802174 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-inventory\") pod \"268ef7e0-0883-4119-a1cd-e1e9f9d061fe\" (UID: \"268ef7e0-0883-4119-a1cd-e1e9f9d061fe\") " Dec 05 18:32:04 crc kubenswrapper[4756]: I1205 18:32:04.802218 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-neutron-ovn-metadata-agent-neutron-config-0\") pod \"268ef7e0-0883-4119-a1cd-e1e9f9d061fe\" (UID: \"268ef7e0-0883-4119-a1cd-e1e9f9d061fe\") " Dec 05 18:32:04 crc kubenswrapper[4756]: I1205 18:32:04.802341 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-ssh-key\") pod \"268ef7e0-0883-4119-a1cd-e1e9f9d061fe\" (UID: \"268ef7e0-0883-4119-a1cd-e1e9f9d061fe\") " Dec 05 18:32:04 crc kubenswrapper[4756]: I1205 18:32:04.802447 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-neutron-metadata-combined-ca-bundle\") pod \"268ef7e0-0883-4119-a1cd-e1e9f9d061fe\" (UID: \"268ef7e0-0883-4119-a1cd-e1e9f9d061fe\") " Dec 05 18:32:04 crc kubenswrapper[4756]: I1205 18:32:04.810841 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-kube-api-access-v6w84" (OuterVolumeSpecName: "kube-api-access-v6w84") pod "268ef7e0-0883-4119-a1cd-e1e9f9d061fe" (UID: "268ef7e0-0883-4119-a1cd-e1e9f9d061fe"). InnerVolumeSpecName "kube-api-access-v6w84". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:32:04 crc kubenswrapper[4756]: I1205 18:32:04.814562 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "268ef7e0-0883-4119-a1cd-e1e9f9d061fe" (UID: "268ef7e0-0883-4119-a1cd-e1e9f9d061fe"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:32:04 crc kubenswrapper[4756]: I1205 18:32:04.843752 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-inventory" (OuterVolumeSpecName: "inventory") pod "268ef7e0-0883-4119-a1cd-e1e9f9d061fe" (UID: "268ef7e0-0883-4119-a1cd-e1e9f9d061fe"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:32:04 crc kubenswrapper[4756]: I1205 18:32:04.845697 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "268ef7e0-0883-4119-a1cd-e1e9f9d061fe" (UID: "268ef7e0-0883-4119-a1cd-e1e9f9d061fe"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:32:04 crc kubenswrapper[4756]: I1205 18:32:04.852522 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "268ef7e0-0883-4119-a1cd-e1e9f9d061fe" (UID: "268ef7e0-0883-4119-a1cd-e1e9f9d061fe"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:32:04 crc kubenswrapper[4756]: I1205 18:32:04.858797 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "268ef7e0-0883-4119-a1cd-e1e9f9d061fe" (UID: "268ef7e0-0883-4119-a1cd-e1e9f9d061fe"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:32:04 crc kubenswrapper[4756]: I1205 18:32:04.905646 4756 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:32:04 crc kubenswrapper[4756]: I1205 18:32:04.905694 4756 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 18:32:04 crc kubenswrapper[4756]: I1205 18:32:04.905716 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6w84\" (UniqueName: \"kubernetes.io/projected/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-kube-api-access-v6w84\") on node \"crc\" DevicePath \"\"" Dec 05 18:32:04 crc kubenswrapper[4756]: I1205 18:32:04.905739 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 18:32:04 crc kubenswrapper[4756]: I1205 18:32:04.905759 4756 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 18:32:04 crc kubenswrapper[4756]: I1205 18:32:04.905779 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/268ef7e0-0883-4119-a1cd-e1e9f9d061fe-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 18:32:05 crc kubenswrapper[4756]: I1205 18:32:05.228321 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-djz95" event={"ID":"268ef7e0-0883-4119-a1cd-e1e9f9d061fe","Type":"ContainerDied","Data":"fac1af144b507591546422dc5aedc6a87c54c1757bda8060bbdc97a8f99698a2"} Dec 05 18:32:05 crc kubenswrapper[4756]: I1205 18:32:05.228371 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fac1af144b507591546422dc5aedc6a87c54c1757bda8060bbdc97a8f99698a2" Dec 05 18:32:05 crc kubenswrapper[4756]: I1205 18:32:05.228491 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-djz95" Dec 05 18:32:05 crc kubenswrapper[4756]: I1205 18:32:05.375855 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-wxn8n"] Dec 05 18:32:05 crc kubenswrapper[4756]: E1205 18:32:05.376355 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="268ef7e0-0883-4119-a1cd-e1e9f9d061fe" containerName="neutron-metadata-openstack-openstack-cell1" Dec 05 18:32:05 crc kubenswrapper[4756]: I1205 18:32:05.376377 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="268ef7e0-0883-4119-a1cd-e1e9f9d061fe" containerName="neutron-metadata-openstack-openstack-cell1" Dec 05 18:32:05 crc kubenswrapper[4756]: I1205 18:32:05.376626 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="268ef7e0-0883-4119-a1cd-e1e9f9d061fe" containerName="neutron-metadata-openstack-openstack-cell1" Dec 05 18:32:05 crc kubenswrapper[4756]: I1205 18:32:05.377447 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-wxn8n" Dec 05 18:32:05 crc kubenswrapper[4756]: I1205 18:32:05.379604 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 18:32:05 crc kubenswrapper[4756]: I1205 18:32:05.379757 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 18:32:05 crc kubenswrapper[4756]: I1205 18:32:05.380777 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 05 18:32:05 crc kubenswrapper[4756]: I1205 18:32:05.381139 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 18:32:05 crc kubenswrapper[4756]: I1205 18:32:05.382918 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dfk9j" Dec 05 18:32:05 crc kubenswrapper[4756]: I1205 18:32:05.386989 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-wxn8n"] Dec 05 18:32:05 crc kubenswrapper[4756]: I1205 18:32:05.416620 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8knvd\" (UniqueName: \"kubernetes.io/projected/d7987f1c-bd96-4436-9fc1-505da115c1ae-kube-api-access-8knvd\") pod \"libvirt-openstack-openstack-cell1-wxn8n\" (UID: \"d7987f1c-bd96-4436-9fc1-505da115c1ae\") " pod="openstack/libvirt-openstack-openstack-cell1-wxn8n" Dec 05 18:32:05 crc kubenswrapper[4756]: I1205 18:32:05.416750 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7987f1c-bd96-4436-9fc1-505da115c1ae-ssh-key\") pod \"libvirt-openstack-openstack-cell1-wxn8n\" (UID: \"d7987f1c-bd96-4436-9fc1-505da115c1ae\") " pod="openstack/libvirt-openstack-openstack-cell1-wxn8n" Dec 05 18:32:05 crc kubenswrapper[4756]: I1205 18:32:05.416801 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7987f1c-bd96-4436-9fc1-505da115c1ae-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-wxn8n\" (UID: \"d7987f1c-bd96-4436-9fc1-505da115c1ae\") " pod="openstack/libvirt-openstack-openstack-cell1-wxn8n" Dec 05 18:32:05 crc kubenswrapper[4756]: I1205 18:32:05.416906 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7987f1c-bd96-4436-9fc1-505da115c1ae-inventory\") pod \"libvirt-openstack-openstack-cell1-wxn8n\" (UID: \"d7987f1c-bd96-4436-9fc1-505da115c1ae\") " pod="openstack/libvirt-openstack-openstack-cell1-wxn8n" Dec 05 18:32:05 crc kubenswrapper[4756]: I1205 18:32:05.416951 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d7987f1c-bd96-4436-9fc1-505da115c1ae-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-wxn8n\" (UID: \"d7987f1c-bd96-4436-9fc1-505da115c1ae\") " pod="openstack/libvirt-openstack-openstack-cell1-wxn8n" Dec 05 18:32:05 crc kubenswrapper[4756]: I1205 18:32:05.518752 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7987f1c-bd96-4436-9fc1-505da115c1ae-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-wxn8n\" (UID: \"d7987f1c-bd96-4436-9fc1-505da115c1ae\") " pod="openstack/libvirt-openstack-openstack-cell1-wxn8n" Dec 05 18:32:05 crc kubenswrapper[4756]: I1205 18:32:05.519565 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7987f1c-bd96-4436-9fc1-505da115c1ae-inventory\") pod \"libvirt-openstack-openstack-cell1-wxn8n\" (UID: \"d7987f1c-bd96-4436-9fc1-505da115c1ae\") " pod="openstack/libvirt-openstack-openstack-cell1-wxn8n" Dec 05 18:32:05 crc kubenswrapper[4756]: I1205 18:32:05.519629 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d7987f1c-bd96-4436-9fc1-505da115c1ae-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-wxn8n\" (UID: \"d7987f1c-bd96-4436-9fc1-505da115c1ae\") " pod="openstack/libvirt-openstack-openstack-cell1-wxn8n" Dec 05 18:32:05 crc kubenswrapper[4756]: I1205 18:32:05.519738 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8knvd\" (UniqueName: \"kubernetes.io/projected/d7987f1c-bd96-4436-9fc1-505da115c1ae-kube-api-access-8knvd\") pod \"libvirt-openstack-openstack-cell1-wxn8n\" (UID: \"d7987f1c-bd96-4436-9fc1-505da115c1ae\") " pod="openstack/libvirt-openstack-openstack-cell1-wxn8n" Dec 05 18:32:05 crc kubenswrapper[4756]: I1205 18:32:05.519884 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7987f1c-bd96-4436-9fc1-505da115c1ae-ssh-key\") pod \"libvirt-openstack-openstack-cell1-wxn8n\" (UID: \"d7987f1c-bd96-4436-9fc1-505da115c1ae\") " pod="openstack/libvirt-openstack-openstack-cell1-wxn8n" Dec 05 18:32:05 crc kubenswrapper[4756]: I1205 18:32:05.524555 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7987f1c-bd96-4436-9fc1-505da115c1ae-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-wxn8n\" (UID: \"d7987f1c-bd96-4436-9fc1-505da115c1ae\") " pod="openstack/libvirt-openstack-openstack-cell1-wxn8n" Dec 05 18:32:05 crc kubenswrapper[4756]: I1205 18:32:05.524716 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d7987f1c-bd96-4436-9fc1-505da115c1ae-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-wxn8n\" (UID: \"d7987f1c-bd96-4436-9fc1-505da115c1ae\") " pod="openstack/libvirt-openstack-openstack-cell1-wxn8n" Dec 05 18:32:05 crc kubenswrapper[4756]: I1205 18:32:05.525163 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7987f1c-bd96-4436-9fc1-505da115c1ae-ssh-key\") pod \"libvirt-openstack-openstack-cell1-wxn8n\" (UID: \"d7987f1c-bd96-4436-9fc1-505da115c1ae\") " pod="openstack/libvirt-openstack-openstack-cell1-wxn8n" Dec 05 18:32:05 crc kubenswrapper[4756]: I1205 18:32:05.529942 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7987f1c-bd96-4436-9fc1-505da115c1ae-inventory\") pod \"libvirt-openstack-openstack-cell1-wxn8n\" (UID: \"d7987f1c-bd96-4436-9fc1-505da115c1ae\") " pod="openstack/libvirt-openstack-openstack-cell1-wxn8n" Dec 05 18:32:05 crc kubenswrapper[4756]: I1205 18:32:05.543113 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8knvd\" (UniqueName: \"kubernetes.io/projected/d7987f1c-bd96-4436-9fc1-505da115c1ae-kube-api-access-8knvd\") pod \"libvirt-openstack-openstack-cell1-wxn8n\" (UID: \"d7987f1c-bd96-4436-9fc1-505da115c1ae\") " pod="openstack/libvirt-openstack-openstack-cell1-wxn8n" Dec 05 18:32:05 crc kubenswrapper[4756]: I1205 18:32:05.728531 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-wxn8n" Dec 05 18:32:06 crc kubenswrapper[4756]: I1205 18:32:06.220676 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-wxn8n"] Dec 05 18:32:06 crc kubenswrapper[4756]: I1205 18:32:06.237032 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-wxn8n" event={"ID":"d7987f1c-bd96-4436-9fc1-505da115c1ae","Type":"ContainerStarted","Data":"9b1f8e5ea724629468250bdb59f4c8bb7e295c77f4465d1e3241216ca8c7cdfd"} Dec 05 18:32:07 crc kubenswrapper[4756]: I1205 18:32:07.249152 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-wxn8n" event={"ID":"d7987f1c-bd96-4436-9fc1-505da115c1ae","Type":"ContainerStarted","Data":"fb106dfd9cca3c95b2365c9a10207534f9a0790598f2d54e20fe1aebdd48e268"} Dec 05 18:32:13 crc kubenswrapper[4756]: I1205 18:32:13.049323 4756 scope.go:117] "RemoveContainer" containerID="82a63a8180647fe397c85e330207b3509035d03c3d6bad2fdf4a5955635b3ce8" Dec 05 18:32:13 crc kubenswrapper[4756]: E1205 18:32:13.050263 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:32:26 crc kubenswrapper[4756]: I1205 18:32:26.050666 4756 scope.go:117] "RemoveContainer" containerID="82a63a8180647fe397c85e330207b3509035d03c3d6bad2fdf4a5955635b3ce8" Dec 05 18:32:26 crc kubenswrapper[4756]: E1205 18:32:26.052131 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:32:40 crc kubenswrapper[4756]: I1205 18:32:40.059890 4756 scope.go:117] "RemoveContainer" containerID="82a63a8180647fe397c85e330207b3509035d03c3d6bad2fdf4a5955635b3ce8" Dec 05 18:32:40 crc kubenswrapper[4756]: E1205 18:32:40.060903 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:32:54 crc kubenswrapper[4756]: I1205 18:32:54.050775 4756 scope.go:117] "RemoveContainer" containerID="82a63a8180647fe397c85e330207b3509035d03c3d6bad2fdf4a5955635b3ce8" Dec 05 18:32:54 crc kubenswrapper[4756]: I1205 18:32:54.864692 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"a5d23aadcd4d72fd37d141ecd6190c59b2d01538fcde7179f516cb258c78f879"} Dec 05 18:32:54 crc kubenswrapper[4756]: I1205 18:32:54.900647 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-wxn8n" podStartSLOduration=49.440804939 podStartE2EDuration="49.900619383s" podCreationTimestamp="2025-12-05 18:32:05 +0000 UTC" firstStartedPulling="2025-12-05 18:32:06.22301661 +0000 UTC m=+8186.461690122" lastFinishedPulling="2025-12-05 18:32:06.682831054 +0000 UTC m=+8186.921504566" observedRunningTime="2025-12-05 18:32:07.267597452 +0000 UTC m=+8187.506270964" watchObservedRunningTime="2025-12-05 18:32:54.900619383 +0000 UTC m=+8235.139292955" Dec 05 18:33:37 crc kubenswrapper[4756]: I1205 18:33:37.626934 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9jsxw"] Dec 05 18:33:37 crc kubenswrapper[4756]: I1205 18:33:37.631644 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9jsxw" Dec 05 18:33:37 crc kubenswrapper[4756]: I1205 18:33:37.660668 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9jsxw"] Dec 05 18:33:37 crc kubenswrapper[4756]: I1205 18:33:37.720627 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66b81054-d86a-4e01-98e4-0464a51b9dff-catalog-content\") pod \"redhat-operators-9jsxw\" (UID: \"66b81054-d86a-4e01-98e4-0464a51b9dff\") " pod="openshift-marketplace/redhat-operators-9jsxw" Dec 05 18:33:37 crc kubenswrapper[4756]: I1205 18:33:37.720723 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4k8f\" (UniqueName: \"kubernetes.io/projected/66b81054-d86a-4e01-98e4-0464a51b9dff-kube-api-access-d4k8f\") pod \"redhat-operators-9jsxw\" (UID: \"66b81054-d86a-4e01-98e4-0464a51b9dff\") " pod="openshift-marketplace/redhat-operators-9jsxw" Dec 05 18:33:37 crc kubenswrapper[4756]: I1205 18:33:37.720910 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66b81054-d86a-4e01-98e4-0464a51b9dff-utilities\") pod \"redhat-operators-9jsxw\" (UID: \"66b81054-d86a-4e01-98e4-0464a51b9dff\") " pod="openshift-marketplace/redhat-operators-9jsxw" Dec 05 18:33:37 crc kubenswrapper[4756]: I1205 18:33:37.822892 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66b81054-d86a-4e01-98e4-0464a51b9dff-catalog-content\") pod \"redhat-operators-9jsxw\" (UID: \"66b81054-d86a-4e01-98e4-0464a51b9dff\") " pod="openshift-marketplace/redhat-operators-9jsxw" Dec 05 18:33:37 crc kubenswrapper[4756]: I1205 18:33:37.823005 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4k8f\" (UniqueName: \"kubernetes.io/projected/66b81054-d86a-4e01-98e4-0464a51b9dff-kube-api-access-d4k8f\") pod \"redhat-operators-9jsxw\" (UID: \"66b81054-d86a-4e01-98e4-0464a51b9dff\") " pod="openshift-marketplace/redhat-operators-9jsxw" Dec 05 18:33:37 crc kubenswrapper[4756]: I1205 18:33:37.823162 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66b81054-d86a-4e01-98e4-0464a51b9dff-utilities\") pod \"redhat-operators-9jsxw\" (UID: \"66b81054-d86a-4e01-98e4-0464a51b9dff\") " pod="openshift-marketplace/redhat-operators-9jsxw" Dec 05 18:33:37 crc kubenswrapper[4756]: I1205 18:33:37.823471 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66b81054-d86a-4e01-98e4-0464a51b9dff-catalog-content\") pod \"redhat-operators-9jsxw\" (UID: \"66b81054-d86a-4e01-98e4-0464a51b9dff\") " pod="openshift-marketplace/redhat-operators-9jsxw" Dec 05 18:33:37 crc kubenswrapper[4756]: I1205 18:33:37.823705 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66b81054-d86a-4e01-98e4-0464a51b9dff-utilities\") pod \"redhat-operators-9jsxw\" (UID: \"66b81054-d86a-4e01-98e4-0464a51b9dff\") " pod="openshift-marketplace/redhat-operators-9jsxw" Dec 05 18:33:37 crc kubenswrapper[4756]: I1205 18:33:37.845047 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4k8f\" (UniqueName: \"kubernetes.io/projected/66b81054-d86a-4e01-98e4-0464a51b9dff-kube-api-access-d4k8f\") pod \"redhat-operators-9jsxw\" (UID: \"66b81054-d86a-4e01-98e4-0464a51b9dff\") " pod="openshift-marketplace/redhat-operators-9jsxw" Dec 05 18:33:37 crc kubenswrapper[4756]: I1205 18:33:37.961882 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9jsxw" Dec 05 18:33:38 crc kubenswrapper[4756]: I1205 18:33:38.502874 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9jsxw"] Dec 05 18:33:39 crc kubenswrapper[4756]: I1205 18:33:39.476639 4756 generic.go:334] "Generic (PLEG): container finished" podID="66b81054-d86a-4e01-98e4-0464a51b9dff" containerID="e83c0424069f2422958cc12aa9b0de5b81701a634129c903a82c7d2dd23342ea" exitCode=0 Dec 05 18:33:39 crc kubenswrapper[4756]: I1205 18:33:39.476993 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jsxw" event={"ID":"66b81054-d86a-4e01-98e4-0464a51b9dff","Type":"ContainerDied","Data":"e83c0424069f2422958cc12aa9b0de5b81701a634129c903a82c7d2dd23342ea"} Dec 05 18:33:39 crc kubenswrapper[4756]: I1205 18:33:39.477035 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jsxw" event={"ID":"66b81054-d86a-4e01-98e4-0464a51b9dff","Type":"ContainerStarted","Data":"71ea18d111bea872eea55e6d3d42b336902161e33237d22be76f285c62072535"} Dec 05 18:33:40 crc kubenswrapper[4756]: I1205 18:33:40.489113 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jsxw" event={"ID":"66b81054-d86a-4e01-98e4-0464a51b9dff","Type":"ContainerStarted","Data":"08124edaae84b3c24c44c7e4bd1a0f94ae9db57d8b9671337ff133d987b8c6f8"} Dec 05 18:33:41 crc kubenswrapper[4756]: I1205 18:33:41.511207 4756 generic.go:334] "Generic (PLEG): container finished" podID="66b81054-d86a-4e01-98e4-0464a51b9dff" containerID="08124edaae84b3c24c44c7e4bd1a0f94ae9db57d8b9671337ff133d987b8c6f8" exitCode=0 Dec 05 18:33:41 crc kubenswrapper[4756]: I1205 18:33:41.511395 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jsxw" event={"ID":"66b81054-d86a-4e01-98e4-0464a51b9dff","Type":"ContainerDied","Data":"08124edaae84b3c24c44c7e4bd1a0f94ae9db57d8b9671337ff133d987b8c6f8"} Dec 05 18:33:43 crc kubenswrapper[4756]: I1205 18:33:43.547681 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jsxw" event={"ID":"66b81054-d86a-4e01-98e4-0464a51b9dff","Type":"ContainerStarted","Data":"c16847e9b1c1faf8cf9d2d2972b64584891766f9f52b654dad794b2400e66800"} Dec 05 18:33:43 crc kubenswrapper[4756]: I1205 18:33:43.570685 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9jsxw" podStartSLOduration=3.91304997 podStartE2EDuration="6.570653442s" podCreationTimestamp="2025-12-05 18:33:37 +0000 UTC" firstStartedPulling="2025-12-05 18:33:39.479953399 +0000 UTC m=+8279.718626941" lastFinishedPulling="2025-12-05 18:33:42.137556871 +0000 UTC m=+8282.376230413" observedRunningTime="2025-12-05 18:33:43.56944277 +0000 UTC m=+8283.808116292" watchObservedRunningTime="2025-12-05 18:33:43.570653442 +0000 UTC m=+8283.809326954" Dec 05 18:33:48 crc kubenswrapper[4756]: I1205 18:33:48.135019 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9jsxw" Dec 05 18:33:48 crc kubenswrapper[4756]: I1205 18:33:48.145271 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9jsxw" Dec 05 18:33:49 crc kubenswrapper[4756]: I1205 18:33:49.229520 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9jsxw" podUID="66b81054-d86a-4e01-98e4-0464a51b9dff" containerName="registry-server" probeResult="failure" output=< Dec 05 18:33:49 crc kubenswrapper[4756]: timeout: failed to connect service ":50051" within 1s Dec 05 18:33:49 crc kubenswrapper[4756]: > Dec 05 18:33:57 crc kubenswrapper[4756]: I1205 18:33:57.202752 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f4fqj"] Dec 05 18:33:57 crc kubenswrapper[4756]: I1205 18:33:57.208984 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f4fqj" Dec 05 18:33:57 crc kubenswrapper[4756]: I1205 18:33:57.219862 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f4fqj"] Dec 05 18:33:57 crc kubenswrapper[4756]: I1205 18:33:57.279949 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llzmp\" (UniqueName: \"kubernetes.io/projected/7abe2354-3116-4515-82b1-fd87eb9368fd-kube-api-access-llzmp\") pod \"community-operators-f4fqj\" (UID: \"7abe2354-3116-4515-82b1-fd87eb9368fd\") " pod="openshift-marketplace/community-operators-f4fqj" Dec 05 18:33:57 crc kubenswrapper[4756]: I1205 18:33:57.280140 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7abe2354-3116-4515-82b1-fd87eb9368fd-catalog-content\") pod \"community-operators-f4fqj\" (UID: \"7abe2354-3116-4515-82b1-fd87eb9368fd\") " pod="openshift-marketplace/community-operators-f4fqj" Dec 05 18:33:57 crc kubenswrapper[4756]: I1205 18:33:57.280184 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7abe2354-3116-4515-82b1-fd87eb9368fd-utilities\") pod \"community-operators-f4fqj\" (UID: \"7abe2354-3116-4515-82b1-fd87eb9368fd\") " pod="openshift-marketplace/community-operators-f4fqj" Dec 05 18:33:57 crc kubenswrapper[4756]: I1205 18:33:57.383606 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llzmp\" (UniqueName: \"kubernetes.io/projected/7abe2354-3116-4515-82b1-fd87eb9368fd-kube-api-access-llzmp\") pod \"community-operators-f4fqj\" (UID: \"7abe2354-3116-4515-82b1-fd87eb9368fd\") " pod="openshift-marketplace/community-operators-f4fqj" Dec 05 18:33:57 crc kubenswrapper[4756]: I1205 18:33:57.384012 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7abe2354-3116-4515-82b1-fd87eb9368fd-catalog-content\") pod \"community-operators-f4fqj\" (UID: \"7abe2354-3116-4515-82b1-fd87eb9368fd\") " pod="openshift-marketplace/community-operators-f4fqj" Dec 05 18:33:57 crc kubenswrapper[4756]: I1205 18:33:57.384040 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7abe2354-3116-4515-82b1-fd87eb9368fd-utilities\") pod \"community-operators-f4fqj\" (UID: \"7abe2354-3116-4515-82b1-fd87eb9368fd\") " pod="openshift-marketplace/community-operators-f4fqj" Dec 05 18:33:57 crc kubenswrapper[4756]: I1205 18:33:57.384656 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7abe2354-3116-4515-82b1-fd87eb9368fd-utilities\") pod \"community-operators-f4fqj\" (UID: \"7abe2354-3116-4515-82b1-fd87eb9368fd\") " pod="openshift-marketplace/community-operators-f4fqj" Dec 05 18:33:57 crc kubenswrapper[4756]: I1205 18:33:57.387455 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7abe2354-3116-4515-82b1-fd87eb9368fd-catalog-content\") pod \"community-operators-f4fqj\" (UID: \"7abe2354-3116-4515-82b1-fd87eb9368fd\") " pod="openshift-marketplace/community-operators-f4fqj" Dec 05 18:33:57 crc kubenswrapper[4756]: I1205 18:33:57.411881 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llzmp\" (UniqueName: \"kubernetes.io/projected/7abe2354-3116-4515-82b1-fd87eb9368fd-kube-api-access-llzmp\") pod \"community-operators-f4fqj\" (UID: \"7abe2354-3116-4515-82b1-fd87eb9368fd\") " pod="openshift-marketplace/community-operators-f4fqj" Dec 05 18:33:57 crc kubenswrapper[4756]: I1205 18:33:57.558317 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f4fqj" Dec 05 18:33:58 crc kubenswrapper[4756]: I1205 18:33:58.022890 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9jsxw" Dec 05 18:33:58 crc kubenswrapper[4756]: I1205 18:33:58.099020 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9jsxw" Dec 05 18:33:58 crc kubenswrapper[4756]: I1205 18:33:58.131429 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f4fqj"] Dec 05 18:33:58 crc kubenswrapper[4756]: I1205 18:33:58.762020 4756 generic.go:334] "Generic (PLEG): container finished" podID="7abe2354-3116-4515-82b1-fd87eb9368fd" containerID="6334017e84ac88122948b070a21dbaa2d548fbcf9027ae1fd31237d07f0653be" exitCode=0 Dec 05 18:33:58 crc kubenswrapper[4756]: I1205 18:33:58.762136 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f4fqj" event={"ID":"7abe2354-3116-4515-82b1-fd87eb9368fd","Type":"ContainerDied","Data":"6334017e84ac88122948b070a21dbaa2d548fbcf9027ae1fd31237d07f0653be"} Dec 05 18:33:58 crc kubenswrapper[4756]: I1205 18:33:58.762518 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f4fqj" event={"ID":"7abe2354-3116-4515-82b1-fd87eb9368fd","Type":"ContainerStarted","Data":"6c5954d44149c686f35638873287f22518db59ef4f49e4dc74ba39607396a7cf"} Dec 05 18:34:00 crc kubenswrapper[4756]: I1205 18:34:00.366731 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9jsxw"] Dec 05 18:34:00 crc kubenswrapper[4756]: I1205 18:34:00.367781 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9jsxw" podUID="66b81054-d86a-4e01-98e4-0464a51b9dff" containerName="registry-server" containerID="cri-o://c16847e9b1c1faf8cf9d2d2972b64584891766f9f52b654dad794b2400e66800" gracePeriod=2 Dec 05 18:34:00 crc kubenswrapper[4756]: I1205 18:34:00.805176 4756 generic.go:334] "Generic (PLEG): container finished" podID="66b81054-d86a-4e01-98e4-0464a51b9dff" containerID="c16847e9b1c1faf8cf9d2d2972b64584891766f9f52b654dad794b2400e66800" exitCode=0 Dec 05 18:34:00 crc kubenswrapper[4756]: I1205 18:34:00.805234 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jsxw" event={"ID":"66b81054-d86a-4e01-98e4-0464a51b9dff","Type":"ContainerDied","Data":"c16847e9b1c1faf8cf9d2d2972b64584891766f9f52b654dad794b2400e66800"} Dec 05 18:34:00 crc kubenswrapper[4756]: I1205 18:34:00.805709 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jsxw" event={"ID":"66b81054-d86a-4e01-98e4-0464a51b9dff","Type":"ContainerDied","Data":"71ea18d111bea872eea55e6d3d42b336902161e33237d22be76f285c62072535"} Dec 05 18:34:00 crc kubenswrapper[4756]: I1205 18:34:00.805731 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71ea18d111bea872eea55e6d3d42b336902161e33237d22be76f285c62072535" Dec 05 18:34:00 crc kubenswrapper[4756]: I1205 18:34:00.884733 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9jsxw" Dec 05 18:34:00 crc kubenswrapper[4756]: I1205 18:34:00.991705 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66b81054-d86a-4e01-98e4-0464a51b9dff-catalog-content\") pod \"66b81054-d86a-4e01-98e4-0464a51b9dff\" (UID: \"66b81054-d86a-4e01-98e4-0464a51b9dff\") " Dec 05 18:34:01 crc kubenswrapper[4756]: I1205 18:34:01.007962 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4k8f\" (UniqueName: \"kubernetes.io/projected/66b81054-d86a-4e01-98e4-0464a51b9dff-kube-api-access-d4k8f\") pod \"66b81054-d86a-4e01-98e4-0464a51b9dff\" (UID: \"66b81054-d86a-4e01-98e4-0464a51b9dff\") " Dec 05 18:34:01 crc kubenswrapper[4756]: I1205 18:34:01.008117 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66b81054-d86a-4e01-98e4-0464a51b9dff-utilities\") pod \"66b81054-d86a-4e01-98e4-0464a51b9dff\" (UID: \"66b81054-d86a-4e01-98e4-0464a51b9dff\") " Dec 05 18:34:01 crc kubenswrapper[4756]: I1205 18:34:01.009031 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66b81054-d86a-4e01-98e4-0464a51b9dff-utilities" (OuterVolumeSpecName: "utilities") pod "66b81054-d86a-4e01-98e4-0464a51b9dff" (UID: "66b81054-d86a-4e01-98e4-0464a51b9dff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:34:01 crc kubenswrapper[4756]: I1205 18:34:01.010255 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66b81054-d86a-4e01-98e4-0464a51b9dff-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 18:34:01 crc kubenswrapper[4756]: I1205 18:34:01.015503 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66b81054-d86a-4e01-98e4-0464a51b9dff-kube-api-access-d4k8f" (OuterVolumeSpecName: "kube-api-access-d4k8f") pod "66b81054-d86a-4e01-98e4-0464a51b9dff" (UID: "66b81054-d86a-4e01-98e4-0464a51b9dff"). InnerVolumeSpecName "kube-api-access-d4k8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:34:01 crc kubenswrapper[4756]: I1205 18:34:01.088106 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66b81054-d86a-4e01-98e4-0464a51b9dff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66b81054-d86a-4e01-98e4-0464a51b9dff" (UID: "66b81054-d86a-4e01-98e4-0464a51b9dff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:34:01 crc kubenswrapper[4756]: I1205 18:34:01.113139 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66b81054-d86a-4e01-98e4-0464a51b9dff-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 18:34:01 crc kubenswrapper[4756]: I1205 18:34:01.113245 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4k8f\" (UniqueName: \"kubernetes.io/projected/66b81054-d86a-4e01-98e4-0464a51b9dff-kube-api-access-d4k8f\") on node \"crc\" DevicePath \"\"" Dec 05 18:34:01 crc kubenswrapper[4756]: I1205 18:34:01.822504 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9jsxw" Dec 05 18:34:01 crc kubenswrapper[4756]: I1205 18:34:01.861358 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9jsxw"] Dec 05 18:34:01 crc kubenswrapper[4756]: I1205 18:34:01.870614 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9jsxw"] Dec 05 18:34:02 crc kubenswrapper[4756]: I1205 18:34:02.064612 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66b81054-d86a-4e01-98e4-0464a51b9dff" path="/var/lib/kubelet/pods/66b81054-d86a-4e01-98e4-0464a51b9dff/volumes" Dec 05 18:34:03 crc kubenswrapper[4756]: I1205 18:34:03.851325 4756 generic.go:334] "Generic (PLEG): container finished" podID="7abe2354-3116-4515-82b1-fd87eb9368fd" containerID="1f1dd78a453ed8b686d8adda6f25981d2f699c5f56f4e2a9a59ccc7c8e0abb2b" exitCode=0 Dec 05 18:34:03 crc kubenswrapper[4756]: I1205 18:34:03.851813 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f4fqj" event={"ID":"7abe2354-3116-4515-82b1-fd87eb9368fd","Type":"ContainerDied","Data":"1f1dd78a453ed8b686d8adda6f25981d2f699c5f56f4e2a9a59ccc7c8e0abb2b"} Dec 05 18:34:04 crc kubenswrapper[4756]: I1205 18:34:04.864079 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f4fqj" event={"ID":"7abe2354-3116-4515-82b1-fd87eb9368fd","Type":"ContainerStarted","Data":"04a80590c7b382dd13657013c6c57010aa8b25e61df5a8587501085ccaf905a8"} Dec 05 18:34:04 crc kubenswrapper[4756]: I1205 18:34:04.886534 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f4fqj" podStartSLOduration=2.130197992 podStartE2EDuration="7.886505758s" podCreationTimestamp="2025-12-05 18:33:57 +0000 UTC" firstStartedPulling="2025-12-05 18:33:58.76396453 +0000 UTC m=+8299.002638062" lastFinishedPulling="2025-12-05 18:34:04.520272276 +0000 UTC m=+8304.758945828" observedRunningTime="2025-12-05 18:34:04.882933482 +0000 UTC m=+8305.121607024" watchObservedRunningTime="2025-12-05 18:34:04.886505758 +0000 UTC m=+8305.125179280" Dec 05 18:34:07 crc kubenswrapper[4756]: I1205 18:34:07.558493 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f4fqj" Dec 05 18:34:07 crc kubenswrapper[4756]: I1205 18:34:07.559150 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f4fqj" Dec 05 18:34:07 crc kubenswrapper[4756]: I1205 18:34:07.624223 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f4fqj" Dec 05 18:34:17 crc kubenswrapper[4756]: I1205 18:34:17.628815 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f4fqj" Dec 05 18:34:17 crc kubenswrapper[4756]: I1205 18:34:17.717929 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f4fqj"] Dec 05 18:34:17 crc kubenswrapper[4756]: I1205 18:34:17.777892 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7xvln"] Dec 05 18:34:17 crc kubenswrapper[4756]: I1205 18:34:17.778130 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7xvln" podUID="549ad202-d7ad-4107-bfdd-af2497550691" containerName="registry-server" containerID="cri-o://7a881dd99491d30d026324bc5b0aae45eea3d7100727e0fb31c3392bc7c678a8" gracePeriod=2 Dec 05 18:34:18 crc kubenswrapper[4756]: I1205 18:34:18.044771 4756 generic.go:334] "Generic (PLEG): container finished" podID="549ad202-d7ad-4107-bfdd-af2497550691" containerID="7a881dd99491d30d026324bc5b0aae45eea3d7100727e0fb31c3392bc7c678a8" exitCode=0 Dec 05 18:34:18 crc kubenswrapper[4756]: I1205 18:34:18.045962 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7xvln" event={"ID":"549ad202-d7ad-4107-bfdd-af2497550691","Type":"ContainerDied","Data":"7a881dd99491d30d026324bc5b0aae45eea3d7100727e0fb31c3392bc7c678a8"} Dec 05 18:34:18 crc kubenswrapper[4756]: I1205 18:34:18.300485 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7xvln" Dec 05 18:34:18 crc kubenswrapper[4756]: I1205 18:34:18.392869 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78stt\" (UniqueName: \"kubernetes.io/projected/549ad202-d7ad-4107-bfdd-af2497550691-kube-api-access-78stt\") pod \"549ad202-d7ad-4107-bfdd-af2497550691\" (UID: \"549ad202-d7ad-4107-bfdd-af2497550691\") " Dec 05 18:34:18 crc kubenswrapper[4756]: I1205 18:34:18.393172 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/549ad202-d7ad-4107-bfdd-af2497550691-utilities\") pod \"549ad202-d7ad-4107-bfdd-af2497550691\" (UID: \"549ad202-d7ad-4107-bfdd-af2497550691\") " Dec 05 18:34:18 crc kubenswrapper[4756]: I1205 18:34:18.393489 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/549ad202-d7ad-4107-bfdd-af2497550691-catalog-content\") pod \"549ad202-d7ad-4107-bfdd-af2497550691\" (UID: \"549ad202-d7ad-4107-bfdd-af2497550691\") " Dec 05 18:34:18 crc kubenswrapper[4756]: I1205 18:34:18.395057 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/549ad202-d7ad-4107-bfdd-af2497550691-utilities" (OuterVolumeSpecName: "utilities") pod "549ad202-d7ad-4107-bfdd-af2497550691" (UID: "549ad202-d7ad-4107-bfdd-af2497550691"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:34:18 crc kubenswrapper[4756]: I1205 18:34:18.411985 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/549ad202-d7ad-4107-bfdd-af2497550691-kube-api-access-78stt" (OuterVolumeSpecName: "kube-api-access-78stt") pod "549ad202-d7ad-4107-bfdd-af2497550691" (UID: "549ad202-d7ad-4107-bfdd-af2497550691"). InnerVolumeSpecName "kube-api-access-78stt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:34:18 crc kubenswrapper[4756]: I1205 18:34:18.453823 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/549ad202-d7ad-4107-bfdd-af2497550691-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "549ad202-d7ad-4107-bfdd-af2497550691" (UID: "549ad202-d7ad-4107-bfdd-af2497550691"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:34:18 crc kubenswrapper[4756]: I1205 18:34:18.495964 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/549ad202-d7ad-4107-bfdd-af2497550691-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 18:34:18 crc kubenswrapper[4756]: I1205 18:34:18.495990 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/549ad202-d7ad-4107-bfdd-af2497550691-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 18:34:18 crc kubenswrapper[4756]: I1205 18:34:18.496003 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78stt\" (UniqueName: \"kubernetes.io/projected/549ad202-d7ad-4107-bfdd-af2497550691-kube-api-access-78stt\") on node \"crc\" DevicePath \"\"" Dec 05 18:34:19 crc kubenswrapper[4756]: I1205 18:34:19.061631 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7xvln" event={"ID":"549ad202-d7ad-4107-bfdd-af2497550691","Type":"ContainerDied","Data":"cab59fe67186e58fa604e19efd6aec8829bc37c51a805190106d42b0f2dfd3b0"} Dec 05 18:34:19 crc kubenswrapper[4756]: I1205 18:34:19.062082 4756 scope.go:117] "RemoveContainer" containerID="7a881dd99491d30d026324bc5b0aae45eea3d7100727e0fb31c3392bc7c678a8" Dec 05 18:34:19 crc kubenswrapper[4756]: I1205 18:34:19.061822 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7xvln" Dec 05 18:34:19 crc kubenswrapper[4756]: I1205 18:34:19.097873 4756 scope.go:117] "RemoveContainer" containerID="b5272825c9f1bfbdd67a7c3a1f7e24d45f298013f9d4789707594800ab6d7a1b" Dec 05 18:34:19 crc kubenswrapper[4756]: I1205 18:34:19.126755 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7xvln"] Dec 05 18:34:19 crc kubenswrapper[4756]: I1205 18:34:19.136598 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7xvln"] Dec 05 18:34:19 crc kubenswrapper[4756]: I1205 18:34:19.144619 4756 scope.go:117] "RemoveContainer" containerID="6dc4ea09ebbcc108c1d8ab55b287b53b762ac0115be10ae7fd3144eb069beecf" Dec 05 18:34:20 crc kubenswrapper[4756]: I1205 18:34:20.118304 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="549ad202-d7ad-4107-bfdd-af2497550691" path="/var/lib/kubelet/pods/549ad202-d7ad-4107-bfdd-af2497550691/volumes" Dec 05 18:35:22 crc kubenswrapper[4756]: I1205 18:35:22.295253 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:35:22 crc kubenswrapper[4756]: I1205 18:35:22.297073 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:35:52 crc kubenswrapper[4756]: I1205 18:35:52.295803 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:35:52 crc kubenswrapper[4756]: I1205 18:35:52.296656 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:36:22 crc kubenswrapper[4756]: I1205 18:36:22.295984 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:36:22 crc kubenswrapper[4756]: I1205 18:36:22.296996 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:36:22 crc kubenswrapper[4756]: I1205 18:36:22.297102 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 18:36:22 crc kubenswrapper[4756]: I1205 18:36:22.298577 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a5d23aadcd4d72fd37d141ecd6190c59b2d01538fcde7179f516cb258c78f879"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 18:36:22 crc kubenswrapper[4756]: I1205 18:36:22.298692 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://a5d23aadcd4d72fd37d141ecd6190c59b2d01538fcde7179f516cb258c78f879" gracePeriod=600 Dec 05 18:36:22 crc kubenswrapper[4756]: I1205 18:36:22.781151 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="a5d23aadcd4d72fd37d141ecd6190c59b2d01538fcde7179f516cb258c78f879" exitCode=0 Dec 05 18:36:22 crc kubenswrapper[4756]: I1205 18:36:22.781306 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"a5d23aadcd4d72fd37d141ecd6190c59b2d01538fcde7179f516cb258c78f879"} Dec 05 18:36:22 crc kubenswrapper[4756]: I1205 18:36:22.781522 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a"} Dec 05 18:36:22 crc kubenswrapper[4756]: I1205 18:36:22.781544 4756 scope.go:117] "RemoveContainer" containerID="82a63a8180647fe397c85e330207b3509035d03c3d6bad2fdf4a5955635b3ce8" Dec 05 18:36:37 crc kubenswrapper[4756]: I1205 18:36:37.467739 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9zg6g"] Dec 05 18:36:37 crc kubenswrapper[4756]: E1205 18:36:37.468936 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="549ad202-d7ad-4107-bfdd-af2497550691" containerName="registry-server" Dec 05 18:36:37 crc kubenswrapper[4756]: I1205 18:36:37.468953 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="549ad202-d7ad-4107-bfdd-af2497550691" containerName="registry-server" Dec 05 18:36:37 crc kubenswrapper[4756]: E1205 18:36:37.468976 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="549ad202-d7ad-4107-bfdd-af2497550691" containerName="extract-content" Dec 05 18:36:37 crc kubenswrapper[4756]: I1205 18:36:37.468985 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="549ad202-d7ad-4107-bfdd-af2497550691" containerName="extract-content" Dec 05 18:36:37 crc kubenswrapper[4756]: E1205 18:36:37.469008 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66b81054-d86a-4e01-98e4-0464a51b9dff" containerName="extract-utilities" Dec 05 18:36:37 crc kubenswrapper[4756]: I1205 18:36:37.469017 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b81054-d86a-4e01-98e4-0464a51b9dff" containerName="extract-utilities" Dec 05 18:36:37 crc kubenswrapper[4756]: E1205 18:36:37.469040 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="549ad202-d7ad-4107-bfdd-af2497550691" containerName="extract-utilities" Dec 05 18:36:37 crc kubenswrapper[4756]: I1205 18:36:37.469049 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="549ad202-d7ad-4107-bfdd-af2497550691" containerName="extract-utilities" Dec 05 18:36:37 crc kubenswrapper[4756]: E1205 18:36:37.469067 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66b81054-d86a-4e01-98e4-0464a51b9dff" containerName="extract-content" Dec 05 18:36:37 crc kubenswrapper[4756]: I1205 18:36:37.469075 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b81054-d86a-4e01-98e4-0464a51b9dff" containerName="extract-content" Dec 05 18:36:37 crc kubenswrapper[4756]: E1205 18:36:37.469089 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66b81054-d86a-4e01-98e4-0464a51b9dff" containerName="registry-server" Dec 05 18:36:37 crc kubenswrapper[4756]: I1205 18:36:37.469097 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b81054-d86a-4e01-98e4-0464a51b9dff" containerName="registry-server" Dec 05 18:36:37 crc kubenswrapper[4756]: I1205 18:36:37.469390 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="66b81054-d86a-4e01-98e4-0464a51b9dff" containerName="registry-server" Dec 05 18:36:37 crc kubenswrapper[4756]: I1205 18:36:37.469402 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="549ad202-d7ad-4107-bfdd-af2497550691" containerName="registry-server" Dec 05 18:36:37 crc kubenswrapper[4756]: I1205 18:36:37.471453 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9zg6g" Dec 05 18:36:37 crc kubenswrapper[4756]: I1205 18:36:37.498123 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9zg6g"] Dec 05 18:36:37 crc kubenswrapper[4756]: I1205 18:36:37.575725 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56c78778-ea99-42a7-885b-15ebf317d3b1-catalog-content\") pod \"redhat-marketplace-9zg6g\" (UID: \"56c78778-ea99-42a7-885b-15ebf317d3b1\") " pod="openshift-marketplace/redhat-marketplace-9zg6g" Dec 05 18:36:37 crc kubenswrapper[4756]: I1205 18:36:37.575834 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56c78778-ea99-42a7-885b-15ebf317d3b1-utilities\") pod \"redhat-marketplace-9zg6g\" (UID: \"56c78778-ea99-42a7-885b-15ebf317d3b1\") " pod="openshift-marketplace/redhat-marketplace-9zg6g" Dec 05 18:36:37 crc kubenswrapper[4756]: I1205 18:36:37.575910 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhs25\" (UniqueName: \"kubernetes.io/projected/56c78778-ea99-42a7-885b-15ebf317d3b1-kube-api-access-xhs25\") pod \"redhat-marketplace-9zg6g\" (UID: \"56c78778-ea99-42a7-885b-15ebf317d3b1\") " pod="openshift-marketplace/redhat-marketplace-9zg6g" Dec 05 18:36:37 crc kubenswrapper[4756]: I1205 18:36:37.677973 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56c78778-ea99-42a7-885b-15ebf317d3b1-utilities\") pod \"redhat-marketplace-9zg6g\" (UID: \"56c78778-ea99-42a7-885b-15ebf317d3b1\") " pod="openshift-marketplace/redhat-marketplace-9zg6g" Dec 05 18:36:37 crc kubenswrapper[4756]: I1205 18:36:37.678296 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhs25\" (UniqueName: \"kubernetes.io/projected/56c78778-ea99-42a7-885b-15ebf317d3b1-kube-api-access-xhs25\") pod \"redhat-marketplace-9zg6g\" (UID: \"56c78778-ea99-42a7-885b-15ebf317d3b1\") " pod="openshift-marketplace/redhat-marketplace-9zg6g" Dec 05 18:36:37 crc kubenswrapper[4756]: I1205 18:36:37.678464 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56c78778-ea99-42a7-885b-15ebf317d3b1-utilities\") pod \"redhat-marketplace-9zg6g\" (UID: \"56c78778-ea99-42a7-885b-15ebf317d3b1\") " pod="openshift-marketplace/redhat-marketplace-9zg6g" Dec 05 18:36:37 crc kubenswrapper[4756]: I1205 18:36:37.678718 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56c78778-ea99-42a7-885b-15ebf317d3b1-catalog-content\") pod \"redhat-marketplace-9zg6g\" (UID: \"56c78778-ea99-42a7-885b-15ebf317d3b1\") " pod="openshift-marketplace/redhat-marketplace-9zg6g" Dec 05 18:36:37 crc kubenswrapper[4756]: I1205 18:36:37.678979 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56c78778-ea99-42a7-885b-15ebf317d3b1-catalog-content\") pod \"redhat-marketplace-9zg6g\" (UID: \"56c78778-ea99-42a7-885b-15ebf317d3b1\") " pod="openshift-marketplace/redhat-marketplace-9zg6g" Dec 05 18:36:37 crc kubenswrapper[4756]: I1205 18:36:37.703662 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhs25\" (UniqueName: \"kubernetes.io/projected/56c78778-ea99-42a7-885b-15ebf317d3b1-kube-api-access-xhs25\") pod \"redhat-marketplace-9zg6g\" (UID: \"56c78778-ea99-42a7-885b-15ebf317d3b1\") " pod="openshift-marketplace/redhat-marketplace-9zg6g" Dec 05 18:36:37 crc kubenswrapper[4756]: I1205 18:36:37.807457 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9zg6g" Dec 05 18:36:38 crc kubenswrapper[4756]: I1205 18:36:38.332568 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9zg6g"] Dec 05 18:36:38 crc kubenswrapper[4756]: W1205 18:36:38.339448 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c78778_ea99_42a7_885b_15ebf317d3b1.slice/crio-cc7074d929137ee496091fc35181657eb9265b219fc01d18d85a91e31633a75a WatchSource:0}: Error finding container cc7074d929137ee496091fc35181657eb9265b219fc01d18d85a91e31633a75a: Status 404 returned error can't find the container with id cc7074d929137ee496091fc35181657eb9265b219fc01d18d85a91e31633a75a Dec 05 18:36:39 crc kubenswrapper[4756]: I1205 18:36:39.025409 4756 generic.go:334] "Generic (PLEG): container finished" podID="56c78778-ea99-42a7-885b-15ebf317d3b1" containerID="30d65cbc6f23948cd1215d6a175f77b8c76a8d4bbac37a4202c627401c83219a" exitCode=0 Dec 05 18:36:39 crc kubenswrapper[4756]: I1205 18:36:39.025725 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9zg6g" event={"ID":"56c78778-ea99-42a7-885b-15ebf317d3b1","Type":"ContainerDied","Data":"30d65cbc6f23948cd1215d6a175f77b8c76a8d4bbac37a4202c627401c83219a"} Dec 05 18:36:39 crc kubenswrapper[4756]: I1205 18:36:39.026387 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9zg6g" event={"ID":"56c78778-ea99-42a7-885b-15ebf317d3b1","Type":"ContainerStarted","Data":"cc7074d929137ee496091fc35181657eb9265b219fc01d18d85a91e31633a75a"} Dec 05 18:36:39 crc kubenswrapper[4756]: I1205 18:36:39.032568 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 18:36:40 crc kubenswrapper[4756]: I1205 18:36:40.044224 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9zg6g" event={"ID":"56c78778-ea99-42a7-885b-15ebf317d3b1","Type":"ContainerStarted","Data":"dd5530e699e672a224aa1b364be48bfeea7657e64dc68104a061d1a4d5c5428c"} Dec 05 18:36:41 crc kubenswrapper[4756]: I1205 18:36:41.061548 4756 generic.go:334] "Generic (PLEG): container finished" podID="56c78778-ea99-42a7-885b-15ebf317d3b1" containerID="dd5530e699e672a224aa1b364be48bfeea7657e64dc68104a061d1a4d5c5428c" exitCode=0 Dec 05 18:36:41 crc kubenswrapper[4756]: I1205 18:36:41.061620 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9zg6g" event={"ID":"56c78778-ea99-42a7-885b-15ebf317d3b1","Type":"ContainerDied","Data":"dd5530e699e672a224aa1b364be48bfeea7657e64dc68104a061d1a4d5c5428c"} Dec 05 18:36:42 crc kubenswrapper[4756]: I1205 18:36:42.079833 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9zg6g" event={"ID":"56c78778-ea99-42a7-885b-15ebf317d3b1","Type":"ContainerStarted","Data":"a4ac136c2bf044d777835855e5459dfbfad61ee1df40f0fd83da04d18fb587e2"} Dec 05 18:36:42 crc kubenswrapper[4756]: I1205 18:36:42.115137 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9zg6g" podStartSLOduration=2.683518727 podStartE2EDuration="5.115115262s" podCreationTimestamp="2025-12-05 18:36:37 +0000 UTC" firstStartedPulling="2025-12-05 18:36:39.032217503 +0000 UTC m=+8459.270891035" lastFinishedPulling="2025-12-05 18:36:41.463814028 +0000 UTC m=+8461.702487570" observedRunningTime="2025-12-05 18:36:42.100176831 +0000 UTC m=+8462.338850373" watchObservedRunningTime="2025-12-05 18:36:42.115115262 +0000 UTC m=+8462.353788784" Dec 05 18:36:47 crc kubenswrapper[4756]: I1205 18:36:47.807777 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9zg6g" Dec 05 18:36:47 crc kubenswrapper[4756]: I1205 18:36:47.808832 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9zg6g" Dec 05 18:36:47 crc kubenswrapper[4756]: I1205 18:36:47.916457 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9zg6g" Dec 05 18:36:48 crc kubenswrapper[4756]: I1205 18:36:48.218575 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9zg6g" Dec 05 18:36:48 crc kubenswrapper[4756]: I1205 18:36:48.282442 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9zg6g"] Dec 05 18:36:49 crc kubenswrapper[4756]: E1205 18:36:49.034185 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c78778_ea99_42a7_885b_15ebf317d3b1.slice/crio-dd5530e699e672a224aa1b364be48bfeea7657e64dc68104a061d1a4d5c5428c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c78778_ea99_42a7_885b_15ebf317d3b1.slice/crio-conmon-dd5530e699e672a224aa1b364be48bfeea7657e64dc68104a061d1a4d5c5428c.scope\": RecentStats: unable to find data in memory cache]" Dec 05 18:36:50 crc kubenswrapper[4756]: I1205 18:36:50.185244 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9zg6g" podUID="56c78778-ea99-42a7-885b-15ebf317d3b1" containerName="registry-server" containerID="cri-o://a4ac136c2bf044d777835855e5459dfbfad61ee1df40f0fd83da04d18fb587e2" gracePeriod=2 Dec 05 18:36:50 crc kubenswrapper[4756]: I1205 18:36:50.790790 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9zg6g" Dec 05 18:36:50 crc kubenswrapper[4756]: I1205 18:36:50.929907 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56c78778-ea99-42a7-885b-15ebf317d3b1-catalog-content\") pod \"56c78778-ea99-42a7-885b-15ebf317d3b1\" (UID: \"56c78778-ea99-42a7-885b-15ebf317d3b1\") " Dec 05 18:36:50 crc kubenswrapper[4756]: I1205 18:36:50.930582 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhs25\" (UniqueName: \"kubernetes.io/projected/56c78778-ea99-42a7-885b-15ebf317d3b1-kube-api-access-xhs25\") pod \"56c78778-ea99-42a7-885b-15ebf317d3b1\" (UID: \"56c78778-ea99-42a7-885b-15ebf317d3b1\") " Dec 05 18:36:50 crc kubenswrapper[4756]: I1205 18:36:50.930662 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56c78778-ea99-42a7-885b-15ebf317d3b1-utilities\") pod \"56c78778-ea99-42a7-885b-15ebf317d3b1\" (UID: \"56c78778-ea99-42a7-885b-15ebf317d3b1\") " Dec 05 18:36:50 crc kubenswrapper[4756]: I1205 18:36:50.932001 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56c78778-ea99-42a7-885b-15ebf317d3b1-utilities" (OuterVolumeSpecName: "utilities") pod "56c78778-ea99-42a7-885b-15ebf317d3b1" (UID: "56c78778-ea99-42a7-885b-15ebf317d3b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:36:50 crc kubenswrapper[4756]: I1205 18:36:50.946039 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56c78778-ea99-42a7-885b-15ebf317d3b1-kube-api-access-xhs25" (OuterVolumeSpecName: "kube-api-access-xhs25") pod "56c78778-ea99-42a7-885b-15ebf317d3b1" (UID: "56c78778-ea99-42a7-885b-15ebf317d3b1"). InnerVolumeSpecName "kube-api-access-xhs25". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:36:50 crc kubenswrapper[4756]: I1205 18:36:50.956933 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56c78778-ea99-42a7-885b-15ebf317d3b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56c78778-ea99-42a7-885b-15ebf317d3b1" (UID: "56c78778-ea99-42a7-885b-15ebf317d3b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:36:51 crc kubenswrapper[4756]: I1205 18:36:51.034112 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhs25\" (UniqueName: \"kubernetes.io/projected/56c78778-ea99-42a7-885b-15ebf317d3b1-kube-api-access-xhs25\") on node \"crc\" DevicePath \"\"" Dec 05 18:36:51 crc kubenswrapper[4756]: I1205 18:36:51.034152 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56c78778-ea99-42a7-885b-15ebf317d3b1-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 18:36:51 crc kubenswrapper[4756]: I1205 18:36:51.034165 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56c78778-ea99-42a7-885b-15ebf317d3b1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 18:36:51 crc kubenswrapper[4756]: I1205 18:36:51.199782 4756 generic.go:334] "Generic (PLEG): container finished" podID="56c78778-ea99-42a7-885b-15ebf317d3b1" containerID="a4ac136c2bf044d777835855e5459dfbfad61ee1df40f0fd83da04d18fb587e2" exitCode=0 Dec 05 18:36:51 crc kubenswrapper[4756]: I1205 18:36:51.199867 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9zg6g" Dec 05 18:36:51 crc kubenswrapper[4756]: I1205 18:36:51.199861 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9zg6g" event={"ID":"56c78778-ea99-42a7-885b-15ebf317d3b1","Type":"ContainerDied","Data":"a4ac136c2bf044d777835855e5459dfbfad61ee1df40f0fd83da04d18fb587e2"} Dec 05 18:36:51 crc kubenswrapper[4756]: I1205 18:36:51.199957 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9zg6g" event={"ID":"56c78778-ea99-42a7-885b-15ebf317d3b1","Type":"ContainerDied","Data":"cc7074d929137ee496091fc35181657eb9265b219fc01d18d85a91e31633a75a"} Dec 05 18:36:51 crc kubenswrapper[4756]: I1205 18:36:51.199997 4756 scope.go:117] "RemoveContainer" containerID="a4ac136c2bf044d777835855e5459dfbfad61ee1df40f0fd83da04d18fb587e2" Dec 05 18:36:51 crc kubenswrapper[4756]: I1205 18:36:51.252814 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9zg6g"] Dec 05 18:36:51 crc kubenswrapper[4756]: I1205 18:36:51.266865 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9zg6g"] Dec 05 18:36:51 crc kubenswrapper[4756]: I1205 18:36:51.272090 4756 scope.go:117] "RemoveContainer" containerID="dd5530e699e672a224aa1b364be48bfeea7657e64dc68104a061d1a4d5c5428c" Dec 05 18:36:51 crc kubenswrapper[4756]: I1205 18:36:51.314839 4756 scope.go:117] "RemoveContainer" containerID="30d65cbc6f23948cd1215d6a175f77b8c76a8d4bbac37a4202c627401c83219a" Dec 05 18:36:51 crc kubenswrapper[4756]: I1205 18:36:51.380261 4756 scope.go:117] "RemoveContainer" containerID="a4ac136c2bf044d777835855e5459dfbfad61ee1df40f0fd83da04d18fb587e2" Dec 05 18:36:51 crc kubenswrapper[4756]: E1205 18:36:51.381049 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4ac136c2bf044d777835855e5459dfbfad61ee1df40f0fd83da04d18fb587e2\": container with ID starting with a4ac136c2bf044d777835855e5459dfbfad61ee1df40f0fd83da04d18fb587e2 not found: ID does not exist" containerID="a4ac136c2bf044d777835855e5459dfbfad61ee1df40f0fd83da04d18fb587e2" Dec 05 18:36:51 crc kubenswrapper[4756]: I1205 18:36:51.381094 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4ac136c2bf044d777835855e5459dfbfad61ee1df40f0fd83da04d18fb587e2"} err="failed to get container status \"a4ac136c2bf044d777835855e5459dfbfad61ee1df40f0fd83da04d18fb587e2\": rpc error: code = NotFound desc = could not find container \"a4ac136c2bf044d777835855e5459dfbfad61ee1df40f0fd83da04d18fb587e2\": container with ID starting with a4ac136c2bf044d777835855e5459dfbfad61ee1df40f0fd83da04d18fb587e2 not found: ID does not exist" Dec 05 18:36:51 crc kubenswrapper[4756]: I1205 18:36:51.381121 4756 scope.go:117] "RemoveContainer" containerID="dd5530e699e672a224aa1b364be48bfeea7657e64dc68104a061d1a4d5c5428c" Dec 05 18:36:51 crc kubenswrapper[4756]: E1205 18:36:51.381667 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd5530e699e672a224aa1b364be48bfeea7657e64dc68104a061d1a4d5c5428c\": container with ID starting with dd5530e699e672a224aa1b364be48bfeea7657e64dc68104a061d1a4d5c5428c not found: ID does not exist" containerID="dd5530e699e672a224aa1b364be48bfeea7657e64dc68104a061d1a4d5c5428c" Dec 05 18:36:51 crc kubenswrapper[4756]: I1205 18:36:51.381737 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd5530e699e672a224aa1b364be48bfeea7657e64dc68104a061d1a4d5c5428c"} err="failed to get container status \"dd5530e699e672a224aa1b364be48bfeea7657e64dc68104a061d1a4d5c5428c\": rpc error: code = NotFound desc = could not find container \"dd5530e699e672a224aa1b364be48bfeea7657e64dc68104a061d1a4d5c5428c\": container with ID starting with dd5530e699e672a224aa1b364be48bfeea7657e64dc68104a061d1a4d5c5428c not found: ID does not exist" Dec 05 18:36:51 crc kubenswrapper[4756]: I1205 18:36:51.381782 4756 scope.go:117] "RemoveContainer" containerID="30d65cbc6f23948cd1215d6a175f77b8c76a8d4bbac37a4202c627401c83219a" Dec 05 18:36:51 crc kubenswrapper[4756]: E1205 18:36:51.382368 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30d65cbc6f23948cd1215d6a175f77b8c76a8d4bbac37a4202c627401c83219a\": container with ID starting with 30d65cbc6f23948cd1215d6a175f77b8c76a8d4bbac37a4202c627401c83219a not found: ID does not exist" containerID="30d65cbc6f23948cd1215d6a175f77b8c76a8d4bbac37a4202c627401c83219a" Dec 05 18:36:51 crc kubenswrapper[4756]: I1205 18:36:51.382466 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30d65cbc6f23948cd1215d6a175f77b8c76a8d4bbac37a4202c627401c83219a"} err="failed to get container status \"30d65cbc6f23948cd1215d6a175f77b8c76a8d4bbac37a4202c627401c83219a\": rpc error: code = NotFound desc = could not find container \"30d65cbc6f23948cd1215d6a175f77b8c76a8d4bbac37a4202c627401c83219a\": container with ID starting with 30d65cbc6f23948cd1215d6a175f77b8c76a8d4bbac37a4202c627401c83219a not found: ID does not exist" Dec 05 18:36:52 crc kubenswrapper[4756]: I1205 18:36:52.065807 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56c78778-ea99-42a7-885b-15ebf317d3b1" path="/var/lib/kubelet/pods/56c78778-ea99-42a7-885b-15ebf317d3b1/volumes" Dec 05 18:36:59 crc kubenswrapper[4756]: E1205 18:36:59.377743 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c78778_ea99_42a7_885b_15ebf317d3b1.slice/crio-dd5530e699e672a224aa1b364be48bfeea7657e64dc68104a061d1a4d5c5428c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c78778_ea99_42a7_885b_15ebf317d3b1.slice/crio-conmon-dd5530e699e672a224aa1b364be48bfeea7657e64dc68104a061d1a4d5c5428c.scope\": RecentStats: unable to find data in memory cache]" Dec 05 18:37:07 crc kubenswrapper[4756]: I1205 18:37:07.417728 4756 generic.go:334] "Generic (PLEG): container finished" podID="d7987f1c-bd96-4436-9fc1-505da115c1ae" containerID="fb106dfd9cca3c95b2365c9a10207534f9a0790598f2d54e20fe1aebdd48e268" exitCode=0 Dec 05 18:37:07 crc kubenswrapper[4756]: I1205 18:37:07.417781 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-wxn8n" event={"ID":"d7987f1c-bd96-4436-9fc1-505da115c1ae","Type":"ContainerDied","Data":"fb106dfd9cca3c95b2365c9a10207534f9a0790598f2d54e20fe1aebdd48e268"} Dec 05 18:37:08 crc kubenswrapper[4756]: I1205 18:37:08.953015 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-wxn8n" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.119156 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8knvd\" (UniqueName: \"kubernetes.io/projected/d7987f1c-bd96-4436-9fc1-505da115c1ae-kube-api-access-8knvd\") pod \"d7987f1c-bd96-4436-9fc1-505da115c1ae\" (UID: \"d7987f1c-bd96-4436-9fc1-505da115c1ae\") " Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.119215 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7987f1c-bd96-4436-9fc1-505da115c1ae-libvirt-combined-ca-bundle\") pod \"d7987f1c-bd96-4436-9fc1-505da115c1ae\" (UID: \"d7987f1c-bd96-4436-9fc1-505da115c1ae\") " Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.119380 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d7987f1c-bd96-4436-9fc1-505da115c1ae-libvirt-secret-0\") pod \"d7987f1c-bd96-4436-9fc1-505da115c1ae\" (UID: \"d7987f1c-bd96-4436-9fc1-505da115c1ae\") " Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.119404 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7987f1c-bd96-4436-9fc1-505da115c1ae-inventory\") pod \"d7987f1c-bd96-4436-9fc1-505da115c1ae\" (UID: \"d7987f1c-bd96-4436-9fc1-505da115c1ae\") " Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.119582 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7987f1c-bd96-4436-9fc1-505da115c1ae-ssh-key\") pod \"d7987f1c-bd96-4436-9fc1-505da115c1ae\" (UID: \"d7987f1c-bd96-4436-9fc1-505da115c1ae\") " Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.127014 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7987f1c-bd96-4436-9fc1-505da115c1ae-kube-api-access-8knvd" (OuterVolumeSpecName: "kube-api-access-8knvd") pod "d7987f1c-bd96-4436-9fc1-505da115c1ae" (UID: "d7987f1c-bd96-4436-9fc1-505da115c1ae"). InnerVolumeSpecName "kube-api-access-8knvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.128814 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7987f1c-bd96-4436-9fc1-505da115c1ae-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "d7987f1c-bd96-4436-9fc1-505da115c1ae" (UID: "d7987f1c-bd96-4436-9fc1-505da115c1ae"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.163660 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7987f1c-bd96-4436-9fc1-505da115c1ae-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d7987f1c-bd96-4436-9fc1-505da115c1ae" (UID: "d7987f1c-bd96-4436-9fc1-505da115c1ae"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.182000 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7987f1c-bd96-4436-9fc1-505da115c1ae-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "d7987f1c-bd96-4436-9fc1-505da115c1ae" (UID: "d7987f1c-bd96-4436-9fc1-505da115c1ae"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.191577 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7987f1c-bd96-4436-9fc1-505da115c1ae-inventory" (OuterVolumeSpecName: "inventory") pod "d7987f1c-bd96-4436-9fc1-505da115c1ae" (UID: "d7987f1c-bd96-4436-9fc1-505da115c1ae"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.224400 4756 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d7987f1c-bd96-4436-9fc1-505da115c1ae-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.224617 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7987f1c-bd96-4436-9fc1-505da115c1ae-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.224972 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7987f1c-bd96-4436-9fc1-505da115c1ae-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.225037 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8knvd\" (UniqueName: \"kubernetes.io/projected/d7987f1c-bd96-4436-9fc1-505da115c1ae-kube-api-access-8knvd\") on node \"crc\" DevicePath \"\"" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.225065 4756 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7987f1c-bd96-4436-9fc1-505da115c1ae-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.445205 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-wxn8n" event={"ID":"d7987f1c-bd96-4436-9fc1-505da115c1ae","Type":"ContainerDied","Data":"9b1f8e5ea724629468250bdb59f4c8bb7e295c77f4465d1e3241216ca8c7cdfd"} Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.445262 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b1f8e5ea724629468250bdb59f4c8bb7e295c77f4465d1e3241216ca8c7cdfd" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.445265 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-wxn8n" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.539495 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-hzf9c"] Dec 05 18:37:09 crc kubenswrapper[4756]: E1205 18:37:09.540212 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7987f1c-bd96-4436-9fc1-505da115c1ae" containerName="libvirt-openstack-openstack-cell1" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.540234 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7987f1c-bd96-4436-9fc1-505da115c1ae" containerName="libvirt-openstack-openstack-cell1" Dec 05 18:37:09 crc kubenswrapper[4756]: E1205 18:37:09.540275 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c78778-ea99-42a7-885b-15ebf317d3b1" containerName="extract-utilities" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.540290 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c78778-ea99-42a7-885b-15ebf317d3b1" containerName="extract-utilities" Dec 05 18:37:09 crc kubenswrapper[4756]: E1205 18:37:09.540333 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c78778-ea99-42a7-885b-15ebf317d3b1" containerName="registry-server" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.540345 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c78778-ea99-42a7-885b-15ebf317d3b1" containerName="registry-server" Dec 05 18:37:09 crc kubenswrapper[4756]: E1205 18:37:09.540364 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c78778-ea99-42a7-885b-15ebf317d3b1" containerName="extract-content" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.540376 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c78778-ea99-42a7-885b-15ebf317d3b1" containerName="extract-content" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.541000 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7987f1c-bd96-4436-9fc1-505da115c1ae" containerName="libvirt-openstack-openstack-cell1" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.541049 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="56c78778-ea99-42a7-885b-15ebf317d3b1" containerName="registry-server" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.542301 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.548173 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.548454 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dfk9j" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.548577 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.548688 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.553309 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.553831 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.564447 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.571629 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-hzf9c"] Dec 05 18:37:09 crc kubenswrapper[4756]: E1205 18:37:09.728330 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c78778_ea99_42a7_885b_15ebf317d3b1.slice/crio-conmon-dd5530e699e672a224aa1b364be48bfeea7657e64dc68104a061d1a4d5c5428c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7987f1c_bd96_4436_9fc1_505da115c1ae.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c78778_ea99_42a7_885b_15ebf317d3b1.slice/crio-dd5530e699e672a224aa1b364be48bfeea7657e64dc68104a061d1a4d5c5428c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7987f1c_bd96_4436_9fc1_505da115c1ae.slice/crio-9b1f8e5ea724629468250bdb59f4c8bb7e295c77f4465d1e3241216ca8c7cdfd\": RecentStats: unable to find data in memory cache]" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.736946 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-hzf9c\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.736988 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-hzf9c\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.737018 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-hzf9c\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.737048 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-hzf9c\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.737272 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-hzf9c\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.737769 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-hzf9c\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.737860 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cljtn\" (UniqueName: \"kubernetes.io/projected/98a7241c-4bda-4548-810c-7aadfc51ebf1-kube-api-access-cljtn\") pod \"nova-cell1-openstack-openstack-cell1-hzf9c\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.738025 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-inventory\") pod \"nova-cell1-openstack-openstack-cell1-hzf9c\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.738178 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-hzf9c\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.840529 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cljtn\" (UniqueName: \"kubernetes.io/projected/98a7241c-4bda-4548-810c-7aadfc51ebf1-kube-api-access-cljtn\") pod \"nova-cell1-openstack-openstack-cell1-hzf9c\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.840828 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-inventory\") pod \"nova-cell1-openstack-openstack-cell1-hzf9c\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.840866 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-hzf9c\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.840896 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-hzf9c\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.840917 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-hzf9c\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.840943 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-hzf9c\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.840969 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-hzf9c\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.841012 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-hzf9c\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.841111 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-hzf9c\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.842687 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-hzf9c\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.845321 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-inventory\") pod \"nova-cell1-openstack-openstack-cell1-hzf9c\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.845811 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-hzf9c\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.845971 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-hzf9c\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.846178 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-hzf9c\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.846979 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-hzf9c\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.855752 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-hzf9c\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.856111 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-hzf9c\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.857023 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cljtn\" (UniqueName: \"kubernetes.io/projected/98a7241c-4bda-4548-810c-7aadfc51ebf1-kube-api-access-cljtn\") pod \"nova-cell1-openstack-openstack-cell1-hzf9c\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:37:09 crc kubenswrapper[4756]: I1205 18:37:09.879295 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:37:10 crc kubenswrapper[4756]: I1205 18:37:10.452016 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-hzf9c"] Dec 05 18:37:10 crc kubenswrapper[4756]: I1205 18:37:10.457487 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" event={"ID":"98a7241c-4bda-4548-810c-7aadfc51ebf1","Type":"ContainerStarted","Data":"bcfe0ec1132a4a1266a9caf56c938ab1925a6e69c20e83389c839aff1cd4c81e"} Dec 05 18:37:11 crc kubenswrapper[4756]: I1205 18:37:11.472110 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" event={"ID":"98a7241c-4bda-4548-810c-7aadfc51ebf1","Type":"ContainerStarted","Data":"6d055024a104fb4dfb93a5eca5eefea8bb833502b5df8461db8a5ef1174ba20d"} Dec 05 18:37:11 crc kubenswrapper[4756]: I1205 18:37:11.505229 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" podStartSLOduration=2.068835623 podStartE2EDuration="2.505206917s" podCreationTimestamp="2025-12-05 18:37:09 +0000 UTC" firstStartedPulling="2025-12-05 18:37:10.427172818 +0000 UTC m=+8490.665846350" lastFinishedPulling="2025-12-05 18:37:10.863544092 +0000 UTC m=+8491.102217644" observedRunningTime="2025-12-05 18:37:11.499527985 +0000 UTC m=+8491.738201547" watchObservedRunningTime="2025-12-05 18:37:11.505206917 +0000 UTC m=+8491.743880429" Dec 05 18:37:20 crc kubenswrapper[4756]: E1205 18:37:20.028224 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c78778_ea99_42a7_885b_15ebf317d3b1.slice/crio-dd5530e699e672a224aa1b364be48bfeea7657e64dc68104a061d1a4d5c5428c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c78778_ea99_42a7_885b_15ebf317d3b1.slice/crio-conmon-dd5530e699e672a224aa1b364be48bfeea7657e64dc68104a061d1a4d5c5428c.scope\": RecentStats: unable to find data in memory cache]" Dec 05 18:37:30 crc kubenswrapper[4756]: E1205 18:37:30.338928 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c78778_ea99_42a7_885b_15ebf317d3b1.slice/crio-dd5530e699e672a224aa1b364be48bfeea7657e64dc68104a061d1a4d5c5428c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c78778_ea99_42a7_885b_15ebf317d3b1.slice/crio-conmon-dd5530e699e672a224aa1b364be48bfeea7657e64dc68104a061d1a4d5c5428c.scope\": RecentStats: unable to find data in memory cache]" Dec 05 18:38:22 crc kubenswrapper[4756]: I1205 18:38:22.297532 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:38:22 crc kubenswrapper[4756]: I1205 18:38:22.299199 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:38:52 crc kubenswrapper[4756]: I1205 18:38:52.296127 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:38:52 crc kubenswrapper[4756]: I1205 18:38:52.296832 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:39:12 crc kubenswrapper[4756]: I1205 18:39:12.729501 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="969a4b86-3398-4066-9515-34123347f863" containerName="galera" probeResult="failure" output="command timed out" Dec 05 18:39:12 crc kubenswrapper[4756]: I1205 18:39:12.741967 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="969a4b86-3398-4066-9515-34123347f863" containerName="galera" probeResult="failure" output="command timed out" Dec 05 18:39:22 crc kubenswrapper[4756]: I1205 18:39:22.295742 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:39:22 crc kubenswrapper[4756]: I1205 18:39:22.296358 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:39:22 crc kubenswrapper[4756]: I1205 18:39:22.296446 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 18:39:22 crc kubenswrapper[4756]: I1205 18:39:22.297638 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 18:39:22 crc kubenswrapper[4756]: I1205 18:39:22.297723 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a" gracePeriod=600 Dec 05 18:39:22 crc kubenswrapper[4756]: E1205 18:39:22.428481 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:39:22 crc kubenswrapper[4756]: I1205 18:39:22.598306 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a" exitCode=0 Dec 05 18:39:22 crc kubenswrapper[4756]: I1205 18:39:22.598367 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a"} Dec 05 18:39:22 crc kubenswrapper[4756]: I1205 18:39:22.598413 4756 scope.go:117] "RemoveContainer" containerID="a5d23aadcd4d72fd37d141ecd6190c59b2d01538fcde7179f516cb258c78f879" Dec 05 18:39:22 crc kubenswrapper[4756]: I1205 18:39:22.599512 4756 scope.go:117] "RemoveContainer" containerID="e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a" Dec 05 18:39:22 crc kubenswrapper[4756]: E1205 18:39:22.599959 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:39:37 crc kubenswrapper[4756]: I1205 18:39:37.050209 4756 scope.go:117] "RemoveContainer" containerID="e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a" Dec 05 18:39:37 crc kubenswrapper[4756]: E1205 18:39:37.051002 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:39:48 crc kubenswrapper[4756]: I1205 18:39:48.050073 4756 scope.go:117] "RemoveContainer" containerID="e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a" Dec 05 18:39:48 crc kubenswrapper[4756]: E1205 18:39:48.051571 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:40:00 crc kubenswrapper[4756]: I1205 18:40:00.065058 4756 scope.go:117] "RemoveContainer" containerID="e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a" Dec 05 18:40:00 crc kubenswrapper[4756]: E1205 18:40:00.065900 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:40:14 crc kubenswrapper[4756]: I1205 18:40:14.050252 4756 scope.go:117] "RemoveContainer" containerID="e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a" Dec 05 18:40:14 crc kubenswrapper[4756]: E1205 18:40:14.051493 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:40:14 crc kubenswrapper[4756]: I1205 18:40:14.717445 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5ghm8"] Dec 05 18:40:14 crc kubenswrapper[4756]: I1205 18:40:14.723129 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5ghm8" Dec 05 18:40:14 crc kubenswrapper[4756]: I1205 18:40:14.737189 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5ghm8"] Dec 05 18:40:14 crc kubenswrapper[4756]: I1205 18:40:14.820771 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4be5e377-bd15-417b-b3eb-098029e7946a-catalog-content\") pod \"certified-operators-5ghm8\" (UID: \"4be5e377-bd15-417b-b3eb-098029e7946a\") " pod="openshift-marketplace/certified-operators-5ghm8" Dec 05 18:40:14 crc kubenswrapper[4756]: I1205 18:40:14.820880 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42vrm\" (UniqueName: \"kubernetes.io/projected/4be5e377-bd15-417b-b3eb-098029e7946a-kube-api-access-42vrm\") pod \"certified-operators-5ghm8\" (UID: \"4be5e377-bd15-417b-b3eb-098029e7946a\") " pod="openshift-marketplace/certified-operators-5ghm8" Dec 05 18:40:14 crc kubenswrapper[4756]: I1205 18:40:14.820939 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4be5e377-bd15-417b-b3eb-098029e7946a-utilities\") pod \"certified-operators-5ghm8\" (UID: \"4be5e377-bd15-417b-b3eb-098029e7946a\") " pod="openshift-marketplace/certified-operators-5ghm8" Dec 05 18:40:14 crc kubenswrapper[4756]: I1205 18:40:14.922975 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42vrm\" (UniqueName: \"kubernetes.io/projected/4be5e377-bd15-417b-b3eb-098029e7946a-kube-api-access-42vrm\") pod \"certified-operators-5ghm8\" (UID: \"4be5e377-bd15-417b-b3eb-098029e7946a\") " pod="openshift-marketplace/certified-operators-5ghm8" Dec 05 18:40:14 crc kubenswrapper[4756]: I1205 18:40:14.923089 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4be5e377-bd15-417b-b3eb-098029e7946a-utilities\") pod \"certified-operators-5ghm8\" (UID: \"4be5e377-bd15-417b-b3eb-098029e7946a\") " pod="openshift-marketplace/certified-operators-5ghm8" Dec 05 18:40:14 crc kubenswrapper[4756]: I1205 18:40:14.923270 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4be5e377-bd15-417b-b3eb-098029e7946a-catalog-content\") pod \"certified-operators-5ghm8\" (UID: \"4be5e377-bd15-417b-b3eb-098029e7946a\") " pod="openshift-marketplace/certified-operators-5ghm8" Dec 05 18:40:14 crc kubenswrapper[4756]: I1205 18:40:14.923664 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4be5e377-bd15-417b-b3eb-098029e7946a-utilities\") pod \"certified-operators-5ghm8\" (UID: \"4be5e377-bd15-417b-b3eb-098029e7946a\") " pod="openshift-marketplace/certified-operators-5ghm8" Dec 05 18:40:14 crc kubenswrapper[4756]: I1205 18:40:14.923807 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4be5e377-bd15-417b-b3eb-098029e7946a-catalog-content\") pod \"certified-operators-5ghm8\" (UID: \"4be5e377-bd15-417b-b3eb-098029e7946a\") " pod="openshift-marketplace/certified-operators-5ghm8" Dec 05 18:40:14 crc kubenswrapper[4756]: I1205 18:40:14.950638 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42vrm\" (UniqueName: \"kubernetes.io/projected/4be5e377-bd15-417b-b3eb-098029e7946a-kube-api-access-42vrm\") pod \"certified-operators-5ghm8\" (UID: \"4be5e377-bd15-417b-b3eb-098029e7946a\") " pod="openshift-marketplace/certified-operators-5ghm8" Dec 05 18:40:15 crc kubenswrapper[4756]: I1205 18:40:15.045781 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5ghm8" Dec 05 18:40:15 crc kubenswrapper[4756]: I1205 18:40:15.548118 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5ghm8"] Dec 05 18:40:16 crc kubenswrapper[4756]: I1205 18:40:16.340631 4756 generic.go:334] "Generic (PLEG): container finished" podID="4be5e377-bd15-417b-b3eb-098029e7946a" containerID="8e64df1161751a02d137ade32c22ce2ca4bff9c55583f190472c9fa720ed9ed8" exitCode=0 Dec 05 18:40:16 crc kubenswrapper[4756]: I1205 18:40:16.340684 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ghm8" event={"ID":"4be5e377-bd15-417b-b3eb-098029e7946a","Type":"ContainerDied","Data":"8e64df1161751a02d137ade32c22ce2ca4bff9c55583f190472c9fa720ed9ed8"} Dec 05 18:40:16 crc kubenswrapper[4756]: I1205 18:40:16.341024 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ghm8" event={"ID":"4be5e377-bd15-417b-b3eb-098029e7946a","Type":"ContainerStarted","Data":"6e39af9c96420963bcd92563974fcf4727bf56a3454b48372b1e45c27d3cdd6b"} Dec 05 18:40:17 crc kubenswrapper[4756]: I1205 18:40:17.357355 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ghm8" event={"ID":"4be5e377-bd15-417b-b3eb-098029e7946a","Type":"ContainerStarted","Data":"03e7d2262623b87d3889c7878394900f5ad82562efb3e0b647acc48e7bd4d3f5"} Dec 05 18:40:18 crc kubenswrapper[4756]: I1205 18:40:18.371192 4756 generic.go:334] "Generic (PLEG): container finished" podID="4be5e377-bd15-417b-b3eb-098029e7946a" containerID="03e7d2262623b87d3889c7878394900f5ad82562efb3e0b647acc48e7bd4d3f5" exitCode=0 Dec 05 18:40:18 crc kubenswrapper[4756]: I1205 18:40:18.371296 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ghm8" event={"ID":"4be5e377-bd15-417b-b3eb-098029e7946a","Type":"ContainerDied","Data":"03e7d2262623b87d3889c7878394900f5ad82562efb3e0b647acc48e7bd4d3f5"} Dec 05 18:40:19 crc kubenswrapper[4756]: I1205 18:40:19.391165 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ghm8" event={"ID":"4be5e377-bd15-417b-b3eb-098029e7946a","Type":"ContainerStarted","Data":"4ea583c5b473189514959a6db062b592b434ab8a6787055008261a7e9858c7b6"} Dec 05 18:40:19 crc kubenswrapper[4756]: I1205 18:40:19.418644 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5ghm8" podStartSLOduration=2.960617765 podStartE2EDuration="5.418625219s" podCreationTimestamp="2025-12-05 18:40:14 +0000 UTC" firstStartedPulling="2025-12-05 18:40:16.34503901 +0000 UTC m=+8676.583712532" lastFinishedPulling="2025-12-05 18:40:18.803046454 +0000 UTC m=+8679.041719986" observedRunningTime="2025-12-05 18:40:19.413111651 +0000 UTC m=+8679.651785193" watchObservedRunningTime="2025-12-05 18:40:19.418625219 +0000 UTC m=+8679.657298741" Dec 05 18:40:19 crc kubenswrapper[4756]: I1205 18:40:19.429318 4756 scope.go:117] "RemoveContainer" containerID="08124edaae84b3c24c44c7e4bd1a0f94ae9db57d8b9671337ff133d987b8c6f8" Dec 05 18:40:19 crc kubenswrapper[4756]: I1205 18:40:19.464426 4756 scope.go:117] "RemoveContainer" containerID="e83c0424069f2422958cc12aa9b0de5b81701a634129c903a82c7d2dd23342ea" Dec 05 18:40:19 crc kubenswrapper[4756]: I1205 18:40:19.524137 4756 scope.go:117] "RemoveContainer" containerID="c16847e9b1c1faf8cf9d2d2972b64584891766f9f52b654dad794b2400e66800" Dec 05 18:40:25 crc kubenswrapper[4756]: I1205 18:40:25.047369 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5ghm8" Dec 05 18:40:25 crc kubenswrapper[4756]: I1205 18:40:25.048546 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5ghm8" Dec 05 18:40:25 crc kubenswrapper[4756]: I1205 18:40:25.138473 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5ghm8" Dec 05 18:40:25 crc kubenswrapper[4756]: I1205 18:40:25.529190 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5ghm8" Dec 05 18:40:25 crc kubenswrapper[4756]: I1205 18:40:25.590763 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5ghm8"] Dec 05 18:40:27 crc kubenswrapper[4756]: I1205 18:40:27.488119 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5ghm8" podUID="4be5e377-bd15-417b-b3eb-098029e7946a" containerName="registry-server" containerID="cri-o://4ea583c5b473189514959a6db062b592b434ab8a6787055008261a7e9858c7b6" gracePeriod=2 Dec 05 18:40:28 crc kubenswrapper[4756]: I1205 18:40:28.064009 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5ghm8" Dec 05 18:40:28 crc kubenswrapper[4756]: I1205 18:40:28.190818 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4be5e377-bd15-417b-b3eb-098029e7946a-utilities\") pod \"4be5e377-bd15-417b-b3eb-098029e7946a\" (UID: \"4be5e377-bd15-417b-b3eb-098029e7946a\") " Dec 05 18:40:28 crc kubenswrapper[4756]: I1205 18:40:28.190913 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42vrm\" (UniqueName: \"kubernetes.io/projected/4be5e377-bd15-417b-b3eb-098029e7946a-kube-api-access-42vrm\") pod \"4be5e377-bd15-417b-b3eb-098029e7946a\" (UID: \"4be5e377-bd15-417b-b3eb-098029e7946a\") " Dec 05 18:40:28 crc kubenswrapper[4756]: I1205 18:40:28.191040 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4be5e377-bd15-417b-b3eb-098029e7946a-catalog-content\") pod \"4be5e377-bd15-417b-b3eb-098029e7946a\" (UID: \"4be5e377-bd15-417b-b3eb-098029e7946a\") " Dec 05 18:40:28 crc kubenswrapper[4756]: I1205 18:40:28.193011 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4be5e377-bd15-417b-b3eb-098029e7946a-utilities" (OuterVolumeSpecName: "utilities") pod "4be5e377-bd15-417b-b3eb-098029e7946a" (UID: "4be5e377-bd15-417b-b3eb-098029e7946a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:40:28 crc kubenswrapper[4756]: I1205 18:40:28.200036 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4be5e377-bd15-417b-b3eb-098029e7946a-kube-api-access-42vrm" (OuterVolumeSpecName: "kube-api-access-42vrm") pod "4be5e377-bd15-417b-b3eb-098029e7946a" (UID: "4be5e377-bd15-417b-b3eb-098029e7946a"). InnerVolumeSpecName "kube-api-access-42vrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:40:28 crc kubenswrapper[4756]: I1205 18:40:28.255019 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4be5e377-bd15-417b-b3eb-098029e7946a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4be5e377-bd15-417b-b3eb-098029e7946a" (UID: "4be5e377-bd15-417b-b3eb-098029e7946a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:40:28 crc kubenswrapper[4756]: I1205 18:40:28.295561 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4be5e377-bd15-417b-b3eb-098029e7946a-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 18:40:28 crc kubenswrapper[4756]: I1205 18:40:28.295964 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42vrm\" (UniqueName: \"kubernetes.io/projected/4be5e377-bd15-417b-b3eb-098029e7946a-kube-api-access-42vrm\") on node \"crc\" DevicePath \"\"" Dec 05 18:40:28 crc kubenswrapper[4756]: I1205 18:40:28.296093 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4be5e377-bd15-417b-b3eb-098029e7946a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 18:40:28 crc kubenswrapper[4756]: I1205 18:40:28.506400 4756 generic.go:334] "Generic (PLEG): container finished" podID="4be5e377-bd15-417b-b3eb-098029e7946a" containerID="4ea583c5b473189514959a6db062b592b434ab8a6787055008261a7e9858c7b6" exitCode=0 Dec 05 18:40:28 crc kubenswrapper[4756]: I1205 18:40:28.506473 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ghm8" event={"ID":"4be5e377-bd15-417b-b3eb-098029e7946a","Type":"ContainerDied","Data":"4ea583c5b473189514959a6db062b592b434ab8a6787055008261a7e9858c7b6"} Dec 05 18:40:28 crc kubenswrapper[4756]: I1205 18:40:28.506546 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ghm8" event={"ID":"4be5e377-bd15-417b-b3eb-098029e7946a","Type":"ContainerDied","Data":"6e39af9c96420963bcd92563974fcf4727bf56a3454b48372b1e45c27d3cdd6b"} Dec 05 18:40:28 crc kubenswrapper[4756]: I1205 18:40:28.506569 4756 scope.go:117] "RemoveContainer" containerID="4ea583c5b473189514959a6db062b592b434ab8a6787055008261a7e9858c7b6" Dec 05 18:40:28 crc kubenswrapper[4756]: I1205 18:40:28.506606 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5ghm8" Dec 05 18:40:28 crc kubenswrapper[4756]: I1205 18:40:28.558984 4756 scope.go:117] "RemoveContainer" containerID="03e7d2262623b87d3889c7878394900f5ad82562efb3e0b647acc48e7bd4d3f5" Dec 05 18:40:28 crc kubenswrapper[4756]: I1205 18:40:28.571587 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5ghm8"] Dec 05 18:40:28 crc kubenswrapper[4756]: I1205 18:40:28.588640 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5ghm8"] Dec 05 18:40:28 crc kubenswrapper[4756]: I1205 18:40:28.607378 4756 scope.go:117] "RemoveContainer" containerID="8e64df1161751a02d137ade32c22ce2ca4bff9c55583f190472c9fa720ed9ed8" Dec 05 18:40:28 crc kubenswrapper[4756]: I1205 18:40:28.665480 4756 scope.go:117] "RemoveContainer" containerID="4ea583c5b473189514959a6db062b592b434ab8a6787055008261a7e9858c7b6" Dec 05 18:40:28 crc kubenswrapper[4756]: E1205 18:40:28.666901 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ea583c5b473189514959a6db062b592b434ab8a6787055008261a7e9858c7b6\": container with ID starting with 4ea583c5b473189514959a6db062b592b434ab8a6787055008261a7e9858c7b6 not found: ID does not exist" containerID="4ea583c5b473189514959a6db062b592b434ab8a6787055008261a7e9858c7b6" Dec 05 18:40:28 crc kubenswrapper[4756]: I1205 18:40:28.666948 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ea583c5b473189514959a6db062b592b434ab8a6787055008261a7e9858c7b6"} err="failed to get container status \"4ea583c5b473189514959a6db062b592b434ab8a6787055008261a7e9858c7b6\": rpc error: code = NotFound desc = could not find container \"4ea583c5b473189514959a6db062b592b434ab8a6787055008261a7e9858c7b6\": container with ID starting with 4ea583c5b473189514959a6db062b592b434ab8a6787055008261a7e9858c7b6 not found: ID does not exist" Dec 05 18:40:28 crc kubenswrapper[4756]: I1205 18:40:28.666973 4756 scope.go:117] "RemoveContainer" containerID="03e7d2262623b87d3889c7878394900f5ad82562efb3e0b647acc48e7bd4d3f5" Dec 05 18:40:28 crc kubenswrapper[4756]: E1205 18:40:28.669876 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03e7d2262623b87d3889c7878394900f5ad82562efb3e0b647acc48e7bd4d3f5\": container with ID starting with 03e7d2262623b87d3889c7878394900f5ad82562efb3e0b647acc48e7bd4d3f5 not found: ID does not exist" containerID="03e7d2262623b87d3889c7878394900f5ad82562efb3e0b647acc48e7bd4d3f5" Dec 05 18:40:28 crc kubenswrapper[4756]: I1205 18:40:28.669917 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03e7d2262623b87d3889c7878394900f5ad82562efb3e0b647acc48e7bd4d3f5"} err="failed to get container status \"03e7d2262623b87d3889c7878394900f5ad82562efb3e0b647acc48e7bd4d3f5\": rpc error: code = NotFound desc = could not find container \"03e7d2262623b87d3889c7878394900f5ad82562efb3e0b647acc48e7bd4d3f5\": container with ID starting with 03e7d2262623b87d3889c7878394900f5ad82562efb3e0b647acc48e7bd4d3f5 not found: ID does not exist" Dec 05 18:40:28 crc kubenswrapper[4756]: I1205 18:40:28.669939 4756 scope.go:117] "RemoveContainer" containerID="8e64df1161751a02d137ade32c22ce2ca4bff9c55583f190472c9fa720ed9ed8" Dec 05 18:40:28 crc kubenswrapper[4756]: E1205 18:40:28.671324 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e64df1161751a02d137ade32c22ce2ca4bff9c55583f190472c9fa720ed9ed8\": container with ID starting with 8e64df1161751a02d137ade32c22ce2ca4bff9c55583f190472c9fa720ed9ed8 not found: ID does not exist" containerID="8e64df1161751a02d137ade32c22ce2ca4bff9c55583f190472c9fa720ed9ed8" Dec 05 18:40:28 crc kubenswrapper[4756]: I1205 18:40:28.671348 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e64df1161751a02d137ade32c22ce2ca4bff9c55583f190472c9fa720ed9ed8"} err="failed to get container status \"8e64df1161751a02d137ade32c22ce2ca4bff9c55583f190472c9fa720ed9ed8\": rpc error: code = NotFound desc = could not find container \"8e64df1161751a02d137ade32c22ce2ca4bff9c55583f190472c9fa720ed9ed8\": container with ID starting with 8e64df1161751a02d137ade32c22ce2ca4bff9c55583f190472c9fa720ed9ed8 not found: ID does not exist" Dec 05 18:40:29 crc kubenswrapper[4756]: I1205 18:40:29.049985 4756 scope.go:117] "RemoveContainer" containerID="e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a" Dec 05 18:40:29 crc kubenswrapper[4756]: E1205 18:40:29.050725 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:40:30 crc kubenswrapper[4756]: I1205 18:40:30.071300 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4be5e377-bd15-417b-b3eb-098029e7946a" path="/var/lib/kubelet/pods/4be5e377-bd15-417b-b3eb-098029e7946a/volumes" Dec 05 18:40:32 crc kubenswrapper[4756]: I1205 18:40:32.574271 4756 generic.go:334] "Generic (PLEG): container finished" podID="98a7241c-4bda-4548-810c-7aadfc51ebf1" containerID="6d055024a104fb4dfb93a5eca5eefea8bb833502b5df8461db8a5ef1174ba20d" exitCode=0 Dec 05 18:40:32 crc kubenswrapper[4756]: I1205 18:40:32.574343 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" event={"ID":"98a7241c-4bda-4548-810c-7aadfc51ebf1","Type":"ContainerDied","Data":"6d055024a104fb4dfb93a5eca5eefea8bb833502b5df8461db8a5ef1174ba20d"} Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.133465 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.290814 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-inventory\") pod \"98a7241c-4bda-4548-810c-7aadfc51ebf1\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.290898 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-ssh-key\") pod \"98a7241c-4bda-4548-810c-7aadfc51ebf1\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.291087 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-migration-ssh-key-1\") pod \"98a7241c-4bda-4548-810c-7aadfc51ebf1\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.291121 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-cell1-compute-config-1\") pod \"98a7241c-4bda-4548-810c-7aadfc51ebf1\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.291171 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-cell1-combined-ca-bundle\") pod \"98a7241c-4bda-4548-810c-7aadfc51ebf1\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.291229 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-cells-global-config-0\") pod \"98a7241c-4bda-4548-810c-7aadfc51ebf1\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.291288 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-cell1-compute-config-0\") pod \"98a7241c-4bda-4548-810c-7aadfc51ebf1\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.291395 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-migration-ssh-key-0\") pod \"98a7241c-4bda-4548-810c-7aadfc51ebf1\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.291507 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cljtn\" (UniqueName: \"kubernetes.io/projected/98a7241c-4bda-4548-810c-7aadfc51ebf1-kube-api-access-cljtn\") pod \"98a7241c-4bda-4548-810c-7aadfc51ebf1\" (UID: \"98a7241c-4bda-4548-810c-7aadfc51ebf1\") " Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.303101 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98a7241c-4bda-4548-810c-7aadfc51ebf1-kube-api-access-cljtn" (OuterVolumeSpecName: "kube-api-access-cljtn") pod "98a7241c-4bda-4548-810c-7aadfc51ebf1" (UID: "98a7241c-4bda-4548-810c-7aadfc51ebf1"). InnerVolumeSpecName "kube-api-access-cljtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.314722 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "98a7241c-4bda-4548-810c-7aadfc51ebf1" (UID: "98a7241c-4bda-4548-810c-7aadfc51ebf1"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.327187 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "98a7241c-4bda-4548-810c-7aadfc51ebf1" (UID: "98a7241c-4bda-4548-810c-7aadfc51ebf1"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.335390 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-inventory" (OuterVolumeSpecName: "inventory") pod "98a7241c-4bda-4548-810c-7aadfc51ebf1" (UID: "98a7241c-4bda-4548-810c-7aadfc51ebf1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.336511 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "98a7241c-4bda-4548-810c-7aadfc51ebf1" (UID: "98a7241c-4bda-4548-810c-7aadfc51ebf1"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.350471 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "98a7241c-4bda-4548-810c-7aadfc51ebf1" (UID: "98a7241c-4bda-4548-810c-7aadfc51ebf1"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.358357 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "98a7241c-4bda-4548-810c-7aadfc51ebf1" (UID: "98a7241c-4bda-4548-810c-7aadfc51ebf1"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.361508 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "98a7241c-4bda-4548-810c-7aadfc51ebf1" (UID: "98a7241c-4bda-4548-810c-7aadfc51ebf1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.371720 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "98a7241c-4bda-4548-810c-7aadfc51ebf1" (UID: "98a7241c-4bda-4548-810c-7aadfc51ebf1"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.394283 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.394325 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.394339 4756 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.394355 4756 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.394369 4756 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.394382 4756 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.394398 4756 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.394410 4756 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/98a7241c-4bda-4548-810c-7aadfc51ebf1-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.394445 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cljtn\" (UniqueName: \"kubernetes.io/projected/98a7241c-4bda-4548-810c-7aadfc51ebf1-kube-api-access-cljtn\") on node \"crc\" DevicePath \"\"" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.602846 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" event={"ID":"98a7241c-4bda-4548-810c-7aadfc51ebf1","Type":"ContainerDied","Data":"bcfe0ec1132a4a1266a9caf56c938ab1925a6e69c20e83389c839aff1cd4c81e"} Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.602906 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcfe0ec1132a4a1266a9caf56c938ab1925a6e69c20e83389c839aff1cd4c81e" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.602936 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-hzf9c" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.720742 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-p9gcj"] Dec 05 18:40:34 crc kubenswrapper[4756]: E1205 18:40:34.721526 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4be5e377-bd15-417b-b3eb-098029e7946a" containerName="extract-content" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.721545 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4be5e377-bd15-417b-b3eb-098029e7946a" containerName="extract-content" Dec 05 18:40:34 crc kubenswrapper[4756]: E1205 18:40:34.721569 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4be5e377-bd15-417b-b3eb-098029e7946a" containerName="extract-utilities" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.721579 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4be5e377-bd15-417b-b3eb-098029e7946a" containerName="extract-utilities" Dec 05 18:40:34 crc kubenswrapper[4756]: E1205 18:40:34.721603 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4be5e377-bd15-417b-b3eb-098029e7946a" containerName="registry-server" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.721612 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4be5e377-bd15-417b-b3eb-098029e7946a" containerName="registry-server" Dec 05 18:40:34 crc kubenswrapper[4756]: E1205 18:40:34.721630 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98a7241c-4bda-4548-810c-7aadfc51ebf1" containerName="nova-cell1-openstack-openstack-cell1" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.721639 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="98a7241c-4bda-4548-810c-7aadfc51ebf1" containerName="nova-cell1-openstack-openstack-cell1" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.721910 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="4be5e377-bd15-417b-b3eb-098029e7946a" containerName="registry-server" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.721942 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="98a7241c-4bda-4548-810c-7aadfc51ebf1" containerName="nova-cell1-openstack-openstack-cell1" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.722904 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-p9gcj" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.726546 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dfk9j" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.726712 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.726775 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.727303 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.728948 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.741059 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-p9gcj"] Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.904110 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7774\" (UniqueName: \"kubernetes.io/projected/2ea99c30-bc6e-47b2-9ace-f7d82418e729-kube-api-access-d7774\") pod \"telemetry-openstack-openstack-cell1-p9gcj\" (UID: \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\") " pod="openstack/telemetry-openstack-openstack-cell1-p9gcj" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.904158 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-p9gcj\" (UID: \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\") " pod="openstack/telemetry-openstack-openstack-cell1-p9gcj" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.904177 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-inventory\") pod \"telemetry-openstack-openstack-cell1-p9gcj\" (UID: \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\") " pod="openstack/telemetry-openstack-openstack-cell1-p9gcj" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.904330 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-ssh-key\") pod \"telemetry-openstack-openstack-cell1-p9gcj\" (UID: \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\") " pod="openstack/telemetry-openstack-openstack-cell1-p9gcj" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.904359 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-p9gcj\" (UID: \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\") " pod="openstack/telemetry-openstack-openstack-cell1-p9gcj" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.904379 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-p9gcj\" (UID: \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\") " pod="openstack/telemetry-openstack-openstack-cell1-p9gcj" Dec 05 18:40:34 crc kubenswrapper[4756]: I1205 18:40:34.904625 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-p9gcj\" (UID: \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\") " pod="openstack/telemetry-openstack-openstack-cell1-p9gcj" Dec 05 18:40:35 crc kubenswrapper[4756]: I1205 18:40:35.006704 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-p9gcj\" (UID: \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\") " pod="openstack/telemetry-openstack-openstack-cell1-p9gcj" Dec 05 18:40:35 crc kubenswrapper[4756]: I1205 18:40:35.006757 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-p9gcj\" (UID: \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\") " pod="openstack/telemetry-openstack-openstack-cell1-p9gcj" Dec 05 18:40:35 crc kubenswrapper[4756]: I1205 18:40:35.006884 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-p9gcj\" (UID: \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\") " pod="openstack/telemetry-openstack-openstack-cell1-p9gcj" Dec 05 18:40:35 crc kubenswrapper[4756]: I1205 18:40:35.006946 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7774\" (UniqueName: \"kubernetes.io/projected/2ea99c30-bc6e-47b2-9ace-f7d82418e729-kube-api-access-d7774\") pod \"telemetry-openstack-openstack-cell1-p9gcj\" (UID: \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\") " pod="openstack/telemetry-openstack-openstack-cell1-p9gcj" Dec 05 18:40:35 crc kubenswrapper[4756]: I1205 18:40:35.006983 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-p9gcj\" (UID: \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\") " pod="openstack/telemetry-openstack-openstack-cell1-p9gcj" Dec 05 18:40:35 crc kubenswrapper[4756]: I1205 18:40:35.007007 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-inventory\") pod \"telemetry-openstack-openstack-cell1-p9gcj\" (UID: \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\") " pod="openstack/telemetry-openstack-openstack-cell1-p9gcj" Dec 05 18:40:35 crc kubenswrapper[4756]: I1205 18:40:35.007131 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-ssh-key\") pod \"telemetry-openstack-openstack-cell1-p9gcj\" (UID: \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\") " pod="openstack/telemetry-openstack-openstack-cell1-p9gcj" Dec 05 18:40:35 crc kubenswrapper[4756]: I1205 18:40:35.013290 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-inventory\") pod \"telemetry-openstack-openstack-cell1-p9gcj\" (UID: \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\") " pod="openstack/telemetry-openstack-openstack-cell1-p9gcj" Dec 05 18:40:35 crc kubenswrapper[4756]: I1205 18:40:35.013872 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-p9gcj\" (UID: \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\") " pod="openstack/telemetry-openstack-openstack-cell1-p9gcj" Dec 05 18:40:35 crc kubenswrapper[4756]: I1205 18:40:35.014095 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-p9gcj\" (UID: \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\") " pod="openstack/telemetry-openstack-openstack-cell1-p9gcj" Dec 05 18:40:35 crc kubenswrapper[4756]: I1205 18:40:35.014808 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-p9gcj\" (UID: \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\") " pod="openstack/telemetry-openstack-openstack-cell1-p9gcj" Dec 05 18:40:35 crc kubenswrapper[4756]: I1205 18:40:35.021667 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-p9gcj\" (UID: \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\") " pod="openstack/telemetry-openstack-openstack-cell1-p9gcj" Dec 05 18:40:35 crc kubenswrapper[4756]: I1205 18:40:35.023563 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-ssh-key\") pod \"telemetry-openstack-openstack-cell1-p9gcj\" (UID: \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\") " pod="openstack/telemetry-openstack-openstack-cell1-p9gcj" Dec 05 18:40:35 crc kubenswrapper[4756]: I1205 18:40:35.038311 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7774\" (UniqueName: \"kubernetes.io/projected/2ea99c30-bc6e-47b2-9ace-f7d82418e729-kube-api-access-d7774\") pod \"telemetry-openstack-openstack-cell1-p9gcj\" (UID: \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\") " pod="openstack/telemetry-openstack-openstack-cell1-p9gcj" Dec 05 18:40:35 crc kubenswrapper[4756]: I1205 18:40:35.041445 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-p9gcj" Dec 05 18:40:35 crc kubenswrapper[4756]: I1205 18:40:35.682597 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-p9gcj"] Dec 05 18:40:36 crc kubenswrapper[4756]: I1205 18:40:36.641722 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-p9gcj" event={"ID":"2ea99c30-bc6e-47b2-9ace-f7d82418e729","Type":"ContainerStarted","Data":"431399ff544569e7a29ae1ff9a69ed4a1d9de91d8f1d8bbce51c33c5fd23f123"} Dec 05 18:40:36 crc kubenswrapper[4756]: I1205 18:40:36.642203 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-p9gcj" event={"ID":"2ea99c30-bc6e-47b2-9ace-f7d82418e729","Type":"ContainerStarted","Data":"a8a248fd73cee02099a85377643f781c84b312df1546bef68f56233723d4756e"} Dec 05 18:40:40 crc kubenswrapper[4756]: I1205 18:40:40.063866 4756 scope.go:117] "RemoveContainer" containerID="e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a" Dec 05 18:40:40 crc kubenswrapper[4756]: E1205 18:40:40.066598 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:40:56 crc kubenswrapper[4756]: I1205 18:40:56.049810 4756 scope.go:117] "RemoveContainer" containerID="e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a" Dec 05 18:40:56 crc kubenswrapper[4756]: E1205 18:40:56.050736 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:41:07 crc kubenswrapper[4756]: I1205 18:41:07.052287 4756 scope.go:117] "RemoveContainer" containerID="e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a" Dec 05 18:41:07 crc kubenswrapper[4756]: E1205 18:41:07.053123 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:41:18 crc kubenswrapper[4756]: I1205 18:41:18.051163 4756 scope.go:117] "RemoveContainer" containerID="e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a" Dec 05 18:41:18 crc kubenswrapper[4756]: E1205 18:41:18.052734 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:41:33 crc kubenswrapper[4756]: I1205 18:41:33.050478 4756 scope.go:117] "RemoveContainer" containerID="e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a" Dec 05 18:41:33 crc kubenswrapper[4756]: E1205 18:41:33.051465 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:41:44 crc kubenswrapper[4756]: I1205 18:41:44.049617 4756 scope.go:117] "RemoveContainer" containerID="e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a" Dec 05 18:41:44 crc kubenswrapper[4756]: E1205 18:41:44.050801 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:41:58 crc kubenswrapper[4756]: I1205 18:41:58.050544 4756 scope.go:117] "RemoveContainer" containerID="e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a" Dec 05 18:41:58 crc kubenswrapper[4756]: E1205 18:41:58.051669 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:42:09 crc kubenswrapper[4756]: I1205 18:42:09.050339 4756 scope.go:117] "RemoveContainer" containerID="e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a" Dec 05 18:42:09 crc kubenswrapper[4756]: E1205 18:42:09.051403 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:42:20 crc kubenswrapper[4756]: I1205 18:42:20.063124 4756 scope.go:117] "RemoveContainer" containerID="e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a" Dec 05 18:42:20 crc kubenswrapper[4756]: E1205 18:42:20.064142 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:42:33 crc kubenswrapper[4756]: I1205 18:42:33.051157 4756 scope.go:117] "RemoveContainer" containerID="e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a" Dec 05 18:42:33 crc kubenswrapper[4756]: E1205 18:42:33.052434 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:42:44 crc kubenswrapper[4756]: I1205 18:42:44.050882 4756 scope.go:117] "RemoveContainer" containerID="e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a" Dec 05 18:42:44 crc kubenswrapper[4756]: E1205 18:42:44.052752 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:42:58 crc kubenswrapper[4756]: I1205 18:42:58.050102 4756 scope.go:117] "RemoveContainer" containerID="e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a" Dec 05 18:42:58 crc kubenswrapper[4756]: E1205 18:42:58.050866 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:43:10 crc kubenswrapper[4756]: I1205 18:43:10.058126 4756 scope.go:117] "RemoveContainer" containerID="e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a" Dec 05 18:43:10 crc kubenswrapper[4756]: E1205 18:43:10.059367 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:43:22 crc kubenswrapper[4756]: I1205 18:43:22.050734 4756 scope.go:117] "RemoveContainer" containerID="e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a" Dec 05 18:43:22 crc kubenswrapper[4756]: E1205 18:43:22.051950 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:43:33 crc kubenswrapper[4756]: I1205 18:43:33.051548 4756 scope.go:117] "RemoveContainer" containerID="e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a" Dec 05 18:43:33 crc kubenswrapper[4756]: E1205 18:43:33.054485 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:43:44 crc kubenswrapper[4756]: I1205 18:43:44.050562 4756 scope.go:117] "RemoveContainer" containerID="e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a" Dec 05 18:43:44 crc kubenswrapper[4756]: E1205 18:43:44.052094 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:43:58 crc kubenswrapper[4756]: I1205 18:43:58.051152 4756 scope.go:117] "RemoveContainer" containerID="e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a" Dec 05 18:43:58 crc kubenswrapper[4756]: E1205 18:43:58.052158 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:44:11 crc kubenswrapper[4756]: I1205 18:44:11.050569 4756 scope.go:117] "RemoveContainer" containerID="e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a" Dec 05 18:44:11 crc kubenswrapper[4756]: E1205 18:44:11.051826 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:44:13 crc kubenswrapper[4756]: I1205 18:44:13.787784 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-p9gcj" podStartSLOduration=219.381074382 podStartE2EDuration="3m39.787758858s" podCreationTimestamp="2025-12-05 18:40:34 +0000 UTC" firstStartedPulling="2025-12-05 18:40:35.687123432 +0000 UTC m=+8695.925796944" lastFinishedPulling="2025-12-05 18:40:36.093807898 +0000 UTC m=+8696.332481420" observedRunningTime="2025-12-05 18:40:36.676754367 +0000 UTC m=+8696.915427919" watchObservedRunningTime="2025-12-05 18:44:13.787758858 +0000 UTC m=+8914.026432410" Dec 05 18:44:13 crc kubenswrapper[4756]: I1205 18:44:13.813031 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9k28m"] Dec 05 18:44:13 crc kubenswrapper[4756]: I1205 18:44:13.820565 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9k28m" Dec 05 18:44:13 crc kubenswrapper[4756]: I1205 18:44:13.863322 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9k28m"] Dec 05 18:44:13 crc kubenswrapper[4756]: I1205 18:44:13.931923 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kckqq\" (UniqueName: \"kubernetes.io/projected/550f70ae-6778-4c15-97e1-6d961121837d-kube-api-access-kckqq\") pod \"community-operators-9k28m\" (UID: \"550f70ae-6778-4c15-97e1-6d961121837d\") " pod="openshift-marketplace/community-operators-9k28m" Dec 05 18:44:13 crc kubenswrapper[4756]: I1205 18:44:13.933362 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/550f70ae-6778-4c15-97e1-6d961121837d-utilities\") pod \"community-operators-9k28m\" (UID: \"550f70ae-6778-4c15-97e1-6d961121837d\") " pod="openshift-marketplace/community-operators-9k28m" Dec 05 18:44:13 crc kubenswrapper[4756]: I1205 18:44:13.933476 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/550f70ae-6778-4c15-97e1-6d961121837d-catalog-content\") pod \"community-operators-9k28m\" (UID: \"550f70ae-6778-4c15-97e1-6d961121837d\") " pod="openshift-marketplace/community-operators-9k28m" Dec 05 18:44:14 crc kubenswrapper[4756]: I1205 18:44:14.036184 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/550f70ae-6778-4c15-97e1-6d961121837d-utilities\") pod \"community-operators-9k28m\" (UID: \"550f70ae-6778-4c15-97e1-6d961121837d\") " pod="openshift-marketplace/community-operators-9k28m" Dec 05 18:44:14 crc kubenswrapper[4756]: I1205 18:44:14.036303 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/550f70ae-6778-4c15-97e1-6d961121837d-catalog-content\") pod \"community-operators-9k28m\" (UID: \"550f70ae-6778-4c15-97e1-6d961121837d\") " pod="openshift-marketplace/community-operators-9k28m" Dec 05 18:44:14 crc kubenswrapper[4756]: I1205 18:44:14.036339 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kckqq\" (UniqueName: \"kubernetes.io/projected/550f70ae-6778-4c15-97e1-6d961121837d-kube-api-access-kckqq\") pod \"community-operators-9k28m\" (UID: \"550f70ae-6778-4c15-97e1-6d961121837d\") " pod="openshift-marketplace/community-operators-9k28m" Dec 05 18:44:14 crc kubenswrapper[4756]: I1205 18:44:14.036740 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/550f70ae-6778-4c15-97e1-6d961121837d-catalog-content\") pod \"community-operators-9k28m\" (UID: \"550f70ae-6778-4c15-97e1-6d961121837d\") " pod="openshift-marketplace/community-operators-9k28m" Dec 05 18:44:14 crc kubenswrapper[4756]: I1205 18:44:14.036966 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/550f70ae-6778-4c15-97e1-6d961121837d-utilities\") pod \"community-operators-9k28m\" (UID: \"550f70ae-6778-4c15-97e1-6d961121837d\") " pod="openshift-marketplace/community-operators-9k28m" Dec 05 18:44:14 crc kubenswrapper[4756]: I1205 18:44:14.061810 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kckqq\" (UniqueName: \"kubernetes.io/projected/550f70ae-6778-4c15-97e1-6d961121837d-kube-api-access-kckqq\") pod \"community-operators-9k28m\" (UID: \"550f70ae-6778-4c15-97e1-6d961121837d\") " pod="openshift-marketplace/community-operators-9k28m" Dec 05 18:44:14 crc kubenswrapper[4756]: I1205 18:44:14.167086 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9k28m" Dec 05 18:44:14 crc kubenswrapper[4756]: I1205 18:44:14.724789 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9k28m"] Dec 05 18:44:15 crc kubenswrapper[4756]: I1205 18:44:15.493869 4756 generic.go:334] "Generic (PLEG): container finished" podID="550f70ae-6778-4c15-97e1-6d961121837d" containerID="89e8cd7ae1b53b802372c8db1385af94cf8fdb4fbe357197acd202fe889161e9" exitCode=0 Dec 05 18:44:15 crc kubenswrapper[4756]: I1205 18:44:15.493928 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9k28m" event={"ID":"550f70ae-6778-4c15-97e1-6d961121837d","Type":"ContainerDied","Data":"89e8cd7ae1b53b802372c8db1385af94cf8fdb4fbe357197acd202fe889161e9"} Dec 05 18:44:15 crc kubenswrapper[4756]: I1205 18:44:15.494203 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9k28m" event={"ID":"550f70ae-6778-4c15-97e1-6d961121837d","Type":"ContainerStarted","Data":"604b98ae4a15d5469f17857fd9a26d79ab8d7b2eebde504190b8ac26a4a5471e"} Dec 05 18:44:15 crc kubenswrapper[4756]: I1205 18:44:15.497290 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 18:44:18 crc kubenswrapper[4756]: I1205 18:44:18.536571 4756 generic.go:334] "Generic (PLEG): container finished" podID="550f70ae-6778-4c15-97e1-6d961121837d" containerID="16aee0a2c11dbdc746a16e24684aa847b5647f4dddc1be4c713e1e300050147d" exitCode=0 Dec 05 18:44:18 crc kubenswrapper[4756]: I1205 18:44:18.536656 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9k28m" event={"ID":"550f70ae-6778-4c15-97e1-6d961121837d","Type":"ContainerDied","Data":"16aee0a2c11dbdc746a16e24684aa847b5647f4dddc1be4c713e1e300050147d"} Dec 05 18:44:19 crc kubenswrapper[4756]: I1205 18:44:19.555403 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9k28m" event={"ID":"550f70ae-6778-4c15-97e1-6d961121837d","Type":"ContainerStarted","Data":"5d23b74eab65becdca5401233be84c2dccf7826ce6724c8571f33589b181617a"} Dec 05 18:44:19 crc kubenswrapper[4756]: I1205 18:44:19.593843 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9k28m" podStartSLOduration=3.113509805 podStartE2EDuration="6.593822744s" podCreationTimestamp="2025-12-05 18:44:13 +0000 UTC" firstStartedPulling="2025-12-05 18:44:15.49594192 +0000 UTC m=+8915.734615442" lastFinishedPulling="2025-12-05 18:44:18.976254829 +0000 UTC m=+8919.214928381" observedRunningTime="2025-12-05 18:44:19.588465351 +0000 UTC m=+8919.827138923" watchObservedRunningTime="2025-12-05 18:44:19.593822744 +0000 UTC m=+8919.832496266" Dec 05 18:44:24 crc kubenswrapper[4756]: I1205 18:44:24.168290 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9k28m" Dec 05 18:44:24 crc kubenswrapper[4756]: I1205 18:44:24.169020 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9k28m" Dec 05 18:44:24 crc kubenswrapper[4756]: I1205 18:44:24.262691 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9k28m" Dec 05 18:44:24 crc kubenswrapper[4756]: I1205 18:44:24.692574 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9k28m" Dec 05 18:44:24 crc kubenswrapper[4756]: I1205 18:44:24.787883 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9k28m"] Dec 05 18:44:26 crc kubenswrapper[4756]: I1205 18:44:26.051500 4756 scope.go:117] "RemoveContainer" containerID="e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a" Dec 05 18:44:26 crc kubenswrapper[4756]: I1205 18:44:26.658588 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"86da3e72fd2da091d3bf1ed9c7603b3e808b242b9c238035f7cb19bb181b9980"} Dec 05 18:44:26 crc kubenswrapper[4756]: I1205 18:44:26.658800 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9k28m" podUID="550f70ae-6778-4c15-97e1-6d961121837d" containerName="registry-server" containerID="cri-o://5d23b74eab65becdca5401233be84c2dccf7826ce6724c8571f33589b181617a" gracePeriod=2 Dec 05 18:44:27 crc kubenswrapper[4756]: I1205 18:44:27.267526 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9k28m" Dec 05 18:44:27 crc kubenswrapper[4756]: I1205 18:44:27.417832 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/550f70ae-6778-4c15-97e1-6d961121837d-catalog-content\") pod \"550f70ae-6778-4c15-97e1-6d961121837d\" (UID: \"550f70ae-6778-4c15-97e1-6d961121837d\") " Dec 05 18:44:27 crc kubenswrapper[4756]: I1205 18:44:27.418037 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/550f70ae-6778-4c15-97e1-6d961121837d-utilities\") pod \"550f70ae-6778-4c15-97e1-6d961121837d\" (UID: \"550f70ae-6778-4c15-97e1-6d961121837d\") " Dec 05 18:44:27 crc kubenswrapper[4756]: I1205 18:44:27.418064 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kckqq\" (UniqueName: \"kubernetes.io/projected/550f70ae-6778-4c15-97e1-6d961121837d-kube-api-access-kckqq\") pod \"550f70ae-6778-4c15-97e1-6d961121837d\" (UID: \"550f70ae-6778-4c15-97e1-6d961121837d\") " Dec 05 18:44:27 crc kubenswrapper[4756]: I1205 18:44:27.420146 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/550f70ae-6778-4c15-97e1-6d961121837d-utilities" (OuterVolumeSpecName: "utilities") pod "550f70ae-6778-4c15-97e1-6d961121837d" (UID: "550f70ae-6778-4c15-97e1-6d961121837d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:44:27 crc kubenswrapper[4756]: I1205 18:44:27.424931 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/550f70ae-6778-4c15-97e1-6d961121837d-kube-api-access-kckqq" (OuterVolumeSpecName: "kube-api-access-kckqq") pod "550f70ae-6778-4c15-97e1-6d961121837d" (UID: "550f70ae-6778-4c15-97e1-6d961121837d"). InnerVolumeSpecName "kube-api-access-kckqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:44:27 crc kubenswrapper[4756]: I1205 18:44:27.482782 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/550f70ae-6778-4c15-97e1-6d961121837d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "550f70ae-6778-4c15-97e1-6d961121837d" (UID: "550f70ae-6778-4c15-97e1-6d961121837d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:44:27 crc kubenswrapper[4756]: I1205 18:44:27.521962 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/550f70ae-6778-4c15-97e1-6d961121837d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 18:44:27 crc kubenswrapper[4756]: I1205 18:44:27.522322 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/550f70ae-6778-4c15-97e1-6d961121837d-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 18:44:27 crc kubenswrapper[4756]: I1205 18:44:27.522377 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kckqq\" (UniqueName: \"kubernetes.io/projected/550f70ae-6778-4c15-97e1-6d961121837d-kube-api-access-kckqq\") on node \"crc\" DevicePath \"\"" Dec 05 18:44:27 crc kubenswrapper[4756]: I1205 18:44:27.675585 4756 generic.go:334] "Generic (PLEG): container finished" podID="550f70ae-6778-4c15-97e1-6d961121837d" containerID="5d23b74eab65becdca5401233be84c2dccf7826ce6724c8571f33589b181617a" exitCode=0 Dec 05 18:44:27 crc kubenswrapper[4756]: I1205 18:44:27.675647 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9k28m" event={"ID":"550f70ae-6778-4c15-97e1-6d961121837d","Type":"ContainerDied","Data":"5d23b74eab65becdca5401233be84c2dccf7826ce6724c8571f33589b181617a"} Dec 05 18:44:27 crc kubenswrapper[4756]: I1205 18:44:27.675680 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9k28m" Dec 05 18:44:27 crc kubenswrapper[4756]: I1205 18:44:27.675700 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9k28m" event={"ID":"550f70ae-6778-4c15-97e1-6d961121837d","Type":"ContainerDied","Data":"604b98ae4a15d5469f17857fd9a26d79ab8d7b2eebde504190b8ac26a4a5471e"} Dec 05 18:44:27 crc kubenswrapper[4756]: I1205 18:44:27.675729 4756 scope.go:117] "RemoveContainer" containerID="5d23b74eab65becdca5401233be84c2dccf7826ce6724c8571f33589b181617a" Dec 05 18:44:27 crc kubenswrapper[4756]: I1205 18:44:27.709831 4756 scope.go:117] "RemoveContainer" containerID="16aee0a2c11dbdc746a16e24684aa847b5647f4dddc1be4c713e1e300050147d" Dec 05 18:44:27 crc kubenswrapper[4756]: I1205 18:44:27.737273 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9k28m"] Dec 05 18:44:27 crc kubenswrapper[4756]: I1205 18:44:27.755657 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9k28m"] Dec 05 18:44:27 crc kubenswrapper[4756]: I1205 18:44:27.765713 4756 scope.go:117] "RemoveContainer" containerID="89e8cd7ae1b53b802372c8db1385af94cf8fdb4fbe357197acd202fe889161e9" Dec 05 18:44:27 crc kubenswrapper[4756]: I1205 18:44:27.831054 4756 scope.go:117] "RemoveContainer" containerID="5d23b74eab65becdca5401233be84c2dccf7826ce6724c8571f33589b181617a" Dec 05 18:44:27 crc kubenswrapper[4756]: E1205 18:44:27.832209 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d23b74eab65becdca5401233be84c2dccf7826ce6724c8571f33589b181617a\": container with ID starting with 5d23b74eab65becdca5401233be84c2dccf7826ce6724c8571f33589b181617a not found: ID does not exist" containerID="5d23b74eab65becdca5401233be84c2dccf7826ce6724c8571f33589b181617a" Dec 05 18:44:27 crc kubenswrapper[4756]: I1205 18:44:27.832270 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d23b74eab65becdca5401233be84c2dccf7826ce6724c8571f33589b181617a"} err="failed to get container status \"5d23b74eab65becdca5401233be84c2dccf7826ce6724c8571f33589b181617a\": rpc error: code = NotFound desc = could not find container \"5d23b74eab65becdca5401233be84c2dccf7826ce6724c8571f33589b181617a\": container with ID starting with 5d23b74eab65becdca5401233be84c2dccf7826ce6724c8571f33589b181617a not found: ID does not exist" Dec 05 18:44:27 crc kubenswrapper[4756]: I1205 18:44:27.832316 4756 scope.go:117] "RemoveContainer" containerID="16aee0a2c11dbdc746a16e24684aa847b5647f4dddc1be4c713e1e300050147d" Dec 05 18:44:27 crc kubenswrapper[4756]: E1205 18:44:27.833000 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16aee0a2c11dbdc746a16e24684aa847b5647f4dddc1be4c713e1e300050147d\": container with ID starting with 16aee0a2c11dbdc746a16e24684aa847b5647f4dddc1be4c713e1e300050147d not found: ID does not exist" containerID="16aee0a2c11dbdc746a16e24684aa847b5647f4dddc1be4c713e1e300050147d" Dec 05 18:44:27 crc kubenswrapper[4756]: I1205 18:44:27.833069 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16aee0a2c11dbdc746a16e24684aa847b5647f4dddc1be4c713e1e300050147d"} err="failed to get container status \"16aee0a2c11dbdc746a16e24684aa847b5647f4dddc1be4c713e1e300050147d\": rpc error: code = NotFound desc = could not find container \"16aee0a2c11dbdc746a16e24684aa847b5647f4dddc1be4c713e1e300050147d\": container with ID starting with 16aee0a2c11dbdc746a16e24684aa847b5647f4dddc1be4c713e1e300050147d not found: ID does not exist" Dec 05 18:44:27 crc kubenswrapper[4756]: I1205 18:44:27.833115 4756 scope.go:117] "RemoveContainer" containerID="89e8cd7ae1b53b802372c8db1385af94cf8fdb4fbe357197acd202fe889161e9" Dec 05 18:44:27 crc kubenswrapper[4756]: E1205 18:44:27.833760 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89e8cd7ae1b53b802372c8db1385af94cf8fdb4fbe357197acd202fe889161e9\": container with ID starting with 89e8cd7ae1b53b802372c8db1385af94cf8fdb4fbe357197acd202fe889161e9 not found: ID does not exist" containerID="89e8cd7ae1b53b802372c8db1385af94cf8fdb4fbe357197acd202fe889161e9" Dec 05 18:44:27 crc kubenswrapper[4756]: I1205 18:44:27.833804 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89e8cd7ae1b53b802372c8db1385af94cf8fdb4fbe357197acd202fe889161e9"} err="failed to get container status \"89e8cd7ae1b53b802372c8db1385af94cf8fdb4fbe357197acd202fe889161e9\": rpc error: code = NotFound desc = could not find container \"89e8cd7ae1b53b802372c8db1385af94cf8fdb4fbe357197acd202fe889161e9\": container with ID starting with 89e8cd7ae1b53b802372c8db1385af94cf8fdb4fbe357197acd202fe889161e9 not found: ID does not exist" Dec 05 18:44:28 crc kubenswrapper[4756]: I1205 18:44:28.072662 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="550f70ae-6778-4c15-97e1-6d961121837d" path="/var/lib/kubelet/pods/550f70ae-6778-4c15-97e1-6d961121837d/volumes" Dec 05 18:45:00 crc kubenswrapper[4756]: I1205 18:45:00.177147 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416005-hxkmw"] Dec 05 18:45:00 crc kubenswrapper[4756]: E1205 18:45:00.178491 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="550f70ae-6778-4c15-97e1-6d961121837d" containerName="extract-utilities" Dec 05 18:45:00 crc kubenswrapper[4756]: I1205 18:45:00.178516 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="550f70ae-6778-4c15-97e1-6d961121837d" containerName="extract-utilities" Dec 05 18:45:00 crc kubenswrapper[4756]: E1205 18:45:00.178578 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="550f70ae-6778-4c15-97e1-6d961121837d" containerName="extract-content" Dec 05 18:45:00 crc kubenswrapper[4756]: I1205 18:45:00.178591 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="550f70ae-6778-4c15-97e1-6d961121837d" containerName="extract-content" Dec 05 18:45:00 crc kubenswrapper[4756]: E1205 18:45:00.178621 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="550f70ae-6778-4c15-97e1-6d961121837d" containerName="registry-server" Dec 05 18:45:00 crc kubenswrapper[4756]: I1205 18:45:00.178633 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="550f70ae-6778-4c15-97e1-6d961121837d" containerName="registry-server" Dec 05 18:45:00 crc kubenswrapper[4756]: I1205 18:45:00.179045 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="550f70ae-6778-4c15-97e1-6d961121837d" containerName="registry-server" Dec 05 18:45:00 crc kubenswrapper[4756]: I1205 18:45:00.180402 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416005-hxkmw" Dec 05 18:45:00 crc kubenswrapper[4756]: I1205 18:45:00.182826 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 18:45:00 crc kubenswrapper[4756]: I1205 18:45:00.183617 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 18:45:00 crc kubenswrapper[4756]: I1205 18:45:00.189268 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416005-hxkmw"] Dec 05 18:45:00 crc kubenswrapper[4756]: I1205 18:45:00.361474 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsf4m\" (UniqueName: \"kubernetes.io/projected/7bedc3fc-8528-41aa-914c-f47458c02606-kube-api-access-tsf4m\") pod \"collect-profiles-29416005-hxkmw\" (UID: \"7bedc3fc-8528-41aa-914c-f47458c02606\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416005-hxkmw" Dec 05 18:45:00 crc kubenswrapper[4756]: I1205 18:45:00.361572 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7bedc3fc-8528-41aa-914c-f47458c02606-config-volume\") pod \"collect-profiles-29416005-hxkmw\" (UID: \"7bedc3fc-8528-41aa-914c-f47458c02606\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416005-hxkmw" Dec 05 18:45:00 crc kubenswrapper[4756]: I1205 18:45:00.362190 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7bedc3fc-8528-41aa-914c-f47458c02606-secret-volume\") pod \"collect-profiles-29416005-hxkmw\" (UID: \"7bedc3fc-8528-41aa-914c-f47458c02606\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416005-hxkmw" Dec 05 18:45:00 crc kubenswrapper[4756]: I1205 18:45:00.464098 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsf4m\" (UniqueName: \"kubernetes.io/projected/7bedc3fc-8528-41aa-914c-f47458c02606-kube-api-access-tsf4m\") pod \"collect-profiles-29416005-hxkmw\" (UID: \"7bedc3fc-8528-41aa-914c-f47458c02606\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416005-hxkmw" Dec 05 18:45:00 crc kubenswrapper[4756]: I1205 18:45:00.464173 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7bedc3fc-8528-41aa-914c-f47458c02606-config-volume\") pod \"collect-profiles-29416005-hxkmw\" (UID: \"7bedc3fc-8528-41aa-914c-f47458c02606\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416005-hxkmw" Dec 05 18:45:00 crc kubenswrapper[4756]: I1205 18:45:00.464389 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7bedc3fc-8528-41aa-914c-f47458c02606-secret-volume\") pod \"collect-profiles-29416005-hxkmw\" (UID: \"7bedc3fc-8528-41aa-914c-f47458c02606\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416005-hxkmw" Dec 05 18:45:00 crc kubenswrapper[4756]: I1205 18:45:00.466065 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7bedc3fc-8528-41aa-914c-f47458c02606-config-volume\") pod \"collect-profiles-29416005-hxkmw\" (UID: \"7bedc3fc-8528-41aa-914c-f47458c02606\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416005-hxkmw" Dec 05 18:45:00 crc kubenswrapper[4756]: I1205 18:45:00.473951 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7bedc3fc-8528-41aa-914c-f47458c02606-secret-volume\") pod \"collect-profiles-29416005-hxkmw\" (UID: \"7bedc3fc-8528-41aa-914c-f47458c02606\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416005-hxkmw" Dec 05 18:45:00 crc kubenswrapper[4756]: I1205 18:45:00.500541 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsf4m\" (UniqueName: \"kubernetes.io/projected/7bedc3fc-8528-41aa-914c-f47458c02606-kube-api-access-tsf4m\") pod \"collect-profiles-29416005-hxkmw\" (UID: \"7bedc3fc-8528-41aa-914c-f47458c02606\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416005-hxkmw" Dec 05 18:45:00 crc kubenswrapper[4756]: I1205 18:45:00.520706 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416005-hxkmw" Dec 05 18:45:01 crc kubenswrapper[4756]: I1205 18:45:01.041799 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416005-hxkmw"] Dec 05 18:45:01 crc kubenswrapper[4756]: I1205 18:45:01.167998 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416005-hxkmw" event={"ID":"7bedc3fc-8528-41aa-914c-f47458c02606","Type":"ContainerStarted","Data":"4c284c777c984e0b2c5bba537940da7a3e671ca44c36b90e2f0554ca9b3ed545"} Dec 05 18:45:02 crc kubenswrapper[4756]: I1205 18:45:02.181591 4756 generic.go:334] "Generic (PLEG): container finished" podID="7bedc3fc-8528-41aa-914c-f47458c02606" containerID="66b20d10b1c8e71d9de80147a03faf317af6aab1122a562d65cb48d7736a337e" exitCode=0 Dec 05 18:45:02 crc kubenswrapper[4756]: I1205 18:45:02.181711 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416005-hxkmw" event={"ID":"7bedc3fc-8528-41aa-914c-f47458c02606","Type":"ContainerDied","Data":"66b20d10b1c8e71d9de80147a03faf317af6aab1122a562d65cb48d7736a337e"} Dec 05 18:45:03 crc kubenswrapper[4756]: I1205 18:45:03.766065 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416005-hxkmw" Dec 05 18:45:03 crc kubenswrapper[4756]: I1205 18:45:03.862991 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7bedc3fc-8528-41aa-914c-f47458c02606-config-volume\") pod \"7bedc3fc-8528-41aa-914c-f47458c02606\" (UID: \"7bedc3fc-8528-41aa-914c-f47458c02606\") " Dec 05 18:45:03 crc kubenswrapper[4756]: I1205 18:45:03.863192 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7bedc3fc-8528-41aa-914c-f47458c02606-secret-volume\") pod \"7bedc3fc-8528-41aa-914c-f47458c02606\" (UID: \"7bedc3fc-8528-41aa-914c-f47458c02606\") " Dec 05 18:45:03 crc kubenswrapper[4756]: I1205 18:45:03.863225 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsf4m\" (UniqueName: \"kubernetes.io/projected/7bedc3fc-8528-41aa-914c-f47458c02606-kube-api-access-tsf4m\") pod \"7bedc3fc-8528-41aa-914c-f47458c02606\" (UID: \"7bedc3fc-8528-41aa-914c-f47458c02606\") " Dec 05 18:45:03 crc kubenswrapper[4756]: I1205 18:45:03.864223 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bedc3fc-8528-41aa-914c-f47458c02606-config-volume" (OuterVolumeSpecName: "config-volume") pod "7bedc3fc-8528-41aa-914c-f47458c02606" (UID: "7bedc3fc-8528-41aa-914c-f47458c02606"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:45:03 crc kubenswrapper[4756]: I1205 18:45:03.869363 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bedc3fc-8528-41aa-914c-f47458c02606-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7bedc3fc-8528-41aa-914c-f47458c02606" (UID: "7bedc3fc-8528-41aa-914c-f47458c02606"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:45:03 crc kubenswrapper[4756]: I1205 18:45:03.870248 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bedc3fc-8528-41aa-914c-f47458c02606-kube-api-access-tsf4m" (OuterVolumeSpecName: "kube-api-access-tsf4m") pod "7bedc3fc-8528-41aa-914c-f47458c02606" (UID: "7bedc3fc-8528-41aa-914c-f47458c02606"). InnerVolumeSpecName "kube-api-access-tsf4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:45:03 crc kubenswrapper[4756]: I1205 18:45:03.965619 4756 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7bedc3fc-8528-41aa-914c-f47458c02606-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 18:45:03 crc kubenswrapper[4756]: I1205 18:45:03.965862 4756 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7bedc3fc-8528-41aa-914c-f47458c02606-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 18:45:03 crc kubenswrapper[4756]: I1205 18:45:03.965948 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsf4m\" (UniqueName: \"kubernetes.io/projected/7bedc3fc-8528-41aa-914c-f47458c02606-kube-api-access-tsf4m\") on node \"crc\" DevicePath \"\"" Dec 05 18:45:04 crc kubenswrapper[4756]: I1205 18:45:04.213762 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416005-hxkmw" event={"ID":"7bedc3fc-8528-41aa-914c-f47458c02606","Type":"ContainerDied","Data":"4c284c777c984e0b2c5bba537940da7a3e671ca44c36b90e2f0554ca9b3ed545"} Dec 05 18:45:04 crc kubenswrapper[4756]: I1205 18:45:04.213815 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c284c777c984e0b2c5bba537940da7a3e671ca44c36b90e2f0554ca9b3ed545" Dec 05 18:45:04 crc kubenswrapper[4756]: I1205 18:45:04.213852 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416005-hxkmw" Dec 05 18:45:04 crc kubenswrapper[4756]: I1205 18:45:04.849492 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415960-svbg4"] Dec 05 18:45:04 crc kubenswrapper[4756]: I1205 18:45:04.866094 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415960-svbg4"] Dec 05 18:45:06 crc kubenswrapper[4756]: I1205 18:45:06.070488 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99dffd73-862d-44e3-ac15-ed5c4cb994c8" path="/var/lib/kubelet/pods/99dffd73-862d-44e3-ac15-ed5c4cb994c8/volumes" Dec 05 18:45:19 crc kubenswrapper[4756]: I1205 18:45:19.735019 4756 scope.go:117] "RemoveContainer" containerID="60e10a65feeaa43aa3a9aa7b101d7a2e3941380e134280e1be30c0775a99dd9c" Dec 05 18:45:28 crc kubenswrapper[4756]: I1205 18:45:28.534506 4756 generic.go:334] "Generic (PLEG): container finished" podID="2ea99c30-bc6e-47b2-9ace-f7d82418e729" containerID="431399ff544569e7a29ae1ff9a69ed4a1d9de91d8f1d8bbce51c33c5fd23f123" exitCode=0 Dec 05 18:45:28 crc kubenswrapper[4756]: I1205 18:45:28.534597 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-p9gcj" event={"ID":"2ea99c30-bc6e-47b2-9ace-f7d82418e729","Type":"ContainerDied","Data":"431399ff544569e7a29ae1ff9a69ed4a1d9de91d8f1d8bbce51c33c5fd23f123"} Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.092931 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-p9gcj" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.206083 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-ceilometer-compute-config-data-1\") pod \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\" (UID: \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\") " Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.206162 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-telemetry-combined-ca-bundle\") pod \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\" (UID: \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\") " Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.206205 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-inventory\") pod \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\" (UID: \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\") " Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.206331 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7774\" (UniqueName: \"kubernetes.io/projected/2ea99c30-bc6e-47b2-9ace-f7d82418e729-kube-api-access-d7774\") pod \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\" (UID: \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\") " Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.206574 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-ceilometer-compute-config-data-0\") pod \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\" (UID: \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\") " Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.206609 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-ceilometer-compute-config-data-2\") pod \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\" (UID: \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\") " Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.206664 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-ssh-key\") pod \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\" (UID: \"2ea99c30-bc6e-47b2-9ace-f7d82418e729\") " Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.214036 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ea99c30-bc6e-47b2-9ace-f7d82418e729-kube-api-access-d7774" (OuterVolumeSpecName: "kube-api-access-d7774") pod "2ea99c30-bc6e-47b2-9ace-f7d82418e729" (UID: "2ea99c30-bc6e-47b2-9ace-f7d82418e729"). InnerVolumeSpecName "kube-api-access-d7774". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.214763 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "2ea99c30-bc6e-47b2-9ace-f7d82418e729" (UID: "2ea99c30-bc6e-47b2-9ace-f7d82418e729"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.238759 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "2ea99c30-bc6e-47b2-9ace-f7d82418e729" (UID: "2ea99c30-bc6e-47b2-9ace-f7d82418e729"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.256673 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-inventory" (OuterVolumeSpecName: "inventory") pod "2ea99c30-bc6e-47b2-9ace-f7d82418e729" (UID: "2ea99c30-bc6e-47b2-9ace-f7d82418e729"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.257141 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "2ea99c30-bc6e-47b2-9ace-f7d82418e729" (UID: "2ea99c30-bc6e-47b2-9ace-f7d82418e729"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.257685 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2ea99c30-bc6e-47b2-9ace-f7d82418e729" (UID: "2ea99c30-bc6e-47b2-9ace-f7d82418e729"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.259626 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "2ea99c30-bc6e-47b2-9ace-f7d82418e729" (UID: "2ea99c30-bc6e-47b2-9ace-f7d82418e729"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.308818 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.308854 4756 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.308870 4756 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.308884 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.308897 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7774\" (UniqueName: \"kubernetes.io/projected/2ea99c30-bc6e-47b2-9ace-f7d82418e729-kube-api-access-d7774\") on node \"crc\" DevicePath \"\"" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.308908 4756 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.308935 4756 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2ea99c30-bc6e-47b2-9ace-f7d82418e729-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.566782 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-p9gcj" event={"ID":"2ea99c30-bc6e-47b2-9ace-f7d82418e729","Type":"ContainerDied","Data":"a8a248fd73cee02099a85377643f781c84b312df1546bef68f56233723d4756e"} Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.566840 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8a248fd73cee02099a85377643f781c84b312df1546bef68f56233723d4756e" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.566921 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-p9gcj" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.689186 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-5d6wl"] Dec 05 18:45:30 crc kubenswrapper[4756]: E1205 18:45:30.689957 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bedc3fc-8528-41aa-914c-f47458c02606" containerName="collect-profiles" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.689980 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bedc3fc-8528-41aa-914c-f47458c02606" containerName="collect-profiles" Dec 05 18:45:30 crc kubenswrapper[4756]: E1205 18:45:30.689994 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea99c30-bc6e-47b2-9ace-f7d82418e729" containerName="telemetry-openstack-openstack-cell1" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.690005 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea99c30-bc6e-47b2-9ace-f7d82418e729" containerName="telemetry-openstack-openstack-cell1" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.690257 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ea99c30-bc6e-47b2-9ace-f7d82418e729" containerName="telemetry-openstack-openstack-cell1" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.690331 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bedc3fc-8528-41aa-914c-f47458c02606" containerName="collect-profiles" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.691268 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-5d6wl" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.694389 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.694473 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.694473 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dfk9j" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.694617 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.695362 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.744096 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-5d6wl"] Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.831479 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03be6150-fc37-4e08-979b-2806de3bf91b-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-5d6wl\" (UID: \"03be6150-fc37-4e08-979b-2806de3bf91b\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d6wl" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.831969 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03be6150-fc37-4e08-979b-2806de3bf91b-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-5d6wl\" (UID: \"03be6150-fc37-4e08-979b-2806de3bf91b\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d6wl" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.832031 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/03be6150-fc37-4e08-979b-2806de3bf91b-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-5d6wl\" (UID: \"03be6150-fc37-4e08-979b-2806de3bf91b\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d6wl" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.833250 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03be6150-fc37-4e08-979b-2806de3bf91b-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-5d6wl\" (UID: \"03be6150-fc37-4e08-979b-2806de3bf91b\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d6wl" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.833795 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xfnx\" (UniqueName: \"kubernetes.io/projected/03be6150-fc37-4e08-979b-2806de3bf91b-kube-api-access-9xfnx\") pod \"neutron-sriov-openstack-openstack-cell1-5d6wl\" (UID: \"03be6150-fc37-4e08-979b-2806de3bf91b\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d6wl" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.937485 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03be6150-fc37-4e08-979b-2806de3bf91b-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-5d6wl\" (UID: \"03be6150-fc37-4e08-979b-2806de3bf91b\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d6wl" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.937663 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xfnx\" (UniqueName: \"kubernetes.io/projected/03be6150-fc37-4e08-979b-2806de3bf91b-kube-api-access-9xfnx\") pod \"neutron-sriov-openstack-openstack-cell1-5d6wl\" (UID: \"03be6150-fc37-4e08-979b-2806de3bf91b\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d6wl" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.937827 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03be6150-fc37-4e08-979b-2806de3bf91b-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-5d6wl\" (UID: \"03be6150-fc37-4e08-979b-2806de3bf91b\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d6wl" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.937889 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03be6150-fc37-4e08-979b-2806de3bf91b-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-5d6wl\" (UID: \"03be6150-fc37-4e08-979b-2806de3bf91b\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d6wl" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.937959 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/03be6150-fc37-4e08-979b-2806de3bf91b-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-5d6wl\" (UID: \"03be6150-fc37-4e08-979b-2806de3bf91b\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d6wl" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.943080 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03be6150-fc37-4e08-979b-2806de3bf91b-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-5d6wl\" (UID: \"03be6150-fc37-4e08-979b-2806de3bf91b\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d6wl" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.944651 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03be6150-fc37-4e08-979b-2806de3bf91b-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-5d6wl\" (UID: \"03be6150-fc37-4e08-979b-2806de3bf91b\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d6wl" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.945651 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/03be6150-fc37-4e08-979b-2806de3bf91b-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-5d6wl\" (UID: \"03be6150-fc37-4e08-979b-2806de3bf91b\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d6wl" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.949022 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03be6150-fc37-4e08-979b-2806de3bf91b-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-5d6wl\" (UID: \"03be6150-fc37-4e08-979b-2806de3bf91b\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d6wl" Dec 05 18:45:30 crc kubenswrapper[4756]: I1205 18:45:30.970706 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xfnx\" (UniqueName: \"kubernetes.io/projected/03be6150-fc37-4e08-979b-2806de3bf91b-kube-api-access-9xfnx\") pod \"neutron-sriov-openstack-openstack-cell1-5d6wl\" (UID: \"03be6150-fc37-4e08-979b-2806de3bf91b\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d6wl" Dec 05 18:45:31 crc kubenswrapper[4756]: I1205 18:45:31.026447 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-5d6wl" Dec 05 18:45:31 crc kubenswrapper[4756]: I1205 18:45:31.646146 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-5d6wl"] Dec 05 18:45:32 crc kubenswrapper[4756]: I1205 18:45:32.593908 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-5d6wl" event={"ID":"03be6150-fc37-4e08-979b-2806de3bf91b","Type":"ContainerStarted","Data":"4134df92c35ac8697e3984bf2b1ccd23eed4c0438ccd5f3aaa5ddab3fe6cb723"} Dec 05 18:45:32 crc kubenswrapper[4756]: I1205 18:45:32.594519 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-5d6wl" event={"ID":"03be6150-fc37-4e08-979b-2806de3bf91b","Type":"ContainerStarted","Data":"ed6cb7329379558601b2c15bbc2431c175937e4e04dafe679ed54c93604f44ef"} Dec 05 18:45:32 crc kubenswrapper[4756]: I1205 18:45:32.630397 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-5d6wl" podStartSLOduration=1.9601627000000001 podStartE2EDuration="2.630371067s" podCreationTimestamp="2025-12-05 18:45:30 +0000 UTC" firstStartedPulling="2025-12-05 18:45:31.657878216 +0000 UTC m=+8991.896551738" lastFinishedPulling="2025-12-05 18:45:32.328086563 +0000 UTC m=+8992.566760105" observedRunningTime="2025-12-05 18:45:32.6135964 +0000 UTC m=+8992.852269922" watchObservedRunningTime="2025-12-05 18:45:32.630371067 +0000 UTC m=+8992.869044609" Dec 05 18:46:52 crc kubenswrapper[4756]: I1205 18:46:52.296285 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:46:52 crc kubenswrapper[4756]: I1205 18:46:52.299220 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:46:57 crc kubenswrapper[4756]: I1205 18:46:57.301004 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fvqgf"] Dec 05 18:46:57 crc kubenswrapper[4756]: I1205 18:46:57.309023 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fvqgf" Dec 05 18:46:57 crc kubenswrapper[4756]: I1205 18:46:57.317301 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fvqgf"] Dec 05 18:46:57 crc kubenswrapper[4756]: I1205 18:46:57.462459 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1518fa27-bfea-4de2-8107-edc12a9eda77-catalog-content\") pod \"redhat-operators-fvqgf\" (UID: \"1518fa27-bfea-4de2-8107-edc12a9eda77\") " pod="openshift-marketplace/redhat-operators-fvqgf" Dec 05 18:46:57 crc kubenswrapper[4756]: I1205 18:46:57.462527 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1518fa27-bfea-4de2-8107-edc12a9eda77-utilities\") pod \"redhat-operators-fvqgf\" (UID: \"1518fa27-bfea-4de2-8107-edc12a9eda77\") " pod="openshift-marketplace/redhat-operators-fvqgf" Dec 05 18:46:57 crc kubenswrapper[4756]: I1205 18:46:57.462641 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfl5t\" (UniqueName: \"kubernetes.io/projected/1518fa27-bfea-4de2-8107-edc12a9eda77-kube-api-access-jfl5t\") pod \"redhat-operators-fvqgf\" (UID: \"1518fa27-bfea-4de2-8107-edc12a9eda77\") " pod="openshift-marketplace/redhat-operators-fvqgf" Dec 05 18:46:57 crc kubenswrapper[4756]: I1205 18:46:57.564281 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1518fa27-bfea-4de2-8107-edc12a9eda77-catalog-content\") pod \"redhat-operators-fvqgf\" (UID: \"1518fa27-bfea-4de2-8107-edc12a9eda77\") " pod="openshift-marketplace/redhat-operators-fvqgf" Dec 05 18:46:57 crc kubenswrapper[4756]: I1205 18:46:57.564356 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1518fa27-bfea-4de2-8107-edc12a9eda77-utilities\") pod \"redhat-operators-fvqgf\" (UID: \"1518fa27-bfea-4de2-8107-edc12a9eda77\") " pod="openshift-marketplace/redhat-operators-fvqgf" Dec 05 18:46:57 crc kubenswrapper[4756]: I1205 18:46:57.564404 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfl5t\" (UniqueName: \"kubernetes.io/projected/1518fa27-bfea-4de2-8107-edc12a9eda77-kube-api-access-jfl5t\") pod \"redhat-operators-fvqgf\" (UID: \"1518fa27-bfea-4de2-8107-edc12a9eda77\") " pod="openshift-marketplace/redhat-operators-fvqgf" Dec 05 18:46:57 crc kubenswrapper[4756]: I1205 18:46:57.564912 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1518fa27-bfea-4de2-8107-edc12a9eda77-utilities\") pod \"redhat-operators-fvqgf\" (UID: \"1518fa27-bfea-4de2-8107-edc12a9eda77\") " pod="openshift-marketplace/redhat-operators-fvqgf" Dec 05 18:46:57 crc kubenswrapper[4756]: I1205 18:46:57.565073 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1518fa27-bfea-4de2-8107-edc12a9eda77-catalog-content\") pod \"redhat-operators-fvqgf\" (UID: \"1518fa27-bfea-4de2-8107-edc12a9eda77\") " pod="openshift-marketplace/redhat-operators-fvqgf" Dec 05 18:46:57 crc kubenswrapper[4756]: I1205 18:46:57.591574 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfl5t\" (UniqueName: \"kubernetes.io/projected/1518fa27-bfea-4de2-8107-edc12a9eda77-kube-api-access-jfl5t\") pod \"redhat-operators-fvqgf\" (UID: \"1518fa27-bfea-4de2-8107-edc12a9eda77\") " pod="openshift-marketplace/redhat-operators-fvqgf" Dec 05 18:46:57 crc kubenswrapper[4756]: I1205 18:46:57.658777 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fvqgf" Dec 05 18:46:58 crc kubenswrapper[4756]: I1205 18:46:58.191662 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fvqgf"] Dec 05 18:46:58 crc kubenswrapper[4756]: I1205 18:46:58.754915 4756 generic.go:334] "Generic (PLEG): container finished" podID="1518fa27-bfea-4de2-8107-edc12a9eda77" containerID="25150eeeb22574d8777b47d82648cd486eebea96c425b3e76d4ff5b43ccf3d5c" exitCode=0 Dec 05 18:46:58 crc kubenswrapper[4756]: I1205 18:46:58.755032 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fvqgf" event={"ID":"1518fa27-bfea-4de2-8107-edc12a9eda77","Type":"ContainerDied","Data":"25150eeeb22574d8777b47d82648cd486eebea96c425b3e76d4ff5b43ccf3d5c"} Dec 05 18:46:58 crc kubenswrapper[4756]: I1205 18:46:58.755262 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fvqgf" event={"ID":"1518fa27-bfea-4de2-8107-edc12a9eda77","Type":"ContainerStarted","Data":"a62b631e5c952bc0964b7f176081b149cf3c568ef8fca527c9643d94545802f5"} Dec 05 18:46:59 crc kubenswrapper[4756]: I1205 18:46:59.772442 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fvqgf" event={"ID":"1518fa27-bfea-4de2-8107-edc12a9eda77","Type":"ContainerStarted","Data":"096105dbe5fab9912033dff8ce0238f86b134acf0c9ef4f0a45c40695fa4b06e"} Dec 05 18:47:00 crc kubenswrapper[4756]: I1205 18:47:00.792677 4756 generic.go:334] "Generic (PLEG): container finished" podID="1518fa27-bfea-4de2-8107-edc12a9eda77" containerID="096105dbe5fab9912033dff8ce0238f86b134acf0c9ef4f0a45c40695fa4b06e" exitCode=0 Dec 05 18:47:00 crc kubenswrapper[4756]: I1205 18:47:00.793126 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fvqgf" event={"ID":"1518fa27-bfea-4de2-8107-edc12a9eda77","Type":"ContainerDied","Data":"096105dbe5fab9912033dff8ce0238f86b134acf0c9ef4f0a45c40695fa4b06e"} Dec 05 18:47:04 crc kubenswrapper[4756]: I1205 18:47:04.870977 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fvqgf" event={"ID":"1518fa27-bfea-4de2-8107-edc12a9eda77","Type":"ContainerStarted","Data":"348e064651a4dbdc718881464594f26fc395306d8862339ee85938638fca3697"} Dec 05 18:47:04 crc kubenswrapper[4756]: I1205 18:47:04.919472 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fvqgf" podStartSLOduration=3.0653218190000002 podStartE2EDuration="7.919441221s" podCreationTimestamp="2025-12-05 18:46:57 +0000 UTC" firstStartedPulling="2025-12-05 18:46:58.756968629 +0000 UTC m=+9078.995642141" lastFinishedPulling="2025-12-05 18:47:03.611087991 +0000 UTC m=+9083.849761543" observedRunningTime="2025-12-05 18:47:04.894896867 +0000 UTC m=+9085.133570409" watchObservedRunningTime="2025-12-05 18:47:04.919441221 +0000 UTC m=+9085.158114793" Dec 05 18:47:07 crc kubenswrapper[4756]: I1205 18:47:07.658989 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fvqgf" Dec 05 18:47:07 crc kubenswrapper[4756]: I1205 18:47:07.661151 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fvqgf" Dec 05 18:47:08 crc kubenswrapper[4756]: I1205 18:47:08.722153 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fvqgf" podUID="1518fa27-bfea-4de2-8107-edc12a9eda77" containerName="registry-server" probeResult="failure" output=< Dec 05 18:47:08 crc kubenswrapper[4756]: timeout: failed to connect service ":50051" within 1s Dec 05 18:47:08 crc kubenswrapper[4756]: > Dec 05 18:47:17 crc kubenswrapper[4756]: I1205 18:47:17.741344 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fvqgf" Dec 05 18:47:17 crc kubenswrapper[4756]: I1205 18:47:17.832810 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fvqgf" Dec 05 18:47:18 crc kubenswrapper[4756]: I1205 18:47:18.011139 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fvqgf"] Dec 05 18:47:19 crc kubenswrapper[4756]: I1205 18:47:19.067620 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fvqgf" podUID="1518fa27-bfea-4de2-8107-edc12a9eda77" containerName="registry-server" containerID="cri-o://348e064651a4dbdc718881464594f26fc395306d8862339ee85938638fca3697" gracePeriod=2 Dec 05 18:47:20 crc kubenswrapper[4756]: I1205 18:47:20.078513 4756 generic.go:334] "Generic (PLEG): container finished" podID="1518fa27-bfea-4de2-8107-edc12a9eda77" containerID="348e064651a4dbdc718881464594f26fc395306d8862339ee85938638fca3697" exitCode=0 Dec 05 18:47:20 crc kubenswrapper[4756]: I1205 18:47:20.078592 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fvqgf" event={"ID":"1518fa27-bfea-4de2-8107-edc12a9eda77","Type":"ContainerDied","Data":"348e064651a4dbdc718881464594f26fc395306d8862339ee85938638fca3697"} Dec 05 18:47:20 crc kubenswrapper[4756]: I1205 18:47:20.078798 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fvqgf" event={"ID":"1518fa27-bfea-4de2-8107-edc12a9eda77","Type":"ContainerDied","Data":"a62b631e5c952bc0964b7f176081b149cf3c568ef8fca527c9643d94545802f5"} Dec 05 18:47:20 crc kubenswrapper[4756]: I1205 18:47:20.078809 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a62b631e5c952bc0964b7f176081b149cf3c568ef8fca527c9643d94545802f5" Dec 05 18:47:20 crc kubenswrapper[4756]: I1205 18:47:20.149053 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fvqgf" Dec 05 18:47:20 crc kubenswrapper[4756]: I1205 18:47:20.259947 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1518fa27-bfea-4de2-8107-edc12a9eda77-utilities\") pod \"1518fa27-bfea-4de2-8107-edc12a9eda77\" (UID: \"1518fa27-bfea-4de2-8107-edc12a9eda77\") " Dec 05 18:47:20 crc kubenswrapper[4756]: I1205 18:47:20.260265 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfl5t\" (UniqueName: \"kubernetes.io/projected/1518fa27-bfea-4de2-8107-edc12a9eda77-kube-api-access-jfl5t\") pod \"1518fa27-bfea-4de2-8107-edc12a9eda77\" (UID: \"1518fa27-bfea-4de2-8107-edc12a9eda77\") " Dec 05 18:47:20 crc kubenswrapper[4756]: I1205 18:47:20.260394 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1518fa27-bfea-4de2-8107-edc12a9eda77-catalog-content\") pod \"1518fa27-bfea-4de2-8107-edc12a9eda77\" (UID: \"1518fa27-bfea-4de2-8107-edc12a9eda77\") " Dec 05 18:47:20 crc kubenswrapper[4756]: I1205 18:47:20.267223 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1518fa27-bfea-4de2-8107-edc12a9eda77-utilities" (OuterVolumeSpecName: "utilities") pod "1518fa27-bfea-4de2-8107-edc12a9eda77" (UID: "1518fa27-bfea-4de2-8107-edc12a9eda77"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:47:20 crc kubenswrapper[4756]: I1205 18:47:20.283100 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1518fa27-bfea-4de2-8107-edc12a9eda77-kube-api-access-jfl5t" (OuterVolumeSpecName: "kube-api-access-jfl5t") pod "1518fa27-bfea-4de2-8107-edc12a9eda77" (UID: "1518fa27-bfea-4de2-8107-edc12a9eda77"). InnerVolumeSpecName "kube-api-access-jfl5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:47:20 crc kubenswrapper[4756]: I1205 18:47:20.363313 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfl5t\" (UniqueName: \"kubernetes.io/projected/1518fa27-bfea-4de2-8107-edc12a9eda77-kube-api-access-jfl5t\") on node \"crc\" DevicePath \"\"" Dec 05 18:47:20 crc kubenswrapper[4756]: I1205 18:47:20.363358 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1518fa27-bfea-4de2-8107-edc12a9eda77-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 18:47:20 crc kubenswrapper[4756]: I1205 18:47:20.388803 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1518fa27-bfea-4de2-8107-edc12a9eda77-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1518fa27-bfea-4de2-8107-edc12a9eda77" (UID: "1518fa27-bfea-4de2-8107-edc12a9eda77"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:47:20 crc kubenswrapper[4756]: I1205 18:47:20.465017 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1518fa27-bfea-4de2-8107-edc12a9eda77-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 18:47:21 crc kubenswrapper[4756]: I1205 18:47:21.091316 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fvqgf" Dec 05 18:47:21 crc kubenswrapper[4756]: I1205 18:47:21.152390 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fvqgf"] Dec 05 18:47:21 crc kubenswrapper[4756]: I1205 18:47:21.172060 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fvqgf"] Dec 05 18:47:22 crc kubenswrapper[4756]: I1205 18:47:22.066319 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1518fa27-bfea-4de2-8107-edc12a9eda77" path="/var/lib/kubelet/pods/1518fa27-bfea-4de2-8107-edc12a9eda77/volumes" Dec 05 18:47:22 crc kubenswrapper[4756]: I1205 18:47:22.295520 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:47:22 crc kubenswrapper[4756]: I1205 18:47:22.295577 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:47:27 crc kubenswrapper[4756]: I1205 18:47:27.498808 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gfbmd"] Dec 05 18:47:27 crc kubenswrapper[4756]: E1205 18:47:27.499841 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1518fa27-bfea-4de2-8107-edc12a9eda77" containerName="registry-server" Dec 05 18:47:27 crc kubenswrapper[4756]: I1205 18:47:27.499856 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1518fa27-bfea-4de2-8107-edc12a9eda77" containerName="registry-server" Dec 05 18:47:27 crc kubenswrapper[4756]: E1205 18:47:27.499868 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1518fa27-bfea-4de2-8107-edc12a9eda77" containerName="extract-content" Dec 05 18:47:27 crc kubenswrapper[4756]: I1205 18:47:27.499875 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1518fa27-bfea-4de2-8107-edc12a9eda77" containerName="extract-content" Dec 05 18:47:27 crc kubenswrapper[4756]: E1205 18:47:27.499908 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1518fa27-bfea-4de2-8107-edc12a9eda77" containerName="extract-utilities" Dec 05 18:47:27 crc kubenswrapper[4756]: I1205 18:47:27.499915 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1518fa27-bfea-4de2-8107-edc12a9eda77" containerName="extract-utilities" Dec 05 18:47:27 crc kubenswrapper[4756]: I1205 18:47:27.500114 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1518fa27-bfea-4de2-8107-edc12a9eda77" containerName="registry-server" Dec 05 18:47:27 crc kubenswrapper[4756]: I1205 18:47:27.501814 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gfbmd" Dec 05 18:47:27 crc kubenswrapper[4756]: I1205 18:47:27.531487 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gfbmd"] Dec 05 18:47:27 crc kubenswrapper[4756]: I1205 18:47:27.634535 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ec20d8d-4e50-4a70-bcb8-2d5372c023fe-catalog-content\") pod \"redhat-marketplace-gfbmd\" (UID: \"8ec20d8d-4e50-4a70-bcb8-2d5372c023fe\") " pod="openshift-marketplace/redhat-marketplace-gfbmd" Dec 05 18:47:27 crc kubenswrapper[4756]: I1205 18:47:27.634815 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsttl\" (UniqueName: \"kubernetes.io/projected/8ec20d8d-4e50-4a70-bcb8-2d5372c023fe-kube-api-access-hsttl\") pod \"redhat-marketplace-gfbmd\" (UID: \"8ec20d8d-4e50-4a70-bcb8-2d5372c023fe\") " pod="openshift-marketplace/redhat-marketplace-gfbmd" Dec 05 18:47:27 crc kubenswrapper[4756]: I1205 18:47:27.634865 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ec20d8d-4e50-4a70-bcb8-2d5372c023fe-utilities\") pod \"redhat-marketplace-gfbmd\" (UID: \"8ec20d8d-4e50-4a70-bcb8-2d5372c023fe\") " pod="openshift-marketplace/redhat-marketplace-gfbmd" Dec 05 18:47:27 crc kubenswrapper[4756]: I1205 18:47:27.737177 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsttl\" (UniqueName: \"kubernetes.io/projected/8ec20d8d-4e50-4a70-bcb8-2d5372c023fe-kube-api-access-hsttl\") pod \"redhat-marketplace-gfbmd\" (UID: \"8ec20d8d-4e50-4a70-bcb8-2d5372c023fe\") " pod="openshift-marketplace/redhat-marketplace-gfbmd" Dec 05 18:47:27 crc kubenswrapper[4756]: I1205 18:47:27.737244 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ec20d8d-4e50-4a70-bcb8-2d5372c023fe-utilities\") pod \"redhat-marketplace-gfbmd\" (UID: \"8ec20d8d-4e50-4a70-bcb8-2d5372c023fe\") " pod="openshift-marketplace/redhat-marketplace-gfbmd" Dec 05 18:47:27 crc kubenswrapper[4756]: I1205 18:47:27.737316 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ec20d8d-4e50-4a70-bcb8-2d5372c023fe-catalog-content\") pod \"redhat-marketplace-gfbmd\" (UID: \"8ec20d8d-4e50-4a70-bcb8-2d5372c023fe\") " pod="openshift-marketplace/redhat-marketplace-gfbmd" Dec 05 18:47:27 crc kubenswrapper[4756]: I1205 18:47:27.737900 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ec20d8d-4e50-4a70-bcb8-2d5372c023fe-utilities\") pod \"redhat-marketplace-gfbmd\" (UID: \"8ec20d8d-4e50-4a70-bcb8-2d5372c023fe\") " pod="openshift-marketplace/redhat-marketplace-gfbmd" Dec 05 18:47:27 crc kubenswrapper[4756]: I1205 18:47:27.737965 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ec20d8d-4e50-4a70-bcb8-2d5372c023fe-catalog-content\") pod \"redhat-marketplace-gfbmd\" (UID: \"8ec20d8d-4e50-4a70-bcb8-2d5372c023fe\") " pod="openshift-marketplace/redhat-marketplace-gfbmd" Dec 05 18:47:27 crc kubenswrapper[4756]: I1205 18:47:27.766499 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsttl\" (UniqueName: \"kubernetes.io/projected/8ec20d8d-4e50-4a70-bcb8-2d5372c023fe-kube-api-access-hsttl\") pod \"redhat-marketplace-gfbmd\" (UID: \"8ec20d8d-4e50-4a70-bcb8-2d5372c023fe\") " pod="openshift-marketplace/redhat-marketplace-gfbmd" Dec 05 18:47:27 crc kubenswrapper[4756]: I1205 18:47:27.832189 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gfbmd" Dec 05 18:47:28 crc kubenswrapper[4756]: I1205 18:47:28.329210 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gfbmd"] Dec 05 18:47:29 crc kubenswrapper[4756]: I1205 18:47:29.211482 4756 generic.go:334] "Generic (PLEG): container finished" podID="8ec20d8d-4e50-4a70-bcb8-2d5372c023fe" containerID="38a9741d3d125d36e921b2769ad2a01e508e68c21d6ce32cbc7b2af8b8838479" exitCode=0 Dec 05 18:47:29 crc kubenswrapper[4756]: I1205 18:47:29.211586 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gfbmd" event={"ID":"8ec20d8d-4e50-4a70-bcb8-2d5372c023fe","Type":"ContainerDied","Data":"38a9741d3d125d36e921b2769ad2a01e508e68c21d6ce32cbc7b2af8b8838479"} Dec 05 18:47:29 crc kubenswrapper[4756]: I1205 18:47:29.212140 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gfbmd" event={"ID":"8ec20d8d-4e50-4a70-bcb8-2d5372c023fe","Type":"ContainerStarted","Data":"31f418cc4af366e9259356aca6d0c55c6dbd08a33f748492de178282000ba8f7"} Dec 05 18:47:31 crc kubenswrapper[4756]: I1205 18:47:31.237212 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gfbmd" event={"ID":"8ec20d8d-4e50-4a70-bcb8-2d5372c023fe","Type":"ContainerStarted","Data":"b805ca091e2890212f378fa6844ea3c64e16e2dc26f63538cd2e02555b836cf3"} Dec 05 18:47:32 crc kubenswrapper[4756]: I1205 18:47:32.254860 4756 generic.go:334] "Generic (PLEG): container finished" podID="8ec20d8d-4e50-4a70-bcb8-2d5372c023fe" containerID="b805ca091e2890212f378fa6844ea3c64e16e2dc26f63538cd2e02555b836cf3" exitCode=0 Dec 05 18:47:32 crc kubenswrapper[4756]: I1205 18:47:32.254936 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gfbmd" event={"ID":"8ec20d8d-4e50-4a70-bcb8-2d5372c023fe","Type":"ContainerDied","Data":"b805ca091e2890212f378fa6844ea3c64e16e2dc26f63538cd2e02555b836cf3"} Dec 05 18:47:33 crc kubenswrapper[4756]: I1205 18:47:33.267044 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gfbmd" event={"ID":"8ec20d8d-4e50-4a70-bcb8-2d5372c023fe","Type":"ContainerStarted","Data":"ba80078b6f0d13b06016350904291865fdd4921622a2763d6781b0f37c49ee19"} Dec 05 18:47:33 crc kubenswrapper[4756]: I1205 18:47:33.298293 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gfbmd" podStartSLOduration=2.844302605 podStartE2EDuration="6.298273612s" podCreationTimestamp="2025-12-05 18:47:27 +0000 UTC" firstStartedPulling="2025-12-05 18:47:29.215142342 +0000 UTC m=+9109.453815894" lastFinishedPulling="2025-12-05 18:47:32.669113339 +0000 UTC m=+9112.907786901" observedRunningTime="2025-12-05 18:47:33.294396939 +0000 UTC m=+9113.533070491" watchObservedRunningTime="2025-12-05 18:47:33.298273612 +0000 UTC m=+9113.536947134" Dec 05 18:47:37 crc kubenswrapper[4756]: I1205 18:47:37.833231 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gfbmd" Dec 05 18:47:37 crc kubenswrapper[4756]: I1205 18:47:37.833809 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gfbmd" Dec 05 18:47:37 crc kubenswrapper[4756]: I1205 18:47:37.892340 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gfbmd" Dec 05 18:47:38 crc kubenswrapper[4756]: I1205 18:47:38.412598 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gfbmd" Dec 05 18:47:38 crc kubenswrapper[4756]: I1205 18:47:38.484765 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gfbmd"] Dec 05 18:47:40 crc kubenswrapper[4756]: I1205 18:47:40.353275 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gfbmd" podUID="8ec20d8d-4e50-4a70-bcb8-2d5372c023fe" containerName="registry-server" containerID="cri-o://ba80078b6f0d13b06016350904291865fdd4921622a2763d6781b0f37c49ee19" gracePeriod=2 Dec 05 18:47:40 crc kubenswrapper[4756]: I1205 18:47:40.922780 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gfbmd" Dec 05 18:47:40 crc kubenswrapper[4756]: I1205 18:47:40.959638 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsttl\" (UniqueName: \"kubernetes.io/projected/8ec20d8d-4e50-4a70-bcb8-2d5372c023fe-kube-api-access-hsttl\") pod \"8ec20d8d-4e50-4a70-bcb8-2d5372c023fe\" (UID: \"8ec20d8d-4e50-4a70-bcb8-2d5372c023fe\") " Dec 05 18:47:40 crc kubenswrapper[4756]: I1205 18:47:40.959971 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ec20d8d-4e50-4a70-bcb8-2d5372c023fe-catalog-content\") pod \"8ec20d8d-4e50-4a70-bcb8-2d5372c023fe\" (UID: \"8ec20d8d-4e50-4a70-bcb8-2d5372c023fe\") " Dec 05 18:47:40 crc kubenswrapper[4756]: I1205 18:47:40.960040 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ec20d8d-4e50-4a70-bcb8-2d5372c023fe-utilities\") pod \"8ec20d8d-4e50-4a70-bcb8-2d5372c023fe\" (UID: \"8ec20d8d-4e50-4a70-bcb8-2d5372c023fe\") " Dec 05 18:47:40 crc kubenswrapper[4756]: I1205 18:47:40.962223 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ec20d8d-4e50-4a70-bcb8-2d5372c023fe-utilities" (OuterVolumeSpecName: "utilities") pod "8ec20d8d-4e50-4a70-bcb8-2d5372c023fe" (UID: "8ec20d8d-4e50-4a70-bcb8-2d5372c023fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:47:40 crc kubenswrapper[4756]: I1205 18:47:40.986217 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ec20d8d-4e50-4a70-bcb8-2d5372c023fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ec20d8d-4e50-4a70-bcb8-2d5372c023fe" (UID: "8ec20d8d-4e50-4a70-bcb8-2d5372c023fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:47:41 crc kubenswrapper[4756]: I1205 18:47:41.062599 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ec20d8d-4e50-4a70-bcb8-2d5372c023fe-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 18:47:41 crc kubenswrapper[4756]: I1205 18:47:41.062636 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ec20d8d-4e50-4a70-bcb8-2d5372c023fe-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 18:47:41 crc kubenswrapper[4756]: I1205 18:47:41.372183 4756 generic.go:334] "Generic (PLEG): container finished" podID="8ec20d8d-4e50-4a70-bcb8-2d5372c023fe" containerID="ba80078b6f0d13b06016350904291865fdd4921622a2763d6781b0f37c49ee19" exitCode=0 Dec 05 18:47:41 crc kubenswrapper[4756]: I1205 18:47:41.372257 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gfbmd" event={"ID":"8ec20d8d-4e50-4a70-bcb8-2d5372c023fe","Type":"ContainerDied","Data":"ba80078b6f0d13b06016350904291865fdd4921622a2763d6781b0f37c49ee19"} Dec 05 18:47:41 crc kubenswrapper[4756]: I1205 18:47:41.372349 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gfbmd" event={"ID":"8ec20d8d-4e50-4a70-bcb8-2d5372c023fe","Type":"ContainerDied","Data":"31f418cc4af366e9259356aca6d0c55c6dbd08a33f748492de178282000ba8f7"} Dec 05 18:47:41 crc kubenswrapper[4756]: I1205 18:47:41.372379 4756 scope.go:117] "RemoveContainer" containerID="ba80078b6f0d13b06016350904291865fdd4921622a2763d6781b0f37c49ee19" Dec 05 18:47:41 crc kubenswrapper[4756]: I1205 18:47:41.372280 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gfbmd" Dec 05 18:47:41 crc kubenswrapper[4756]: I1205 18:47:41.403934 4756 scope.go:117] "RemoveContainer" containerID="b805ca091e2890212f378fa6844ea3c64e16e2dc26f63538cd2e02555b836cf3" Dec 05 18:47:41 crc kubenswrapper[4756]: I1205 18:47:41.630215 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ec20d8d-4e50-4a70-bcb8-2d5372c023fe-kube-api-access-hsttl" (OuterVolumeSpecName: "kube-api-access-hsttl") pod "8ec20d8d-4e50-4a70-bcb8-2d5372c023fe" (UID: "8ec20d8d-4e50-4a70-bcb8-2d5372c023fe"). InnerVolumeSpecName "kube-api-access-hsttl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:47:41 crc kubenswrapper[4756]: I1205 18:47:41.646163 4756 scope.go:117] "RemoveContainer" containerID="38a9741d3d125d36e921b2769ad2a01e508e68c21d6ce32cbc7b2af8b8838479" Dec 05 18:47:41 crc kubenswrapper[4756]: I1205 18:47:41.675105 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsttl\" (UniqueName: \"kubernetes.io/projected/8ec20d8d-4e50-4a70-bcb8-2d5372c023fe-kube-api-access-hsttl\") on node \"crc\" DevicePath \"\"" Dec 05 18:47:41 crc kubenswrapper[4756]: I1205 18:47:41.776535 4756 scope.go:117] "RemoveContainer" containerID="ba80078b6f0d13b06016350904291865fdd4921622a2763d6781b0f37c49ee19" Dec 05 18:47:41 crc kubenswrapper[4756]: E1205 18:47:41.777300 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba80078b6f0d13b06016350904291865fdd4921622a2763d6781b0f37c49ee19\": container with ID starting with ba80078b6f0d13b06016350904291865fdd4921622a2763d6781b0f37c49ee19 not found: ID does not exist" containerID="ba80078b6f0d13b06016350904291865fdd4921622a2763d6781b0f37c49ee19" Dec 05 18:47:41 crc kubenswrapper[4756]: I1205 18:47:41.777358 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba80078b6f0d13b06016350904291865fdd4921622a2763d6781b0f37c49ee19"} err="failed to get container status \"ba80078b6f0d13b06016350904291865fdd4921622a2763d6781b0f37c49ee19\": rpc error: code = NotFound desc = could not find container \"ba80078b6f0d13b06016350904291865fdd4921622a2763d6781b0f37c49ee19\": container with ID starting with ba80078b6f0d13b06016350904291865fdd4921622a2763d6781b0f37c49ee19 not found: ID does not exist" Dec 05 18:47:41 crc kubenswrapper[4756]: I1205 18:47:41.777396 4756 scope.go:117] "RemoveContainer" containerID="b805ca091e2890212f378fa6844ea3c64e16e2dc26f63538cd2e02555b836cf3" Dec 05 18:47:41 crc kubenswrapper[4756]: E1205 18:47:41.778083 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b805ca091e2890212f378fa6844ea3c64e16e2dc26f63538cd2e02555b836cf3\": container with ID starting with b805ca091e2890212f378fa6844ea3c64e16e2dc26f63538cd2e02555b836cf3 not found: ID does not exist" containerID="b805ca091e2890212f378fa6844ea3c64e16e2dc26f63538cd2e02555b836cf3" Dec 05 18:47:41 crc kubenswrapper[4756]: I1205 18:47:41.778123 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b805ca091e2890212f378fa6844ea3c64e16e2dc26f63538cd2e02555b836cf3"} err="failed to get container status \"b805ca091e2890212f378fa6844ea3c64e16e2dc26f63538cd2e02555b836cf3\": rpc error: code = NotFound desc = could not find container \"b805ca091e2890212f378fa6844ea3c64e16e2dc26f63538cd2e02555b836cf3\": container with ID starting with b805ca091e2890212f378fa6844ea3c64e16e2dc26f63538cd2e02555b836cf3 not found: ID does not exist" Dec 05 18:47:41 crc kubenswrapper[4756]: I1205 18:47:41.778152 4756 scope.go:117] "RemoveContainer" containerID="38a9741d3d125d36e921b2769ad2a01e508e68c21d6ce32cbc7b2af8b8838479" Dec 05 18:47:41 crc kubenswrapper[4756]: E1205 18:47:41.778684 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38a9741d3d125d36e921b2769ad2a01e508e68c21d6ce32cbc7b2af8b8838479\": container with ID starting with 38a9741d3d125d36e921b2769ad2a01e508e68c21d6ce32cbc7b2af8b8838479 not found: ID does not exist" containerID="38a9741d3d125d36e921b2769ad2a01e508e68c21d6ce32cbc7b2af8b8838479" Dec 05 18:47:41 crc kubenswrapper[4756]: I1205 18:47:41.778713 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38a9741d3d125d36e921b2769ad2a01e508e68c21d6ce32cbc7b2af8b8838479"} err="failed to get container status \"38a9741d3d125d36e921b2769ad2a01e508e68c21d6ce32cbc7b2af8b8838479\": rpc error: code = NotFound desc = could not find container \"38a9741d3d125d36e921b2769ad2a01e508e68c21d6ce32cbc7b2af8b8838479\": container with ID starting with 38a9741d3d125d36e921b2769ad2a01e508e68c21d6ce32cbc7b2af8b8838479 not found: ID does not exist" Dec 05 18:47:41 crc kubenswrapper[4756]: I1205 18:47:41.848907 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gfbmd"] Dec 05 18:47:41 crc kubenswrapper[4756]: I1205 18:47:41.864902 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gfbmd"] Dec 05 18:47:42 crc kubenswrapper[4756]: I1205 18:47:42.064497 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ec20d8d-4e50-4a70-bcb8-2d5372c023fe" path="/var/lib/kubelet/pods/8ec20d8d-4e50-4a70-bcb8-2d5372c023fe/volumes" Dec 05 18:47:52 crc kubenswrapper[4756]: I1205 18:47:52.295818 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:47:52 crc kubenswrapper[4756]: I1205 18:47:52.296461 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:47:52 crc kubenswrapper[4756]: I1205 18:47:52.296523 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 18:47:52 crc kubenswrapper[4756]: I1205 18:47:52.297699 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"86da3e72fd2da091d3bf1ed9c7603b3e808b242b9c238035f7cb19bb181b9980"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 18:47:52 crc kubenswrapper[4756]: I1205 18:47:52.297796 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://86da3e72fd2da091d3bf1ed9c7603b3e808b242b9c238035f7cb19bb181b9980" gracePeriod=600 Dec 05 18:47:53 crc kubenswrapper[4756]: I1205 18:47:53.527982 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="86da3e72fd2da091d3bf1ed9c7603b3e808b242b9c238035f7cb19bb181b9980" exitCode=0 Dec 05 18:47:53 crc kubenswrapper[4756]: I1205 18:47:53.528048 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"86da3e72fd2da091d3bf1ed9c7603b3e808b242b9c238035f7cb19bb181b9980"} Dec 05 18:47:53 crc kubenswrapper[4756]: I1205 18:47:53.528101 4756 scope.go:117] "RemoveContainer" containerID="e6cf2e268c71d31fbba70dd03ba4d3f3c639338aa82965681743b976ca1e371a" Dec 05 18:47:54 crc kubenswrapper[4756]: I1205 18:47:54.538753 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f"} Dec 05 18:50:22 crc kubenswrapper[4756]: I1205 18:50:22.295319 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:50:22 crc kubenswrapper[4756]: I1205 18:50:22.296174 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:50:52 crc kubenswrapper[4756]: I1205 18:50:52.295405 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:50:52 crc kubenswrapper[4756]: I1205 18:50:52.296499 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:51:22 crc kubenswrapper[4756]: I1205 18:51:22.295384 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:51:22 crc kubenswrapper[4756]: I1205 18:51:22.295958 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:51:22 crc kubenswrapper[4756]: I1205 18:51:22.296015 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 18:51:22 crc kubenswrapper[4756]: I1205 18:51:22.297134 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 18:51:22 crc kubenswrapper[4756]: I1205 18:51:22.297222 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f" gracePeriod=600 Dec 05 18:51:22 crc kubenswrapper[4756]: E1205 18:51:22.441462 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:51:22 crc kubenswrapper[4756]: I1205 18:51:22.653681 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f" exitCode=0 Dec 05 18:51:22 crc kubenswrapper[4756]: I1205 18:51:22.653726 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f"} Dec 05 18:51:22 crc kubenswrapper[4756]: I1205 18:51:22.653794 4756 scope.go:117] "RemoveContainer" containerID="86da3e72fd2da091d3bf1ed9c7603b3e808b242b9c238035f7cb19bb181b9980" Dec 05 18:51:22 crc kubenswrapper[4756]: I1205 18:51:22.654332 4756 scope.go:117] "RemoveContainer" containerID="8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f" Dec 05 18:51:22 crc kubenswrapper[4756]: E1205 18:51:22.654733 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:51:29 crc kubenswrapper[4756]: I1205 18:51:29.747405 4756 generic.go:334] "Generic (PLEG): container finished" podID="03be6150-fc37-4e08-979b-2806de3bf91b" containerID="4134df92c35ac8697e3984bf2b1ccd23eed4c0438ccd5f3aaa5ddab3fe6cb723" exitCode=0 Dec 05 18:51:29 crc kubenswrapper[4756]: I1205 18:51:29.747487 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-5d6wl" event={"ID":"03be6150-fc37-4e08-979b-2806de3bf91b","Type":"ContainerDied","Data":"4134df92c35ac8697e3984bf2b1ccd23eed4c0438ccd5f3aaa5ddab3fe6cb723"} Dec 05 18:51:31 crc kubenswrapper[4756]: I1205 18:51:31.310359 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-5d6wl" Dec 05 18:51:31 crc kubenswrapper[4756]: I1205 18:51:31.471363 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03be6150-fc37-4e08-979b-2806de3bf91b-ssh-key\") pod \"03be6150-fc37-4e08-979b-2806de3bf91b\" (UID: \"03be6150-fc37-4e08-979b-2806de3bf91b\") " Dec 05 18:51:31 crc kubenswrapper[4756]: I1205 18:51:31.471444 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03be6150-fc37-4e08-979b-2806de3bf91b-inventory\") pod \"03be6150-fc37-4e08-979b-2806de3bf91b\" (UID: \"03be6150-fc37-4e08-979b-2806de3bf91b\") " Dec 05 18:51:31 crc kubenswrapper[4756]: I1205 18:51:31.471501 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfnx\" (UniqueName: \"kubernetes.io/projected/03be6150-fc37-4e08-979b-2806de3bf91b-kube-api-access-9xfnx\") pod \"03be6150-fc37-4e08-979b-2806de3bf91b\" (UID: \"03be6150-fc37-4e08-979b-2806de3bf91b\") " Dec 05 18:51:31 crc kubenswrapper[4756]: I1205 18:51:31.471540 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/03be6150-fc37-4e08-979b-2806de3bf91b-neutron-sriov-agent-neutron-config-0\") pod \"03be6150-fc37-4e08-979b-2806de3bf91b\" (UID: \"03be6150-fc37-4e08-979b-2806de3bf91b\") " Dec 05 18:51:31 crc kubenswrapper[4756]: I1205 18:51:31.471591 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03be6150-fc37-4e08-979b-2806de3bf91b-neutron-sriov-combined-ca-bundle\") pod \"03be6150-fc37-4e08-979b-2806de3bf91b\" (UID: \"03be6150-fc37-4e08-979b-2806de3bf91b\") " Dec 05 18:51:31 crc kubenswrapper[4756]: I1205 18:51:31.482893 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03be6150-fc37-4e08-979b-2806de3bf91b-kube-api-access-9xfnx" (OuterVolumeSpecName: "kube-api-access-9xfnx") pod "03be6150-fc37-4e08-979b-2806de3bf91b" (UID: "03be6150-fc37-4e08-979b-2806de3bf91b"). InnerVolumeSpecName "kube-api-access-9xfnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:51:31 crc kubenswrapper[4756]: I1205 18:51:31.485356 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03be6150-fc37-4e08-979b-2806de3bf91b-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "03be6150-fc37-4e08-979b-2806de3bf91b" (UID: "03be6150-fc37-4e08-979b-2806de3bf91b"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:51:31 crc kubenswrapper[4756]: I1205 18:51:31.521368 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03be6150-fc37-4e08-979b-2806de3bf91b-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "03be6150-fc37-4e08-979b-2806de3bf91b" (UID: "03be6150-fc37-4e08-979b-2806de3bf91b"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:51:31 crc kubenswrapper[4756]: I1205 18:51:31.524902 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03be6150-fc37-4e08-979b-2806de3bf91b-inventory" (OuterVolumeSpecName: "inventory") pod "03be6150-fc37-4e08-979b-2806de3bf91b" (UID: "03be6150-fc37-4e08-979b-2806de3bf91b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:51:31 crc kubenswrapper[4756]: I1205 18:51:31.526034 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03be6150-fc37-4e08-979b-2806de3bf91b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "03be6150-fc37-4e08-979b-2806de3bf91b" (UID: "03be6150-fc37-4e08-979b-2806de3bf91b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:51:31 crc kubenswrapper[4756]: I1205 18:51:31.573709 4756 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03be6150-fc37-4e08-979b-2806de3bf91b-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:51:31 crc kubenswrapper[4756]: I1205 18:51:31.573746 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03be6150-fc37-4e08-979b-2806de3bf91b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 18:51:31 crc kubenswrapper[4756]: I1205 18:51:31.573761 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03be6150-fc37-4e08-979b-2806de3bf91b-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 18:51:31 crc kubenswrapper[4756]: I1205 18:51:31.573774 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfnx\" (UniqueName: \"kubernetes.io/projected/03be6150-fc37-4e08-979b-2806de3bf91b-kube-api-access-9xfnx\") on node \"crc\" DevicePath \"\"" Dec 05 18:51:31 crc kubenswrapper[4756]: I1205 18:51:31.573789 4756 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/03be6150-fc37-4e08-979b-2806de3bf91b-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 18:51:31 crc kubenswrapper[4756]: I1205 18:51:31.799675 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-5d6wl" event={"ID":"03be6150-fc37-4e08-979b-2806de3bf91b","Type":"ContainerDied","Data":"ed6cb7329379558601b2c15bbc2431c175937e4e04dafe679ed54c93604f44ef"} Dec 05 18:51:31 crc kubenswrapper[4756]: I1205 18:51:31.799736 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed6cb7329379558601b2c15bbc2431c175937e4e04dafe679ed54c93604f44ef" Dec 05 18:51:31 crc kubenswrapper[4756]: I1205 18:51:31.799779 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-5d6wl" Dec 05 18:51:33 crc kubenswrapper[4756]: I1205 18:51:33.052277 4756 scope.go:117] "RemoveContainer" containerID="8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f" Dec 05 18:51:33 crc kubenswrapper[4756]: E1205 18:51:33.053718 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:51:33 crc kubenswrapper[4756]: I1205 18:51:33.165674 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6q7vf"] Dec 05 18:51:33 crc kubenswrapper[4756]: E1205 18:51:33.166866 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ec20d8d-4e50-4a70-bcb8-2d5372c023fe" containerName="registry-server" Dec 05 18:51:33 crc kubenswrapper[4756]: I1205 18:51:33.166903 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ec20d8d-4e50-4a70-bcb8-2d5372c023fe" containerName="registry-server" Dec 05 18:51:33 crc kubenswrapper[4756]: E1205 18:51:33.166940 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ec20d8d-4e50-4a70-bcb8-2d5372c023fe" containerName="extract-utilities" Dec 05 18:51:33 crc kubenswrapper[4756]: I1205 18:51:33.166955 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ec20d8d-4e50-4a70-bcb8-2d5372c023fe" containerName="extract-utilities" Dec 05 18:51:33 crc kubenswrapper[4756]: E1205 18:51:33.166988 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ec20d8d-4e50-4a70-bcb8-2d5372c023fe" containerName="extract-content" Dec 05 18:51:33 crc kubenswrapper[4756]: I1205 18:51:33.167002 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ec20d8d-4e50-4a70-bcb8-2d5372c023fe" containerName="extract-content" Dec 05 18:51:33 crc kubenswrapper[4756]: E1205 18:51:33.167048 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03be6150-fc37-4e08-979b-2806de3bf91b" containerName="neutron-sriov-openstack-openstack-cell1" Dec 05 18:51:33 crc kubenswrapper[4756]: I1205 18:51:33.167061 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="03be6150-fc37-4e08-979b-2806de3bf91b" containerName="neutron-sriov-openstack-openstack-cell1" Dec 05 18:51:33 crc kubenswrapper[4756]: I1205 18:51:33.167542 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="03be6150-fc37-4e08-979b-2806de3bf91b" containerName="neutron-sriov-openstack-openstack-cell1" Dec 05 18:51:33 crc kubenswrapper[4756]: I1205 18:51:33.167600 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ec20d8d-4e50-4a70-bcb8-2d5372c023fe" containerName="registry-server" Dec 05 18:51:33 crc kubenswrapper[4756]: I1205 18:51:33.170594 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6q7vf" Dec 05 18:51:33 crc kubenswrapper[4756]: I1205 18:51:33.179757 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6q7vf"] Dec 05 18:51:33 crc kubenswrapper[4756]: I1205 18:51:33.326578 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64721f9c-7aeb-4e5b-9635-98f7f5547b11-utilities\") pod \"certified-operators-6q7vf\" (UID: \"64721f9c-7aeb-4e5b-9635-98f7f5547b11\") " pod="openshift-marketplace/certified-operators-6q7vf" Dec 05 18:51:33 crc kubenswrapper[4756]: I1205 18:51:33.327779 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64721f9c-7aeb-4e5b-9635-98f7f5547b11-catalog-content\") pod \"certified-operators-6q7vf\" (UID: \"64721f9c-7aeb-4e5b-9635-98f7f5547b11\") " pod="openshift-marketplace/certified-operators-6q7vf" Dec 05 18:51:33 crc kubenswrapper[4756]: I1205 18:51:33.328075 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc7xj\" (UniqueName: \"kubernetes.io/projected/64721f9c-7aeb-4e5b-9635-98f7f5547b11-kube-api-access-fc7xj\") pod \"certified-operators-6q7vf\" (UID: \"64721f9c-7aeb-4e5b-9635-98f7f5547b11\") " pod="openshift-marketplace/certified-operators-6q7vf" Dec 05 18:51:33 crc kubenswrapper[4756]: I1205 18:51:33.430645 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64721f9c-7aeb-4e5b-9635-98f7f5547b11-catalog-content\") pod \"certified-operators-6q7vf\" (UID: \"64721f9c-7aeb-4e5b-9635-98f7f5547b11\") " pod="openshift-marketplace/certified-operators-6q7vf" Dec 05 18:51:33 crc kubenswrapper[4756]: I1205 18:51:33.430747 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc7xj\" (UniqueName: \"kubernetes.io/projected/64721f9c-7aeb-4e5b-9635-98f7f5547b11-kube-api-access-fc7xj\") pod \"certified-operators-6q7vf\" (UID: \"64721f9c-7aeb-4e5b-9635-98f7f5547b11\") " pod="openshift-marketplace/certified-operators-6q7vf" Dec 05 18:51:33 crc kubenswrapper[4756]: I1205 18:51:33.430959 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64721f9c-7aeb-4e5b-9635-98f7f5547b11-utilities\") pod \"certified-operators-6q7vf\" (UID: \"64721f9c-7aeb-4e5b-9635-98f7f5547b11\") " pod="openshift-marketplace/certified-operators-6q7vf" Dec 05 18:51:33 crc kubenswrapper[4756]: I1205 18:51:33.431708 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64721f9c-7aeb-4e5b-9635-98f7f5547b11-utilities\") pod \"certified-operators-6q7vf\" (UID: \"64721f9c-7aeb-4e5b-9635-98f7f5547b11\") " pod="openshift-marketplace/certified-operators-6q7vf" Dec 05 18:51:33 crc kubenswrapper[4756]: I1205 18:51:33.431881 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64721f9c-7aeb-4e5b-9635-98f7f5547b11-catalog-content\") pod \"certified-operators-6q7vf\" (UID: \"64721f9c-7aeb-4e5b-9635-98f7f5547b11\") " pod="openshift-marketplace/certified-operators-6q7vf" Dec 05 18:51:33 crc kubenswrapper[4756]: I1205 18:51:33.833184 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc7xj\" (UniqueName: \"kubernetes.io/projected/64721f9c-7aeb-4e5b-9635-98f7f5547b11-kube-api-access-fc7xj\") pod \"certified-operators-6q7vf\" (UID: \"64721f9c-7aeb-4e5b-9635-98f7f5547b11\") " pod="openshift-marketplace/certified-operators-6q7vf" Dec 05 18:51:34 crc kubenswrapper[4756]: I1205 18:51:34.126272 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6q7vf" Dec 05 18:51:34 crc kubenswrapper[4756]: I1205 18:51:34.670190 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6q7vf"] Dec 05 18:51:34 crc kubenswrapper[4756]: I1205 18:51:34.833387 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6q7vf" event={"ID":"64721f9c-7aeb-4e5b-9635-98f7f5547b11","Type":"ContainerStarted","Data":"e6c1dec448ed174cc7b802ba88edc56018672ef47ddb18205f36ae30d59665eb"} Dec 05 18:51:35 crc kubenswrapper[4756]: I1205 18:51:35.848137 4756 generic.go:334] "Generic (PLEG): container finished" podID="64721f9c-7aeb-4e5b-9635-98f7f5547b11" containerID="4a6beb215fb0b5bf44224207c3ffe26fd2584426ad60f2da2f2b80eb99b36a44" exitCode=0 Dec 05 18:51:35 crc kubenswrapper[4756]: I1205 18:51:35.848230 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6q7vf" event={"ID":"64721f9c-7aeb-4e5b-9635-98f7f5547b11","Type":"ContainerDied","Data":"4a6beb215fb0b5bf44224207c3ffe26fd2584426ad60f2da2f2b80eb99b36a44"} Dec 05 18:51:35 crc kubenswrapper[4756]: I1205 18:51:35.851937 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 18:51:37 crc kubenswrapper[4756]: I1205 18:51:37.885903 4756 generic.go:334] "Generic (PLEG): container finished" podID="64721f9c-7aeb-4e5b-9635-98f7f5547b11" containerID="f64a57f7008d8ba0249b31e52a4a627df775d68e86ca70bf672a2f7900b229e4" exitCode=0 Dec 05 18:51:37 crc kubenswrapper[4756]: I1205 18:51:37.885968 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6q7vf" event={"ID":"64721f9c-7aeb-4e5b-9635-98f7f5547b11","Type":"ContainerDied","Data":"f64a57f7008d8ba0249b31e52a4a627df775d68e86ca70bf672a2f7900b229e4"} Dec 05 18:51:38 crc kubenswrapper[4756]: I1205 18:51:38.905736 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6q7vf" event={"ID":"64721f9c-7aeb-4e5b-9635-98f7f5547b11","Type":"ContainerStarted","Data":"8747afd7e81e4532d6b097bef43de4f23d8b1482153bdd1e182ad25bdea5f950"} Dec 05 18:51:38 crc kubenswrapper[4756]: I1205 18:51:38.947213 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6q7vf" podStartSLOduration=3.466865526 podStartE2EDuration="5.947185331s" podCreationTimestamp="2025-12-05 18:51:33 +0000 UTC" firstStartedPulling="2025-12-05 18:51:35.85151157 +0000 UTC m=+9356.090185122" lastFinishedPulling="2025-12-05 18:51:38.331831405 +0000 UTC m=+9358.570504927" observedRunningTime="2025-12-05 18:51:38.930110386 +0000 UTC m=+9359.168783928" watchObservedRunningTime="2025-12-05 18:51:38.947185331 +0000 UTC m=+9359.185858883" Dec 05 18:51:44 crc kubenswrapper[4756]: I1205 18:51:44.051884 4756 scope.go:117] "RemoveContainer" containerID="8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f" Dec 05 18:51:44 crc kubenswrapper[4756]: E1205 18:51:44.053283 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:51:44 crc kubenswrapper[4756]: I1205 18:51:44.127016 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6q7vf" Dec 05 18:51:44 crc kubenswrapper[4756]: I1205 18:51:44.127084 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6q7vf" Dec 05 18:51:44 crc kubenswrapper[4756]: I1205 18:51:44.193524 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6q7vf" Dec 05 18:51:45 crc kubenswrapper[4756]: I1205 18:51:45.062075 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6q7vf" Dec 05 18:51:45 crc kubenswrapper[4756]: I1205 18:51:45.132915 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6q7vf"] Dec 05 18:51:47 crc kubenswrapper[4756]: I1205 18:51:47.008380 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6q7vf" podUID="64721f9c-7aeb-4e5b-9635-98f7f5547b11" containerName="registry-server" containerID="cri-o://8747afd7e81e4532d6b097bef43de4f23d8b1482153bdd1e182ad25bdea5f950" gracePeriod=2 Dec 05 18:51:48 crc kubenswrapper[4756]: I1205 18:51:48.025747 4756 generic.go:334] "Generic (PLEG): container finished" podID="64721f9c-7aeb-4e5b-9635-98f7f5547b11" containerID="8747afd7e81e4532d6b097bef43de4f23d8b1482153bdd1e182ad25bdea5f950" exitCode=0 Dec 05 18:51:48 crc kubenswrapper[4756]: I1205 18:51:48.025832 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6q7vf" event={"ID":"64721f9c-7aeb-4e5b-9635-98f7f5547b11","Type":"ContainerDied","Data":"8747afd7e81e4532d6b097bef43de4f23d8b1482153bdd1e182ad25bdea5f950"} Dec 05 18:51:48 crc kubenswrapper[4756]: I1205 18:51:48.601560 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6q7vf" Dec 05 18:51:48 crc kubenswrapper[4756]: I1205 18:51:48.766046 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64721f9c-7aeb-4e5b-9635-98f7f5547b11-catalog-content\") pod \"64721f9c-7aeb-4e5b-9635-98f7f5547b11\" (UID: \"64721f9c-7aeb-4e5b-9635-98f7f5547b11\") " Dec 05 18:51:48 crc kubenswrapper[4756]: I1205 18:51:48.766138 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64721f9c-7aeb-4e5b-9635-98f7f5547b11-utilities\") pod \"64721f9c-7aeb-4e5b-9635-98f7f5547b11\" (UID: \"64721f9c-7aeb-4e5b-9635-98f7f5547b11\") " Dec 05 18:51:48 crc kubenswrapper[4756]: I1205 18:51:48.766177 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc7xj\" (UniqueName: \"kubernetes.io/projected/64721f9c-7aeb-4e5b-9635-98f7f5547b11-kube-api-access-fc7xj\") pod \"64721f9c-7aeb-4e5b-9635-98f7f5547b11\" (UID: \"64721f9c-7aeb-4e5b-9635-98f7f5547b11\") " Dec 05 18:51:48 crc kubenswrapper[4756]: I1205 18:51:48.768097 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64721f9c-7aeb-4e5b-9635-98f7f5547b11-utilities" (OuterVolumeSpecName: "utilities") pod "64721f9c-7aeb-4e5b-9635-98f7f5547b11" (UID: "64721f9c-7aeb-4e5b-9635-98f7f5547b11"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:51:48 crc kubenswrapper[4756]: I1205 18:51:48.775986 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64721f9c-7aeb-4e5b-9635-98f7f5547b11-kube-api-access-fc7xj" (OuterVolumeSpecName: "kube-api-access-fc7xj") pod "64721f9c-7aeb-4e5b-9635-98f7f5547b11" (UID: "64721f9c-7aeb-4e5b-9635-98f7f5547b11"). InnerVolumeSpecName "kube-api-access-fc7xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:51:48 crc kubenswrapper[4756]: I1205 18:51:48.839367 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64721f9c-7aeb-4e5b-9635-98f7f5547b11-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64721f9c-7aeb-4e5b-9635-98f7f5547b11" (UID: "64721f9c-7aeb-4e5b-9635-98f7f5547b11"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:51:48 crc kubenswrapper[4756]: I1205 18:51:48.871089 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64721f9c-7aeb-4e5b-9635-98f7f5547b11-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 18:51:48 crc kubenswrapper[4756]: I1205 18:51:48.871182 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64721f9c-7aeb-4e5b-9635-98f7f5547b11-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 18:51:48 crc kubenswrapper[4756]: I1205 18:51:48.871269 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc7xj\" (UniqueName: \"kubernetes.io/projected/64721f9c-7aeb-4e5b-9635-98f7f5547b11-kube-api-access-fc7xj\") on node \"crc\" DevicePath \"\"" Dec 05 18:51:49 crc kubenswrapper[4756]: I1205 18:51:49.039879 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6q7vf" event={"ID":"64721f9c-7aeb-4e5b-9635-98f7f5547b11","Type":"ContainerDied","Data":"e6c1dec448ed174cc7b802ba88edc56018672ef47ddb18205f36ae30d59665eb"} Dec 05 18:51:49 crc kubenswrapper[4756]: I1205 18:51:49.040280 4756 scope.go:117] "RemoveContainer" containerID="8747afd7e81e4532d6b097bef43de4f23d8b1482153bdd1e182ad25bdea5f950" Dec 05 18:51:49 crc kubenswrapper[4756]: I1205 18:51:49.040025 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6q7vf" Dec 05 18:51:49 crc kubenswrapper[4756]: I1205 18:51:49.064243 4756 scope.go:117] "RemoveContainer" containerID="f64a57f7008d8ba0249b31e52a4a627df775d68e86ca70bf672a2f7900b229e4" Dec 05 18:51:49 crc kubenswrapper[4756]: I1205 18:51:49.097461 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6q7vf"] Dec 05 18:51:49 crc kubenswrapper[4756]: I1205 18:51:49.109051 4756 scope.go:117] "RemoveContainer" containerID="4a6beb215fb0b5bf44224207c3ffe26fd2584426ad60f2da2f2b80eb99b36a44" Dec 05 18:51:49 crc kubenswrapper[4756]: I1205 18:51:49.111857 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6q7vf"] Dec 05 18:51:50 crc kubenswrapper[4756]: I1205 18:51:50.077755 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64721f9c-7aeb-4e5b-9635-98f7f5547b11" path="/var/lib/kubelet/pods/64721f9c-7aeb-4e5b-9635-98f7f5547b11/volumes" Dec 05 18:51:58 crc kubenswrapper[4756]: I1205 18:51:58.051199 4756 scope.go:117] "RemoveContainer" containerID="8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f" Dec 05 18:51:58 crc kubenswrapper[4756]: E1205 18:51:58.053846 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:51:59 crc kubenswrapper[4756]: I1205 18:51:59.514608 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 18:51:59 crc kubenswrapper[4756]: I1205 18:51:59.515162 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="3d5c272e-db66-4d1e-b5b6-9fa305a01c8d" containerName="nova-cell0-conductor-conductor" containerID="cri-o://8ac544e6acb53e59340961cb347b29993be336ec7ebc42fa894e410a02fbe8ba" gracePeriod=30 Dec 05 18:51:59 crc kubenswrapper[4756]: I1205 18:51:59.564756 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 18:51:59 crc kubenswrapper[4756]: I1205 18:51:59.565005 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="a3e2ced8-3d41-4e67-afd7-f82da5f22f1d" containerName="nova-cell1-conductor-conductor" containerID="cri-o://ca33974b6c799ac1fd07c5c1926a60110f1774b448b224e415720ed404ad30dc" gracePeriod=30 Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.364394 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.364945 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4838409d-0cb6-459b-87d3-525ec6da0fca" containerName="nova-api-log" containerID="cri-o://c46b87ac77963258fe8935b2ed2881f4e04f76c200f665df70826623fd681e4f" gracePeriod=30 Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.365079 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4838409d-0cb6-459b-87d3-525ec6da0fca" containerName="nova-api-api" containerID="cri-o://1fae077c39cddbee5bc0d17bde93f808f74bf8c6b511cc9e54e16a3f01d360b9" gracePeriod=30 Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.389583 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.389853 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a5bd65a0-3765-47ba-bde4-73f8429a2219" containerName="nova-scheduler-scheduler" containerID="cri-o://527c9ca1484ba7a5a882cb1a20859194d3a358f0a0bd3018b858584d7650191a" gracePeriod=30 Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.399330 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq"] Dec 05 18:52:00 crc kubenswrapper[4756]: E1205 18:52:00.399965 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64721f9c-7aeb-4e5b-9635-98f7f5547b11" containerName="registry-server" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.399990 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="64721f9c-7aeb-4e5b-9635-98f7f5547b11" containerName="registry-server" Dec 05 18:52:00 crc kubenswrapper[4756]: E1205 18:52:00.400024 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64721f9c-7aeb-4e5b-9635-98f7f5547b11" containerName="extract-utilities" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.400033 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="64721f9c-7aeb-4e5b-9635-98f7f5547b11" containerName="extract-utilities" Dec 05 18:52:00 crc kubenswrapper[4756]: E1205 18:52:00.400060 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64721f9c-7aeb-4e5b-9635-98f7f5547b11" containerName="extract-content" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.400068 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="64721f9c-7aeb-4e5b-9635-98f7f5547b11" containerName="extract-content" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.400315 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="64721f9c-7aeb-4e5b-9635-98f7f5547b11" containerName="registry-server" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.401133 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.404484 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.404676 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.404735 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.404803 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.404994 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dfk9j" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.405025 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.405750 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.408502 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq"] Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.455889 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.456136 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3d2b6ac9-0cdf-4fed-836e-918827fd0d90" containerName="nova-metadata-log" containerID="cri-o://d76bef93df02b6312bb583f8ecb763cce117bdc6e900ecbb8aeec0283f903e37" gracePeriod=30 Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.456250 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3d2b6ac9-0cdf-4fed-836e-918827fd0d90" containerName="nova-metadata-metadata" containerID="cri-o://4d1fcf3379d75e77c4561d70e283c1ed0d5037a8f285a1af30bb896cacbe3713" gracePeriod=30 Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.470708 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.470785 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.470821 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.470856 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.470884 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.470982 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.471018 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92s54\" (UniqueName: \"kubernetes.io/projected/dcacd5e8-09f7-447e-8469-6a1462c5916c-kube-api-access-92s54\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.471121 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.471176 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.572968 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.573009 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92s54\" (UniqueName: \"kubernetes.io/projected/dcacd5e8-09f7-447e-8469-6a1462c5916c-kube-api-access-92s54\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.573095 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.573136 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.573199 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.573238 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.573264 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.573288 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.573307 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.576560 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.580174 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.581441 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.583809 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.584194 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.584670 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.593740 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92s54\" (UniqueName: \"kubernetes.io/projected/dcacd5e8-09f7-447e-8469-6a1462c5916c-kube-api-access-92s54\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.594788 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.595926 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:52:00 crc kubenswrapper[4756]: I1205 18:52:00.752475 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:52:01 crc kubenswrapper[4756]: I1205 18:52:01.267503 4756 generic.go:334] "Generic (PLEG): container finished" podID="4838409d-0cb6-459b-87d3-525ec6da0fca" containerID="c46b87ac77963258fe8935b2ed2881f4e04f76c200f665df70826623fd681e4f" exitCode=143 Dec 05 18:52:01 crc kubenswrapper[4756]: I1205 18:52:01.267578 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4838409d-0cb6-459b-87d3-525ec6da0fca","Type":"ContainerDied","Data":"c46b87ac77963258fe8935b2ed2881f4e04f76c200f665df70826623fd681e4f"} Dec 05 18:52:01 crc kubenswrapper[4756]: I1205 18:52:01.271901 4756 generic.go:334] "Generic (PLEG): container finished" podID="3d2b6ac9-0cdf-4fed-836e-918827fd0d90" containerID="d76bef93df02b6312bb583f8ecb763cce117bdc6e900ecbb8aeec0283f903e37" exitCode=143 Dec 05 18:52:01 crc kubenswrapper[4756]: I1205 18:52:01.271963 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3d2b6ac9-0cdf-4fed-836e-918827fd0d90","Type":"ContainerDied","Data":"d76bef93df02b6312bb583f8ecb763cce117bdc6e900ecbb8aeec0283f903e37"} Dec 05 18:52:01 crc kubenswrapper[4756]: I1205 18:52:01.336259 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq"] Dec 05 18:52:01 crc kubenswrapper[4756]: E1205 18:52:01.463094 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ca33974b6c799ac1fd07c5c1926a60110f1774b448b224e415720ed404ad30dc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 05 18:52:01 crc kubenswrapper[4756]: E1205 18:52:01.465239 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ca33974b6c799ac1fd07c5c1926a60110f1774b448b224e415720ed404ad30dc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 05 18:52:01 crc kubenswrapper[4756]: E1205 18:52:01.466739 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ca33974b6c799ac1fd07c5c1926a60110f1774b448b224e415720ed404ad30dc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 05 18:52:01 crc kubenswrapper[4756]: E1205 18:52:01.466794 4756 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="a3e2ced8-3d41-4e67-afd7-f82da5f22f1d" containerName="nova-cell1-conductor-conductor" Dec 05 18:52:02 crc kubenswrapper[4756]: I1205 18:52:02.286008 4756 generic.go:334] "Generic (PLEG): container finished" podID="3d5c272e-db66-4d1e-b5b6-9fa305a01c8d" containerID="8ac544e6acb53e59340961cb347b29993be336ec7ebc42fa894e410a02fbe8ba" exitCode=0 Dec 05 18:52:02 crc kubenswrapper[4756]: I1205 18:52:02.286272 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3d5c272e-db66-4d1e-b5b6-9fa305a01c8d","Type":"ContainerDied","Data":"8ac544e6acb53e59340961cb347b29993be336ec7ebc42fa894e410a02fbe8ba"} Dec 05 18:52:02 crc kubenswrapper[4756]: I1205 18:52:02.289596 4756 generic.go:334] "Generic (PLEG): container finished" podID="a3e2ced8-3d41-4e67-afd7-f82da5f22f1d" containerID="ca33974b6c799ac1fd07c5c1926a60110f1774b448b224e415720ed404ad30dc" exitCode=0 Dec 05 18:52:02 crc kubenswrapper[4756]: I1205 18:52:02.289671 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a3e2ced8-3d41-4e67-afd7-f82da5f22f1d","Type":"ContainerDied","Data":"ca33974b6c799ac1fd07c5c1926a60110f1774b448b224e415720ed404ad30dc"} Dec 05 18:52:02 crc kubenswrapper[4756]: I1205 18:52:02.292119 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" event={"ID":"dcacd5e8-09f7-447e-8469-6a1462c5916c","Type":"ContainerStarted","Data":"fcd077eb3374221ec9473dd7a7cb001a8c1d32c5c2c38b3d387de26ff213de62"} Dec 05 18:52:02 crc kubenswrapper[4756]: I1205 18:52:02.660358 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 18:52:02 crc kubenswrapper[4756]: I1205 18:52:02.676690 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 18:52:02 crc kubenswrapper[4756]: I1205 18:52:02.834794 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vvdj\" (UniqueName: \"kubernetes.io/projected/a3e2ced8-3d41-4e67-afd7-f82da5f22f1d-kube-api-access-4vvdj\") pod \"a3e2ced8-3d41-4e67-afd7-f82da5f22f1d\" (UID: \"a3e2ced8-3d41-4e67-afd7-f82da5f22f1d\") " Dec 05 18:52:02 crc kubenswrapper[4756]: I1205 18:52:02.834990 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3e2ced8-3d41-4e67-afd7-f82da5f22f1d-config-data\") pod \"a3e2ced8-3d41-4e67-afd7-f82da5f22f1d\" (UID: \"a3e2ced8-3d41-4e67-afd7-f82da5f22f1d\") " Dec 05 18:52:02 crc kubenswrapper[4756]: I1205 18:52:02.835371 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d5c272e-db66-4d1e-b5b6-9fa305a01c8d-combined-ca-bundle\") pod \"3d5c272e-db66-4d1e-b5b6-9fa305a01c8d\" (UID: \"3d5c272e-db66-4d1e-b5b6-9fa305a01c8d\") " Dec 05 18:52:02 crc kubenswrapper[4756]: I1205 18:52:02.835448 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3e2ced8-3d41-4e67-afd7-f82da5f22f1d-combined-ca-bundle\") pod \"a3e2ced8-3d41-4e67-afd7-f82da5f22f1d\" (UID: \"a3e2ced8-3d41-4e67-afd7-f82da5f22f1d\") " Dec 05 18:52:02 crc kubenswrapper[4756]: I1205 18:52:02.835524 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qqmf\" (UniqueName: \"kubernetes.io/projected/3d5c272e-db66-4d1e-b5b6-9fa305a01c8d-kube-api-access-6qqmf\") pod \"3d5c272e-db66-4d1e-b5b6-9fa305a01c8d\" (UID: \"3d5c272e-db66-4d1e-b5b6-9fa305a01c8d\") " Dec 05 18:52:02 crc kubenswrapper[4756]: I1205 18:52:02.835667 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d5c272e-db66-4d1e-b5b6-9fa305a01c8d-config-data\") pod \"3d5c272e-db66-4d1e-b5b6-9fa305a01c8d\" (UID: \"3d5c272e-db66-4d1e-b5b6-9fa305a01c8d\") " Dec 05 18:52:02 crc kubenswrapper[4756]: I1205 18:52:02.842176 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3e2ced8-3d41-4e67-afd7-f82da5f22f1d-kube-api-access-4vvdj" (OuterVolumeSpecName: "kube-api-access-4vvdj") pod "a3e2ced8-3d41-4e67-afd7-f82da5f22f1d" (UID: "a3e2ced8-3d41-4e67-afd7-f82da5f22f1d"). InnerVolumeSpecName "kube-api-access-4vvdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:52:02 crc kubenswrapper[4756]: I1205 18:52:02.843226 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d5c272e-db66-4d1e-b5b6-9fa305a01c8d-kube-api-access-6qqmf" (OuterVolumeSpecName: "kube-api-access-6qqmf") pod "3d5c272e-db66-4d1e-b5b6-9fa305a01c8d" (UID: "3d5c272e-db66-4d1e-b5b6-9fa305a01c8d"). InnerVolumeSpecName "kube-api-access-6qqmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:52:02 crc kubenswrapper[4756]: I1205 18:52:02.868892 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d5c272e-db66-4d1e-b5b6-9fa305a01c8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d5c272e-db66-4d1e-b5b6-9fa305a01c8d" (UID: "3d5c272e-db66-4d1e-b5b6-9fa305a01c8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:52:02 crc kubenswrapper[4756]: I1205 18:52:02.869577 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3e2ced8-3d41-4e67-afd7-f82da5f22f1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3e2ced8-3d41-4e67-afd7-f82da5f22f1d" (UID: "a3e2ced8-3d41-4e67-afd7-f82da5f22f1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:52:02 crc kubenswrapper[4756]: I1205 18:52:02.884142 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3e2ced8-3d41-4e67-afd7-f82da5f22f1d-config-data" (OuterVolumeSpecName: "config-data") pod "a3e2ced8-3d41-4e67-afd7-f82da5f22f1d" (UID: "a3e2ced8-3d41-4e67-afd7-f82da5f22f1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:52:02 crc kubenswrapper[4756]: I1205 18:52:02.898563 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d5c272e-db66-4d1e-b5b6-9fa305a01c8d-config-data" (OuterVolumeSpecName: "config-data") pod "3d5c272e-db66-4d1e-b5b6-9fa305a01c8d" (UID: "3d5c272e-db66-4d1e-b5b6-9fa305a01c8d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:52:02 crc kubenswrapper[4756]: I1205 18:52:02.939077 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vvdj\" (UniqueName: \"kubernetes.io/projected/a3e2ced8-3d41-4e67-afd7-f82da5f22f1d-kube-api-access-4vvdj\") on node \"crc\" DevicePath \"\"" Dec 05 18:52:02 crc kubenswrapper[4756]: I1205 18:52:02.939135 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3e2ced8-3d41-4e67-afd7-f82da5f22f1d-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 18:52:02 crc kubenswrapper[4756]: I1205 18:52:02.939157 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d5c272e-db66-4d1e-b5b6-9fa305a01c8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:52:02 crc kubenswrapper[4756]: I1205 18:52:02.939175 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3e2ced8-3d41-4e67-afd7-f82da5f22f1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:52:02 crc kubenswrapper[4756]: I1205 18:52:02.939194 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qqmf\" (UniqueName: \"kubernetes.io/projected/3d5c272e-db66-4d1e-b5b6-9fa305a01c8d-kube-api-access-6qqmf\") on node \"crc\" DevicePath \"\"" Dec 05 18:52:02 crc kubenswrapper[4756]: I1205 18:52:02.939210 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d5c272e-db66-4d1e-b5b6-9fa305a01c8d-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.112589 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.244594 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxv7z\" (UniqueName: \"kubernetes.io/projected/a5bd65a0-3765-47ba-bde4-73f8429a2219-kube-api-access-fxv7z\") pod \"a5bd65a0-3765-47ba-bde4-73f8429a2219\" (UID: \"a5bd65a0-3765-47ba-bde4-73f8429a2219\") " Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.244813 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5bd65a0-3765-47ba-bde4-73f8429a2219-combined-ca-bundle\") pod \"a5bd65a0-3765-47ba-bde4-73f8429a2219\" (UID: \"a5bd65a0-3765-47ba-bde4-73f8429a2219\") " Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.244871 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5bd65a0-3765-47ba-bde4-73f8429a2219-config-data\") pod \"a5bd65a0-3765-47ba-bde4-73f8429a2219\" (UID: \"a5bd65a0-3765-47ba-bde4-73f8429a2219\") " Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.250953 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5bd65a0-3765-47ba-bde4-73f8429a2219-kube-api-access-fxv7z" (OuterVolumeSpecName: "kube-api-access-fxv7z") pod "a5bd65a0-3765-47ba-bde4-73f8429a2219" (UID: "a5bd65a0-3765-47ba-bde4-73f8429a2219"). InnerVolumeSpecName "kube-api-access-fxv7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.284575 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5bd65a0-3765-47ba-bde4-73f8429a2219-config-data" (OuterVolumeSpecName: "config-data") pod "a5bd65a0-3765-47ba-bde4-73f8429a2219" (UID: "a5bd65a0-3765-47ba-bde4-73f8429a2219"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.288134 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5bd65a0-3765-47ba-bde4-73f8429a2219-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5bd65a0-3765-47ba-bde4-73f8429a2219" (UID: "a5bd65a0-3765-47ba-bde4-73f8429a2219"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.303726 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a3e2ced8-3d41-4e67-afd7-f82da5f22f1d","Type":"ContainerDied","Data":"48bbeacd7118fc16c61fdeaef7bed0b724b3619820ca80a3fe7c8a5c674e51d8"} Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.303781 4756 scope.go:117] "RemoveContainer" containerID="ca33974b6c799ac1fd07c5c1926a60110f1774b448b224e415720ed404ad30dc" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.303794 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.310319 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" event={"ID":"dcacd5e8-09f7-447e-8469-6a1462c5916c","Type":"ContainerStarted","Data":"facaf1066236adaf716136dba37aa1a69149929bdeb82f7051b7eec612cebbc5"} Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.315912 4756 generic.go:334] "Generic (PLEG): container finished" podID="a5bd65a0-3765-47ba-bde4-73f8429a2219" containerID="527c9ca1484ba7a5a882cb1a20859194d3a358f0a0bd3018b858584d7650191a" exitCode=0 Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.316118 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a5bd65a0-3765-47ba-bde4-73f8429a2219","Type":"ContainerDied","Data":"527c9ca1484ba7a5a882cb1a20859194d3a358f0a0bd3018b858584d7650191a"} Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.316194 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a5bd65a0-3765-47ba-bde4-73f8429a2219","Type":"ContainerDied","Data":"21d06e9eed0b322bd18a009f159c865c967bd859c549993e86cd696488186904"} Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.316357 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.320936 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3d5c272e-db66-4d1e-b5b6-9fa305a01c8d","Type":"ContainerDied","Data":"0c1761025092ba7b5de0bd1ff74ff12a0dd22a01af8d2ca838a8efd0ed87e676"} Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.321393 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.347544 4756 scope.go:117] "RemoveContainer" containerID="527c9ca1484ba7a5a882cb1a20859194d3a358f0a0bd3018b858584d7650191a" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.348244 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5bd65a0-3765-47ba-bde4-73f8429a2219-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.348291 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5bd65a0-3765-47ba-bde4-73f8429a2219-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.348312 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxv7z\" (UniqueName: \"kubernetes.io/projected/a5bd65a0-3765-47ba-bde4-73f8429a2219-kube-api-access-fxv7z\") on node \"crc\" DevicePath \"\"" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.354394 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" podStartSLOduration=2.69560998 podStartE2EDuration="3.354372242s" podCreationTimestamp="2025-12-05 18:52:00 +0000 UTC" firstStartedPulling="2025-12-05 18:52:01.347325087 +0000 UTC m=+9381.585998639" lastFinishedPulling="2025-12-05 18:52:02.006087349 +0000 UTC m=+9382.244760901" observedRunningTime="2025-12-05 18:52:03.344942461 +0000 UTC m=+9383.583616013" watchObservedRunningTime="2025-12-05 18:52:03.354372242 +0000 UTC m=+9383.593045784" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.386949 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.397305 4756 scope.go:117] "RemoveContainer" containerID="527c9ca1484ba7a5a882cb1a20859194d3a358f0a0bd3018b858584d7650191a" Dec 05 18:52:03 crc kubenswrapper[4756]: E1205 18:52:03.398571 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"527c9ca1484ba7a5a882cb1a20859194d3a358f0a0bd3018b858584d7650191a\": container with ID starting with 527c9ca1484ba7a5a882cb1a20859194d3a358f0a0bd3018b858584d7650191a not found: ID does not exist" containerID="527c9ca1484ba7a5a882cb1a20859194d3a358f0a0bd3018b858584d7650191a" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.398609 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"527c9ca1484ba7a5a882cb1a20859194d3a358f0a0bd3018b858584d7650191a"} err="failed to get container status \"527c9ca1484ba7a5a882cb1a20859194d3a358f0a0bd3018b858584d7650191a\": rpc error: code = NotFound desc = could not find container \"527c9ca1484ba7a5a882cb1a20859194d3a358f0a0bd3018b858584d7650191a\": container with ID starting with 527c9ca1484ba7a5a882cb1a20859194d3a358f0a0bd3018b858584d7650191a not found: ID does not exist" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.398632 4756 scope.go:117] "RemoveContainer" containerID="8ac544e6acb53e59340961cb347b29993be336ec7ebc42fa894e410a02fbe8ba" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.402811 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.422674 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 18:52:03 crc kubenswrapper[4756]: E1205 18:52:03.423210 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5bd65a0-3765-47ba-bde4-73f8429a2219" containerName="nova-scheduler-scheduler" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.423228 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5bd65a0-3765-47ba-bde4-73f8429a2219" containerName="nova-scheduler-scheduler" Dec 05 18:52:03 crc kubenswrapper[4756]: E1205 18:52:03.423274 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d5c272e-db66-4d1e-b5b6-9fa305a01c8d" containerName="nova-cell0-conductor-conductor" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.423282 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d5c272e-db66-4d1e-b5b6-9fa305a01c8d" containerName="nova-cell0-conductor-conductor" Dec 05 18:52:03 crc kubenswrapper[4756]: E1205 18:52:03.423296 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3e2ced8-3d41-4e67-afd7-f82da5f22f1d" containerName="nova-cell1-conductor-conductor" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.423304 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3e2ced8-3d41-4e67-afd7-f82da5f22f1d" containerName="nova-cell1-conductor-conductor" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.423512 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5bd65a0-3765-47ba-bde4-73f8429a2219" containerName="nova-scheduler-scheduler" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.423525 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3e2ced8-3d41-4e67-afd7-f82da5f22f1d" containerName="nova-cell1-conductor-conductor" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.423548 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d5c272e-db66-4d1e-b5b6-9fa305a01c8d" containerName="nova-cell0-conductor-conductor" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.424328 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.427521 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.442772 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.451085 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.462128 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.470814 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.482169 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.490653 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.492083 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.495474 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.506665 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.518088 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.520970 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.524805 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.549091 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.556584 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4056238e-c76d-4c30-af9b-62a5feb11624-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4056238e-c76d-4c30-af9b-62a5feb11624\") " pod="openstack/nova-cell1-conductor-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.556693 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pgnt\" (UniqueName: \"kubernetes.io/projected/4056238e-c76d-4c30-af9b-62a5feb11624-kube-api-access-8pgnt\") pod \"nova-cell1-conductor-0\" (UID: \"4056238e-c76d-4c30-af9b-62a5feb11624\") " pod="openstack/nova-cell1-conductor-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.556786 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4056238e-c76d-4c30-af9b-62a5feb11624-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4056238e-c76d-4c30-af9b-62a5feb11624\") " pod="openstack/nova-cell1-conductor-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.603460 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3d2b6ac9-0cdf-4fed-836e-918827fd0d90" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.91:8775/\": read tcp 10.217.0.2:37796->10.217.1.91:8775: read: connection reset by peer" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.603450 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3d2b6ac9-0cdf-4fed-836e-918827fd0d90" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.91:8775/\": read tcp 10.217.0.2:37798->10.217.1.91:8775: read: connection reset by peer" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.659128 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pgnt\" (UniqueName: \"kubernetes.io/projected/4056238e-c76d-4c30-af9b-62a5feb11624-kube-api-access-8pgnt\") pod \"nova-cell1-conductor-0\" (UID: \"4056238e-c76d-4c30-af9b-62a5feb11624\") " pod="openstack/nova-cell1-conductor-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.659186 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/280ceb24-00a7-4a73-aa0a-5ee4bc7d62ba-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"280ceb24-00a7-4a73-aa0a-5ee4bc7d62ba\") " pod="openstack/nova-scheduler-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.659215 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4204540-a721-43ea-bc43-b39baf51e18b-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b4204540-a721-43ea-bc43-b39baf51e18b\") " pod="openstack/nova-cell0-conductor-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.659282 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4056238e-c76d-4c30-af9b-62a5feb11624-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4056238e-c76d-4c30-af9b-62a5feb11624\") " pod="openstack/nova-cell1-conductor-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.659315 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4204540-a721-43ea-bc43-b39baf51e18b-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b4204540-a721-43ea-bc43-b39baf51e18b\") " pod="openstack/nova-cell0-conductor-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.659357 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/280ceb24-00a7-4a73-aa0a-5ee4bc7d62ba-config-data\") pod \"nova-scheduler-0\" (UID: \"280ceb24-00a7-4a73-aa0a-5ee4bc7d62ba\") " pod="openstack/nova-scheduler-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.659450 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4056238e-c76d-4c30-af9b-62a5feb11624-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4056238e-c76d-4c30-af9b-62a5feb11624\") " pod="openstack/nova-cell1-conductor-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.659473 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w978n\" (UniqueName: \"kubernetes.io/projected/b4204540-a721-43ea-bc43-b39baf51e18b-kube-api-access-w978n\") pod \"nova-cell0-conductor-0\" (UID: \"b4204540-a721-43ea-bc43-b39baf51e18b\") " pod="openstack/nova-cell0-conductor-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.659518 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xddvh\" (UniqueName: \"kubernetes.io/projected/280ceb24-00a7-4a73-aa0a-5ee4bc7d62ba-kube-api-access-xddvh\") pod \"nova-scheduler-0\" (UID: \"280ceb24-00a7-4a73-aa0a-5ee4bc7d62ba\") " pod="openstack/nova-scheduler-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.664906 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4056238e-c76d-4c30-af9b-62a5feb11624-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4056238e-c76d-4c30-af9b-62a5feb11624\") " pod="openstack/nova-cell1-conductor-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.664982 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4056238e-c76d-4c30-af9b-62a5feb11624-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4056238e-c76d-4c30-af9b-62a5feb11624\") " pod="openstack/nova-cell1-conductor-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.677302 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pgnt\" (UniqueName: \"kubernetes.io/projected/4056238e-c76d-4c30-af9b-62a5feb11624-kube-api-access-8pgnt\") pod \"nova-cell1-conductor-0\" (UID: \"4056238e-c76d-4c30-af9b-62a5feb11624\") " pod="openstack/nova-cell1-conductor-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.751554 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.761072 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4204540-a721-43ea-bc43-b39baf51e18b-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b4204540-a721-43ea-bc43-b39baf51e18b\") " pod="openstack/nova-cell0-conductor-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.761150 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/280ceb24-00a7-4a73-aa0a-5ee4bc7d62ba-config-data\") pod \"nova-scheduler-0\" (UID: \"280ceb24-00a7-4a73-aa0a-5ee4bc7d62ba\") " pod="openstack/nova-scheduler-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.761215 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w978n\" (UniqueName: \"kubernetes.io/projected/b4204540-a721-43ea-bc43-b39baf51e18b-kube-api-access-w978n\") pod \"nova-cell0-conductor-0\" (UID: \"b4204540-a721-43ea-bc43-b39baf51e18b\") " pod="openstack/nova-cell0-conductor-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.761253 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xddvh\" (UniqueName: \"kubernetes.io/projected/280ceb24-00a7-4a73-aa0a-5ee4bc7d62ba-kube-api-access-xddvh\") pod \"nova-scheduler-0\" (UID: \"280ceb24-00a7-4a73-aa0a-5ee4bc7d62ba\") " pod="openstack/nova-scheduler-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.761313 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/280ceb24-00a7-4a73-aa0a-5ee4bc7d62ba-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"280ceb24-00a7-4a73-aa0a-5ee4bc7d62ba\") " pod="openstack/nova-scheduler-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.761336 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4204540-a721-43ea-bc43-b39baf51e18b-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b4204540-a721-43ea-bc43-b39baf51e18b\") " pod="openstack/nova-cell0-conductor-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.767153 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4204540-a721-43ea-bc43-b39baf51e18b-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b4204540-a721-43ea-bc43-b39baf51e18b\") " pod="openstack/nova-cell0-conductor-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.767227 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/280ceb24-00a7-4a73-aa0a-5ee4bc7d62ba-config-data\") pod \"nova-scheduler-0\" (UID: \"280ceb24-00a7-4a73-aa0a-5ee4bc7d62ba\") " pod="openstack/nova-scheduler-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.772327 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4204540-a721-43ea-bc43-b39baf51e18b-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b4204540-a721-43ea-bc43-b39baf51e18b\") " pod="openstack/nova-cell0-conductor-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.777690 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/280ceb24-00a7-4a73-aa0a-5ee4bc7d62ba-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"280ceb24-00a7-4a73-aa0a-5ee4bc7d62ba\") " pod="openstack/nova-scheduler-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.780785 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xddvh\" (UniqueName: \"kubernetes.io/projected/280ceb24-00a7-4a73-aa0a-5ee4bc7d62ba-kube-api-access-xddvh\") pod \"nova-scheduler-0\" (UID: \"280ceb24-00a7-4a73-aa0a-5ee4bc7d62ba\") " pod="openstack/nova-scheduler-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.784803 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w978n\" (UniqueName: \"kubernetes.io/projected/b4204540-a721-43ea-bc43-b39baf51e18b-kube-api-access-w978n\") pod \"nova-cell0-conductor-0\" (UID: \"b4204540-a721-43ea-bc43-b39baf51e18b\") " pod="openstack/nova-cell0-conductor-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.887312 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 18:52:03 crc kubenswrapper[4756]: I1205 18:52:03.891223 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 18:52:04 crc kubenswrapper[4756]: I1205 18:52:04.066050 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d5c272e-db66-4d1e-b5b6-9fa305a01c8d" path="/var/lib/kubelet/pods/3d5c272e-db66-4d1e-b5b6-9fa305a01c8d/volumes" Dec 05 18:52:04 crc kubenswrapper[4756]: I1205 18:52:04.067378 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3e2ced8-3d41-4e67-afd7-f82da5f22f1d" path="/var/lib/kubelet/pods/a3e2ced8-3d41-4e67-afd7-f82da5f22f1d/volumes" Dec 05 18:52:04 crc kubenswrapper[4756]: I1205 18:52:04.068386 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5bd65a0-3765-47ba-bde4-73f8429a2219" path="/var/lib/kubelet/pods/a5bd65a0-3765-47ba-bde4-73f8429a2219/volumes" Dec 05 18:52:04 crc kubenswrapper[4756]: I1205 18:52:04.284514 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 18:52:04 crc kubenswrapper[4756]: I1205 18:52:04.355835 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4056238e-c76d-4c30-af9b-62a5feb11624","Type":"ContainerStarted","Data":"5a9d2d4d175513c78fde617f902b5983a21d15aab9d6f15144fcf91b8f3ee770"} Dec 05 18:52:04 crc kubenswrapper[4756]: W1205 18:52:04.458370 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4204540_a721_43ea_bc43_b39baf51e18b.slice/crio-b775809932a8fd3c4bb638aa52b52897248fb978b3bfa87eb4529d8e91f3e817 WatchSource:0}: Error finding container b775809932a8fd3c4bb638aa52b52897248fb978b3bfa87eb4529d8e91f3e817: Status 404 returned error can't find the container with id b775809932a8fd3c4bb638aa52b52897248fb978b3bfa87eb4529d8e91f3e817 Dec 05 18:52:04 crc kubenswrapper[4756]: I1205 18:52:04.461211 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 18:52:04 crc kubenswrapper[4756]: I1205 18:52:04.562681 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 18:52:04 crc kubenswrapper[4756]: W1205 18:52:04.567037 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod280ceb24_00a7_4a73_aa0a_5ee4bc7d62ba.slice/crio-65ba7050f70d5a50a1cdab1b184fcc77ef925327af23d595beb852c01f324788 WatchSource:0}: Error finding container 65ba7050f70d5a50a1cdab1b184fcc77ef925327af23d595beb852c01f324788: Status 404 returned error can't find the container with id 65ba7050f70d5a50a1cdab1b184fcc77ef925327af23d595beb852c01f324788 Dec 05 18:52:05 crc kubenswrapper[4756]: I1205 18:52:05.374215 4756 generic.go:334] "Generic (PLEG): container finished" podID="3d2b6ac9-0cdf-4fed-836e-918827fd0d90" containerID="4d1fcf3379d75e77c4561d70e283c1ed0d5037a8f285a1af30bb896cacbe3713" exitCode=0 Dec 05 18:52:05 crc kubenswrapper[4756]: I1205 18:52:05.374371 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3d2b6ac9-0cdf-4fed-836e-918827fd0d90","Type":"ContainerDied","Data":"4d1fcf3379d75e77c4561d70e283c1ed0d5037a8f285a1af30bb896cacbe3713"} Dec 05 18:52:05 crc kubenswrapper[4756]: I1205 18:52:05.376841 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"280ceb24-00a7-4a73-aa0a-5ee4bc7d62ba","Type":"ContainerStarted","Data":"65ba7050f70d5a50a1cdab1b184fcc77ef925327af23d595beb852c01f324788"} Dec 05 18:52:05 crc kubenswrapper[4756]: I1205 18:52:05.378701 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b4204540-a721-43ea-bc43-b39baf51e18b","Type":"ContainerStarted","Data":"b775809932a8fd3c4bb638aa52b52897248fb978b3bfa87eb4529d8e91f3e817"} Dec 05 18:52:05 crc kubenswrapper[4756]: I1205 18:52:05.824122 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.013485 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d2b6ac9-0cdf-4fed-836e-918827fd0d90-config-data\") pod \"3d2b6ac9-0cdf-4fed-836e-918827fd0d90\" (UID: \"3d2b6ac9-0cdf-4fed-836e-918827fd0d90\") " Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.013582 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d2b6ac9-0cdf-4fed-836e-918827fd0d90-combined-ca-bundle\") pod \"3d2b6ac9-0cdf-4fed-836e-918827fd0d90\" (UID: \"3d2b6ac9-0cdf-4fed-836e-918827fd0d90\") " Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.013720 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d2b6ac9-0cdf-4fed-836e-918827fd0d90-logs\") pod \"3d2b6ac9-0cdf-4fed-836e-918827fd0d90\" (UID: \"3d2b6ac9-0cdf-4fed-836e-918827fd0d90\") " Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.013752 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d2b6ac9-0cdf-4fed-836e-918827fd0d90-nova-metadata-tls-certs\") pod \"3d2b6ac9-0cdf-4fed-836e-918827fd0d90\" (UID: \"3d2b6ac9-0cdf-4fed-836e-918827fd0d90\") " Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.013832 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xr8g\" (UniqueName: \"kubernetes.io/projected/3d2b6ac9-0cdf-4fed-836e-918827fd0d90-kube-api-access-7xr8g\") pod \"3d2b6ac9-0cdf-4fed-836e-918827fd0d90\" (UID: \"3d2b6ac9-0cdf-4fed-836e-918827fd0d90\") " Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.015486 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d2b6ac9-0cdf-4fed-836e-918827fd0d90-logs" (OuterVolumeSpecName: "logs") pod "3d2b6ac9-0cdf-4fed-836e-918827fd0d90" (UID: "3d2b6ac9-0cdf-4fed-836e-918827fd0d90"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.020760 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d2b6ac9-0cdf-4fed-836e-918827fd0d90-kube-api-access-7xr8g" (OuterVolumeSpecName: "kube-api-access-7xr8g") pod "3d2b6ac9-0cdf-4fed-836e-918827fd0d90" (UID: "3d2b6ac9-0cdf-4fed-836e-918827fd0d90"). InnerVolumeSpecName "kube-api-access-7xr8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.049911 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d2b6ac9-0cdf-4fed-836e-918827fd0d90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d2b6ac9-0cdf-4fed-836e-918827fd0d90" (UID: "3d2b6ac9-0cdf-4fed-836e-918827fd0d90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.054796 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d2b6ac9-0cdf-4fed-836e-918827fd0d90-config-data" (OuterVolumeSpecName: "config-data") pod "3d2b6ac9-0cdf-4fed-836e-918827fd0d90" (UID: "3d2b6ac9-0cdf-4fed-836e-918827fd0d90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.100108 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d2b6ac9-0cdf-4fed-836e-918827fd0d90-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3d2b6ac9-0cdf-4fed-836e-918827fd0d90" (UID: "3d2b6ac9-0cdf-4fed-836e-918827fd0d90"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.115773 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d2b6ac9-0cdf-4fed-836e-918827fd0d90-logs\") on node \"crc\" DevicePath \"\"" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.115808 4756 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d2b6ac9-0cdf-4fed-836e-918827fd0d90-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.115819 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xr8g\" (UniqueName: \"kubernetes.io/projected/3d2b6ac9-0cdf-4fed-836e-918827fd0d90-kube-api-access-7xr8g\") on node \"crc\" DevicePath \"\"" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.115831 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d2b6ac9-0cdf-4fed-836e-918827fd0d90-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.115841 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d2b6ac9-0cdf-4fed-836e-918827fd0d90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.168293 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.323975 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4838409d-0cb6-459b-87d3-525ec6da0fca-config-data\") pod \"4838409d-0cb6-459b-87d3-525ec6da0fca\" (UID: \"4838409d-0cb6-459b-87d3-525ec6da0fca\") " Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.324155 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4838409d-0cb6-459b-87d3-525ec6da0fca-internal-tls-certs\") pod \"4838409d-0cb6-459b-87d3-525ec6da0fca\" (UID: \"4838409d-0cb6-459b-87d3-525ec6da0fca\") " Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.324276 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tflj\" (UniqueName: \"kubernetes.io/projected/4838409d-0cb6-459b-87d3-525ec6da0fca-kube-api-access-6tflj\") pod \"4838409d-0cb6-459b-87d3-525ec6da0fca\" (UID: \"4838409d-0cb6-459b-87d3-525ec6da0fca\") " Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.324311 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4838409d-0cb6-459b-87d3-525ec6da0fca-combined-ca-bundle\") pod \"4838409d-0cb6-459b-87d3-525ec6da0fca\" (UID: \"4838409d-0cb6-459b-87d3-525ec6da0fca\") " Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.324354 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4838409d-0cb6-459b-87d3-525ec6da0fca-logs\") pod \"4838409d-0cb6-459b-87d3-525ec6da0fca\" (UID: \"4838409d-0cb6-459b-87d3-525ec6da0fca\") " Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.324431 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4838409d-0cb6-459b-87d3-525ec6da0fca-public-tls-certs\") pod \"4838409d-0cb6-459b-87d3-525ec6da0fca\" (UID: \"4838409d-0cb6-459b-87d3-525ec6da0fca\") " Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.325602 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4838409d-0cb6-459b-87d3-525ec6da0fca-logs" (OuterVolumeSpecName: "logs") pod "4838409d-0cb6-459b-87d3-525ec6da0fca" (UID: "4838409d-0cb6-459b-87d3-525ec6da0fca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.327845 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4838409d-0cb6-459b-87d3-525ec6da0fca-kube-api-access-6tflj" (OuterVolumeSpecName: "kube-api-access-6tflj") pod "4838409d-0cb6-459b-87d3-525ec6da0fca" (UID: "4838409d-0cb6-459b-87d3-525ec6da0fca"). InnerVolumeSpecName "kube-api-access-6tflj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.358554 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4838409d-0cb6-459b-87d3-525ec6da0fca-config-data" (OuterVolumeSpecName: "config-data") pod "4838409d-0cb6-459b-87d3-525ec6da0fca" (UID: "4838409d-0cb6-459b-87d3-525ec6da0fca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.376450 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4838409d-0cb6-459b-87d3-525ec6da0fca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4838409d-0cb6-459b-87d3-525ec6da0fca" (UID: "4838409d-0cb6-459b-87d3-525ec6da0fca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.383743 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4838409d-0cb6-459b-87d3-525ec6da0fca-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4838409d-0cb6-459b-87d3-525ec6da0fca" (UID: "4838409d-0cb6-459b-87d3-525ec6da0fca"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.392813 4756 generic.go:334] "Generic (PLEG): container finished" podID="4838409d-0cb6-459b-87d3-525ec6da0fca" containerID="1fae077c39cddbee5bc0d17bde93f808f74bf8c6b511cc9e54e16a3f01d360b9" exitCode=0 Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.392919 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4838409d-0cb6-459b-87d3-525ec6da0fca","Type":"ContainerDied","Data":"1fae077c39cddbee5bc0d17bde93f808f74bf8c6b511cc9e54e16a3f01d360b9"} Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.392968 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4838409d-0cb6-459b-87d3-525ec6da0fca","Type":"ContainerDied","Data":"0d3b31135b5bd4fcdf2f8a665c4d1798dcf9ba63d89f922642d7d3c52c8753b1"} Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.392986 4756 scope.go:117] "RemoveContainer" containerID="1fae077c39cddbee5bc0d17bde93f808f74bf8c6b511cc9e54e16a3f01d360b9" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.393224 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.395307 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4056238e-c76d-4c30-af9b-62a5feb11624","Type":"ContainerStarted","Data":"c9ef3e3b62e460936b7b7b9f2a400e9aa2d038dbf7e9f2f8496f877f23339f6e"} Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.395458 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.397308 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b4204540-a721-43ea-bc43-b39baf51e18b","Type":"ContainerStarted","Data":"a3f083dfeb8f20ffa8a2ffd5679d0a054bb24fb9604fd4a9704e51d9f1f9fbbc"} Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.397344 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.400574 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.400569 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3d2b6ac9-0cdf-4fed-836e-918827fd0d90","Type":"ContainerDied","Data":"b26bb72b3061c8d159373749196de355bac13480f901e13b622eeed5b5643890"} Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.405397 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"280ceb24-00a7-4a73-aa0a-5ee4bc7d62ba","Type":"ContainerStarted","Data":"94d5e240ae73048bc960172e0319a892ed55bfaaf5ff59d2590a6121d880a352"} Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.417614 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.417598949 podStartE2EDuration="3.417598949s" podCreationTimestamp="2025-12-05 18:52:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 18:52:06.410912631 +0000 UTC m=+9386.649586143" watchObservedRunningTime="2025-12-05 18:52:06.417598949 +0000 UTC m=+9386.656272461" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.426795 4756 scope.go:117] "RemoveContainer" containerID="c46b87ac77963258fe8935b2ed2881f4e04f76c200f665df70826623fd681e4f" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.429027 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=3.429013383 podStartE2EDuration="3.429013383s" podCreationTimestamp="2025-12-05 18:52:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 18:52:06.426485776 +0000 UTC m=+9386.665159288" watchObservedRunningTime="2025-12-05 18:52:06.429013383 +0000 UTC m=+9386.667686895" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.429713 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tflj\" (UniqueName: \"kubernetes.io/projected/4838409d-0cb6-459b-87d3-525ec6da0fca-kube-api-access-6tflj\") on node \"crc\" DevicePath \"\"" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.429760 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4838409d-0cb6-459b-87d3-525ec6da0fca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.429775 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4838409d-0cb6-459b-87d3-525ec6da0fca-logs\") on node \"crc\" DevicePath \"\"" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.429786 4756 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4838409d-0cb6-459b-87d3-525ec6da0fca-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.429796 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4838409d-0cb6-459b-87d3-525ec6da0fca-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.429734 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4838409d-0cb6-459b-87d3-525ec6da0fca-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4838409d-0cb6-459b-87d3-525ec6da0fca" (UID: "4838409d-0cb6-459b-87d3-525ec6da0fca"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.451397 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.45029479 podStartE2EDuration="3.45029479s" podCreationTimestamp="2025-12-05 18:52:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 18:52:06.438786094 +0000 UTC m=+9386.677459616" watchObservedRunningTime="2025-12-05 18:52:06.45029479 +0000 UTC m=+9386.688968302" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.454346 4756 scope.go:117] "RemoveContainer" containerID="1fae077c39cddbee5bc0d17bde93f808f74bf8c6b511cc9e54e16a3f01d360b9" Dec 05 18:52:06 crc kubenswrapper[4756]: E1205 18:52:06.454837 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fae077c39cddbee5bc0d17bde93f808f74bf8c6b511cc9e54e16a3f01d360b9\": container with ID starting with 1fae077c39cddbee5bc0d17bde93f808f74bf8c6b511cc9e54e16a3f01d360b9 not found: ID does not exist" containerID="1fae077c39cddbee5bc0d17bde93f808f74bf8c6b511cc9e54e16a3f01d360b9" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.454872 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fae077c39cddbee5bc0d17bde93f808f74bf8c6b511cc9e54e16a3f01d360b9"} err="failed to get container status \"1fae077c39cddbee5bc0d17bde93f808f74bf8c6b511cc9e54e16a3f01d360b9\": rpc error: code = NotFound desc = could not find container \"1fae077c39cddbee5bc0d17bde93f808f74bf8c6b511cc9e54e16a3f01d360b9\": container with ID starting with 1fae077c39cddbee5bc0d17bde93f808f74bf8c6b511cc9e54e16a3f01d360b9 not found: ID does not exist" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.454891 4756 scope.go:117] "RemoveContainer" containerID="c46b87ac77963258fe8935b2ed2881f4e04f76c200f665df70826623fd681e4f" Dec 05 18:52:06 crc kubenswrapper[4756]: E1205 18:52:06.455200 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c46b87ac77963258fe8935b2ed2881f4e04f76c200f665df70826623fd681e4f\": container with ID starting with c46b87ac77963258fe8935b2ed2881f4e04f76c200f665df70826623fd681e4f not found: ID does not exist" containerID="c46b87ac77963258fe8935b2ed2881f4e04f76c200f665df70826623fd681e4f" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.455222 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c46b87ac77963258fe8935b2ed2881f4e04f76c200f665df70826623fd681e4f"} err="failed to get container status \"c46b87ac77963258fe8935b2ed2881f4e04f76c200f665df70826623fd681e4f\": rpc error: code = NotFound desc = could not find container \"c46b87ac77963258fe8935b2ed2881f4e04f76c200f665df70826623fd681e4f\": container with ID starting with c46b87ac77963258fe8935b2ed2881f4e04f76c200f665df70826623fd681e4f not found: ID does not exist" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.455236 4756 scope.go:117] "RemoveContainer" containerID="4d1fcf3379d75e77c4561d70e283c1ed0d5037a8f285a1af30bb896cacbe3713" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.474456 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.484633 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.503315 4756 scope.go:117] "RemoveContainer" containerID="d76bef93df02b6312bb583f8ecb763cce117bdc6e900ecbb8aeec0283f903e37" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.536293 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 18:52:06 crc kubenswrapper[4756]: E1205 18:52:06.537022 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d2b6ac9-0cdf-4fed-836e-918827fd0d90" containerName="nova-metadata-metadata" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.537043 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d2b6ac9-0cdf-4fed-836e-918827fd0d90" containerName="nova-metadata-metadata" Dec 05 18:52:06 crc kubenswrapper[4756]: E1205 18:52:06.537065 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d2b6ac9-0cdf-4fed-836e-918827fd0d90" containerName="nova-metadata-log" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.537073 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d2b6ac9-0cdf-4fed-836e-918827fd0d90" containerName="nova-metadata-log" Dec 05 18:52:06 crc kubenswrapper[4756]: E1205 18:52:06.537101 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4838409d-0cb6-459b-87d3-525ec6da0fca" containerName="nova-api-log" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.537108 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4838409d-0cb6-459b-87d3-525ec6da0fca" containerName="nova-api-log" Dec 05 18:52:06 crc kubenswrapper[4756]: E1205 18:52:06.537115 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4838409d-0cb6-459b-87d3-525ec6da0fca" containerName="nova-api-api" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.537121 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4838409d-0cb6-459b-87d3-525ec6da0fca" containerName="nova-api-api" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.537317 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d2b6ac9-0cdf-4fed-836e-918827fd0d90" containerName="nova-metadata-metadata" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.537341 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="4838409d-0cb6-459b-87d3-525ec6da0fca" containerName="nova-api-log" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.537353 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d2b6ac9-0cdf-4fed-836e-918827fd0d90" containerName="nova-metadata-log" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.537368 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="4838409d-0cb6-459b-87d3-525ec6da0fca" containerName="nova-api-api" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.538521 4756 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4838409d-0cb6-459b-87d3-525ec6da0fca-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.539563 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.545267 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.547431 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.547561 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.640081 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14e7e8b8-2f32-4db3-a86c-a7a3abe55f93-logs\") pod \"nova-metadata-0\" (UID: \"14e7e8b8-2f32-4db3-a86c-a7a3abe55f93\") " pod="openstack/nova-metadata-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.640138 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14e7e8b8-2f32-4db3-a86c-a7a3abe55f93-config-data\") pod \"nova-metadata-0\" (UID: \"14e7e8b8-2f32-4db3-a86c-a7a3abe55f93\") " pod="openstack/nova-metadata-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.640193 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/14e7e8b8-2f32-4db3-a86c-a7a3abe55f93-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"14e7e8b8-2f32-4db3-a86c-a7a3abe55f93\") " pod="openstack/nova-metadata-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.640273 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e7e8b8-2f32-4db3-a86c-a7a3abe55f93-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"14e7e8b8-2f32-4db3-a86c-a7a3abe55f93\") " pod="openstack/nova-metadata-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.640384 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbk2v\" (UniqueName: \"kubernetes.io/projected/14e7e8b8-2f32-4db3-a86c-a7a3abe55f93-kube-api-access-sbk2v\") pod \"nova-metadata-0\" (UID: \"14e7e8b8-2f32-4db3-a86c-a7a3abe55f93\") " pod="openstack/nova-metadata-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.731363 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.744108 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14e7e8b8-2f32-4db3-a86c-a7a3abe55f93-logs\") pod \"nova-metadata-0\" (UID: \"14e7e8b8-2f32-4db3-a86c-a7a3abe55f93\") " pod="openstack/nova-metadata-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.744165 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14e7e8b8-2f32-4db3-a86c-a7a3abe55f93-config-data\") pod \"nova-metadata-0\" (UID: \"14e7e8b8-2f32-4db3-a86c-a7a3abe55f93\") " pod="openstack/nova-metadata-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.744196 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/14e7e8b8-2f32-4db3-a86c-a7a3abe55f93-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"14e7e8b8-2f32-4db3-a86c-a7a3abe55f93\") " pod="openstack/nova-metadata-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.744263 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e7e8b8-2f32-4db3-a86c-a7a3abe55f93-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"14e7e8b8-2f32-4db3-a86c-a7a3abe55f93\") " pod="openstack/nova-metadata-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.744323 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbk2v\" (UniqueName: \"kubernetes.io/projected/14e7e8b8-2f32-4db3-a86c-a7a3abe55f93-kube-api-access-sbk2v\") pod \"nova-metadata-0\" (UID: \"14e7e8b8-2f32-4db3-a86c-a7a3abe55f93\") " pod="openstack/nova-metadata-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.744937 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14e7e8b8-2f32-4db3-a86c-a7a3abe55f93-logs\") pod \"nova-metadata-0\" (UID: \"14e7e8b8-2f32-4db3-a86c-a7a3abe55f93\") " pod="openstack/nova-metadata-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.749333 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14e7e8b8-2f32-4db3-a86c-a7a3abe55f93-config-data\") pod \"nova-metadata-0\" (UID: \"14e7e8b8-2f32-4db3-a86c-a7a3abe55f93\") " pod="openstack/nova-metadata-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.753105 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.761883 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e7e8b8-2f32-4db3-a86c-a7a3abe55f93-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"14e7e8b8-2f32-4db3-a86c-a7a3abe55f93\") " pod="openstack/nova-metadata-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.762112 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/14e7e8b8-2f32-4db3-a86c-a7a3abe55f93-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"14e7e8b8-2f32-4db3-a86c-a7a3abe55f93\") " pod="openstack/nova-metadata-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.764326 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbk2v\" (UniqueName: \"kubernetes.io/projected/14e7e8b8-2f32-4db3-a86c-a7a3abe55f93-kube-api-access-sbk2v\") pod \"nova-metadata-0\" (UID: \"14e7e8b8-2f32-4db3-a86c-a7a3abe55f93\") " pod="openstack/nova-metadata-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.765282 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.766992 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.774663 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.775067 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.775304 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.775655 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.846077 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g8nb\" (UniqueName: \"kubernetes.io/projected/f028654a-21d7-4fa5-b313-bf4e418a5420-kube-api-access-9g8nb\") pod \"nova-api-0\" (UID: \"f028654a-21d7-4fa5-b313-bf4e418a5420\") " pod="openstack/nova-api-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.846364 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f028654a-21d7-4fa5-b313-bf4e418a5420-logs\") pod \"nova-api-0\" (UID: \"f028654a-21d7-4fa5-b313-bf4e418a5420\") " pod="openstack/nova-api-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.846471 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f028654a-21d7-4fa5-b313-bf4e418a5420-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f028654a-21d7-4fa5-b313-bf4e418a5420\") " pod="openstack/nova-api-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.846565 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f028654a-21d7-4fa5-b313-bf4e418a5420-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f028654a-21d7-4fa5-b313-bf4e418a5420\") " pod="openstack/nova-api-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.846692 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f028654a-21d7-4fa5-b313-bf4e418a5420-config-data\") pod \"nova-api-0\" (UID: \"f028654a-21d7-4fa5-b313-bf4e418a5420\") " pod="openstack/nova-api-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.846830 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f028654a-21d7-4fa5-b313-bf4e418a5420-public-tls-certs\") pod \"nova-api-0\" (UID: \"f028654a-21d7-4fa5-b313-bf4e418a5420\") " pod="openstack/nova-api-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.863611 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 18:52:06 crc kubenswrapper[4756]: E1205 18:52:06.879552 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4838409d_0cb6_459b_87d3_525ec6da0fca.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4838409d_0cb6_459b_87d3_525ec6da0fca.slice/crio-0d3b31135b5bd4fcdf2f8a665c4d1798dcf9ba63d89f922642d7d3c52c8753b1\": RecentStats: unable to find data in memory cache]" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.951681 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f028654a-21d7-4fa5-b313-bf4e418a5420-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f028654a-21d7-4fa5-b313-bf4e418a5420\") " pod="openstack/nova-api-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.951971 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f028654a-21d7-4fa5-b313-bf4e418a5420-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f028654a-21d7-4fa5-b313-bf4e418a5420\") " pod="openstack/nova-api-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.952005 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f028654a-21d7-4fa5-b313-bf4e418a5420-config-data\") pod \"nova-api-0\" (UID: \"f028654a-21d7-4fa5-b313-bf4e418a5420\") " pod="openstack/nova-api-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.952049 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f028654a-21d7-4fa5-b313-bf4e418a5420-public-tls-certs\") pod \"nova-api-0\" (UID: \"f028654a-21d7-4fa5-b313-bf4e418a5420\") " pod="openstack/nova-api-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.952089 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g8nb\" (UniqueName: \"kubernetes.io/projected/f028654a-21d7-4fa5-b313-bf4e418a5420-kube-api-access-9g8nb\") pod \"nova-api-0\" (UID: \"f028654a-21d7-4fa5-b313-bf4e418a5420\") " pod="openstack/nova-api-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.952122 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f028654a-21d7-4fa5-b313-bf4e418a5420-logs\") pod \"nova-api-0\" (UID: \"f028654a-21d7-4fa5-b313-bf4e418a5420\") " pod="openstack/nova-api-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.952513 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f028654a-21d7-4fa5-b313-bf4e418a5420-logs\") pod \"nova-api-0\" (UID: \"f028654a-21d7-4fa5-b313-bf4e418a5420\") " pod="openstack/nova-api-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.957540 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f028654a-21d7-4fa5-b313-bf4e418a5420-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f028654a-21d7-4fa5-b313-bf4e418a5420\") " pod="openstack/nova-api-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.958662 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f028654a-21d7-4fa5-b313-bf4e418a5420-config-data\") pod \"nova-api-0\" (UID: \"f028654a-21d7-4fa5-b313-bf4e418a5420\") " pod="openstack/nova-api-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.962854 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f028654a-21d7-4fa5-b313-bf4e418a5420-public-tls-certs\") pod \"nova-api-0\" (UID: \"f028654a-21d7-4fa5-b313-bf4e418a5420\") " pod="openstack/nova-api-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.964291 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f028654a-21d7-4fa5-b313-bf4e418a5420-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f028654a-21d7-4fa5-b313-bf4e418a5420\") " pod="openstack/nova-api-0" Dec 05 18:52:06 crc kubenswrapper[4756]: I1205 18:52:06.976232 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g8nb\" (UniqueName: \"kubernetes.io/projected/f028654a-21d7-4fa5-b313-bf4e418a5420-kube-api-access-9g8nb\") pod \"nova-api-0\" (UID: \"f028654a-21d7-4fa5-b313-bf4e418a5420\") " pod="openstack/nova-api-0" Dec 05 18:52:07 crc kubenswrapper[4756]: I1205 18:52:07.149139 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 18:52:07 crc kubenswrapper[4756]: W1205 18:52:07.333577 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14e7e8b8_2f32_4db3_a86c_a7a3abe55f93.slice/crio-992b3febc78949a362691d7ed66e9d121c33bc8ba4fb6c3e230efed391cd307f WatchSource:0}: Error finding container 992b3febc78949a362691d7ed66e9d121c33bc8ba4fb6c3e230efed391cd307f: Status 404 returned error can't find the container with id 992b3febc78949a362691d7ed66e9d121c33bc8ba4fb6c3e230efed391cd307f Dec 05 18:52:07 crc kubenswrapper[4756]: I1205 18:52:07.335376 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 18:52:07 crc kubenswrapper[4756]: I1205 18:52:07.429194 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14e7e8b8-2f32-4db3-a86c-a7a3abe55f93","Type":"ContainerStarted","Data":"992b3febc78949a362691d7ed66e9d121c33bc8ba4fb6c3e230efed391cd307f"} Dec 05 18:52:07 crc kubenswrapper[4756]: I1205 18:52:07.691520 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 18:52:08 crc kubenswrapper[4756]: I1205 18:52:08.066005 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d2b6ac9-0cdf-4fed-836e-918827fd0d90" path="/var/lib/kubelet/pods/3d2b6ac9-0cdf-4fed-836e-918827fd0d90/volumes" Dec 05 18:52:08 crc kubenswrapper[4756]: I1205 18:52:08.067656 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4838409d-0cb6-459b-87d3-525ec6da0fca" path="/var/lib/kubelet/pods/4838409d-0cb6-459b-87d3-525ec6da0fca/volumes" Dec 05 18:52:08 crc kubenswrapper[4756]: I1205 18:52:08.438344 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f028654a-21d7-4fa5-b313-bf4e418a5420","Type":"ContainerStarted","Data":"65758fa98f048efbaa30f38404641247546d8024ef141bc8c33cf7c4958381e9"} Dec 05 18:52:08 crc kubenswrapper[4756]: I1205 18:52:08.438648 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f028654a-21d7-4fa5-b313-bf4e418a5420","Type":"ContainerStarted","Data":"f63126149216467455e05709c65a29467ca457f9d8adb07ec25d11f0f75bce1b"} Dec 05 18:52:08 crc kubenswrapper[4756]: I1205 18:52:08.438661 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f028654a-21d7-4fa5-b313-bf4e418a5420","Type":"ContainerStarted","Data":"6b07a966e21eb6b2f7ff41ff37d17d45c5711425cfb9d83c9a4eea9a0c0d0635"} Dec 05 18:52:08 crc kubenswrapper[4756]: I1205 18:52:08.439805 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14e7e8b8-2f32-4db3-a86c-a7a3abe55f93","Type":"ContainerStarted","Data":"da7473bccf7978279f3e6f40c712725505372cbb54d12beafd32fa94ead7b778"} Dec 05 18:52:08 crc kubenswrapper[4756]: I1205 18:52:08.439849 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14e7e8b8-2f32-4db3-a86c-a7a3abe55f93","Type":"ContainerStarted","Data":"3f21415b4232508e4558d744ff7ddc50b5b03d5748548c56ae317b86bc94519e"} Dec 05 18:52:08 crc kubenswrapper[4756]: I1205 18:52:08.483545 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.483514383 podStartE2EDuration="2.483514383s" podCreationTimestamp="2025-12-05 18:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 18:52:08.458398683 +0000 UTC m=+9388.697072205" watchObservedRunningTime="2025-12-05 18:52:08.483514383 +0000 UTC m=+9388.722187925" Dec 05 18:52:08 crc kubenswrapper[4756]: I1205 18:52:08.499475 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.499453647 podStartE2EDuration="2.499453647s" podCreationTimestamp="2025-12-05 18:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 18:52:08.482441984 +0000 UTC m=+9388.721115516" watchObservedRunningTime="2025-12-05 18:52:08.499453647 +0000 UTC m=+9388.738127169" Dec 05 18:52:08 crc kubenswrapper[4756]: I1205 18:52:08.892702 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 18:52:11 crc kubenswrapper[4756]: I1205 18:52:11.863784 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 18:52:11 crc kubenswrapper[4756]: I1205 18:52:11.864413 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 18:52:13 crc kubenswrapper[4756]: I1205 18:52:13.051273 4756 scope.go:117] "RemoveContainer" containerID="8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f" Dec 05 18:52:13 crc kubenswrapper[4756]: E1205 18:52:13.052526 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:52:13 crc kubenswrapper[4756]: I1205 18:52:13.892803 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 18:52:14 crc kubenswrapper[4756]: I1205 18:52:14.072629 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 05 18:52:14 crc kubenswrapper[4756]: I1205 18:52:14.078432 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 05 18:52:14 crc kubenswrapper[4756]: I1205 18:52:14.087595 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 18:52:14 crc kubenswrapper[4756]: I1205 18:52:14.567510 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 18:52:16 crc kubenswrapper[4756]: I1205 18:52:16.864572 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 18:52:16 crc kubenswrapper[4756]: I1205 18:52:16.865716 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 18:52:17 crc kubenswrapper[4756]: I1205 18:52:17.149917 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 18:52:17 crc kubenswrapper[4756]: I1205 18:52:17.149974 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 18:52:17 crc kubenswrapper[4756]: I1205 18:52:17.883623 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="14e7e8b8-2f32-4db3-a86c-a7a3abe55f93" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 18:52:17 crc kubenswrapper[4756]: I1205 18:52:17.883662 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="14e7e8b8-2f32-4db3-a86c-a7a3abe55f93" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 18:52:18 crc kubenswrapper[4756]: I1205 18:52:18.174685 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f028654a-21d7-4fa5-b313-bf4e418a5420" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 18:52:18 crc kubenswrapper[4756]: I1205 18:52:18.174731 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f028654a-21d7-4fa5-b313-bf4e418a5420" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 18:52:26 crc kubenswrapper[4756]: I1205 18:52:26.050182 4756 scope.go:117] "RemoveContainer" containerID="8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f" Dec 05 18:52:26 crc kubenswrapper[4756]: E1205 18:52:26.051702 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:52:26 crc kubenswrapper[4756]: I1205 18:52:26.870894 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 18:52:26 crc kubenswrapper[4756]: I1205 18:52:26.873855 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 18:52:26 crc kubenswrapper[4756]: I1205 18:52:26.881805 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 18:52:27 crc kubenswrapper[4756]: I1205 18:52:27.161766 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 18:52:27 crc kubenswrapper[4756]: I1205 18:52:27.162724 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 18:52:27 crc kubenswrapper[4756]: I1205 18:52:27.170994 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 18:52:27 crc kubenswrapper[4756]: I1205 18:52:27.171095 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 18:52:27 crc kubenswrapper[4756]: I1205 18:52:27.655854 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 18:52:27 crc kubenswrapper[4756]: I1205 18:52:27.659686 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 18:52:27 crc kubenswrapper[4756]: I1205 18:52:27.668059 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 18:52:41 crc kubenswrapper[4756]: I1205 18:52:41.050956 4756 scope.go:117] "RemoveContainer" containerID="8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f" Dec 05 18:52:41 crc kubenswrapper[4756]: E1205 18:52:41.051893 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:52:53 crc kubenswrapper[4756]: I1205 18:52:53.051066 4756 scope.go:117] "RemoveContainer" containerID="8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f" Dec 05 18:52:53 crc kubenswrapper[4756]: E1205 18:52:53.053284 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:53:04 crc kubenswrapper[4756]: I1205 18:53:04.049535 4756 scope.go:117] "RemoveContainer" containerID="8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f" Dec 05 18:53:04 crc kubenswrapper[4756]: E1205 18:53:04.050241 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:53:17 crc kubenswrapper[4756]: I1205 18:53:17.051337 4756 scope.go:117] "RemoveContainer" containerID="8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f" Dec 05 18:53:17 crc kubenswrapper[4756]: E1205 18:53:17.052812 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:53:20 crc kubenswrapper[4756]: I1205 18:53:20.291289 4756 scope.go:117] "RemoveContainer" containerID="348e064651a4dbdc718881464594f26fc395306d8862339ee85938638fca3697" Dec 05 18:53:20 crc kubenswrapper[4756]: I1205 18:53:20.319913 4756 scope.go:117] "RemoveContainer" containerID="096105dbe5fab9912033dff8ce0238f86b134acf0c9ef4f0a45c40695fa4b06e" Dec 05 18:53:20 crc kubenswrapper[4756]: I1205 18:53:20.357814 4756 scope.go:117] "RemoveContainer" containerID="25150eeeb22574d8777b47d82648cd486eebea96c425b3e76d4ff5b43ccf3d5c" Dec 05 18:53:30 crc kubenswrapper[4756]: I1205 18:53:30.057273 4756 scope.go:117] "RemoveContainer" containerID="8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f" Dec 05 18:53:30 crc kubenswrapper[4756]: E1205 18:53:30.058370 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:53:43 crc kubenswrapper[4756]: I1205 18:53:43.050243 4756 scope.go:117] "RemoveContainer" containerID="8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f" Dec 05 18:53:43 crc kubenswrapper[4756]: E1205 18:53:43.051461 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:53:57 crc kubenswrapper[4756]: I1205 18:53:57.050321 4756 scope.go:117] "RemoveContainer" containerID="8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f" Dec 05 18:53:57 crc kubenswrapper[4756]: E1205 18:53:57.051364 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:54:08 crc kubenswrapper[4756]: I1205 18:54:08.050441 4756 scope.go:117] "RemoveContainer" containerID="8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f" Dec 05 18:54:08 crc kubenswrapper[4756]: E1205 18:54:08.051493 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:54:20 crc kubenswrapper[4756]: I1205 18:54:20.855668 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-shpwx"] Dec 05 18:54:20 crc kubenswrapper[4756]: I1205 18:54:20.859320 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-shpwx" Dec 05 18:54:20 crc kubenswrapper[4756]: I1205 18:54:20.868997 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-shpwx"] Dec 05 18:54:21 crc kubenswrapper[4756]: I1205 18:54:21.028065 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90b7f057-9087-424a-add1-419435702dfd-utilities\") pod \"community-operators-shpwx\" (UID: \"90b7f057-9087-424a-add1-419435702dfd\") " pod="openshift-marketplace/community-operators-shpwx" Dec 05 18:54:21 crc kubenswrapper[4756]: I1205 18:54:21.028129 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90b7f057-9087-424a-add1-419435702dfd-catalog-content\") pod \"community-operators-shpwx\" (UID: \"90b7f057-9087-424a-add1-419435702dfd\") " pod="openshift-marketplace/community-operators-shpwx" Dec 05 18:54:21 crc kubenswrapper[4756]: I1205 18:54:21.028245 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrsll\" (UniqueName: \"kubernetes.io/projected/90b7f057-9087-424a-add1-419435702dfd-kube-api-access-vrsll\") pod \"community-operators-shpwx\" (UID: \"90b7f057-9087-424a-add1-419435702dfd\") " pod="openshift-marketplace/community-operators-shpwx" Dec 05 18:54:21 crc kubenswrapper[4756]: I1205 18:54:21.131096 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrsll\" (UniqueName: \"kubernetes.io/projected/90b7f057-9087-424a-add1-419435702dfd-kube-api-access-vrsll\") pod \"community-operators-shpwx\" (UID: \"90b7f057-9087-424a-add1-419435702dfd\") " pod="openshift-marketplace/community-operators-shpwx" Dec 05 18:54:21 crc kubenswrapper[4756]: I1205 18:54:21.131995 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90b7f057-9087-424a-add1-419435702dfd-utilities\") pod \"community-operators-shpwx\" (UID: \"90b7f057-9087-424a-add1-419435702dfd\") " pod="openshift-marketplace/community-operators-shpwx" Dec 05 18:54:21 crc kubenswrapper[4756]: I1205 18:54:21.132219 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90b7f057-9087-424a-add1-419435702dfd-catalog-content\") pod \"community-operators-shpwx\" (UID: \"90b7f057-9087-424a-add1-419435702dfd\") " pod="openshift-marketplace/community-operators-shpwx" Dec 05 18:54:21 crc kubenswrapper[4756]: I1205 18:54:21.132552 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90b7f057-9087-424a-add1-419435702dfd-utilities\") pod \"community-operators-shpwx\" (UID: \"90b7f057-9087-424a-add1-419435702dfd\") " pod="openshift-marketplace/community-operators-shpwx" Dec 05 18:54:21 crc kubenswrapper[4756]: I1205 18:54:21.133014 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90b7f057-9087-424a-add1-419435702dfd-catalog-content\") pod \"community-operators-shpwx\" (UID: \"90b7f057-9087-424a-add1-419435702dfd\") " pod="openshift-marketplace/community-operators-shpwx" Dec 05 18:54:21 crc kubenswrapper[4756]: I1205 18:54:21.834561 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrsll\" (UniqueName: \"kubernetes.io/projected/90b7f057-9087-424a-add1-419435702dfd-kube-api-access-vrsll\") pod \"community-operators-shpwx\" (UID: \"90b7f057-9087-424a-add1-419435702dfd\") " pod="openshift-marketplace/community-operators-shpwx" Dec 05 18:54:22 crc kubenswrapper[4756]: I1205 18:54:22.049329 4756 scope.go:117] "RemoveContainer" containerID="8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f" Dec 05 18:54:22 crc kubenswrapper[4756]: E1205 18:54:22.049628 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:54:22 crc kubenswrapper[4756]: I1205 18:54:22.101413 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-shpwx" Dec 05 18:54:22 crc kubenswrapper[4756]: I1205 18:54:22.589975 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-shpwx"] Dec 05 18:54:23 crc kubenswrapper[4756]: I1205 18:54:23.038655 4756 generic.go:334] "Generic (PLEG): container finished" podID="90b7f057-9087-424a-add1-419435702dfd" containerID="8c18c06f10a6d9237fbb79b9f4009f2c883eb4f159d2c421cb9d01696647bb1f" exitCode=0 Dec 05 18:54:23 crc kubenswrapper[4756]: I1205 18:54:23.038783 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shpwx" event={"ID":"90b7f057-9087-424a-add1-419435702dfd","Type":"ContainerDied","Data":"8c18c06f10a6d9237fbb79b9f4009f2c883eb4f159d2c421cb9d01696647bb1f"} Dec 05 18:54:23 crc kubenswrapper[4756]: I1205 18:54:23.039397 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shpwx" event={"ID":"90b7f057-9087-424a-add1-419435702dfd","Type":"ContainerStarted","Data":"68aedd2003f6fce30b9104d1693c400d47734a6ea6a0fe8eab1c3f1ca1079768"} Dec 05 18:54:26 crc kubenswrapper[4756]: I1205 18:54:26.099219 4756 generic.go:334] "Generic (PLEG): container finished" podID="90b7f057-9087-424a-add1-419435702dfd" containerID="66b238f49a724160cc80e8d99d0ea42c390ee5b5054449fd1ac4362f1f99d22d" exitCode=0 Dec 05 18:54:26 crc kubenswrapper[4756]: I1205 18:54:26.099348 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shpwx" event={"ID":"90b7f057-9087-424a-add1-419435702dfd","Type":"ContainerDied","Data":"66b238f49a724160cc80e8d99d0ea42c390ee5b5054449fd1ac4362f1f99d22d"} Dec 05 18:54:27 crc kubenswrapper[4756]: I1205 18:54:27.113817 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shpwx" event={"ID":"90b7f057-9087-424a-add1-419435702dfd","Type":"ContainerStarted","Data":"28aab3673c8a121a325ea455d181c002ffcf9afcca7c9e030da7c64afa6a3a47"} Dec 05 18:54:27 crc kubenswrapper[4756]: I1205 18:54:27.154040 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-shpwx" podStartSLOduration=3.699938736 podStartE2EDuration="7.154017346s" podCreationTimestamp="2025-12-05 18:54:20 +0000 UTC" firstStartedPulling="2025-12-05 18:54:23.041228225 +0000 UTC m=+9523.279901787" lastFinishedPulling="2025-12-05 18:54:26.495306885 +0000 UTC m=+9526.733980397" observedRunningTime="2025-12-05 18:54:27.136588281 +0000 UTC m=+9527.375261813" watchObservedRunningTime="2025-12-05 18:54:27.154017346 +0000 UTC m=+9527.392690868" Dec 05 18:54:32 crc kubenswrapper[4756]: I1205 18:54:32.102221 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-shpwx" Dec 05 18:54:32 crc kubenswrapper[4756]: I1205 18:54:32.103009 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-shpwx" Dec 05 18:54:32 crc kubenswrapper[4756]: I1205 18:54:32.306199 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-shpwx" Dec 05 18:54:32 crc kubenswrapper[4756]: I1205 18:54:32.362936 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-shpwx" Dec 05 18:54:32 crc kubenswrapper[4756]: I1205 18:54:32.546581 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-shpwx"] Dec 05 18:54:34 crc kubenswrapper[4756]: I1205 18:54:34.051583 4756 scope.go:117] "RemoveContainer" containerID="8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f" Dec 05 18:54:34 crc kubenswrapper[4756]: E1205 18:54:34.051850 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:54:34 crc kubenswrapper[4756]: I1205 18:54:34.205119 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-shpwx" podUID="90b7f057-9087-424a-add1-419435702dfd" containerName="registry-server" containerID="cri-o://28aab3673c8a121a325ea455d181c002ffcf9afcca7c9e030da7c64afa6a3a47" gracePeriod=2 Dec 05 18:54:35 crc kubenswrapper[4756]: I1205 18:54:35.219636 4756 generic.go:334] "Generic (PLEG): container finished" podID="90b7f057-9087-424a-add1-419435702dfd" containerID="28aab3673c8a121a325ea455d181c002ffcf9afcca7c9e030da7c64afa6a3a47" exitCode=0 Dec 05 18:54:35 crc kubenswrapper[4756]: I1205 18:54:35.219713 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shpwx" event={"ID":"90b7f057-9087-424a-add1-419435702dfd","Type":"ContainerDied","Data":"28aab3673c8a121a325ea455d181c002ffcf9afcca7c9e030da7c64afa6a3a47"} Dec 05 18:54:35 crc kubenswrapper[4756]: I1205 18:54:35.220061 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shpwx" event={"ID":"90b7f057-9087-424a-add1-419435702dfd","Type":"ContainerDied","Data":"68aedd2003f6fce30b9104d1693c400d47734a6ea6a0fe8eab1c3f1ca1079768"} Dec 05 18:54:35 crc kubenswrapper[4756]: I1205 18:54:35.220084 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68aedd2003f6fce30b9104d1693c400d47734a6ea6a0fe8eab1c3f1ca1079768" Dec 05 18:54:35 crc kubenswrapper[4756]: I1205 18:54:35.255725 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-shpwx" Dec 05 18:54:35 crc kubenswrapper[4756]: I1205 18:54:35.427876 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90b7f057-9087-424a-add1-419435702dfd-utilities\") pod \"90b7f057-9087-424a-add1-419435702dfd\" (UID: \"90b7f057-9087-424a-add1-419435702dfd\") " Dec 05 18:54:35 crc kubenswrapper[4756]: I1205 18:54:35.427976 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrsll\" (UniqueName: \"kubernetes.io/projected/90b7f057-9087-424a-add1-419435702dfd-kube-api-access-vrsll\") pod \"90b7f057-9087-424a-add1-419435702dfd\" (UID: \"90b7f057-9087-424a-add1-419435702dfd\") " Dec 05 18:54:35 crc kubenswrapper[4756]: I1205 18:54:35.428199 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90b7f057-9087-424a-add1-419435702dfd-catalog-content\") pod \"90b7f057-9087-424a-add1-419435702dfd\" (UID: \"90b7f057-9087-424a-add1-419435702dfd\") " Dec 05 18:54:35 crc kubenswrapper[4756]: I1205 18:54:35.429090 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90b7f057-9087-424a-add1-419435702dfd-utilities" (OuterVolumeSpecName: "utilities") pod "90b7f057-9087-424a-add1-419435702dfd" (UID: "90b7f057-9087-424a-add1-419435702dfd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:54:35 crc kubenswrapper[4756]: I1205 18:54:35.436713 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90b7f057-9087-424a-add1-419435702dfd-kube-api-access-vrsll" (OuterVolumeSpecName: "kube-api-access-vrsll") pod "90b7f057-9087-424a-add1-419435702dfd" (UID: "90b7f057-9087-424a-add1-419435702dfd"). InnerVolumeSpecName "kube-api-access-vrsll". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:54:35 crc kubenswrapper[4756]: I1205 18:54:35.501567 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90b7f057-9087-424a-add1-419435702dfd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "90b7f057-9087-424a-add1-419435702dfd" (UID: "90b7f057-9087-424a-add1-419435702dfd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:54:35 crc kubenswrapper[4756]: I1205 18:54:35.530829 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90b7f057-9087-424a-add1-419435702dfd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 18:54:35 crc kubenswrapper[4756]: I1205 18:54:35.530866 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90b7f057-9087-424a-add1-419435702dfd-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 18:54:35 crc kubenswrapper[4756]: I1205 18:54:35.530882 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrsll\" (UniqueName: \"kubernetes.io/projected/90b7f057-9087-424a-add1-419435702dfd-kube-api-access-vrsll\") on node \"crc\" DevicePath \"\"" Dec 05 18:54:36 crc kubenswrapper[4756]: I1205 18:54:36.235279 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-shpwx" Dec 05 18:54:36 crc kubenswrapper[4756]: I1205 18:54:36.284505 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-shpwx"] Dec 05 18:54:36 crc kubenswrapper[4756]: I1205 18:54:36.298020 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-shpwx"] Dec 05 18:54:38 crc kubenswrapper[4756]: I1205 18:54:38.069138 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90b7f057-9087-424a-add1-419435702dfd" path="/var/lib/kubelet/pods/90b7f057-9087-424a-add1-419435702dfd/volumes" Dec 05 18:54:47 crc kubenswrapper[4756]: I1205 18:54:47.050945 4756 scope.go:117] "RemoveContainer" containerID="8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f" Dec 05 18:54:47 crc kubenswrapper[4756]: E1205 18:54:47.051951 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:55:00 crc kubenswrapper[4756]: I1205 18:55:00.056452 4756 scope.go:117] "RemoveContainer" containerID="8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f" Dec 05 18:55:00 crc kubenswrapper[4756]: E1205 18:55:00.057472 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:55:15 crc kubenswrapper[4756]: I1205 18:55:15.050501 4756 scope.go:117] "RemoveContainer" containerID="8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f" Dec 05 18:55:15 crc kubenswrapper[4756]: E1205 18:55:15.051438 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:55:27 crc kubenswrapper[4756]: I1205 18:55:27.054212 4756 scope.go:117] "RemoveContainer" containerID="8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f" Dec 05 18:55:27 crc kubenswrapper[4756]: E1205 18:55:27.055323 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:55:41 crc kubenswrapper[4756]: I1205 18:55:41.050235 4756 scope.go:117] "RemoveContainer" containerID="8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f" Dec 05 18:55:41 crc kubenswrapper[4756]: E1205 18:55:41.051937 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:55:52 crc kubenswrapper[4756]: I1205 18:55:52.050308 4756 scope.go:117] "RemoveContainer" containerID="8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f" Dec 05 18:55:52 crc kubenswrapper[4756]: E1205 18:55:52.051442 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:56:04 crc kubenswrapper[4756]: I1205 18:56:04.050354 4756 scope.go:117] "RemoveContainer" containerID="8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f" Dec 05 18:56:04 crc kubenswrapper[4756]: E1205 18:56:04.051870 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:56:18 crc kubenswrapper[4756]: I1205 18:56:18.050470 4756 scope.go:117] "RemoveContainer" containerID="8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f" Dec 05 18:56:18 crc kubenswrapper[4756]: E1205 18:56:18.053683 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 18:56:29 crc kubenswrapper[4756]: I1205 18:56:29.050507 4756 scope.go:117] "RemoveContainer" containerID="8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f" Dec 05 18:56:29 crc kubenswrapper[4756]: I1205 18:56:29.750933 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"3b487db91a35b43b2ad7f9ec84f421006fecc871f530c2ac8a41abd36649097e"} Dec 05 18:57:18 crc kubenswrapper[4756]: I1205 18:57:18.361401 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9f44k"] Dec 05 18:57:18 crc kubenswrapper[4756]: E1205 18:57:18.363216 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90b7f057-9087-424a-add1-419435702dfd" containerName="registry-server" Dec 05 18:57:18 crc kubenswrapper[4756]: I1205 18:57:18.363251 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="90b7f057-9087-424a-add1-419435702dfd" containerName="registry-server" Dec 05 18:57:18 crc kubenswrapper[4756]: E1205 18:57:18.363296 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90b7f057-9087-424a-add1-419435702dfd" containerName="extract-utilities" Dec 05 18:57:18 crc kubenswrapper[4756]: I1205 18:57:18.363314 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="90b7f057-9087-424a-add1-419435702dfd" containerName="extract-utilities" Dec 05 18:57:18 crc kubenswrapper[4756]: E1205 18:57:18.363376 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90b7f057-9087-424a-add1-419435702dfd" containerName="extract-content" Dec 05 18:57:18 crc kubenswrapper[4756]: I1205 18:57:18.363395 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="90b7f057-9087-424a-add1-419435702dfd" containerName="extract-content" Dec 05 18:57:18 crc kubenswrapper[4756]: I1205 18:57:18.364046 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="90b7f057-9087-424a-add1-419435702dfd" containerName="registry-server" Dec 05 18:57:18 crc kubenswrapper[4756]: I1205 18:57:18.366993 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9f44k" Dec 05 18:57:18 crc kubenswrapper[4756]: I1205 18:57:18.380953 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9f44k"] Dec 05 18:57:18 crc kubenswrapper[4756]: I1205 18:57:18.435982 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dc1a51d-3d67-4287-ae12-f1241a2c45fa-utilities\") pod \"redhat-operators-9f44k\" (UID: \"8dc1a51d-3d67-4287-ae12-f1241a2c45fa\") " pod="openshift-marketplace/redhat-operators-9f44k" Dec 05 18:57:18 crc kubenswrapper[4756]: I1205 18:57:18.436719 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bvqb\" (UniqueName: \"kubernetes.io/projected/8dc1a51d-3d67-4287-ae12-f1241a2c45fa-kube-api-access-5bvqb\") pod \"redhat-operators-9f44k\" (UID: \"8dc1a51d-3d67-4287-ae12-f1241a2c45fa\") " pod="openshift-marketplace/redhat-operators-9f44k" Dec 05 18:57:18 crc kubenswrapper[4756]: I1205 18:57:18.436785 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dc1a51d-3d67-4287-ae12-f1241a2c45fa-catalog-content\") pod \"redhat-operators-9f44k\" (UID: \"8dc1a51d-3d67-4287-ae12-f1241a2c45fa\") " pod="openshift-marketplace/redhat-operators-9f44k" Dec 05 18:57:18 crc kubenswrapper[4756]: I1205 18:57:18.539837 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dc1a51d-3d67-4287-ae12-f1241a2c45fa-utilities\") pod \"redhat-operators-9f44k\" (UID: \"8dc1a51d-3d67-4287-ae12-f1241a2c45fa\") " pod="openshift-marketplace/redhat-operators-9f44k" Dec 05 18:57:18 crc kubenswrapper[4756]: I1205 18:57:18.540030 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bvqb\" (UniqueName: \"kubernetes.io/projected/8dc1a51d-3d67-4287-ae12-f1241a2c45fa-kube-api-access-5bvqb\") pod \"redhat-operators-9f44k\" (UID: \"8dc1a51d-3d67-4287-ae12-f1241a2c45fa\") " pod="openshift-marketplace/redhat-operators-9f44k" Dec 05 18:57:18 crc kubenswrapper[4756]: I1205 18:57:18.540095 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dc1a51d-3d67-4287-ae12-f1241a2c45fa-catalog-content\") pod \"redhat-operators-9f44k\" (UID: \"8dc1a51d-3d67-4287-ae12-f1241a2c45fa\") " pod="openshift-marketplace/redhat-operators-9f44k" Dec 05 18:57:18 crc kubenswrapper[4756]: I1205 18:57:18.540399 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dc1a51d-3d67-4287-ae12-f1241a2c45fa-utilities\") pod \"redhat-operators-9f44k\" (UID: \"8dc1a51d-3d67-4287-ae12-f1241a2c45fa\") " pod="openshift-marketplace/redhat-operators-9f44k" Dec 05 18:57:18 crc kubenswrapper[4756]: I1205 18:57:18.540611 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dc1a51d-3d67-4287-ae12-f1241a2c45fa-catalog-content\") pod \"redhat-operators-9f44k\" (UID: \"8dc1a51d-3d67-4287-ae12-f1241a2c45fa\") " pod="openshift-marketplace/redhat-operators-9f44k" Dec 05 18:57:18 crc kubenswrapper[4756]: I1205 18:57:18.561325 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bvqb\" (UniqueName: \"kubernetes.io/projected/8dc1a51d-3d67-4287-ae12-f1241a2c45fa-kube-api-access-5bvqb\") pod \"redhat-operators-9f44k\" (UID: \"8dc1a51d-3d67-4287-ae12-f1241a2c45fa\") " pod="openshift-marketplace/redhat-operators-9f44k" Dec 05 18:57:18 crc kubenswrapper[4756]: I1205 18:57:18.687215 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9f44k" Dec 05 18:57:19 crc kubenswrapper[4756]: I1205 18:57:19.125154 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9f44k"] Dec 05 18:57:19 crc kubenswrapper[4756]: I1205 18:57:19.382987 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9f44k" event={"ID":"8dc1a51d-3d67-4287-ae12-f1241a2c45fa","Type":"ContainerStarted","Data":"3e0e29c21460770bc0356d0052a3cd93ccf4368a2e70d6bebed63f4ffa445d78"} Dec 05 18:57:21 crc kubenswrapper[4756]: I1205 18:57:21.406621 4756 generic.go:334] "Generic (PLEG): container finished" podID="8dc1a51d-3d67-4287-ae12-f1241a2c45fa" containerID="1c2a1ea7744db8eaebb94fa9198124ecd81f78ed4e94f5b88dcaa3ae4f618518" exitCode=0 Dec 05 18:57:21 crc kubenswrapper[4756]: I1205 18:57:21.406745 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9f44k" event={"ID":"8dc1a51d-3d67-4287-ae12-f1241a2c45fa","Type":"ContainerDied","Data":"1c2a1ea7744db8eaebb94fa9198124ecd81f78ed4e94f5b88dcaa3ae4f618518"} Dec 05 18:57:21 crc kubenswrapper[4756]: I1205 18:57:21.409405 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 18:57:23 crc kubenswrapper[4756]: I1205 18:57:23.434036 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9f44k" event={"ID":"8dc1a51d-3d67-4287-ae12-f1241a2c45fa","Type":"ContainerStarted","Data":"8beb3fc3727a233704a314e682fb613d8602827c29d8c9676ef033d558a8d572"} Dec 05 18:57:32 crc kubenswrapper[4756]: I1205 18:57:32.555746 4756 generic.go:334] "Generic (PLEG): container finished" podID="8dc1a51d-3d67-4287-ae12-f1241a2c45fa" containerID="8beb3fc3727a233704a314e682fb613d8602827c29d8c9676ef033d558a8d572" exitCode=0 Dec 05 18:57:32 crc kubenswrapper[4756]: I1205 18:57:32.555975 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9f44k" event={"ID":"8dc1a51d-3d67-4287-ae12-f1241a2c45fa","Type":"ContainerDied","Data":"8beb3fc3727a233704a314e682fb613d8602827c29d8c9676ef033d558a8d572"} Dec 05 18:57:34 crc kubenswrapper[4756]: I1205 18:57:34.601589 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9f44k" event={"ID":"8dc1a51d-3d67-4287-ae12-f1241a2c45fa","Type":"ContainerStarted","Data":"510fa4cfef0257254ce77b6d17ea85ec2cf3528b68e08b59008f4fa7dd196bc3"} Dec 05 18:57:34 crc kubenswrapper[4756]: I1205 18:57:34.632158 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9f44k" podStartSLOduration=4.030664606 podStartE2EDuration="16.632141738s" podCreationTimestamp="2025-12-05 18:57:18 +0000 UTC" firstStartedPulling="2025-12-05 18:57:21.408930211 +0000 UTC m=+9701.647603763" lastFinishedPulling="2025-12-05 18:57:34.010407373 +0000 UTC m=+9714.249080895" observedRunningTime="2025-12-05 18:57:34.621442463 +0000 UTC m=+9714.860115975" watchObservedRunningTime="2025-12-05 18:57:34.632141738 +0000 UTC m=+9714.870815250" Dec 05 18:57:38 crc kubenswrapper[4756]: I1205 18:57:38.688215 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9f44k" Dec 05 18:57:38 crc kubenswrapper[4756]: I1205 18:57:38.689222 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9f44k" Dec 05 18:57:39 crc kubenswrapper[4756]: I1205 18:57:39.753369 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9f44k" podUID="8dc1a51d-3d67-4287-ae12-f1241a2c45fa" containerName="registry-server" probeResult="failure" output=< Dec 05 18:57:39 crc kubenswrapper[4756]: timeout: failed to connect service ":50051" within 1s Dec 05 18:57:39 crc kubenswrapper[4756]: > Dec 05 18:57:49 crc kubenswrapper[4756]: I1205 18:57:49.385597 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9f44k" Dec 05 18:57:49 crc kubenswrapper[4756]: I1205 18:57:49.445963 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9f44k" Dec 05 18:57:49 crc kubenswrapper[4756]: I1205 18:57:49.629820 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9f44k"] Dec 05 18:57:50 crc kubenswrapper[4756]: I1205 18:57:50.795172 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9f44k" podUID="8dc1a51d-3d67-4287-ae12-f1241a2c45fa" containerName="registry-server" containerID="cri-o://510fa4cfef0257254ce77b6d17ea85ec2cf3528b68e08b59008f4fa7dd196bc3" gracePeriod=2 Dec 05 18:57:51 crc kubenswrapper[4756]: I1205 18:57:51.815862 4756 generic.go:334] "Generic (PLEG): container finished" podID="8dc1a51d-3d67-4287-ae12-f1241a2c45fa" containerID="510fa4cfef0257254ce77b6d17ea85ec2cf3528b68e08b59008f4fa7dd196bc3" exitCode=0 Dec 05 18:57:51 crc kubenswrapper[4756]: I1205 18:57:51.815958 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9f44k" event={"ID":"8dc1a51d-3d67-4287-ae12-f1241a2c45fa","Type":"ContainerDied","Data":"510fa4cfef0257254ce77b6d17ea85ec2cf3528b68e08b59008f4fa7dd196bc3"} Dec 05 18:57:52 crc kubenswrapper[4756]: I1205 18:57:52.109948 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9f44k" Dec 05 18:57:52 crc kubenswrapper[4756]: I1205 18:57:52.289822 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bvqb\" (UniqueName: \"kubernetes.io/projected/8dc1a51d-3d67-4287-ae12-f1241a2c45fa-kube-api-access-5bvqb\") pod \"8dc1a51d-3d67-4287-ae12-f1241a2c45fa\" (UID: \"8dc1a51d-3d67-4287-ae12-f1241a2c45fa\") " Dec 05 18:57:52 crc kubenswrapper[4756]: I1205 18:57:52.289943 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dc1a51d-3d67-4287-ae12-f1241a2c45fa-catalog-content\") pod \"8dc1a51d-3d67-4287-ae12-f1241a2c45fa\" (UID: \"8dc1a51d-3d67-4287-ae12-f1241a2c45fa\") " Dec 05 18:57:52 crc kubenswrapper[4756]: I1205 18:57:52.291334 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dc1a51d-3d67-4287-ae12-f1241a2c45fa-utilities\") pod \"8dc1a51d-3d67-4287-ae12-f1241a2c45fa\" (UID: \"8dc1a51d-3d67-4287-ae12-f1241a2c45fa\") " Dec 05 18:57:52 crc kubenswrapper[4756]: I1205 18:57:52.292084 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dc1a51d-3d67-4287-ae12-f1241a2c45fa-utilities" (OuterVolumeSpecName: "utilities") pod "8dc1a51d-3d67-4287-ae12-f1241a2c45fa" (UID: "8dc1a51d-3d67-4287-ae12-f1241a2c45fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:57:52 crc kubenswrapper[4756]: I1205 18:57:52.292758 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dc1a51d-3d67-4287-ae12-f1241a2c45fa-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 18:57:52 crc kubenswrapper[4756]: I1205 18:57:52.297921 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dc1a51d-3d67-4287-ae12-f1241a2c45fa-kube-api-access-5bvqb" (OuterVolumeSpecName: "kube-api-access-5bvqb") pod "8dc1a51d-3d67-4287-ae12-f1241a2c45fa" (UID: "8dc1a51d-3d67-4287-ae12-f1241a2c45fa"). InnerVolumeSpecName "kube-api-access-5bvqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:57:52 crc kubenswrapper[4756]: I1205 18:57:52.394609 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bvqb\" (UniqueName: \"kubernetes.io/projected/8dc1a51d-3d67-4287-ae12-f1241a2c45fa-kube-api-access-5bvqb\") on node \"crc\" DevicePath \"\"" Dec 05 18:57:52 crc kubenswrapper[4756]: I1205 18:57:52.409438 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dc1a51d-3d67-4287-ae12-f1241a2c45fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8dc1a51d-3d67-4287-ae12-f1241a2c45fa" (UID: "8dc1a51d-3d67-4287-ae12-f1241a2c45fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:57:52 crc kubenswrapper[4756]: I1205 18:57:52.497469 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dc1a51d-3d67-4287-ae12-f1241a2c45fa-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 18:57:52 crc kubenswrapper[4756]: I1205 18:57:52.828532 4756 generic.go:334] "Generic (PLEG): container finished" podID="dcacd5e8-09f7-447e-8469-6a1462c5916c" containerID="facaf1066236adaf716136dba37aa1a69149929bdeb82f7051b7eec612cebbc5" exitCode=0 Dec 05 18:57:52 crc kubenswrapper[4756]: I1205 18:57:52.828611 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" event={"ID":"dcacd5e8-09f7-447e-8469-6a1462c5916c","Type":"ContainerDied","Data":"facaf1066236adaf716136dba37aa1a69149929bdeb82f7051b7eec612cebbc5"} Dec 05 18:57:52 crc kubenswrapper[4756]: I1205 18:57:52.830881 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9f44k" event={"ID":"8dc1a51d-3d67-4287-ae12-f1241a2c45fa","Type":"ContainerDied","Data":"3e0e29c21460770bc0356d0052a3cd93ccf4368a2e70d6bebed63f4ffa445d78"} Dec 05 18:57:52 crc kubenswrapper[4756]: I1205 18:57:52.830936 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9f44k" Dec 05 18:57:52 crc kubenswrapper[4756]: I1205 18:57:52.830944 4756 scope.go:117] "RemoveContainer" containerID="510fa4cfef0257254ce77b6d17ea85ec2cf3528b68e08b59008f4fa7dd196bc3" Dec 05 18:57:52 crc kubenswrapper[4756]: I1205 18:57:52.863910 4756 scope.go:117] "RemoveContainer" containerID="8beb3fc3727a233704a314e682fb613d8602827c29d8c9676ef033d558a8d572" Dec 05 18:57:52 crc kubenswrapper[4756]: I1205 18:57:52.896858 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9f44k"] Dec 05 18:57:52 crc kubenswrapper[4756]: I1205 18:57:52.903107 4756 scope.go:117] "RemoveContainer" containerID="1c2a1ea7744db8eaebb94fa9198124ecd81f78ed4e94f5b88dcaa3ae4f618518" Dec 05 18:57:52 crc kubenswrapper[4756]: I1205 18:57:52.907248 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9f44k"] Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.060644 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dc1a51d-3d67-4287-ae12-f1241a2c45fa" path="/var/lib/kubelet/pods/8dc1a51d-3d67-4287-ae12-f1241a2c45fa/volumes" Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.349001 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.440146 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-cell1-compute-config-0\") pod \"dcacd5e8-09f7-447e-8469-6a1462c5916c\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.440599 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-cell1-compute-config-1\") pod \"dcacd5e8-09f7-447e-8469-6a1462c5916c\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.441114 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92s54\" (UniqueName: \"kubernetes.io/projected/dcacd5e8-09f7-447e-8469-6a1462c5916c-kube-api-access-92s54\") pod \"dcacd5e8-09f7-447e-8469-6a1462c5916c\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.441336 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-cell1-combined-ca-bundle\") pod \"dcacd5e8-09f7-447e-8469-6a1462c5916c\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.441503 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-ssh-key\") pod \"dcacd5e8-09f7-447e-8469-6a1462c5916c\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.441647 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-migration-ssh-key-1\") pod \"dcacd5e8-09f7-447e-8469-6a1462c5916c\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.441841 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-migration-ssh-key-0\") pod \"dcacd5e8-09f7-447e-8469-6a1462c5916c\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.441989 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-cells-global-config-0\") pod \"dcacd5e8-09f7-447e-8469-6a1462c5916c\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.442194 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-inventory\") pod \"dcacd5e8-09f7-447e-8469-6a1462c5916c\" (UID: \"dcacd5e8-09f7-447e-8469-6a1462c5916c\") " Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.447290 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcacd5e8-09f7-447e-8469-6a1462c5916c-kube-api-access-92s54" (OuterVolumeSpecName: "kube-api-access-92s54") pod "dcacd5e8-09f7-447e-8469-6a1462c5916c" (UID: "dcacd5e8-09f7-447e-8469-6a1462c5916c"). InnerVolumeSpecName "kube-api-access-92s54". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.448633 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "dcacd5e8-09f7-447e-8469-6a1462c5916c" (UID: "dcacd5e8-09f7-447e-8469-6a1462c5916c"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.476725 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "dcacd5e8-09f7-447e-8469-6a1462c5916c" (UID: "dcacd5e8-09f7-447e-8469-6a1462c5916c"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.483450 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "dcacd5e8-09f7-447e-8469-6a1462c5916c" (UID: "dcacd5e8-09f7-447e-8469-6a1462c5916c"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.485218 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "dcacd5e8-09f7-447e-8469-6a1462c5916c" (UID: "dcacd5e8-09f7-447e-8469-6a1462c5916c"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.488626 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-inventory" (OuterVolumeSpecName: "inventory") pod "dcacd5e8-09f7-447e-8469-6a1462c5916c" (UID: "dcacd5e8-09f7-447e-8469-6a1462c5916c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.499758 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "dcacd5e8-09f7-447e-8469-6a1462c5916c" (UID: "dcacd5e8-09f7-447e-8469-6a1462c5916c"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.503548 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dcacd5e8-09f7-447e-8469-6a1462c5916c" (UID: "dcacd5e8-09f7-447e-8469-6a1462c5916c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.512487 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "dcacd5e8-09f7-447e-8469-6a1462c5916c" (UID: "dcacd5e8-09f7-447e-8469-6a1462c5916c"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.545869 4756 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.545915 4756 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.545932 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.545945 4756 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.545960 4756 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.545972 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92s54\" (UniqueName: \"kubernetes.io/projected/dcacd5e8-09f7-447e-8469-6a1462c5916c-kube-api-access-92s54\") on node \"crc\" DevicePath \"\"" Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.545984 4756 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.545997 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.546010 4756 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dcacd5e8-09f7-447e-8469-6a1462c5916c-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.859373 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" event={"ID":"dcacd5e8-09f7-447e-8469-6a1462c5916c","Type":"ContainerDied","Data":"fcd077eb3374221ec9473dd7a7cb001a8c1d32c5c2c38b3d387de26ff213de62"} Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.859497 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcd077eb3374221ec9473dd7a7cb001a8c1d32c5c2c38b3d387de26ff213de62" Dec 05 18:57:54 crc kubenswrapper[4756]: I1205 18:57:54.859588 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq" Dec 05 18:58:19 crc kubenswrapper[4756]: I1205 18:58:19.443351 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tmcvf"] Dec 05 18:58:19 crc kubenswrapper[4756]: E1205 18:58:19.444434 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dc1a51d-3d67-4287-ae12-f1241a2c45fa" containerName="extract-content" Dec 05 18:58:19 crc kubenswrapper[4756]: I1205 18:58:19.444447 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dc1a51d-3d67-4287-ae12-f1241a2c45fa" containerName="extract-content" Dec 05 18:58:19 crc kubenswrapper[4756]: E1205 18:58:19.444489 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcacd5e8-09f7-447e-8469-6a1462c5916c" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 05 18:58:19 crc kubenswrapper[4756]: I1205 18:58:19.444496 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcacd5e8-09f7-447e-8469-6a1462c5916c" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 05 18:58:19 crc kubenswrapper[4756]: E1205 18:58:19.444520 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dc1a51d-3d67-4287-ae12-f1241a2c45fa" containerName="registry-server" Dec 05 18:58:19 crc kubenswrapper[4756]: I1205 18:58:19.444526 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dc1a51d-3d67-4287-ae12-f1241a2c45fa" containerName="registry-server" Dec 05 18:58:19 crc kubenswrapper[4756]: E1205 18:58:19.444540 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dc1a51d-3d67-4287-ae12-f1241a2c45fa" containerName="extract-utilities" Dec 05 18:58:19 crc kubenswrapper[4756]: I1205 18:58:19.444546 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dc1a51d-3d67-4287-ae12-f1241a2c45fa" containerName="extract-utilities" Dec 05 18:58:19 crc kubenswrapper[4756]: I1205 18:58:19.444781 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dc1a51d-3d67-4287-ae12-f1241a2c45fa" containerName="registry-server" Dec 05 18:58:19 crc kubenswrapper[4756]: I1205 18:58:19.444819 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcacd5e8-09f7-447e-8469-6a1462c5916c" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 05 18:58:19 crc kubenswrapper[4756]: I1205 18:58:19.446684 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tmcvf" Dec 05 18:58:19 crc kubenswrapper[4756]: I1205 18:58:19.460568 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tmcvf"] Dec 05 18:58:19 crc kubenswrapper[4756]: I1205 18:58:19.650710 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16d00b6a-9cc6-442c-ba75-121ff4decb42-catalog-content\") pod \"redhat-marketplace-tmcvf\" (UID: \"16d00b6a-9cc6-442c-ba75-121ff4decb42\") " pod="openshift-marketplace/redhat-marketplace-tmcvf" Dec 05 18:58:19 crc kubenswrapper[4756]: I1205 18:58:19.650843 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh7jh\" (UniqueName: \"kubernetes.io/projected/16d00b6a-9cc6-442c-ba75-121ff4decb42-kube-api-access-wh7jh\") pod \"redhat-marketplace-tmcvf\" (UID: \"16d00b6a-9cc6-442c-ba75-121ff4decb42\") " pod="openshift-marketplace/redhat-marketplace-tmcvf" Dec 05 18:58:19 crc kubenswrapper[4756]: I1205 18:58:19.650986 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16d00b6a-9cc6-442c-ba75-121ff4decb42-utilities\") pod \"redhat-marketplace-tmcvf\" (UID: \"16d00b6a-9cc6-442c-ba75-121ff4decb42\") " pod="openshift-marketplace/redhat-marketplace-tmcvf" Dec 05 18:58:19 crc kubenswrapper[4756]: I1205 18:58:19.752530 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16d00b6a-9cc6-442c-ba75-121ff4decb42-utilities\") pod \"redhat-marketplace-tmcvf\" (UID: \"16d00b6a-9cc6-442c-ba75-121ff4decb42\") " pod="openshift-marketplace/redhat-marketplace-tmcvf" Dec 05 18:58:19 crc kubenswrapper[4756]: I1205 18:58:19.752653 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16d00b6a-9cc6-442c-ba75-121ff4decb42-catalog-content\") pod \"redhat-marketplace-tmcvf\" (UID: \"16d00b6a-9cc6-442c-ba75-121ff4decb42\") " pod="openshift-marketplace/redhat-marketplace-tmcvf" Dec 05 18:58:19 crc kubenswrapper[4756]: I1205 18:58:19.752777 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh7jh\" (UniqueName: \"kubernetes.io/projected/16d00b6a-9cc6-442c-ba75-121ff4decb42-kube-api-access-wh7jh\") pod \"redhat-marketplace-tmcvf\" (UID: \"16d00b6a-9cc6-442c-ba75-121ff4decb42\") " pod="openshift-marketplace/redhat-marketplace-tmcvf" Dec 05 18:58:19 crc kubenswrapper[4756]: I1205 18:58:19.753888 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16d00b6a-9cc6-442c-ba75-121ff4decb42-utilities\") pod \"redhat-marketplace-tmcvf\" (UID: \"16d00b6a-9cc6-442c-ba75-121ff4decb42\") " pod="openshift-marketplace/redhat-marketplace-tmcvf" Dec 05 18:58:19 crc kubenswrapper[4756]: I1205 18:58:19.754077 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16d00b6a-9cc6-442c-ba75-121ff4decb42-catalog-content\") pod \"redhat-marketplace-tmcvf\" (UID: \"16d00b6a-9cc6-442c-ba75-121ff4decb42\") " pod="openshift-marketplace/redhat-marketplace-tmcvf" Dec 05 18:58:19 crc kubenswrapper[4756]: I1205 18:58:19.778887 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh7jh\" (UniqueName: \"kubernetes.io/projected/16d00b6a-9cc6-442c-ba75-121ff4decb42-kube-api-access-wh7jh\") pod \"redhat-marketplace-tmcvf\" (UID: \"16d00b6a-9cc6-442c-ba75-121ff4decb42\") " pod="openshift-marketplace/redhat-marketplace-tmcvf" Dec 05 18:58:20 crc kubenswrapper[4756]: I1205 18:58:20.076141 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tmcvf" Dec 05 18:58:20 crc kubenswrapper[4756]: I1205 18:58:20.574695 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tmcvf"] Dec 05 18:58:21 crc kubenswrapper[4756]: I1205 18:58:21.205568 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tmcvf" event={"ID":"16d00b6a-9cc6-442c-ba75-121ff4decb42","Type":"ContainerStarted","Data":"582f15c11db56020de1f809b8c46c6fef10be2d1696e4aab688a7ca0710f986c"} Dec 05 18:58:27 crc kubenswrapper[4756]: I1205 18:58:27.281522 4756 generic.go:334] "Generic (PLEG): container finished" podID="16d00b6a-9cc6-442c-ba75-121ff4decb42" containerID="3718ae0d8f433e70801d626ef546376d4d72fca966cabae452b34839136371e2" exitCode=0 Dec 05 18:58:27 crc kubenswrapper[4756]: I1205 18:58:27.282154 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tmcvf" event={"ID":"16d00b6a-9cc6-442c-ba75-121ff4decb42","Type":"ContainerDied","Data":"3718ae0d8f433e70801d626ef546376d4d72fca966cabae452b34839136371e2"} Dec 05 18:58:28 crc kubenswrapper[4756]: I1205 18:58:28.675056 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-99cc989cd-9j2nn" podUID="37480e17-7f37-4df4-9587-3933a1fc902b" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 05 18:58:29 crc kubenswrapper[4756]: I1205 18:58:29.306891 4756 generic.go:334] "Generic (PLEG): container finished" podID="16d00b6a-9cc6-442c-ba75-121ff4decb42" containerID="2ef4b6ac2f4d3b93a2fbded21145f6b09be9fc02bf852a9777e08134df09a547" exitCode=0 Dec 05 18:58:29 crc kubenswrapper[4756]: I1205 18:58:29.306986 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tmcvf" event={"ID":"16d00b6a-9cc6-442c-ba75-121ff4decb42","Type":"ContainerDied","Data":"2ef4b6ac2f4d3b93a2fbded21145f6b09be9fc02bf852a9777e08134df09a547"} Dec 05 18:58:32 crc kubenswrapper[4756]: I1205 18:58:32.345366 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tmcvf" event={"ID":"16d00b6a-9cc6-442c-ba75-121ff4decb42","Type":"ContainerStarted","Data":"a15ce1bf69671fed98f2fb31a5697c08ee72519679e8d51d3a7c4340560ec42e"} Dec 05 18:58:32 crc kubenswrapper[4756]: I1205 18:58:32.384717 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tmcvf" podStartSLOduration=9.883015978 podStartE2EDuration="13.384691776s" podCreationTimestamp="2025-12-05 18:58:19 +0000 UTC" firstStartedPulling="2025-12-05 18:58:27.286897811 +0000 UTC m=+9767.525571333" lastFinishedPulling="2025-12-05 18:58:30.788573609 +0000 UTC m=+9771.027247131" observedRunningTime="2025-12-05 18:58:32.3810849 +0000 UTC m=+9772.619758412" watchObservedRunningTime="2025-12-05 18:58:32.384691776 +0000 UTC m=+9772.623365308" Dec 05 18:58:40 crc kubenswrapper[4756]: I1205 18:58:40.076602 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tmcvf" Dec 05 18:58:40 crc kubenswrapper[4756]: I1205 18:58:40.077165 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tmcvf" Dec 05 18:58:40 crc kubenswrapper[4756]: I1205 18:58:40.800300 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tmcvf" Dec 05 18:58:40 crc kubenswrapper[4756]: I1205 18:58:40.860784 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tmcvf" Dec 05 18:58:41 crc kubenswrapper[4756]: I1205 18:58:41.048939 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tmcvf"] Dec 05 18:58:42 crc kubenswrapper[4756]: I1205 18:58:42.458550 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tmcvf" podUID="16d00b6a-9cc6-442c-ba75-121ff4decb42" containerName="registry-server" containerID="cri-o://a15ce1bf69671fed98f2fb31a5697c08ee72519679e8d51d3a7c4340560ec42e" gracePeriod=2 Dec 05 18:58:43 crc kubenswrapper[4756]: I1205 18:58:43.471833 4756 generic.go:334] "Generic (PLEG): container finished" podID="16d00b6a-9cc6-442c-ba75-121ff4decb42" containerID="a15ce1bf69671fed98f2fb31a5697c08ee72519679e8d51d3a7c4340560ec42e" exitCode=0 Dec 05 18:58:43 crc kubenswrapper[4756]: I1205 18:58:43.471871 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tmcvf" event={"ID":"16d00b6a-9cc6-442c-ba75-121ff4decb42","Type":"ContainerDied","Data":"a15ce1bf69671fed98f2fb31a5697c08ee72519679e8d51d3a7c4340560ec42e"} Dec 05 18:58:43 crc kubenswrapper[4756]: I1205 18:58:43.905254 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tmcvf" Dec 05 18:58:44 crc kubenswrapper[4756]: I1205 18:58:44.066774 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16d00b6a-9cc6-442c-ba75-121ff4decb42-catalog-content\") pod \"16d00b6a-9cc6-442c-ba75-121ff4decb42\" (UID: \"16d00b6a-9cc6-442c-ba75-121ff4decb42\") " Dec 05 18:58:44 crc kubenswrapper[4756]: I1205 18:58:44.066851 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh7jh\" (UniqueName: \"kubernetes.io/projected/16d00b6a-9cc6-442c-ba75-121ff4decb42-kube-api-access-wh7jh\") pod \"16d00b6a-9cc6-442c-ba75-121ff4decb42\" (UID: \"16d00b6a-9cc6-442c-ba75-121ff4decb42\") " Dec 05 18:58:44 crc kubenswrapper[4756]: I1205 18:58:44.066912 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16d00b6a-9cc6-442c-ba75-121ff4decb42-utilities\") pod \"16d00b6a-9cc6-442c-ba75-121ff4decb42\" (UID: \"16d00b6a-9cc6-442c-ba75-121ff4decb42\") " Dec 05 18:58:44 crc kubenswrapper[4756]: I1205 18:58:44.069523 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16d00b6a-9cc6-442c-ba75-121ff4decb42-utilities" (OuterVolumeSpecName: "utilities") pod "16d00b6a-9cc6-442c-ba75-121ff4decb42" (UID: "16d00b6a-9cc6-442c-ba75-121ff4decb42"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:58:44 crc kubenswrapper[4756]: I1205 18:58:44.083115 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16d00b6a-9cc6-442c-ba75-121ff4decb42-kube-api-access-wh7jh" (OuterVolumeSpecName: "kube-api-access-wh7jh") pod "16d00b6a-9cc6-442c-ba75-121ff4decb42" (UID: "16d00b6a-9cc6-442c-ba75-121ff4decb42"). InnerVolumeSpecName "kube-api-access-wh7jh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 18:58:44 crc kubenswrapper[4756]: I1205 18:58:44.100139 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16d00b6a-9cc6-442c-ba75-121ff4decb42-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16d00b6a-9cc6-442c-ba75-121ff4decb42" (UID: "16d00b6a-9cc6-442c-ba75-121ff4decb42"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 18:58:44 crc kubenswrapper[4756]: I1205 18:58:44.171455 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16d00b6a-9cc6-442c-ba75-121ff4decb42-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 18:58:44 crc kubenswrapper[4756]: I1205 18:58:44.171496 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wh7jh\" (UniqueName: \"kubernetes.io/projected/16d00b6a-9cc6-442c-ba75-121ff4decb42-kube-api-access-wh7jh\") on node \"crc\" DevicePath \"\"" Dec 05 18:58:44 crc kubenswrapper[4756]: I1205 18:58:44.171512 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16d00b6a-9cc6-442c-ba75-121ff4decb42-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 18:58:44 crc kubenswrapper[4756]: I1205 18:58:44.487502 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tmcvf" event={"ID":"16d00b6a-9cc6-442c-ba75-121ff4decb42","Type":"ContainerDied","Data":"582f15c11db56020de1f809b8c46c6fef10be2d1696e4aab688a7ca0710f986c"} Dec 05 18:58:44 crc kubenswrapper[4756]: I1205 18:58:44.487580 4756 scope.go:117] "RemoveContainer" containerID="a15ce1bf69671fed98f2fb31a5697c08ee72519679e8d51d3a7c4340560ec42e" Dec 05 18:58:44 crc kubenswrapper[4756]: I1205 18:58:44.487617 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tmcvf" Dec 05 18:58:44 crc kubenswrapper[4756]: I1205 18:58:44.523548 4756 scope.go:117] "RemoveContainer" containerID="2ef4b6ac2f4d3b93a2fbded21145f6b09be9fc02bf852a9777e08134df09a547" Dec 05 18:58:44 crc kubenswrapper[4756]: I1205 18:58:44.541136 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tmcvf"] Dec 05 18:58:44 crc kubenswrapper[4756]: I1205 18:58:44.551978 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tmcvf"] Dec 05 18:58:44 crc kubenswrapper[4756]: I1205 18:58:44.562251 4756 scope.go:117] "RemoveContainer" containerID="3718ae0d8f433e70801d626ef546376d4d72fca966cabae452b34839136371e2" Dec 05 18:58:46 crc kubenswrapper[4756]: I1205 18:58:46.075806 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16d00b6a-9cc6-442c-ba75-121ff4decb42" path="/var/lib/kubelet/pods/16d00b6a-9cc6-442c-ba75-121ff4decb42/volumes" Dec 05 18:58:52 crc kubenswrapper[4756]: I1205 18:58:52.299202 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:58:52 crc kubenswrapper[4756]: I1205 18:58:52.299901 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:59:22 crc kubenswrapper[4756]: I1205 18:59:22.295171 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:59:22 crc kubenswrapper[4756]: I1205 18:59:22.295878 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:59:45 crc kubenswrapper[4756]: I1205 18:59:45.678657 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zfcqk/must-gather-qk6sq"] Dec 05 18:59:45 crc kubenswrapper[4756]: E1205 18:59:45.679668 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16d00b6a-9cc6-442c-ba75-121ff4decb42" containerName="extract-content" Dec 05 18:59:45 crc kubenswrapper[4756]: I1205 18:59:45.679680 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="16d00b6a-9cc6-442c-ba75-121ff4decb42" containerName="extract-content" Dec 05 18:59:45 crc kubenswrapper[4756]: E1205 18:59:45.679695 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16d00b6a-9cc6-442c-ba75-121ff4decb42" containerName="extract-utilities" Dec 05 18:59:45 crc kubenswrapper[4756]: I1205 18:59:45.679702 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="16d00b6a-9cc6-442c-ba75-121ff4decb42" containerName="extract-utilities" Dec 05 18:59:45 crc kubenswrapper[4756]: E1205 18:59:45.679722 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16d00b6a-9cc6-442c-ba75-121ff4decb42" containerName="registry-server" Dec 05 18:59:45 crc kubenswrapper[4756]: I1205 18:59:45.679729 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="16d00b6a-9cc6-442c-ba75-121ff4decb42" containerName="registry-server" Dec 05 18:59:45 crc kubenswrapper[4756]: I1205 18:59:45.679936 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="16d00b6a-9cc6-442c-ba75-121ff4decb42" containerName="registry-server" Dec 05 18:59:45 crc kubenswrapper[4756]: I1205 18:59:45.681138 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zfcqk/must-gather-qk6sq" Dec 05 18:59:45 crc kubenswrapper[4756]: I1205 18:59:45.687390 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-zfcqk"/"openshift-service-ca.crt" Dec 05 18:59:45 crc kubenswrapper[4756]: I1205 18:59:45.687485 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zfcqk/must-gather-qk6sq"] Dec 05 18:59:45 crc kubenswrapper[4756]: I1205 18:59:45.687632 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-zfcqk"/"default-dockercfg-nw2mv" Dec 05 18:59:45 crc kubenswrapper[4756]: I1205 18:59:45.687748 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-zfcqk"/"kube-root-ca.crt" Dec 05 18:59:45 crc kubenswrapper[4756]: I1205 18:59:45.788444 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fe24f4b8-5eb6-4027-9445-6c57b390876e-must-gather-output\") pod \"must-gather-qk6sq\" (UID: \"fe24f4b8-5eb6-4027-9445-6c57b390876e\") " pod="openshift-must-gather-zfcqk/must-gather-qk6sq" Dec 05 18:59:45 crc kubenswrapper[4756]: I1205 18:59:45.788613 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rd6h\" (UniqueName: \"kubernetes.io/projected/fe24f4b8-5eb6-4027-9445-6c57b390876e-kube-api-access-2rd6h\") pod \"must-gather-qk6sq\" (UID: \"fe24f4b8-5eb6-4027-9445-6c57b390876e\") " pod="openshift-must-gather-zfcqk/must-gather-qk6sq" Dec 05 18:59:45 crc kubenswrapper[4756]: I1205 18:59:45.890126 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rd6h\" (UniqueName: \"kubernetes.io/projected/fe24f4b8-5eb6-4027-9445-6c57b390876e-kube-api-access-2rd6h\") pod \"must-gather-qk6sq\" (UID: \"fe24f4b8-5eb6-4027-9445-6c57b390876e\") " pod="openshift-must-gather-zfcqk/must-gather-qk6sq" Dec 05 18:59:45 crc kubenswrapper[4756]: I1205 18:59:45.890288 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fe24f4b8-5eb6-4027-9445-6c57b390876e-must-gather-output\") pod \"must-gather-qk6sq\" (UID: \"fe24f4b8-5eb6-4027-9445-6c57b390876e\") " pod="openshift-must-gather-zfcqk/must-gather-qk6sq" Dec 05 18:59:45 crc kubenswrapper[4756]: I1205 18:59:45.890682 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fe24f4b8-5eb6-4027-9445-6c57b390876e-must-gather-output\") pod \"must-gather-qk6sq\" (UID: \"fe24f4b8-5eb6-4027-9445-6c57b390876e\") " pod="openshift-must-gather-zfcqk/must-gather-qk6sq" Dec 05 18:59:45 crc kubenswrapper[4756]: I1205 18:59:45.922078 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rd6h\" (UniqueName: \"kubernetes.io/projected/fe24f4b8-5eb6-4027-9445-6c57b390876e-kube-api-access-2rd6h\") pod \"must-gather-qk6sq\" (UID: \"fe24f4b8-5eb6-4027-9445-6c57b390876e\") " pod="openshift-must-gather-zfcqk/must-gather-qk6sq" Dec 05 18:59:45 crc kubenswrapper[4756]: I1205 18:59:45.998585 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zfcqk/must-gather-qk6sq" Dec 05 18:59:46 crc kubenswrapper[4756]: I1205 18:59:46.535863 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zfcqk/must-gather-qk6sq"] Dec 05 18:59:46 crc kubenswrapper[4756]: W1205 18:59:46.542030 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe24f4b8_5eb6_4027_9445_6c57b390876e.slice/crio-a866465798f589ff0df3819431ae501550b05418f447e19bc840bd411735f2a2 WatchSource:0}: Error finding container a866465798f589ff0df3819431ae501550b05418f447e19bc840bd411735f2a2: Status 404 returned error can't find the container with id a866465798f589ff0df3819431ae501550b05418f447e19bc840bd411735f2a2 Dec 05 18:59:47 crc kubenswrapper[4756]: I1205 18:59:47.180534 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zfcqk/must-gather-qk6sq" event={"ID":"fe24f4b8-5eb6-4027-9445-6c57b390876e","Type":"ContainerStarted","Data":"a866465798f589ff0df3819431ae501550b05418f447e19bc840bd411735f2a2"} Dec 05 18:59:52 crc kubenswrapper[4756]: I1205 18:59:52.296248 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 18:59:52 crc kubenswrapper[4756]: I1205 18:59:52.296824 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 18:59:52 crc kubenswrapper[4756]: I1205 18:59:52.296884 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 18:59:52 crc kubenswrapper[4756]: I1205 18:59:52.298025 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3b487db91a35b43b2ad7f9ec84f421006fecc871f530c2ac8a41abd36649097e"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 18:59:52 crc kubenswrapper[4756]: I1205 18:59:52.298115 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://3b487db91a35b43b2ad7f9ec84f421006fecc871f530c2ac8a41abd36649097e" gracePeriod=600 Dec 05 18:59:53 crc kubenswrapper[4756]: I1205 18:59:53.249114 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zfcqk/must-gather-qk6sq" event={"ID":"fe24f4b8-5eb6-4027-9445-6c57b390876e","Type":"ContainerStarted","Data":"5f7b75922371ffef2dc56a71d59daa7bacbc06c86a493d49a422f23b0c2db981"} Dec 05 18:59:53 crc kubenswrapper[4756]: I1205 18:59:53.249804 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zfcqk/must-gather-qk6sq" event={"ID":"fe24f4b8-5eb6-4027-9445-6c57b390876e","Type":"ContainerStarted","Data":"f79cec67d91fe39767030ff5e224f289188866c4bbdb07e0444862c45c5c3d89"} Dec 05 18:59:53 crc kubenswrapper[4756]: I1205 18:59:53.251580 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="3b487db91a35b43b2ad7f9ec84f421006fecc871f530c2ac8a41abd36649097e" exitCode=0 Dec 05 18:59:53 crc kubenswrapper[4756]: I1205 18:59:53.251629 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"3b487db91a35b43b2ad7f9ec84f421006fecc871f530c2ac8a41abd36649097e"} Dec 05 18:59:53 crc kubenswrapper[4756]: I1205 18:59:53.251659 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"de32e6eef8a3a3e91095a3e009dc8ee00a9b2eb9eda008ae6ceb187ef7b761eb"} Dec 05 18:59:53 crc kubenswrapper[4756]: I1205 18:59:53.251680 4756 scope.go:117] "RemoveContainer" containerID="8e95bdb7bce22103d5caf5b042590a345f34c5af9bac46f008073236849ec31f" Dec 05 18:59:53 crc kubenswrapper[4756]: I1205 18:59:53.272791 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zfcqk/must-gather-qk6sq" podStartSLOduration=2.527273207 podStartE2EDuration="8.272772837s" podCreationTimestamp="2025-12-05 18:59:45 +0000 UTC" firstStartedPulling="2025-12-05 18:59:46.555475902 +0000 UTC m=+9846.794149414" lastFinishedPulling="2025-12-05 18:59:52.300975522 +0000 UTC m=+9852.539649044" observedRunningTime="2025-12-05 18:59:53.268119235 +0000 UTC m=+9853.506792747" watchObservedRunningTime="2025-12-05 18:59:53.272772837 +0000 UTC m=+9853.511446339" Dec 05 18:59:56 crc kubenswrapper[4756]: I1205 18:59:56.458455 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zfcqk/crc-debug-zqqlv"] Dec 05 18:59:56 crc kubenswrapper[4756]: I1205 18:59:56.461798 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zfcqk/crc-debug-zqqlv" Dec 05 18:59:56 crc kubenswrapper[4756]: I1205 18:59:56.581878 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/649b6b56-a315-4cc7-9759-890c0a6838c4-host\") pod \"crc-debug-zqqlv\" (UID: \"649b6b56-a315-4cc7-9759-890c0a6838c4\") " pod="openshift-must-gather-zfcqk/crc-debug-zqqlv" Dec 05 18:59:56 crc kubenswrapper[4756]: I1205 18:59:56.582102 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7jjb\" (UniqueName: \"kubernetes.io/projected/649b6b56-a315-4cc7-9759-890c0a6838c4-kube-api-access-z7jjb\") pod \"crc-debug-zqqlv\" (UID: \"649b6b56-a315-4cc7-9759-890c0a6838c4\") " pod="openshift-must-gather-zfcqk/crc-debug-zqqlv" Dec 05 18:59:56 crc kubenswrapper[4756]: I1205 18:59:56.684174 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/649b6b56-a315-4cc7-9759-890c0a6838c4-host\") pod \"crc-debug-zqqlv\" (UID: \"649b6b56-a315-4cc7-9759-890c0a6838c4\") " pod="openshift-must-gather-zfcqk/crc-debug-zqqlv" Dec 05 18:59:56 crc kubenswrapper[4756]: I1205 18:59:56.684324 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/649b6b56-a315-4cc7-9759-890c0a6838c4-host\") pod \"crc-debug-zqqlv\" (UID: \"649b6b56-a315-4cc7-9759-890c0a6838c4\") " pod="openshift-must-gather-zfcqk/crc-debug-zqqlv" Dec 05 18:59:56 crc kubenswrapper[4756]: I1205 18:59:56.684527 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7jjb\" (UniqueName: \"kubernetes.io/projected/649b6b56-a315-4cc7-9759-890c0a6838c4-kube-api-access-z7jjb\") pod \"crc-debug-zqqlv\" (UID: \"649b6b56-a315-4cc7-9759-890c0a6838c4\") " pod="openshift-must-gather-zfcqk/crc-debug-zqqlv" Dec 05 18:59:56 crc kubenswrapper[4756]: I1205 18:59:56.713244 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7jjb\" (UniqueName: \"kubernetes.io/projected/649b6b56-a315-4cc7-9759-890c0a6838c4-kube-api-access-z7jjb\") pod \"crc-debug-zqqlv\" (UID: \"649b6b56-a315-4cc7-9759-890c0a6838c4\") " pod="openshift-must-gather-zfcqk/crc-debug-zqqlv" Dec 05 18:59:56 crc kubenswrapper[4756]: I1205 18:59:56.785040 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zfcqk/crc-debug-zqqlv" Dec 05 18:59:57 crc kubenswrapper[4756]: I1205 18:59:57.342172 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zfcqk/crc-debug-zqqlv" event={"ID":"649b6b56-a315-4cc7-9759-890c0a6838c4","Type":"ContainerStarted","Data":"527988abc66180d0b7e01afbc707ab9da33b7a67efd17b1b5196b66f8d526e9b"} Dec 05 19:00:00 crc kubenswrapper[4756]: I1205 19:00:00.149308 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416020-pxf5x"] Dec 05 19:00:00 crc kubenswrapper[4756]: I1205 19:00:00.151462 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416020-pxf5x" Dec 05 19:00:00 crc kubenswrapper[4756]: I1205 19:00:00.153078 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 19:00:00 crc kubenswrapper[4756]: I1205 19:00:00.154282 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 19:00:00 crc kubenswrapper[4756]: I1205 19:00:00.165858 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416020-pxf5x"] Dec 05 19:00:00 crc kubenswrapper[4756]: I1205 19:00:00.200229 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c86db5e-d92f-402c-a7e0-05c32cf87059-secret-volume\") pod \"collect-profiles-29416020-pxf5x\" (UID: \"2c86db5e-d92f-402c-a7e0-05c32cf87059\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416020-pxf5x" Dec 05 19:00:00 crc kubenswrapper[4756]: I1205 19:00:00.200353 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c86db5e-d92f-402c-a7e0-05c32cf87059-config-volume\") pod \"collect-profiles-29416020-pxf5x\" (UID: \"2c86db5e-d92f-402c-a7e0-05c32cf87059\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416020-pxf5x" Dec 05 19:00:00 crc kubenswrapper[4756]: I1205 19:00:00.200474 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ggbh\" (UniqueName: \"kubernetes.io/projected/2c86db5e-d92f-402c-a7e0-05c32cf87059-kube-api-access-4ggbh\") pod \"collect-profiles-29416020-pxf5x\" (UID: \"2c86db5e-d92f-402c-a7e0-05c32cf87059\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416020-pxf5x" Dec 05 19:00:00 crc kubenswrapper[4756]: I1205 19:00:00.302282 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c86db5e-d92f-402c-a7e0-05c32cf87059-secret-volume\") pod \"collect-profiles-29416020-pxf5x\" (UID: \"2c86db5e-d92f-402c-a7e0-05c32cf87059\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416020-pxf5x" Dec 05 19:00:00 crc kubenswrapper[4756]: I1205 19:00:00.302405 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c86db5e-d92f-402c-a7e0-05c32cf87059-config-volume\") pod \"collect-profiles-29416020-pxf5x\" (UID: \"2c86db5e-d92f-402c-a7e0-05c32cf87059\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416020-pxf5x" Dec 05 19:00:00 crc kubenswrapper[4756]: I1205 19:00:00.302522 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ggbh\" (UniqueName: \"kubernetes.io/projected/2c86db5e-d92f-402c-a7e0-05c32cf87059-kube-api-access-4ggbh\") pod \"collect-profiles-29416020-pxf5x\" (UID: \"2c86db5e-d92f-402c-a7e0-05c32cf87059\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416020-pxf5x" Dec 05 19:00:00 crc kubenswrapper[4756]: I1205 19:00:00.303353 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c86db5e-d92f-402c-a7e0-05c32cf87059-config-volume\") pod \"collect-profiles-29416020-pxf5x\" (UID: \"2c86db5e-d92f-402c-a7e0-05c32cf87059\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416020-pxf5x" Dec 05 19:00:00 crc kubenswrapper[4756]: I1205 19:00:00.327347 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ggbh\" (UniqueName: \"kubernetes.io/projected/2c86db5e-d92f-402c-a7e0-05c32cf87059-kube-api-access-4ggbh\") pod \"collect-profiles-29416020-pxf5x\" (UID: \"2c86db5e-d92f-402c-a7e0-05c32cf87059\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416020-pxf5x" Dec 05 19:00:00 crc kubenswrapper[4756]: I1205 19:00:00.328774 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c86db5e-d92f-402c-a7e0-05c32cf87059-secret-volume\") pod \"collect-profiles-29416020-pxf5x\" (UID: \"2c86db5e-d92f-402c-a7e0-05c32cf87059\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416020-pxf5x" Dec 05 19:00:00 crc kubenswrapper[4756]: I1205 19:00:00.471090 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416020-pxf5x" Dec 05 19:00:00 crc kubenswrapper[4756]: I1205 19:00:00.961394 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416020-pxf5x"] Dec 05 19:00:00 crc kubenswrapper[4756]: W1205 19:00:00.969307 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c86db5e_d92f_402c_a7e0_05c32cf87059.slice/crio-cb71a97639b9939e0b4a711988874071ff7a988798330dae7a776b8570b1538c WatchSource:0}: Error finding container cb71a97639b9939e0b4a711988874071ff7a988798330dae7a776b8570b1538c: Status 404 returned error can't find the container with id cb71a97639b9939e0b4a711988874071ff7a988798330dae7a776b8570b1538c Dec 05 19:00:01 crc kubenswrapper[4756]: I1205 19:00:01.452042 4756 generic.go:334] "Generic (PLEG): container finished" podID="2c86db5e-d92f-402c-a7e0-05c32cf87059" containerID="b393e38faf25c14737a562c58420ae7946d6c4011b9de61d321368054644057e" exitCode=0 Dec 05 19:00:01 crc kubenswrapper[4756]: I1205 19:00:01.452145 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416020-pxf5x" event={"ID":"2c86db5e-d92f-402c-a7e0-05c32cf87059","Type":"ContainerDied","Data":"b393e38faf25c14737a562c58420ae7946d6c4011b9de61d321368054644057e"} Dec 05 19:00:01 crc kubenswrapper[4756]: I1205 19:00:01.452274 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416020-pxf5x" event={"ID":"2c86db5e-d92f-402c-a7e0-05c32cf87059","Type":"ContainerStarted","Data":"cb71a97639b9939e0b4a711988874071ff7a988798330dae7a776b8570b1538c"} Dec 05 19:00:03 crc kubenswrapper[4756]: I1205 19:00:03.288299 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416020-pxf5x" Dec 05 19:00:03 crc kubenswrapper[4756]: I1205 19:00:03.381186 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ggbh\" (UniqueName: \"kubernetes.io/projected/2c86db5e-d92f-402c-a7e0-05c32cf87059-kube-api-access-4ggbh\") pod \"2c86db5e-d92f-402c-a7e0-05c32cf87059\" (UID: \"2c86db5e-d92f-402c-a7e0-05c32cf87059\") " Dec 05 19:00:03 crc kubenswrapper[4756]: I1205 19:00:03.381264 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c86db5e-d92f-402c-a7e0-05c32cf87059-secret-volume\") pod \"2c86db5e-d92f-402c-a7e0-05c32cf87059\" (UID: \"2c86db5e-d92f-402c-a7e0-05c32cf87059\") " Dec 05 19:00:03 crc kubenswrapper[4756]: I1205 19:00:03.381288 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c86db5e-d92f-402c-a7e0-05c32cf87059-config-volume\") pod \"2c86db5e-d92f-402c-a7e0-05c32cf87059\" (UID: \"2c86db5e-d92f-402c-a7e0-05c32cf87059\") " Dec 05 19:00:03 crc kubenswrapper[4756]: I1205 19:00:03.382739 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c86db5e-d92f-402c-a7e0-05c32cf87059-config-volume" (OuterVolumeSpecName: "config-volume") pod "2c86db5e-d92f-402c-a7e0-05c32cf87059" (UID: "2c86db5e-d92f-402c-a7e0-05c32cf87059"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 19:00:03 crc kubenswrapper[4756]: I1205 19:00:03.403561 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c86db5e-d92f-402c-a7e0-05c32cf87059-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2c86db5e-d92f-402c-a7e0-05c32cf87059" (UID: "2c86db5e-d92f-402c-a7e0-05c32cf87059"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 19:00:03 crc kubenswrapper[4756]: I1205 19:00:03.417047 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c86db5e-d92f-402c-a7e0-05c32cf87059-kube-api-access-4ggbh" (OuterVolumeSpecName: "kube-api-access-4ggbh") pod "2c86db5e-d92f-402c-a7e0-05c32cf87059" (UID: "2c86db5e-d92f-402c-a7e0-05c32cf87059"). InnerVolumeSpecName "kube-api-access-4ggbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 19:00:03 crc kubenswrapper[4756]: I1205 19:00:03.472774 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416020-pxf5x" event={"ID":"2c86db5e-d92f-402c-a7e0-05c32cf87059","Type":"ContainerDied","Data":"cb71a97639b9939e0b4a711988874071ff7a988798330dae7a776b8570b1538c"} Dec 05 19:00:03 crc kubenswrapper[4756]: I1205 19:00:03.472811 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb71a97639b9939e0b4a711988874071ff7a988798330dae7a776b8570b1538c" Dec 05 19:00:03 crc kubenswrapper[4756]: I1205 19:00:03.472814 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416020-pxf5x" Dec 05 19:00:03 crc kubenswrapper[4756]: I1205 19:00:03.484082 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ggbh\" (UniqueName: \"kubernetes.io/projected/2c86db5e-d92f-402c-a7e0-05c32cf87059-kube-api-access-4ggbh\") on node \"crc\" DevicePath \"\"" Dec 05 19:00:03 crc kubenswrapper[4756]: I1205 19:00:03.484126 4756 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c86db5e-d92f-402c-a7e0-05c32cf87059-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 19:00:03 crc kubenswrapper[4756]: I1205 19:00:03.484141 4756 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c86db5e-d92f-402c-a7e0-05c32cf87059-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 19:00:04 crc kubenswrapper[4756]: I1205 19:00:04.370233 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415975-lng7z"] Dec 05 19:00:04 crc kubenswrapper[4756]: I1205 19:00:04.381823 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415975-lng7z"] Dec 05 19:00:06 crc kubenswrapper[4756]: I1205 19:00:06.062497 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f7b0c7b-38ad-488b-9561-7c44304160c2" path="/var/lib/kubelet/pods/5f7b0c7b-38ad-488b-9561-7c44304160c2/volumes" Dec 05 19:00:11 crc kubenswrapper[4756]: I1205 19:00:11.547328 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zfcqk/crc-debug-zqqlv" event={"ID":"649b6b56-a315-4cc7-9759-890c0a6838c4","Type":"ContainerStarted","Data":"890fbd4c2065cf82ed5270cb81cc69b9f03a35dd273827399f893504ea415f1f"} Dec 05 19:00:11 crc kubenswrapper[4756]: I1205 19:00:11.571854 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zfcqk/crc-debug-zqqlv" podStartSLOduration=1.6454005760000001 podStartE2EDuration="15.571834223s" podCreationTimestamp="2025-12-05 18:59:56 +0000 UTC" firstStartedPulling="2025-12-05 18:59:56.825761489 +0000 UTC m=+9857.064435021" lastFinishedPulling="2025-12-05 19:00:10.752195156 +0000 UTC m=+9870.990868668" observedRunningTime="2025-12-05 19:00:11.560643481 +0000 UTC m=+9871.799317013" watchObservedRunningTime="2025-12-05 19:00:11.571834223 +0000 UTC m=+9871.810507755" Dec 05 19:00:20 crc kubenswrapper[4756]: I1205 19:00:20.663663 4756 scope.go:117] "RemoveContainer" containerID="20d997a5255c2ad47b02407528d6e6a7c5eeac8f8d564a6d13604cc590dd6f25" Dec 05 19:00:53 crc kubenswrapper[4756]: E1205 19:00:53.510172 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod649b6b56_a315_4cc7_9759_890c0a6838c4.slice/crio-conmon-890fbd4c2065cf82ed5270cb81cc69b9f03a35dd273827399f893504ea415f1f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod649b6b56_a315_4cc7_9759_890c0a6838c4.slice/crio-890fbd4c2065cf82ed5270cb81cc69b9f03a35dd273827399f893504ea415f1f.scope\": RecentStats: unable to find data in memory cache]" Dec 05 19:00:54 crc kubenswrapper[4756]: I1205 19:00:54.032057 4756 generic.go:334] "Generic (PLEG): container finished" podID="649b6b56-a315-4cc7-9759-890c0a6838c4" containerID="890fbd4c2065cf82ed5270cb81cc69b9f03a35dd273827399f893504ea415f1f" exitCode=0 Dec 05 19:00:54 crc kubenswrapper[4756]: I1205 19:00:54.032186 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zfcqk/crc-debug-zqqlv" event={"ID":"649b6b56-a315-4cc7-9759-890c0a6838c4","Type":"ContainerDied","Data":"890fbd4c2065cf82ed5270cb81cc69b9f03a35dd273827399f893504ea415f1f"} Dec 05 19:00:55 crc kubenswrapper[4756]: I1205 19:00:55.200835 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zfcqk/crc-debug-zqqlv" Dec 05 19:00:55 crc kubenswrapper[4756]: I1205 19:00:55.238333 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zfcqk/crc-debug-zqqlv"] Dec 05 19:00:55 crc kubenswrapper[4756]: I1205 19:00:55.247784 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zfcqk/crc-debug-zqqlv"] Dec 05 19:00:55 crc kubenswrapper[4756]: I1205 19:00:55.350640 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7jjb\" (UniqueName: \"kubernetes.io/projected/649b6b56-a315-4cc7-9759-890c0a6838c4-kube-api-access-z7jjb\") pod \"649b6b56-a315-4cc7-9759-890c0a6838c4\" (UID: \"649b6b56-a315-4cc7-9759-890c0a6838c4\") " Dec 05 19:00:55 crc kubenswrapper[4756]: I1205 19:00:55.350773 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/649b6b56-a315-4cc7-9759-890c0a6838c4-host\") pod \"649b6b56-a315-4cc7-9759-890c0a6838c4\" (UID: \"649b6b56-a315-4cc7-9759-890c0a6838c4\") " Dec 05 19:00:55 crc kubenswrapper[4756]: I1205 19:00:55.351002 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/649b6b56-a315-4cc7-9759-890c0a6838c4-host" (OuterVolumeSpecName: "host") pod "649b6b56-a315-4cc7-9759-890c0a6838c4" (UID: "649b6b56-a315-4cc7-9759-890c0a6838c4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 19:00:55 crc kubenswrapper[4756]: I1205 19:00:55.351317 4756 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/649b6b56-a315-4cc7-9759-890c0a6838c4-host\") on node \"crc\" DevicePath \"\"" Dec 05 19:00:55 crc kubenswrapper[4756]: I1205 19:00:55.357390 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/649b6b56-a315-4cc7-9759-890c0a6838c4-kube-api-access-z7jjb" (OuterVolumeSpecName: "kube-api-access-z7jjb") pod "649b6b56-a315-4cc7-9759-890c0a6838c4" (UID: "649b6b56-a315-4cc7-9759-890c0a6838c4"). InnerVolumeSpecName "kube-api-access-z7jjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 19:00:55 crc kubenswrapper[4756]: I1205 19:00:55.454246 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7jjb\" (UniqueName: \"kubernetes.io/projected/649b6b56-a315-4cc7-9759-890c0a6838c4-kube-api-access-z7jjb\") on node \"crc\" DevicePath \"\"" Dec 05 19:00:56 crc kubenswrapper[4756]: I1205 19:00:56.058906 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zfcqk/crc-debug-zqqlv" Dec 05 19:00:56 crc kubenswrapper[4756]: I1205 19:00:56.063122 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="649b6b56-a315-4cc7-9759-890c0a6838c4" path="/var/lib/kubelet/pods/649b6b56-a315-4cc7-9759-890c0a6838c4/volumes" Dec 05 19:00:56 crc kubenswrapper[4756]: I1205 19:00:56.063810 4756 scope.go:117] "RemoveContainer" containerID="890fbd4c2065cf82ed5270cb81cc69b9f03a35dd273827399f893504ea415f1f" Dec 05 19:00:56 crc kubenswrapper[4756]: I1205 19:00:56.456101 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zfcqk/crc-debug-vfgj5"] Dec 05 19:00:56 crc kubenswrapper[4756]: E1205 19:00:56.456532 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="649b6b56-a315-4cc7-9759-890c0a6838c4" containerName="container-00" Dec 05 19:00:56 crc kubenswrapper[4756]: I1205 19:00:56.456543 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="649b6b56-a315-4cc7-9759-890c0a6838c4" containerName="container-00" Dec 05 19:00:56 crc kubenswrapper[4756]: E1205 19:00:56.456568 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c86db5e-d92f-402c-a7e0-05c32cf87059" containerName="collect-profiles" Dec 05 19:00:56 crc kubenswrapper[4756]: I1205 19:00:56.456574 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c86db5e-d92f-402c-a7e0-05c32cf87059" containerName="collect-profiles" Dec 05 19:00:56 crc kubenswrapper[4756]: I1205 19:00:56.456790 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c86db5e-d92f-402c-a7e0-05c32cf87059" containerName="collect-profiles" Dec 05 19:00:56 crc kubenswrapper[4756]: I1205 19:00:56.456810 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="649b6b56-a315-4cc7-9759-890c0a6838c4" containerName="container-00" Dec 05 19:00:56 crc kubenswrapper[4756]: I1205 19:00:56.457520 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zfcqk/crc-debug-vfgj5" Dec 05 19:00:56 crc kubenswrapper[4756]: I1205 19:00:56.575767 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc870694-18c1-455a-bdde-d1393acfd433-host\") pod \"crc-debug-vfgj5\" (UID: \"fc870694-18c1-455a-bdde-d1393acfd433\") " pod="openshift-must-gather-zfcqk/crc-debug-vfgj5" Dec 05 19:00:56 crc kubenswrapper[4756]: I1205 19:00:56.576049 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd4jr\" (UniqueName: \"kubernetes.io/projected/fc870694-18c1-455a-bdde-d1393acfd433-kube-api-access-vd4jr\") pod \"crc-debug-vfgj5\" (UID: \"fc870694-18c1-455a-bdde-d1393acfd433\") " pod="openshift-must-gather-zfcqk/crc-debug-vfgj5" Dec 05 19:00:56 crc kubenswrapper[4756]: I1205 19:00:56.679087 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd4jr\" (UniqueName: \"kubernetes.io/projected/fc870694-18c1-455a-bdde-d1393acfd433-kube-api-access-vd4jr\") pod \"crc-debug-vfgj5\" (UID: \"fc870694-18c1-455a-bdde-d1393acfd433\") " pod="openshift-must-gather-zfcqk/crc-debug-vfgj5" Dec 05 19:00:56 crc kubenswrapper[4756]: I1205 19:00:56.679858 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc870694-18c1-455a-bdde-d1393acfd433-host\") pod \"crc-debug-vfgj5\" (UID: \"fc870694-18c1-455a-bdde-d1393acfd433\") " pod="openshift-must-gather-zfcqk/crc-debug-vfgj5" Dec 05 19:00:56 crc kubenswrapper[4756]: I1205 19:00:56.680033 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc870694-18c1-455a-bdde-d1393acfd433-host\") pod \"crc-debug-vfgj5\" (UID: \"fc870694-18c1-455a-bdde-d1393acfd433\") " pod="openshift-must-gather-zfcqk/crc-debug-vfgj5" Dec 05 19:00:56 crc kubenswrapper[4756]: I1205 19:00:56.701048 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd4jr\" (UniqueName: \"kubernetes.io/projected/fc870694-18c1-455a-bdde-d1393acfd433-kube-api-access-vd4jr\") pod \"crc-debug-vfgj5\" (UID: \"fc870694-18c1-455a-bdde-d1393acfd433\") " pod="openshift-must-gather-zfcqk/crc-debug-vfgj5" Dec 05 19:00:56 crc kubenswrapper[4756]: I1205 19:00:56.786404 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zfcqk/crc-debug-vfgj5" Dec 05 19:00:57 crc kubenswrapper[4756]: I1205 19:00:57.069079 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zfcqk/crc-debug-vfgj5" event={"ID":"fc870694-18c1-455a-bdde-d1393acfd433","Type":"ContainerStarted","Data":"efc3f14b2e8cb670006d64ba4d10b25b335b585e94492cad350f5f3387fe8ced"} Dec 05 19:00:57 crc kubenswrapper[4756]: I1205 19:00:57.069447 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zfcqk/crc-debug-vfgj5" event={"ID":"fc870694-18c1-455a-bdde-d1393acfd433","Type":"ContainerStarted","Data":"e84f3b283cff0d9104a3381e0706366887af2ce89f44908efaba5aebabf02133"} Dec 05 19:00:57 crc kubenswrapper[4756]: I1205 19:00:57.095449 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zfcqk/crc-debug-vfgj5" podStartSLOduration=1.095385128 podStartE2EDuration="1.095385128s" podCreationTimestamp="2025-12-05 19:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 19:00:57.091835495 +0000 UTC m=+9917.330509017" watchObservedRunningTime="2025-12-05 19:00:57.095385128 +0000 UTC m=+9917.334058670" Dec 05 19:00:58 crc kubenswrapper[4756]: I1205 19:00:58.091695 4756 generic.go:334] "Generic (PLEG): container finished" podID="fc870694-18c1-455a-bdde-d1393acfd433" containerID="efc3f14b2e8cb670006d64ba4d10b25b335b585e94492cad350f5f3387fe8ced" exitCode=0 Dec 05 19:00:58 crc kubenswrapper[4756]: I1205 19:00:58.091744 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zfcqk/crc-debug-vfgj5" event={"ID":"fc870694-18c1-455a-bdde-d1393acfd433","Type":"ContainerDied","Data":"efc3f14b2e8cb670006d64ba4d10b25b335b585e94492cad350f5f3387fe8ced"} Dec 05 19:00:59 crc kubenswrapper[4756]: I1205 19:00:59.222460 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zfcqk/crc-debug-vfgj5" Dec 05 19:00:59 crc kubenswrapper[4756]: I1205 19:00:59.281478 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zfcqk/crc-debug-vfgj5"] Dec 05 19:00:59 crc kubenswrapper[4756]: I1205 19:00:59.303887 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zfcqk/crc-debug-vfgj5"] Dec 05 19:00:59 crc kubenswrapper[4756]: I1205 19:00:59.341262 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc870694-18c1-455a-bdde-d1393acfd433-host\") pod \"fc870694-18c1-455a-bdde-d1393acfd433\" (UID: \"fc870694-18c1-455a-bdde-d1393acfd433\") " Dec 05 19:00:59 crc kubenswrapper[4756]: I1205 19:00:59.341344 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd4jr\" (UniqueName: \"kubernetes.io/projected/fc870694-18c1-455a-bdde-d1393acfd433-kube-api-access-vd4jr\") pod \"fc870694-18c1-455a-bdde-d1393acfd433\" (UID: \"fc870694-18c1-455a-bdde-d1393acfd433\") " Dec 05 19:00:59 crc kubenswrapper[4756]: I1205 19:00:59.342583 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc870694-18c1-455a-bdde-d1393acfd433-host" (OuterVolumeSpecName: "host") pod "fc870694-18c1-455a-bdde-d1393acfd433" (UID: "fc870694-18c1-455a-bdde-d1393acfd433"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 19:00:59 crc kubenswrapper[4756]: I1205 19:00:59.371634 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc870694-18c1-455a-bdde-d1393acfd433-kube-api-access-vd4jr" (OuterVolumeSpecName: "kube-api-access-vd4jr") pod "fc870694-18c1-455a-bdde-d1393acfd433" (UID: "fc870694-18c1-455a-bdde-d1393acfd433"). InnerVolumeSpecName "kube-api-access-vd4jr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 19:00:59 crc kubenswrapper[4756]: I1205 19:00:59.443916 4756 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc870694-18c1-455a-bdde-d1393acfd433-host\") on node \"crc\" DevicePath \"\"" Dec 05 19:00:59 crc kubenswrapper[4756]: I1205 19:00:59.443949 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd4jr\" (UniqueName: \"kubernetes.io/projected/fc870694-18c1-455a-bdde-d1393acfd433-kube-api-access-vd4jr\") on node \"crc\" DevicePath \"\"" Dec 05 19:01:00 crc kubenswrapper[4756]: I1205 19:01:00.064077 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc870694-18c1-455a-bdde-d1393acfd433" path="/var/lib/kubelet/pods/fc870694-18c1-455a-bdde-d1393acfd433/volumes" Dec 05 19:01:00 crc kubenswrapper[4756]: I1205 19:01:00.113186 4756 scope.go:117] "RemoveContainer" containerID="efc3f14b2e8cb670006d64ba4d10b25b335b585e94492cad350f5f3387fe8ced" Dec 05 19:01:00 crc kubenswrapper[4756]: I1205 19:01:00.113405 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zfcqk/crc-debug-vfgj5" Dec 05 19:01:00 crc kubenswrapper[4756]: I1205 19:01:00.169704 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29416021-s4hr4"] Dec 05 19:01:00 crc kubenswrapper[4756]: E1205 19:01:00.170147 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc870694-18c1-455a-bdde-d1393acfd433" containerName="container-00" Dec 05 19:01:00 crc kubenswrapper[4756]: I1205 19:01:00.170163 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc870694-18c1-455a-bdde-d1393acfd433" containerName="container-00" Dec 05 19:01:00 crc kubenswrapper[4756]: I1205 19:01:00.170464 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc870694-18c1-455a-bdde-d1393acfd433" containerName="container-00" Dec 05 19:01:00 crc kubenswrapper[4756]: I1205 19:01:00.171251 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416021-s4hr4" Dec 05 19:01:00 crc kubenswrapper[4756]: I1205 19:01:00.181256 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29416021-s4hr4"] Dec 05 19:01:00 crc kubenswrapper[4756]: I1205 19:01:00.259982 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecde8bb8-34a3-4765-9bf2-0522fd81c49d-combined-ca-bundle\") pod \"keystone-cron-29416021-s4hr4\" (UID: \"ecde8bb8-34a3-4765-9bf2-0522fd81c49d\") " pod="openstack/keystone-cron-29416021-s4hr4" Dec 05 19:01:00 crc kubenswrapper[4756]: I1205 19:01:00.261895 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ecde8bb8-34a3-4765-9bf2-0522fd81c49d-fernet-keys\") pod \"keystone-cron-29416021-s4hr4\" (UID: \"ecde8bb8-34a3-4765-9bf2-0522fd81c49d\") " pod="openstack/keystone-cron-29416021-s4hr4" Dec 05 19:01:00 crc kubenswrapper[4756]: I1205 19:01:00.262116 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecde8bb8-34a3-4765-9bf2-0522fd81c49d-config-data\") pod \"keystone-cron-29416021-s4hr4\" (UID: \"ecde8bb8-34a3-4765-9bf2-0522fd81c49d\") " pod="openstack/keystone-cron-29416021-s4hr4" Dec 05 19:01:00 crc kubenswrapper[4756]: I1205 19:01:00.262278 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bqvz\" (UniqueName: \"kubernetes.io/projected/ecde8bb8-34a3-4765-9bf2-0522fd81c49d-kube-api-access-5bqvz\") pod \"keystone-cron-29416021-s4hr4\" (UID: \"ecde8bb8-34a3-4765-9bf2-0522fd81c49d\") " pod="openstack/keystone-cron-29416021-s4hr4" Dec 05 19:01:00 crc kubenswrapper[4756]: I1205 19:01:00.364191 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecde8bb8-34a3-4765-9bf2-0522fd81c49d-combined-ca-bundle\") pod \"keystone-cron-29416021-s4hr4\" (UID: \"ecde8bb8-34a3-4765-9bf2-0522fd81c49d\") " pod="openstack/keystone-cron-29416021-s4hr4" Dec 05 19:01:00 crc kubenswrapper[4756]: I1205 19:01:00.364237 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ecde8bb8-34a3-4765-9bf2-0522fd81c49d-fernet-keys\") pod \"keystone-cron-29416021-s4hr4\" (UID: \"ecde8bb8-34a3-4765-9bf2-0522fd81c49d\") " pod="openstack/keystone-cron-29416021-s4hr4" Dec 05 19:01:00 crc kubenswrapper[4756]: I1205 19:01:00.364283 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecde8bb8-34a3-4765-9bf2-0522fd81c49d-config-data\") pod \"keystone-cron-29416021-s4hr4\" (UID: \"ecde8bb8-34a3-4765-9bf2-0522fd81c49d\") " pod="openstack/keystone-cron-29416021-s4hr4" Dec 05 19:01:00 crc kubenswrapper[4756]: I1205 19:01:00.364310 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bqvz\" (UniqueName: \"kubernetes.io/projected/ecde8bb8-34a3-4765-9bf2-0522fd81c49d-kube-api-access-5bqvz\") pod \"keystone-cron-29416021-s4hr4\" (UID: \"ecde8bb8-34a3-4765-9bf2-0522fd81c49d\") " pod="openstack/keystone-cron-29416021-s4hr4" Dec 05 19:01:00 crc kubenswrapper[4756]: I1205 19:01:00.370334 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecde8bb8-34a3-4765-9bf2-0522fd81c49d-config-data\") pod \"keystone-cron-29416021-s4hr4\" (UID: \"ecde8bb8-34a3-4765-9bf2-0522fd81c49d\") " pod="openstack/keystone-cron-29416021-s4hr4" Dec 05 19:01:00 crc kubenswrapper[4756]: I1205 19:01:00.370574 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ecde8bb8-34a3-4765-9bf2-0522fd81c49d-fernet-keys\") pod \"keystone-cron-29416021-s4hr4\" (UID: \"ecde8bb8-34a3-4765-9bf2-0522fd81c49d\") " pod="openstack/keystone-cron-29416021-s4hr4" Dec 05 19:01:00 crc kubenswrapper[4756]: I1205 19:01:00.376564 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecde8bb8-34a3-4765-9bf2-0522fd81c49d-combined-ca-bundle\") pod \"keystone-cron-29416021-s4hr4\" (UID: \"ecde8bb8-34a3-4765-9bf2-0522fd81c49d\") " pod="openstack/keystone-cron-29416021-s4hr4" Dec 05 19:01:00 crc kubenswrapper[4756]: I1205 19:01:00.382649 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bqvz\" (UniqueName: \"kubernetes.io/projected/ecde8bb8-34a3-4765-9bf2-0522fd81c49d-kube-api-access-5bqvz\") pod \"keystone-cron-29416021-s4hr4\" (UID: \"ecde8bb8-34a3-4765-9bf2-0522fd81c49d\") " pod="openstack/keystone-cron-29416021-s4hr4" Dec 05 19:01:00 crc kubenswrapper[4756]: I1205 19:01:00.500624 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416021-s4hr4" Dec 05 19:01:00 crc kubenswrapper[4756]: I1205 19:01:00.582255 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zfcqk/crc-debug-vcmvf"] Dec 05 19:01:00 crc kubenswrapper[4756]: I1205 19:01:00.586335 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zfcqk/crc-debug-vcmvf" Dec 05 19:01:00 crc kubenswrapper[4756]: I1205 19:01:00.671116 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsrst\" (UniqueName: \"kubernetes.io/projected/fd61acc5-1866-4e92-b030-335f67f8faa5-kube-api-access-lsrst\") pod \"crc-debug-vcmvf\" (UID: \"fd61acc5-1866-4e92-b030-335f67f8faa5\") " pod="openshift-must-gather-zfcqk/crc-debug-vcmvf" Dec 05 19:01:00 crc kubenswrapper[4756]: I1205 19:01:00.671177 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd61acc5-1866-4e92-b030-335f67f8faa5-host\") pod \"crc-debug-vcmvf\" (UID: \"fd61acc5-1866-4e92-b030-335f67f8faa5\") " pod="openshift-must-gather-zfcqk/crc-debug-vcmvf" Dec 05 19:01:00 crc kubenswrapper[4756]: I1205 19:01:00.773633 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsrst\" (UniqueName: \"kubernetes.io/projected/fd61acc5-1866-4e92-b030-335f67f8faa5-kube-api-access-lsrst\") pod \"crc-debug-vcmvf\" (UID: \"fd61acc5-1866-4e92-b030-335f67f8faa5\") " pod="openshift-must-gather-zfcqk/crc-debug-vcmvf" Dec 05 19:01:00 crc kubenswrapper[4756]: I1205 19:01:00.773998 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd61acc5-1866-4e92-b030-335f67f8faa5-host\") pod \"crc-debug-vcmvf\" (UID: \"fd61acc5-1866-4e92-b030-335f67f8faa5\") " pod="openshift-must-gather-zfcqk/crc-debug-vcmvf" Dec 05 19:01:00 crc kubenswrapper[4756]: I1205 19:01:00.774115 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd61acc5-1866-4e92-b030-335f67f8faa5-host\") pod \"crc-debug-vcmvf\" (UID: \"fd61acc5-1866-4e92-b030-335f67f8faa5\") " pod="openshift-must-gather-zfcqk/crc-debug-vcmvf" Dec 05 19:01:00 crc kubenswrapper[4756]: I1205 19:01:00.963002 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29416021-s4hr4"] Dec 05 19:01:01 crc kubenswrapper[4756]: I1205 19:01:01.730077 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsrst\" (UniqueName: \"kubernetes.io/projected/fd61acc5-1866-4e92-b030-335f67f8faa5-kube-api-access-lsrst\") pod \"crc-debug-vcmvf\" (UID: \"fd61acc5-1866-4e92-b030-335f67f8faa5\") " pod="openshift-must-gather-zfcqk/crc-debug-vcmvf" Dec 05 19:01:01 crc kubenswrapper[4756]: I1205 19:01:01.837019 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zfcqk/crc-debug-vcmvf" Dec 05 19:01:01 crc kubenswrapper[4756]: W1205 19:01:01.885035 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd61acc5_1866_4e92_b030_335f67f8faa5.slice/crio-922bd555fc199701faa4e61b4ee5e430bdceb3ddecbec3419cea63523f209010 WatchSource:0}: Error finding container 922bd555fc199701faa4e61b4ee5e430bdceb3ddecbec3419cea63523f209010: Status 404 returned error can't find the container with id 922bd555fc199701faa4e61b4ee5e430bdceb3ddecbec3419cea63523f209010 Dec 05 19:01:02 crc kubenswrapper[4756]: I1205 19:01:02.141587 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416021-s4hr4" event={"ID":"ecde8bb8-34a3-4765-9bf2-0522fd81c49d","Type":"ContainerStarted","Data":"869070b983c11d78543a6fc9dd40ad4a826dad26e2c17ff6290a6f5b3633b1b1"} Dec 05 19:01:02 crc kubenswrapper[4756]: I1205 19:01:02.143223 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zfcqk/crc-debug-vcmvf" event={"ID":"fd61acc5-1866-4e92-b030-335f67f8faa5","Type":"ContainerStarted","Data":"922bd555fc199701faa4e61b4ee5e430bdceb3ddecbec3419cea63523f209010"} Dec 05 19:01:03 crc kubenswrapper[4756]: I1205 19:01:03.172384 4756 generic.go:334] "Generic (PLEG): container finished" podID="fd61acc5-1866-4e92-b030-335f67f8faa5" containerID="585e278ead9f9a33200732cec8197aaf4fd04e9c20e58d746eb9fd0d22ac045b" exitCode=0 Dec 05 19:01:03 crc kubenswrapper[4756]: I1205 19:01:03.172557 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zfcqk/crc-debug-vcmvf" event={"ID":"fd61acc5-1866-4e92-b030-335f67f8faa5","Type":"ContainerDied","Data":"585e278ead9f9a33200732cec8197aaf4fd04e9c20e58d746eb9fd0d22ac045b"} Dec 05 19:01:03 crc kubenswrapper[4756]: I1205 19:01:03.177386 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416021-s4hr4" event={"ID":"ecde8bb8-34a3-4765-9bf2-0522fd81c49d","Type":"ContainerStarted","Data":"c92a116354ddc9dd65234c5d197de00cbdab622af33ad0a8457e883502eeac7e"} Dec 05 19:01:03 crc kubenswrapper[4756]: I1205 19:01:03.242686 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zfcqk/crc-debug-vcmvf"] Dec 05 19:01:03 crc kubenswrapper[4756]: I1205 19:01:03.245632 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29416021-s4hr4" podStartSLOduration=3.245611999 podStartE2EDuration="3.245611999s" podCreationTimestamp="2025-12-05 19:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 19:01:03.218474829 +0000 UTC m=+9923.457148371" watchObservedRunningTime="2025-12-05 19:01:03.245611999 +0000 UTC m=+9923.484285521" Dec 05 19:01:03 crc kubenswrapper[4756]: I1205 19:01:03.291315 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zfcqk/crc-debug-vcmvf"] Dec 05 19:01:04 crc kubenswrapper[4756]: I1205 19:01:04.320874 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zfcqk/crc-debug-vcmvf" Dec 05 19:01:04 crc kubenswrapper[4756]: I1205 19:01:04.480411 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd61acc5-1866-4e92-b030-335f67f8faa5-host\") pod \"fd61acc5-1866-4e92-b030-335f67f8faa5\" (UID: \"fd61acc5-1866-4e92-b030-335f67f8faa5\") " Dec 05 19:01:04 crc kubenswrapper[4756]: I1205 19:01:04.480492 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsrst\" (UniqueName: \"kubernetes.io/projected/fd61acc5-1866-4e92-b030-335f67f8faa5-kube-api-access-lsrst\") pod \"fd61acc5-1866-4e92-b030-335f67f8faa5\" (UID: \"fd61acc5-1866-4e92-b030-335f67f8faa5\") " Dec 05 19:01:04 crc kubenswrapper[4756]: I1205 19:01:04.480506 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd61acc5-1866-4e92-b030-335f67f8faa5-host" (OuterVolumeSpecName: "host") pod "fd61acc5-1866-4e92-b030-335f67f8faa5" (UID: "fd61acc5-1866-4e92-b030-335f67f8faa5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 19:01:04 crc kubenswrapper[4756]: I1205 19:01:04.481933 4756 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd61acc5-1866-4e92-b030-335f67f8faa5-host\") on node \"crc\" DevicePath \"\"" Dec 05 19:01:04 crc kubenswrapper[4756]: I1205 19:01:04.487868 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd61acc5-1866-4e92-b030-335f67f8faa5-kube-api-access-lsrst" (OuterVolumeSpecName: "kube-api-access-lsrst") pod "fd61acc5-1866-4e92-b030-335f67f8faa5" (UID: "fd61acc5-1866-4e92-b030-335f67f8faa5"). InnerVolumeSpecName "kube-api-access-lsrst". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 19:01:04 crc kubenswrapper[4756]: I1205 19:01:04.584410 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsrst\" (UniqueName: \"kubernetes.io/projected/fd61acc5-1866-4e92-b030-335f67f8faa5-kube-api-access-lsrst\") on node \"crc\" DevicePath \"\"" Dec 05 19:01:05 crc kubenswrapper[4756]: I1205 19:01:05.211132 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zfcqk/crc-debug-vcmvf" Dec 05 19:01:05 crc kubenswrapper[4756]: I1205 19:01:05.211103 4756 scope.go:117] "RemoveContainer" containerID="585e278ead9f9a33200732cec8197aaf4fd04e9c20e58d746eb9fd0d22ac045b" Dec 05 19:01:05 crc kubenswrapper[4756]: I1205 19:01:05.216452 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416021-s4hr4" event={"ID":"ecde8bb8-34a3-4765-9bf2-0522fd81c49d","Type":"ContainerDied","Data":"c92a116354ddc9dd65234c5d197de00cbdab622af33ad0a8457e883502eeac7e"} Dec 05 19:01:05 crc kubenswrapper[4756]: I1205 19:01:05.216340 4756 generic.go:334] "Generic (PLEG): container finished" podID="ecde8bb8-34a3-4765-9bf2-0522fd81c49d" containerID="c92a116354ddc9dd65234c5d197de00cbdab622af33ad0a8457e883502eeac7e" exitCode=0 Dec 05 19:01:06 crc kubenswrapper[4756]: I1205 19:01:06.063966 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd61acc5-1866-4e92-b030-335f67f8faa5" path="/var/lib/kubelet/pods/fd61acc5-1866-4e92-b030-335f67f8faa5/volumes" Dec 05 19:01:06 crc kubenswrapper[4756]: I1205 19:01:06.714962 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416021-s4hr4" Dec 05 19:01:06 crc kubenswrapper[4756]: I1205 19:01:06.738796 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecde8bb8-34a3-4765-9bf2-0522fd81c49d-config-data\") pod \"ecde8bb8-34a3-4765-9bf2-0522fd81c49d\" (UID: \"ecde8bb8-34a3-4765-9bf2-0522fd81c49d\") " Dec 05 19:01:06 crc kubenswrapper[4756]: I1205 19:01:06.739791 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ecde8bb8-34a3-4765-9bf2-0522fd81c49d-fernet-keys\") pod \"ecde8bb8-34a3-4765-9bf2-0522fd81c49d\" (UID: \"ecde8bb8-34a3-4765-9bf2-0522fd81c49d\") " Dec 05 19:01:06 crc kubenswrapper[4756]: I1205 19:01:06.739910 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecde8bb8-34a3-4765-9bf2-0522fd81c49d-combined-ca-bundle\") pod \"ecde8bb8-34a3-4765-9bf2-0522fd81c49d\" (UID: \"ecde8bb8-34a3-4765-9bf2-0522fd81c49d\") " Dec 05 19:01:06 crc kubenswrapper[4756]: I1205 19:01:06.739990 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bqvz\" (UniqueName: \"kubernetes.io/projected/ecde8bb8-34a3-4765-9bf2-0522fd81c49d-kube-api-access-5bqvz\") pod \"ecde8bb8-34a3-4765-9bf2-0522fd81c49d\" (UID: \"ecde8bb8-34a3-4765-9bf2-0522fd81c49d\") " Dec 05 19:01:06 crc kubenswrapper[4756]: I1205 19:01:06.745495 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecde8bb8-34a3-4765-9bf2-0522fd81c49d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ecde8bb8-34a3-4765-9bf2-0522fd81c49d" (UID: "ecde8bb8-34a3-4765-9bf2-0522fd81c49d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 19:01:06 crc kubenswrapper[4756]: I1205 19:01:06.755300 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecde8bb8-34a3-4765-9bf2-0522fd81c49d-kube-api-access-5bqvz" (OuterVolumeSpecName: "kube-api-access-5bqvz") pod "ecde8bb8-34a3-4765-9bf2-0522fd81c49d" (UID: "ecde8bb8-34a3-4765-9bf2-0522fd81c49d"). InnerVolumeSpecName "kube-api-access-5bqvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 19:01:06 crc kubenswrapper[4756]: I1205 19:01:06.801133 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecde8bb8-34a3-4765-9bf2-0522fd81c49d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ecde8bb8-34a3-4765-9bf2-0522fd81c49d" (UID: "ecde8bb8-34a3-4765-9bf2-0522fd81c49d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 19:01:06 crc kubenswrapper[4756]: I1205 19:01:06.805305 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecde8bb8-34a3-4765-9bf2-0522fd81c49d-config-data" (OuterVolumeSpecName: "config-data") pod "ecde8bb8-34a3-4765-9bf2-0522fd81c49d" (UID: "ecde8bb8-34a3-4765-9bf2-0522fd81c49d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 19:01:06 crc kubenswrapper[4756]: I1205 19:01:06.843032 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecde8bb8-34a3-4765-9bf2-0522fd81c49d-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 19:01:06 crc kubenswrapper[4756]: I1205 19:01:06.843056 4756 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ecde8bb8-34a3-4765-9bf2-0522fd81c49d-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 19:01:06 crc kubenswrapper[4756]: I1205 19:01:06.843065 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecde8bb8-34a3-4765-9bf2-0522fd81c49d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 19:01:06 crc kubenswrapper[4756]: I1205 19:01:06.843079 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bqvz\" (UniqueName: \"kubernetes.io/projected/ecde8bb8-34a3-4765-9bf2-0522fd81c49d-kube-api-access-5bqvz\") on node \"crc\" DevicePath \"\"" Dec 05 19:01:07 crc kubenswrapper[4756]: I1205 19:01:07.254027 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416021-s4hr4" event={"ID":"ecde8bb8-34a3-4765-9bf2-0522fd81c49d","Type":"ContainerDied","Data":"869070b983c11d78543a6fc9dd40ad4a826dad26e2c17ff6290a6f5b3633b1b1"} Dec 05 19:01:07 crc kubenswrapper[4756]: I1205 19:01:07.254699 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="869070b983c11d78543a6fc9dd40ad4a826dad26e2c17ff6290a6f5b3633b1b1" Dec 05 19:01:07 crc kubenswrapper[4756]: I1205 19:01:07.254584 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416021-s4hr4" Dec 05 19:01:23 crc kubenswrapper[4756]: I1205 19:01:23.265512 4756 scope.go:117] "RemoveContainer" containerID="28aab3673c8a121a325ea455d181c002ffcf9afcca7c9e030da7c64afa6a3a47" Dec 05 19:01:23 crc kubenswrapper[4756]: I1205 19:01:23.303588 4756 scope.go:117] "RemoveContainer" containerID="8c18c06f10a6d9237fbb79b9f4009f2c883eb4f159d2c421cb9d01696647bb1f" Dec 05 19:01:23 crc kubenswrapper[4756]: I1205 19:01:23.335665 4756 scope.go:117] "RemoveContainer" containerID="66b238f49a724160cc80e8d99d0ea42c390ee5b5054449fd1ac4362f1f99d22d" Dec 05 19:01:26 crc kubenswrapper[4756]: I1205 19:01:26.079366 4756 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod649b6b56-a315-4cc7-9759-890c0a6838c4"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod649b6b56-a315-4cc7-9759-890c0a6838c4] : Timed out while waiting for systemd to remove kubepods-besteffort-pod649b6b56_a315_4cc7_9759_890c0a6838c4.slice" Dec 05 19:01:26 crc kubenswrapper[4756]: E1205 19:01:26.079724 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod649b6b56-a315-4cc7-9759-890c0a6838c4] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod649b6b56-a315-4cc7-9759-890c0a6838c4] : Timed out while waiting for systemd to remove kubepods-besteffort-pod649b6b56_a315_4cc7_9759_890c0a6838c4.slice" pod="openshift-must-gather-zfcqk/crc-debug-zqqlv" podUID="649b6b56-a315-4cc7-9759-890c0a6838c4" Dec 05 19:01:26 crc kubenswrapper[4756]: I1205 19:01:26.521186 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zfcqk/crc-debug-zqqlv" Dec 05 19:01:52 crc kubenswrapper[4756]: I1205 19:01:52.295848 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 19:01:52 crc kubenswrapper[4756]: I1205 19:01:52.296750 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 19:02:22 crc kubenswrapper[4756]: I1205 19:02:22.294915 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 19:02:22 crc kubenswrapper[4756]: I1205 19:02:22.295468 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 19:02:52 crc kubenswrapper[4756]: I1205 19:02:52.296346 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 19:02:52 crc kubenswrapper[4756]: I1205 19:02:52.297096 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 19:02:52 crc kubenswrapper[4756]: I1205 19:02:52.297183 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 19:02:52 crc kubenswrapper[4756]: I1205 19:02:52.298644 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"de32e6eef8a3a3e91095a3e009dc8ee00a9b2eb9eda008ae6ceb187ef7b761eb"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 19:02:52 crc kubenswrapper[4756]: I1205 19:02:52.298774 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://de32e6eef8a3a3e91095a3e009dc8ee00a9b2eb9eda008ae6ceb187ef7b761eb" gracePeriod=600 Dec 05 19:02:52 crc kubenswrapper[4756]: E1205 19:02:52.458100 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:02:52 crc kubenswrapper[4756]: I1205 19:02:52.567124 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="de32e6eef8a3a3e91095a3e009dc8ee00a9b2eb9eda008ae6ceb187ef7b761eb" exitCode=0 Dec 05 19:02:52 crc kubenswrapper[4756]: I1205 19:02:52.567472 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"de32e6eef8a3a3e91095a3e009dc8ee00a9b2eb9eda008ae6ceb187ef7b761eb"} Dec 05 19:02:52 crc kubenswrapper[4756]: I1205 19:02:52.567509 4756 scope.go:117] "RemoveContainer" containerID="3b487db91a35b43b2ad7f9ec84f421006fecc871f530c2ac8a41abd36649097e" Dec 05 19:02:52 crc kubenswrapper[4756]: I1205 19:02:52.568242 4756 scope.go:117] "RemoveContainer" containerID="de32e6eef8a3a3e91095a3e009dc8ee00a9b2eb9eda008ae6ceb187ef7b761eb" Dec 05 19:02:52 crc kubenswrapper[4756]: E1205 19:02:52.568565 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:03:05 crc kubenswrapper[4756]: I1205 19:03:05.051708 4756 scope.go:117] "RemoveContainer" containerID="de32e6eef8a3a3e91095a3e009dc8ee00a9b2eb9eda008ae6ceb187ef7b761eb" Dec 05 19:03:05 crc kubenswrapper[4756]: E1205 19:03:05.052597 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:03:20 crc kubenswrapper[4756]: I1205 19:03:20.062691 4756 scope.go:117] "RemoveContainer" containerID="de32e6eef8a3a3e91095a3e009dc8ee00a9b2eb9eda008ae6ceb187ef7b761eb" Dec 05 19:03:20 crc kubenswrapper[4756]: E1205 19:03:20.065311 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:03:35 crc kubenswrapper[4756]: I1205 19:03:35.049999 4756 scope.go:117] "RemoveContainer" containerID="de32e6eef8a3a3e91095a3e009dc8ee00a9b2eb9eda008ae6ceb187ef7b761eb" Dec 05 19:03:35 crc kubenswrapper[4756]: E1205 19:03:35.051039 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:03:47 crc kubenswrapper[4756]: I1205 19:03:47.050398 4756 scope.go:117] "RemoveContainer" containerID="de32e6eef8a3a3e91095a3e009dc8ee00a9b2eb9eda008ae6ceb187ef7b761eb" Dec 05 19:03:47 crc kubenswrapper[4756]: E1205 19:03:47.051394 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:04:01 crc kubenswrapper[4756]: I1205 19:04:01.049478 4756 scope.go:117] "RemoveContainer" containerID="de32e6eef8a3a3e91095a3e009dc8ee00a9b2eb9eda008ae6ceb187ef7b761eb" Dec 05 19:04:01 crc kubenswrapper[4756]: E1205 19:04:01.068943 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:04:16 crc kubenswrapper[4756]: I1205 19:04:16.050413 4756 scope.go:117] "RemoveContainer" containerID="de32e6eef8a3a3e91095a3e009dc8ee00a9b2eb9eda008ae6ceb187ef7b761eb" Dec 05 19:04:16 crc kubenswrapper[4756]: E1205 19:04:16.051506 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:04:27 crc kubenswrapper[4756]: I1205 19:04:27.050940 4756 scope.go:117] "RemoveContainer" containerID="de32e6eef8a3a3e91095a3e009dc8ee00a9b2eb9eda008ae6ceb187ef7b761eb" Dec 05 19:04:27 crc kubenswrapper[4756]: E1205 19:04:27.052130 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:04:40 crc kubenswrapper[4756]: I1205 19:04:40.059066 4756 scope.go:117] "RemoveContainer" containerID="de32e6eef8a3a3e91095a3e009dc8ee00a9b2eb9eda008ae6ceb187ef7b761eb" Dec 05 19:04:40 crc kubenswrapper[4756]: E1205 19:04:40.059989 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:04:55 crc kubenswrapper[4756]: I1205 19:04:55.050448 4756 scope.go:117] "RemoveContainer" containerID="de32e6eef8a3a3e91095a3e009dc8ee00a9b2eb9eda008ae6ceb187ef7b761eb" Dec 05 19:04:55 crc kubenswrapper[4756]: E1205 19:04:55.051229 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:05:08 crc kubenswrapper[4756]: I1205 19:05:08.050934 4756 scope.go:117] "RemoveContainer" containerID="de32e6eef8a3a3e91095a3e009dc8ee00a9b2eb9eda008ae6ceb187ef7b761eb" Dec 05 19:05:08 crc kubenswrapper[4756]: E1205 19:05:08.052287 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:05:20 crc kubenswrapper[4756]: I1205 19:05:20.066905 4756 scope.go:117] "RemoveContainer" containerID="de32e6eef8a3a3e91095a3e009dc8ee00a9b2eb9eda008ae6ceb187ef7b761eb" Dec 05 19:05:20 crc kubenswrapper[4756]: E1205 19:05:20.068033 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:05:24 crc kubenswrapper[4756]: I1205 19:05:24.812805 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wzxw4"] Dec 05 19:05:24 crc kubenswrapper[4756]: E1205 19:05:24.813808 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd61acc5-1866-4e92-b030-335f67f8faa5" containerName="container-00" Dec 05 19:05:24 crc kubenswrapper[4756]: I1205 19:05:24.813821 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd61acc5-1866-4e92-b030-335f67f8faa5" containerName="container-00" Dec 05 19:05:24 crc kubenswrapper[4756]: E1205 19:05:24.813837 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecde8bb8-34a3-4765-9bf2-0522fd81c49d" containerName="keystone-cron" Dec 05 19:05:24 crc kubenswrapper[4756]: I1205 19:05:24.813842 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecde8bb8-34a3-4765-9bf2-0522fd81c49d" containerName="keystone-cron" Dec 05 19:05:24 crc kubenswrapper[4756]: I1205 19:05:24.814046 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd61acc5-1866-4e92-b030-335f67f8faa5" containerName="container-00" Dec 05 19:05:24 crc kubenswrapper[4756]: I1205 19:05:24.814070 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecde8bb8-34a3-4765-9bf2-0522fd81c49d" containerName="keystone-cron" Dec 05 19:05:24 crc kubenswrapper[4756]: I1205 19:05:24.815680 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wzxw4" Dec 05 19:05:24 crc kubenswrapper[4756]: I1205 19:05:24.827245 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wzxw4"] Dec 05 19:05:24 crc kubenswrapper[4756]: I1205 19:05:24.909277 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d99e5e16-b2df-45c8-963d-6fded723f937-utilities\") pod \"community-operators-wzxw4\" (UID: \"d99e5e16-b2df-45c8-963d-6fded723f937\") " pod="openshift-marketplace/community-operators-wzxw4" Dec 05 19:05:24 crc kubenswrapper[4756]: I1205 19:05:24.909597 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cscr4\" (UniqueName: \"kubernetes.io/projected/d99e5e16-b2df-45c8-963d-6fded723f937-kube-api-access-cscr4\") pod \"community-operators-wzxw4\" (UID: \"d99e5e16-b2df-45c8-963d-6fded723f937\") " pod="openshift-marketplace/community-operators-wzxw4" Dec 05 19:05:24 crc kubenswrapper[4756]: I1205 19:05:24.909707 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d99e5e16-b2df-45c8-963d-6fded723f937-catalog-content\") pod \"community-operators-wzxw4\" (UID: \"d99e5e16-b2df-45c8-963d-6fded723f937\") " pod="openshift-marketplace/community-operators-wzxw4" Dec 05 19:05:25 crc kubenswrapper[4756]: I1205 19:05:25.012401 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cscr4\" (UniqueName: \"kubernetes.io/projected/d99e5e16-b2df-45c8-963d-6fded723f937-kube-api-access-cscr4\") pod \"community-operators-wzxw4\" (UID: \"d99e5e16-b2df-45c8-963d-6fded723f937\") " pod="openshift-marketplace/community-operators-wzxw4" Dec 05 19:05:25 crc kubenswrapper[4756]: I1205 19:05:25.012481 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d99e5e16-b2df-45c8-963d-6fded723f937-catalog-content\") pod \"community-operators-wzxw4\" (UID: \"d99e5e16-b2df-45c8-963d-6fded723f937\") " pod="openshift-marketplace/community-operators-wzxw4" Dec 05 19:05:25 crc kubenswrapper[4756]: I1205 19:05:25.012655 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d99e5e16-b2df-45c8-963d-6fded723f937-utilities\") pod \"community-operators-wzxw4\" (UID: \"d99e5e16-b2df-45c8-963d-6fded723f937\") " pod="openshift-marketplace/community-operators-wzxw4" Dec 05 19:05:25 crc kubenswrapper[4756]: I1205 19:05:25.013200 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d99e5e16-b2df-45c8-963d-6fded723f937-catalog-content\") pod \"community-operators-wzxw4\" (UID: \"d99e5e16-b2df-45c8-963d-6fded723f937\") " pod="openshift-marketplace/community-operators-wzxw4" Dec 05 19:05:25 crc kubenswrapper[4756]: I1205 19:05:25.013224 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d99e5e16-b2df-45c8-963d-6fded723f937-utilities\") pod \"community-operators-wzxw4\" (UID: \"d99e5e16-b2df-45c8-963d-6fded723f937\") " pod="openshift-marketplace/community-operators-wzxw4" Dec 05 19:05:25 crc kubenswrapper[4756]: I1205 19:05:25.033235 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cscr4\" (UniqueName: \"kubernetes.io/projected/d99e5e16-b2df-45c8-963d-6fded723f937-kube-api-access-cscr4\") pod \"community-operators-wzxw4\" (UID: \"d99e5e16-b2df-45c8-963d-6fded723f937\") " pod="openshift-marketplace/community-operators-wzxw4" Dec 05 19:05:25 crc kubenswrapper[4756]: I1205 19:05:25.161523 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wzxw4" Dec 05 19:05:25 crc kubenswrapper[4756]: I1205 19:05:25.711877 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wzxw4"] Dec 05 19:05:26 crc kubenswrapper[4756]: I1205 19:05:26.605212 4756 generic.go:334] "Generic (PLEG): container finished" podID="d99e5e16-b2df-45c8-963d-6fded723f937" containerID="82b4245a6aee62d925149458ee3ed1bd8ecc5fac2e4eb4e0134b6ea52b532c2b" exitCode=0 Dec 05 19:05:26 crc kubenswrapper[4756]: I1205 19:05:26.605708 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzxw4" event={"ID":"d99e5e16-b2df-45c8-963d-6fded723f937","Type":"ContainerDied","Data":"82b4245a6aee62d925149458ee3ed1bd8ecc5fac2e4eb4e0134b6ea52b532c2b"} Dec 05 19:05:26 crc kubenswrapper[4756]: I1205 19:05:26.605764 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzxw4" event={"ID":"d99e5e16-b2df-45c8-963d-6fded723f937","Type":"ContainerStarted","Data":"b3fedb6ed2de0f10766bf2a9ec0c148f292bc586f28f3063a2a8c7407fc058e1"} Dec 05 19:05:26 crc kubenswrapper[4756]: I1205 19:05:26.609054 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 19:05:28 crc kubenswrapper[4756]: I1205 19:05:28.649194 4756 generic.go:334] "Generic (PLEG): container finished" podID="d99e5e16-b2df-45c8-963d-6fded723f937" containerID="0319e081ba047c1c70af92238f19321209a5e6d3fad9d17dfc8f97ae47fc03dd" exitCode=0 Dec 05 19:05:28 crc kubenswrapper[4756]: I1205 19:05:28.649677 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzxw4" event={"ID":"d99e5e16-b2df-45c8-963d-6fded723f937","Type":"ContainerDied","Data":"0319e081ba047c1c70af92238f19321209a5e6d3fad9d17dfc8f97ae47fc03dd"} Dec 05 19:05:30 crc kubenswrapper[4756]: I1205 19:05:30.673536 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzxw4" event={"ID":"d99e5e16-b2df-45c8-963d-6fded723f937","Type":"ContainerStarted","Data":"14e74c6597f4ff423990e2235f81507ab69a444020771d5d06f6b91e4e04dc53"} Dec 05 19:05:30 crc kubenswrapper[4756]: I1205 19:05:30.702461 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wzxw4" podStartSLOduration=3.786987742 podStartE2EDuration="6.702441028s" podCreationTimestamp="2025-12-05 19:05:24 +0000 UTC" firstStartedPulling="2025-12-05 19:05:26.608664838 +0000 UTC m=+10186.847338380" lastFinishedPulling="2025-12-05 19:05:29.524118144 +0000 UTC m=+10189.762791666" observedRunningTime="2025-12-05 19:05:30.692300842 +0000 UTC m=+10190.930974374" watchObservedRunningTime="2025-12-05 19:05:30.702441028 +0000 UTC m=+10190.941114540" Dec 05 19:05:33 crc kubenswrapper[4756]: I1205 19:05:33.050438 4756 scope.go:117] "RemoveContainer" containerID="de32e6eef8a3a3e91095a3e009dc8ee00a9b2eb9eda008ae6ceb187ef7b761eb" Dec 05 19:05:33 crc kubenswrapper[4756]: E1205 19:05:33.051289 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:05:35 crc kubenswrapper[4756]: I1205 19:05:35.162085 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wzxw4" Dec 05 19:05:35 crc kubenswrapper[4756]: I1205 19:05:35.162478 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wzxw4" Dec 05 19:05:35 crc kubenswrapper[4756]: I1205 19:05:35.216402 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wzxw4" Dec 05 19:05:35 crc kubenswrapper[4756]: I1205 19:05:35.800592 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wzxw4" Dec 05 19:05:35 crc kubenswrapper[4756]: I1205 19:05:35.866204 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wzxw4"] Dec 05 19:05:37 crc kubenswrapper[4756]: I1205 19:05:37.757126 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wzxw4" podUID="d99e5e16-b2df-45c8-963d-6fded723f937" containerName="registry-server" containerID="cri-o://14e74c6597f4ff423990e2235f81507ab69a444020771d5d06f6b91e4e04dc53" gracePeriod=2 Dec 05 19:05:38 crc kubenswrapper[4756]: I1205 19:05:38.837704 4756 generic.go:334] "Generic (PLEG): container finished" podID="d99e5e16-b2df-45c8-963d-6fded723f937" containerID="14e74c6597f4ff423990e2235f81507ab69a444020771d5d06f6b91e4e04dc53" exitCode=0 Dec 05 19:05:38 crc kubenswrapper[4756]: I1205 19:05:38.838031 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzxw4" event={"ID":"d99e5e16-b2df-45c8-963d-6fded723f937","Type":"ContainerDied","Data":"14e74c6597f4ff423990e2235f81507ab69a444020771d5d06f6b91e4e04dc53"} Dec 05 19:05:38 crc kubenswrapper[4756]: I1205 19:05:38.975861 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wzxw4" Dec 05 19:05:39 crc kubenswrapper[4756]: I1205 19:05:39.156438 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d99e5e16-b2df-45c8-963d-6fded723f937-utilities\") pod \"d99e5e16-b2df-45c8-963d-6fded723f937\" (UID: \"d99e5e16-b2df-45c8-963d-6fded723f937\") " Dec 05 19:05:39 crc kubenswrapper[4756]: I1205 19:05:39.156526 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cscr4\" (UniqueName: \"kubernetes.io/projected/d99e5e16-b2df-45c8-963d-6fded723f937-kube-api-access-cscr4\") pod \"d99e5e16-b2df-45c8-963d-6fded723f937\" (UID: \"d99e5e16-b2df-45c8-963d-6fded723f937\") " Dec 05 19:05:39 crc kubenswrapper[4756]: I1205 19:05:39.156845 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d99e5e16-b2df-45c8-963d-6fded723f937-catalog-content\") pod \"d99e5e16-b2df-45c8-963d-6fded723f937\" (UID: \"d99e5e16-b2df-45c8-963d-6fded723f937\") " Dec 05 19:05:39 crc kubenswrapper[4756]: I1205 19:05:39.157242 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d99e5e16-b2df-45c8-963d-6fded723f937-utilities" (OuterVolumeSpecName: "utilities") pod "d99e5e16-b2df-45c8-963d-6fded723f937" (UID: "d99e5e16-b2df-45c8-963d-6fded723f937"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 19:05:39 crc kubenswrapper[4756]: I1205 19:05:39.158089 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d99e5e16-b2df-45c8-963d-6fded723f937-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 19:05:39 crc kubenswrapper[4756]: I1205 19:05:39.163744 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d99e5e16-b2df-45c8-963d-6fded723f937-kube-api-access-cscr4" (OuterVolumeSpecName: "kube-api-access-cscr4") pod "d99e5e16-b2df-45c8-963d-6fded723f937" (UID: "d99e5e16-b2df-45c8-963d-6fded723f937"). InnerVolumeSpecName "kube-api-access-cscr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 19:05:39 crc kubenswrapper[4756]: I1205 19:05:39.209767 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d99e5e16-b2df-45c8-963d-6fded723f937-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d99e5e16-b2df-45c8-963d-6fded723f937" (UID: "d99e5e16-b2df-45c8-963d-6fded723f937"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 19:05:39 crc kubenswrapper[4756]: I1205 19:05:39.261054 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d99e5e16-b2df-45c8-963d-6fded723f937-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 19:05:39 crc kubenswrapper[4756]: I1205 19:05:39.261084 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cscr4\" (UniqueName: \"kubernetes.io/projected/d99e5e16-b2df-45c8-963d-6fded723f937-kube-api-access-cscr4\") on node \"crc\" DevicePath \"\"" Dec 05 19:05:39 crc kubenswrapper[4756]: I1205 19:05:39.857471 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzxw4" event={"ID":"d99e5e16-b2df-45c8-963d-6fded723f937","Type":"ContainerDied","Data":"b3fedb6ed2de0f10766bf2a9ec0c148f292bc586f28f3063a2a8c7407fc058e1"} Dec 05 19:05:39 crc kubenswrapper[4756]: I1205 19:05:39.857532 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wzxw4" Dec 05 19:05:39 crc kubenswrapper[4756]: I1205 19:05:39.857547 4756 scope.go:117] "RemoveContainer" containerID="14e74c6597f4ff423990e2235f81507ab69a444020771d5d06f6b91e4e04dc53" Dec 05 19:05:39 crc kubenswrapper[4756]: I1205 19:05:39.909044 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wzxw4"] Dec 05 19:05:39 crc kubenswrapper[4756]: I1205 19:05:39.909697 4756 scope.go:117] "RemoveContainer" containerID="0319e081ba047c1c70af92238f19321209a5e6d3fad9d17dfc8f97ae47fc03dd" Dec 05 19:05:39 crc kubenswrapper[4756]: I1205 19:05:39.926919 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wzxw4"] Dec 05 19:05:39 crc kubenswrapper[4756]: I1205 19:05:39.947026 4756 scope.go:117] "RemoveContainer" containerID="82b4245a6aee62d925149458ee3ed1bd8ecc5fac2e4eb4e0134b6ea52b532c2b" Dec 05 19:05:40 crc kubenswrapper[4756]: I1205 19:05:40.064778 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d99e5e16-b2df-45c8-963d-6fded723f937" path="/var/lib/kubelet/pods/d99e5e16-b2df-45c8-963d-6fded723f937/volumes" Dec 05 19:05:45 crc kubenswrapper[4756]: I1205 19:05:45.050737 4756 scope.go:117] "RemoveContainer" containerID="de32e6eef8a3a3e91095a3e009dc8ee00a9b2eb9eda008ae6ceb187ef7b761eb" Dec 05 19:05:45 crc kubenswrapper[4756]: E1205 19:05:45.051652 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:05:57 crc kubenswrapper[4756]: I1205 19:05:57.049858 4756 scope.go:117] "RemoveContainer" containerID="de32e6eef8a3a3e91095a3e009dc8ee00a9b2eb9eda008ae6ceb187ef7b761eb" Dec 05 19:05:57 crc kubenswrapper[4756]: E1205 19:05:57.050859 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:06:12 crc kubenswrapper[4756]: I1205 19:06:12.051147 4756 scope.go:117] "RemoveContainer" containerID="de32e6eef8a3a3e91095a3e009dc8ee00a9b2eb9eda008ae6ceb187ef7b761eb" Dec 05 19:06:12 crc kubenswrapper[4756]: E1205 19:06:12.052412 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:06:27 crc kubenswrapper[4756]: I1205 19:06:27.049626 4756 scope.go:117] "RemoveContainer" containerID="de32e6eef8a3a3e91095a3e009dc8ee00a9b2eb9eda008ae6ceb187ef7b761eb" Dec 05 19:06:27 crc kubenswrapper[4756]: E1205 19:06:27.050577 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:06:41 crc kubenswrapper[4756]: I1205 19:06:41.050536 4756 scope.go:117] "RemoveContainer" containerID="de32e6eef8a3a3e91095a3e009dc8ee00a9b2eb9eda008ae6ceb187ef7b761eb" Dec 05 19:06:41 crc kubenswrapper[4756]: E1205 19:06:41.051794 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:06:54 crc kubenswrapper[4756]: I1205 19:06:54.050177 4756 scope.go:117] "RemoveContainer" containerID="de32e6eef8a3a3e91095a3e009dc8ee00a9b2eb9eda008ae6ceb187ef7b761eb" Dec 05 19:06:54 crc kubenswrapper[4756]: E1205 19:06:54.051338 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:07:09 crc kubenswrapper[4756]: I1205 19:07:09.050403 4756 scope.go:117] "RemoveContainer" containerID="de32e6eef8a3a3e91095a3e009dc8ee00a9b2eb9eda008ae6ceb187ef7b761eb" Dec 05 19:07:09 crc kubenswrapper[4756]: E1205 19:07:09.051278 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:07:20 crc kubenswrapper[4756]: I1205 19:07:20.060336 4756 scope.go:117] "RemoveContainer" containerID="de32e6eef8a3a3e91095a3e009dc8ee00a9b2eb9eda008ae6ceb187ef7b761eb" Dec 05 19:07:20 crc kubenswrapper[4756]: E1205 19:07:20.061173 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:07:22 crc kubenswrapper[4756]: I1205 19:07:22.980178 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bk795"] Dec 05 19:07:22 crc kubenswrapper[4756]: E1205 19:07:22.981039 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d99e5e16-b2df-45c8-963d-6fded723f937" containerName="registry-server" Dec 05 19:07:22 crc kubenswrapper[4756]: I1205 19:07:22.981054 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d99e5e16-b2df-45c8-963d-6fded723f937" containerName="registry-server" Dec 05 19:07:22 crc kubenswrapper[4756]: E1205 19:07:22.981093 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d99e5e16-b2df-45c8-963d-6fded723f937" containerName="extract-utilities" Dec 05 19:07:22 crc kubenswrapper[4756]: I1205 19:07:22.981101 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d99e5e16-b2df-45c8-963d-6fded723f937" containerName="extract-utilities" Dec 05 19:07:22 crc kubenswrapper[4756]: E1205 19:07:22.981112 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d99e5e16-b2df-45c8-963d-6fded723f937" containerName="extract-content" Dec 05 19:07:22 crc kubenswrapper[4756]: I1205 19:07:22.981120 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d99e5e16-b2df-45c8-963d-6fded723f937" containerName="extract-content" Dec 05 19:07:22 crc kubenswrapper[4756]: I1205 19:07:22.981376 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="d99e5e16-b2df-45c8-963d-6fded723f937" containerName="registry-server" Dec 05 19:07:22 crc kubenswrapper[4756]: I1205 19:07:22.983341 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bk795" Dec 05 19:07:23 crc kubenswrapper[4756]: I1205 19:07:23.020538 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bk795"] Dec 05 19:07:23 crc kubenswrapper[4756]: I1205 19:07:23.060719 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fcbfb54-907c-462e-9cfe-30ef869c0fde-catalog-content\") pod \"redhat-operators-bk795\" (UID: \"7fcbfb54-907c-462e-9cfe-30ef869c0fde\") " pod="openshift-marketplace/redhat-operators-bk795" Dec 05 19:07:23 crc kubenswrapper[4756]: I1205 19:07:23.060847 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bnzc\" (UniqueName: \"kubernetes.io/projected/7fcbfb54-907c-462e-9cfe-30ef869c0fde-kube-api-access-8bnzc\") pod \"redhat-operators-bk795\" (UID: \"7fcbfb54-907c-462e-9cfe-30ef869c0fde\") " pod="openshift-marketplace/redhat-operators-bk795" Dec 05 19:07:23 crc kubenswrapper[4756]: I1205 19:07:23.060907 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fcbfb54-907c-462e-9cfe-30ef869c0fde-utilities\") pod \"redhat-operators-bk795\" (UID: \"7fcbfb54-907c-462e-9cfe-30ef869c0fde\") " pod="openshift-marketplace/redhat-operators-bk795" Dec 05 19:07:23 crc kubenswrapper[4756]: I1205 19:07:23.163195 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bnzc\" (UniqueName: \"kubernetes.io/projected/7fcbfb54-907c-462e-9cfe-30ef869c0fde-kube-api-access-8bnzc\") pod \"redhat-operators-bk795\" (UID: \"7fcbfb54-907c-462e-9cfe-30ef869c0fde\") " pod="openshift-marketplace/redhat-operators-bk795" Dec 05 19:07:23 crc kubenswrapper[4756]: I1205 19:07:23.163295 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fcbfb54-907c-462e-9cfe-30ef869c0fde-utilities\") pod \"redhat-operators-bk795\" (UID: \"7fcbfb54-907c-462e-9cfe-30ef869c0fde\") " pod="openshift-marketplace/redhat-operators-bk795" Dec 05 19:07:23 crc kubenswrapper[4756]: I1205 19:07:23.163466 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fcbfb54-907c-462e-9cfe-30ef869c0fde-catalog-content\") pod \"redhat-operators-bk795\" (UID: \"7fcbfb54-907c-462e-9cfe-30ef869c0fde\") " pod="openshift-marketplace/redhat-operators-bk795" Dec 05 19:07:23 crc kubenswrapper[4756]: I1205 19:07:23.164196 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fcbfb54-907c-462e-9cfe-30ef869c0fde-catalog-content\") pod \"redhat-operators-bk795\" (UID: \"7fcbfb54-907c-462e-9cfe-30ef869c0fde\") " pod="openshift-marketplace/redhat-operators-bk795" Dec 05 19:07:23 crc kubenswrapper[4756]: I1205 19:07:23.164316 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fcbfb54-907c-462e-9cfe-30ef869c0fde-utilities\") pod \"redhat-operators-bk795\" (UID: \"7fcbfb54-907c-462e-9cfe-30ef869c0fde\") " pod="openshift-marketplace/redhat-operators-bk795" Dec 05 19:07:23 crc kubenswrapper[4756]: I1205 19:07:23.193516 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bnzc\" (UniqueName: \"kubernetes.io/projected/7fcbfb54-907c-462e-9cfe-30ef869c0fde-kube-api-access-8bnzc\") pod \"redhat-operators-bk795\" (UID: \"7fcbfb54-907c-462e-9cfe-30ef869c0fde\") " pod="openshift-marketplace/redhat-operators-bk795" Dec 05 19:07:23 crc kubenswrapper[4756]: I1205 19:07:23.317390 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bk795" Dec 05 19:07:23 crc kubenswrapper[4756]: I1205 19:07:23.864229 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bk795"] Dec 05 19:07:24 crc kubenswrapper[4756]: I1205 19:07:24.421647 4756 generic.go:334] "Generic (PLEG): container finished" podID="7fcbfb54-907c-462e-9cfe-30ef869c0fde" containerID="262f2951748338577e925ae1c17d7e7877b89fa5e4fef32d1544b351f7bc3ac9" exitCode=0 Dec 05 19:07:24 crc kubenswrapper[4756]: I1205 19:07:24.421695 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bk795" event={"ID":"7fcbfb54-907c-462e-9cfe-30ef869c0fde","Type":"ContainerDied","Data":"262f2951748338577e925ae1c17d7e7877b89fa5e4fef32d1544b351f7bc3ac9"} Dec 05 19:07:24 crc kubenswrapper[4756]: I1205 19:07:24.421999 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bk795" event={"ID":"7fcbfb54-907c-462e-9cfe-30ef869c0fde","Type":"ContainerStarted","Data":"ad5e987d608e700886910401c2d0162d5251d54bf0c3016484cdbc70fbfa5ec7"} Dec 05 19:07:25 crc kubenswrapper[4756]: I1205 19:07:25.435502 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bk795" event={"ID":"7fcbfb54-907c-462e-9cfe-30ef869c0fde","Type":"ContainerStarted","Data":"dfb3a8d403316fd57636a34bb44aa9287dd23e8df782d947e6bb5354d7b72ccd"} Dec 05 19:07:29 crc kubenswrapper[4756]: I1205 19:07:29.484948 4756 generic.go:334] "Generic (PLEG): container finished" podID="7fcbfb54-907c-462e-9cfe-30ef869c0fde" containerID="dfb3a8d403316fd57636a34bb44aa9287dd23e8df782d947e6bb5354d7b72ccd" exitCode=0 Dec 05 19:07:29 crc kubenswrapper[4756]: I1205 19:07:29.485153 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bk795" event={"ID":"7fcbfb54-907c-462e-9cfe-30ef869c0fde","Type":"ContainerDied","Data":"dfb3a8d403316fd57636a34bb44aa9287dd23e8df782d947e6bb5354d7b72ccd"} Dec 05 19:07:30 crc kubenswrapper[4756]: I1205 19:07:30.502800 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bk795" event={"ID":"7fcbfb54-907c-462e-9cfe-30ef869c0fde","Type":"ContainerStarted","Data":"1abe787243546af159cf93c0d32003925e5d13038606d51f1771bf149624fd06"} Dec 05 19:07:30 crc kubenswrapper[4756]: I1205 19:07:30.534266 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bk795" podStartSLOduration=2.923518855 podStartE2EDuration="8.534244602s" podCreationTimestamp="2025-12-05 19:07:22 +0000 UTC" firstStartedPulling="2025-12-05 19:07:24.423881874 +0000 UTC m=+10304.662555386" lastFinishedPulling="2025-12-05 19:07:30.034607611 +0000 UTC m=+10310.273281133" observedRunningTime="2025-12-05 19:07:30.52920556 +0000 UTC m=+10310.767879102" watchObservedRunningTime="2025-12-05 19:07:30.534244602 +0000 UTC m=+10310.772918144" Dec 05 19:07:32 crc kubenswrapper[4756]: I1205 19:07:32.050453 4756 scope.go:117] "RemoveContainer" containerID="de32e6eef8a3a3e91095a3e009dc8ee00a9b2eb9eda008ae6ceb187ef7b761eb" Dec 05 19:07:32 crc kubenswrapper[4756]: E1205 19:07:32.051757 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:07:33 crc kubenswrapper[4756]: I1205 19:07:33.318466 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bk795" Dec 05 19:07:33 crc kubenswrapper[4756]: I1205 19:07:33.319221 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bk795" Dec 05 19:07:34 crc kubenswrapper[4756]: I1205 19:07:34.385162 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bk795" podUID="7fcbfb54-907c-462e-9cfe-30ef869c0fde" containerName="registry-server" probeResult="failure" output=< Dec 05 19:07:34 crc kubenswrapper[4756]: timeout: failed to connect service ":50051" within 1s Dec 05 19:07:34 crc kubenswrapper[4756]: > Dec 05 19:07:43 crc kubenswrapper[4756]: I1205 19:07:43.418361 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bk795" Dec 05 19:07:43 crc kubenswrapper[4756]: I1205 19:07:43.483901 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bk795" Dec 05 19:07:43 crc kubenswrapper[4756]: I1205 19:07:43.685555 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bk795"] Dec 05 19:07:44 crc kubenswrapper[4756]: I1205 19:07:44.698281 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bk795" podUID="7fcbfb54-907c-462e-9cfe-30ef869c0fde" containerName="registry-server" containerID="cri-o://1abe787243546af159cf93c0d32003925e5d13038606d51f1771bf149624fd06" gracePeriod=2 Dec 05 19:07:45 crc kubenswrapper[4756]: I1205 19:07:45.049876 4756 scope.go:117] "RemoveContainer" containerID="de32e6eef8a3a3e91095a3e009dc8ee00a9b2eb9eda008ae6ceb187ef7b761eb" Dec 05 19:07:45 crc kubenswrapper[4756]: E1205 19:07:45.050555 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:07:45 crc kubenswrapper[4756]: I1205 19:07:45.710933 4756 generic.go:334] "Generic (PLEG): container finished" podID="7fcbfb54-907c-462e-9cfe-30ef869c0fde" containerID="1abe787243546af159cf93c0d32003925e5d13038606d51f1771bf149624fd06" exitCode=0 Dec 05 19:07:45 crc kubenswrapper[4756]: I1205 19:07:45.710984 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bk795" event={"ID":"7fcbfb54-907c-462e-9cfe-30ef869c0fde","Type":"ContainerDied","Data":"1abe787243546af159cf93c0d32003925e5d13038606d51f1771bf149624fd06"} Dec 05 19:07:45 crc kubenswrapper[4756]: I1205 19:07:45.711015 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bk795" event={"ID":"7fcbfb54-907c-462e-9cfe-30ef869c0fde","Type":"ContainerDied","Data":"ad5e987d608e700886910401c2d0162d5251d54bf0c3016484cdbc70fbfa5ec7"} Dec 05 19:07:45 crc kubenswrapper[4756]: I1205 19:07:45.711030 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad5e987d608e700886910401c2d0162d5251d54bf0c3016484cdbc70fbfa5ec7" Dec 05 19:07:45 crc kubenswrapper[4756]: I1205 19:07:45.822901 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bk795" Dec 05 19:07:45 crc kubenswrapper[4756]: I1205 19:07:45.874690 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fcbfb54-907c-462e-9cfe-30ef869c0fde-catalog-content\") pod \"7fcbfb54-907c-462e-9cfe-30ef869c0fde\" (UID: \"7fcbfb54-907c-462e-9cfe-30ef869c0fde\") " Dec 05 19:07:45 crc kubenswrapper[4756]: I1205 19:07:45.874795 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fcbfb54-907c-462e-9cfe-30ef869c0fde-utilities\") pod \"7fcbfb54-907c-462e-9cfe-30ef869c0fde\" (UID: \"7fcbfb54-907c-462e-9cfe-30ef869c0fde\") " Dec 05 19:07:45 crc kubenswrapper[4756]: I1205 19:07:45.874853 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bnzc\" (UniqueName: \"kubernetes.io/projected/7fcbfb54-907c-462e-9cfe-30ef869c0fde-kube-api-access-8bnzc\") pod \"7fcbfb54-907c-462e-9cfe-30ef869c0fde\" (UID: \"7fcbfb54-907c-462e-9cfe-30ef869c0fde\") " Dec 05 19:07:45 crc kubenswrapper[4756]: I1205 19:07:45.876276 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fcbfb54-907c-462e-9cfe-30ef869c0fde-utilities" (OuterVolumeSpecName: "utilities") pod "7fcbfb54-907c-462e-9cfe-30ef869c0fde" (UID: "7fcbfb54-907c-462e-9cfe-30ef869c0fde"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 19:07:45 crc kubenswrapper[4756]: I1205 19:07:45.882721 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fcbfb54-907c-462e-9cfe-30ef869c0fde-kube-api-access-8bnzc" (OuterVolumeSpecName: "kube-api-access-8bnzc") pod "7fcbfb54-907c-462e-9cfe-30ef869c0fde" (UID: "7fcbfb54-907c-462e-9cfe-30ef869c0fde"). InnerVolumeSpecName "kube-api-access-8bnzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 19:07:45 crc kubenswrapper[4756]: I1205 19:07:45.979049 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fcbfb54-907c-462e-9cfe-30ef869c0fde-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 19:07:45 crc kubenswrapper[4756]: I1205 19:07:45.979088 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bnzc\" (UniqueName: \"kubernetes.io/projected/7fcbfb54-907c-462e-9cfe-30ef869c0fde-kube-api-access-8bnzc\") on node \"crc\" DevicePath \"\"" Dec 05 19:07:45 crc kubenswrapper[4756]: I1205 19:07:45.987541 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fcbfb54-907c-462e-9cfe-30ef869c0fde-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7fcbfb54-907c-462e-9cfe-30ef869c0fde" (UID: "7fcbfb54-907c-462e-9cfe-30ef869c0fde"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 19:07:46 crc kubenswrapper[4756]: I1205 19:07:46.081983 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fcbfb54-907c-462e-9cfe-30ef869c0fde-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 19:07:46 crc kubenswrapper[4756]: I1205 19:07:46.724904 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bk795" Dec 05 19:07:46 crc kubenswrapper[4756]: I1205 19:07:46.753979 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bk795"] Dec 05 19:07:46 crc kubenswrapper[4756]: I1205 19:07:46.765085 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bk795"] Dec 05 19:07:48 crc kubenswrapper[4756]: I1205 19:07:48.071527 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fcbfb54-907c-462e-9cfe-30ef869c0fde" path="/var/lib/kubelet/pods/7fcbfb54-907c-462e-9cfe-30ef869c0fde/volumes" Dec 05 19:07:58 crc kubenswrapper[4756]: I1205 19:07:58.049879 4756 scope.go:117] "RemoveContainer" containerID="de32e6eef8a3a3e91095a3e009dc8ee00a9b2eb9eda008ae6ceb187ef7b761eb" Dec 05 19:07:59 crc kubenswrapper[4756]: I1205 19:07:59.243737 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"5b79640ea6e442aff2813d6696ab699c40b5373571ba83575e20b9a20b3e543c"} Dec 05 19:09:32 crc kubenswrapper[4756]: I1205 19:09:32.729756 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="969a4b86-3398-4066-9515-34123347f863" containerName="galera" probeResult="failure" output="command timed out" Dec 05 19:09:32 crc kubenswrapper[4756]: I1205 19:09:32.729963 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="969a4b86-3398-4066-9515-34123347f863" containerName="galera" probeResult="failure" output="command timed out" Dec 05 19:09:39 crc kubenswrapper[4756]: I1205 19:09:39.756965 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xskhb"] Dec 05 19:09:39 crc kubenswrapper[4756]: E1205 19:09:39.758128 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fcbfb54-907c-462e-9cfe-30ef869c0fde" containerName="extract-content" Dec 05 19:09:39 crc kubenswrapper[4756]: I1205 19:09:39.758145 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fcbfb54-907c-462e-9cfe-30ef869c0fde" containerName="extract-content" Dec 05 19:09:39 crc kubenswrapper[4756]: E1205 19:09:39.758174 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fcbfb54-907c-462e-9cfe-30ef869c0fde" containerName="registry-server" Dec 05 19:09:39 crc kubenswrapper[4756]: I1205 19:09:39.758183 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fcbfb54-907c-462e-9cfe-30ef869c0fde" containerName="registry-server" Dec 05 19:09:39 crc kubenswrapper[4756]: E1205 19:09:39.758222 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fcbfb54-907c-462e-9cfe-30ef869c0fde" containerName="extract-utilities" Dec 05 19:09:39 crc kubenswrapper[4756]: I1205 19:09:39.758234 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fcbfb54-907c-462e-9cfe-30ef869c0fde" containerName="extract-utilities" Dec 05 19:09:39 crc kubenswrapper[4756]: I1205 19:09:39.758560 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fcbfb54-907c-462e-9cfe-30ef869c0fde" containerName="registry-server" Dec 05 19:09:39 crc kubenswrapper[4756]: I1205 19:09:39.760599 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xskhb" Dec 05 19:09:39 crc kubenswrapper[4756]: I1205 19:09:39.783666 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xskhb"] Dec 05 19:09:39 crc kubenswrapper[4756]: I1205 19:09:39.897291 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b75c6bf-d2f3-4f8c-972f-790f20da62b4-catalog-content\") pod \"certified-operators-xskhb\" (UID: \"8b75c6bf-d2f3-4f8c-972f-790f20da62b4\") " pod="openshift-marketplace/certified-operators-xskhb" Dec 05 19:09:39 crc kubenswrapper[4756]: I1205 19:09:39.897520 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xj7t\" (UniqueName: \"kubernetes.io/projected/8b75c6bf-d2f3-4f8c-972f-790f20da62b4-kube-api-access-9xj7t\") pod \"certified-operators-xskhb\" (UID: \"8b75c6bf-d2f3-4f8c-972f-790f20da62b4\") " pod="openshift-marketplace/certified-operators-xskhb" Dec 05 19:09:39 crc kubenswrapper[4756]: I1205 19:09:39.897567 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b75c6bf-d2f3-4f8c-972f-790f20da62b4-utilities\") pod \"certified-operators-xskhb\" (UID: \"8b75c6bf-d2f3-4f8c-972f-790f20da62b4\") " pod="openshift-marketplace/certified-operators-xskhb" Dec 05 19:09:39 crc kubenswrapper[4756]: I1205 19:09:39.998998 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b75c6bf-d2f3-4f8c-972f-790f20da62b4-catalog-content\") pod \"certified-operators-xskhb\" (UID: \"8b75c6bf-d2f3-4f8c-972f-790f20da62b4\") " pod="openshift-marketplace/certified-operators-xskhb" Dec 05 19:09:39 crc kubenswrapper[4756]: I1205 19:09:39.999086 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xj7t\" (UniqueName: \"kubernetes.io/projected/8b75c6bf-d2f3-4f8c-972f-790f20da62b4-kube-api-access-9xj7t\") pod \"certified-operators-xskhb\" (UID: \"8b75c6bf-d2f3-4f8c-972f-790f20da62b4\") " pod="openshift-marketplace/certified-operators-xskhb" Dec 05 19:09:39 crc kubenswrapper[4756]: I1205 19:09:39.999117 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b75c6bf-d2f3-4f8c-972f-790f20da62b4-utilities\") pod \"certified-operators-xskhb\" (UID: \"8b75c6bf-d2f3-4f8c-972f-790f20da62b4\") " pod="openshift-marketplace/certified-operators-xskhb" Dec 05 19:09:39 crc kubenswrapper[4756]: I1205 19:09:39.999529 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b75c6bf-d2f3-4f8c-972f-790f20da62b4-catalog-content\") pod \"certified-operators-xskhb\" (UID: \"8b75c6bf-d2f3-4f8c-972f-790f20da62b4\") " pod="openshift-marketplace/certified-operators-xskhb" Dec 05 19:09:39 crc kubenswrapper[4756]: I1205 19:09:39.999547 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b75c6bf-d2f3-4f8c-972f-790f20da62b4-utilities\") pod \"certified-operators-xskhb\" (UID: \"8b75c6bf-d2f3-4f8c-972f-790f20da62b4\") " pod="openshift-marketplace/certified-operators-xskhb" Dec 05 19:09:40 crc kubenswrapper[4756]: I1205 19:09:40.019778 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xj7t\" (UniqueName: \"kubernetes.io/projected/8b75c6bf-d2f3-4f8c-972f-790f20da62b4-kube-api-access-9xj7t\") pod \"certified-operators-xskhb\" (UID: \"8b75c6bf-d2f3-4f8c-972f-790f20da62b4\") " pod="openshift-marketplace/certified-operators-xskhb" Dec 05 19:09:40 crc kubenswrapper[4756]: I1205 19:09:40.088321 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xskhb" Dec 05 19:09:40 crc kubenswrapper[4756]: I1205 19:09:40.643035 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xskhb"] Dec 05 19:09:41 crc kubenswrapper[4756]: I1205 19:09:41.581541 4756 generic.go:334] "Generic (PLEG): container finished" podID="8b75c6bf-d2f3-4f8c-972f-790f20da62b4" containerID="68b6d1f85a37a613cd375b952fa64673ab206bc6c7f330e3097ea19ba46b4f1e" exitCode=0 Dec 05 19:09:41 crc kubenswrapper[4756]: I1205 19:09:41.581608 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xskhb" event={"ID":"8b75c6bf-d2f3-4f8c-972f-790f20da62b4","Type":"ContainerDied","Data":"68b6d1f85a37a613cd375b952fa64673ab206bc6c7f330e3097ea19ba46b4f1e"} Dec 05 19:09:41 crc kubenswrapper[4756]: I1205 19:09:41.581907 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xskhb" event={"ID":"8b75c6bf-d2f3-4f8c-972f-790f20da62b4","Type":"ContainerStarted","Data":"3ed345506dde07e95e6993d4ee4ce440242f99dd923d2a299723e284fe238fe6"} Dec 05 19:09:43 crc kubenswrapper[4756]: I1205 19:09:43.540835 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6jwkn"] Dec 05 19:09:43 crc kubenswrapper[4756]: I1205 19:09:43.543496 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6jwkn" Dec 05 19:09:43 crc kubenswrapper[4756]: I1205 19:09:43.563303 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6jwkn"] Dec 05 19:09:43 crc kubenswrapper[4756]: I1205 19:09:43.607863 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xskhb" event={"ID":"8b75c6bf-d2f3-4f8c-972f-790f20da62b4","Type":"ContainerStarted","Data":"34543a8a7965f2115bcc0d21c54f84f1ae0cb589a2f5532d785033c25ff3feac"} Dec 05 19:09:43 crc kubenswrapper[4756]: I1205 19:09:43.698385 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5-utilities\") pod \"redhat-marketplace-6jwkn\" (UID: \"76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5\") " pod="openshift-marketplace/redhat-marketplace-6jwkn" Dec 05 19:09:43 crc kubenswrapper[4756]: I1205 19:09:43.698562 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp96f\" (UniqueName: \"kubernetes.io/projected/76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5-kube-api-access-xp96f\") pod \"redhat-marketplace-6jwkn\" (UID: \"76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5\") " pod="openshift-marketplace/redhat-marketplace-6jwkn" Dec 05 19:09:43 crc kubenswrapper[4756]: I1205 19:09:43.698595 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5-catalog-content\") pod \"redhat-marketplace-6jwkn\" (UID: \"76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5\") " pod="openshift-marketplace/redhat-marketplace-6jwkn" Dec 05 19:09:43 crc kubenswrapper[4756]: I1205 19:09:43.800690 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5-utilities\") pod \"redhat-marketplace-6jwkn\" (UID: \"76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5\") " pod="openshift-marketplace/redhat-marketplace-6jwkn" Dec 05 19:09:43 crc kubenswrapper[4756]: I1205 19:09:43.800850 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp96f\" (UniqueName: \"kubernetes.io/projected/76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5-kube-api-access-xp96f\") pod \"redhat-marketplace-6jwkn\" (UID: \"76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5\") " pod="openshift-marketplace/redhat-marketplace-6jwkn" Dec 05 19:09:43 crc kubenswrapper[4756]: I1205 19:09:43.800884 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5-catalog-content\") pod \"redhat-marketplace-6jwkn\" (UID: \"76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5\") " pod="openshift-marketplace/redhat-marketplace-6jwkn" Dec 05 19:09:43 crc kubenswrapper[4756]: I1205 19:09:43.801712 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5-catalog-content\") pod \"redhat-marketplace-6jwkn\" (UID: \"76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5\") " pod="openshift-marketplace/redhat-marketplace-6jwkn" Dec 05 19:09:43 crc kubenswrapper[4756]: I1205 19:09:43.801891 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5-utilities\") pod \"redhat-marketplace-6jwkn\" (UID: \"76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5\") " pod="openshift-marketplace/redhat-marketplace-6jwkn" Dec 05 19:09:44 crc kubenswrapper[4756]: I1205 19:09:44.132467 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp96f\" (UniqueName: \"kubernetes.io/projected/76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5-kube-api-access-xp96f\") pod \"redhat-marketplace-6jwkn\" (UID: \"76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5\") " pod="openshift-marketplace/redhat-marketplace-6jwkn" Dec 05 19:09:44 crc kubenswrapper[4756]: I1205 19:09:44.163983 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6jwkn" Dec 05 19:09:44 crc kubenswrapper[4756]: I1205 19:09:44.623186 4756 generic.go:334] "Generic (PLEG): container finished" podID="8b75c6bf-d2f3-4f8c-972f-790f20da62b4" containerID="34543a8a7965f2115bcc0d21c54f84f1ae0cb589a2f5532d785033c25ff3feac" exitCode=0 Dec 05 19:09:44 crc kubenswrapper[4756]: I1205 19:09:44.623541 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xskhb" event={"ID":"8b75c6bf-d2f3-4f8c-972f-790f20da62b4","Type":"ContainerDied","Data":"34543a8a7965f2115bcc0d21c54f84f1ae0cb589a2f5532d785033c25ff3feac"} Dec 05 19:09:44 crc kubenswrapper[4756]: W1205 19:09:44.650975 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76d1c0bb_35e4_461d_bfc4_b52c43cd8cf5.slice/crio-055689943deecc814eaf9da97de9717b85e61950ea80f4b6768f39b4ffb2792e WatchSource:0}: Error finding container 055689943deecc814eaf9da97de9717b85e61950ea80f4b6768f39b4ffb2792e: Status 404 returned error can't find the container with id 055689943deecc814eaf9da97de9717b85e61950ea80f4b6768f39b4ffb2792e Dec 05 19:09:44 crc kubenswrapper[4756]: I1205 19:09:44.663681 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6jwkn"] Dec 05 19:09:45 crc kubenswrapper[4756]: I1205 19:09:45.637796 4756 generic.go:334] "Generic (PLEG): container finished" podID="76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5" containerID="7b6844bf860bbc02bf307d5cb0bf22f0c6a56d68cbf26809474486707d5dd759" exitCode=0 Dec 05 19:09:45 crc kubenswrapper[4756]: I1205 19:09:45.637887 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jwkn" event={"ID":"76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5","Type":"ContainerDied","Data":"7b6844bf860bbc02bf307d5cb0bf22f0c6a56d68cbf26809474486707d5dd759"} Dec 05 19:09:45 crc kubenswrapper[4756]: I1205 19:09:45.638576 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jwkn" event={"ID":"76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5","Type":"ContainerStarted","Data":"055689943deecc814eaf9da97de9717b85e61950ea80f4b6768f39b4ffb2792e"} Dec 05 19:09:45 crc kubenswrapper[4756]: I1205 19:09:45.646792 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xskhb" event={"ID":"8b75c6bf-d2f3-4f8c-972f-790f20da62b4","Type":"ContainerStarted","Data":"94aa2443acd7398e2d924e2b31cd83f76c1cdc2d79aefc9803edb50900400ce6"} Dec 05 19:09:46 crc kubenswrapper[4756]: I1205 19:09:46.658674 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jwkn" event={"ID":"76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5","Type":"ContainerStarted","Data":"566d0eeec0ef6ad5daf229856b1190f1c78cd7da29939ab1673c6baf2fed5882"} Dec 05 19:09:46 crc kubenswrapper[4756]: I1205 19:09:46.692057 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xskhb" podStartSLOduration=4.229219588 podStartE2EDuration="7.692039373s" podCreationTimestamp="2025-12-05 19:09:39 +0000 UTC" firstStartedPulling="2025-12-05 19:09:41.584846831 +0000 UTC m=+10441.823520383" lastFinishedPulling="2025-12-05 19:09:45.047666606 +0000 UTC m=+10445.286340168" observedRunningTime="2025-12-05 19:09:45.692784411 +0000 UTC m=+10445.931457953" watchObservedRunningTime="2025-12-05 19:09:46.692039373 +0000 UTC m=+10446.930712885" Dec 05 19:09:47 crc kubenswrapper[4756]: I1205 19:09:47.669816 4756 generic.go:334] "Generic (PLEG): container finished" podID="76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5" containerID="566d0eeec0ef6ad5daf229856b1190f1c78cd7da29939ab1673c6baf2fed5882" exitCode=0 Dec 05 19:09:47 crc kubenswrapper[4756]: I1205 19:09:47.669861 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jwkn" event={"ID":"76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5","Type":"ContainerDied","Data":"566d0eeec0ef6ad5daf229856b1190f1c78cd7da29939ab1673c6baf2fed5882"} Dec 05 19:09:49 crc kubenswrapper[4756]: I1205 19:09:49.700963 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jwkn" event={"ID":"76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5","Type":"ContainerStarted","Data":"ed897b5b18ed702ae436f8c159244983809b84d77f2fe6d89acbe6ccc313c2c7"} Dec 05 19:09:49 crc kubenswrapper[4756]: I1205 19:09:49.722615 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6jwkn" podStartSLOduration=3.467414541 podStartE2EDuration="6.722593409s" podCreationTimestamp="2025-12-05 19:09:43 +0000 UTC" firstStartedPulling="2025-12-05 19:09:45.639887968 +0000 UTC m=+10445.878561520" lastFinishedPulling="2025-12-05 19:09:48.895066876 +0000 UTC m=+10449.133740388" observedRunningTime="2025-12-05 19:09:49.722509907 +0000 UTC m=+10449.961183419" watchObservedRunningTime="2025-12-05 19:09:49.722593409 +0000 UTC m=+10449.961266921" Dec 05 19:09:50 crc kubenswrapper[4756]: I1205 19:09:50.089121 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xskhb" Dec 05 19:09:50 crc kubenswrapper[4756]: I1205 19:09:50.089631 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xskhb" Dec 05 19:09:50 crc kubenswrapper[4756]: I1205 19:09:50.162877 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xskhb" Dec 05 19:09:50 crc kubenswrapper[4756]: I1205 19:09:50.778973 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xskhb" Dec 05 19:09:51 crc kubenswrapper[4756]: I1205 19:09:51.130894 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xskhb"] Dec 05 19:09:52 crc kubenswrapper[4756]: I1205 19:09:52.745462 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xskhb" podUID="8b75c6bf-d2f3-4f8c-972f-790f20da62b4" containerName="registry-server" containerID="cri-o://94aa2443acd7398e2d924e2b31cd83f76c1cdc2d79aefc9803edb50900400ce6" gracePeriod=2 Dec 05 19:09:53 crc kubenswrapper[4756]: I1205 19:09:53.371893 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xskhb" Dec 05 19:09:53 crc kubenswrapper[4756]: I1205 19:09:53.474624 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xj7t\" (UniqueName: \"kubernetes.io/projected/8b75c6bf-d2f3-4f8c-972f-790f20da62b4-kube-api-access-9xj7t\") pod \"8b75c6bf-d2f3-4f8c-972f-790f20da62b4\" (UID: \"8b75c6bf-d2f3-4f8c-972f-790f20da62b4\") " Dec 05 19:09:53 crc kubenswrapper[4756]: I1205 19:09:53.474876 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b75c6bf-d2f3-4f8c-972f-790f20da62b4-utilities\") pod \"8b75c6bf-d2f3-4f8c-972f-790f20da62b4\" (UID: \"8b75c6bf-d2f3-4f8c-972f-790f20da62b4\") " Dec 05 19:09:53 crc kubenswrapper[4756]: I1205 19:09:53.474939 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b75c6bf-d2f3-4f8c-972f-790f20da62b4-catalog-content\") pod \"8b75c6bf-d2f3-4f8c-972f-790f20da62b4\" (UID: \"8b75c6bf-d2f3-4f8c-972f-790f20da62b4\") " Dec 05 19:09:53 crc kubenswrapper[4756]: I1205 19:09:53.476334 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b75c6bf-d2f3-4f8c-972f-790f20da62b4-utilities" (OuterVolumeSpecName: "utilities") pod "8b75c6bf-d2f3-4f8c-972f-790f20da62b4" (UID: "8b75c6bf-d2f3-4f8c-972f-790f20da62b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 19:09:53 crc kubenswrapper[4756]: I1205 19:09:53.483612 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b75c6bf-d2f3-4f8c-972f-790f20da62b4-kube-api-access-9xj7t" (OuterVolumeSpecName: "kube-api-access-9xj7t") pod "8b75c6bf-d2f3-4f8c-972f-790f20da62b4" (UID: "8b75c6bf-d2f3-4f8c-972f-790f20da62b4"). InnerVolumeSpecName "kube-api-access-9xj7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 19:09:53 crc kubenswrapper[4756]: I1205 19:09:53.536227 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b75c6bf-d2f3-4f8c-972f-790f20da62b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b75c6bf-d2f3-4f8c-972f-790f20da62b4" (UID: "8b75c6bf-d2f3-4f8c-972f-790f20da62b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 19:09:53 crc kubenswrapper[4756]: I1205 19:09:53.578248 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xj7t\" (UniqueName: \"kubernetes.io/projected/8b75c6bf-d2f3-4f8c-972f-790f20da62b4-kube-api-access-9xj7t\") on node \"crc\" DevicePath \"\"" Dec 05 19:09:53 crc kubenswrapper[4756]: I1205 19:09:53.578277 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b75c6bf-d2f3-4f8c-972f-790f20da62b4-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 19:09:53 crc kubenswrapper[4756]: I1205 19:09:53.578286 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b75c6bf-d2f3-4f8c-972f-790f20da62b4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 19:09:53 crc kubenswrapper[4756]: I1205 19:09:53.757525 4756 generic.go:334] "Generic (PLEG): container finished" podID="8b75c6bf-d2f3-4f8c-972f-790f20da62b4" containerID="94aa2443acd7398e2d924e2b31cd83f76c1cdc2d79aefc9803edb50900400ce6" exitCode=0 Dec 05 19:09:53 crc kubenswrapper[4756]: I1205 19:09:53.757575 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xskhb" event={"ID":"8b75c6bf-d2f3-4f8c-972f-790f20da62b4","Type":"ContainerDied","Data":"94aa2443acd7398e2d924e2b31cd83f76c1cdc2d79aefc9803edb50900400ce6"} Dec 05 19:09:53 crc kubenswrapper[4756]: I1205 19:09:53.757625 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xskhb" event={"ID":"8b75c6bf-d2f3-4f8c-972f-790f20da62b4","Type":"ContainerDied","Data":"3ed345506dde07e95e6993d4ee4ce440242f99dd923d2a299723e284fe238fe6"} Dec 05 19:09:53 crc kubenswrapper[4756]: I1205 19:09:53.757632 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xskhb" Dec 05 19:09:53 crc kubenswrapper[4756]: I1205 19:09:53.757642 4756 scope.go:117] "RemoveContainer" containerID="94aa2443acd7398e2d924e2b31cd83f76c1cdc2d79aefc9803edb50900400ce6" Dec 05 19:09:53 crc kubenswrapper[4756]: I1205 19:09:53.812650 4756 scope.go:117] "RemoveContainer" containerID="34543a8a7965f2115bcc0d21c54f84f1ae0cb589a2f5532d785033c25ff3feac" Dec 05 19:09:53 crc kubenswrapper[4756]: I1205 19:09:53.834616 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xskhb"] Dec 05 19:09:53 crc kubenswrapper[4756]: I1205 19:09:53.839326 4756 scope.go:117] "RemoveContainer" containerID="68b6d1f85a37a613cd375b952fa64673ab206bc6c7f330e3097ea19ba46b4f1e" Dec 05 19:09:53 crc kubenswrapper[4756]: I1205 19:09:53.853266 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xskhb"] Dec 05 19:09:53 crc kubenswrapper[4756]: I1205 19:09:53.911025 4756 scope.go:117] "RemoveContainer" containerID="94aa2443acd7398e2d924e2b31cd83f76c1cdc2d79aefc9803edb50900400ce6" Dec 05 19:09:53 crc kubenswrapper[4756]: E1205 19:09:53.914731 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94aa2443acd7398e2d924e2b31cd83f76c1cdc2d79aefc9803edb50900400ce6\": container with ID starting with 94aa2443acd7398e2d924e2b31cd83f76c1cdc2d79aefc9803edb50900400ce6 not found: ID does not exist" containerID="94aa2443acd7398e2d924e2b31cd83f76c1cdc2d79aefc9803edb50900400ce6" Dec 05 19:09:53 crc kubenswrapper[4756]: I1205 19:09:53.914883 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94aa2443acd7398e2d924e2b31cd83f76c1cdc2d79aefc9803edb50900400ce6"} err="failed to get container status \"94aa2443acd7398e2d924e2b31cd83f76c1cdc2d79aefc9803edb50900400ce6\": rpc error: code = NotFound desc = could not find container \"94aa2443acd7398e2d924e2b31cd83f76c1cdc2d79aefc9803edb50900400ce6\": container with ID starting with 94aa2443acd7398e2d924e2b31cd83f76c1cdc2d79aefc9803edb50900400ce6 not found: ID does not exist" Dec 05 19:09:53 crc kubenswrapper[4756]: I1205 19:09:53.914929 4756 scope.go:117] "RemoveContainer" containerID="34543a8a7965f2115bcc0d21c54f84f1ae0cb589a2f5532d785033c25ff3feac" Dec 05 19:09:53 crc kubenswrapper[4756]: E1205 19:09:53.915996 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34543a8a7965f2115bcc0d21c54f84f1ae0cb589a2f5532d785033c25ff3feac\": container with ID starting with 34543a8a7965f2115bcc0d21c54f84f1ae0cb589a2f5532d785033c25ff3feac not found: ID does not exist" containerID="34543a8a7965f2115bcc0d21c54f84f1ae0cb589a2f5532d785033c25ff3feac" Dec 05 19:09:53 crc kubenswrapper[4756]: I1205 19:09:53.916162 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34543a8a7965f2115bcc0d21c54f84f1ae0cb589a2f5532d785033c25ff3feac"} err="failed to get container status \"34543a8a7965f2115bcc0d21c54f84f1ae0cb589a2f5532d785033c25ff3feac\": rpc error: code = NotFound desc = could not find container \"34543a8a7965f2115bcc0d21c54f84f1ae0cb589a2f5532d785033c25ff3feac\": container with ID starting with 34543a8a7965f2115bcc0d21c54f84f1ae0cb589a2f5532d785033c25ff3feac not found: ID does not exist" Dec 05 19:09:53 crc kubenswrapper[4756]: I1205 19:09:53.916190 4756 scope.go:117] "RemoveContainer" containerID="68b6d1f85a37a613cd375b952fa64673ab206bc6c7f330e3097ea19ba46b4f1e" Dec 05 19:09:53 crc kubenswrapper[4756]: E1205 19:09:53.916576 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68b6d1f85a37a613cd375b952fa64673ab206bc6c7f330e3097ea19ba46b4f1e\": container with ID starting with 68b6d1f85a37a613cd375b952fa64673ab206bc6c7f330e3097ea19ba46b4f1e not found: ID does not exist" containerID="68b6d1f85a37a613cd375b952fa64673ab206bc6c7f330e3097ea19ba46b4f1e" Dec 05 19:09:53 crc kubenswrapper[4756]: I1205 19:09:53.916612 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68b6d1f85a37a613cd375b952fa64673ab206bc6c7f330e3097ea19ba46b4f1e"} err="failed to get container status \"68b6d1f85a37a613cd375b952fa64673ab206bc6c7f330e3097ea19ba46b4f1e\": rpc error: code = NotFound desc = could not find container \"68b6d1f85a37a613cd375b952fa64673ab206bc6c7f330e3097ea19ba46b4f1e\": container with ID starting with 68b6d1f85a37a613cd375b952fa64673ab206bc6c7f330e3097ea19ba46b4f1e not found: ID does not exist" Dec 05 19:09:54 crc kubenswrapper[4756]: I1205 19:09:54.062860 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b75c6bf-d2f3-4f8c-972f-790f20da62b4" path="/var/lib/kubelet/pods/8b75c6bf-d2f3-4f8c-972f-790f20da62b4/volumes" Dec 05 19:09:54 crc kubenswrapper[4756]: I1205 19:09:54.164659 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6jwkn" Dec 05 19:09:54 crc kubenswrapper[4756]: I1205 19:09:54.165743 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6jwkn" Dec 05 19:09:54 crc kubenswrapper[4756]: I1205 19:09:54.266370 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6jwkn" Dec 05 19:09:54 crc kubenswrapper[4756]: I1205 19:09:54.840154 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6jwkn" Dec 05 19:09:56 crc kubenswrapper[4756]: I1205 19:09:56.529131 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6jwkn"] Dec 05 19:09:57 crc kubenswrapper[4756]: I1205 19:09:57.807061 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6jwkn" podUID="76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5" containerName="registry-server" containerID="cri-o://ed897b5b18ed702ae436f8c159244983809b84d77f2fe6d89acbe6ccc313c2c7" gracePeriod=2 Dec 05 19:09:58 crc kubenswrapper[4756]: I1205 19:09:58.834960 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6jwkn" Dec 05 19:09:58 crc kubenswrapper[4756]: I1205 19:09:58.835595 4756 generic.go:334] "Generic (PLEG): container finished" podID="76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5" containerID="ed897b5b18ed702ae436f8c159244983809b84d77f2fe6d89acbe6ccc313c2c7" exitCode=0 Dec 05 19:09:58 crc kubenswrapper[4756]: I1205 19:09:58.835640 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jwkn" event={"ID":"76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5","Type":"ContainerDied","Data":"ed897b5b18ed702ae436f8c159244983809b84d77f2fe6d89acbe6ccc313c2c7"} Dec 05 19:09:58 crc kubenswrapper[4756]: I1205 19:09:58.835677 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jwkn" event={"ID":"76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5","Type":"ContainerDied","Data":"055689943deecc814eaf9da97de9717b85e61950ea80f4b6768f39b4ffb2792e"} Dec 05 19:09:58 crc kubenswrapper[4756]: I1205 19:09:58.835694 4756 scope.go:117] "RemoveContainer" containerID="ed897b5b18ed702ae436f8c159244983809b84d77f2fe6d89acbe6ccc313c2c7" Dec 05 19:09:58 crc kubenswrapper[4756]: I1205 19:09:58.870599 4756 scope.go:117] "RemoveContainer" containerID="566d0eeec0ef6ad5daf229856b1190f1c78cd7da29939ab1673c6baf2fed5882" Dec 05 19:09:58 crc kubenswrapper[4756]: I1205 19:09:58.911111 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5-catalog-content\") pod \"76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5\" (UID: \"76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5\") " Dec 05 19:09:58 crc kubenswrapper[4756]: I1205 19:09:58.911236 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5-utilities\") pod \"76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5\" (UID: \"76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5\") " Dec 05 19:09:58 crc kubenswrapper[4756]: I1205 19:09:58.911342 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xp96f\" (UniqueName: \"kubernetes.io/projected/76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5-kube-api-access-xp96f\") pod \"76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5\" (UID: \"76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5\") " Dec 05 19:09:58 crc kubenswrapper[4756]: I1205 19:09:58.914598 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5-utilities" (OuterVolumeSpecName: "utilities") pod "76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5" (UID: "76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 19:09:58 crc kubenswrapper[4756]: I1205 19:09:58.919088 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5-kube-api-access-xp96f" (OuterVolumeSpecName: "kube-api-access-xp96f") pod "76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5" (UID: "76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5"). InnerVolumeSpecName "kube-api-access-xp96f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 19:09:58 crc kubenswrapper[4756]: I1205 19:09:58.931337 4756 scope.go:117] "RemoveContainer" containerID="7b6844bf860bbc02bf307d5cb0bf22f0c6a56d68cbf26809474486707d5dd759" Dec 05 19:09:58 crc kubenswrapper[4756]: I1205 19:09:58.949543 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5" (UID: "76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 19:09:59 crc kubenswrapper[4756]: I1205 19:09:59.014433 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 19:09:59 crc kubenswrapper[4756]: I1205 19:09:59.014463 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 19:09:59 crc kubenswrapper[4756]: I1205 19:09:59.014475 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xp96f\" (UniqueName: \"kubernetes.io/projected/76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5-kube-api-access-xp96f\") on node \"crc\" DevicePath \"\"" Dec 05 19:09:59 crc kubenswrapper[4756]: I1205 19:09:59.026012 4756 scope.go:117] "RemoveContainer" containerID="ed897b5b18ed702ae436f8c159244983809b84d77f2fe6d89acbe6ccc313c2c7" Dec 05 19:09:59 crc kubenswrapper[4756]: E1205 19:09:59.026475 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed897b5b18ed702ae436f8c159244983809b84d77f2fe6d89acbe6ccc313c2c7\": container with ID starting with ed897b5b18ed702ae436f8c159244983809b84d77f2fe6d89acbe6ccc313c2c7 not found: ID does not exist" containerID="ed897b5b18ed702ae436f8c159244983809b84d77f2fe6d89acbe6ccc313c2c7" Dec 05 19:09:59 crc kubenswrapper[4756]: I1205 19:09:59.026521 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed897b5b18ed702ae436f8c159244983809b84d77f2fe6d89acbe6ccc313c2c7"} err="failed to get container status \"ed897b5b18ed702ae436f8c159244983809b84d77f2fe6d89acbe6ccc313c2c7\": rpc error: code = NotFound desc = could not find container \"ed897b5b18ed702ae436f8c159244983809b84d77f2fe6d89acbe6ccc313c2c7\": container with ID starting with ed897b5b18ed702ae436f8c159244983809b84d77f2fe6d89acbe6ccc313c2c7 not found: ID does not exist" Dec 05 19:09:59 crc kubenswrapper[4756]: I1205 19:09:59.026555 4756 scope.go:117] "RemoveContainer" containerID="566d0eeec0ef6ad5daf229856b1190f1c78cd7da29939ab1673c6baf2fed5882" Dec 05 19:09:59 crc kubenswrapper[4756]: E1205 19:09:59.026873 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"566d0eeec0ef6ad5daf229856b1190f1c78cd7da29939ab1673c6baf2fed5882\": container with ID starting with 566d0eeec0ef6ad5daf229856b1190f1c78cd7da29939ab1673c6baf2fed5882 not found: ID does not exist" containerID="566d0eeec0ef6ad5daf229856b1190f1c78cd7da29939ab1673c6baf2fed5882" Dec 05 19:09:59 crc kubenswrapper[4756]: I1205 19:09:59.026895 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"566d0eeec0ef6ad5daf229856b1190f1c78cd7da29939ab1673c6baf2fed5882"} err="failed to get container status \"566d0eeec0ef6ad5daf229856b1190f1c78cd7da29939ab1673c6baf2fed5882\": rpc error: code = NotFound desc = could not find container \"566d0eeec0ef6ad5daf229856b1190f1c78cd7da29939ab1673c6baf2fed5882\": container with ID starting with 566d0eeec0ef6ad5daf229856b1190f1c78cd7da29939ab1673c6baf2fed5882 not found: ID does not exist" Dec 05 19:09:59 crc kubenswrapper[4756]: I1205 19:09:59.026907 4756 scope.go:117] "RemoveContainer" containerID="7b6844bf860bbc02bf307d5cb0bf22f0c6a56d68cbf26809474486707d5dd759" Dec 05 19:09:59 crc kubenswrapper[4756]: E1205 19:09:59.027220 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b6844bf860bbc02bf307d5cb0bf22f0c6a56d68cbf26809474486707d5dd759\": container with ID starting with 7b6844bf860bbc02bf307d5cb0bf22f0c6a56d68cbf26809474486707d5dd759 not found: ID does not exist" containerID="7b6844bf860bbc02bf307d5cb0bf22f0c6a56d68cbf26809474486707d5dd759" Dec 05 19:09:59 crc kubenswrapper[4756]: I1205 19:09:59.027252 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b6844bf860bbc02bf307d5cb0bf22f0c6a56d68cbf26809474486707d5dd759"} err="failed to get container status \"7b6844bf860bbc02bf307d5cb0bf22f0c6a56d68cbf26809474486707d5dd759\": rpc error: code = NotFound desc = could not find container \"7b6844bf860bbc02bf307d5cb0bf22f0c6a56d68cbf26809474486707d5dd759\": container with ID starting with 7b6844bf860bbc02bf307d5cb0bf22f0c6a56d68cbf26809474486707d5dd759 not found: ID does not exist" Dec 05 19:09:59 crc kubenswrapper[4756]: I1205 19:09:59.868604 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6jwkn" Dec 05 19:09:59 crc kubenswrapper[4756]: I1205 19:09:59.902868 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6jwkn"] Dec 05 19:09:59 crc kubenswrapper[4756]: I1205 19:09:59.918612 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6jwkn"] Dec 05 19:10:00 crc kubenswrapper[4756]: I1205 19:10:00.061830 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5" path="/var/lib/kubelet/pods/76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5/volumes" Dec 05 19:10:22 crc kubenswrapper[4756]: I1205 19:10:22.296178 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 19:10:22 crc kubenswrapper[4756]: I1205 19:10:22.297262 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 19:10:52 crc kubenswrapper[4756]: I1205 19:10:52.295809 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 19:10:52 crc kubenswrapper[4756]: I1205 19:10:52.296757 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 19:11:05 crc kubenswrapper[4756]: I1205 19:11:05.695261 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_2421f4e9-7c0c-4600-bc22-54c6802397b2/init-config-reloader/0.log" Dec 05 19:11:06 crc kubenswrapper[4756]: I1205 19:11:06.005396 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_2421f4e9-7c0c-4600-bc22-54c6802397b2/init-config-reloader/0.log" Dec 05 19:11:06 crc kubenswrapper[4756]: I1205 19:11:06.007608 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_2421f4e9-7c0c-4600-bc22-54c6802397b2/config-reloader/0.log" Dec 05 19:11:06 crc kubenswrapper[4756]: I1205 19:11:06.042364 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_2421f4e9-7c0c-4600-bc22-54c6802397b2/alertmanager/0.log" Dec 05 19:11:06 crc kubenswrapper[4756]: I1205 19:11:06.215357 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_46fbc0c7-e619-4b43-bd9f-1af28c6321b5/aodh-api/0.log" Dec 05 19:11:06 crc kubenswrapper[4756]: I1205 19:11:06.282187 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_46fbc0c7-e619-4b43-bd9f-1af28c6321b5/aodh-evaluator/0.log" Dec 05 19:11:06 crc kubenswrapper[4756]: I1205 19:11:06.402964 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_46fbc0c7-e619-4b43-bd9f-1af28c6321b5/aodh-listener/0.log" Dec 05 19:11:06 crc kubenswrapper[4756]: I1205 19:11:06.498643 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7d79f4c8f4-xpgcm_7fbcc7b8-bcb8-4bba-9302-d22bc187e559/barbican-api/0.log" Dec 05 19:11:06 crc kubenswrapper[4756]: I1205 19:11:06.506384 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_46fbc0c7-e619-4b43-bd9f-1af28c6321b5/aodh-notifier/0.log" Dec 05 19:11:06 crc kubenswrapper[4756]: I1205 19:11:06.606513 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7d79f4c8f4-xpgcm_7fbcc7b8-bcb8-4bba-9302-d22bc187e559/barbican-api-log/0.log" Dec 05 19:11:06 crc kubenswrapper[4756]: I1205 19:11:06.719519 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6c694485c4-5xm56_c2add0aa-4a0d-464f-8ce0-c51a983392bd/barbican-keystone-listener/0.log" Dec 05 19:11:06 crc kubenswrapper[4756]: I1205 19:11:06.767746 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6c694485c4-5xm56_c2add0aa-4a0d-464f-8ce0-c51a983392bd/barbican-keystone-listener-log/0.log" Dec 05 19:11:06 crc kubenswrapper[4756]: I1205 19:11:06.953457 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-746459ccc9-jrtlj_93adb284-0238-4295-b819-3d51b171ee6a/barbican-worker-log/0.log" Dec 05 19:11:06 crc kubenswrapper[4756]: I1205 19:11:06.971871 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-746459ccc9-jrtlj_93adb284-0238-4295-b819-3d51b171ee6a/barbican-worker/0.log" Dec 05 19:11:07 crc kubenswrapper[4756]: I1205 19:11:07.066804 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-59lvc_c2204483-1509-4302-84cc-15270057dda4/bootstrap-openstack-openstack-cell1/0.log" Dec 05 19:11:07 crc kubenswrapper[4756]: I1205 19:11:07.204226 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_40ae1a02-0032-4316-8fa4-826e47163b78/ceilometer-central-agent/0.log" Dec 05 19:11:07 crc kubenswrapper[4756]: I1205 19:11:07.246042 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_40ae1a02-0032-4316-8fa4-826e47163b78/ceilometer-notification-agent/0.log" Dec 05 19:11:07 crc kubenswrapper[4756]: I1205 19:11:07.324777 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_40ae1a02-0032-4316-8fa4-826e47163b78/proxy-httpd/0.log" Dec 05 19:11:07 crc kubenswrapper[4756]: I1205 19:11:07.444774 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_40ae1a02-0032-4316-8fa4-826e47163b78/sg-core/0.log" Dec 05 19:11:07 crc kubenswrapper[4756]: I1205 19:11:07.558578 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_48912894-5075-4468-b597-ea47a1099fbb/cinder-api/0.log" Dec 05 19:11:07 crc kubenswrapper[4756]: I1205 19:11:07.560131 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_48912894-5075-4468-b597-ea47a1099fbb/cinder-api-log/0.log" Dec 05 19:11:07 crc kubenswrapper[4756]: I1205 19:11:07.793461 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_cf6a65a9-f08a-4a67-b139-099756a8563b/cinder-scheduler/0.log" Dec 05 19:11:07 crc kubenswrapper[4756]: I1205 19:11:07.896271 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_cf6a65a9-f08a-4a67-b139-099756a8563b/probe/0.log" Dec 05 19:11:07 crc kubenswrapper[4756]: I1205 19:11:07.922474 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-fgh87_454b8b51-3072-47be-be52-4665332c05ee/configure-network-openstack-openstack-cell1/0.log" Dec 05 19:11:08 crc kubenswrapper[4756]: I1205 19:11:08.088362 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-wkhtn_6261e620-19f8-4624-87eb-97a8fddd6283/configure-os-openstack-openstack-cell1/0.log" Dec 05 19:11:08 crc kubenswrapper[4756]: I1205 19:11:08.155113 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7c97695b55-7nkqx_a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e/init/0.log" Dec 05 19:11:08 crc kubenswrapper[4756]: I1205 19:11:08.322872 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7c97695b55-7nkqx_a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e/init/0.log" Dec 05 19:11:08 crc kubenswrapper[4756]: I1205 19:11:08.348873 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7c97695b55-7nkqx_a3cff30f-57b2-4b5d-a1fc-8e8d40f2630e/dnsmasq-dns/0.log" Dec 05 19:11:08 crc kubenswrapper[4756]: I1205 19:11:08.411320 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-sz96x_8232bee5-aac4-45a3-8321-8fb9161c6566/download-cache-openstack-openstack-cell1/0.log" Dec 05 19:11:09 crc kubenswrapper[4756]: I1205 19:11:09.362299 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_60360fa1-9e1f-4cb0-b41d-1f4701bef286/glance-httpd/0.log" Dec 05 19:11:09 crc kubenswrapper[4756]: I1205 19:11:09.402718 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_60360fa1-9e1f-4cb0-b41d-1f4701bef286/glance-log/0.log" Dec 05 19:11:09 crc kubenswrapper[4756]: I1205 19:11:09.463564 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c997de36-5bb1-4e49-860f-b1e90480b111/glance-httpd/0.log" Dec 05 19:11:09 crc kubenswrapper[4756]: I1205 19:11:09.564242 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c997de36-5bb1-4e49-860f-b1e90480b111/glance-log/0.log" Dec 05 19:11:09 crc kubenswrapper[4756]: I1205 19:11:09.911480 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-c5f6496c-9fc66_ee77eb77-55a3-42e6-bec8-3c0203881b39/heat-engine/0.log" Dec 05 19:11:10 crc kubenswrapper[4756]: I1205 19:11:10.244938 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6599c94766-cv4cn_6d180dae-962c-4f30-8d6e-49ba2182673e/horizon/0.log" Dec 05 19:11:10 crc kubenswrapper[4756]: I1205 19:11:10.294736 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-747f9c6d84-vtx8n_7438581b-1216-4129-b06f-953462d6572b/heat-api/0.log" Dec 05 19:11:10 crc kubenswrapper[4756]: I1205 19:11:10.422375 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-647989cdb6-nq2w9_74c364df-da60-4faf-a193-eb859f541a7d/heat-cfnapi/0.log" Dec 05 19:11:10 crc kubenswrapper[4756]: I1205 19:11:10.446447 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-k4w9p_80f46aab-07ca-4d94-be4b-87d3598beb35/install-certs-openstack-openstack-cell1/0.log" Dec 05 19:11:10 crc kubenswrapper[4756]: I1205 19:11:10.511605 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-pqvr5_ad9215e6-b603-4771-b2a8-962d2160e2f6/install-os-openstack-openstack-cell1/0.log" Dec 05 19:11:10 crc kubenswrapper[4756]: I1205 19:11:10.767627 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6599c94766-cv4cn_6d180dae-962c-4f30-8d6e-49ba2182673e/horizon-log/0.log" Dec 05 19:11:11 crc kubenswrapper[4756]: I1205 19:11:11.219430 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29415961-hgbt4_3faa564c-7313-40f9-9cad-ed666212dca9/keystone-cron/0.log" Dec 05 19:11:11 crc kubenswrapper[4756]: I1205 19:11:11.282323 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-667b884ddd-gsppx_14f3c83c-c658-4f94-8ec6-dfd4916be5c1/keystone-api/0.log" Dec 05 19:11:11 crc kubenswrapper[4756]: I1205 19:11:11.629121 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_468d379d-4396-4212-8a9b-20e95ec098dd/kube-state-metrics/0.log" Dec 05 19:11:11 crc kubenswrapper[4756]: I1205 19:11:11.629829 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29416021-s4hr4_ecde8bb8-34a3-4765-9bf2-0522fd81c49d/keystone-cron/0.log" Dec 05 19:11:11 crc kubenswrapper[4756]: I1205 19:11:11.647665 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-wxn8n_d7987f1c-bd96-4436-9fc1-505da115c1ae/libvirt-openstack-openstack-cell1/0.log" Dec 05 19:11:11 crc kubenswrapper[4756]: I1205 19:11:11.830230 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-copy-data_a7d9dfae-05d0-49c7-85d5-360812e2abb5/adoption/0.log" Dec 05 19:11:12 crc kubenswrapper[4756]: I1205 19:11:12.186595 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-db4968d85-bhsbf_f15acbfb-ac97-4d70-8a46-3d08cf7a53c2/neutron-httpd/0.log" Dec 05 19:11:12 crc kubenswrapper[4756]: I1205 19:11:12.194096 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-db4968d85-bhsbf_f15acbfb-ac97-4d70-8a46-3d08cf7a53c2/neutron-api/0.log" Dec 05 19:11:12 crc kubenswrapper[4756]: I1205 19:11:12.348688 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-djz95_268ef7e0-0883-4119-a1cd-e1e9f9d061fe/neutron-metadata-openstack-openstack-cell1/0.log" Dec 05 19:11:12 crc kubenswrapper[4756]: I1205 19:11:12.511106 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-5d6wl_03be6150-fc37-4e08-979b-2806de3bf91b/neutron-sriov-openstack-openstack-cell1/0.log" Dec 05 19:11:12 crc kubenswrapper[4756]: I1205 19:11:12.847673 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f028654a-21d7-4fa5-b313-bf4e418a5420/nova-api-log/0.log" Dec 05 19:11:12 crc kubenswrapper[4756]: I1205 19:11:12.875684 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f028654a-21d7-4fa5-b313-bf4e418a5420/nova-api-api/0.log" Dec 05 19:11:12 crc kubenswrapper[4756]: I1205 19:11:12.930400 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_b4204540-a721-43ea-bc43-b39baf51e18b/nova-cell0-conductor-conductor/0.log" Dec 05 19:11:13 crc kubenswrapper[4756]: I1205 19:11:13.143797 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_4056238e-c76d-4c30-af9b-62a5feb11624/nova-cell1-conductor-conductor/0.log" Dec 05 19:11:13 crc kubenswrapper[4756]: I1205 19:11:13.326247 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_2151170c-3a66-44a7-9840-b1f78aba5db4/nova-cell1-novncproxy-novncproxy/0.log" Dec 05 19:11:13 crc kubenswrapper[4756]: I1205 19:11:13.474749 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell289vq_dcacd5e8-09f7-447e-8469-6a1462c5916c/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Dec 05 19:11:13 crc kubenswrapper[4756]: I1205 19:11:13.671179 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-hzf9c_98a7241c-4bda-4548-810c-7aadfc51ebf1/nova-cell1-openstack-openstack-cell1/0.log" Dec 05 19:11:13 crc kubenswrapper[4756]: I1205 19:11:13.776658 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_14e7e8b8-2f32-4db3-a86c-a7a3abe55f93/nova-metadata-log/0.log" Dec 05 19:11:14 crc kubenswrapper[4756]: I1205 19:11:14.048462 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_280ceb24-00a7-4a73-aa0a-5ee4bc7d62ba/nova-scheduler-scheduler/0.log" Dec 05 19:11:14 crc kubenswrapper[4756]: I1205 19:11:14.128475 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-854956d6cd-q45tl_bbec8eb6-ea30-43de-b333-4a650978d5dd/init/0.log" Dec 05 19:11:14 crc kubenswrapper[4756]: I1205 19:11:14.291663 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_14e7e8b8-2f32-4db3-a86c-a7a3abe55f93/nova-metadata-metadata/0.log" Dec 05 19:11:14 crc kubenswrapper[4756]: I1205 19:11:14.345923 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-854956d6cd-q45tl_bbec8eb6-ea30-43de-b333-4a650978d5dd/init/0.log" Dec 05 19:11:14 crc kubenswrapper[4756]: I1205 19:11:14.371341 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-854956d6cd-q45tl_bbec8eb6-ea30-43de-b333-4a650978d5dd/octavia-api-provider-agent/0.log" Dec 05 19:11:14 crc kubenswrapper[4756]: I1205 19:11:14.569826 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-jjj9n_57852886-8c8f-4e90-b7bd-0bc1c7ffc235/init/0.log" Dec 05 19:11:14 crc kubenswrapper[4756]: I1205 19:11:14.607208 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-854956d6cd-q45tl_bbec8eb6-ea30-43de-b333-4a650978d5dd/octavia-api/0.log" Dec 05 19:11:14 crc kubenswrapper[4756]: I1205 19:11:14.814156 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-9rjkh_4c3a5437-0d84-49fd-99ba-2c3d14d15b40/init/0.log" Dec 05 19:11:14 crc kubenswrapper[4756]: I1205 19:11:14.859601 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-jjj9n_57852886-8c8f-4e90-b7bd-0bc1c7ffc235/init/0.log" Dec 05 19:11:14 crc kubenswrapper[4756]: I1205 19:11:14.939048 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-jjj9n_57852886-8c8f-4e90-b7bd-0bc1c7ffc235/octavia-healthmanager/0.log" Dec 05 19:11:15 crc kubenswrapper[4756]: I1205 19:11:15.104794 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-9rjkh_4c3a5437-0d84-49fd-99ba-2c3d14d15b40/init/0.log" Dec 05 19:11:15 crc kubenswrapper[4756]: I1205 19:11:15.114984 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-9rjkh_4c3a5437-0d84-49fd-99ba-2c3d14d15b40/octavia-housekeeping/0.log" Dec 05 19:11:15 crc kubenswrapper[4756]: I1205 19:11:15.175955 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-56c9f55b99-l4hvk_dbdeb570-6b89-46ff-9f82-9c70fb24bc44/init/0.log" Dec 05 19:11:15 crc kubenswrapper[4756]: I1205 19:11:15.475985 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-56c9f55b99-l4hvk_dbdeb570-6b89-46ff-9f82-9c70fb24bc44/init/0.log" Dec 05 19:11:15 crc kubenswrapper[4756]: I1205 19:11:15.571998 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-56c9f55b99-l4hvk_dbdeb570-6b89-46ff-9f82-9c70fb24bc44/octavia-amphora-httpd/0.log" Dec 05 19:11:15 crc kubenswrapper[4756]: I1205 19:11:15.576385 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-nwvnw_4bc062ab-b968-48e4-b201-b320bf7af1c2/init/0.log" Dec 05 19:11:15 crc kubenswrapper[4756]: I1205 19:11:15.839137 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-nwvnw_4bc062ab-b968-48e4-b201-b320bf7af1c2/init/0.log" Dec 05 19:11:15 crc kubenswrapper[4756]: I1205 19:11:15.857810 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-nwvnw_4bc062ab-b968-48e4-b201-b320bf7af1c2/octavia-rsyslog/0.log" Dec 05 19:11:15 crc kubenswrapper[4756]: I1205 19:11:15.929568 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-tl26l_f1a1e01d-72dd-43eb-912c-dab63cca1cfb/init/0.log" Dec 05 19:11:16 crc kubenswrapper[4756]: I1205 19:11:16.113253 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-tl26l_f1a1e01d-72dd-43eb-912c-dab63cca1cfb/init/0.log" Dec 05 19:11:16 crc kubenswrapper[4756]: I1205 19:11:16.247868 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-tl26l_f1a1e01d-72dd-43eb-912c-dab63cca1cfb/octavia-worker/0.log" Dec 05 19:11:16 crc kubenswrapper[4756]: I1205 19:11:16.279121 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_dbd67003-7ac6-402a-84ff-91aa8e5c7b37/mysql-bootstrap/0.log" Dec 05 19:11:16 crc kubenswrapper[4756]: I1205 19:11:16.442808 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_dbd67003-7ac6-402a-84ff-91aa8e5c7b37/mysql-bootstrap/0.log" Dec 05 19:11:16 crc kubenswrapper[4756]: I1205 19:11:16.511618 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_dbd67003-7ac6-402a-84ff-91aa8e5c7b37/galera/0.log" Dec 05 19:11:16 crc kubenswrapper[4756]: I1205 19:11:16.600935 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_969a4b86-3398-4066-9515-34123347f863/mysql-bootstrap/0.log" Dec 05 19:11:16 crc kubenswrapper[4756]: I1205 19:11:16.802742 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_969a4b86-3398-4066-9515-34123347f863/mysql-bootstrap/0.log" Dec 05 19:11:16 crc kubenswrapper[4756]: I1205 19:11:16.822768 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_59b410c3-f27c-4aba-a63b-28285991c0f8/openstackclient/0.log" Dec 05 19:11:16 crc kubenswrapper[4756]: I1205 19:11:16.884268 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_969a4b86-3398-4066-9515-34123347f863/galera/0.log" Dec 05 19:11:17 crc kubenswrapper[4756]: I1205 19:11:17.040203 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-7drzk_484025cd-c3a0-46de-bc81-3508bbfdfd9e/ovn-controller/0.log" Dec 05 19:11:17 crc kubenswrapper[4756]: I1205 19:11:17.139822 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-l5wr2_d7a1031b-d48b-4397-a5ff-63ee325b6e7a/openstack-network-exporter/0.log" Dec 05 19:11:17 crc kubenswrapper[4756]: I1205 19:11:17.399670 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8ljss_3bda222f-17e6-4b62-b1fe-2de61fa33be3/ovsdb-server-init/0.log" Dec 05 19:11:17 crc kubenswrapper[4756]: I1205 19:11:17.558446 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8ljss_3bda222f-17e6-4b62-b1fe-2de61fa33be3/ovs-vswitchd/0.log" Dec 05 19:11:17 crc kubenswrapper[4756]: I1205 19:11:17.570492 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8ljss_3bda222f-17e6-4b62-b1fe-2de61fa33be3/ovsdb-server/0.log" Dec 05 19:11:17 crc kubenswrapper[4756]: I1205 19:11:17.624037 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8ljss_3bda222f-17e6-4b62-b1fe-2de61fa33be3/ovsdb-server-init/0.log" Dec 05 19:11:17 crc kubenswrapper[4756]: I1205 19:11:17.825038 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-copy-data_3772b5ca-1c29-49df-9dd8-6f15da9484a5/adoption/0.log" Dec 05 19:11:17 crc kubenswrapper[4756]: I1205 19:11:17.830876 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41/openstack-network-exporter/0.log" Dec 05 19:11:17 crc kubenswrapper[4756]: I1205 19:11:17.954904 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_94ce1a20-3ea1-40b7-aa72-2e25d6e4bd41/ovn-northd/0.log" Dec 05 19:11:18 crc kubenswrapper[4756]: I1205 19:11:18.096098 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-gtpb9_47d509cd-7752-45f8-b9f0-7dd6cf3e0311/ovn-openstack-openstack-cell1/0.log" Dec 05 19:11:18 crc kubenswrapper[4756]: I1205 19:11:18.197832 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2dfcfe1d-4220-400b-83fd-f35c88cf6fa0/openstack-network-exporter/0.log" Dec 05 19:11:18 crc kubenswrapper[4756]: I1205 19:11:18.305641 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2dfcfe1d-4220-400b-83fd-f35c88cf6fa0/ovsdbserver-nb/0.log" Dec 05 19:11:18 crc kubenswrapper[4756]: I1205 19:11:18.351080 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_81682d31-ae36-4120-ab44-7f69483622d1/openstack-network-exporter/0.log" Dec 05 19:11:18 crc kubenswrapper[4756]: I1205 19:11:18.426040 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_81682d31-ae36-4120-ab44-7f69483622d1/ovsdbserver-nb/0.log" Dec 05 19:11:18 crc kubenswrapper[4756]: I1205 19:11:18.574839 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_f55cd61d-b3a2-456f-8b79-0d207405dd13/openstack-network-exporter/0.log" Dec 05 19:11:18 crc kubenswrapper[4756]: I1205 19:11:18.640434 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_f55cd61d-b3a2-456f-8b79-0d207405dd13/ovsdbserver-nb/0.log" Dec 05 19:11:19 crc kubenswrapper[4756]: I1205 19:11:19.201241 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_554b5c45-8327-4229-8c3e-294407a96bed/openstack-network-exporter/0.log" Dec 05 19:11:19 crc kubenswrapper[4756]: I1205 19:11:19.243771 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_39409198-58b6-4300-a65c-911eb49c7455/openstack-network-exporter/0.log" Dec 05 19:11:19 crc kubenswrapper[4756]: I1205 19:11:19.287670 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_554b5c45-8327-4229-8c3e-294407a96bed/ovsdbserver-sb/0.log" Dec 05 19:11:19 crc kubenswrapper[4756]: I1205 19:11:19.503834 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_39409198-58b6-4300-a65c-911eb49c7455/ovsdbserver-sb/0.log" Dec 05 19:11:19 crc kubenswrapper[4756]: I1205 19:11:19.563656 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_b7dfdb26-c3f2-4176-a880-f526a052983c/openstack-network-exporter/0.log" Dec 05 19:11:19 crc kubenswrapper[4756]: I1205 19:11:19.572358 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_b7dfdb26-c3f2-4176-a880-f526a052983c/ovsdbserver-sb/0.log" Dec 05 19:11:19 crc kubenswrapper[4756]: I1205 19:11:19.813269 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-548487ccbd-m5m7w_f9ee4748-c8fd-47f2-95b2-18c5db21973f/placement-api/0.log" Dec 05 19:11:19 crc kubenswrapper[4756]: I1205 19:11:19.895856 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-548487ccbd-m5m7w_f9ee4748-c8fd-47f2-95b2-18c5db21973f/placement-log/0.log" Dec 05 19:11:20 crc kubenswrapper[4756]: I1205 19:11:20.182501 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-c6swp4_5870dc85-1351-4505-b996-5d5b65eee0ee/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Dec 05 19:11:20 crc kubenswrapper[4756]: I1205 19:11:20.234984 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_347732e6-8a29-4364-97a7-b6a44556fe88/init-config-reloader/0.log" Dec 05 19:11:21 crc kubenswrapper[4756]: I1205 19:11:21.437163 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_347732e6-8a29-4364-97a7-b6a44556fe88/init-config-reloader/0.log" Dec 05 19:11:21 crc kubenswrapper[4756]: I1205 19:11:21.449397 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_347732e6-8a29-4364-97a7-b6a44556fe88/prometheus/0.log" Dec 05 19:11:21 crc kubenswrapper[4756]: I1205 19:11:21.468174 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_347732e6-8a29-4364-97a7-b6a44556fe88/thanos-sidecar/0.log" Dec 05 19:11:21 crc kubenswrapper[4756]: I1205 19:11:21.515523 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_347732e6-8a29-4364-97a7-b6a44556fe88/config-reloader/0.log" Dec 05 19:11:21 crc kubenswrapper[4756]: I1205 19:11:21.740728 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f2f77107-b6d1-42d0-afbd-482dfefc3524/setup-container/0.log" Dec 05 19:11:21 crc kubenswrapper[4756]: I1205 19:11:21.878876 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f2f77107-b6d1-42d0-afbd-482dfefc3524/setup-container/0.log" Dec 05 19:11:21 crc kubenswrapper[4756]: I1205 19:11:21.900478 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f2f77107-b6d1-42d0-afbd-482dfefc3524/rabbitmq/0.log" Dec 05 19:11:22 crc kubenswrapper[4756]: I1205 19:11:22.019091 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc/setup-container/0.log" Dec 05 19:11:22 crc kubenswrapper[4756]: I1205 19:11:22.245627 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc/setup-container/0.log" Dec 05 19:11:22 crc kubenswrapper[4756]: I1205 19:11:22.294884 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 19:11:22 crc kubenswrapper[4756]: I1205 19:11:22.294937 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 19:11:22 crc kubenswrapper[4756]: I1205 19:11:22.294976 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 19:11:22 crc kubenswrapper[4756]: I1205 19:11:22.295671 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5b79640ea6e442aff2813d6696ab699c40b5373571ba83575e20b9a20b3e543c"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 19:11:22 crc kubenswrapper[4756]: I1205 19:11:22.295725 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://5b79640ea6e442aff2813d6696ab699c40b5373571ba83575e20b9a20b3e543c" gracePeriod=600 Dec 05 19:11:22 crc kubenswrapper[4756]: I1205 19:11:22.326049 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-hw8vv_4212fd48-503e-40ea-9f8b-efb0cc7d999f/reboot-os-openstack-openstack-cell1/0.log" Dec 05 19:11:22 crc kubenswrapper[4756]: E1205 19:11:22.517515 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2067f3da_c088_40d8_bc08_1eb971992c56.slice/crio-5b79640ea6e442aff2813d6696ab699c40b5373571ba83575e20b9a20b3e543c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2067f3da_c088_40d8_bc08_1eb971992c56.slice/crio-conmon-5b79640ea6e442aff2813d6696ab699c40b5373571ba83575e20b9a20b3e543c.scope\": RecentStats: unable to find data in memory cache]" Dec 05 19:11:22 crc kubenswrapper[4756]: I1205 19:11:22.562351 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-4bxkw_5b789592-0645-42b6-ae30-a9eb32ed067e/run-os-openstack-openstack-cell1/0.log" Dec 05 19:11:22 crc kubenswrapper[4756]: I1205 19:11:22.813326 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-hdbm2_06709cc8-4a67-48a2-906a-9388a9f0d56e/ssh-known-hosts-openstack/0.log" Dec 05 19:11:22 crc kubenswrapper[4756]: I1205 19:11:22.925999 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="5b79640ea6e442aff2813d6696ab699c40b5373571ba83575e20b9a20b3e543c" exitCode=0 Dec 05 19:11:22 crc kubenswrapper[4756]: I1205 19:11:22.926048 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"5b79640ea6e442aff2813d6696ab699c40b5373571ba83575e20b9a20b3e543c"} Dec 05 19:11:22 crc kubenswrapper[4756]: I1205 19:11:22.926080 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"dd8fda0746006088fd09caede5ac5d34d70606c3815a94d2eee4a6e84a8ae547"} Dec 05 19:11:22 crc kubenswrapper[4756]: I1205 19:11:22.926094 4756 scope.go:117] "RemoveContainer" containerID="de32e6eef8a3a3e91095a3e009dc8ee00a9b2eb9eda008ae6ceb187ef7b761eb" Dec 05 19:11:23 crc kubenswrapper[4756]: I1205 19:11:23.049992 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-99cc989cd-9j2nn_37480e17-7f37-4df4-9587-3933a1fc902b/proxy-server/0.log" Dec 05 19:11:23 crc kubenswrapper[4756]: I1205 19:11:23.127034 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2b3ed154-90e5-45a0-9d4d-c53aa7f6bcfc/rabbitmq/0.log" Dec 05 19:11:23 crc kubenswrapper[4756]: I1205 19:11:23.200829 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-99cc989cd-9j2nn_37480e17-7f37-4df4-9587-3933a1fc902b/proxy-httpd/0.log" Dec 05 19:11:23 crc kubenswrapper[4756]: I1205 19:11:23.246277 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-2b7k7_7e02404d-5bef-4477-a4c9-b2d3992f0136/swift-ring-rebalance/0.log" Dec 05 19:11:23 crc kubenswrapper[4756]: I1205 19:11:23.375009 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-p9gcj_2ea99c30-bc6e-47b2-9ace-f7d82418e729/telemetry-openstack-openstack-cell1/0.log" Dec 05 19:11:23 crc kubenswrapper[4756]: I1205 19:11:23.586966 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-47j8v_9f633ab6-c240-4b57-8941-d8dc399f4a84/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Dec 05 19:11:23 crc kubenswrapper[4756]: I1205 19:11:23.595235 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-6cshr_70125fdc-4134-4348-9b99-63b2893674e1/validate-network-openstack-openstack-cell1/0.log" Dec 05 19:11:24 crc kubenswrapper[4756]: I1205 19:11:24.908591 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f5f97483-1670-469a-afec-002e63d0de2f/memcached/0.log" Dec 05 19:11:55 crc kubenswrapper[4756]: I1205 19:11:55.764260 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws_3356d953-464c-40f6-bc07-8534d8a8976e/util/0.log" Dec 05 19:11:55 crc kubenswrapper[4756]: I1205 19:11:55.947741 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws_3356d953-464c-40f6-bc07-8534d8a8976e/util/0.log" Dec 05 19:11:55 crc kubenswrapper[4756]: I1205 19:11:55.983775 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws_3356d953-464c-40f6-bc07-8534d8a8976e/pull/0.log" Dec 05 19:11:55 crc kubenswrapper[4756]: I1205 19:11:55.999740 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws_3356d953-464c-40f6-bc07-8534d8a8976e/pull/0.log" Dec 05 19:11:56 crc kubenswrapper[4756]: I1205 19:11:56.186109 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws_3356d953-464c-40f6-bc07-8534d8a8976e/extract/0.log" Dec 05 19:11:56 crc kubenswrapper[4756]: I1205 19:11:56.186725 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws_3356d953-464c-40f6-bc07-8534d8a8976e/pull/0.log" Dec 05 19:11:56 crc kubenswrapper[4756]: I1205 19:11:56.188962 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eaftmvws_3356d953-464c-40f6-bc07-8534d8a8976e/util/0.log" Dec 05 19:11:56 crc kubenswrapper[4756]: I1205 19:11:56.423113 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-r6xbc_6013b2f1-3b04-47f0-bb2c-f741587c8efc/kube-rbac-proxy/0.log" Dec 05 19:11:56 crc kubenswrapper[4756]: I1205 19:11:56.439426 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-r7qh7_9354b938-780a-4431-a060-48ad59f1bbec/kube-rbac-proxy/0.log" Dec 05 19:11:56 crc kubenswrapper[4756]: I1205 19:11:56.502208 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-r6xbc_6013b2f1-3b04-47f0-bb2c-f741587c8efc/manager/0.log" Dec 05 19:11:56 crc kubenswrapper[4756]: I1205 19:11:56.643526 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-f65kp_ced54d49-8a17-4bba-a783-c0f69ed32a7a/kube-rbac-proxy/0.log" Dec 05 19:11:56 crc kubenswrapper[4756]: I1205 19:11:56.672455 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-r7qh7_9354b938-780a-4431-a060-48ad59f1bbec/manager/0.log" Dec 05 19:11:56 crc kubenswrapper[4756]: I1205 19:11:56.750804 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-f65kp_ced54d49-8a17-4bba-a783-c0f69ed32a7a/manager/0.log" Dec 05 19:11:56 crc kubenswrapper[4756]: I1205 19:11:56.852527 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-7qxl4_626bcac2-4a19-4f34-83be-1f7261f44161/kube-rbac-proxy/0.log" Dec 05 19:11:57 crc kubenswrapper[4756]: I1205 19:11:57.010683 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-7qxl4_626bcac2-4a19-4f34-83be-1f7261f44161/manager/0.log" Dec 05 19:11:57 crc kubenswrapper[4756]: I1205 19:11:57.083713 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-rm4xp_bc274847-f26a-4043-86f6-60af2577a321/manager/0.log" Dec 05 19:11:57 crc kubenswrapper[4756]: I1205 19:11:57.104520 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-rm4xp_bc274847-f26a-4043-86f6-60af2577a321/kube-rbac-proxy/0.log" Dec 05 19:11:57 crc kubenswrapper[4756]: I1205 19:11:57.193961 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-njt8n_b7a067e2-8f42-4eec-9b62-623498c27f31/kube-rbac-proxy/0.log" Dec 05 19:11:57 crc kubenswrapper[4756]: I1205 19:11:57.315572 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-njt8n_b7a067e2-8f42-4eec-9b62-623498c27f31/manager/0.log" Dec 05 19:11:57 crc kubenswrapper[4756]: I1205 19:11:57.449434 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-6xbpf_b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38/kube-rbac-proxy/0.log" Dec 05 19:11:57 crc kubenswrapper[4756]: I1205 19:11:57.495188 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-kbg5m_2afcdd94-bf46-446a-b9d1-ff49260594d9/kube-rbac-proxy/0.log" Dec 05 19:11:57 crc kubenswrapper[4756]: I1205 19:11:57.648522 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-kbg5m_2afcdd94-bf46-446a-b9d1-ff49260594d9/manager/0.log" Dec 05 19:11:57 crc kubenswrapper[4756]: I1205 19:11:57.733216 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-fcwhr_95cbe112-4bfb-463a-99d3-15283473d2c6/kube-rbac-proxy/0.log" Dec 05 19:11:57 crc kubenswrapper[4756]: I1205 19:11:57.834152 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-6xbpf_b8e9cd2a-e3bd-4fd4-b4d7-f84806772b38/manager/0.log" Dec 05 19:11:57 crc kubenswrapper[4756]: I1205 19:11:57.979022 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-m8qwl_ccaea4ee-e7e9-451f-a1e3-fef7402f356b/kube-rbac-proxy/0.log" Dec 05 19:11:58 crc kubenswrapper[4756]: I1205 19:11:58.025019 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-fcwhr_95cbe112-4bfb-463a-99d3-15283473d2c6/manager/0.log" Dec 05 19:11:58 crc kubenswrapper[4756]: I1205 19:11:58.067318 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-m8qwl_ccaea4ee-e7e9-451f-a1e3-fef7402f356b/manager/0.log" Dec 05 19:11:58 crc kubenswrapper[4756]: I1205 19:11:58.179159 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-9lzlc_482f7c65-7376-4794-9308-062ed6209299/kube-rbac-proxy/0.log" Dec 05 19:11:58 crc kubenswrapper[4756]: I1205 19:11:58.232715 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-9lzlc_482f7c65-7376-4794-9308-062ed6209299/manager/0.log" Dec 05 19:11:58 crc kubenswrapper[4756]: I1205 19:11:58.384792 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-25dl4_156f45a8-a698-4f99-93c3-d8cc8a40d14f/kube-rbac-proxy/0.log" Dec 05 19:11:58 crc kubenswrapper[4756]: I1205 19:11:58.492665 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-25dl4_156f45a8-a698-4f99-93c3-d8cc8a40d14f/manager/0.log" Dec 05 19:11:58 crc kubenswrapper[4756]: I1205 19:11:58.535033 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-pvn24_79ee86d5-861a-4a03-b57a-9b382be05361/kube-rbac-proxy/0.log" Dec 05 19:11:58 crc kubenswrapper[4756]: I1205 19:11:58.730714 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-qlbvt_a6494f1e-bf6a-469e-a9ab-fdc9211668c9/kube-rbac-proxy/0.log" Dec 05 19:11:58 crc kubenswrapper[4756]: I1205 19:11:58.796648 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-qlbvt_a6494f1e-bf6a-469e-a9ab-fdc9211668c9/manager/0.log" Dec 05 19:11:58 crc kubenswrapper[4756]: I1205 19:11:58.811957 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-pvn24_79ee86d5-861a-4a03-b57a-9b382be05361/manager/0.log" Dec 05 19:11:58 crc kubenswrapper[4756]: I1205 19:11:58.949015 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-55c85496f5k5nb8_94e40b0c-5c93-4c26-b9c1-3c9240a8a44e/kube-rbac-proxy/0.log" Dec 05 19:11:59 crc kubenswrapper[4756]: I1205 19:11:58.999974 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-55c85496f5k5nb8_94e40b0c-5c93-4c26-b9c1-3c9240a8a44e/manager/0.log" Dec 05 19:11:59 crc kubenswrapper[4756]: I1205 19:11:59.377724 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-55b6fb9447-djrcs_fa646207-9ada-44e6-a096-f15872c59f7c/operator/0.log" Dec 05 19:11:59 crc kubenswrapper[4756]: I1205 19:11:59.455850 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-dnrr5_6953530f-9e26-4c39-b015-bdadcf81aca2/kube-rbac-proxy/0.log" Dec 05 19:11:59 crc kubenswrapper[4756]: I1205 19:11:59.486508 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-mq7cd_7402cf0a-a645-47dc-8f6f-6d9c833a9f8b/registry-server/0.log" Dec 05 19:11:59 crc kubenswrapper[4756]: I1205 19:11:59.642344 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-qrkfp_e9fe3d7c-713c-4b55-9775-c557846cae41/kube-rbac-proxy/0.log" Dec 05 19:11:59 crc kubenswrapper[4756]: I1205 19:11:59.696100 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-dnrr5_6953530f-9e26-4c39-b015-bdadcf81aca2/manager/0.log" Dec 05 19:11:59 crc kubenswrapper[4756]: I1205 19:11:59.781391 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-qrkfp_e9fe3d7c-713c-4b55-9775-c557846cae41/manager/0.log" Dec 05 19:11:59 crc kubenswrapper[4756]: I1205 19:11:59.921652 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-l5gdd_dd84c3d4-9a84-4418-8950-c550a63f018a/operator/0.log" Dec 05 19:12:00 crc kubenswrapper[4756]: I1205 19:12:00.002041 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-c8h5n_fd94fbcc-a2d3-423f-87fd-57b6e09d9b3d/kube-rbac-proxy/0.log" Dec 05 19:12:00 crc kubenswrapper[4756]: I1205 19:12:00.069691 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-c8h5n_fd94fbcc-a2d3-423f-87fd-57b6e09d9b3d/manager/0.log" Dec 05 19:12:00 crc kubenswrapper[4756]: I1205 19:12:00.126149 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-2x8ln_fd6cc312-becc-437b-bfd6-e461f3ae0912/kube-rbac-proxy/0.log" Dec 05 19:12:00 crc kubenswrapper[4756]: I1205 19:12:00.331876 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-22zvw_76126169-0993-4057-9698-56bb61a8872f/kube-rbac-proxy/0.log" Dec 05 19:12:00 crc kubenswrapper[4756]: I1205 19:12:00.452911 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-22zvw_76126169-0993-4057-9698-56bb61a8872f/manager/0.log" Dec 05 19:12:00 crc kubenswrapper[4756]: I1205 19:12:00.501882 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-2x8ln_fd6cc312-becc-437b-bfd6-e461f3ae0912/manager/0.log" Dec 05 19:12:00 crc kubenswrapper[4756]: I1205 19:12:00.558625 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-rdp64_e92435ef-34fe-44e9-bf45-7900fed5f199/kube-rbac-proxy/0.log" Dec 05 19:12:00 crc kubenswrapper[4756]: I1205 19:12:00.666644 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-rdp64_e92435ef-34fe-44e9-bf45-7900fed5f199/manager/0.log" Dec 05 19:12:01 crc kubenswrapper[4756]: I1205 19:12:01.462650 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-54bdf956c4-b8ghg_f5709380-2e15-4b5b-94a0-196fcdfc73d3/manager/0.log" Dec 05 19:12:21 crc kubenswrapper[4756]: I1205 19:12:21.842349 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-67x7d_996b39af-143d-4f4a-affc-b726debbfb0c/control-plane-machine-set-operator/0.log" Dec 05 19:12:22 crc kubenswrapper[4756]: I1205 19:12:22.005988 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mlthg_cba515e6-fcd1-4310-9630-70ca7c90c213/kube-rbac-proxy/0.log" Dec 05 19:12:22 crc kubenswrapper[4756]: I1205 19:12:22.036020 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mlthg_cba515e6-fcd1-4310-9630-70ca7c90c213/machine-api-operator/0.log" Dec 05 19:12:38 crc kubenswrapper[4756]: I1205 19:12:38.187517 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-674nf_9cfc35ef-5ac8-4f3e-9713-7fdabcca35d6/cert-manager-controller/0.log" Dec 05 19:12:38 crc kubenswrapper[4756]: I1205 19:12:38.332946 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-vv2kx_dab2fb4f-8a5a-473b-b211-4459f623785f/cert-manager-cainjector/0.log" Dec 05 19:12:38 crc kubenswrapper[4756]: I1205 19:12:38.358001 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-f455g_bd258cbd-151e-4165-b19d-9c5cc22e9f97/cert-manager-webhook/0.log" Dec 05 19:12:53 crc kubenswrapper[4756]: I1205 19:12:53.787760 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-pqqkr_f2ec05fa-c742-4ba4-830e-571af72430c0/nmstate-console-plugin/0.log" Dec 05 19:12:54 crc kubenswrapper[4756]: I1205 19:12:54.003816 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-sfl86_c9f8a111-e6d4-440a-a3d4-4f3010304ba9/nmstate-handler/0.log" Dec 05 19:12:54 crc kubenswrapper[4756]: I1205 19:12:54.008318 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-jbg86_92193823-fb9a-4291-a9cd-5d25f4a566cc/kube-rbac-proxy/0.log" Dec 05 19:12:54 crc kubenswrapper[4756]: I1205 19:12:54.123817 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-jbg86_92193823-fb9a-4291-a9cd-5d25f4a566cc/nmstate-metrics/0.log" Dec 05 19:12:54 crc kubenswrapper[4756]: I1205 19:12:54.213711 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-jpztz_2c336167-a213-4d0d-a396-512a7a80deda/nmstate-operator/0.log" Dec 05 19:12:54 crc kubenswrapper[4756]: I1205 19:12:54.317198 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-z2lv4_81bd8cf2-cb7b-4039-9a4d-fa57e5200d7a/nmstate-webhook/0.log" Dec 05 19:13:11 crc kubenswrapper[4756]: I1205 19:13:11.912243 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-65sww_7ed26790-e827-4c04-a6c1-3a6befe3f875/kube-rbac-proxy/0.log" Dec 05 19:13:12 crc kubenswrapper[4756]: I1205 19:13:12.158408 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nnqf2_03cce283-77b4-4dbb-995a-a28cc39ae376/cp-frr-files/0.log" Dec 05 19:13:12 crc kubenswrapper[4756]: I1205 19:13:12.268536 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-65sww_7ed26790-e827-4c04-a6c1-3a6befe3f875/controller/0.log" Dec 05 19:13:12 crc kubenswrapper[4756]: I1205 19:13:12.291070 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nnqf2_03cce283-77b4-4dbb-995a-a28cc39ae376/cp-frr-files/0.log" Dec 05 19:13:12 crc kubenswrapper[4756]: I1205 19:13:12.358801 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nnqf2_03cce283-77b4-4dbb-995a-a28cc39ae376/cp-reloader/0.log" Dec 05 19:13:12 crc kubenswrapper[4756]: I1205 19:13:12.371968 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nnqf2_03cce283-77b4-4dbb-995a-a28cc39ae376/cp-metrics/0.log" Dec 05 19:13:12 crc kubenswrapper[4756]: I1205 19:13:12.464909 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nnqf2_03cce283-77b4-4dbb-995a-a28cc39ae376/cp-reloader/0.log" Dec 05 19:13:12 crc kubenswrapper[4756]: I1205 19:13:12.665542 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nnqf2_03cce283-77b4-4dbb-995a-a28cc39ae376/cp-frr-files/0.log" Dec 05 19:13:12 crc kubenswrapper[4756]: I1205 19:13:12.696544 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nnqf2_03cce283-77b4-4dbb-995a-a28cc39ae376/cp-reloader/0.log" Dec 05 19:13:12 crc kubenswrapper[4756]: I1205 19:13:12.704336 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nnqf2_03cce283-77b4-4dbb-995a-a28cc39ae376/cp-metrics/0.log" Dec 05 19:13:12 crc kubenswrapper[4756]: I1205 19:13:12.707260 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nnqf2_03cce283-77b4-4dbb-995a-a28cc39ae376/cp-metrics/0.log" Dec 05 19:13:12 crc kubenswrapper[4756]: I1205 19:13:12.884816 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nnqf2_03cce283-77b4-4dbb-995a-a28cc39ae376/cp-frr-files/0.log" Dec 05 19:13:12 crc kubenswrapper[4756]: I1205 19:13:12.918720 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nnqf2_03cce283-77b4-4dbb-995a-a28cc39ae376/cp-reloader/0.log" Dec 05 19:13:12 crc kubenswrapper[4756]: I1205 19:13:12.940585 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nnqf2_03cce283-77b4-4dbb-995a-a28cc39ae376/controller/0.log" Dec 05 19:13:12 crc kubenswrapper[4756]: I1205 19:13:12.981391 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nnqf2_03cce283-77b4-4dbb-995a-a28cc39ae376/cp-metrics/0.log" Dec 05 19:13:13 crc kubenswrapper[4756]: I1205 19:13:13.121547 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nnqf2_03cce283-77b4-4dbb-995a-a28cc39ae376/frr-metrics/0.log" Dec 05 19:13:13 crc kubenswrapper[4756]: I1205 19:13:13.241251 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nnqf2_03cce283-77b4-4dbb-995a-a28cc39ae376/kube-rbac-proxy/0.log" Dec 05 19:13:13 crc kubenswrapper[4756]: I1205 19:13:13.271699 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nnqf2_03cce283-77b4-4dbb-995a-a28cc39ae376/kube-rbac-proxy-frr/0.log" Dec 05 19:13:13 crc kubenswrapper[4756]: I1205 19:13:13.333713 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nnqf2_03cce283-77b4-4dbb-995a-a28cc39ae376/reloader/0.log" Dec 05 19:13:13 crc kubenswrapper[4756]: I1205 19:13:13.512638 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-kzhn8_c64b6bd4-0f30-4b95-8bac-8c302e466540/frr-k8s-webhook-server/0.log" Dec 05 19:13:13 crc kubenswrapper[4756]: I1205 19:13:13.745743 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-55964f8cbb-lhn6h_bd7e4822-2466-41c9-9544-7748e0f123c5/manager/0.log" Dec 05 19:13:13 crc kubenswrapper[4756]: I1205 19:13:13.823046 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-77b74c6747-tsvjw_9a4d60c6-7374-4425-aa47-cfe2b7265c65/webhook-server/0.log" Dec 05 19:13:14 crc kubenswrapper[4756]: I1205 19:13:14.018882 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-mwwdp_9a2f231e-11fb-4b75-ba6b-35f59554bcbe/kube-rbac-proxy/0.log" Dec 05 19:13:15 crc kubenswrapper[4756]: I1205 19:13:15.079168 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-mwwdp_9a2f231e-11fb-4b75-ba6b-35f59554bcbe/speaker/0.log" Dec 05 19:13:16 crc kubenswrapper[4756]: I1205 19:13:16.729782 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nnqf2_03cce283-77b4-4dbb-995a-a28cc39ae376/frr/0.log" Dec 05 19:13:22 crc kubenswrapper[4756]: I1205 19:13:22.295910 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 19:13:22 crc kubenswrapper[4756]: I1205 19:13:22.296583 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 19:13:31 crc kubenswrapper[4756]: I1205 19:13:31.029276 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67_31e5042c-37e3-444e-adfb-3c7f50096886/util/0.log" Dec 05 19:13:31 crc kubenswrapper[4756]: I1205 19:13:31.174301 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67_31e5042c-37e3-444e-adfb-3c7f50096886/util/0.log" Dec 05 19:13:31 crc kubenswrapper[4756]: I1205 19:13:31.230369 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67_31e5042c-37e3-444e-adfb-3c7f50096886/pull/0.log" Dec 05 19:13:31 crc kubenswrapper[4756]: I1205 19:13:31.238995 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67_31e5042c-37e3-444e-adfb-3c7f50096886/pull/0.log" Dec 05 19:13:31 crc kubenswrapper[4756]: I1205 19:13:31.415574 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67_31e5042c-37e3-444e-adfb-3c7f50096886/extract/0.log" Dec 05 19:13:31 crc kubenswrapper[4756]: I1205 19:13:31.446893 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67_31e5042c-37e3-444e-adfb-3c7f50096886/pull/0.log" Dec 05 19:13:31 crc kubenswrapper[4756]: I1205 19:13:31.464248 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a89r67_31e5042c-37e3-444e-adfb-3c7f50096886/util/0.log" Dec 05 19:13:31 crc kubenswrapper[4756]: I1205 19:13:31.603104 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5_68caf581-c3d0-49bb-9fff-c574269dbcf6/util/0.log" Dec 05 19:13:31 crc kubenswrapper[4756]: I1205 19:13:31.774493 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5_68caf581-c3d0-49bb-9fff-c574269dbcf6/pull/0.log" Dec 05 19:13:31 crc kubenswrapper[4756]: I1205 19:13:31.800276 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5_68caf581-c3d0-49bb-9fff-c574269dbcf6/util/0.log" Dec 05 19:13:31 crc kubenswrapper[4756]: I1205 19:13:31.832215 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5_68caf581-c3d0-49bb-9fff-c574269dbcf6/pull/0.log" Dec 05 19:13:32 crc kubenswrapper[4756]: I1205 19:13:32.065619 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5_68caf581-c3d0-49bb-9fff-c574269dbcf6/extract/0.log" Dec 05 19:13:32 crc kubenswrapper[4756]: I1205 19:13:32.080586 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5_68caf581-c3d0-49bb-9fff-c574269dbcf6/util/0.log" Dec 05 19:13:32 crc kubenswrapper[4756]: I1205 19:13:32.086634 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8qpr5_68caf581-c3d0-49bb-9fff-c574269dbcf6/pull/0.log" Dec 05 19:13:32 crc kubenswrapper[4756]: I1205 19:13:32.256200 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts_894452cc-5e36-4dd4-a985-fb42bab5ca7b/util/0.log" Dec 05 19:13:32 crc kubenswrapper[4756]: I1205 19:13:32.416866 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts_894452cc-5e36-4dd4-a985-fb42bab5ca7b/util/0.log" Dec 05 19:13:32 crc kubenswrapper[4756]: I1205 19:13:32.454669 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts_894452cc-5e36-4dd4-a985-fb42bab5ca7b/pull/0.log" Dec 05 19:13:32 crc kubenswrapper[4756]: I1205 19:13:32.478092 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts_894452cc-5e36-4dd4-a985-fb42bab5ca7b/pull/0.log" Dec 05 19:13:32 crc kubenswrapper[4756]: I1205 19:13:32.630076 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts_894452cc-5e36-4dd4-a985-fb42bab5ca7b/util/0.log" Dec 05 19:13:32 crc kubenswrapper[4756]: I1205 19:13:32.652865 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts_894452cc-5e36-4dd4-a985-fb42bab5ca7b/pull/0.log" Dec 05 19:13:32 crc kubenswrapper[4756]: I1205 19:13:32.715431 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210slqts_894452cc-5e36-4dd4-a985-fb42bab5ca7b/extract/0.log" Dec 05 19:13:32 crc kubenswrapper[4756]: I1205 19:13:32.803863 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk_c1e72e52-3661-4e92-93a2-a23afb4f2560/util/0.log" Dec 05 19:13:33 crc kubenswrapper[4756]: I1205 19:13:33.075914 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk_c1e72e52-3661-4e92-93a2-a23afb4f2560/util/0.log" Dec 05 19:13:33 crc kubenswrapper[4756]: I1205 19:13:33.091648 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk_c1e72e52-3661-4e92-93a2-a23afb4f2560/pull/0.log" Dec 05 19:13:33 crc kubenswrapper[4756]: I1205 19:13:33.097647 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk_c1e72e52-3661-4e92-93a2-a23afb4f2560/pull/0.log" Dec 05 19:13:33 crc kubenswrapper[4756]: I1205 19:13:33.292780 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk_c1e72e52-3661-4e92-93a2-a23afb4f2560/util/0.log" Dec 05 19:13:33 crc kubenswrapper[4756]: I1205 19:13:33.294725 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk_c1e72e52-3661-4e92-93a2-a23afb4f2560/extract/0.log" Dec 05 19:13:33 crc kubenswrapper[4756]: I1205 19:13:33.297371 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hfwkk_c1e72e52-3661-4e92-93a2-a23afb4f2560/pull/0.log" Dec 05 19:13:33 crc kubenswrapper[4756]: I1205 19:13:33.459276 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-c9shz_bf4e82d2-e343-4e98-8e73-80e04c22e903/extract-utilities/0.log" Dec 05 19:13:33 crc kubenswrapper[4756]: I1205 19:13:33.619850 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-c9shz_bf4e82d2-e343-4e98-8e73-80e04c22e903/extract-utilities/0.log" Dec 05 19:13:33 crc kubenswrapper[4756]: I1205 19:13:33.638511 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-c9shz_bf4e82d2-e343-4e98-8e73-80e04c22e903/extract-content/0.log" Dec 05 19:13:33 crc kubenswrapper[4756]: I1205 19:13:33.711087 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-c9shz_bf4e82d2-e343-4e98-8e73-80e04c22e903/extract-content/0.log" Dec 05 19:13:33 crc kubenswrapper[4756]: I1205 19:13:33.851033 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-c9shz_bf4e82d2-e343-4e98-8e73-80e04c22e903/extract-utilities/0.log" Dec 05 19:13:33 crc kubenswrapper[4756]: I1205 19:13:33.860564 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-c9shz_bf4e82d2-e343-4e98-8e73-80e04c22e903/extract-content/0.log" Dec 05 19:13:34 crc kubenswrapper[4756]: I1205 19:13:34.097518 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f4fqj_7abe2354-3116-4515-82b1-fd87eb9368fd/extract-utilities/0.log" Dec 05 19:13:34 crc kubenswrapper[4756]: I1205 19:13:34.403795 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f4fqj_7abe2354-3116-4515-82b1-fd87eb9368fd/extract-utilities/0.log" Dec 05 19:13:34 crc kubenswrapper[4756]: I1205 19:13:34.439904 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f4fqj_7abe2354-3116-4515-82b1-fd87eb9368fd/extract-content/0.log" Dec 05 19:13:34 crc kubenswrapper[4756]: I1205 19:13:34.460540 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f4fqj_7abe2354-3116-4515-82b1-fd87eb9368fd/extract-content/0.log" Dec 05 19:13:34 crc kubenswrapper[4756]: I1205 19:13:34.746678 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f4fqj_7abe2354-3116-4515-82b1-fd87eb9368fd/extract-utilities/0.log" Dec 05 19:13:34 crc kubenswrapper[4756]: I1205 19:13:34.763839 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f4fqj_7abe2354-3116-4515-82b1-fd87eb9368fd/extract-content/0.log" Dec 05 19:13:35 crc kubenswrapper[4756]: I1205 19:13:35.110888 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-c9shz_bf4e82d2-e343-4e98-8e73-80e04c22e903/registry-server/0.log" Dec 05 19:13:35 crc kubenswrapper[4756]: I1205 19:13:35.113388 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-fpgmr_6c445575-d157-46b0-a496-79e9dfc47c17/marketplace-operator/0.log" Dec 05 19:13:35 crc kubenswrapper[4756]: I1205 19:13:35.230107 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m8tg4_1e18238d-9140-4421-a912-377c9c626b83/extract-utilities/0.log" Dec 05 19:13:35 crc kubenswrapper[4756]: I1205 19:13:35.363450 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f4fqj_7abe2354-3116-4515-82b1-fd87eb9368fd/registry-server/0.log" Dec 05 19:13:35 crc kubenswrapper[4756]: I1205 19:13:35.387863 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m8tg4_1e18238d-9140-4421-a912-377c9c626b83/extract-utilities/0.log" Dec 05 19:13:35 crc kubenswrapper[4756]: I1205 19:13:35.417928 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m8tg4_1e18238d-9140-4421-a912-377c9c626b83/extract-content/0.log" Dec 05 19:13:35 crc kubenswrapper[4756]: I1205 19:13:35.426658 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m8tg4_1e18238d-9140-4421-a912-377c9c626b83/extract-content/0.log" Dec 05 19:13:35 crc kubenswrapper[4756]: I1205 19:13:35.618224 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m8tg4_1e18238d-9140-4421-a912-377c9c626b83/extract-content/0.log" Dec 05 19:13:35 crc kubenswrapper[4756]: I1205 19:13:35.655177 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m8tg4_1e18238d-9140-4421-a912-377c9c626b83/extract-utilities/0.log" Dec 05 19:13:35 crc kubenswrapper[4756]: I1205 19:13:35.731028 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-trs6s_fd51e2e3-9449-49c9-9ac0-a8496f61265d/extract-utilities/0.log" Dec 05 19:13:35 crc kubenswrapper[4756]: I1205 19:13:35.879687 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-trs6s_fd51e2e3-9449-49c9-9ac0-a8496f61265d/extract-content/0.log" Dec 05 19:13:35 crc kubenswrapper[4756]: I1205 19:13:35.920648 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-trs6s_fd51e2e3-9449-49c9-9ac0-a8496f61265d/extract-content/0.log" Dec 05 19:13:35 crc kubenswrapper[4756]: I1205 19:13:35.922138 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m8tg4_1e18238d-9140-4421-a912-377c9c626b83/registry-server/0.log" Dec 05 19:13:35 crc kubenswrapper[4756]: I1205 19:13:35.984746 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-trs6s_fd51e2e3-9449-49c9-9ac0-a8496f61265d/extract-utilities/0.log" Dec 05 19:13:36 crc kubenswrapper[4756]: I1205 19:13:36.147464 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-trs6s_fd51e2e3-9449-49c9-9ac0-a8496f61265d/extract-utilities/0.log" Dec 05 19:13:36 crc kubenswrapper[4756]: I1205 19:13:36.202495 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-trs6s_fd51e2e3-9449-49c9-9ac0-a8496f61265d/extract-content/0.log" Dec 05 19:13:37 crc kubenswrapper[4756]: I1205 19:13:37.377441 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-trs6s_fd51e2e3-9449-49c9-9ac0-a8496f61265d/registry-server/0.log" Dec 05 19:13:52 crc kubenswrapper[4756]: I1205 19:13:52.295497 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 19:13:52 crc kubenswrapper[4756]: I1205 19:13:52.296050 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 19:13:53 crc kubenswrapper[4756]: I1205 19:13:53.745550 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-4d4k8_f9df7e1a-4c29-43b1-b5fb-0a74fcd9acd7/prometheus-operator/0.log" Dec 05 19:13:53 crc kubenswrapper[4756]: I1205 19:13:53.959045 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-bdc588f9-44cdj_d6b206ae-0b75-402a-aea8-f3d6eaa6a7e7/prometheus-operator-admission-webhook/0.log" Dec 05 19:13:53 crc kubenswrapper[4756]: I1205 19:13:53.990222 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-bdc588f9-522g7_5399ecc6-c8a6-4fb3-812b-ac3fd080d6b2/prometheus-operator-admission-webhook/0.log" Dec 05 19:13:54 crc kubenswrapper[4756]: I1205 19:13:54.191540 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-mxdgp_a6174fba-31f8-4814-9c46-1ef40e855c39/perses-operator/0.log" Dec 05 19:13:54 crc kubenswrapper[4756]: I1205 19:13:54.197652 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-mxr46_2304ca15-1116-4f4e-ab8e-ab04484010a3/operator/0.log" Dec 05 19:14:22 crc kubenswrapper[4756]: I1205 19:14:22.295673 4756 patch_prober.go:28] interesting pod/machine-config-daemon-zrvcn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 19:14:22 crc kubenswrapper[4756]: I1205 19:14:22.296384 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 19:14:22 crc kubenswrapper[4756]: I1205 19:14:22.296479 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" Dec 05 19:14:22 crc kubenswrapper[4756]: I1205 19:14:22.297734 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dd8fda0746006088fd09caede5ac5d34d70606c3815a94d2eee4a6e84a8ae547"} pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 19:14:22 crc kubenswrapper[4756]: I1205 19:14:22.297841 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" containerName="machine-config-daemon" containerID="cri-o://dd8fda0746006088fd09caede5ac5d34d70606c3815a94d2eee4a6e84a8ae547" gracePeriod=600 Dec 05 19:14:22 crc kubenswrapper[4756]: E1205 19:14:22.427960 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:14:22 crc kubenswrapper[4756]: I1205 19:14:22.954671 4756 generic.go:334] "Generic (PLEG): container finished" podID="2067f3da-c088-40d8-bc08-1eb971992c56" containerID="dd8fda0746006088fd09caede5ac5d34d70606c3815a94d2eee4a6e84a8ae547" exitCode=0 Dec 05 19:14:22 crc kubenswrapper[4756]: I1205 19:14:22.954714 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerDied","Data":"dd8fda0746006088fd09caede5ac5d34d70606c3815a94d2eee4a6e84a8ae547"} Dec 05 19:14:22 crc kubenswrapper[4756]: I1205 19:14:22.955060 4756 scope.go:117] "RemoveContainer" containerID="5b79640ea6e442aff2813d6696ab699c40b5373571ba83575e20b9a20b3e543c" Dec 05 19:14:22 crc kubenswrapper[4756]: I1205 19:14:22.955969 4756 scope.go:117] "RemoveContainer" containerID="dd8fda0746006088fd09caede5ac5d34d70606c3815a94d2eee4a6e84a8ae547" Dec 05 19:14:22 crc kubenswrapper[4756]: E1205 19:14:22.956549 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:14:23 crc kubenswrapper[4756]: I1205 19:14:23.901257 4756 scope.go:117] "RemoveContainer" containerID="1abe787243546af159cf93c0d32003925e5d13038606d51f1771bf149624fd06" Dec 05 19:14:23 crc kubenswrapper[4756]: I1205 19:14:23.926426 4756 scope.go:117] "RemoveContainer" containerID="dfb3a8d403316fd57636a34bb44aa9287dd23e8df782d947e6bb5354d7b72ccd" Dec 05 19:14:24 crc kubenswrapper[4756]: I1205 19:14:24.579783 4756 scope.go:117] "RemoveContainer" containerID="262f2951748338577e925ae1c17d7e7877b89fa5e4fef32d1544b351f7bc3ac9" Dec 05 19:14:35 crc kubenswrapper[4756]: I1205 19:14:35.051584 4756 scope.go:117] "RemoveContainer" containerID="dd8fda0746006088fd09caede5ac5d34d70606c3815a94d2eee4a6e84a8ae547" Dec 05 19:14:35 crc kubenswrapper[4756]: E1205 19:14:35.052762 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:14:47 crc kubenswrapper[4756]: I1205 19:14:47.050470 4756 scope.go:117] "RemoveContainer" containerID="dd8fda0746006088fd09caede5ac5d34d70606c3815a94d2eee4a6e84a8ae547" Dec 05 19:14:47 crc kubenswrapper[4756]: E1205 19:14:47.051743 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:15:00 crc kubenswrapper[4756]: I1205 19:15:00.200472 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416035-slwt4"] Dec 05 19:15:00 crc kubenswrapper[4756]: E1205 19:15:00.201750 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5" containerName="extract-utilities" Dec 05 19:15:00 crc kubenswrapper[4756]: I1205 19:15:00.201772 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5" containerName="extract-utilities" Dec 05 19:15:00 crc kubenswrapper[4756]: E1205 19:15:00.201820 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5" containerName="registry-server" Dec 05 19:15:00 crc kubenswrapper[4756]: I1205 19:15:00.201834 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5" containerName="registry-server" Dec 05 19:15:00 crc kubenswrapper[4756]: E1205 19:15:00.201871 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b75c6bf-d2f3-4f8c-972f-790f20da62b4" containerName="extract-utilities" Dec 05 19:15:00 crc kubenswrapper[4756]: I1205 19:15:00.201885 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b75c6bf-d2f3-4f8c-972f-790f20da62b4" containerName="extract-utilities" Dec 05 19:15:00 crc kubenswrapper[4756]: E1205 19:15:00.201918 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b75c6bf-d2f3-4f8c-972f-790f20da62b4" containerName="registry-server" Dec 05 19:15:00 crc kubenswrapper[4756]: I1205 19:15:00.201931 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b75c6bf-d2f3-4f8c-972f-790f20da62b4" containerName="registry-server" Dec 05 19:15:00 crc kubenswrapper[4756]: E1205 19:15:00.201969 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5" containerName="extract-content" Dec 05 19:15:00 crc kubenswrapper[4756]: I1205 19:15:00.201982 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5" containerName="extract-content" Dec 05 19:15:00 crc kubenswrapper[4756]: E1205 19:15:00.202006 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b75c6bf-d2f3-4f8c-972f-790f20da62b4" containerName="extract-content" Dec 05 19:15:00 crc kubenswrapper[4756]: I1205 19:15:00.202019 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b75c6bf-d2f3-4f8c-972f-790f20da62b4" containerName="extract-content" Dec 05 19:15:00 crc kubenswrapper[4756]: I1205 19:15:00.202454 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b75c6bf-d2f3-4f8c-972f-790f20da62b4" containerName="registry-server" Dec 05 19:15:00 crc kubenswrapper[4756]: I1205 19:15:00.202479 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="76d1c0bb-35e4-461d-bfc4-b52c43cd8cf5" containerName="registry-server" Dec 05 19:15:00 crc kubenswrapper[4756]: I1205 19:15:00.249786 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416035-slwt4" Dec 05 19:15:00 crc kubenswrapper[4756]: I1205 19:15:00.261012 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 19:15:00 crc kubenswrapper[4756]: I1205 19:15:00.261136 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416035-slwt4"] Dec 05 19:15:00 crc kubenswrapper[4756]: I1205 19:15:00.261206 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 19:15:00 crc kubenswrapper[4756]: I1205 19:15:00.281106 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f8b4c8e-de00-4289-80b0-3f5469905f81-config-volume\") pod \"collect-profiles-29416035-slwt4\" (UID: \"3f8b4c8e-de00-4289-80b0-3f5469905f81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416035-slwt4" Dec 05 19:15:00 crc kubenswrapper[4756]: I1205 19:15:00.281309 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwrw2\" (UniqueName: \"kubernetes.io/projected/3f8b4c8e-de00-4289-80b0-3f5469905f81-kube-api-access-vwrw2\") pod \"collect-profiles-29416035-slwt4\" (UID: \"3f8b4c8e-de00-4289-80b0-3f5469905f81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416035-slwt4" Dec 05 19:15:00 crc kubenswrapper[4756]: I1205 19:15:00.281434 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f8b4c8e-de00-4289-80b0-3f5469905f81-secret-volume\") pod \"collect-profiles-29416035-slwt4\" (UID: \"3f8b4c8e-de00-4289-80b0-3f5469905f81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416035-slwt4" Dec 05 19:15:00 crc kubenswrapper[4756]: I1205 19:15:00.383066 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwrw2\" (UniqueName: \"kubernetes.io/projected/3f8b4c8e-de00-4289-80b0-3f5469905f81-kube-api-access-vwrw2\") pod \"collect-profiles-29416035-slwt4\" (UID: \"3f8b4c8e-de00-4289-80b0-3f5469905f81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416035-slwt4" Dec 05 19:15:00 crc kubenswrapper[4756]: I1205 19:15:00.383437 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f8b4c8e-de00-4289-80b0-3f5469905f81-secret-volume\") pod \"collect-profiles-29416035-slwt4\" (UID: \"3f8b4c8e-de00-4289-80b0-3f5469905f81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416035-slwt4" Dec 05 19:15:00 crc kubenswrapper[4756]: I1205 19:15:00.383610 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f8b4c8e-de00-4289-80b0-3f5469905f81-config-volume\") pod \"collect-profiles-29416035-slwt4\" (UID: \"3f8b4c8e-de00-4289-80b0-3f5469905f81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416035-slwt4" Dec 05 19:15:00 crc kubenswrapper[4756]: I1205 19:15:00.384689 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f8b4c8e-de00-4289-80b0-3f5469905f81-config-volume\") pod \"collect-profiles-29416035-slwt4\" (UID: \"3f8b4c8e-de00-4289-80b0-3f5469905f81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416035-slwt4" Dec 05 19:15:00 crc kubenswrapper[4756]: I1205 19:15:00.394941 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f8b4c8e-de00-4289-80b0-3f5469905f81-secret-volume\") pod \"collect-profiles-29416035-slwt4\" (UID: \"3f8b4c8e-de00-4289-80b0-3f5469905f81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416035-slwt4" Dec 05 19:15:00 crc kubenswrapper[4756]: I1205 19:15:00.403588 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwrw2\" (UniqueName: \"kubernetes.io/projected/3f8b4c8e-de00-4289-80b0-3f5469905f81-kube-api-access-vwrw2\") pod \"collect-profiles-29416035-slwt4\" (UID: \"3f8b4c8e-de00-4289-80b0-3f5469905f81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416035-slwt4" Dec 05 19:15:00 crc kubenswrapper[4756]: I1205 19:15:00.592103 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416035-slwt4" Dec 05 19:15:01 crc kubenswrapper[4756]: I1205 19:15:01.051695 4756 scope.go:117] "RemoveContainer" containerID="dd8fda0746006088fd09caede5ac5d34d70606c3815a94d2eee4a6e84a8ae547" Dec 05 19:15:01 crc kubenswrapper[4756]: E1205 19:15:01.052340 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:15:01 crc kubenswrapper[4756]: I1205 19:15:01.132458 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416035-slwt4"] Dec 05 19:15:01 crc kubenswrapper[4756]: W1205 19:15:01.143879 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f8b4c8e_de00_4289_80b0_3f5469905f81.slice/crio-b05de62580af97b86dbcdcb5b45e21d05ee104133419808aaab57cbcb6cbf995 WatchSource:0}: Error finding container b05de62580af97b86dbcdcb5b45e21d05ee104133419808aaab57cbcb6cbf995: Status 404 returned error can't find the container with id b05de62580af97b86dbcdcb5b45e21d05ee104133419808aaab57cbcb6cbf995 Dec 05 19:15:01 crc kubenswrapper[4756]: I1205 19:15:01.465630 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416035-slwt4" event={"ID":"3f8b4c8e-de00-4289-80b0-3f5469905f81","Type":"ContainerStarted","Data":"b3995d2b67e9bd25cfcbdec0033fb87420e7481e34699c5436d7bbdd858a807b"} Dec 05 19:15:01 crc kubenswrapper[4756]: I1205 19:15:01.465902 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416035-slwt4" event={"ID":"3f8b4c8e-de00-4289-80b0-3f5469905f81","Type":"ContainerStarted","Data":"b05de62580af97b86dbcdcb5b45e21d05ee104133419808aaab57cbcb6cbf995"} Dec 05 19:15:01 crc kubenswrapper[4756]: I1205 19:15:01.495795 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29416035-slwt4" podStartSLOduration=1.495772487 podStartE2EDuration="1.495772487s" podCreationTimestamp="2025-12-05 19:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 19:15:01.482395497 +0000 UTC m=+10761.721068999" watchObservedRunningTime="2025-12-05 19:15:01.495772487 +0000 UTC m=+10761.734446019" Dec 05 19:15:02 crc kubenswrapper[4756]: I1205 19:15:02.480982 4756 generic.go:334] "Generic (PLEG): container finished" podID="3f8b4c8e-de00-4289-80b0-3f5469905f81" containerID="b3995d2b67e9bd25cfcbdec0033fb87420e7481e34699c5436d7bbdd858a807b" exitCode=0 Dec 05 19:15:02 crc kubenswrapper[4756]: I1205 19:15:02.481251 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416035-slwt4" event={"ID":"3f8b4c8e-de00-4289-80b0-3f5469905f81","Type":"ContainerDied","Data":"b3995d2b67e9bd25cfcbdec0033fb87420e7481e34699c5436d7bbdd858a807b"} Dec 05 19:15:03 crc kubenswrapper[4756]: I1205 19:15:03.935285 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416035-slwt4" Dec 05 19:15:04 crc kubenswrapper[4756]: I1205 19:15:04.085929 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f8b4c8e-de00-4289-80b0-3f5469905f81-secret-volume\") pod \"3f8b4c8e-de00-4289-80b0-3f5469905f81\" (UID: \"3f8b4c8e-de00-4289-80b0-3f5469905f81\") " Dec 05 19:15:04 crc kubenswrapper[4756]: I1205 19:15:04.086042 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwrw2\" (UniqueName: \"kubernetes.io/projected/3f8b4c8e-de00-4289-80b0-3f5469905f81-kube-api-access-vwrw2\") pod \"3f8b4c8e-de00-4289-80b0-3f5469905f81\" (UID: \"3f8b4c8e-de00-4289-80b0-3f5469905f81\") " Dec 05 19:15:04 crc kubenswrapper[4756]: I1205 19:15:04.086128 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f8b4c8e-de00-4289-80b0-3f5469905f81-config-volume\") pod \"3f8b4c8e-de00-4289-80b0-3f5469905f81\" (UID: \"3f8b4c8e-de00-4289-80b0-3f5469905f81\") " Dec 05 19:15:04 crc kubenswrapper[4756]: I1205 19:15:04.087261 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f8b4c8e-de00-4289-80b0-3f5469905f81-config-volume" (OuterVolumeSpecName: "config-volume") pod "3f8b4c8e-de00-4289-80b0-3f5469905f81" (UID: "3f8b4c8e-de00-4289-80b0-3f5469905f81"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 19:15:04 crc kubenswrapper[4756]: I1205 19:15:04.088624 4756 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f8b4c8e-de00-4289-80b0-3f5469905f81-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 19:15:04 crc kubenswrapper[4756]: I1205 19:15:04.095840 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8b4c8e-de00-4289-80b0-3f5469905f81-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3f8b4c8e-de00-4289-80b0-3f5469905f81" (UID: "3f8b4c8e-de00-4289-80b0-3f5469905f81"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 19:15:04 crc kubenswrapper[4756]: I1205 19:15:04.099274 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f8b4c8e-de00-4289-80b0-3f5469905f81-kube-api-access-vwrw2" (OuterVolumeSpecName: "kube-api-access-vwrw2") pod "3f8b4c8e-de00-4289-80b0-3f5469905f81" (UID: "3f8b4c8e-de00-4289-80b0-3f5469905f81"). InnerVolumeSpecName "kube-api-access-vwrw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 19:15:04 crc kubenswrapper[4756]: I1205 19:15:04.191659 4756 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f8b4c8e-de00-4289-80b0-3f5469905f81-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 19:15:04 crc kubenswrapper[4756]: I1205 19:15:04.191724 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwrw2\" (UniqueName: \"kubernetes.io/projected/3f8b4c8e-de00-4289-80b0-3f5469905f81-kube-api-access-vwrw2\") on node \"crc\" DevicePath \"\"" Dec 05 19:15:04 crc kubenswrapper[4756]: I1205 19:15:04.552520 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416035-slwt4" event={"ID":"3f8b4c8e-de00-4289-80b0-3f5469905f81","Type":"ContainerDied","Data":"b05de62580af97b86dbcdcb5b45e21d05ee104133419808aaab57cbcb6cbf995"} Dec 05 19:15:04 crc kubenswrapper[4756]: I1205 19:15:04.552579 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b05de62580af97b86dbcdcb5b45e21d05ee104133419808aaab57cbcb6cbf995" Dec 05 19:15:04 crc kubenswrapper[4756]: I1205 19:15:04.552672 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416035-slwt4" Dec 05 19:15:04 crc kubenswrapper[4756]: I1205 19:15:04.596923 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415990-jpzpq"] Dec 05 19:15:04 crc kubenswrapper[4756]: I1205 19:15:04.606945 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415990-jpzpq"] Dec 05 19:15:06 crc kubenswrapper[4756]: I1205 19:15:06.071618 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d8deec6-9788-4d17-967a-cfbd1cb7f0b6" path="/var/lib/kubelet/pods/7d8deec6-9788-4d17-967a-cfbd1cb7f0b6/volumes" Dec 05 19:15:13 crc kubenswrapper[4756]: I1205 19:15:13.050059 4756 scope.go:117] "RemoveContainer" containerID="dd8fda0746006088fd09caede5ac5d34d70606c3815a94d2eee4a6e84a8ae547" Dec 05 19:15:13 crc kubenswrapper[4756]: E1205 19:15:13.051068 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:15:17 crc kubenswrapper[4756]: I1205 19:15:17.054871 4756 trace.go:236] Trace[606851216]: "Calculate volume metrics of mysql-db for pod openstack/openstack-cell1-galera-0" (05-Dec-2025 19:15:16.041) (total time: 1013ms): Dec 05 19:15:17 crc kubenswrapper[4756]: Trace[606851216]: [1.013057604s] [1.013057604s] END Dec 05 19:15:24 crc kubenswrapper[4756]: I1205 19:15:24.667573 4756 scope.go:117] "RemoveContainer" containerID="5a3911457dfbebec3b32ce1a290c71fdfe7f9d45b34d2a3c99eba29454c23f19" Dec 05 19:15:28 crc kubenswrapper[4756]: I1205 19:15:28.051922 4756 scope.go:117] "RemoveContainer" containerID="dd8fda0746006088fd09caede5ac5d34d70606c3815a94d2eee4a6e84a8ae547" Dec 05 19:15:28 crc kubenswrapper[4756]: E1205 19:15:28.053060 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:15:42 crc kubenswrapper[4756]: I1205 19:15:42.053491 4756 scope.go:117] "RemoveContainer" containerID="dd8fda0746006088fd09caede5ac5d34d70606c3815a94d2eee4a6e84a8ae547" Dec 05 19:15:42 crc kubenswrapper[4756]: E1205 19:15:42.054684 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:15:53 crc kubenswrapper[4756]: I1205 19:15:53.284356 4756 generic.go:334] "Generic (PLEG): container finished" podID="fe24f4b8-5eb6-4027-9445-6c57b390876e" containerID="f79cec67d91fe39767030ff5e224f289188866c4bbdb07e0444862c45c5c3d89" exitCode=0 Dec 05 19:15:53 crc kubenswrapper[4756]: I1205 19:15:53.284638 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zfcqk/must-gather-qk6sq" event={"ID":"fe24f4b8-5eb6-4027-9445-6c57b390876e","Type":"ContainerDied","Data":"f79cec67d91fe39767030ff5e224f289188866c4bbdb07e0444862c45c5c3d89"} Dec 05 19:15:53 crc kubenswrapper[4756]: I1205 19:15:53.286838 4756 scope.go:117] "RemoveContainer" containerID="f79cec67d91fe39767030ff5e224f289188866c4bbdb07e0444862c45c5c3d89" Dec 05 19:15:54 crc kubenswrapper[4756]: I1205 19:15:54.315062 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zfcqk_must-gather-qk6sq_fe24f4b8-5eb6-4027-9445-6c57b390876e/gather/0.log" Dec 05 19:15:56 crc kubenswrapper[4756]: I1205 19:15:56.049435 4756 scope.go:117] "RemoveContainer" containerID="dd8fda0746006088fd09caede5ac5d34d70606c3815a94d2eee4a6e84a8ae547" Dec 05 19:15:56 crc kubenswrapper[4756]: E1205 19:15:56.050223 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:16:02 crc kubenswrapper[4756]: I1205 19:16:02.512315 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zfcqk/must-gather-qk6sq"] Dec 05 19:16:02 crc kubenswrapper[4756]: I1205 19:16:02.513833 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-zfcqk/must-gather-qk6sq" podUID="fe24f4b8-5eb6-4027-9445-6c57b390876e" containerName="copy" containerID="cri-o://5f7b75922371ffef2dc56a71d59daa7bacbc06c86a493d49a422f23b0c2db981" gracePeriod=2 Dec 05 19:16:02 crc kubenswrapper[4756]: I1205 19:16:02.529969 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zfcqk/must-gather-qk6sq"] Dec 05 19:16:03 crc kubenswrapper[4756]: I1205 19:16:03.423104 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zfcqk_must-gather-qk6sq_fe24f4b8-5eb6-4027-9445-6c57b390876e/copy/0.log" Dec 05 19:16:03 crc kubenswrapper[4756]: I1205 19:16:03.423948 4756 generic.go:334] "Generic (PLEG): container finished" podID="fe24f4b8-5eb6-4027-9445-6c57b390876e" containerID="5f7b75922371ffef2dc56a71d59daa7bacbc06c86a493d49a422f23b0c2db981" exitCode=143 Dec 05 19:16:03 crc kubenswrapper[4756]: I1205 19:16:03.666870 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zfcqk_must-gather-qk6sq_fe24f4b8-5eb6-4027-9445-6c57b390876e/copy/0.log" Dec 05 19:16:03 crc kubenswrapper[4756]: I1205 19:16:03.667490 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zfcqk/must-gather-qk6sq" Dec 05 19:16:03 crc kubenswrapper[4756]: I1205 19:16:03.687602 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fe24f4b8-5eb6-4027-9445-6c57b390876e-must-gather-output\") pod \"fe24f4b8-5eb6-4027-9445-6c57b390876e\" (UID: \"fe24f4b8-5eb6-4027-9445-6c57b390876e\") " Dec 05 19:16:03 crc kubenswrapper[4756]: I1205 19:16:03.688083 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rd6h\" (UniqueName: \"kubernetes.io/projected/fe24f4b8-5eb6-4027-9445-6c57b390876e-kube-api-access-2rd6h\") pod \"fe24f4b8-5eb6-4027-9445-6c57b390876e\" (UID: \"fe24f4b8-5eb6-4027-9445-6c57b390876e\") " Dec 05 19:16:03 crc kubenswrapper[4756]: I1205 19:16:03.708134 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe24f4b8-5eb6-4027-9445-6c57b390876e-kube-api-access-2rd6h" (OuterVolumeSpecName: "kube-api-access-2rd6h") pod "fe24f4b8-5eb6-4027-9445-6c57b390876e" (UID: "fe24f4b8-5eb6-4027-9445-6c57b390876e"). InnerVolumeSpecName "kube-api-access-2rd6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 19:16:03 crc kubenswrapper[4756]: I1205 19:16:03.791604 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rd6h\" (UniqueName: \"kubernetes.io/projected/fe24f4b8-5eb6-4027-9445-6c57b390876e-kube-api-access-2rd6h\") on node \"crc\" DevicePath \"\"" Dec 05 19:16:03 crc kubenswrapper[4756]: I1205 19:16:03.876133 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe24f4b8-5eb6-4027-9445-6c57b390876e-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "fe24f4b8-5eb6-4027-9445-6c57b390876e" (UID: "fe24f4b8-5eb6-4027-9445-6c57b390876e"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 19:16:03 crc kubenswrapper[4756]: I1205 19:16:03.894027 4756 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fe24f4b8-5eb6-4027-9445-6c57b390876e-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 05 19:16:04 crc kubenswrapper[4756]: I1205 19:16:04.091933 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe24f4b8-5eb6-4027-9445-6c57b390876e" path="/var/lib/kubelet/pods/fe24f4b8-5eb6-4027-9445-6c57b390876e/volumes" Dec 05 19:16:04 crc kubenswrapper[4756]: I1205 19:16:04.436573 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zfcqk_must-gather-qk6sq_fe24f4b8-5eb6-4027-9445-6c57b390876e/copy/0.log" Dec 05 19:16:04 crc kubenswrapper[4756]: I1205 19:16:04.437261 4756 scope.go:117] "RemoveContainer" containerID="5f7b75922371ffef2dc56a71d59daa7bacbc06c86a493d49a422f23b0c2db981" Dec 05 19:16:04 crc kubenswrapper[4756]: I1205 19:16:04.437350 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zfcqk/must-gather-qk6sq" Dec 05 19:16:04 crc kubenswrapper[4756]: I1205 19:16:04.458401 4756 scope.go:117] "RemoveContainer" containerID="f79cec67d91fe39767030ff5e224f289188866c4bbdb07e0444862c45c5c3d89" Dec 05 19:16:07 crc kubenswrapper[4756]: I1205 19:16:07.050260 4756 scope.go:117] "RemoveContainer" containerID="dd8fda0746006088fd09caede5ac5d34d70606c3815a94d2eee4a6e84a8ae547" Dec 05 19:16:07 crc kubenswrapper[4756]: E1205 19:16:07.050929 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:16:15 crc kubenswrapper[4756]: I1205 19:16:15.075108 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gh5bf"] Dec 05 19:16:15 crc kubenswrapper[4756]: E1205 19:16:15.076485 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe24f4b8-5eb6-4027-9445-6c57b390876e" containerName="gather" Dec 05 19:16:15 crc kubenswrapper[4756]: I1205 19:16:15.076506 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe24f4b8-5eb6-4027-9445-6c57b390876e" containerName="gather" Dec 05 19:16:15 crc kubenswrapper[4756]: E1205 19:16:15.076537 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f8b4c8e-de00-4289-80b0-3f5469905f81" containerName="collect-profiles" Dec 05 19:16:15 crc kubenswrapper[4756]: I1205 19:16:15.076550 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f8b4c8e-de00-4289-80b0-3f5469905f81" containerName="collect-profiles" Dec 05 19:16:15 crc kubenswrapper[4756]: E1205 19:16:15.076581 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe24f4b8-5eb6-4027-9445-6c57b390876e" containerName="copy" Dec 05 19:16:15 crc kubenswrapper[4756]: I1205 19:16:15.076595 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe24f4b8-5eb6-4027-9445-6c57b390876e" containerName="copy" Dec 05 19:16:15 crc kubenswrapper[4756]: I1205 19:16:15.077038 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f8b4c8e-de00-4289-80b0-3f5469905f81" containerName="collect-profiles" Dec 05 19:16:15 crc kubenswrapper[4756]: I1205 19:16:15.077059 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe24f4b8-5eb6-4027-9445-6c57b390876e" containerName="gather" Dec 05 19:16:15 crc kubenswrapper[4756]: I1205 19:16:15.077082 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe24f4b8-5eb6-4027-9445-6c57b390876e" containerName="copy" Dec 05 19:16:15 crc kubenswrapper[4756]: I1205 19:16:15.084505 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gh5bf" Dec 05 19:16:15 crc kubenswrapper[4756]: I1205 19:16:15.104297 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gh5bf"] Dec 05 19:16:15 crc kubenswrapper[4756]: I1205 19:16:15.265034 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e801001-cf18-4068-9071-9894bca043c7-utilities\") pod \"community-operators-gh5bf\" (UID: \"7e801001-cf18-4068-9071-9894bca043c7\") " pod="openshift-marketplace/community-operators-gh5bf" Dec 05 19:16:15 crc kubenswrapper[4756]: I1205 19:16:15.265111 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bml6\" (UniqueName: \"kubernetes.io/projected/7e801001-cf18-4068-9071-9894bca043c7-kube-api-access-4bml6\") pod \"community-operators-gh5bf\" (UID: \"7e801001-cf18-4068-9071-9894bca043c7\") " pod="openshift-marketplace/community-operators-gh5bf" Dec 05 19:16:15 crc kubenswrapper[4756]: I1205 19:16:15.265188 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e801001-cf18-4068-9071-9894bca043c7-catalog-content\") pod \"community-operators-gh5bf\" (UID: \"7e801001-cf18-4068-9071-9894bca043c7\") " pod="openshift-marketplace/community-operators-gh5bf" Dec 05 19:16:15 crc kubenswrapper[4756]: I1205 19:16:15.366656 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e801001-cf18-4068-9071-9894bca043c7-utilities\") pod \"community-operators-gh5bf\" (UID: \"7e801001-cf18-4068-9071-9894bca043c7\") " pod="openshift-marketplace/community-operators-gh5bf" Dec 05 19:16:15 crc kubenswrapper[4756]: I1205 19:16:15.366736 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bml6\" (UniqueName: \"kubernetes.io/projected/7e801001-cf18-4068-9071-9894bca043c7-kube-api-access-4bml6\") pod \"community-operators-gh5bf\" (UID: \"7e801001-cf18-4068-9071-9894bca043c7\") " pod="openshift-marketplace/community-operators-gh5bf" Dec 05 19:16:15 crc kubenswrapper[4756]: I1205 19:16:15.366836 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e801001-cf18-4068-9071-9894bca043c7-catalog-content\") pod \"community-operators-gh5bf\" (UID: \"7e801001-cf18-4068-9071-9894bca043c7\") " pod="openshift-marketplace/community-operators-gh5bf" Dec 05 19:16:15 crc kubenswrapper[4756]: I1205 19:16:15.367288 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e801001-cf18-4068-9071-9894bca043c7-utilities\") pod \"community-operators-gh5bf\" (UID: \"7e801001-cf18-4068-9071-9894bca043c7\") " pod="openshift-marketplace/community-operators-gh5bf" Dec 05 19:16:15 crc kubenswrapper[4756]: I1205 19:16:15.367292 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e801001-cf18-4068-9071-9894bca043c7-catalog-content\") pod \"community-operators-gh5bf\" (UID: \"7e801001-cf18-4068-9071-9894bca043c7\") " pod="openshift-marketplace/community-operators-gh5bf" Dec 05 19:16:15 crc kubenswrapper[4756]: I1205 19:16:15.388700 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bml6\" (UniqueName: \"kubernetes.io/projected/7e801001-cf18-4068-9071-9894bca043c7-kube-api-access-4bml6\") pod \"community-operators-gh5bf\" (UID: \"7e801001-cf18-4068-9071-9894bca043c7\") " pod="openshift-marketplace/community-operators-gh5bf" Dec 05 19:16:15 crc kubenswrapper[4756]: I1205 19:16:15.438406 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gh5bf" Dec 05 19:16:15 crc kubenswrapper[4756]: I1205 19:16:15.973309 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gh5bf"] Dec 05 19:16:16 crc kubenswrapper[4756]: I1205 19:16:16.598855 4756 generic.go:334] "Generic (PLEG): container finished" podID="7e801001-cf18-4068-9071-9894bca043c7" containerID="07db30d695d6b32b33fd905f339946fea0abe958f945c3cf794ba86a02546606" exitCode=0 Dec 05 19:16:16 crc kubenswrapper[4756]: I1205 19:16:16.598912 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gh5bf" event={"ID":"7e801001-cf18-4068-9071-9894bca043c7","Type":"ContainerDied","Data":"07db30d695d6b32b33fd905f339946fea0abe958f945c3cf794ba86a02546606"} Dec 05 19:16:16 crc kubenswrapper[4756]: I1205 19:16:16.599126 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gh5bf" event={"ID":"7e801001-cf18-4068-9071-9894bca043c7","Type":"ContainerStarted","Data":"36b2468a10cff3557cddeb9ea32d26bf7af0b6f1847b48521182d8578f8be479"} Dec 05 19:16:16 crc kubenswrapper[4756]: I1205 19:16:16.603636 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 19:16:18 crc kubenswrapper[4756]: I1205 19:16:18.639717 4756 generic.go:334] "Generic (PLEG): container finished" podID="7e801001-cf18-4068-9071-9894bca043c7" containerID="fe36162b61c13b4afe3e1e65a51f7798f80eca3cca9877aac4e0ed1bb338e8b2" exitCode=0 Dec 05 19:16:18 crc kubenswrapper[4756]: I1205 19:16:18.639813 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gh5bf" event={"ID":"7e801001-cf18-4068-9071-9894bca043c7","Type":"ContainerDied","Data":"fe36162b61c13b4afe3e1e65a51f7798f80eca3cca9877aac4e0ed1bb338e8b2"} Dec 05 19:16:19 crc kubenswrapper[4756]: I1205 19:16:19.050420 4756 scope.go:117] "RemoveContainer" containerID="dd8fda0746006088fd09caede5ac5d34d70606c3815a94d2eee4a6e84a8ae547" Dec 05 19:16:19 crc kubenswrapper[4756]: E1205 19:16:19.050726 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:16:19 crc kubenswrapper[4756]: I1205 19:16:19.651666 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gh5bf" event={"ID":"7e801001-cf18-4068-9071-9894bca043c7","Type":"ContainerStarted","Data":"e922353565c41d298f9ebaf00053d338617ee69d19888491cc48dfebeabc4998"} Dec 05 19:16:19 crc kubenswrapper[4756]: I1205 19:16:19.677507 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gh5bf" podStartSLOduration=2.237953756 podStartE2EDuration="4.67749328s" podCreationTimestamp="2025-12-05 19:16:15 +0000 UTC" firstStartedPulling="2025-12-05 19:16:16.603302419 +0000 UTC m=+10836.841975931" lastFinishedPulling="2025-12-05 19:16:19.042841903 +0000 UTC m=+10839.281515455" observedRunningTime="2025-12-05 19:16:19.674026029 +0000 UTC m=+10839.912699551" watchObservedRunningTime="2025-12-05 19:16:19.67749328 +0000 UTC m=+10839.916166802" Dec 05 19:16:25 crc kubenswrapper[4756]: I1205 19:16:25.439716 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gh5bf" Dec 05 19:16:25 crc kubenswrapper[4756]: I1205 19:16:25.440378 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gh5bf" Dec 05 19:16:25 crc kubenswrapper[4756]: I1205 19:16:25.809084 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gh5bf" Dec 05 19:16:25 crc kubenswrapper[4756]: I1205 19:16:25.872040 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gh5bf" Dec 05 19:16:26 crc kubenswrapper[4756]: I1205 19:16:26.069732 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gh5bf"] Dec 05 19:16:27 crc kubenswrapper[4756]: I1205 19:16:27.753142 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gh5bf" podUID="7e801001-cf18-4068-9071-9894bca043c7" containerName="registry-server" containerID="cri-o://e922353565c41d298f9ebaf00053d338617ee69d19888491cc48dfebeabc4998" gracePeriod=2 Dec 05 19:16:28 crc kubenswrapper[4756]: I1205 19:16:28.282440 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gh5bf" Dec 05 19:16:28 crc kubenswrapper[4756]: I1205 19:16:28.397641 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bml6\" (UniqueName: \"kubernetes.io/projected/7e801001-cf18-4068-9071-9894bca043c7-kube-api-access-4bml6\") pod \"7e801001-cf18-4068-9071-9894bca043c7\" (UID: \"7e801001-cf18-4068-9071-9894bca043c7\") " Dec 05 19:16:28 crc kubenswrapper[4756]: I1205 19:16:28.397801 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e801001-cf18-4068-9071-9894bca043c7-catalog-content\") pod \"7e801001-cf18-4068-9071-9894bca043c7\" (UID: \"7e801001-cf18-4068-9071-9894bca043c7\") " Dec 05 19:16:28 crc kubenswrapper[4756]: I1205 19:16:28.398027 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e801001-cf18-4068-9071-9894bca043c7-utilities\") pod \"7e801001-cf18-4068-9071-9894bca043c7\" (UID: \"7e801001-cf18-4068-9071-9894bca043c7\") " Dec 05 19:16:28 crc kubenswrapper[4756]: I1205 19:16:28.399321 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e801001-cf18-4068-9071-9894bca043c7-utilities" (OuterVolumeSpecName: "utilities") pod "7e801001-cf18-4068-9071-9894bca043c7" (UID: "7e801001-cf18-4068-9071-9894bca043c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 19:16:28 crc kubenswrapper[4756]: I1205 19:16:28.403849 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e801001-cf18-4068-9071-9894bca043c7-kube-api-access-4bml6" (OuterVolumeSpecName: "kube-api-access-4bml6") pod "7e801001-cf18-4068-9071-9894bca043c7" (UID: "7e801001-cf18-4068-9071-9894bca043c7"). InnerVolumeSpecName "kube-api-access-4bml6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 19:16:28 crc kubenswrapper[4756]: I1205 19:16:28.473588 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e801001-cf18-4068-9071-9894bca043c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e801001-cf18-4068-9071-9894bca043c7" (UID: "7e801001-cf18-4068-9071-9894bca043c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 19:16:28 crc kubenswrapper[4756]: I1205 19:16:28.500368 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e801001-cf18-4068-9071-9894bca043c7-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 19:16:28 crc kubenswrapper[4756]: I1205 19:16:28.500697 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bml6\" (UniqueName: \"kubernetes.io/projected/7e801001-cf18-4068-9071-9894bca043c7-kube-api-access-4bml6\") on node \"crc\" DevicePath \"\"" Dec 05 19:16:28 crc kubenswrapper[4756]: I1205 19:16:28.500797 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e801001-cf18-4068-9071-9894bca043c7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 19:16:28 crc kubenswrapper[4756]: I1205 19:16:28.769734 4756 generic.go:334] "Generic (PLEG): container finished" podID="7e801001-cf18-4068-9071-9894bca043c7" containerID="e922353565c41d298f9ebaf00053d338617ee69d19888491cc48dfebeabc4998" exitCode=0 Dec 05 19:16:28 crc kubenswrapper[4756]: I1205 19:16:28.769774 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gh5bf" event={"ID":"7e801001-cf18-4068-9071-9894bca043c7","Type":"ContainerDied","Data":"e922353565c41d298f9ebaf00053d338617ee69d19888491cc48dfebeabc4998"} Dec 05 19:16:28 crc kubenswrapper[4756]: I1205 19:16:28.771128 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gh5bf" event={"ID":"7e801001-cf18-4068-9071-9894bca043c7","Type":"ContainerDied","Data":"36b2468a10cff3557cddeb9ea32d26bf7af0b6f1847b48521182d8578f8be479"} Dec 05 19:16:28 crc kubenswrapper[4756]: I1205 19:16:28.771198 4756 scope.go:117] "RemoveContainer" containerID="e922353565c41d298f9ebaf00053d338617ee69d19888491cc48dfebeabc4998" Dec 05 19:16:28 crc kubenswrapper[4756]: I1205 19:16:28.769827 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gh5bf" Dec 05 19:16:28 crc kubenswrapper[4756]: I1205 19:16:28.821730 4756 scope.go:117] "RemoveContainer" containerID="fe36162b61c13b4afe3e1e65a51f7798f80eca3cca9877aac4e0ed1bb338e8b2" Dec 05 19:16:28 crc kubenswrapper[4756]: I1205 19:16:28.836302 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gh5bf"] Dec 05 19:16:28 crc kubenswrapper[4756]: I1205 19:16:28.846651 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gh5bf"] Dec 05 19:16:28 crc kubenswrapper[4756]: I1205 19:16:28.853841 4756 scope.go:117] "RemoveContainer" containerID="07db30d695d6b32b33fd905f339946fea0abe958f945c3cf794ba86a02546606" Dec 05 19:16:28 crc kubenswrapper[4756]: I1205 19:16:28.925969 4756 scope.go:117] "RemoveContainer" containerID="e922353565c41d298f9ebaf00053d338617ee69d19888491cc48dfebeabc4998" Dec 05 19:16:28 crc kubenswrapper[4756]: E1205 19:16:28.929439 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e922353565c41d298f9ebaf00053d338617ee69d19888491cc48dfebeabc4998\": container with ID starting with e922353565c41d298f9ebaf00053d338617ee69d19888491cc48dfebeabc4998 not found: ID does not exist" containerID="e922353565c41d298f9ebaf00053d338617ee69d19888491cc48dfebeabc4998" Dec 05 19:16:28 crc kubenswrapper[4756]: I1205 19:16:28.929484 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e922353565c41d298f9ebaf00053d338617ee69d19888491cc48dfebeabc4998"} err="failed to get container status \"e922353565c41d298f9ebaf00053d338617ee69d19888491cc48dfebeabc4998\": rpc error: code = NotFound desc = could not find container \"e922353565c41d298f9ebaf00053d338617ee69d19888491cc48dfebeabc4998\": container with ID starting with e922353565c41d298f9ebaf00053d338617ee69d19888491cc48dfebeabc4998 not found: ID does not exist" Dec 05 19:16:28 crc kubenswrapper[4756]: I1205 19:16:28.929511 4756 scope.go:117] "RemoveContainer" containerID="fe36162b61c13b4afe3e1e65a51f7798f80eca3cca9877aac4e0ed1bb338e8b2" Dec 05 19:16:28 crc kubenswrapper[4756]: E1205 19:16:28.929973 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe36162b61c13b4afe3e1e65a51f7798f80eca3cca9877aac4e0ed1bb338e8b2\": container with ID starting with fe36162b61c13b4afe3e1e65a51f7798f80eca3cca9877aac4e0ed1bb338e8b2 not found: ID does not exist" containerID="fe36162b61c13b4afe3e1e65a51f7798f80eca3cca9877aac4e0ed1bb338e8b2" Dec 05 19:16:28 crc kubenswrapper[4756]: I1205 19:16:28.930018 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe36162b61c13b4afe3e1e65a51f7798f80eca3cca9877aac4e0ed1bb338e8b2"} err="failed to get container status \"fe36162b61c13b4afe3e1e65a51f7798f80eca3cca9877aac4e0ed1bb338e8b2\": rpc error: code = NotFound desc = could not find container \"fe36162b61c13b4afe3e1e65a51f7798f80eca3cca9877aac4e0ed1bb338e8b2\": container with ID starting with fe36162b61c13b4afe3e1e65a51f7798f80eca3cca9877aac4e0ed1bb338e8b2 not found: ID does not exist" Dec 05 19:16:28 crc kubenswrapper[4756]: I1205 19:16:28.930048 4756 scope.go:117] "RemoveContainer" containerID="07db30d695d6b32b33fd905f339946fea0abe958f945c3cf794ba86a02546606" Dec 05 19:16:28 crc kubenswrapper[4756]: E1205 19:16:28.930338 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07db30d695d6b32b33fd905f339946fea0abe958f945c3cf794ba86a02546606\": container with ID starting with 07db30d695d6b32b33fd905f339946fea0abe958f945c3cf794ba86a02546606 not found: ID does not exist" containerID="07db30d695d6b32b33fd905f339946fea0abe958f945c3cf794ba86a02546606" Dec 05 19:16:28 crc kubenswrapper[4756]: I1205 19:16:28.930357 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07db30d695d6b32b33fd905f339946fea0abe958f945c3cf794ba86a02546606"} err="failed to get container status \"07db30d695d6b32b33fd905f339946fea0abe958f945c3cf794ba86a02546606\": rpc error: code = NotFound desc = could not find container \"07db30d695d6b32b33fd905f339946fea0abe958f945c3cf794ba86a02546606\": container with ID starting with 07db30d695d6b32b33fd905f339946fea0abe958f945c3cf794ba86a02546606 not found: ID does not exist" Dec 05 19:16:30 crc kubenswrapper[4756]: I1205 19:16:30.072044 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e801001-cf18-4068-9071-9894bca043c7" path="/var/lib/kubelet/pods/7e801001-cf18-4068-9071-9894bca043c7/volumes" Dec 05 19:16:33 crc kubenswrapper[4756]: I1205 19:16:33.050478 4756 scope.go:117] "RemoveContainer" containerID="dd8fda0746006088fd09caede5ac5d34d70606c3815a94d2eee4a6e84a8ae547" Dec 05 19:16:33 crc kubenswrapper[4756]: E1205 19:16:33.052750 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:16:44 crc kubenswrapper[4756]: I1205 19:16:44.049934 4756 scope.go:117] "RemoveContainer" containerID="dd8fda0746006088fd09caede5ac5d34d70606c3815a94d2eee4a6e84a8ae547" Dec 05 19:16:44 crc kubenswrapper[4756]: E1205 19:16:44.050745 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:16:57 crc kubenswrapper[4756]: I1205 19:16:57.050705 4756 scope.go:117] "RemoveContainer" containerID="dd8fda0746006088fd09caede5ac5d34d70606c3815a94d2eee4a6e84a8ae547" Dec 05 19:16:57 crc kubenswrapper[4756]: E1205 19:16:57.051898 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:17:09 crc kubenswrapper[4756]: I1205 19:17:09.050970 4756 scope.go:117] "RemoveContainer" containerID="dd8fda0746006088fd09caede5ac5d34d70606c3815a94d2eee4a6e84a8ae547" Dec 05 19:17:09 crc kubenswrapper[4756]: E1205 19:17:09.052267 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:17:22 crc kubenswrapper[4756]: I1205 19:17:22.051871 4756 scope.go:117] "RemoveContainer" containerID="dd8fda0746006088fd09caede5ac5d34d70606c3815a94d2eee4a6e84a8ae547" Dec 05 19:17:22 crc kubenswrapper[4756]: E1205 19:17:22.052719 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:17:36 crc kubenswrapper[4756]: I1205 19:17:36.049598 4756 scope.go:117] "RemoveContainer" containerID="dd8fda0746006088fd09caede5ac5d34d70606c3815a94d2eee4a6e84a8ae547" Dec 05 19:17:36 crc kubenswrapper[4756]: E1205 19:17:36.050488 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:17:48 crc kubenswrapper[4756]: I1205 19:17:48.233615 4756 scope.go:117] "RemoveContainer" containerID="dd8fda0746006088fd09caede5ac5d34d70606c3815a94d2eee4a6e84a8ae547" Dec 05 19:17:48 crc kubenswrapper[4756]: E1205 19:17:48.235628 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:18:02 crc kubenswrapper[4756]: I1205 19:18:02.050026 4756 scope.go:117] "RemoveContainer" containerID="dd8fda0746006088fd09caede5ac5d34d70606c3815a94d2eee4a6e84a8ae547" Dec 05 19:18:02 crc kubenswrapper[4756]: E1205 19:18:02.051639 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:18:13 crc kubenswrapper[4756]: I1205 19:18:13.050237 4756 scope.go:117] "RemoveContainer" containerID="dd8fda0746006088fd09caede5ac5d34d70606c3815a94d2eee4a6e84a8ae547" Dec 05 19:18:13 crc kubenswrapper[4756]: E1205 19:18:13.050972 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:18:24 crc kubenswrapper[4756]: I1205 19:18:24.049894 4756 scope.go:117] "RemoveContainer" containerID="dd8fda0746006088fd09caede5ac5d34d70606c3815a94d2eee4a6e84a8ae547" Dec 05 19:18:24 crc kubenswrapper[4756]: E1205 19:18:24.051025 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:18:38 crc kubenswrapper[4756]: I1205 19:18:38.050278 4756 scope.go:117] "RemoveContainer" containerID="dd8fda0746006088fd09caede5ac5d34d70606c3815a94d2eee4a6e84a8ae547" Dec 05 19:18:38 crc kubenswrapper[4756]: E1205 19:18:38.051404 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:18:52 crc kubenswrapper[4756]: I1205 19:18:52.050710 4756 scope.go:117] "RemoveContainer" containerID="dd8fda0746006088fd09caede5ac5d34d70606c3815a94d2eee4a6e84a8ae547" Dec 05 19:18:52 crc kubenswrapper[4756]: E1205 19:18:52.051523 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:19:05 crc kubenswrapper[4756]: I1205 19:19:05.054644 4756 scope.go:117] "RemoveContainer" containerID="dd8fda0746006088fd09caede5ac5d34d70606c3815a94d2eee4a6e84a8ae547" Dec 05 19:19:05 crc kubenswrapper[4756]: E1205 19:19:05.055737 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:19:19 crc kubenswrapper[4756]: I1205 19:19:19.049960 4756 scope.go:117] "RemoveContainer" containerID="dd8fda0746006088fd09caede5ac5d34d70606c3815a94d2eee4a6e84a8ae547" Dec 05 19:19:19 crc kubenswrapper[4756]: E1205 19:19:19.051231 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zrvcn_openshift-machine-config-operator(2067f3da-c088-40d8-bc08-1eb971992c56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" podUID="2067f3da-c088-40d8-bc08-1eb971992c56" Dec 05 19:19:31 crc kubenswrapper[4756]: I1205 19:19:31.051216 4756 scope.go:117] "RemoveContainer" containerID="dd8fda0746006088fd09caede5ac5d34d70606c3815a94d2eee4a6e84a8ae547" Dec 05 19:19:32 crc kubenswrapper[4756]: I1205 19:19:32.269197 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zrvcn" event={"ID":"2067f3da-c088-40d8-bc08-1eb971992c56","Type":"ContainerStarted","Data":"ecf3d8ff7d3db9641266f3cacf2cb3a180e4ebc6c1eb82c4c367c0d456df32ed"}